WorldWideScience

Sample records for automated process monitoring

  1. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  2. Plug-and-play monitoring and performance optimization for industrial automation processes

    CERN Document Server

    Luo, Hao

    2017-01-01

    Dr.-Ing. Hao Luo demonstrates the developments of advanced plug-and-play (PnP) process monitoring and control systems for industrial automation processes. With aid of the so-called Youla parameterization, a novel PnP process monitoring and control architecture (PnP-PMCA) with modularized components is proposed. To validate the developments, a case study on an industrial rolling mill benchmark is performed, and the real-time implementation on a laboratory brushless DC motor is presented. Contents PnP Process Monitoring and Control Architecture Real-Time Configuration Techniques for PnP Process Monitoring Real-Time Configuration Techniques for PnP Performance Optimization Benchmark Study and Real-Time Implementation Target Groups Researchers and students of Automation and Control Engineering Practitioners in the area of Industrial and Production Engineering The Author Hao Luo received the Ph.D. degree at the Institute for Automatic Control and Complex Systems (AKS) at the University of Duisburg-Essen, Germany, ...

  3. Automated Radioanalytical Chemistry: Applications For The Laboratory And Industrial Process Monitoring

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Farawila, Anne F.; Grate, Jay W.

    2009-01-01

    The identification and quantification of targeted α- and β-emitting radionuclides via destructive analysis in complex radioactive liquid matrices is highly challenging. Analyses are typically accomplished at on- or off-site laboratories through laborious sample preparation steps and extensive chemical separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, alpha energy spectroscopy, mass spectrometry). Analytical results may take days or weeks to report. When an industrial-scale plant requires periodic or continuous monitoring of radionuclides as an indication of the composition of its feed stream, diversion of safeguarded nuclides, or of plant operational conditions (for example), radiochemical measurements should be rapid, but not at the expense of precision and accuracy. Scientists at Pacific Northwest National Laboratory have developed and characterized a host of automated radioanalytical systems designed to perform reproducible and rapid radioanalytical processes. Platforms have been assembled for (1) automation and acceleration of sample analysis in the laboratory and (2) automated monitors for monitoring industrial scale nuclear processes on-line with near-real time results. These methods have been applied to the analysis of environmental-level actinides and fission products to high-level nuclear process fluids. Systems have been designed to integrate a number of discrete sample handling steps, including sample pretreatment (e.g., digestion and valence state adjustment) and chemical separations. The systems have either utilized on-line analyte detection or have collected the purified analyte fractions for off-line measurement applications. One PNNL system of particular note is a fully automated prototype on-line radioanalytical system designed for the Waste Treatment Plant at Hanford, WA, USA. This system demonstrated nearly continuous destructive analysis of the soft β-emitting radionuclide 99Tc in nuclear

  4. Technology Transfer Opportunities: Automated Ground-Water Monitoring

    Science.gov (United States)

    Smith, Kirk P.; Granato, Gregory E.

    1997-01-01

    Introduction A new automated ground-water monitoring system developed by the U.S. Geological Survey (USGS) measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automation of water-quality monitoring systems in the field, in laboratories, and in industry have increased data density and utility while reducing operating costs. Uses for an automated ground-water monitoring system include, (but are not limited to) monitoring ground-water quality for research, monitoring known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, and as an early warning system monitoring groundwater quality near public water-supply wells.

  5. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; Grate, Jay W.; DeVol, Timothy A.

    2003-01-01

    This research program is directed toward rapid, sensitive, and selective determination of beta and alpha-emitting radionuclides such as 99Tc, 90Sr, and trans-uranium (TRU) elements in low activity waste (LAW) processing streams. The overall technical approach is based on automated radiochemical measurement principles. Nuclear waste process streams are particularly challenging for rapid analytical methods due to the complex, high- ionic-strength, caustic brine sample matrix, the presence of interfering radionuclides, and the variable and uncertain speciation of the radionuclides of interest. As a result, matrix modification, speciation control, and separation chemistries are required for use in automated process analyzers. Significant knowledge gaps exist relative to the design of chemistries for such analyzers so that radionuclides can be quantitatively and rapidly separated and analyzed in solutions derived from low-activity waste processing operations. This research is addressing these knowledge gaps in the area of separation science, nuclear detection, and analytical chemistry and instrumentation. The outcome of these investigations will be the knowledge necessary to choose appropriate chemistries for sample matrix modification and analyte speciation control and chemistries for rapid and selective separation and preconcentration of target radionuclides from complex sample matrices. In addition, new approaches for quantification of alpha emitters in solution using solid state diode detectors, as well as improved instrumentation and signal processing techniques for use with solid-state and scintillation detectors, will be developed. New knowledge of the performance of separation materials, matrix modification and speciation control chemistries, instrument configurations, and quantitative analytical approaches will provide the basis for designing effective instrumentation for radioanalytical process monitoring. Specific analytical targets include 99 Tc, 90Sr and

  6. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; Grate, Jay W.; DeVol, Timothy A.

    2004-01-01

    This research program is directed toward rapid, sensitive, and selective determination of beta and alpha-emitting radionuclides such as 99Tc, 90Sr, and trans-uranium (TRU) elements in low activity waste (LAW) processing streams. The overall technical approach is based on automated radiochemical measurement principles, which entails integration of sample treatment and separation chemistries and radiometric detection within a single functional analytical instrument. Nuclear waste process streams are particularly challenging for rapid analytical methods due to the complex, high-ionic-strength, caustic brine sample matrix, the presence of interfering radionuclides, and the variable and uncertain speciation of the radionuclides of interest. As a result, matrix modification, speciation control, and separation chemistries are required for use in automated process analyzers. Significant knowledge gaps exist relative to the design of chemistries for such analyzers so that radionuclides can be quantitatively and rapidly separated and analyzed in solutions derived from low-activity waste processing operations. This research is addressing these knowledge gaps in the area of separation science, nuclear detection, and analytical chemistry and instrumentation. The outcome of these investigations will be the knowledge necessary to choose appropriate chemistries for sample matrix modification and analyte speciation control and chemistries for rapid and selective separation and preconcentration of target radionuclides from complex sample matrices. In addition, new approaches for quantification of alpha emitters in solution using solid-state diode detectors, as well as improved instrumentation and signal processing techniques for use with solid-state and scintillation detectors, will be developed. New knowledge of the performance of separation materials, matrix modification and speciation control chemistries, instrument configurations, and quantitative analytical approaches will

  7. Elektronische monitoring van luchtwassers op veehouderijbedrijven = Automated process monitoring and data logging of air scrubbers at animal houses

    NARCIS (Netherlands)

    Melse, R.W.; Franssen, J.C.T.J.

    2010-01-01

    At 6 animal houses air scrubbers equipped with an automated process monitoring and data logging system were tested. The measured values were successfully stored but the measured values, especially the pH and EC of the recirculation water, appeared not to be correct at all times.

  8. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  9. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  10. 49 CFR 238.445 - Automated monitoring.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Automated monitoring. 238.445 Section 238.445... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor the... limiting the speed of the train. (c) The monitoring system shall be designed with an automatic self-test...

  11. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Jay W. Grate; Timothy A. DeVol

    2006-01-01

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodology, and to initiate work on new detection approaches, especially using modified diode detectors

  12. Automated contamination monitoring for hot particles

    International Nuclear Information System (INIS)

    Johnstone, G.; Case, L.

    1987-01-01

    INS Corp., the largest nuclear laundry company in the United States, has recently developed two types of automated contamination monitoring systems: 1) the Automated Laundry Monitor (ALM), which provides quality assurance monitoring for protective clothing contamination and 2) a low-level automated monitoring system for Plastic Volume Reduction Service (PVRS). The presentation details the inaccuracies associated with hand-probe frisking which led to the development of the ALM. The ALM was designed for 100% quality assurance monitoring of garments to the most stringent customer requirements. A review of why the ALM is essential in verifying the absence of hot particles on garments is given. The final topic addresses the expansion of the ALM technology in support of the INS Plastic Volume Reduction Service by monitoring decontaminated plastics to free release levels. This presentation reviews the design and operation of both monitoring systems

  13. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  14. 49 CFR 238.237 - Automated monitoring.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Automated monitoring. 238.237 Section 238.237 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after...

  15. Monitoring system for automation of experimental researches in cutting

    International Nuclear Information System (INIS)

    Kuzinovski, Mikolaj; Trajchevski, Neven; Filipovski, Velimir; Tomov, Mite; Cichosz, Piotr

    2009-01-01

    This study presents procedures being performed when projecting and realizing experimental scientific researches by application of the automated measurement system with a computer support in all experiment stages. A special accent is placed on the measurement system integration and mathematical processing of data from experiments. Automation processes are described through the realized own automated monitoring system for research of physical phenomena in the cutting process with computer-aided data acquisition. The monitoring system is intended for determining the tangential, axial and radial component of the cutting force, as well as average temperature in the cutting process. The hardware acquisition art consists of amplifiers and A/D converters, while as for analysis and visualization software for P C is developed by using M S Visual C++. For mathematical description researched physical phenomena CADEX software is made, which in connection with MATLAB is intended for projecting processing and analysis of experimental scientific researches against the theory for planning multi-factorial experiments. The design and construction of the interface and the computerized measurement system were done by the Faculty of Mechanical Engineering in Skopje in collaboration with the Faculty of Electrical Engineering and Information Technologies in Skopje and the Institute of Production Engineering and Automation, Wroclaw University of Technology, Poland. Gaining own scientific research measurement system with free access to hardware and software parts provides conditions for a complete control of the research process and reduction of interval of the measuring uncertainty of gained results from performed researches.

  16. An automated neutron monitor maintenance system

    International Nuclear Information System (INIS)

    Moore, F.S.; Griffin, J.C.; Odell, D.M.C.

    1996-01-01

    Neutron detectors are commonly used by the nuclear materials processing industry to monitor fissile materials in process vessels and tanks. The proper functioning of these neutron monitors must be periodically evaluated. We have developed and placed in routine use a PC-based multichannel analyzer (MCA) system for on-line BF3 and He-3 gas-filled detector function testing. The automated system: 1) acquires spectral data from the monitor system, 2) analyzes the spectrum to determine the detector's functionality, 3) makes suggestions for maintenance or repair, as required, and 4) saves the spectrum and results to disk for review. The operator interface has been designed to be user-friendly and to minimize the training requirements of the user. The system may also be easily customized for various applications

  17. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  18. AUTOMATED LOW-COST PHOTOGRAMMETRY FOR FLEXIBLE STRUCTURE MONITORING

    Directory of Open Access Journals (Sweden)

    C. H. Wang

    2012-07-01

    Full Text Available Structural monitoring requires instruments which can provide high precision and accuracy, reliable measurements at good temporal resolution and rapid processing speeds. Long-term campaigns and flexible structures are regarded as two of the most challenging subjects in monitoring engineering structures. Long-term monitoring in civil engineering is generally considered to be labourintensive and financially expensive and it can take significant effort to arrange the necessary human resources, transportation and equipment maintenance. When dealing with flexible structure monitoring, it is of paramount importance that any monitoring equipment used is able to carry out rapid sampling. Low cost, automated, photogrammetric techniques therefore have the potential to become routinely viable for monitoring non-rigid structures. This research aims to provide a photogrammetric solution for long-term flexible structural monitoring purposes. The automated approach was achieved using low-cost imaging devices (mobile phones to replace traditional image acquisition stations and substantially reduce the equipment costs. A self-programmed software package was developed to deal with the hardware-software integration and system operation. In order to evaluate the performance of this low-cost monitoring system, a shaking table experiment was undertaken. Different network configurations and target sizes were used to determine the best configuration. A large quantity of image data was captured by four DSLR cameras and four mobile phone cameras respectively. These image data were processed using photogrammetric techniques to calculate the final results for the system evaluation.

  19. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  20. Process computers automate CERN power supply installations

    International Nuclear Information System (INIS)

    Ullrich, H.; Martin, A.

    1974-01-01

    Higher standards of performance and reliability in the power plants of large particle accelerators necessitate increasing use of automation. The CERN (European Nuclear Research Centre) in Geneva started to employ process computers for plant automation at an early stage in its history. The great complexity and extent of the plants for high-energy physics first led to the setting-up of decentralized automatic systems which are now being increasingly combined into one interconnected automation system. One of these automatic systems controls and monitors the extensive power supply installations for the main ring magnets in the experimental zones. (orig.) [de

  1. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    Science.gov (United States)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  2. Means of storage and automated monitoring of versions of text technical documentation

    Science.gov (United States)

    Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.

    2018-03-01

    The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.

  3. Real-time bioacoustics monitoring and automated species identification

    Directory of Open Access Journals (Sweden)

    T. Mitchell Aide

    2013-07-01

    Full Text Available Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON, a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website (arbimon.net. Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  4. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  5. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  6. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  7. Automated Cryocooler Monitor and Control System Software

    Science.gov (United States)

    Britchcliffe, Michael J.; Conroy, Bruce L.; Anderson, Paul E.; Wilson, Ahmad

    2011-01-01

    This software is used in an automated cryogenic control system developed to monitor and control the operation of small-scale cryocoolers. The system was designed to automate the cryogenically cooled low-noise amplifier system described in "Automated Cryocooler Monitor and Control System" (NPO-47246), NASA Tech Briefs, Vol. 35, No. 5 (May 2011), page 7a. The software contains algorithms necessary to convert non-linear output voltages from the cryogenic diode-type thermometers and vacuum pressure and helium pressure sensors, to temperature and pressure units. The control function algorithms use the monitor data to control the cooler power, vacuum solenoid, vacuum pump, and electrical warm-up heaters. The control algorithms are based on a rule-based system that activates the required device based on the operating mode. The external interface is Web-based. It acts as a Web server, providing pages for monitor, control, and configuration. No client software from the external user is required.

  8. Engineering Process Monitoring for Control Room Operation

    OpenAIRE

    Bätz, M

    2001-01-01

    A major challenge in process operation is to reduce costs and increase system efficiency whereas the complexity of automated process engineering, control and monitoring systems increases continuously. To cope with this challenge the design, implementation and operation of process monitoring systems for control room operation have to be treated as an ensemble. This is only possible if the engineering of the monitoring information is focused on the production objective and is lead in close coll...

  9. Automating the personnel dosimeter monitoring program

    International Nuclear Information System (INIS)

    Compston, M.W.

    1982-12-01

    The personnel dosimetry monitoring program at the Portsmouth uranium enrichment facility has been improved by using thermoluminescent dosimetry to monitor for ionizing radiation exposure, and by automating most of the operations and all of the associated information handling. A thermoluminescent dosimeter (TLD) card, worn by personnel inside security badges, stores the energy of ionizing radiation. The dosimeters are changed-out periodically and are loaded 150 cards at a time into an automated reader-processor. The resulting data is recorded and filed into a useful form by computer programming developed for this purpose

  10. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  11. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  12. Automated personal dosimetry monitoring system for NPP

    International Nuclear Information System (INIS)

    Chanyshev, E.; Chechyotkin, N.; Kondratev, A.; Plyshevskaya, D.

    2006-01-01

    Full text: Radiation safety of personnel at nuclear power plants (NPP) is a priority aim. Degree of radiation exposure of personnel is defined by many factors: NPP design, operation of equipment, organizational management of radiation hazardous works and, certainly, safety culture of every employee. Automated Personal Dosimetry Monitoring System (A.P.D.M.S.) is applied at all nuclear power plants nowadays in Russia to eliminate the possibility of occupational radiation exposure beyond regulated level under different modes of NPP operation. A.P.D.M.S. provides individual radiation dose registration. In the paper the efforts of Design Bureau 'Promengineering' in construction of software and hardware complex of A.P.D.M.S. (S.H.W. A.P.D.M.S.) for NPP with PWR are presented. The developed complex is intended to automatize activities of radiation safety department when caring out individual dosimetry control. The complex covers all main processes concerning individual monitoring of external and internal radiation exposure as well as dose recording, management, and planning. S.H.W. A.P.D.M.S. is a multi-purpose system which software was designed on the modular approach. This approach presumes modification and extension of software using new components (modules) without changes in other components. Such structure makes the system flexible and allows modifying it in case of implementation a new radiation safety requirements and extending the scope of dosimetry monitoring. That gives the possibility to include with time new kinds of dosimetry control for Russian NPP in compliance with IAEA recommendations, for instance, control of the equivalent dose rate to the skin and the equivalent dose rate to the lens of the eye S.H.W. A.P.D.M.S. provides dosimetry control as follows: Current monitoring of external radiation exposure: - Gamma radiation dose measurement using radio-photoluminescent personal dosimeters. - Neutron radiation dose measurement using thermoluminescent

  13. SHARP - Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. G.

    1990-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  14. SHARP: Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. Gaius

    1991-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  15. Lifecycle, Iteration, and Process Automation with SMS Gateway

    Directory of Open Access Journals (Sweden)

    Fenny Fenny

    2015-12-01

    Full Text Available Producing a better quality software system requires an understanding of the indicators of the software quality through defect detection, and automated testing. This paper aims to elevate the design and automated testing process in an engine water pump of a drinking water plant. This paper proposes how software developers can improve the maintainability and reliability of automated testing system and report the abnormal state when an error occurs on the machine. The method in this paper uses literature to explain best practices and case studies of a drinking water plant. Furthermore, this paper is expected to be able to provide insights into the efforts to better handle errors and perform automated testing and monitoring on a machine.

  16. Advanced health monitor for automated driving functions

    NARCIS (Netherlands)

    Mikovski Iotov, I.

    2017-01-01

    There is a trend in the automotive domain where driving functions are taken from the driver by automated driving functions. In order to guarantee the correct behavior of these auto-mated driving functions, the report introduces an Advanced Health Monitor that uses Tem-poral Logic and Probabilistic

  17. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  18. Engineering Process Monitoring for Control Room Operation

    CERN Document Server

    Bätz, M

    2001-01-01

    A major challenge in process operation is to reduce costs and increase system efficiency whereas the complexity of automated process engineering, control and monitoring systems increases continuously. To cope with this challenge the design, implementation and operation of process monitoring systems for control room operation have to be treated as an ensemble. This is only possible if the engineering of the monitoring information is focused on the production objective and is lead in close collaboration of control room teams, exploitation personnel and process specialists. In this paper some principles for the engineering of monitoring information for control room operation are developed at the example of the exploitation of a particle accelerator at the European Laboratory for Nuclear Research (CERN).

  19. Advanced health monitor for automated driving functions

    OpenAIRE

    Mikovski Iotov, I.

    2017-01-01

    There is a trend in the automotive domain where driving functions are taken from the driver by automated driving functions. In order to guarantee the correct behavior of these auto-mated driving functions, the report introduces an Advanced Health Monitor that uses Tem-poral Logic and Probabilistic Analysis to indicate the system’s health.

  20. Unattended reaction monitoring using an automated microfluidic sampler and on-line liquid chromatography.

    Science.gov (United States)

    Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve

    2018-04-03

    In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    International Nuclear Information System (INIS)

    Brown, Aaron W.; Simone, Paul S.; York, J.C.; Emmert, Gary L.

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L –1 . • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L −1 . Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature

  2. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Aaron W. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Simone, Paul S. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States); York, J.C. [City of Lebanon, TN Water Treatment Plant, 7 Gilmore Hill Rd., Lebanon, TN 37087 (United States); Emmert, Gary L., E-mail: gemmert@memphis.edu [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States)

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L{sup –1}. • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L{sup −1}. Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature.

  3. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    Science.gov (United States)

    Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248

  4. Automated long-term monitoring of parallel microfluidic operations applying a machine vision-assisted positioning method.

    Science.gov (United States)

    Yip, Hon Ming; Li, John C S; Xie, Kai; Cui, Xin; Prasad, Agrim; Gao, Qiannan; Leung, Chi Chiu; Lam, Raymond H W

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  5. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    Directory of Open Access Journals (Sweden)

    Hon Ming Yip

    2014-01-01

    Full Text Available As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  6. Automated biomonitoring: living sensors as environmental monitors

    National Research Council Canada - National Science Library

    Gruber, D; Diamond, J

    1988-01-01

    Water quality continues to present problems of global concern and has resulted in greatly increased use of automated biological systems in monitoring drinking water, industrial effluents and wastewater...

  7. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  8. Automated Method for Monitoring Water Quality Using Landsat Imagery

    Directory of Open Access Journals (Sweden)

    D. Clay Barrett

    2016-06-01

    Full Text Available Regular monitoring of water quality is increasingly necessary to keep pace with rapid environmental change and protect human health and well-being. Remote sensing has been suggested as a potential solution for monitoring certain water quality parameters without the need for in situ sampling, but universal methods and tools are lacking. While many studies have developed predictive relationships between remotely sensed surface reflectance and water parameters, these relationships are often unique to a particular geographic region and have little applicability in other areas. In order to remotely monitor water quality, these relationships must be developed on a region by region basis. This paper presents an automated method for processing remotely sensed images from Landsat Thematic Mapper (TM and Enhanced Thematic Mapper Plus (ETM+ and extracting corrected reflectance measurements around known sample locations to allow rapid development of predictive water quality relationships to improve remote monitoring. Using open Python scripting, this study (1 provides an openly accessible and simple method for processing publicly available remote sensing data; and (2 allows determination of relationships between sampled water quality parameters and reflectance values to ultimately allow predictive monitoring. The method is demonstrated through a case study of the Ozark/Ouchita-Appalachian ecoregion in eastern Oklahoma using data collected for the Beneficial Use Monitoring Program (BUMP.

  9. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  10. Automation of technical specification monitoring for nuclear power plants

    International Nuclear Information System (INIS)

    Lin, J.C.; Abbott, E.C.; Hubbard, F.R.

    1986-01-01

    The complexity of today's nuclear power plants combined with an equally detailed regulatory process makes it necessary for the plant staff to have access to an automated system capable of monitoring the status of limiting conditions for operation (LCO). Pickard, Lowe and Garrick, Inc. (PLG), has developed the first of such a system, called Limiting Conditions for Operation Monitor (LIMCOM). LIMCOM provides members of the operating staff with an up-to-date comparison of currently operable equipment and plant operating conditions with what is required in the technical specifications. LIMCOM also provides an effective method of screening tagout requests by evaluating their impact on the LCOs. Finally, LIMCOM provides an accurate method of tracking and scheduling routine surveillance. (author)

  11. Miniature multichannel analyzer for process monitoring

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.; Wiig, L.G.; Ianakiev, K.D.

    1993-01-01

    A new, 4,000-channel analyzer has been developed for gamma-ray spectroscopy applications. A design philosophy of hardware and software building blocks has been combined with design goals of simplicity, compactness, portability, and reliability. The result is a miniature, modular multichannel analyzer (MMMCA), which offers solution to a variety of nondestructive assay (NDA) needs in many areas of general application, independent of computer platform or operating system. Detector-signal analog electronics, the bias supply, and batteries are included in the virtually pocket-size, low-power MMMCA unit. The MMMCA features digital setup and control, automated data reduction, and automated quality assurance. Areas of current NDA applications include on-line continuous (process) monitoring, process material holdup measurements, and field inspections

  12. Tools for automated acoustic monitoring within the R package monitoR

    DEFF Research Database (Denmark)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those...... with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors....

  13. Tools for automated acoustic monitoring within the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  14. ERP processes automation in corporate environments

    OpenAIRE

    Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta

    2017-01-01

    The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...

  15. Monitoring and control of fine abrasive finishing processes

    DEFF Research Database (Denmark)

    Lazarev, Ruslan

    In engineering, surfaces with specified functional properties are of high demand in various applications. Desired surface finish can be obtained using several methods. Abrasive finishing is one of the most important processes in the manufacturing of mould and dies tools. It is a principal method ...... was segmented using discretization methods. The applied methodology was proposed for implementation as an on-line system and is considered to be a part of the next generation of STRECON NanoRAP machine....... to remove unwanted material, obtain desired geometry, surface quality and surface functional properties. The automation and computerization of finishing processes involves utilisation of robots, specialized machines with several degrees of freedom, sensors and data acquisition systems. The focus...... of this work was to investigate foundations for process monitoring and control methods in application to semi-automated polishing machine based on the industrial robot. The monitoring system was built on NI data acquisition system with two sensors, acoustic emission sensor and accelerometer. Acquired sensory...

  16. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives

    DEFF Research Database (Denmark)

    Podevin, Michael Paul Ambrose; Fotidis, Ioannis; Angelidaki, Irini

    2018-01-01

    microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on “big data”. The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral...... contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity – the capability of monitoring several analytes simultaneously – in the interest of improving product quality, productivity, and process automation...... interpretation and process automation to aid and motivate development....

  17. The Automator: Intelligent control system monitoring

    International Nuclear Information System (INIS)

    M. Bickley; D.A. Bryan; K.S. White

    1999-01-01

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  18. Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

    Science.gov (United States)

    Duc Nguyen, Minh

    2017-10-01

    This work describes Live Monitor, the monitoring subsystem of SDDS - an automated system for space experiment data processing, storage, and distribution created at SINP MSU. Live Monitor allows operators and developers of satellite data centers to identify errors occurred in data processing quickly and to prevent further consequences caused by the errors. All activities of the whole data processing cycle are illustrated via a web interface in real-time. Notification messages are delivered to responsible people via emails and Telegram messenger service. The flexible monitoring mechanism implemented in Live Monitor allows us to dynamically change and control events being shown on the web interface on our demands. Physicists, whose space weather analysis models are functioning upon satellite data provided by SDDS, can use the developed RESTful API to monitor their own events and deliver customized notification messages by their needs.

  19. Monitoring and control of the Rossendorf research reactor using a microcomputerized automation system

    International Nuclear Information System (INIS)

    Ba weg, F.; Enkelmann, W.; Klebau, J.

    1982-01-01

    A decentral hierarchic information system (HIS) is presented, which has been developed for monitoring and control of the Rossendorf Research Reactor RFR, but which may also be considered the prototype of a digital automation system (AS) to be used in power stations. The functions integrated in the HIS are as follows: process monitoring, process control, and use of a specialized industrial robot for control of charging and discharging of the materials to be irradiated. The AS is realized on the basis of the process computer system PRA 30 (A 6492) developed in the GDR and including a computer K 1630 and the intelligent process terminals ursadat 5000 connected by a fast serial interface (IFLS). (author)

  20. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  1. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor

    2017-01-01

    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  2. Automated processing of endoscopic surgical instruments.

    Science.gov (United States)

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  3. Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

    Directory of Open Access Journals (Sweden)

    Duc Nguyen Minh

    2017-01-01

    Full Text Available This work describes Live Monitor, the monitoring subsystem of SDDS – an automated system for space experiment data processing, storage, and distribution created at SINP MSU. Live Monitor allows operators and developers of satellite data centers to identify errors occurred in data processing quickly and to prevent further consequences caused by the errors. All activities of the whole data processing cycle are illustrated via a web interface in real-time. Notification messages are delivered to responsible people via emails and Telegram messenger service. The flexible monitoring mechanism implemented in Live Monitor allows us to dynamically change and control events being shown on the web interface on our demands. Physicists, whose space weather analysis models are functioning upon satellite data provided by SDDS, can use the developed RESTful API to monitor their own events and deliver customized notification messages by their needs.

  4. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  5. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    Science.gov (United States)

    Singh, Preetpal

    Canada is home to thousands of freshwater lakes and rivers. Apart from being sources of infinite natural beauty, rivers and lakes are an important source of water, food and transportation. The northern hemisphere of Canada experiences extreme cold temperatures in the winter resulting in a freeze up of regional lakes and rivers. Frozen lakes and rivers tend to offer unique opportunities in terms of wildlife harvesting and winter transportation. Ice roads built on frozen rivers and lakes are vital supply lines for industrial operations in the remote north. Monitoring the ice freeze-up and break-up dates annually can help predict regional climatic changes. Lake ice impacts a variety of physical, ecological and economic processes. The construction and maintenance of a winter road can cost millions of dollars annually. A good understanding of ice mechanics is required to build and deem an ice road safe. A crucial factor in calculating load bearing capacity of ice sheets is the thickness of ice. Construction costs are mainly attributed to producing and maintaining a specific thickness and density of ice that can support different loads. Climate change is leading to warmer temperatures causing the ice to thin faster. At a certain point, a winter road may not be thick enough to support travel and transportation. There is considerable interest in monitoring winter road conditions given the high construction and maintenance costs involved. Remote sensing technologies such as Synthetic Aperture Radar have been successfully utilized to study the extent of ice covers and record freeze-up and break-up dates of ice on lakes and rivers across the north. Ice road builders often used Ultrasound equipment to measure ice thickness. However, an automated monitoring system, based on machine vision and image processing technology, which can measure ice thickness on lakes has not been thought of. Machine vision and image processing techniques have successfully been used in manufacturing

  6. Automated wireless monitoring system for cable tension using smart sensors

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jongwoong; Cho, Soojin; Spencer, Billie F.; Yun, Chung-Bang

    2013-04-01

    Cables are critical load carrying members of cable-stayed bridges; monitoring tension forces of the cables provides valuable information for SHM of the cable-stayed bridges. Monitoring systems for the cable tension can be efficiently realized using wireless smart sensors in conjunction with vibration-based cable tension estimation approaches. This study develops an automated cable tension monitoring system using MEMSIC's Imote2 smart sensors. An embedded data processing strategy is implemented on the Imote2-based wireless sensor network to calculate cable tensions using a vibration-based method, significantly reducing the wireless data transmission and associated power consumption. The autonomous operation of the monitoring system is achieved by AutoMonitor, a high-level coordinator application provided by the Illinois SHM Project Services Toolsuite. The monitoring system also features power harvesting enabled by solar panels attached to each sensor node and AutoMonitor for charging control. The proposed wireless system has been deployed on the Jindo Bridge, a cable-stayed bridge located in South Korea. Tension forces are autonomously monitored for 12 cables in the east, land side of the bridge, proving the validity and potential of the presented tension monitoring system for real-world applications.

  7. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    Energy Technology Data Exchange (ETDEWEB)

    Musyurka, A. V., E-mail: musyurkaav@burges.rushydro.ru [Bureya HPP (a JSC RusGidro affiliate) (Russian Federation)

    2016-09-15

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  8. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    International Nuclear Information System (INIS)

    Musyurka, A. V.

    2016-01-01

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  9. Multivariate process monitoring of EAFs

    Energy Technology Data Exchange (ETDEWEB)

    Sandberg, E.; Lennox, B.; Marjanovic, O.; Smith, K.

    2005-06-01

    Improved knowledge of the effect of scrap grades on the electric steelmaking process and optimised scrap loading practices increase the potential for process automation. As part of an ongoing programme, process data from four Scandinavian EAFs have been analysed, using the multivariate process monitoring approach, to develop predictive models for end point conditions such as chemical composition, yield and energy consumption. The models developed generally predict final Cr, Ni and Mo and tramp element contents well, but electrical energy consumption, yield and content of oxidisable and impurity elements (C, Si, Mn, P, S) are at present more difficult to predict. Potential scrap management applications of the prediction models are also presented. (author)

  10. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  11. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    Science.gov (United States)

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  12. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    Science.gov (United States)

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  13. Automated extinction monitor for the NLOT site survey

    Science.gov (United States)

    Kumar Sharma, Tarun

    In order to search a few potential sites for the National Large Optical Telescope (NLOT) project, we have initiated a site survey program. Since, most of instruments used for the site survey are custom made, we also started developing our own site characterization instruments. In this process we have designed and developed a device called Automated Extinction Monitor (AEM) and installed the same at IAO, Hanle. The AEM is a small wide field robotic telescope, dedicated to record atmospheric extinction in one or more photometric bands. It gives very accurate statistics of the distribution of photometric nights. In addition to this, instrument also provides the measurement of sky brightness. Here we briefly describe overall instrument and initial results obtained.

  14. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  15. An automated, self-verifying system for monitoring uranium in effluent streams

    International Nuclear Information System (INIS)

    Reda, R.J.; Pickett, J.L.

    1992-01-01

    In nuclear facilities such as nuclear fuel fabrication plants, a constant vigil is required to ensure that the concentrations of uranium in process or waste streams do not exceed required specifications. The specifications may be dictated by the process owner, a regulatory agency such as the US Nuclear Regulatory Agency or Environmental Protection Agency, or by criticality safety engineering criteria. Traditionally, uranium monitoring in effluent streams has been accomplished by taking periodic samples of the liquid stream and determining the concentration by chemical analysis. Despite its accuracy, chemical sampling is not timely enough for practical use in continuously flowing systems because of the possibility that a significant quantity of uranium may be discharged between sampling intervals. To completely satisfy regulatory standards, the liquid waste stream must be monitored for uranium on a 100% basis. To this end, an automated, radioisotopic liquid-waste monitoring system was developed by GE Nuclear Energy as an integral part of the uranium conversion and waste recovery operations. The system utilizes passive gamma-ray spectroscopy and is thus a robust, on-line, and nondestructive assay for uranium. The system provides uranium concentration data for process monitoring and assures regulatory compliance for criticality safety. A summary of the principles of system operation, calibration, and verification is presented in this paper

  16. Proof-of-concept automation of propellant processing

    Science.gov (United States)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  17. Automation in a material processing/storage facility

    International Nuclear Information System (INIS)

    Peterson, K.; Gordon, J.

    1997-01-01

    The Savannah River Site (SRS) is currently developing a new facility, the Actinide Packaging and Storage Facility (APSF), to process and store legacy materials from the United States nuclear stockpile. A variety of materials, with a variety of properties, packaging and handling/storage requirements, will be processed and stored at the facility. Since these materials are hazardous and radioactive, automation will be used to minimize worker exposure. Other benefits derived from automation of the facility include increased throughput capacity and enhanced security. The diversity of materials and packaging geometries to be handled poses challenges to the automation of facility processes. In addition, the nature of the materials to be processed underscores the need for safety, reliability and serviceability. The application of automation in this facility must, therefore, be accomplished in a rational and disciplined manner to satisfy the strict operational requirements of the facility. Among the functions to be automated are the transport of containers between process and storage areas via an Automatic Guided Vehicle (AGV), and various processes in the Shipping Package Unpackaging (SPU) area, the Accountability Measurements (AM) area, the Special Isotope Storage (SIS) vault and the Special Nuclear Materials (SNM) vault. Other areas of the facility are also being automated, but are outside the scope of this paper

  18. Automated Groundwater Monitoring of Uranium at the Hanford Site, Washington - 13116

    Energy Technology Data Exchange (ETDEWEB)

    Burge, Scott R. [Burge Environmental, Inc., 6100 South Maple Avenue, no. 114, Tempe, AZ, 85283 (United States); O' Hara, Matthew J. [Pacific Northwest National Laboratory, 902 Battelle Blvd., Richland, WA, 99352 (United States)

    2013-07-01

    An automated groundwater monitoring system for the detection of uranyl ion in groundwater was deployed at the 300 Area Industrial Complex, Hanford Site, Washington. The research was conducted to determine if at-site, automated monitoring of contaminant movement in the subsurface is a viable alternative to the baseline manual sampling and analytical laboratory assay methods currently employed. The monitoring system used Arsenazo III, a colorimetric chelating compound, for the detection of the uranyl ion. The analytical system had a limit of quantification of approximately 10 parts per billion (ppb, μg/L). The EPA's drinking water maximum contaminant level (MCL) is 30 ppb [1]. In addition to the uranyl ion assay, the system was capable of acquiring temperature, conductivity, and river level data. The system was fully automated and could be operated remotely. The system was capable of collecting water samples from four sampling sources, quantifying the uranyl ion, and periodically performing a calibration of the analytical cell. The system communications were accomplished by way of cellular data link with the information transmitted through the internet. Four water sample sources were selected for the investigation: one location provided samples of Columbia River water, and the remaining three sources provided groundwater from aquifer sampling tubes positioned in a vertical array at the Columbia River shoreline. The typical sampling schedule was to sample the four locations twice per day with one calibration check per day. This paper outlines the instrumentation employed, the operation of the instrumentation, and analytical results for a period of time between July and August, 2012. The presentation includes the uranyl ion concentration and conductivity results from the automated sampling/analysis system, along with a comparison between the automated monitor's analytical performance and an independent laboratory analysis. Benefits of using the automated

  19. Automated Groundwater Monitoring of Uranium at the Hanford Site, Washington - 13116

    International Nuclear Information System (INIS)

    Burge, Scott R.; O'Hara, Matthew J.

    2013-01-01

    An automated groundwater monitoring system for the detection of uranyl ion in groundwater was deployed at the 300 Area Industrial Complex, Hanford Site, Washington. The research was conducted to determine if at-site, automated monitoring of contaminant movement in the subsurface is a viable alternative to the baseline manual sampling and analytical laboratory assay methods currently employed. The monitoring system used Arsenazo III, a colorimetric chelating compound, for the detection of the uranyl ion. The analytical system had a limit of quantification of approximately 10 parts per billion (ppb, μg/L). The EPA's drinking water maximum contaminant level (MCL) is 30 ppb [1]. In addition to the uranyl ion assay, the system was capable of acquiring temperature, conductivity, and river level data. The system was fully automated and could be operated remotely. The system was capable of collecting water samples from four sampling sources, quantifying the uranyl ion, and periodically performing a calibration of the analytical cell. The system communications were accomplished by way of cellular data link with the information transmitted through the internet. Four water sample sources were selected for the investigation: one location provided samples of Columbia River water, and the remaining three sources provided groundwater from aquifer sampling tubes positioned in a vertical array at the Columbia River shoreline. The typical sampling schedule was to sample the four locations twice per day with one calibration check per day. This paper outlines the instrumentation employed, the operation of the instrumentation, and analytical results for a period of time between July and August, 2012. The presentation includes the uranyl ion concentration and conductivity results from the automated sampling/analysis system, along with a comparison between the automated monitor's analytical performance and an independent laboratory analysis. Benefits of using the automated system as an

  20. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong

    2014-11-01

    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  1. Automated Cryocooler Monitor and Control System

    Science.gov (United States)

    Britcliffe, Michael J.; Hanscon, Theodore R.; Fowler, Larry E.

    2011-01-01

    A system was designed to automate cryogenically cooled low-noise amplifier systems used in the NASA Deep Space Network. It automates the entire operation of the system including cool-down, warm-up, and performance monitoring. The system is based on a single-board computer with custom software and hardware to monitor and control the cryogenic operation of the system. The system provides local display and control, and can be operated remotely via a Web interface. The system controller is based on a commercial single-board computer with onboard data acquisition capability. The commercial hardware includes a microprocessor, an LCD (liquid crystal display), seven LED (light emitting diode) displays, a seven-key keypad, an Ethernet interface, 40 digital I/O (input/output) ports, 11 A/D (analog to digital) inputs, four D/A (digital to analog) outputs, and an external relay board to control the high-current devices. The temperature sensors used are commercial silicon diode devices that provide a non-linear voltage output proportional to temperature. The devices are excited with a 10-microamp bias current. The system is capable of monitoring and displaying three temperatures. The vacuum sensors are commercial thermistor devices. The output of the sensors is a non-linear voltage proportional to vacuum pressure in the 1-Torr to 1-millitorr range. Two sensors are used. One measures the vacuum pressure in the cryocooler and the other the pressure at the input to the vacuum pump. The helium pressure sensor is a commercial device that provides a linear voltage output from 1 to 5 volts, corresponding to a gas pressure from 0 to 3.5 MPa (approx. = 500 psig). Control of the vacuum process is accomplished with a commercial electrically operated solenoid valve. A commercial motor starter is used to control the input power of the compressor. The warm-up heaters are commercial power resistors sized to provide the appropriate power for the thermal mass of the particular system, and

  2. Monitored Retrievable Storage/Multi-Purpose Canister analysis: Simulation and economics of automation

    International Nuclear Information System (INIS)

    Bennett, P.C.; Stringer, J.B.

    1994-01-01

    Robotic automation is examined as a possible alternative to manual spent nuclear fuel, transport cask and Multi-Purpose canister (MPC) handling at a Monitored Retrievable Storage (MRS) facility. Automation of key operational aspects for the MRS/MPC system are analyzed to determine equipment requirements, through-put times and equipment costs is described. The economic and radiation dose impacts resulting from this automation are compared to manual handling methods

  3. Adaptive Algorithms for Automated Processing of Document Images

    Science.gov (United States)

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  4. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  5. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S.

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m 3 /h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10 -6 Bq/m 3 . The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  6. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  7. Safety Evaluation of an Automated Remote Monitoring System for Heart Failure in an Urban, Indigent Population.

    Science.gov (United States)

    Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Hertz, Crystal Coyazo; Guterman, Jeffrey J

    2017-12-01

    Heart Failure (HF) is the most expensive preventable condition, regardless of patient ethnicity, race, socioeconomic status, sex, and insurance status. Remote telemonitoring with timely outpatient care can significantly reduce avoidable HF hospitalizations. Human outreach, the traditional method used for remote monitoring, is effective but costly. Automated systems can potentially provide positive clinical, fiscal, and satisfaction outcomes in chronic disease monitoring. The authors implemented a telephonic HF automated remote monitoring system that utilizes deterministic decision tree logic to identify patients who are at risk of clinical decompensation. This safety study evaluated the degree of clinical concordance between the automated system and traditional human monitoring. This study focused on a broad underserved population and demonstrated a safe, reliable, and inexpensive method of monitoring patients with HF.

  8. Test Automation Process Improvement A case study of BroadSoft

    OpenAIRE

    Gummadi, Jalendar

    2016-01-01

    This master thesis research is about improvement of test automation process at BroadSoft Finland as a case study. Test automation project recently started at BroadSoft but the project is not properly integrated in to existing process. Project is about converting manual test cases to automation test cases. The aim of this thesis is about studying existing BroadSoft test process and studying different test automation frameworks. In this thesis different test automation process are studied ...

  9. In-process monitoring and control of microassembly by utilising force sensor

    OpenAIRE

    S. Tangjitsitcharoen; P. Tangpornprasert; Ch. Virulsri; N. Rojanarowan

    2008-01-01

    Purpose: The aim of this research is to develop an in-process monitoring system to control the position of the shaftwithin a tolerance of ±2.5 μm regardless of any conditions of the geometries of the shaft and the thrust plate.Design/methodology/approach: To realize an automated and intelligent microassembly process, a method hasbeen developed to monitor and control the position of the shaft in the plate of the high-precision spindle motorfor hard disk drive in order to reduce the shaft high ...

  10. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  11. Powder handling for automated fuel processing

    International Nuclear Information System (INIS)

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-01-01

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs

  12. RAPID AUTOMATED RADIOCHEMICAL ANALYZER FOR DETERMINATION OF TARGETED RADIONUCLIDES IN NUCLEAR PROCESS STREAMS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Egorov, Oleg; Devol, Timothy A.

    2008-01-01

    Some industrial process-scale plants require the monitoring of specific radionuclides as an indication of the composition of their feed streams or as indicators of plant performance. In this process environment, radiochemical measurements must be fast, accurate, and reliable. Manual sampling, sample preparation, and analysis of process fluids are highly precise and accurate, but tend to be expensive and slow. Scientists at Pacific Northwest National Laboratory (PNNL) have assembled and characterized a fully automated prototype Process Monitor instrument which was originally designed to rapidly measure Tc-99 in the effluent streams of the Waste Treatment Plant at Hanford, WA. The system is capable of a variety of tasks: extraction of a precise volume of sample, sample digestion/analyte redox adjustment, column-based chemical separations, flow-through radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results can be immediately calculated and electronically reported. It is capable of performing a complete analytical cycle in less than 15 minutes. The system is highly modular and can be adapted to a variety of sample types and analytical requirements. It exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  13. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dennison, D.K.; Domning, E.E.; Seivers, R.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  14. Computer simulation and automation of data processing

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1981-01-01

    The principles of computerized simulation and automation of data processing are presented. The automized processing system is constructed according to the module-hierarchical principle. The main operating conditions of the system are as follows: preprocessing, installation analysis, interpretation, accuracy analysis and controlling parameters. The definition of the quasireal experiment permitting to plan the real experiment is given. It is pointed out that realization of the quasireal experiment by means of the computerized installation model with subsequent automized processing permits to scan the quantitative aspect of the system as a whole as well as provides optimal designing of installation parameters for obtaining maximum resolution [ru

  15. Automated testing of arrhythmia monitors using annotated databases.

    Science.gov (United States)

    Elghazzawi, Z; Murray, W; Porter, M; Ezekiel, E; Goodall, M; Staats, S; Geheb, F

    1992-01-01

    Arrhythmia-algorithm performance is typically tested using the AHA and MIT/BIH databases. The tools for this test are simulation software programs. While these simulations provide rapid results, they neglect hardware and software effects in the monitor. To provide a more accurate measure of performance in the actual monitor, a system has been developed for automated arrhythmia testing. The testing system incorporates an IBM-compatible personal computer, a digital-to-analog converter, an RS232 board, a patient-simulator interface to the monitor, and a multi-tasking software package for data conversion and communication with the monitor. This system "plays" patient data files into the monitor and saves beat classifications in detection files. Tests were performed using the MIT/BIH and AHA databases. Statistics were generated by comparing the detection files with the annotation files. These statistics were marginally different from those that resulted from the simulation. Differences were then examined. As expected, the differences were related to monitor hardware effects.

  16. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing

    International Nuclear Information System (INIS)

    Masotti, Paulo Henrique Ferraz

    2006-01-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  17. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  18. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    Science.gov (United States)

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  19. Automated Remote Monitoring of Depression: Acceptance Among Low-Income Patients in Diabetes Disease Management

    OpenAIRE

    Ramirez, Magaly; Wu, Shinyi; Jin, Haomiao; Ell, Kathleen; Gross-Schulman, Sandra; Myerchin Sklaroff, Laura; Guterman, Jeffrey

    2016-01-01

    Background Remote patient monitoring is increasingly integrated into health care delivery to expand access and increase effectiveness. Automation can add efficiency to remote monitoring, but patient acceptance of automated tools is critical for success. From 2010 to 2013, the Diabetes-Depression Care-management Adoption Trial (DCAT)?a quasi-experimental comparative effectiveness research trial aimed at accelerating the adoption of collaborative depression care in a safety-net health care syst...

  20. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives.

    Science.gov (United States)

    Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini

    2018-08-01

    Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.

  1. AUTOMATION OF CHAMPAGNE WINES PROCESS IN SPARKLING WINE PRESSURE TANK

    Directory of Open Access Journals (Sweden)

    E. V. Lukyanchuk

    2016-08-01

    Full Text Available The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex avtomatization sites grape processing with bulk transportation of the latter; improving the quality and durability of wines by the processing of a wide applying wine cold and heat, as well as technical and microbiological control most powerful automation equipment; the introduction of automated production processes of continuous technical champagne, sherry wine and cognac alcohol madery; the use of complex automation auxiliary production sites (boilers, air conditioners, refrigeration unitsand other.; complex avtomatization creation of enterprises, and sites manufactory bottling wines. In the wine industry developed more sophisticated schemes of automation and devices that enable the transition to integrated production automation, will create, are indicative automated enterprise serving for laboratories to study of the main problems of automation of production processes of winemaking.

  2. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  3. [Algorithm for the automated processing of rheosignals].

    Science.gov (United States)

    Odinets, G S

    1988-01-01

    Algorithm for rheosignals recognition for a microprocessing device with a representation apparatus and with automated and manual cursor control was examined. The algorithm permits to automate rheosignals registrating and processing taking into account their changeability.

  4. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    . Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory......Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  5. Technology transfer potential of an automated water monitoring system. [market research

    Science.gov (United States)

    Jamieson, W. M.; Hillman, M. E. D.; Eischen, M. A.; Stilwell, J. M.

    1976-01-01

    The nature and characteristics of the potential economic need (markets) for a highly integrated water quality monitoring system were investigated. The technological, institutional and marketing factors that would influence the transfer and adoption of an automated system were studied for application to public and private water supply, public and private wastewater treatment and environmental monitoring of rivers and lakes.

  6. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    International Nuclear Information System (INIS)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-01-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems. (paper)

  7. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    Science.gov (United States)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-04-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  8. Automation trust and attention allocation in multitasking workspace.

    Science.gov (United States)

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.

  9. Development of automated patrol-type monitoring and inspection system for nuclear power plant and application to actual plant

    International Nuclear Information System (INIS)

    Senoo, Makoto; Koga, Kazunori; Hirakawa, Hiroshi; Tanaka, Keiji

    1996-01-01

    An automated patrol-type monitoring and inspection system was developed and applied in a nuclear power plant. This system consists of a monorail, a monitoring robot and an operator's console. The monitoring robot consists of a sensor unit and a control unit. Three kinds of sensor, a color ITV camera, an infrared camera and a microphone are installed in the sensor unit. The features of this monitoring robot are; (1) Weights 15 kg with a cross-sectional dimensions of 152 mm width and 290 mm height. (2) Several automatic monitoring functions are installed using image processing and frequency analysis for three sensor signals. (author)

  10. Evaluation of new and conventional thermoluminescent phosphors for environmental monitoring using automated thermoluminescent dosimeter readers

    International Nuclear Information System (INIS)

    Rathbone, B.A.; Endres, A.W.; Antonio, E.J.

    1994-10-01

    In recent years, there has been considerable interest in a new generation of super-sensitive thermoluminescent (TL) phosphors for potential use in routine personnel and environmental monitoring. Two of these phosphors, α-Al 2 O 3 :C and LiF:Mg,Cu,P, are evaluated in this paper for selected characteristics relevant to environmental monitoring, along with two conventional phosphors widely used in environmental monitoring, LiF:Mg,Ti and CaF 2 :Dy. The characteristics evaluated are light-induced fading, light-induced background, linearity and variability at low dose, and the minimum measurable dose. These characteristics were determined using an automated commercial dosimetry system (Harshaw System 8800) and routine processing protocols. Annealing and readout protocols for each phosphor were optimized for use in a large-scale environmental monitoring program

  11. Automated Student Aid Processing: The Challenge and Opportunity.

    Science.gov (United States)

    St. John, Edward P.

    1985-01-01

    To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)

  12. Quantitative monitoring of the fluorination process by neutron counting

    International Nuclear Information System (INIS)

    Russo, P.A.; Appert, Q.D.; Biddle, R.S.; Kelley, T.A.; Martinez, M.M.; West, M.H.

    1993-01-01

    Plutonium metal is produced by reducing PuF 4 prepared from PuO 2 by fluorination. Both fluorination and reduction are batch processes at the Los Alamos Plutonium Facility. The conversion of plutonium oxide to fluoride greatly increases the neutron yield, a result of the high cross section for alpha-neutron (α,n) reactions on fluorine targets compared to the (more than 100 times) smaller α,n yield on oxygen targets. Because of the increase, total neutron counting can be used to monitor the conversion process. This monitoring ability can lead to an improved metal product, reduced scrap for recycle, waste reduction, minimized reagent usage, and reduce personnel radiation exposures. A new stirred-bed fluorination process has been developed simultaneously with a recent evaluation of an automated neutron-counting instrument for quantitative process monitoring. Neutrons are counted with polyethylene-moderated 3 He-gas proportional counters. Results include a calibration of the real-time neutron-count-rate indicator for the extent of fluorination using reference values obtained from destructive analysis of samples from the blended fluoroinated batch

  13. Acoustic emission-based in-process monitoring of surface generation in robot-assisted polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas; Bissacco, Giuliano; De Chiffre, Leonardo

    2016-01-01

    The applicability of acoustic emission (AE) measurements for in-process monitoring of surface generation in the robot-assisted polishing (RAP) was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate...... automatic detection of optimal process endpoint allow intelligent process control, creating fundamental elements in development of robust fully automated RAP process for its widespread industrial application....... removal of the part from the machine tool. In this study, stabilisation of surface roughness during polishing rotational symmetric surfaces by the RAP process was monitored by AE measurements. An AE sensor was placed on a polishing arm in direct contact with a bonded abrasive polishing tool...

  14. More steps towards process automation for optical fabrication

    Science.gov (United States)

    Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina

    2017-06-01

    In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.

  15. Automated monitoring: a potential solution for achieving sustainable improvement in hand hygiene practices.

    Science.gov (United States)

    Levchenko, Alexander I; Boscart, Veronique M; Fernie, Geoff R

    2014-08-01

    Adequate hand hygiene is often considered as the most effective method of reducing the rates of hospital-acquired infections, which are one of the major causes of increased cost, morbidity, and mortality in healthcare. Electronic monitoring technologies provide a promising direction for achieving sustainable hand hygiene improvement by introducing the elements of automated feedback and creating the possibility to automatically collect individual hand hygiene performance data. The results of the multiphase testing of an automated hand hygiene reminding and monitoring system installed in a complex continuing care setting are presented. The study included a baseline Phase 1, with the system performing automated data collection only, a preintervention Phase 2 with hand hygiene status indicator enabled, two intervention Phases 3 and 4 with the system generating hand hygiene reminding signals and periodic performance feedback sessions provided, and a postintervention Phase 5 with only hand hygiene status indicator enabled and no feedback sessions provided. A significant increase in hand hygiene performance observed during the first intervention Phase 3 was sustained over the second intervention Phase 4, with the postintervention phase also indicating higher hand hygiene activity rates compared with the preintervention and baseline phases. The overall trends observed during the multiphase testing, the factors affecting acceptability of the automated hand hygiene monitoring system, and various strategies of technology deployment are discussed.

  16. Implementation of a fully automated process purge-and-trap gas chromatograph at an environmental remediation site

    International Nuclear Information System (INIS)

    Blair, D.S.; Morrison, D.J.

    1997-01-01

    The AQUASCAN, a commercially available, fully automated purge-and-trap gas chromatograph from Sentex Systems Inc., was implemented and evaluated as an in-field, automated monitoring system of contaminated groundwater at an active DOE remediation site in Pinellas, FL. Though the AQUASCAN is designed as a stand alone process analytical unit, implementation at this site required additional hardware. The hardware included a sample dilution system and a method for delivering standard solution to the gas chromatograph for automated calibration. As a result of the evaluation the system was determined to be a reliable and accurate instrument. The AQUASCAN reported concentration values for methylene chloride, trichloroethylene, and toluene in the Pinellas ground water were within 20% of reference laboratory values

  17. Containerless automated processing of intermetallic compounds and composites

    Science.gov (United States)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  18. Automated terrestrial laser scanning with near-real-time change detection – monitoring of the Séchilienne landslide

    Directory of Open Access Journals (Sweden)

    R. A. Kromer

    2017-05-01

    Full Text Available We present an automated terrestrial laser scanning (ATLS system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.

  19. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  20. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  1. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  2. Welding process automation in power machine building

    International Nuclear Information System (INIS)

    Mel'bard, S.N.; Shakhnov, A.F.; Shergov, I.V.

    1977-01-01

    The level of welding automation operations in power engineering and ways of its enhancement are highlighted. Used as the examples of comlex automation are an apparatus for the horizontal welding of turbine rotors, remotely controlled automatic machine for welding ring joint of large-sized vessels, equipment for the electron-beam welding of steam turbine assemblies of alloyed steels. The prospects of industrial robots are noted. The importance of the complex automation of technological process, including stocking, assemblying, transportation and auxiliary operations, is emphasized

  3. Automated analysis of PET based in-vivo monitoring in ion beam therapy

    International Nuclear Information System (INIS)

    Kuess, P.

    2014-01-01

    Particle Therapy (PT)-PET is currently the only clinically approved in-vivo method for monitoring PT. Due to fragmentation processes in the patients' tissue and the beam projectiles, a beta plus activity distribution (BAD) can be measured during or shortly after the irradiation. The recorded activity map can not be directly compared to the planned dose distribution. However, by means of a Monte Carlo (MC) simulation it is possible to predict the measured BAD from a treatment plan (TP). Thus to verify a patient's treatment fraction the actual PET measurement can be compared to the respective BAD prediction. This comparison is currently performed by visual inspection which requires experienced evaluators and is rather time consuming. In this PhD thesis an evaluation tool is presented to compare BADs in an automated and objective way. The evaluation method was based on the Pearson's correlation coefficient (PCC) – an established measure in medical image processing – which was coded into a software tool. The patient data used to develop, test and validate the software tool were acquired at the GSI research facility where over 400 patient treatments with 12C were monitored by means of an in-beam PET prototype. The number of data sets was increased by artificially altering BAD to simulate different beam ranges. The automated detection tool was tested in head and neck (H&N), prostate, lung, and brain. To generate carbon ion TPs the treatment planning system TRiP98 was used for all cases. From these TPs the respective BAD predictions were derived. Besides the detection of range deviations by means of PT-PET also the automated detection of patient setup uncertainties was investigated. Although all measured patient data were recorded during the irradiation (in-beam) also scenarios performing PET scans shortly after the irradiation (in-room) were considered. To analyze the achievable precision of PT-PET with the automated evaluation tool based on

  4. CONCEPT AND STRUCTURE OF AUTOMATED SYSTEM FOR MONITORING STUDENT LEARNING QUALITY

    Directory of Open Access Journals (Sweden)

    M. Yu. Kataev

    2017-01-01

    organization and management of the learning process in a higher educational institution. The factors that affect the level of student knowledge obtained during training are shown. On this basis, the determining factors in assessing the level of knowledge are highlighted. It is offered to build the managing of individual training at any time interval on the basis of a calculation of the generalized criterion which consists of students’ current progress, their activity and time spent for training.The block structure of the automated program system of continuous monitoring of achievements of each student is described. All functional blocks of system are interconnected with educational process. The main advantage of this system is that students have continuous access to materials about own individual achievements and mistakes; from passive consumers of information they turn into active members of the education, and thus, they can achieve bigger effectiveness of personal vocational training. It is pointed out that information base of such system has to be available not only to students and teachers, but also future employers of university graduates.Practical significance. The concept of automated system for education results monitoring and technique of processing of collected material presented in the article are based on a simple and obvious circumstance: a student with high progress spends more time on training and leads active lifestyle in comparison with fellow students; therefore, that student with high probability will be more successful in the chosen profession. Thus, for ease of use, complete, fully detailed and digitized information on individual educational achievements of future expert is necessary not only for effective management of educational process in higher education institutions, but also for employers interested in well-prepared, qualified and hard-working staff intended to take responsibility for labour duties.

  5. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  6. Organization of film data processing in the PPI-SA automated system

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Perekatov, V.G.

    1984-01-01

    Organization of processing nuclear interaction images at PUOS - type standard devices using the PPI-SA automated system is considered. The system is made in the form of a complete module comprising two scanning measuring projectors and a scan-ning automatic device which operate in real time on line with the BESM-4-computer. The system comprises: subsystem for photographic film scanning, selection of events for measurements and preliminary encoding; subsystem for formation and generation of libraries with data required for monitoring the scanning automatic device; subsystem for precision measurements separate coordinates on photo images of nuclear particle tracks and ionization losses. The system software comprises monitoring programs for the projectors and scanning automatic device as well as test functional control programs and operating system. The programs are organized a modular concept. By changing the module set the system can be modified and adapted for image processing in different fields of science and technology

  7. Development of an automation system for a tablet coater

    DEFF Research Database (Denmark)

    Ruotsalainen, Mirja; Heinämäki, Jyrki; Rantanen, Jukka

    2002-01-01

    An instrumentation and automation system for a side-vented pan coater with a novel air-flow rate measurement system for monitoring the film-coating process of tablets was designed and tested. The instrumented coating system was tested and validated by film-coating over 20 pilot-scale batches...... and automated pan-coating system described, including historical data storage capability and a novel air-flow measurement system, is a useful tool for controlling and characterizing the tablet film-coating process. Monitoring of critical process parameters increases the overall coating process efficiency...

  8. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1988-01-01

    Automation, the removal of the human element in inspection has not been generally applied to film radiographic NDT. The justification for automation is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 3x10 (to the power of nine) bits per 14x17. This is equivalent to 2200 computer floppy disks parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, a computer aided interpretation appears on the horizon. A unit which laser scans a 14x27 (inch) film in 6-8 seconds can digitize film in information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for film digital radiography system) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. (Author). 4 refs.; 21 figs

  9. Acoustic Emission Based In-process Monitoring in Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas; Bissacco, Giuliano; De Chiffre, Leonardo

    The applicability of acoustic emission (AE) measurements for in-process monitoring in the Robot Assisted Polishing (RAP) process was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate removal...... improving the efficiency of the process. It also allows for intelligent process control and generally enhances the robustness and reliability of the automated RAP system in industrial applications....... of the part from the machine tool. In this study, development of surface roughness during polishing rotational symmetric surfaces by the RAP process was inferred from AE measurements. An AE sensor was placed on a polishing tool, and a cylindrical rod of Vanadis 4E steel having an initial turned surface...

  10. AUTOMATION DESIGN FOR MONORAIL - BASED SYSTEM PROCESSES

    Directory of Open Access Journals (Sweden)

    Bunda BESA

    2016-12-01

    Full Text Available Currently, conventional methods of decline development put enormous cost pressure on the profitability of mining operations. This is the case with narrow vein ore bodies where current methods and mine design of decline development may be too expensive to support economic extraction of the ore. According to studies, the time it takes to drill, clean and blast an end in conventional decline development can be up to 224 minutes. This is because once an end is blasted, cleaning should first be completed before drilling can commence, resulting in low advance rates per shift. Improvements in advance rates during decline development can be achieved by application of the Electric Monorail Transport System (EMTS based drilling system. The system consists of the drilling and loading components that use monorail technology to drill and clean the face during decline development. The two systems work simultaneously at the face in such a way that as the top part of the face is being drilled the pneumatic loading system cleans the face. However, to improve the efficiency of the two systems, critical processes performed by the two systems during mining operations must be automated. Automation increases safety and productivity, reduces operator fatigue and also reduces the labour costs of the system. The aim of this paper is, therefore, to describe automation designs of the two processes performed by the monorail drilling and loading systems during operations. During automation design, critical processes performed by the two systems and control requirements necessary to allow the two systems execute such processes automatically have also been identified.

  11. Process monitoring

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Many of the measurements and observations made in a nuclear processing facility to monitor processes and product quality can also be used to monitor the location and movements of nuclear materials. In this session information is presented on how to use process monitoring data to enhance nuclear material control and accounting (MC and A). It will be seen that SNM losses can generally be detected with greater sensitivity and timeliness and point of loss localized more closely than by conventional MC and A systems if process monitoring data are applied. The purpose of this session is to enable the participants to: (1) identify process unit operations that could improve control units for monitoring SNM losses; (2) choose key measurement points and formulate a loss indicator for each control unit; and (3) describe how the sensitivities and timeliness of loss detection could be determined for each loss indicator

  12. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meier, David E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coble, Jamie B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, David V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mcdonald, Luther W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Forrester, Joel B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Unlu, Kenan [Pennsylvania State Univ., University Park, PA (United States); Landsberger, Sheldon [Univ. of Texas, Austin, TX (United States); Bender, Sarah [Pennsylvania State Univ., University Park, PA (United States); Dayman, Kenneth J. [Univ. of Texas, Austin, TX (United States); Reilly, Dallas D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-01

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicate changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.

  13. Automated processing of nuclear materials accounting data

    International Nuclear Information System (INIS)

    Straka, J.; Pacak, P.; Moravec, J.

    1980-01-01

    An automated system was developed of nuclear materials accounting in Czechoslovakia. The system allows automating data processing including data storage. It comprises keeping records of inventories and material balance. In designing the system, the aim of the IAEA was taken into consideration, ie., building a unified information system interconnected with state-run systems of accounting and checking nuclear materials in the signatory countries of the non-proliferation treaty. The nuclear materials accounting programs were written in PL-1 and were tested at an EC 1040 computer at UJV Rez where also the routine data processing takes place. (B.S.)

  14. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...

  15. Automation in irrigation process in family farm with Arduino platform

    Directory of Open Access Journals (Sweden)

    Kianne Crystie Bezerra da Cunha

    2016-03-01

    Full Text Available The small farmers tend not to use mechanical inputs in the irrigation process due to the high cost than conventional irrigation systems have and in other cases, the lack of knowledge and technical guidance makes the farmer theme using the system. Thus, all control and monitoring are made by hand without the aid of machines and this practice can lead to numerous problems from poor irrigation, and water waste, energy, and deficits in production. It is difficult to deduce when to irrigate, or how much water applied in cultivation, measure the soil temperature variables, temperature, and humidity, etc. The objective of this work is to implement an automated irrigation system aimed at family farming that is low cost and accessible to the farmer. The system will be able to monitor all parameters from irrigation. For this to occur, the key characteristics of family farming, Arduino platform, and irrigation were analyzed.

  16. [The use of automated processing of information obtained during space flights for the monitoring and evaluation of airborne pollution].

    Science.gov (United States)

    Bagmanov, B Kh; Mikhaĭlova, A Iu; Pavlov, S V

    1997-01-01

    The article describes experience on use of automated processing of information obtained during spaceflights for analysis of urban air pollution. The authors present a method for processing of information obtained during spaceflights and show how to identify foci of industrial release and area of their spread within and beyond the cities.

  17. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  18. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  19. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    Padgett, H.C.; Schmidt, D.G.; Bida, G.T.; Wieland, B.W.; Pekrul, E.; Kingsbury, W.G.

    1991-01-01

    With the recent emergence of positron emission tomography (PET) as a viable clinical tool, there is a need for a convenient, cost-effective source of the positron emitter-labeled radiotracers labeled with carbon-11, nitrogen-13, oxygen-15, and fluorine-18. These short-lived radioisotopes are accelerator produced and thus, require a cyclotron and radiochemistry processing instrumentation that can be operated 3 in a clinical environment by competant technicians. The basic goal is to ensure safety and reliability while setting new standards for economy and ease of operation. The Siemens Radioisotope Delivery System (RDS 112) is a fully automated system dedicated to the production and delivery of positron-emitter labeled precursors and radiochemicals required to support a clinical PET imaging program. Thus, the entire RDS can be thought of as an automated radiochemical processing apparatus

  20. Automated and electronically assisted hand hygiene monitoring systems: a systematic review.

    Science.gov (United States)

    Ward, Melissa A; Schweizer, Marin L; Polgreen, Philip M; Gupta, Kalpana; Reisinger, Heather S; Perencevich, Eli N

    2014-05-01

    Hand hygiene is one of the most effective ways to prevent transmission of health care-associated infections. Electronic systems and tools are being developed to enhance hand hygiene compliance monitoring. Our systematic review assesses the existing evidence surrounding the adoption and accuracy of automated systems or electronically enhanced direct observations and also reviews the effectiveness of such systems in health care settings. We systematically reviewed PubMed for articles published between January 1, 2000, and March 31, 2013, containing the terms hand AND hygiene or hand AND disinfection or handwashing. Resulting articles were reviewed to determine if an electronic system was used. We identified 42 articles for inclusion. Four types of systems were identified: electronically assisted/enhanced direct observation, video-monitored direct observation systems, electronic dispenser counters, and automated hand hygiene monitoring networks. Fewer than 20% of articles identified included calculations for efficiency or accuracy. Limited data are currently available to recommend adoption of specific automatic or electronically assisted hand hygiene surveillance systems. Future studies should be undertaken that assess the accuracy, effectiveness, and cost-effectiveness of such systems. Given the restricted clinical and infection prevention budgets of most facilities, cost-effectiveness analysis of specific systems will be required before these systems are widely adopted. Published by Mosby, Inc.

  1. ARTIP: Automated Radio Telescope Image Processing Pipeline

    Science.gov (United States)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  2. Automating the conflict resolution process

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  3. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  4. Electrical - light current remote monitoring, control and automation. [Coal mine, United Kingdom

    Energy Technology Data Exchange (ETDEWEB)

    Collingwood, C H

    1981-06-01

    A brief discussion is given of the application of control monitoring and automation techniques to coal mining in the United Kingdom, especially of the use of microprocessors, for the purpose of enhancing safety and productivity. Lighting systems for the coal mine is similarly discussed.

  5. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    Schobert, G.; Arnold, N.; Bashore, D.; Mika, R.; Oliaro, G.

    1989-01-01

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  6. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    International Nuclear Information System (INIS)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M.; Rahmat, K.; Ariffin, H.

    2012-01-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  7. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); Rahmat, K. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); University Malaya, Biomedical Imaging Department, Kuala Lumpur (Malaysia); Ariffin, H. [University of Malaya, Department of Paediatrics, Faculty of Medicine, Kuala Lumpur (Malaysia)

    2012-07-15

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  8. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    Johnson, S.M.; Lorenz, R.

    1992-01-01

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  9. A fuzzy model for processing and monitoring vital signs in ICU patients

    Directory of Open Access Journals (Sweden)

    Valentim Ricardo AM

    2011-08-01

    Full Text Available Abstract Background The area of the hospital automation has been the subject of much research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others; communication (tracking patients, staff and materials, development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials; and aid to medical diagnosis (according to each speciality. Methods In this context, this paper presents a Fuzzy model for helping medical diagnosis of Intensive Care Unit (ICU patients and their vital signs monitored through a multiparameter heart screen. Intelligent systems techniques were used in the data acquisition and processing (sorting, transforming, among others it into useful information, conducting pre-diagnosis and providing, when necessary, alert signs to the medical staff. Conclusions The use of fuzzy logic turned to the medical area can be very useful if seen as a tool to assist specialists in this area. This paper presented a fuzzy model able to monitor and classify the condition of the vital signs of hospitalized patients, sending alerts according to the pre-diagnosis done helping the medical diagnosis.

  10. Processing and review interface for strong motion data (PRISM) software, version 1.0.0—Methodology and automated processing

    Science.gov (United States)

    Jones, Jeanne; Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.

  11. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C......In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...

  12. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    Science.gov (United States)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are

  13. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline.

    Science.gov (United States)

    Loh, K B; Ramli, N; Tan, L K; Roziah, M; Rahmat, K; Ariffin, H

    2012-07-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. Diffusion tensor imaging outperforms conventional MRI in depicting white matter maturation. • DTI will become an important clinical tool for diagnosing paediatric neurological diseases. • DTI appears especially helpful for developmental abnormalities, tumours and white matter disease. • An automated processing pipeline assists quantitative analysis of high throughput DTI data.

  14. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  15. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    International Nuclear Information System (INIS)

    Kazarov, A; Miotto, G Lehmann; Magnoni, L

    2012-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  16. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    Science.gov (United States)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  17. Advanced process monitoring and feedback control to enhance cell culture process production and robustness.

    Science.gov (United States)

    Zhang, An; Tsang, Valerie Liu; Moore, Brandon; Shen, Vivian; Huang, Yao-Ming; Kshirsagar, Rashmi; Ryll, Thomas

    2015-12-01

    It is a common practice in biotherapeutic manufacturing to define a fixed-volume feed strategy for nutrient feeds, based on historical cell demand. However, once the feed volumes are defined, they are inflexible to batch-to-batch variations in cell growth and physiology and can lead to inconsistent productivity and product quality. In an effort to control critical quality attributes and to apply process analytical technology (PAT), a fully automated cell culture feedback control system has been explored in three different applications. The first study illustrates that frequent monitoring and automatically controlling the complex feed based on a surrogate (glutamate) level improved protein production. More importantly, the resulting feed strategy was translated into a manufacturing-friendly manual feed strategy without impact on product quality. The second study demonstrates the improved process robustness of an automated feed strategy based on online bio-capacitance measurements for cell growth. In the third study, glucose and lactate concentrations were measured online and were used to automatically control the glucose feed, which in turn changed lactate metabolism. These studies suggest that the auto-feedback control system has the potential to significantly increase productivity and improve robustness in manufacturing, with the goal of ensuring process performance and product quality consistency. © 2015 Wiley Periodicals, Inc.

  18. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  19. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  20. Integration of disabled people in an automated work process

    Science.gov (United States)

    Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.

    2017-05-01

    Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.

  1. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-01-01

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  2. Failsafe automation of Phase II clinical trial interim monitoring for stopping rules.

    Science.gov (United States)

    Day, Roger S

    2010-02-01

    In Phase II clinical trials in cancer, preventing the treatment of patients on a study when current data demonstrate that the treatment is insufficiently active or too toxic has obvious benefits, both in protecting patients and in reducing sponsor costs. Considerable efforts have gone into experimental designs for Phase II clinical trials with flexible sample size, usually implemented by early stopping rules. The intended benefits will not ensue, however, if the design is not followed. Despite the best intentions, failures can occur for many reasons. The main goal is to develop an automated system for interim monitoring, as a backup system supplementing the protocol team, to ensure that patients are protected. A secondary goal is to stimulate timely recording of patient assessments. We developed key concepts and performance needs, then designed, implemented, and deployed a software solution embedded in the clinical trials database system. The system has been in place since October 2007. One clinical trial tripped the automated monitor, resulting in e-mails that initiated statistician/investigator review in timely fashion. Several essential contributing activities still require human intervention, institutional policy decisions, and institutional commitment of resources. We believe that implementing the concepts presented here will provide greater assurance that interim monitoring plans are followed and that patients are protected from inadequate response or excessive toxicity. This approach may also facilitate wider acceptance and quicker implementation of new interim monitoring algorithms.

  3. The SARVIEWS Project: Automated SAR Processing in Support of Operational Near Real-time Volcano Monitoring

    Science.gov (United States)

    Meyer, F. J.; Webley, P. W.; Dehn, J.; Arko, S. A.; McAlpin, D. B.; Gong, W.

    2016-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing has become established in operational volcano monitoring. Centers like the Alaska Volcano Observatory rely heavily on remote sensing data from optical and thermal sensors to provide time-critical hazard information. Despite this high use of remote sensing data, the presence of clouds and a dependence on solar illumination often limit their impact on decision making. Synthetic Aperture Radar (SAR) systems are widely considered superior to optical sensors in operational monitoring situations, due to their weather and illumination independence. Still, the contribution of SAR to operational volcano monitoring has been limited in the past due to high data costs, long processing times, and low temporal sampling rates of most SAR systems. In this study, we introduce the automatic SAR processing system SARVIEWS, whose advanced data analysis and data integration techniques allow, for the first time, a meaningful integration of SAR into operational monitoring systems. We will introduce the SARVIEWS database interface that allows for automatic, rapid, and seamless access to the data holdings of the Alaska Satellite Facility. We will also present a set of processing techniques designed to automatically generate a set of SAR-based hazard products (e.g. change detection maps, interferograms, geocoded images). The techniques take advantage of modern signal processing and radiometric normalization schemes, enabling the combination of data from different geometries. Finally, we will show how SAR-based hazard information is integrated in existing multi-sensor decision support tools to enable joint hazard analysis with data from optical and thermal sensors. We will showcase the SAR processing system using a set of recent natural disasters (both earthquakes and volcanic eruptions) to demonstrate its

  4. Automated electrohysterographic detection of uterine contractions for monitoring of pregnancy: feasibility and prospects.

    Science.gov (United States)

    Muszynski, C; Happillon, T; Azudin, K; Tylcz, J-B; Istrate, D; Marque, C

    2018-05-08

    Preterm birth is a major public health problem in developed countries. In this context, we have conducted research into outpatient monitoring of uterine electrical activity in women at risk of preterm delivery. The objective of this preliminary study was to perform automated detection of uterine contractions (without human intervention or tocographic signal, TOCO) by processing the EHG recorded on the abdomen of pregnant women. The feasibility and accuracy of uterine contraction detection based on EHG processing were tested and compared to expert decision using external tocodynamometry (TOCO) . The study protocol was approved by local Ethics Committees under numbers ID-RCB 2016-A00663-48 for France and VSN 02-0006-V2 for Iceland. Two populations of women were included (threatened preterm birth and labour) in order to test our system of recognition of the various types of uterine contractions. EHG signal acquisition was performed according to a standardized protocol to ensure optimal reproducibility of EHG recordings. A system of 18 Ag/AgCl surface electrodes was used by placing 16 recording electrodes between the woman's pubis and umbilicus according to a 4 × 4 matrix. TOCO was recorded simultaneously with EHG recording. EHG signals were analysed in real-time by calculation of the nonlinear correlation coefficient H 2 . A curve representing the number of correlated pairs of signals according to the value of H 2 calculated between bipolar signals was then plotted. High values of H 2 indicated the presence of an event that may correspond to a contraction. Two tests were performed after detection of an event (fusion and elimination of certain events) in order to increase the contraction detection rate. The EHG database contained 51 recordings from pregnant women, with a total of 501 contractions previously labelled by analysis of the corresponding tocographic recording. The percentage recognitions obtained by application of the method based on coefficient H 2 was

  5. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    Science.gov (United States)

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  6. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    Directory of Open Access Journals (Sweden)

    Tongxin Shu

    2017-11-01

    Full Text Available Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA, while achieving around the same Normalized Mean Error (NME, DDASA is superior in saving 5.31% more battery energy.

  7. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    Science.gov (United States)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical

  9. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  10. Automated transmission system operation and management : meeting stakeholder information needs

    Energy Technology Data Exchange (ETDEWEB)

    Peelo, D.F.; Toom, P.O. [British Columbia Hydro, Vancouver, BC (Canada)

    1998-12-01

    Information monitoring is considered to be the fundamental basis for moving beyond substation automation and into automated transmission system operation and management. Information monitoring was defined as the acquisition of data and processing the data into decision making. Advances in digital technology and cheaper, more powerful computing capability has made it possible to capture all transmission stakeholder needs in a shared and automated operation and management system. Recognizing that the key to success in the development of transmission systems is automation, BC Hydro has initiated a long-term research and development project to develop the structure and detail of transmission system automation. The involvement of partners, be they utility or equipment suppliers, is essential in order to deal with protocol and similar issues. 3 refs., 1 tab., 3 figs.

  11. Computer-based diagnostic monitoring to enhance the human-machine interface of complex processes

    International Nuclear Information System (INIS)

    Kim, I.S.

    1992-02-01

    There is a growing interest in introducing an automated, on-line, diagnostic monitoring function into the human-machine interfaces (HMIs) or control rooms of complex process plants. The design of such a system should be properly integrated with other HMI systems in the control room, such as the alarms system or the Safety Parameter Display System (SPDS). This paper provides a conceptual foundation for the development of a Plant-wide Diagnostic Monitoring System (PDMS), along with functional requirements for the system and other advanced HMI systems. Insights are presented into the design of an efficient and robust PDMS, which were gained from a critical review of various methodologies developed in the nuclear power industry, the chemical process industry, and the space technological community

  12. Completely automated measurement facility (PAVICOM) for track-detector data processing

    CERN Document Server

    Aleksandrov, A B; Feinberg, E L; Goncharova, L A; Konovalova, N S; Martynov, A G; Polukhina, N G; Roussetski, A S; Starkov, NI; Tsarev, V A

    2004-01-01

    A review of technical capabilities and investigations performed using the completely automated measuring facility (PAVICOM) is presented. This very efficient facility for track-detector data processing in the field of nuclear and high-energy particle physics has been constructed in the Lebedev physical institute. PAVICOM is widely used in Russia for treatment of experimental data from track detectors (emulsion and solid-state trackers) in high- and low-energy physics, cosmic ray physics, etc. PAVICOM provides an essential improvement of the efficiency of experimental studies. In contrast to semi-automated microscopes widely used until now, PAVICOM is capable of performing completely automated measurements of charged particle tracks in nuclear emulsions and track detectors without employing hard visual work. In this case, track images are recorded by CCD cameras and then are digitized and converted into files. Thus, experimental data processing is accelerated by approximately a thousand times. Completely autom...

  13. Dispatcher's monitoring systems of coal preparation processes. Systemy dyspozytorskiej kontroli procesow wzbogacania wegla

    Energy Technology Data Exchange (ETDEWEB)

    Cierpisz, S [Politechnika Slaska, Gliwice (Poland); Cierpisz, T; Glowacki, D; Puczylowski, T [Min-Tech Sp. z o.o., Katowice (Poland)

    1994-08-01

    The computer-based control and dispatcher's monitoring systems for coal preparation plants are described. The article refers to the local automation systems of coal blending production, control systems of heavy media separation process and dispatcher's visualization systems of technological lines operation. The effects of implementation of the above mentioned systems as well as some experiences gained at the designing and operational stages are given. (author). 2 refs., 6 figs.

  14. Rules-based analysis with JBoss Drools: adding intelligence to automation

    International Nuclear Information System (INIS)

    Ley, E. de; Jacobs, D.

    2012-01-01

    Rule engines are specialized software systems for applying conditional actions (if/then rules) on data. They are also known as 'production rule systems'. Rules engines are less-known as software technology than the traditional procedural, object-oriented, scripting or dynamic development languages. This is a pity, as their usage may offer an important enrichment to a development toolbox. JBoss Drools is an open-source rules engine that can easily be embedded in any Java application. Through an integration in our Passerelle process automation suite, we have been able to provide advanced solutions for intelligent process automation, complex event processing, system monitoring and alarming, automated repair etc. This platform has been proven for many years as an automated diagnosis and repair engine for Belgium's largest telecom provider, and it is being piloted at Synchrotron Soleil for device monitoring and alarming. After an introduction to rules engines in general and JBoss Drools in particular, we will present its integration in a solution platform, some important principles and a practical use case. (authors)

  15. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  16. IDAPS (Image Data Automated Processing System) System Description

    Science.gov (United States)

    1988-06-24

    This document describes the physical configuration and components used in the image processing system referred to as IDAPS (Image Data Automated ... Processing System). This system was developed by the Environmental Research Institute of Michigan (ERIM) for Eglin Air Force Base. The system is designed

  17. DEEP LEARNING AND IMAGE PROCESSING FOR AUTOMATED CRACK DETECTION AND DEFECT MEASUREMENT IN UNDERGROUND STRUCTURES

    Directory of Open Access Journals (Sweden)

    F. Panella

    2018-05-01

    Full Text Available This work presents the combination of Deep-Learning (DL and image processing to produce an automated cracks recognition and defect measurement tool for civil structures. The authors focus on tunnel civil structures and survey and have developed an end to end tool for asset management of underground structures. In order to maintain the serviceability of tunnels, regular inspection is needed to assess their structural status. The traditional method of carrying out the survey is the visual inspection: simple, but slow and relatively expensive and the quality of the output depends on the ability and experience of the engineer as well as on the total workload (stress and tiredness may influence the ability to observe and record information. As a result of these issues, in the last decade there is the desire to automate the monitoring using new methods of inspection. The present paper has the goal of combining DL with traditional image processing to create a tool able to detect, locate and measure the structural defect.

  18. Affective processes in human-automation interactions.

    Science.gov (United States)

    Merritt, Stephanie M

    2011-08-01

    This study contributes to the literature on automation reliance by illuminating the influences of user moods and emotions on reliance on automated systems. Past work has focused predominantly on cognitive and attitudinal variables, such as perceived machine reliability and trust. However, recent work on human decision making suggests that affective variables (i.e., moods and emotions) are also important. Drawing from the affect infusion model, significant effects of affect are hypothesized. Furthermore, a new affectively laden attitude termed liking is introduced. Participants watched video clips selected to induce positive or negative moods, then interacted with a fictitious automated system on an X-ray screening task At five time points, important variables were assessed including trust, liking, perceived machine accuracy, user self-perceived accuracy, and reliance.These variables, along with propensity to trust machines and state affect, were integrated in a structural equation model. Happiness significantly increased trust and liking for the system throughout the task. Liking was the only variable that significantly predicted reliance early in the task. Trust predicted reliance later in the task, whereas perceived machine accuracy and user self-perceived accuracy had no significant direct effects on reliance at any time. Affective influences on automation reliance are demonstrated, suggesting that this decision-making process may be less rational and more emotional than previously acknowledged. Liking for a new system may be key to appropriate reliance, particularly early in the task. Positive affect can be easily induced and may be a lever for increasing liking.

  19. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    Science.gov (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for

  20. AUTOMATION OF CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    O. Yu. Sakaliuk

    2016-09-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. A process model simulation conducted business process notation eEPC and BPMN. Also held database design, creation of file structure and create AIS interface. Implemented interaction with the webcam. Justification feasibility of software development, and the definition of performance based on the results petal chart, it is safe to say that an automated way to much more efficient compared to manual mode. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  1. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  2. Automation Processes and Blockchain Systems

    OpenAIRE

    Hegadekatti, Kartik

    2017-01-01

    Blockchain Systems and Ubiquitous computing are changing the way we do business and lead our lives. One of the most important applications of Blockchain technology is in automation processes and Internet-of-Things (IoT). Machines have so far been limited in ability primarily because they have restricted capacity to exchange value. Any monetary exchange of value has to be supervised by humans or human-based centralised ledgers. Blockchain technology changes all that. It allows machines to have...

  3. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high...... flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so...... and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells...

  4. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  5. A Comparative Experimental Study on the Use of Machine Learning Approaches for Automated Valve Monitoring Based on Acoustic Emission Parameters

    Science.gov (United States)

    Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.

    2018-03-01

    Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.

  6. Automation of data processing | G | African Journal of Range and ...

    African Journals Online (AJOL)

    Data processing can be time-consuming when experiments with advanced designs are employed. This, coupled with a shortage of research workers, necessitates automation. It is suggested that with automation the first step is to determine how the data must be analysed. The second step is to determine what programmes ...

  7. Problems of collaborative work of the automated process control system (APCS) and the its information security and solutions.

    Science.gov (United States)

    Arakelyan, E. K.; Andryushin, A. V.; Mezin, S. V.; Kosoy, A. A.; Kalinina, Ya V.; Khokhlov, I. S.

    2017-11-01

    The principle of interaction of the specified systems of technological protections by the Automated process control system (APCS) and information safety in case of incorrect execution of the algorithm of technological protection is offered. - checking the correctness of the operation of technological protection in each specific situation using the functional relationship between the monitored parameters. The methodology for assessing the economic feasibility of developing and implementing an information security system.

  8. Automated inundation monitoring using TerraSAR-X multitemporal imagery

    Science.gov (United States)

    Gebhardt, S.; Huth, J.; Wehrmann, T.; Schettler, I.; Künzer, C.; Schmidt, M.; Dech, S.

    2009-04-01

    The Mekong Delta in Vietnam offers natural resources for several million inhabitants. However, a strong population increase, changing climatic conditions and regulatory measures at the upper reaches of the Mekong lead to severe changes in the Delta. Extreme flood events occur more frequently, drinking water availability is increasingly limited, soils show signs of salinization or acidification, species and complete habitats diminish. During the Monsoon season the river regularly overflows its banks in the lower Mekong area, usually with beneficial effects. However, extreme flood events occur more frequently causing extensive damage, on the average once every 6 to 10 years river flood levels exceed the critical beneficial level X-band SAR data are well suited for deriving inundated surface areas. The TerraSAR-X sensor with its different scanning modi allows for the derivation of spatial and temporal high resolved inundation masks. The paper presents an automated procedure for deriving inundated areas from TerraSAR-X Scansar and Stripmap image data. Within the framework of the German-Vietnamese WISDOM project, focussing the Mekong Delta region in Vietnam, images have been acquired covering the flood season from June 2008 to November 2008. Based on these images a time series of the so called watermask showing inundated areas have been derived. The product is required as intermediate to (i) calibrate 2d inundation model scenarios, (ii) estimate the extent of affected areas, and (iii) analyze the scope of prior crisis. The image processing approach is based on the assumption that water surfaces are forward scattering the radar signal resulting in low backscatter signals to the sensor. It uses multiple grey level thresholds and image morphological operations. The approach is robust in terms of automation, accuracy, robustness, and processing time. The resulting watermasks show the seasonal flooding pattern with inundations starting in July, having their peak at the end

  9. Research progress of laser welding process dynamic monitoring technology based on plasma characteristics signal

    Directory of Open Access Journals (Sweden)

    Teng WANG

    2017-02-01

    Full Text Available During the high-power laser welding process, plasmas are induced by the evaporation of metal under laser radiation, which can affect the coupling of laser energy and the workpiece, and ultimately impact on the reliability of laser welding quality and process directly. The research of laser-induced plasma is a focus in high-power deep penetration welding field, which provides a promising research area for realizing the automation of welding process quality inspection. In recent years, the research of laser welding process dynamic monitoring technology based on plasma characteristics is mainly in two aspects, namely the research of plasma signal detection and the research of laser welding process modeling. The laser-induced plasma in the laser welding is introduced, and the related research of laser welding process dynamic monitoring technology based on plasma characteristics at home and abroad is analyzed. The current problems in the field are summarized, and the future development trend is put forward.

  10. Automated radon-thoron monitoring for earthquake prediction research

    International Nuclear Information System (INIS)

    Shapiro, M.H.; Melvin, J.D.; Copping, N.A.; Tombrello, T.A.; Whitcomb, J.H.

    1980-01-01

    This paper describes an automated instrument for earthquake prediction research which monitors the emission of radon ( 222 Rn) and thoron ( 220 Rn) from rock. The instrument uses aerosol filtration techniques and beta counting to determine radon and thoron levels. Data from the first year of operation of a field prototype suggest an annual cycle in the radon level at the site which is related to thermoelastic strains in the crust. Two anomalous increases in the radon level of short duration have been observed during the first year of operation. One anomaly appears to have been a precursor for a nearby earthquake (2.8 magnitude, Richter scale), and the other may have been associated with changing hydrological conditions resulting from heavy rainfall

  11. An Automated 476 MHz RF Cavity Processing Facility at SLAC

    CERN Document Server

    McIntosh, P; Schwarz, H

    2003-01-01

    The 476 MHz accelerating cavities currently used at SLAC are those installed on the PEP-II B-Factory collider accelerator. They are designed to operate at a maximum accelerating voltage of 1 MV and are routinely utilized on PEP-II at voltages up to 750 kV. During the summer of 2003, SPEAR3 will undergo a substantial upgrade, part of which will be to replace the existing 358.54 MHz RF system with essentially a PEP-II high energy ring (HER) RF station operating at 476.3 MHz and 3.2 MV (or 800 kV/cavity). Prior to installation, cavity RF processing is required to prepare them for use. A dedicated high power test facility is employed at SLAC to provide the capability of conditioning each cavity up to the required accelerating voltage. An automated LabVIEW based interface controls and monitors various cavity and test stand parameters, increasing the RF fields accordingly such that stable operation is finally achieved. This paper describes the high power RF cavity processing facility, highlighting the features of t...

  12. Automated micro fluidic system for PCR applications in the monitoring of drinking water quality

    International Nuclear Information System (INIS)

    Soria Soria, E.; Yanez Amoros, A.; Murtula Corbi, R.; Catalan Cuenca, V.; Martin-Cisneros, C. S.; Ymbern, O.; Alonso-Chamorro, J.

    2009-01-01

    Microbiological laboratories present a growing interest in automated, simple and user-friendly methodologies able to perform simultaneous analysis of a high amount of samples. Analytical tools based on micro-fluidic could play an important role in this field. In this work, the development of an automated micro fluidic system for PCR applications and aimed to monitoring of drinking water quality is presented. The device will be able to determine, simultaneously, fecal pollution indicators and water-transmitted pathogens. Further-more, complemented with DNA pre-concentration and extraction modules, the device would present a highly integrated solution for microbiological diagnostic laboratories. (Author) 13 refs.

  13. Introducing 2D barcode on TLD cards - a step towards automation in personnel monitoring

    International Nuclear Information System (INIS)

    Ajoy, K.C.; Dhanasekaran, A.; Annalakshmi, O; Rajagopal, V.; Santhanam, R.; Jose, M.T.

    2018-01-01

    As part of personnel monitoring services, TLD lab, RSD, IGCAR issues and receives large numbers of TLD cards every month, for use by occupational workers belonging to various hot facilities at Kalpakkam. Considering the nature of the work being manual, routine, labour intensive and being prone for human errors, introducing automation would be necessary at the TLD lab as well as at the user facility. This requires identification of the individual components of the TLD and embed them with unique identification for the system to accomplish the task. The paper discusses the automation part related to the TLD cards

  14. DEVELOPMENT OF AN AUTOMATED BATCH-PROCESS SOLAR ...

    African Journals Online (AJOL)

    One of the shortcomings of solar disinfection of water (SODIS) is the absence of a feedback mechanism indicating treatment completion. This work presents the development of an automated batch-process water disinfection system aimed at solving this challenge. Locally sourced materials in addition to an Arduinomicro ...

  15. Concurrent Pilot Instrument Monitoring in the Automated Multi-Crew Airline Cockpit.

    Science.gov (United States)

    Jarvis, Stephen R

    2017-12-01

    Pilot instrument monitoring has been described as "inadequate," "ineffective," and "insufficient" after multicrew aircraft accidents. Regulators have called for improved instrument monitoring by flight crews, but scientific knowledge in the area is scarce. Research has tended to investigate the monitoring of individual pilots when in the pilot-flying role; very little research has looked at crew monitoring, or that of the "monitoring-pilot" role despite it being half of the apparent problem. Eye-tracking data were collected from 17 properly constituted and current Boeing 737 crews operating in a full motion simulator. Each crew flew four realistic flight segments, with pilots swapping between the pilot-flying and pilot-monitoring roles, with and without the autopilot engaged. Analysis was performed on the 375 maneuvering-segments prior to localizer intercept. Autopilot engagement led to significantly less visual dwell time on the attitude director indicator (mean 212.8-47.8 s for the flying pilot and 58.5-39.8 s for the monitoring-pilot) and an associated increase on the horizontal situation indicator (18-52.5 s and 36.4-50.5 s). The flying-pilots' withdrawal of attention from the primary flight reference and increased attention to the primary navigational reference was paralleled rather than complemented by the monitoring-pilot, suggesting that monitoring vulnerabilities can be duplicated in the flight deck. Therefore it is possible that accident causes identified as "inadequate" or "insufficient" monitoring, are in fact a result of parallel monitoring.Jarvis SR. Concurrent pilot instrument monitoring in the automated multi-crew airline cockpit. Aerosp Med Hum Perform. 2017; 88(12):1100-1106.

  16. Automated Internal Revenue Processing System: A Panacea For ...

    African Journals Online (AJOL)

    Automated Internal Revenue Processing System: A Panacea For Financial ... for the collection and management of internal revenue which is the financial ... them, computational errors, high level of redundancy and inconsistencies in record, ...

  17. Automated read-out of thermoluminescence dosemeters in a centralized individual monitoring service

    International Nuclear Information System (INIS)

    Toivonen, M.

    The organizational problems in maintaining centralized individual monitoring service with erasable and re-usable dosemeters are evaluated. Design criteria for an automated thermoluminescence reader are laid down. It is characteristic for the planning of the monitoring system that the issuing of dosemeters can be arranged without having two dosemeters for each worker. A home made reader designed to fullfil these criteria is presented. The use of a standard barcode and a standard optical barcode reader in identification of dosemeters is described. A method of using a minicomputer in preparing the self-fastening identification labels, in printing mailing lists and in printing results is described

  18. An Automated Processing Method for Agglomeration Areas

    Directory of Open Access Journals (Sweden)

    Chengming Li

    2018-05-01

    Full Text Available Agglomeration operations are a core component of the automated generalization of aggregated area groups. However, because geographical elements that possess agglomeration features are relatively scarce, the current literature has not given sufficient attention to agglomeration operations. Furthermore, most reports on the subject are limited to the general conceptual level. Consequently, current agglomeration methods are highly reliant on subjective determinations and cannot support intelligent computer processing. This paper proposes an automated processing method for agglomeration areas. Firstly, the proposed method automatically identifies agglomeration areas based on the width of the striped bridging area, distribution pattern index (DPI, shape similarity index (SSI, and overlap index (OI. Next, the progressive agglomeration operation is carried out, including the computation of the external boundary outlines and the extraction of agglomeration lines. The effectiveness and rationality of the proposed method has been validated by using actual census data of Chinese geographical conditions in the Jiangsu Province.

  19. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    Science.gov (United States)

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years

  20. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  1. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  2. Automated system of monitoring and positioning of functional units of mining technological machines for coal-mining enterprises

    Directory of Open Access Journals (Sweden)

    Meshcheryakov Yaroslav

    2018-01-01

    Full Text Available This article is show to the development of an automated monitoring and positioning system for functional nodes of mining technological machines. It describes the structure, element base, algorithms for identifying the operating states of a walking excavator; various types of errors in the functioning of microelectromechanical gyroscopes and accelerometers, as well as methods for their correction based on the Madgwick fusion filter. The results of industrial tests of an automated monitoring and positioning system for functional units on one of the opencast coal mines of Kuzbass are presented. This work is addressed to specialists working in the fields of the development of embedded systems and control systems, radio electronics, mechatronics, and robotics.

  3. Expanding the functional significance of automated control systems for the production process at hydroelectric plants

    International Nuclear Information System (INIS)

    Vasil'ev, Yu.S.; Kononova, M.Yu.

    1993-01-01

    Automated control systems for the production process (ACS PP) have been successfully implemented in a number of hydroelectric plants in the Russian Federation. The circle of problems that can be solved using ACS PP can be conditionally divided into two classes: on-line/technological control, and production-technological control. This article describes successes and future directions for the solution of these two classes of problems. From the discussion, it is concluded (1) that the data base for existing ACS PP at hydroelectric plants can be successfully employed as points for monitoring the conservation of an environment of local significance; (2) that is is expedient to discuss the problem with organizations, including local control groups interested in the development of territorial-basin systems for ecological monitoring; and (3) that the initiative in creating local territorial-basin support points for monitoring should emanate from guidelines for hydroelectric plants with ACS PP. 3 refs., 2 figs

  4. Automation of the micro-arc oxidation process

    Science.gov (United States)

    Golubkov, P. E.; Pecherskaya, E. A.; Karpanin, O. V.; Shepeleva, Y. V.; Zinchenko, T. O.; Artamonov, D. V.

    2017-11-01

    At present the significantly increased interest in micro-arc oxidation (MAO) encourages scientists to look for the solution of the problem of this technological process controllability. To solve this problem an automated technological installation MAO was developed, its structure and control principles are presented in this article. This device will allow to provide the controlled synthesis of MAO coatings and to identify MAO process patterns which contributes to commercialization of this technology.

  5. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  6. Marketing automation processes as a way to improve contemporary marketing of a company

    Directory of Open Access Journals (Sweden)

    Witold Świeczak

    2013-09-01

    Full Text Available The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influence an effective course of actions taken as a part of marketing automation. Because the concept of marketing automation is a completely new reality; it is giving up the communication based on mass distribution of a uniform contents for really personalized individual and fully automated communication. This is a completely new idea, a kind of coexistence, in which both a sales department and a marketing department cooperate with each other closely to achieve the best result. It is also a situation in which marketing can definitely confirm its contribution to the income generated by the company. But marketing automation also means huge analytical possibilities and a real increase of a company’s value, its value added generated by the system – the source of information about clients, about all processes both marketing and sales, taking place in a company. The introduction of marketing automation system alters not only the current functioning of a marketing department, but also marketers themselves. In fact, everything that marketing automation system provides, including primarily accumulated unique knowledge of the client, is also a critical marketing value of every modern enterprise.

  7. The feasibility of automated online flow cytometry for in-situ monitoring of microbial dynamics in aquatic ecosystems

    Science.gov (United States)

    Besmer, Michael D.; Weissbrodt, David G.; Kratochvil, Bradley E.; Sigrist, Jürg A.; Weyland, Mathias S.; Hammes, Frederik

    2014-01-01

    Fluorescent staining coupled with flow cytometry (FCM) is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i) flowing tap water from a municipal drinking water supply network and (ii) river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12–14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized) by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough toward the eventual establishment of fully automated online microbiological monitoring technologies. PMID:24917858

  8. Modern business process automation : YAWL and its support environment

    NARCIS (Netherlands)

    Hofstede, ter A.H.M.; Aalst, van der W.M.P.; Adams, M.; Russell, N.C.

    2010-01-01

    This book provides a comprehensive treatment of the field of Business Process Management (BPM) with a focus on Business Process Automation. It achieves this by covering a wide range of topics, both introductory and advanced, illustrated through and grounded in the YAWL (Yet Another Workflow

  9. Integrated safeguards and security for a highly automated process

    International Nuclear Information System (INIS)

    Zack, N.R.; Hunteman, W.J.; Jaeger, C.D.

    1993-01-01

    Before the cancellation of the New Production Reactor Programs for the production of tritium, the reactors and associated processing were being designed to contain some of the most highly automated and remote systems conceived for a Department of Energy facility. Integrating safety, security, materials control and accountability (MC and A), and process systems at the proposed facilities would enhance the overall information and protection-in-depth available. Remote, automated fuel handling and assembly/disassembly techniques would deny access to the nuclear materials while upholding ALARA principles but would also require the full integration of all data/information systems. Such systems would greatly enhance MC and A as well as facilitate materials tracking. Physical protection systems would be connected with materials control features to cross check activities and help detect and resolve anomalies. This paper will discuss the results of a study of the safeguards and security benefits achieved from a highly automated and integrated remote nuclear facility and the impacts that such systems have on safeguards and computer and information security

  10. Automated system for acquisition and image processing for the control and monitoring boned nopal

    Science.gov (United States)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  11. Long-term monitoring of soil gas fluxes with closed chambers using automated and manual systems

    Energy Technology Data Exchange (ETDEWEB)

    Scott, A.; Crichton, I.; Ball, B.C.

    1999-10-01

    The authors describe two gas sample collection techniques, each of which is used in conjunction with custom made automated or manually operated closed chambers. The automated system allows automatic collection of gas samples for simultaneous analysis of multiple trace gas efflux from soils, permitting long-term monitoring. Since the manual system is cheaper to produce, it can be replicated more than the automated and used to estimate spatial variability of soil fluxes. The automated chamber covers a soil area of 0.5 m{sup 2} and has a motor driven lid that remains operational throughout a range of weather conditions. Both systems use gas-tight containers of robust metal construction, which give good sample retention, thereby allowing long-term storage and convenience of transport from remote locations. The containers in the automated system are filled by pumping gas from the closed chamber via a multiway rotary valve. Stored samples from both systems are analyzed simultaneously for N{sub 2}O and CO{sub 2} using automated injection into laboratory-based gas chromatographs. The use of both collection systems is illustrated by results from a field experiment on sewage sludge disposal to land where N{sub 2}O fluxes were high. The automated gas sampling system permitted quantification of the marked temporal variability of concurrent N{sub 2}O and CO{sub 2} fluxes and allowed improved estimation of cumulative fluxes. The automated measurement approach yielded higher estimates of cumulative flux because integration of manual point-in-time observations missed a number of transient high-flux events.

  12. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  13. An Automated Sample Processing System for Planetary Exploration

    Science.gov (United States)

    Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther

    2012-01-01

    An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.

  14. Monitoring of transport contamination

    International Nuclear Information System (INIS)

    Turkin, N.F.

    1980-01-01

    Organization of monitoring of transport contamination is considered. A particularly thorough monitoring is recommended to be carried out in loading-unloading operations. The monitoring is performed when leaving loading-unloading site and zone under control and prior to preventive examination, technical service or repair. The method of monitoring of auto transport contamination with high-energy β-emitters by means of a special stand permitting the automation of the monitoring process is described [ru

  15. AUTOMATION OF CHAMPAGNE WINES PROCESS IN SPARKLING WINE PRESSURE TANK

    OpenAIRE

    E. V. Lukyanchuk; V. A. Khobin; V. A. Khobin

    2016-01-01

    The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex a...

  16. Automated full matrix capture for industrial processes

    Science.gov (United States)

    Brown, Roy H.; Pierce, S. Gareth; Collison, Ian; Dutton, Ben; Dziewierz, Jerzy; Jackson, Joseph; Lardner, Timothy; MacLeod, Charles; Morozov, Maxim

    2015-03-01

    Full matrix capture (FMC) ultrasound can be used to generate a permanent re-focusable record of data describing the geometry of a part; a valuable asset for an inspection process. FMC is a desirable acquisition mode for automated scanning of complex geometries, as it allows compensation for surface shape in post processing and application of the total focusing method. However, automating the delivery of such FMC inspection remains a significant challenge for real industrial processes due to the high data overhead associated with the ultrasonic acquisition. The benefits of NDE delivery using six-axis industrial robots are well versed when considering complex inspection geometries, but such an approach brings additional challenges to scanning speed and positional accuracy when combined with FMC inspection. This study outlines steps taken to optimize the scanning speed and data management of a process to scan the diffusion bonded membrane of a titanium test plate. A system combining a KUKA robotic arm and a reconfigurable FMC phased array controller is presented. The speed and data implications of different scanning methods are compared, and the impacts on data visualization quality are discussed with reference to this study. For the 0.5 m2 sample considered, typical acquisitions of 18 TB/m2 were measured for a triple back wall FMC acquisition, illustrating the challenge of combining high data throughput with acceptable scanning speeds.

  17. Automated Processing Workflow for Ambient Seismic Recordings

    Science.gov (United States)

    Girard, A. J.; Shragge, J.

    2017-12-01

    Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that

  18. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  19. Near-infrared spectroscopy monitoring and control of the fluidized bed granulation and coating processes-A review.

    Science.gov (United States)

    Liu, Ronghua; Li, Lian; Yin, Wenping; Xu, Dongbo; Zang, Hengchang

    2017-09-15

    The fluidized bed granulation and pellets coating technologies are widely used in pharmaceutical industry, because the particles made in a fluidized bed have good flowability, compressibility, and the coating thickness of pellets are homogeneous. With the popularization of process analytical technology (PAT), real-time analysis for critical quality attributes (CQA) was getting more attention. Near-infrared (NIR) spectroscopy, as a PAT tool, could realize the real-time monitoring and control during the granulating and coating processes, which could optimize the manufacturing processes. This article reviewed the application of NIR spectroscopy in CQA (moisture content, particle size and tablet/pellet thickness) monitoring during fluidized bed granulation and coating processes. Through this review, we would like to provide references for realizing automated control and intelligent production in fluidized bed granulation and pellets coating of pharmaceutical industry. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    Science.gov (United States)

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  1. A method for automated processing of measurement information during mechanical drilling

    Energy Technology Data Exchange (ETDEWEB)

    Samonenko, V.I.; Belinkov, V.G.; Romanova, L.A.

    1984-01-01

    An algorithm is cited for a developed method for automated processing of measurement information during mechanical drilling. Its use in conditions of operation of an automated control system (ASU) from drilling will make it possible to precisely identify a change in the lithology, the physical and mechanical and the abrasive properties, in the stratum (pore) pressure in the rock being drilled out during mechanical drilling, which along with other methods for testing the drilling process will increase the reliability of the decisions made.

  2. Healthcare Blockchain System Using Smart Contracts for Secure Automated Remote Patient Monitoring.

    Science.gov (United States)

    Griggs, Kristen N; Ossipova, Olya; Kohlios, Christopher P; Baccarini, Alessandro N; Howson, Emily A; Hayajneh, Thaier

    2018-06-06

    As Internet of Things (IoT) devices and other remote patient monitoring systems increase in popularity, security concerns about the transfer and logging of data transactions arise. In order to handle the protected health information (PHI) generated by these devices, we propose utilizing blockchain-based smart contracts to facilitate secure analysis and management of medical sensors. Using a private blockchain based on the Ethereum protocol, we created a system where the sensors communicate with a smart device that calls smart contracts and writes records of all events on the blockchain. This smart contract system would support real-time patient monitoring and medical interventions by sending notifications to patients and medical professionals, while also maintaining a secure record of who has initiated these activities. This would resolve many security vulnerabilities associated with remote patient monitoring and automate the delivery of notifications to all involved parties in a HIPAA compliant manner.

  3. Vibration based structural health monitoring of an arch bridge: From automated OMA to damage detection

    Science.gov (United States)

    Magalhães, F.; Cunha, A.; Caetano, E.

    2012-04-01

    In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.

  4. Cluster processing business level monitor

    International Nuclear Information System (INIS)

    Muniz, Francisco J.

    2017-01-01

    This article describes a Cluster Processing Monitor. Several applications with this functionality can be freely found doing a search in the Google machine. However, those applications may offer more features that are needed on the Processing Monitor being proposed. Therefore, making the monitor output evaluation difficult to be understood by the user, at-a-glance. In addition, such monitors may add unnecessary processing cost to the Cluster. For these reasons, a completely new Cluster Processing Monitor module was designed and implemented. In the CDTN, Clusters are broadly used, mainly, in deterministic methods (CFD) and non-deterministic methods (Monte Carlo). (author)

  5. Cluster processing business level monitor

    Energy Technology Data Exchange (ETDEWEB)

    Muniz, Francisco J., E-mail: muniz@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    This article describes a Cluster Processing Monitor. Several applications with this functionality can be freely found doing a search in the Google machine. However, those applications may offer more features that are needed on the Processing Monitor being proposed. Therefore, making the monitor output evaluation difficult to be understood by the user, at-a-glance. In addition, such monitors may add unnecessary processing cost to the Cluster. For these reasons, a completely new Cluster Processing Monitor module was designed and implemented. In the CDTN, Clusters are broadly used, mainly, in deterministic methods (CFD) and non-deterministic methods (Monte Carlo). (author)

  6. The automated data processing architecture for the GPI Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  7. Automated processing for proton spectroscopic imaging using water reference deconvolution.

    Science.gov (United States)

    Maudsley, A A; Wu, Z; Meyerhoff, D J; Weiner, M W

    1994-06-01

    Automated formation of MR spectroscopic images (MRSI) is necessary before routine application of these methods is possible for in vivo studies; however, this task is complicated by the presence of spatially dependent instrumental distortions and the complex nature of the MR spectrum. A data processing method is presented for completely automated formation of in vivo proton spectroscopic images, and applied for analysis of human brain metabolites. This procedure uses the water reference deconvolution method (G. A. Morris, J. Magn. Reson. 80, 547(1988)) to correct for line shape distortions caused by instrumental and sample characteristics, followed by parametric spectral analysis. Results for automated image formation were found to compare favorably with operator dependent spectral integration methods. While the water reference deconvolution processing was found to provide good correction of spatially dependent resonance frequency shifts, it was found to be susceptible to errors for correction of line shape distortions. These occur due to differences between the water reference and the metabolite distributions.

  8. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    Science.gov (United States)

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  9. Automated processing of data on the use of motor vehicles in the Serbian Armed Forces

    Directory of Open Access Journals (Sweden)

    Nikola S. Osmokrović

    2012-10-01

    Full Text Available The main aim of introducing information technology into the armed forces is the automation of the management process. The management in movement and transport (M&T in our armed forces has been included in the process of automation from the beginning. For that reason, today we can speak about the automated processing of data on road traffic safety and on the use of motor vehicles. With regard to the overall development of the information system of the movement and transport service, the paper presents an information system of the M&T service for the processing of data on the use of motor vehicles. The main features, components and functions of the 'Vozila' application, which was specially developed for the automated processing of data on motor vehicle use, are explained in particular.

  10. Automation of the aircraft design process

    Science.gov (United States)

    Heldenfels, R. R.

    1974-01-01

    The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.

  11. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  12. A Multi-Scale Flood Monitoring System Based on Fully Automatic MODIS and TerraSAR-X Processing Chains

    Directory of Open Access Journals (Sweden)

    Enrico Stein

    2013-10-01

    Full Text Available A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently under development at DLR’s (German Aerospace Center Center for Satellite based Crisis Information (ZKI to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS. A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR based satellite mission (TerraSAR-X. The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS and the TerraSAR-X Flood Service (TFS include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013.

  13. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    Science.gov (United States)

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  14. Automated Intelligent Monitoring and the Controlling Software System for Solar Panels

    Science.gov (United States)

    Nalamwar, H. S.; Ivanov, M. A.; Baidali, S. A.

    2017-01-01

    The inspection of the solar panels on a periodic basis is important to improve longevity and ensure performance of the solar system. To get the most solar potential of the photovoltaic (PV) system is possible through an intelligent monitoring & controlling system. The monitoring & controlling system has rapidly increased its popularity because of its user-friendly graphical interface for data acquisition, monitoring, controlling and measurements. In order to monitor the performance of the system especially for renewable energy source application such as solar photovoltaic (PV), data-acquisition systems had been used to collect all the data regarding the installed system. In this paper the development of a smart automated monitoring & controlling system for the solar panel is described, the core idea is based on IoT (the Internet of Things). The measurements of data are made using sensors, block management data acquisition modules, and a software system. Then, all the real-time data collection of the electrical output parameters of the PV plant such as voltage, current and generated electricity is displayed and stored in the block management. The proposed system is smart enough to make suggestions if the panel is not working properly, to display errors, to remind about maintenance of the system through email or SMS, and to rotate panels according to a sun position using the Ephemeral table that stored in the system. The advantages of the system are the performance of the solar panel system which can be monitored and analyzed.

  15. Automated selected reaction monitoring software for accurate label-free protein quantification.

    Science.gov (United States)

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  16. Automated radiological monitoring at a Russian Ministry of Defence Naval Site

    International Nuclear Information System (INIS)

    Moskowitz, P.D.; Pomerville, J.; Gavrilov, S.; Kisselev, V.; Daniylan, V.; Belikov, A.; Egorkin, A.; Sokolovski, Y.; Endregard, M.; Krosshavn, M.; Sundling, C.V.; Yokstad, H.

    2001-01-01

    The Arctic Military Environmental Cooperation (AMEC) Program is a cooperative effort between the military establishments of the Kingdom of Norway, the Russian Federation, and the US. This paper discusses joint activities conducted over the past year among Norwegian, Russian, and US technical experts on a project to develop, demonstrate and implement automated radiological monitoring at Russian Navy facilities engaged in the dismantlement of nuclear-powered strategic ballistic missile launching submarines. Radiological monitoring is needed at these facilities to help protect workers engaged in the dismantlement program and the public living within the footprint of routine and accidental radiation exposure areas. By providing remote stand-alone monitoring, the Russian Navy will achieve added protection due to the defense-in-depth strategy afforded by local (at the site), regional (Kola) and national-level (Moscow) oversight. The system being implemented at the Polyaminsky Russian Naval Shipyard was developed from a working model tested at the Russian Institute for Nuclear Safety, Moscow, Russia. It includes Russian manufactured terrestrial and underwater gamma detectors, smart controllers for graded sampling, radio-modems for offsite transmission of the data, and a data fusion/display system: The data fusion/display system is derived from the Norwegian Picasso AMEC Environmental Monitoring software package. This computer package allows monitoring personnel to review the real-time and historical status of monitoring at specific sites and objects and to establish new monitoring protocols as required, for example, in an off-normal accident situation. Plans are being developed to implement the use of this system at most RF Naval sites handling spent nuclear fuel

  17. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  18. Automated multivariate analysis of multi-sensor data submitted online: Real-time environmental monitoring.

    Science.gov (United States)

    Eide, Ingvar; Westad, Frank

    2018-01-01

    A pilot study demonstrating real-time environmental monitoring with automated multivariate analysis of multi-sensor data submitted online has been performed at the cabled LoVe Ocean Observatory located at 258 m depth 20 km off the coast of Lofoten-Vesterålen, Norway. The major purpose was efficient monitoring of many variables simultaneously and early detection of changes and time-trends in the overall response pattern before changes were evident in individual variables. The pilot study was performed with 12 sensors from May 16 to August 31, 2015. The sensors provided data for chlorophyll, turbidity, conductivity, temperature (three sensors), salinity (calculated from temperature and conductivity), biomass at three different depth intervals (5-50, 50-120, 120-250 m), and current speed measured in two directions (east and north) using two sensors covering different depths with overlap. A total of 88 variables were monitored, 78 from the two current speed sensors. The time-resolution varied, thus the data had to be aligned to a common time resolution. After alignment, the data were interpreted using principal component analysis (PCA). Initially, a calibration model was established using data from May 16 to July 31. The data on current speed from two sensors were subject to two separate PCA models and the score vectors from these two models were combined with the other 10 variables in a multi-block PCA model. The observations from August were projected on the calibration model consecutively one at a time and the result was visualized in a score plot. Automated PCA of multi-sensor data submitted online is illustrated with an attached time-lapse video covering the relative short time period used in the pilot study. Methods for statistical validation, and warning and alarm limits are described. Redundant sensors enable sensor diagnostics and quality assurance. In a future perspective, the concept may be used in integrated environmental monitoring.

  19. development of an automated batch-process solar water disinfection

    African Journals Online (AJOL)

    user

    This work presents the development of an automated batch-process water disinfection system ... Locally sourced materials in addition to an Arduinomicro processor were used to control ..... As already mentioned in section 3.1.1, a statistical.

  20. AIRSAR Automated Web-based Data Processing and Distribution System

    Science.gov (United States)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  1. A WEB-BASED SOLUTION TO VISUALIZE OPERATIONAL MONITORING LINUX CLUSTER FOR THE PROTODUNE DATA QUALITY MONITORING CLUSTER

    CERN Document Server

    Mosesane, Badisa

    2017-01-01

    The Neutrino computing cluster made of 300 Dell PowerEdge 1950 U1 nodes serves an integral role to the CERN Neutrino Platform (CENF). It represents an effort to foster fundamental research in the field of Neutrino physics as it provides data processing facility. We cannot begin to over emphasize the need for data quality monitoring coupled with automating system configurations and remote monitoring of the cluster. To achieve these, a software stack has been chosen to implement automatic propagation of configurations across all the nodes in the cluster. The bulk of these discusses and delves more into the automated configuration management system on this cluster to enable the fast online data processing and Data Quality (DQM) process for the Neutrino Platform cluster (npcmp.cern.ch).

  2. BiomaSoft: data processing system for monitoring and evaluating food and energy production. Part I

    International Nuclear Information System (INIS)

    Quevedo, J. R.; Suárez, J.

    2015-01-01

    The integrated food and energy production in Cuba demands to process diverse and voluminous information to make local, sectoral and national decisions, in order to have incidence on public policies, for which the support of automated systems that facilitate the monitoring and evaluation (M&E) of the integrated food and energy production in Cuban municipalities is necessary. The objective of this research was to identify the tools for the design of the data processing system BiomaSoft and to contextualize its application environment. The software development methodology was RUP (Rational Unified Process), with UML (Unified Modeling Language) as modeling language and PHP (Hypertext Pre-Processor) as programming language. The environment was conceptualized through a dominion model and the functional and non-functional requisites that should be fulfilled, as well as the Use Case Diagram of the system, with the description of actors, were specified. For the display of BiomaSoft a configuration based on two types of physical nodes (a web server and client computers) was conceived, in the municipalities that participate in the project «Biomass as renewable energy source for Cuban rural areas» (BIOMAS-CUBA). It is concluded that the monitoring and evaluation of integrated food and energy production under Cuban conditions can be made through the automated system BiomaSoft, and the identification of tools for its design and the contextualization of its application environment contribute to this purpose. (author)

  3. Analysis of Real Time Technical Data Obtained While Shotcreting: An Approach Towards Automation

    OpenAIRE

    Rodríguez, Ángel; Río, Olga

    2010-01-01

    Automation of shotcreting process is a key factor in both improving the working conditions and increasing productivity; as well as in increasing the quality of shotcrete. The confidence in the quality of the automation process itself and shotcrete linings can be improved by real time monitoring of pumping as well as other shotcreting machine related parameters. Prediction of how the difIerent technical parameters of application are governing the whole process is being a subject of increasing ...

  4. Regimes of data output from an automated scanning system into a computer

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Shaislamov, P.T.

    1984-01-01

    A method is described for accomplishment of rather a complex algorithm of various coordinate and service data transmission from different automated scanning system devices into a monitoring computer in the automated system for processing images from bubble chambers. The accepted data output algorithm and the developed appropriate equipment enable data transmission both in separate words and word arrays

  5. Design and development on automated control system of coated fuel particle fabrication process

    International Nuclear Information System (INIS)

    Liu Malin; Shao Youlin; Liu Bing

    2013-01-01

    With the development trend of the large-scale production of the HTR coated fuel particles, the original manual control system can not meet the requirement and the automation control system of coated fuel particle fabrication in modern industrial grade is needed to develop. The comprehensive analysis aiming at successive 4-layer coating process of TRISO type coated fuel particles was carried out. It was found that the coating process could be divided into five subsystems and nine operating states. The establishment of DCS-type (distributed control system) of automation control system was proposed. According to the rigorous requirements of preparation process for coated particles, the design considerations of DCS were proposed, including the principle of coordinated control, safety and reliability, integration specification, practical and easy to use, and open and easy to update. A complete set of automation control system for coated fuel particle preparation process was manufactured based on fulfilling the requirements of these principles in manufacture practice. The automated control system was put into operation in the production of irradiated samples for HTRPM demonstration project. The experimental results prove that the system can achieve better control of coated fuel particle preparation process and meet the requirements of factory-scale production. (authors)

  6. Automated processing of X-ray images in medicine

    International Nuclear Information System (INIS)

    Babij, Ya.S.; B'yalyuk, Ya.O.; Yanovich, I.A.; Lysenko, A.V.

    1991-01-01

    Theoretical and practical achievements in application of computing technology means for processing of X-ray images in medicine were generalized. The scheme of the main directions and tasks of processing of X-ray images was given and analyzed. The principal problems appeared in automated processing of X-ray images were distinguished. It is shown that for interpretation of X-ray images it is expedient to introduce a notion of relative operating characteristic (ROC) of a roentgenologist. Every point on ROC curve determines the individual criteria of roentgenologist to put a positive diagnosis for definite situation

  7. Automation of the DoD Export License Application Review Process

    National Research Council Canada - National Science Library

    Young, Shelton

    2002-01-01

    .... The overall audit objective was to determine whether Federal automation programs supporting the export license and review process could be used to establish a common electronic interface creating...

  8. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  9. A method for the automated long-term monitoring of three-spined stickleback Gasterosteus aculeatus shoal dynamics.

    Science.gov (United States)

    Kleinhappel, T K; Al-Zoubi, A; Al-Diri, B; Burman, O; Dickinson, P; John, L; Wilkinson, A; Pike, T W

    2014-04-01

    This paper describes and evaluates a flexible, non-invasive tagging system for the automated identification and long-term monitoring of individual three-spined sticklebacks Gasterosteus aculeatus. The system is based on barcoded tags, which can be reliably and robustly detected and decoded to provide information on an individual's identity and location. Because large numbers of fish can be individually tagged, it can be used to monitor individual- and group-level dynamics within fish shoals. © 2014 The Fisheries Society of the British Isles.

  10. A New Device to Automate the Monitoring of Critical Patients’ Urine Output

    Directory of Open Access Journals (Sweden)

    Abraham Otero

    2014-01-01

    Full Text Available Urine output (UO is usually measured manually each hour in acutely ill patients. This task consumes a substantial amount of time. Furthermore, in the literature there is evidence that more frequent (minute-by-minute UO measurement could impact clinical decision making and improve patient outcomes. However, it is not feasible to manually take minute-by-minute UO measurements. A device capable of automatically monitoring UO could save precious time of the healthcare staff and improve patient outcomes through a more precise and continuous monitoring of this parameter. This paper presents a device capable of automatically monitoring UO. It provides minute by minute measures and it can generate alarms that warn of deviations from therapeutic goals. It uses a capacitive sensor for the measurement of the UO collected within a rigid container. When the container is full, it automatically empties without requiring any internal or external power supply or any intervention by the nursing staff. In vitro tests have been conducted to verify the proper operation and accuracy in the measures of the device. These tests confirm the viability of the device to automate the monitoring of UO.

  11. Marketing automation processes as a way to improve contemporary marketing of a company

    OpenAIRE

    Witold Świeczak

    2013-01-01

    The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influenc...

  12. Monitoring of operating processes

    International Nuclear Information System (INIS)

    Barry, R.F.

    1981-01-01

    Apparatus is described for monitoring the processes of a nuclear reactor to detect off-normal operation of any process and for testing the monitoring apparatus. The processes are evaluated by response to their paramters, such as temperature, pressure, etc. The apparatus includes a pair of monitoring paths or signal processing units. Each unit includes facilities for receiving on a time-sharing basis, a status binary word made up of digits each indicating the status of a process, whether normal or off-normal, and test-signal binary words simulating the status binary words. The status words and test words are processed in succession during successive cycles. During each cycle, the two units receive the same status word and the same test word. The test words simulate the status words both when they indicate normal operation and when they indicate off-normal operation. Each signal-processing unit includes a pair of memories. Each memory receives a status word or a test word, as the case may be, and converts the received word into a converted status word or a converted test word. The memories of each monitoring unit operate into a non-coincidence which signals non-coincidence of the converted word out of one memory of a signal-processing unit not identical to the converted word of the other memory of the same unit

  13. AUTOMATION OF TRACEABILITY PROCESS AT GRAIN TERMINAL LLC “ UKRTRANSAGRO"

    Directory of Open Access Journals (Sweden)

    F. A. TRISHYN

    2017-07-01

    Full Text Available A positive trend of growth in both grain production and export is indicated. In the current marketing year the export potential of the Ukrainian grain market is close to the record level. However, the high positions in the rating of world exporters are achieved not only due to the high export potential, but also because of higher quality and logistics. These factors depend directly on the quality of enterprise management and all processes occurring at it. One of the perspective ways of enterprise development is the implementation of the traceability system and further automation of the traceability process. European integration laws are obliging Ukrainian enterprises to have a traceability system. Traceability is an ability to follow the movement of a feed or food through specified stages of production, processing and distribution. The process of traceability is managing by people, which implies a human factor. Automation will allow, in a greater extent, to exclude the human factor that will mean decreasing of errors in documentation and will speed up the process of grain transshipment. Research work on the process was carried out on the most modern grain terminal - LLC “UkrTransAgro”. The terminal is located in the Ukrainian water area of the Azov Sea (Mariupol, Ukraine. Characteristics of the terminal: capacity of a simultaneous storage - 48,120 thousand tons, acceptance of crops from transport - 4,500 tons / day; acceptance of crops from railway transport - 3000 tons / day, transshipment capacity - up to 1.2 million tons per year, shipment to the sea vessels - 7000 tons / day. The analysis of the automation level of the grain terminal is carried out. The company uses software from 1C - «1C: Enterprise 8. Accounting for grain elevator, mill, and feed mill for Ukraine». This software is used for quantitative and qualitative registration at the elevator in accordance with industry guidelines and standards. The software product has many

  14. Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.

    2018-01-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  15. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    Science.gov (United States)

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  16. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  17. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures

    Science.gov (United States)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin

    2014-04-01

    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic

  18. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  19. Comparative analysis of automation of production process with industrial robots in Asia/Australia and Europe

    Directory of Open Access Journals (Sweden)

    I. Karabegović

    2017-01-01

    Full Text Available The term "INDUSTRY 4.0" or "fourth industrial revolution" was first introduced at the fair in 2011 in Hannover. It comes from the high-tech strategy of the German Federal Government that promotes automation-computerization to complete smart automation, meaning the introduction of a method of self-automation, self-configuration, self-diagnosing and fixing the problem, knowledge and intelligent decision-making. Any automation, including smart, cannot be imagined without industrial robots. Along with the fourth industrial revolution, ‘’robotic revolution’’ is taking place in Japan. Robotic revolution refers to the development and research of robotic technology with the aim of using robots in all production processes, and the use of robots in real life, to be of service to a man in daily life. Knowing these facts, an analysis was conducted of the representation of industrial robots in the production processes on the two continents of Europe and Asia /Australia, as well as research that industry is ready for the introduction of intelligent automation with the goal of establishing future smart factories. The paper gives a representation of the automation of production processes in Europe and Asia/Australia, with predictions for the future.

  20. Automating the Fireshed Assessment Process with ArcGIS

    Science.gov (United States)

    Alan Ager; Klaus Barber

    2006-01-01

    A library of macros was developed to automate the Fireshed process within ArcGIS. The macros link a number of vegetation simulation and wildfire behavior models (FVS, SVS, FARSITE, and FlamMap) with ESRI geodatabases, desktop software (Access, Excel), and ArcGIS. The macros provide for (1) an interactive linkage between digital imagery, vegetation data, FVS-FFE, and...

  1. Automated defect spatial signature analysis for semiconductor manufacturing process

    Science.gov (United States)

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  2. Automated CD-SEM metrology for efficient TD and HVM

    Science.gov (United States)

    Starikov, Alexander; Mulapudi, Satya P.

    2008-03-01

    CD-SEM is the metrology tool of choice for patterning process development and production process control. We can make these applications more efficient by extracting more information from each CD-SEM image. This enables direct monitors of key process parameters, such as lithography dose and focus, or predicting the outcome of processing, such as etched dimensions or electrical parameters. Automating CD-SEM recipes at the early stages of process development can accelerate technology characterization, segmentation of variance and process improvements. This leverages the engineering effort, reduces development costs and helps to manage the risks inherent in new technology. Automating CD-SEM for manufacturing enables efficient operations. Novel SEM Alarm Time Indicator (SATI) makes this task manageable. SATI pulls together data mining, trend charting of the key recipe and Operations (OPS) indicators, Pareto of OPS losses and inputs for root cause analysis. This approach proved natural to our FAB personnel. After minimal initial training, we applied new methods in 65nm FLASH manufacture. This resulted in significant lasting improvements of CD-SEM recipe robustness, portability and automation, increased CD-SEM capacity and MT productivity.

  3. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    Science.gov (United States)

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  4. Vision-Based Geo-Monitoring - A New Approach for an Automated System

    Science.gov (United States)

    Wagner, A.; Reiterer, A.; Wasmeier, P.; Rieke-Zapp, D.; Wunderlich, T.

    2012-04-01

    The necessity for monitoring geo-risk areas such as rock slides is growing due to the increasing probability of such events caused by environmental change. Life with threat becomes to a calculable risk by geodetic deformation monitoring. An in-depth monitoring concept with modern measurement technologies allows the estimation of the hazard potential and the prediction of life-threatening situations. The movements can be monitored by sensors, placed in the unstable slope area. In most cases, it is necessary to enter the regions at risk in order to place the sensors and maintain them. Using long-range monitoring systems (e.g. terrestrial laser scanners, total stations, ground based synthetic aperture radar) allows avoiding this risk. To close the gap between the existing low-resolution, medium-accuracy sensors and conventional (co-operative target-based) surveying methods, image-assisted total stations (IATS) are a suggestive solution. IATS offer the user (e.g. metrology expert) an image capturing system (CCD/CMOS camera) in addition to 3D point measurements. The images of the telescope's visual field are projected onto the camera's chip. With appropriate calibration, these images are accurately geo-referenced and oriented since the horizontal and vertical angles of rotation are continuously recorded. The oriented images can directly be used for direction measurements with no need for object control points or further photogrammetric orientation processes. IATS are able to provide high density deformation fields with high accuracy (down to mm range), in all three coordinate directions. Tests have shown that with suitable image processing measurements a precision of 0.05 pixel ± 0.04·σ is possible (which corresponds to 0.03 mgon ± 0.04·σ). These results have to be seen under the consideration that such measurements are image-based only. For measuring in 3D object space the precision of pointing has to be taken into account. IATS can be used in two different ways

  5. Analysis And Control System For Automated Welding

    Science.gov (United States)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  6. System of automated processing of radionuclide investigations (SAPRI-01) in clinical practice

    International Nuclear Information System (INIS)

    Sivachenko, T.P.; Mechev, D.S.; Krupka, I.N.

    1988-01-01

    The author described the results of clinical testing of a system SAPRI-01 designed for automated collection, storage and processing of data on radionuclide investigations. He gave examples of automated processing of RCG and the results of positive scintigraphy of tumors of different sites using 67 Ga-citrate and 99m Tc pertechnetate in statistical and dynamic investigations. Short-comings and ways for updating 4 the system during its serial production were pointed out. The introduction of the system into clinical practice on a wide scale was shown to hold promise

  7. Development of Information Support of the Automated System for Monitoring the State of the Gas Transportation System’s Industrial Safety

    Directory of Open Access Journals (Sweden)

    Ruslan Skrynkovskyy

    2017-08-01

    Full Text Available The purpose of the article is to developing the information security of the automated system for monitoring the state of industrial safety of the gas transportation system within the framework of the safety management system, which will enable timely and objective detection of adverse accident hazards (hazardous events and taking the necessary specific measures to eliminate them and operate the gas transport system safely. It is proved that the basis of the information provision of the automated system for monitoring the state of the industrial safety of the gas transmission system is a methodology that includes the following basic procedures: identifying hazards; qualitative and quantitative assessment of emergencies; establishing of unacceptable (unallowable risks and their introduction to the information base (register of unacceptable risks of objects of the gas transportation system; comprehensive assessment and certification of the state of industrial safety of objects of the gas transportation system; identification of effective, productive (efficient risk management measures. The prospect of further research in this area is the development and implementation of an automated system for monitoring the state of industrial safety of the objects of the gas transmission system based on the results of the research (of the submitted information provision.

  8. Automated electronic monitoring of circuit pressures during continuous renal replacement therapy: a technical report.

    Science.gov (United States)

    Zhang, Ling; Baldwin, Ian; Zhu, Guijun; Tanaka, Aiko; Bellomo, Rinaldo

    2015-03-01

    Automated electronic monitoring and analysis of circuit pressures during continuous renal replacement therapy (CRRT) has the potential to predict failure and allow intervention to optimise function. Current CRRT machines can measure and store pressure readings for downloading into databases and for analysis. We developed a procedure to obtain such data at intervals of 1 minute and analyse them using the Prismaflex CRRT machine, and we present an example of such analysis. We obtained data on pressures obtained at intervals of 1 minute in a patient with acute kidney injury and sepsis treated with continuous haemofiltration at 2 L/hour of ultrafiltration and a blood flow of 200 mL/minute. Data analysis identified progressive increases in transmembrane pressure (TMP) and prefilter pressure (PFP) from time 0 until 33 hours or clotting. TMP increased from 104 mmHg to 313 mmHg and PFP increased from from 131 mmHg to 185 mmHg. Effluent pressure showed a progressive increase in the negative pressure applied to achieve ultrafiltration from 0 mmHg to -168 mmHg. The inflection point for such changes was also identified. Blood pathway pressures for access and return remained unchanged throughout. Automated electronic monitoring of circuit pressure during CRRT is possible and provides useful information on the evolution of circuit clotting.

  9. Number crunchers : instrumentation sector treads path of selective automation

    International Nuclear Information System (INIS)

    Budd, G.

    2006-01-01

    Automation guided by adequate monitoring and control instrumentation is playing an increasingly important role in the oil and gas sector. This article presented a overview of new instruments in the Western Canada Sedimentary Basin (WCSB), where a niche market for instrumentation and automation has grown in tandem with increased drilling activities. Fluctuating prices in oil and gas have also meant that available production methods must be optimized in order to ensure bottom line profits. However, economies of production scale can exclude extensive monitoring techniques in coalbed methane (CBM) activities. Compressor stations are the site of most monitoring and control instrumentation in CBM activities. Compressor packages at the stations include an engine panel that monitors suction pressure and water temperature. Alarm points on all monitoring instrumentation can shut down operations or assist in slight adjustments to machinery to optimize production. In addition, acoustical flow meters are fitted to headers to identify drops in a station's overall volumetric flow of natural gas. Instrumentation at the stations monitors and controls boilers that heat glycol for the gas dehydration process through the use of a pneumatic control loop that communicates with the motor control centre. The system is capable of ensuring shut-downs in emergencies. The combination of automation and carbon dioxide (CO 2 ) flooding has dramatically improved production in the Weyburn oilfield in southeastern Saskatchewan, where data transfer is now completed using Ethernet communications to a SCADA system to communicate with and from 5 satellite sites to a central main plant. It was estimated that the life expectancy of the Weyburn oilfield has been extended by almost 25 years. It was concluded that when harnessed to other technologies and combined with a user-friendly interface, automation can make a huge difference in the production profile of a field. 2 figs

  10. An automated digital imaging system for environmental monitoring applications

    Science.gov (United States)

    Bogle, Rian; Velasco, Miguel; Vogel, John

    2013-01-01

    Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.

  11. Some considerations on automated image processing of pathline photographs

    International Nuclear Information System (INIS)

    Kobayashi, T.; Saga, T.; Segawa, S.

    1987-01-01

    It is presently shown that flow visualization velocity vectors can be automatically obtained from tracer particle photographs by means of an image processing system. The system involves automated gray level threshold selection during the digitization process and separation or erasure of the intersecting path lines, followed by use of the pathline picture in the identification process and an adjustment of the averaging area in the rearrangement process. Attention is given to the results obtained for two-dimensional flows past an airfoil cascade and around a circular cylinder. 7 references

  12. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    International Nuclear Information System (INIS)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan

    2012-01-01

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED adj ). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED adj between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED adj that differed by up to 44% from effective dose estimates that were not

  13. Automated multivariate analysis of multi-sensor data submitted online: Real-time environmental monitoring.

    Directory of Open Access Journals (Sweden)

    Ingvar Eide

    Full Text Available A pilot study demonstrating real-time environmental monitoring with automated multivariate analysis of multi-sensor data submitted online has been performed at the cabled LoVe Ocean Observatory located at 258 m depth 20 km off the coast of Lofoten-Vesterålen, Norway. The major purpose was efficient monitoring of many variables simultaneously and early detection of changes and time-trends in the overall response pattern before changes were evident in individual variables. The pilot study was performed with 12 sensors from May 16 to August 31, 2015. The sensors provided data for chlorophyll, turbidity, conductivity, temperature (three sensors, salinity (calculated from temperature and conductivity, biomass at three different depth intervals (5-50, 50-120, 120-250 m, and current speed measured in two directions (east and north using two sensors covering different depths with overlap. A total of 88 variables were monitored, 78 from the two current speed sensors. The time-resolution varied, thus the data had to be aligned to a common time resolution. After alignment, the data were interpreted using principal component analysis (PCA. Initially, a calibration model was established using data from May 16 to July 31. The data on current speed from two sensors were subject to two separate PCA models and the score vectors from these two models were combined with the other 10 variables in a multi-block PCA model. The observations from August were projected on the calibration model consecutively one at a time and the result was visualized in a score plot. Automated PCA of multi-sensor data submitted online is illustrated with an attached time-lapse video covering the relative short time period used in the pilot study. Methods for statistical validation, and warning and alarm limits are described. Redundant sensors enable sensor diagnostics and quality assurance. In a future perspective, the concept may be used in integrated environmental monitoring.

  14. Monitoring and analysis of air emissions based on condition models derived from process history

    Directory of Open Access Journals (Sweden)

    M. Liukkonen

    2016-12-01

    Full Text Available Evaluation of online information on operating conditions is necessary when reducing air emissions in energy plants. In this respect, automated monitoring and control are of primary concern, particularly in biomass combustion. As monitoring of emissions in power plants is ever more challenging because of low-grade fuels and fuel mixtures, new monitoring applications are needed to extract essential information from the large amount of measurement data. The management of emissions in energy boilers lacks economically efficient, fast, and competent computational systems that could support decision-making regarding the improvement of emission efficiency. In this paper, a novel emission monitoring platform based on the self-organizing map method is presented. The system is capable, not only of visualizing the prevailing status of the process and detecting problem situations (i.e. increased emission release rates, but also of analyzing these situations automatically and presenting factors potentially affecting them. The system is demonstrated using measurement data from an industrial circulating fluidized bed boiler fired by forest residue as the primary fuel and coal as the supporting fuel.

  15. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław

    2016-12-01

    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  16. A system of automated processing of deep water hydrological information

    Science.gov (United States)

    Romantsov, V. A.; Dyubkin, I. A.; Klyukbin, L. N.

    1974-01-01

    An automated system for primary and scientific analysis of deep water hydrological information is presented. Primary processing of the data in this system is carried out on a drifting station, which also calculates the parameters of vertical stability of the sea layers, as well as their depths and altitudes. Methods of processing the raw data are described.

  17. 3-D image pre-processing algorithms for improved automated tracing of neuronal arbors.

    Science.gov (United States)

    Narayanaswamy, Arunachalam; Wang, Yu; Roysam, Badrinath

    2011-09-01

    The accuracy and reliability of automated neurite tracing systems is ultimately limited by image quality as reflected in the signal-to-noise ratio, contrast, and image variability. This paper describes a novel combination of image processing methods that operate on images of neurites captured by confocal and widefield microscopy, and produce synthetic images that are better suited to automated tracing. The algorithms are based on the curvelet transform (for denoising curvilinear structures and local orientation estimation), perceptual grouping by scalar voting (for elimination of non-tubular structures and improvement of neurite continuity while preserving branch points), adaptive focus detection, and depth estimation (for handling widefield images without deconvolution). The proposed methods are fast, and capable of handling large images. Their ability to handle images of unlimited size derives from automated tiling of large images along the lateral dimension, and processing of 3-D images one optical slice at a time. Their speed derives in part from the fact that the core computations are formulated in terms of the Fast Fourier Transform (FFT), and in part from parallel computation on multi-core computers. The methods are simple to apply to new images since they require very few adjustable parameters, all of which are intuitive. Examples of pre-processing DIADEM Challenge images are used to illustrate improved automated tracing resulting from our pre-processing methods.

  18. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  19. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Entezari, Pegah; Kino, Aya; Honarmand, Amir R.; Galizia, Mauricio S.; Yang, Yan; Collins, Jeremy; Yaghmai, Vahid; Carr, James C.

    2013-01-01

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  20. Immunosuppressant therapeutic drug monitoring by LC-MS/MS: workflow optimization through automated processing of whole blood samples.

    Science.gov (United States)

    Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario

    2013-11-01

    Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.

  1. Manual of process automation. On-line control systems for devices in the process technology. 3. tot. rev. and enl. ed.; Handbuch der Prozessautomatisierung. Prozessleittechnik fuer verfahrenstechnische Anlagen

    Energy Technology Data Exchange (ETDEWEB)

    Maier, U.; Frueh, K.F. (eds.)

    2004-07-01

    This is a reference manual for engineers who need answers to automation problems in chemical engineering. Some new current subjects have been introduced to complement the information. The following chapters are new or have been rewritten by new authors: Internet and intranet technologies; Outline of process-related functions; Control systems in industrial applications; Problems and solutions; Model-based predicative control (MPC); Report archive analysis; Control Loop Performance Monitoring (CPM); Automation structures; Explosion protection; Remote-I/O; Integration of intelligent field equipment in PLS; Weighing and filling techniques; Safety; Maintenance - structures and strategies. The other chapters have been revised and updated as well. (orig.) [German] Das grundsaetzliche Konzept des Handbuchs ist unveraendert: Es dient als Nachschlagewerk fuer Ingenieure, die sich in verschiedenen Taetigkeitsbereichen mit Fragen der Automatisierung verfahrenstechnischer Anlagen auseinandersetzen muessen. Einige Themen wurden neu aufgenommen - wegen ihrer Aktualitaet und zur Abrundung des Themenspektrums. Folgende Kapitel sind voellig neu oder mit neuen Autoren wesentlich erweitert: Internet-/Intranettechnologien; Uebersicht ueber prozessnahe Funktionen; Industrielle Regelung: Probleme und Problemloesungen; Modellgestuetzte praediktive Regelung (MPC); Meldearchivanalyse; Control Loop Performance Monitoring (CPM); Automatisierungsstrukturen; Explosionsschutz; Remote-I/O; Integration intelligenter Feldgeraete in PLS; Waege- und Abfuelltechnik; Anlagensicherheit; Ganzheitliche Instandhaltung - Strukturen und Strategien. Die uebrigen Kapitel wurden aktualisiert und teilweise auch wesentlich ueberarbeitet. (orig.)

  2. Monitoring Device Safety in Interventional Cardiology

    OpenAIRE

    Matheny, Michael E.; Ohno-Machado, Lucila; Resnic, Frederic S.

    2006-01-01

    Objective: A variety of postmarketing surveillance strategies to monitor the safety of medical devices have been supported by the U.S. Food and Drug Administration, but there are few systems to automate surveillance. Our objective was to develop a system to perform real-time monitoring of safety data using a variety of process control techniques.

  3. Comprehensive automation and monitoring of MV grids as the key element of improvement of energy supply reliability and continuity

    Directory of Open Access Journals (Sweden)

    Stanisław Kubacki

    2012-03-01

    Full Text Available The paper presents the issue of comprehensive automation and monitoring of medium voltage (MV grids as a key element of the Smart Grid concept. The existing condition of MV grid control and monitoring is discussed, and the concept of a solution which will provide the possibility of remote automatic grid reconfiguration and ensure full grid observability from the dispatching system level is introduced. Automation of MV grid switching is discussed in detail to isolate a faulty line section and supply electricity at the time of the failure to the largest possible number of recipients. An example of such automation controls’ operation is also presented. The paper’s second part presents the key role of the quick fault location function and the possibility of the MV grid’s remote reconfiguration for improving power supply reliability (SAIDI and SAIFI indices. It is also shown how an increase in the number of points fitted with faulted circuit indicators with the option of remote control of switches from the dispatch system in MV grids may affect reduction of SAIDI and SAIFI indices across ENERGA-OPERATOR SA divisions.

  4. Simplified automated image analysis for detection and phenotyping of Mycobacterium tuberculosis on porous supports by monitoring growing microcolonies.

    Directory of Open Access Journals (Sweden)

    Alice L den Hertog

    Full Text Available BACKGROUND: Even with the advent of nucleic acid (NA amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS, as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. METHODS: Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO supports. Repeated imaging during colony growth greatly simplifies "computer vision" and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. SIGNIFICANCE: Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation.

  5. Simplified Automated Image Analysis for Detection and Phenotyping of Mycobacterium tuberculosis on Porous Supports by Monitoring Growing Microcolonies

    Science.gov (United States)

    den Hertog, Alice L.; Visser, Dennis W.; Ingham, Colin J.; Fey, Frank H. A. G.; Klatser, Paul R.; Anthony, Richard M.

    2010-01-01

    Background Even with the advent of nucleic acid (NA) amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS), as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. Methods Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO) supports. Repeated imaging during colony growth greatly simplifies “computer vision” and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. Significance Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation. PMID:20544033

  6. A prototype of an automated high resolution InSAR volcano-monitoring system in the MED-SUV project

    Science.gov (United States)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas

    2016-04-01

    Volcanic processes which produce a variety of geological and hydrological hazards are difficult to predict and capable of triggering natural disasters on regional to global scales. Therefore it is important to monitor volcano continuously and with a high spatial and temporal sampling rate. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities and it helps for the better understanding and modelling of the involved geophysical processes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide a powerful tool for observing the eruptive activities and measuring the surface changes of millimetre accuracy. All the mentioned techniques with deformation time series extraction address the challenges by exploiting medium to large SAR image stacks. The process of selecting, ordering, downloading, storing, logging, extracting and preparing the data for processing is very time consuming has to be done manually for every single data-stack. In many cases it is even an iterative process which has to be done regularly and continuously. Therefore, data processing becomes slow which causes significant delays in data delivery. The SAR Satellite based High Resolution Data Acquisition System, which will be developed at DLR, will automate this entire time consuming tasks and allows an operational volcano monitoring system. Every 24 hours the system runs for searching new acquired scene over the volcanoes and keeps track of the data orders, log the status and download the provided data via ftp-transfer including E-Mail alert. Furthermore, the system will deliver specified reports and maps to a database for review and use by specialists. The user interaction will be minimized and iterative processes will be totally avoided. In this presentation, a prototype of SAR Satellite based High Resolution Data

  7. Real time fish pond monitoring and automation using Arduino

    Science.gov (United States)

    Harun, Z.; Reda, E.; Hashim, H.

    2018-03-01

    Investment and operating costs are the biggest obstacles in modernizing fish ponds in an otherwise very lucrative industry i.e. food production, in this region. Small-scale farmers running on small ponds could not afford to hire workers to man daily operations which usually consists of monitoring water levels, temperature and feeding fish. Bigger scale enterprises usually have some kinds of automation for water monitoring and replacement. These entities have to consider employing pH and dissolved oxygen (DO) sensors to ensure the health and growth of fish, sooner or later as their farms grow. This project identifies one of the sites, located in Malacca. In this project, water, temperature, pH and DO levels are measured and integrated with aerating and water supply pumps using Arduino. User could receive information at predetermined intervals on preferred communication or display gadgets as long as they have internet. Since integrating devices are comparatively not expensive; it usually consists of Arduino board, internet and relay frames and display system, farmer could source these components easily. A sample of two days measurements of temperature, pH and DO levels show that this farm has a high-quality water. Oxygen levels increases in the day as sunshine supports photosynthesis in the pond. With this integration system, farmer need not hire worker at their site, consequently drive down operating costs and improve efficiency.

  8. Project TwEATs: a feasibility study testing the use of automated text messaging to monitor appetite ratings in a free-living population

    Science.gov (United States)

    Schembre, Susan M.; Yuen, Jessica

    2011-01-01

    There are no standardized methods for monitoring appetite in free-living populations. Fifteen participants tested a computer-automated text-messaging system designed to track hunger ratings over seven days. Participants were sent text-messages (SMS) hourly and instructed to reply during waking hours with their current hunger rating. Of 168 SMS, 0.6-7.1% were undelivered, varying by mobile service provider, On average 12 SMS responses were received daily with minor variations by observation day or day of the week. Compliance was over 74% and 93% of the ratings were received within 30-minutes. Automated text-messaging is a feasible method to monitor appetite ratings in this population. PMID:21251941

  9. Fundamentals of business process management

    NARCIS (Netherlands)

    Dumas, Marlon; La Rosa, Marcello; Mendling, Jan; Reijers, Hajo A.

    2018-01-01

    This textbook covers the entire Business Process Management (BPM) lifecycle, from process identification to process monitoring, covering along the way process modelling, analysis, redesign and automation. Concepts, methods and tools from business management, computer science and industrial

  10. Minimising life cycle costs of automated valves in offshore platforms

    Energy Technology Data Exchange (ETDEWEB)

    Yli-Petays, Juha [Metso Automation do Brasil Ltda., Rio de Janeiro, RJ (Brazil); Niemela, Ismo [Metso Automation, Imatra (Finland)

    2012-07-01

    Automated process valves play an essential role in offshore platforms operation. If you are able to optimize their operation and maintenance activities you can receive extensive operational savings with minimal investment. Valves used in offshore platforms doesn't differentiate that much from the valves used in downstream but there are certain specialties, which makes the operations more challenging in offshore: Process valves are more difficult to access and maintain because of space limitations. Also spare part inventories and deliveries are challenging because of offshore platform's remote location. To overcome these challenges usage of digital positioners with diagnostic features has become more common because predictive maintenance capabilities enable possibilities to plan the maintenance activities and this way optimise the spare part orders regarding to valves. There are intelligent controllers available for control valves, automated on/off valves as well as ESD-valves and whole network of automated valves on platforms can be controlled by intelligent valve controllers. This creates many new opportunities in regards of optimized process performance or predictive maintenance point-of-view. By means of intelligent valve controllers and predictive diagnostics, condition monitoring and maintenance planning can also be performed remotely from an onshore location. Thus, intelligent valve controllers provide good way to minimize spending related to total cost of ownership of automated process valves. When purchase value of control valve represent 20% of TCO, intelligent positioner and predictive maintenance methods can enable as high as 30% savings over the life cycle of asset so basically it benefit savings higher than whole investment of monitored asset over its life cycle. This is mainly achieved through the optimized maintenance activities since real life examples has shown that with time based maintenance (preventive maintenance) approach 70% of

  11. Results and discussion of laboratory experiences with different automated TLD readers for personnel monitoring

    International Nuclear Information System (INIS)

    Regulla, D.F.; Drexeler, G.

    Although the film seems to continue serving as the main personnel dosemeter in Germany for the time in sight, the evolution of particularly solid state techniques and their properties are thoroughly considered with respect to a possible generalized application in personnel monitoring. For this reason different automated TLD systems that are commercially available have been investigated in the laboratory in order to find out their usefulness for a largescale or also decentralized service. Along with studying the dosimetrical and apparative parameters, the question has been discussed to which monitoring philosophy these TLD systems seem to fit. It is reported both on experimental experiences achieved as well as on the results of basic discussions that in return influence the discussion about the necessary outfit of personnel TL dosemeters

  12. The Automated Threaded Fastening Based on On-line Identification

    Directory of Open Access Journals (Sweden)

    Nicolas Ivan Giannoccaro

    2008-11-01

    Full Text Available The principle of the thread fastenings have been known and used for decades with the purpose of joining one component to another. Threaded fastenings are popular because they permit easy disassembly for maintenance, repair, relocation and recycling. Screw insertions are typically carried out manually. It is a difficult problem to automat. As a result there is very little published research on automating threaded fastenings, and most research on automated assembly focus on the peg-in-hole assembly problem. This paper investigates the problem of automated monitoring of the screw insertion process. The monitoring problem deals with predicting integrity of a threaded insertion, based on the torque vs. insertion depth curve generated during the insertions. The authors have developed an analytical model to predict the torque signature signals during self-tapping screw insertions. However, the model requires parameters on the screw dimensions and plate material properties are difficult to measure. This paper presents a study on on-line identification during screw fastenings. An identification methodology for two unknown parameter estimation during a self-tapping screw insertion process is presented. It is shown that friction and screw properties required by the model can be reliably estimated on-line. Experimental results are presented to validate the identification procedure.

  13. Development of an automated processing system for potential fishing zone forecast

    Science.gov (United States)

    Ardianto, R.; Setiawan, A.; Hidayat, J. J.; Zaky, A. R.

    2017-01-01

    The Institute for Marine Research and Observation (IMRO) - Ministry of Marine Affairs and Fisheries Republic of Indonesia (MMAF) has developed a potential fishing zone (PFZ) forecast using satellite data, called Peta Prakiraan Daerah Penangkapan Ikan (PPDPI). Since 2005, IMRO disseminates everyday PPDPI maps for fisheries marine ports and 3 days average for national areas. The accuracy in determining the PFZ and processing time of maps depend much on the experience of the operators creating them. This paper presents our research in developing an automated processing system for PPDPI in order to increase the accuracy and shorten processing time. PFZ are identified by combining MODIS sea surface temperature (SST) and chlorophyll-a (CHL) data in order to detect the presence of upwelling, thermal fronts and biological productivity enhancement, where the integration of these phenomena generally representing the PFZ. The whole process involves data download, map geo-process as well as layout that are carried out automatically by Python and ArcPy. The results showed that the automated processing system could be used to reduce the operator’s dependence on determining PFZ and speed up processing time.

  14. Use of automated medication adherence monitoring in bipolar disorder research: pitfalls, pragmatics, and possibilities.

    Science.gov (United States)

    Levin, Jennifer B; Sams, Johnny; Tatsuoka, Curtis; Cassidy, Kristin A; Sajatovic, Martha

    2015-04-01

    Medication nonadherence occurs in 20-60% of persons with bipolar disorder (BD) and is associated with serious negative outcomes, including relapse, hospitalization, incarceration, suicide and high healthcare costs. Various strategies have been developed to measure adherence in BD. This descriptive paper summarizes challenges and workable strategies using electronic medication monitoring in a randomized clinical trial (RCT) in patients with BD. Descriptive data from 57 nonadherent individuals with BD enrolled in a prospective RCT evaluating a novel customized adherence intervention versus control were analyzed. Analyses focused on whole group data and did not assess intervention effects. Adherence was assessed with the self-reported Tablets Routine Questionnaire and the Medication Event Monitoring System (MEMS). The majority of participants were women (74%), African American (69%), with type I BD (77%). Practical limitations of MEMS included misuse in conjunction with pill minders, polypharmacy, cost, failure to bring to research visits, losing the device, and the device impacting baseline measurement. The advantages were more precise measurement, less biased recall, and collecting data from past time periods for missed interim visits. Automated devices such as MEMS can assist investigators in evaluating adherence in patients with BD. Knowing the anticipated pitfalls allows study teams to implement preemptive procedures for successful implementation in BD adherence studies and can help pave the way for future refinements as automated adherence assessment technologies become more sophisticated and readily available.

  15. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  16. Monitoring, accounting and automated decision support for the ALICE experiment based on the MonALISA framework

    CERN Document Server

    Cirstoiu, C; Betev, L; Saiz, P; Peters, A J; Muraru, A; Voicu, R; Legrand, I

    2007-01-01

    We are developing a general purpose monitoring system for the ALICE experiment, based on the MonALISA framework. MonALISA (Monitoring Agents using a Large Integrated Services Architecture) is a fully distributed system with no single point of failure that is able to collect, store monitoring information and present it as significant perspectives and synthetic views on the status and the trends of the entire system. Furthermore, agents can use it for taking automated operational decisions. Monitoring information is gathered locally from all the components running in each site. The entire flow of information is aggregated on site level by a MonALISA service and then collected and presented in various forms by a central MonALISA Repository. Based on this information, other services take operational decisions such as alerts, triggers, service restarts and automatic production job or transfer submissions. The system monitors all the components: computer clusters (all major parameters of each computing node), jobs ...

  17. Automation of NLO processes and decays and POWHEG matching in WHIZARD

    International Nuclear Information System (INIS)

    Reuter, Juergen; Chokoufe, Bijan; Stahlhofen, Maximilian

    2016-03-01

    We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one-loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e + e - processes are shown. Also, first NLO-studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the POWHEG formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables.

  18. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  19. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan [Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States) and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27710 (United States); and Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708 (United States)

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose

  20. Current status of process monitoring for IAEA safeguards

    International Nuclear Information System (INIS)

    Koroyasu, M.

    1987-06-01

    Based on literature survey, this report tries to answer some of the following questions on process monitoring for safeguards purposes of future large scale reprocessing plants: what is process monitoring, what are the basic elements of process monitoring, what kinds of process monitoring are there, what are the basic problems of process monitoring, what is the relationship between process monitoring and near-real-time materials accountancy, what are actual results of process monitoring tests and what should be studied in future. A brief description of Advanced Safeguards Approaches proposed by the four states (France, U.K., Japan and U.S.A.), the approach proposed by the U.S.A., the description of the process monitoring, the main part of the report published as a result of one of the U.S. Support Programmes for IAEA Safeguards and an article on process monitoring presented at an IAEA Symposium held in November 1986 are given in the annexes. 24 refs, 20 figs, tabs

  1. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology.

    Science.gov (United States)

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-05-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  2. Automated multiscale morphometry of muscle disease from second harmonic generation microscopy using tensor-based image processing.

    Science.gov (United States)

    Garbe, Christoph S; Buttgereit, Andreas; Schürmann, Sebastian; Friedrich, Oliver

    2012-01-01

    Practically, all chronic diseases are characterized by tissue remodeling that alters organ and cellular function through changes to normal organ architecture. Some morphometric alterations become irreversible and account for disease progression even on cellular levels. Early diagnostics to categorize tissue alterations, as well as monitoring progression or remission of disturbed cytoarchitecture upon treatment in the same individual, are a new emerging field. They strongly challenge spatial resolution and require advanced imaging techniques and strategies for detecting morphological changes. We use a combined second harmonic generation (SHG) microscopy and automated image processing approach to quantify morphology in an animal model of inherited Duchenne muscular dystrophy (mdx mouse) with age. Multiphoton XYZ image stacks from tissue slices reveal vast morphological deviation in muscles from old mdx mice at different scales of cytoskeleton architecture: cell calibers are irregular, myofibrils within cells are twisted, and sarcomere lattice disruptions (detected as "verniers") are larger in number compared to samples from healthy mice. In young mdx mice, such alterations are only minor. The boundary-tensor approach, adapted and optimized for SHG data, is a suitable approach to allow quick quantitative morphometry in whole tissue slices. The overall detection performance of the automated algorithm compares very well with manual "by eye" detection, the latter being time consuming and prone to subjective errors. Our algorithm outperfoms manual detection by time with similar reliability. This approach will be an important prerequisite for the implementation of a clinical image databases to diagnose and monitor specific morphological alterations in chronic (muscle) diseases. © 2011 IEEE

  3. Development of an automated chip culture system with integrated on-line monitoring for maturation culture of retinal pigment epithelial cells

    Directory of Open Access Journals (Sweden)

    Mee-Hae Kim

    2017-10-01

    Full Text Available In cell manufacturing, the establishment of a fully automated, microfluidic, cell culture system that can be used for long-term cell cultures, as well as for process optimization is highly desirable. This study reports the development of a novel chip bioreactor system that can be used for automated long-term maturation cultures of retinal pigment epithelial (RPE cells. The system consists of an incubation unit, a medium supply unit, a culture observation unit, and a control unit. In the incubation unit, the chip contains a closed culture vessel (2.5 mm diameter, working volume 9.1 μL, which can be set to 37 °C and 5% CO2, and uses a gas-permeable resin (poly- dimethylsiloxane as the vessel wall. RPE cells were seeded at 5.0 × 104 cells/cm2 and the medium was changed every day by introducing fresh medium using the medium supply unit. Culture solutions were stored either in the refrigerator or the freezer, and fresh medium was prepared before any medium change by warming to 37 °C and mixing. Automated culture was allowed to continue for 30 days to allow maturation of the RPE cells. This chip culture system allows for the long-term, bubble-free, culture of RPE cells, while also being able to observe cells in order to elucidate their cell morphology or show the presence of tight junctions. This culture system, along with an integrated on-line monitoring system, can therefore be applied to long-term cultures of RPE cells, and should contribute to process control in RPE cell manufacturing.

  4. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  5. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  6. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Directory of Open Access Journals (Sweden)

    Dai Fei Elmer Ker

    Full Text Available Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and

  7. An improved, computer-based, on-line gamma monitor for plutonium anion exchange process control

    International Nuclear Information System (INIS)

    Pope, N.G.; Marsh, S.F.

    1987-06-01

    An improved, low-cost, computer-based system has replaced a previously developed on-line gamma monitor. Both instruments continuously profile uranium, plutonium, and americium in the nitrate anion exchange process used to recover and purify plutonium at the Los Alamos Plutonium Facility. The latest system incorporates a personal computer that provides full-feature multichannel analyzer (MCA) capabilities by means of a single-slot, plug-in integrated circuit board. In addition to controlling all MCA functions, the computer program continuously corrects for gain shift and performs all other data processing functions. This Plutonium Recovery Operations Gamma Ray Energy Spectrometer System (PROGRESS) provides on-line process operational data essential for efficient operation. By identifying abnormal conditions in real time, it allows operators to take corrective actions promptly. The decision-making capability of the computer will be of increasing value as we implement automated process-control functions in the future. 4 refs., 6 figs

  8. AUTOMATED PROCESSING OF DAIRY PRODUCT MICROPHOTOS USING IMAGEJ AND STATISTICA

    Directory of Open Access Journals (Sweden)

    V. K. Bitiukov

    2014-01-01

    Full Text Available Summary. The article discusses the construction of algorithms for automated processing of microphotos of dairy products. Automated processing of micro photos of dairy products relevant in the study of the degree of homogenization. Microphotos of dairy products contain information about the distribution of fat globules in the mass fractions. Today, there are some of software products, offering image processing and relieving researchers from routine operations manual data processing. But it need to be adapted for performing the processing of microphotos of dairy products. In this paper we propose to use for processing the application package ImageJ for processing image files taken with digital microscope, and to calculate the statistical characteristics of the proposed use of the software package Statistica. Processing algorithm consists of successive stages of conversion to gray scale, scaling, filtering, binarization, object recognition and statistical processing of the results of recognition. The result of the implemented data processing algorithms is the distribution function of the fat globules in terms of volume or mass fraction, as well as the statistical parameters of the distribution (the mathematical expectation, variance, skewness and kurtosis coefficients. For the inspection of the algorithm and its debugging experimental studieswere carried out. Carries out the homogenization of farm milk at different pressures of homogenization. For each sample were made microphoto sand image processing carried out in accordance with the proposed algorithm. Studies have shown the effectiveness and feasibility of the proposed algorithm in the form of java script for ImageJ and then send the data to a file for the software package Statistica.

  9. Automated Processing of 2-D Gel Electrophoretograms of Genomic DNA for Hunting Pathogenic DNA Molecular Changes.

    Science.gov (United States)

    Takahashi; Nakazawa; Watanabe; Konagaya

    1999-01-01

    We have developed the automated processing algorithms for 2-dimensional (2-D) electrophoretograms of genomic DNA based on RLGS (Restriction Landmark Genomic Scanning) method, which scans the restriction enzyme recognition sites as the landmark and maps them onto a 2-D electrophoresis gel. Our powerful processing algorithms realize the automated spot recognition from RLGS electrophoretograms and the automated comparison of a huge number of such images. In the final stage of the automated processing, a master spot pattern, on which all the spots in the RLGS images are mapped at once, can be obtained. The spot pattern variations which seemed to be specific to the pathogenic DNA molecular changes can be easily detected by simply looking over the master spot pattern. When we applied our algorithms to the analysis of 33 RLGS images derived from human colon tissues, we successfully detected several colon tumor specific spot pattern changes.

  10. A New Tool for Automated Data Collection and Complete On-site Flux Data Processing for Eddy Covariance Measurements

    Science.gov (United States)

    Begashaw, I. G.; Kathilankal, J. C.; Li, J.; Beaty, K.; Ediger, K.; Forgione, A.; Fratini, G.; Johnson, D.; Velgersdyk, M.; Hupp, J. R.; Xu, L.; Burba, G. G.

    2014-12-01

    The eddy covariance method is widely used for direct measurements of turbulent exchange of gases and energy between the surface and atmosphere. In the past, raw data were collected first in the field and then processed back in the laboratory to achieve fully corrected publication-ready flux results. This post-processing consumed significant amount of time and resources, and precluded researchers from accessing near real-time final flux results. A new automated measurement system with novel hardware and software designs was developed, tested, and deployed starting late 2013. The major advancements with this automated flux system include: 1) Enabling logging high-frequency, three-dimensional wind speeds and multiple gas densities (CO2, H2O and CH4), low-frequency meteorological data, and site metadata simultaneously through a specially designed file format 2) Conducting fully corrected, real-time on-site flux computations using conventional as well as user-specified methods, by implementing EddyPro Software on a small low-power microprocessor 3) Providing precision clock control and coordinate information for data synchronization and inter-site data comparison by incorporating a GPS and Precision Time Protocol. Along with these innovations, a data management server application was also developed to chart fully corrected real-time fluxes to assist remote system monitoring, to send e-mail alerts, and to automate data QA/QC, transfer and archiving at individual stations or on a network level. Combination of all of these functions was designed to help save substantial amount of time and costs associated with managing a research site by eliminating the post-field data processing, reducing user errors and facilitating real-time access to fully corrected flux results. The design, functionality, and test results from this new eddy covariance measurement tool will be presented.

  11. Automated Pre-processing for NMR Assignments with Reduced Tedium

    Energy Technology Data Exchange (ETDEWEB)

    2004-05-11

    An important rate-limiting step in the reasonance asignment process is accurate identification of resonance peaks in MNR spectra. NMR spectra are noisy. Hence, automatic peak-picking programs must navigate between the Scylla of reliable but incomplete picking, and the Charybdis of noisy but complete picking. Each of these extremes complicates the assignment process: incomplete peak-picking results in the loss of essential connectivities, while noisy picking conceals the true connectivities under a combinatiorial explosion of false positives. Intermediate processing can simplify the assignment process by preferentially removing false peaks from noisy peak lists. This is accomplished by requiring consensus between multiple NMR experiments, exploiting a priori information about NMR spectra, and drawing on empirical statistical distributions of chemical shift extracted from the BioMagResBank. Experienced NMR practitioners currently apply many of these techniques "by hand", which is tedious, and may appear arbitrary to the novice. To increase efficiency, we have created a systematic and automated approach to this process, known as APART. Automated pre-processing has three main advantages: reduced tedium, standardization, and pedagogy. In the hands of experienced spectroscopists, the main advantage is reduced tedium (a rapid increase in the ratio of true peaks to false peaks with minimal effort). When a project is passed from hand to hand, the main advantage is standardization. APART automatically documents the peak filtering process by archiving its original recommendations, the accompanying justifications, and whether a user accepted or overrode a given filtering recommendation. In the hands of a novice, this tool can reduce the stumbling block of learning to differentiate between real peaks and noise, by providing real-time examples of how such decisions are made.

  12. Development and implementation of an automated system for antiquated of the process of gamma radiation monitors calibration

    International Nuclear Information System (INIS)

    Silva Junior, Iremar Alves

    2012-01-01

    In this study it was carried out the development and implementation of a system for the appropriate process of gamma radiation monitors calibration, constituted by a pneumatic dispositive to exchange the attenuators and a positioning table, both actuated through a control panel. We also implemented a System of Caesa-Gammatron Irradiator, which increased the range of the air kerma rates, due to its higher activity comparing with the current system of gamma radiation in use in the calibration laboratory of gamma irradiation. Hence, it was necessary the installation of an attenuator dispositive remotely controlled in this irradiator system. Lastly, it was carried out an evaluation of the reduction in the rates of the occupational dose. This dissertation was developed with the aim of improving the quality of the services of calibration and tests of gamma radiation monitors - provided by the IPEN Laboratory of Instrument Calibration - as well as decreasing the occupational dose of the technicians involved in the process of calibration, following thus the principles of radiation protection. (author)

  13. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules: Final Subcontract Report, April 1998 - April 2002

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore, S. B.; Hogan, S. J.

    2002-11-01

    This report describes the automated systems developed for PV module assembly and testing processes after lamination. These processes are applicable to a broad range of module types, including those made with wafer-based and thin-film solar cells. Survey data and input from module manufacturers gathered during site visits were used to define system capabilities and process specifications. Spire completed mechanical, electrical, and software engineering for four automation systems: a module edge trimming system, the SPI-TRIM 350; an edge sealing and framing system, the SPI-FRAMER 350; an integrated module testing system, the SPI-MODULE QA 350; and a module buffer storage system, the SPI-BUFFER 350. A fifth system for junction-box installation, the SPI-BOXER 350, was nearly completed during the program. A new-size solar simulator, the SPI-SUN SIMULATOR 350i, was designed as part of the SPI-MODULE QA 350. This simulator occupies minimal production floor space, and its test area is large enough to handle most production modules. The automated systems developed in this program are designed for integration to create automated production lines.

  14. "SmartMonitor"--an intelligent security system for the protection of individuals and small properties with the possibility of home automation.

    Science.gov (United States)

    Frejlichowski, Dariusz; Gościewska, Katarzyna; Forczmański, Paweł; Hofman, Radosław

    2014-06-05

    "SmartMonitor" is an intelligent security system based on image analysis that combines the advantages of alarm, video surveillance and home automation systems. The system is a complete solution that automatically reacts to every learned situation in a pre-specified way and has various applications, e.g., home and surrounding protection against unauthorized intrusion, crime detection or supervision over ill persons. The software is based on well-known and proven methods and algorithms for visual content analysis (VCA) that were appropriately modified and adopted to fit specific needs and create a video processing model which consists of foreground region detection and localization, candidate object extraction, object classification and tracking. In this paper, the "SmartMonitor" system is presented along with its architecture, employed methods and algorithms, and object analysis approach. Some experimental results on system operation are also provided. In the paper, focus is put on one of the aforementioned functionalities of the system, namely supervision over ill persons.

  15. An Aspect-Oriented Framework for Business Process Improvement

    Science.gov (United States)

    Pourshahid, Alireza; Mussbacher, Gunter; Amyot, Daniel; Weiss, Michael

    Recently, many organizations invested in Business Process Management Systems (BPMSs) in order to automate and monitor their processes. Business Activity Monitoring is one of the essential modules of a BPMS as it provides the core monitoring capabilities. Although the natural step after process monitoring is process improvement, most of the existing systems do not provide the means to help users with the improvement step. In this paper, we address this issue by proposing an aspect-oriented framework that allows the impact of changes to business processes to be explored with what-if scenarios based on the most appropriate process redesign patterns among several possibilities. As the four cornerstones of a BPMS are process, goal, performance and validation views, these views need to be aligned automatically by any approach that intends to support automated improvement of business processes. Our framework therefore provides means to reflect process changes also in the other views of the business process. A health care case study presented as a proof of concept suggests that this novel approach is feasible.

  16. An architectural framework for developing intelligent applications for the carbon dioxide capture process

    Energy Technology Data Exchange (ETDEWEB)

    Luo, C.; Zhou, Q.; Chan, C.W. [Regina Univ., SK (Canada)

    2009-07-01

    This presentation reported on the development of automated application solutions for the carbon dioxide (CO{sub 2}) capture process. An architectural framework was presented for developing intelligent systems for the process system. The chemical absorption process consists of dozens of components. It therefore generates more than a hundred different types of data. Developing automated support for these tasks is desirable because the monitoring, analysis and diagnosis of the data is very complex. The proposed framework interacts with an implemented domain ontology for the CO{sub 2} capture process, which consists of information derived from senior operators of the CO{sub 2} pilot plant at the International Test Centre for Carbon Dioxide Capture at University of Regina. The well-defined library within the framework reduces development time and cost. The framework also has built-in web-based software components for data monitoring, management, and analysis. These components provide support for generating automated solutions for the CO{sub 2} capture process. An automated monitoring system that was also developed based on the architectural framework.

  17. A Study on the Cost-Effectiveness of a SemiAutomated Cutting Process at a Garment Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Castro, Mark Daniel

    2017-11-01

    Full Text Available The subject of the study, Company X, has been experiencing variations in the quantity report from the cutting department and the transmittal reports. The management found that these processes are hugely affected by manual labor. To reduce the system's proneness to human error, the management decided to explore the possibility of adapting a semi-automated spreading and cutting process in the system. This research aims to evaluate the pre-sewing processes of Company X and whether introducing automation can be beneficial to the company and the garments industry. The researchers used process mapping tools, descriptive research, and process flowchart to assess the current and proposed systems, and engineering economics to evaluate the cost and benefits of implementing the semi-automated system. The results showed that with the implementation of the semi- automated system; the company will incur 66.61% more savings per year than the current system. In terms of cycle time, the semi-automated system eliminated the relaxation of fabric before the cutting process, thereby greatly reducing cycle time. In addition, the researchers found that as long as the company produce more than 4,140 pieces per day for the system will be economically feasible. Unquantifiable benefits are also identified on introducing the semi- automated system to the company. The company can have a cleaner work environment that will lead to more productivity and greater quality of goods. This will lead to a better company image that will encourage more customers to place job orders.

  18. An automated system for monitoring bird collisions with power lines and tower guys

    Energy Technology Data Exchange (ETDEWEB)

    Carlton, R.G. [Electric Power Research Inst., Palo Alto, CA (United States)

    2005-07-01

    An automated system for monitoring collisions between birds and power lines was presented. The bird strike indicator (BSI) was developed to gather bird collision information that is difficult to obtain through direct human observation as well as to aid in the calculation of inherent biases which must be considered when attempting to determine total mortality from data obtained in on-the-ground dead bird searches. The BSI can be placed directly on power lines, static wires, or tower guy cables with a standard hot stick power line clamp. The sensor consists of a state-of-the-art accelerometers, power supplies, signal processors, and data acquisition systems. The BSI also includes a communication system for transmitting data to a ground-based unit in which raw data can be stored. A complete BSI consists of 30 sensors with signal processing and data logging capabilities, and a base station. The sensors integrate several components, including wireless radio, data storage, and a microcontroller with an A/D converter. Full-scale field deployment has shown that the BSI is both robust and sensitive to vibrations in the guy wires, as the system has been tuned to eliminate vibrations induced by wind. 3 figs.

  19. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    Science.gov (United States)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  20. Development of nuclear power plant automated remote patrol system

    International Nuclear Information System (INIS)

    Nakayama, R.; Kubo, K.; Sato, K.; Taguchi, J.

    1984-01-01

    An Automated Remote Patrol System was developed for a remote inspection, observation and monitoring of nuclear power plant's components. This automated remote patrol system consists of; a vehicle moving along a monorail; three rails mounted in a monorail for data transmission and for power supply; an image fiber connected to a TV camera; an arm type mechanism (manipulator) for moving image fiber; a computer for control and data processing and operator's console. Special features of this Automated Remote Patrol System are as follows: The inspection vehicle runs along horizontal and vertical (up/down) monorails. The arm type mechanism (manipulator) on the vehicle is used to move image fiber. Slide type electric collectors are used for data transmission and power supply. Time-division multiplexing is adapted for data transmission. Voice communication is used for controlling mechanisms. Pattern recognition is used for data processing. The experience that has been obtained from a series of various tests is summarized. (author)

  1. Implementation of a novel postoperative monitoring system using automated Modified Early Warning Scores (MEWS) incorporating end-tidal capnography.

    Science.gov (United States)

    Blankush, Joseph M; Freeman, Robbie; McIlvaine, Joy; Tran, Trung; Nassani, Stephen; Leitman, I Michael

    2017-10-01

    Modified Early Warning Scores (MEWS) provide real-time vital sign (VS) trending and reduce ICU admissions in post-operative patients. These early warning calculations classically incorporate oxygen saturation, heart rate, respiratory rate, systolic blood pressure, and temperature but have not previously included end-tidal CO2 (EtCO 2 ), more recently identified as an independent predictor of critical illness. These systems may be subject to failure when physiologic data is incorrectly measured, leading to false alarms and increased workload. This study investigates whether the implementation of automated devices that utilize ongoing vital signs monitoring and MEWS calculations, inclusive of a score for end-tidal CO 2 (EtCO 2 ), can be feasibly implemented on the general care hospital floor and effectively identify derangements in a post-operative patient's condition while limiting the amount of false alarms that would serve to increase provider workload. From July to November 2014, post-operative patients meeting the inclusion criteria (BMI > 30 kg/m 2 , history of obstructive sleep apnea, or the use of patient-controlled analgesia (PCA) or epidural narcotics) were monitored using automated devices that record minute-by-minute VS included in classic MEWS calculations as well as EtCO 2 . Automated messages via pagers were sent to providers for instances when the device measured elevated MEWS, abnormal EtCO 2 , and oxygen desaturations below 85 %. Data, including alarm and message details from the first 133 patients, were recorded and analyzed. Overall, 3.3 alarms and pages sounded per hour of monitoring. Device-only alarms sounded 2.7 times per hour-21 % were technical alarms. The remaining device-only alarms for concerning VS sounded 2.0/h, 70 % for falsely recorded VS. Pages for abnormal EtCO 2 sounded 0.4/h (82 % false recordings) while pages for low blood oxygen saturation sounded 0.1/h (55 % false alarms). 143 times (0.1 pages/h) the devices calculated

  2. Role of automation in the ACRV operations

    Science.gov (United States)

    Sepahban, S. F.

    1992-01-01

    The Assured Crew Return Vehicle (ACRV) will provide the Space Station Freedom with contingency means of return to earth (1) of one disabled crew member during medical emergencies, (2) of all crew members in case of accidents or failures of SSF systems, and (3) in case of interruption of the Space Shuttle flights. A wide range of vehicle configurations and system approaches are currently under study. The Program requirements focus on minimizing life cycle costs by ensuring simple operations, built-in reliability and maintainability. The ACRV philosophy of embedded operations is based on maximum use of existing facilities, resources and processes, while minimizing the interfaces and impacts to the Space Shuttle and Freedom programs. A preliminary integrated operations concept based on this philosophy and covering the ground, flight, mission support, and landing and recovery operations has been produced. To implement the ACRV operations concept, the underlying approach has been to rely on vehicle autonomy and automation, to the extent possible. Candidate functions and processes which may benefit from current or near-term automation and robotics technologies are identified. These include, but are not limited to, built-in automated ground tests and checkouts; use of the Freedom and the Orbiter remote manipulator systems, for ACRV berthing; automated passive monitoring and performance trend analysis, and periodic active checkouts during dormant periods. The major ACRV operations concept issues as they relate to the use of automation are discussed.

  3. Default mode contributions to automated information processing.

    Science.gov (United States)

    Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A

    2017-11-28

    Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.

  4. Automated Mobility Transitions: Governing Processes in the UK

    Directory of Open Access Journals (Sweden)

    Debbie Hopkins

    2018-03-01

    Full Text Available Contemporary systems of mobility are undergoing a transition towards automation. In the UK, this transition is being led by (often new partnerships between incumbent manufacturers and new entrants, in collaboration with national governments, local/regional councils, and research institutions. This paper first offers a framework for analyzing the governance of the transition, adapting ideas from the Transition Management (TM perspective, and then applies the framework to ongoing automated vehicle transition dynamics in the UK. The empirical analysis suggests that the UK has adopted a reasonably comprehensive approach to the governing of automated vehicle innovation but that this approach cannot be characterized as sufficiently inclusive, democratic, diverse and open. The lack of inclusivity, democracy, diversity and openness is symptomatic of the post-political character of how the UK’s automated mobility transition is being governed. The paper ends with a call for a reconfiguration of the automated vehicle transition in the UK and beyond, so that much more space is created for dissent and for reflexive and comprehensive big picture thinking on (automated mobility futures.

  5. Process monitoring by display devices

    International Nuclear Information System (INIS)

    Eggerdinger, C.; Schattner, R.

    1984-01-01

    The use of extensive automation, regulating, protection and limiting devices and the application of ergonomic principles (e.g. the increased use of mimic diagrams) has led to plant being capable of continued operation. German nuclear power stations are in top position worldwide as regards safety and availability. However, there is already a requirement to overcome the unmanageable state due to the large number and miniaturization of elements by renewed efforts. An attempt at this made with conventional technology is represented by a mimic board, which was provided in a powerstation just being set to work. Such mimic boards give the opportunity of monitoring the most important parameters at a glance but there are limits to their use due to the large space required. The use of VDU screens represents a possibility of solving this problem. (orig./DG) [de

  6. Environmental and process monitoring technologies

    International Nuclear Information System (INIS)

    Vo-Dinh, Tuan

    1993-01-01

    The objective of this conference was to provide a multidisciplinary forum dealing with state-of-the-art methods and instrumentation for environmental and process monitoring. In the last few years, important advances have been made in improving existing analytical methods and developing new techniques for trace detection of chemicals. These monitoring technologies are a topic of great interest for environmental and industrial control in a wide spectrum of areas. Sensitive detection, selective characterization, and cost-effective analysis are among the most important challenges facing monitoring technologies. This conference integrating interdisciplinary research and development was aimed to present the most recent advances and applications in the important areas of environmental and process monitoring. Separate abstracts have been prepared for 34 papers for inclusion in the appropriate data bases

  7. Automated Monitoring of Carbon Fluxes in a Northern Rocky Mountain Forest Indicates Above-Average Net Primary Productivity During the 2015 Western U.S. Drought

    Science.gov (United States)

    Stenzel, J.; Hudiburg, T. W.

    2016-12-01

    As global temperatures rise in the 21st century, "hotter" droughts will become more intense and persistent, particularly in areas which already experience seasonal drought. Because forests represent a large and persistent terrestrial carbon sink which has previously offset a significant proportion of anthropogenic carbon emissions, forest carbon cycle responses to drought have become a prominent research concern. However, robust mechanistic modeling of carbon balance responses to projected drought effects requires improved observation-driven representations of carbon cycle processes; many such component processes are rarely monitored in complex terrain, are modeled or unrepresented quantities at eddy covariance sites, or are monitored at course temporal scales that are not conducive to elucidating process responses at process time scales. In the present study, we demonstrate the use of newly available and affordable automated dendrometers for the estimation of intra-seasonal Net Primary Productivity (NPP) in a Northern Rocky Mountain conifer forest which is impacted by seasonal drought. Results from our pilot study suggest that NPP was restricted by mid-summer moisture deficit under the extraordinary 2015 Western U.S. drought, with greater than 90% off stand growth occurring prior to August. Examination of growth on an inter-annual scale, however, suggests that the study site experienced above-average NPP during this exceptionally hot year. Taken together, these findings indicate that intensifying mid-summer drought in regional forests has affected the timing but has not diminished the magnitude of this carbon flux. By employing automated instrumentation for the intra-annual assessment of NPP, we reveal that annual NPP in regional forests is largely determined before mid-summer and is therefore surprisingly resilient to intensities of seasonal drought that exceed normal conditions of the 20th century.

  8. An automated image processing method for classification of diabetic retinopathy stages from conjunctival microvasculature images

    Science.gov (United States)

    Khansari, Maziyar M.; O'Neill, William; Penn, Richard; Blair, Norman P.; Chau, Felix; Shahidi, Mahnaz

    2017-03-01

    The conjunctiva is a densely vascularized tissue of the eye that provides an opportunity for imaging of human microcirculation. In the current study, automated fine structure analysis of conjunctival microvasculature images was performed to discriminate stages of diabetic retinopathy (DR). The study population consisted of one group of nondiabetic control subjects (NC) and 3 groups of diabetic subjects, with no clinical DR (NDR), non-proliferative DR (NPDR), or proliferative DR (PDR). Ordinary least square regression and Fisher linear discriminant analyses were performed to automatically discriminate images between group pairs of subjects. Human observers who were masked to the grouping of subjects performed image discrimination between group pairs. Over 80% and 70% of images of subjects with clinical and non-clinical DR were correctly discriminated by the automated method, respectively. The discrimination rates of the automated method were higher than human observers. The fine structure analysis of conjunctival microvasculature images provided discrimination of DR stages and can be potentially useful for DR screening and monitoring.

  9. Automated process control system for heat-treating nuclear power station parts

    International Nuclear Information System (INIS)

    Afanasiadi, N.G.; Demin, V.P.; Launin, B.N.

    1984-01-01

    The basic factors determining the need for an automated process control system (APCS) are discussed, as are system requirements. The basic tasks solved by the system are discussed. The functional scheme for a decentralized, two-level APCS is given

  10. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  11. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  12. Concept of a computer network architecture for complete automation of nuclear power plants

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ray, A.

    1990-01-01

    The state of the art in automation of nuclear power plants has been largely limited to computerized data acquisition, monitoring, display, and recording of process signals. Complete automation of nuclear power plants, which would include plant operations, control, and management, fault diagnosis, and system reconfiguration with efficient and reliable man/machine interactions, has been projected as a realistic goal. This paper presents the concept of a computer network architecture that would use a high-speed optical data highway to integrate diverse, interacting, and spatially distributed functions that are essential for a fully automated nuclear power plant

  13. Automated processing of data generated by molecular dynamics

    International Nuclear Information System (INIS)

    Lobato Hoyos, Ivan; Rojas Tapia, Justo; Instituto Peruano de Energia Nuclear, Lima

    2008-01-01

    A new integrated tool for automated processing of data generated by molecular dynamics packages and programs have been developed. The program allows to calculate important quantities such as pair correlation function, the analysis of common neighbors, counting nanoparticles and their size distribution, conversion of output files between different formats. The work explains in detail the modules of the tool, the interface between them. The uses of program are illustrated in application examples in the calculation of various properties of silver nanoparticles. (author)

  14. Managing Automation: A Process, Not a Project.

    Science.gov (United States)

    Hoffmann, Ellen

    1988-01-01

    Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…

  15. The Effects of Automated Prompting and Self-Monitoring on Homework Completion for a Student with Attention Deficit Hyperactivity Disorder

    Science.gov (United States)

    Blicha, Amy; Belfiore, Phillip J.

    2013-01-01

    This study examined the effects of an intervention consisting of automated prompting and self-monitoring on the level of independent homework task completion for an elementary-age student with attention deficit hyperactivity disorder (ADHD). Instituting a single subject, within series ABAB design, the results showed a consistent increase and…

  16. Flexible Automation System for Determination of Elemental Composition of Incrustations in Clogged Biliary Endoprostheses Using ICP-MS.

    Science.gov (United States)

    Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin

    2018-02-01

    Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.

  17. MO-G-BRE-03: Automated Continuous Monitoring of Patient Setup with Second-Check Independent Image Registration

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, X; Fox, T; Schreibmann, E [Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA (United States)

    2014-06-15

    Purpose: To create a non-supervised quality assurance program to monitor image-based patient setup. The system acts a secondary check by independently computing shifts and rotations and interfaces with Varian's database to verify therapist's work and warn against sub-optimal setups. Methods: Temporary digitally-reconstructed radiographs (DRRs) and OBI radiographic image files created by Varian's treatment console during patient setup are intercepted and used as input in an independent registration module customized for accuracy that determines the optimal rotations and shifts. To deal with the poor quality of OBI images, a histogram equalization of the live images to the DDR counterparts is performed as a pre-processing step. A search for the most sensitive metric was performed by plotting search spaces subject to various translations and convergence analysis was applied to ensure the optimizer finds the global minima. Final system configuration uses the NCC metric with 150 histogram bins and a one plus one optimizer running for 2000 iterations with customized scales for translations and rotations in a multi-stage optimization setup that first corrects and translations and subsequently rotations. Results: The system was installed clinically to monitor and provide almost real-time feedback on patient positioning. On a 2 month-basis uncorrected pitch values were of a mean 0.016° with standard deviation of 1.692°, and couch rotations of − 0.090°± 1.547°. The couch shifts were −0.157°±0.466° cm for the vertical, 0.045°±0.286 laterally and 0.084°± 0.501° longitudinally. Uncorrected pitch angles were the most common source of discrepancies. Large variations in the pitch angles were correlated with patient motion inside the mask. Conclusion: A system for automated quality assurance of therapist's registration was designed and tested in clinical practice. The approach complements the clinical software's automated registration in

  18. MO-G-BRE-03: Automated Continuous Monitoring of Patient Setup with Second-Check Independent Image Registration

    International Nuclear Information System (INIS)

    Jiang, X; Fox, T; Schreibmann, E

    2014-01-01

    Purpose: To create a non-supervised quality assurance program to monitor image-based patient setup. The system acts a secondary check by independently computing shifts and rotations and interfaces with Varian's database to verify therapist's work and warn against sub-optimal setups. Methods: Temporary digitally-reconstructed radiographs (DRRs) and OBI radiographic image files created by Varian's treatment console during patient setup are intercepted and used as input in an independent registration module customized for accuracy that determines the optimal rotations and shifts. To deal with the poor quality of OBI images, a histogram equalization of the live images to the DDR counterparts is performed as a pre-processing step. A search for the most sensitive metric was performed by plotting search spaces subject to various translations and convergence analysis was applied to ensure the optimizer finds the global minima. Final system configuration uses the NCC metric with 150 histogram bins and a one plus one optimizer running for 2000 iterations with customized scales for translations and rotations in a multi-stage optimization setup that first corrects and translations and subsequently rotations. Results: The system was installed clinically to monitor and provide almost real-time feedback on patient positioning. On a 2 month-basis uncorrected pitch values were of a mean 0.016° with standard deviation of 1.692°, and couch rotations of − 0.090°± 1.547°. The couch shifts were −0.157°±0.466° cm for the vertical, 0.045°±0.286 laterally and 0.084°± 0.501° longitudinally. Uncorrected pitch angles were the most common source of discrepancies. Large variations in the pitch angles were correlated with patient motion inside the mask. Conclusion: A system for automated quality assurance of therapist's registration was designed and tested in clinical practice. The approach complements the clinical software's automated registration in

  19. Performance Monitoring Of A Computer Numerically Controlled (CNC) Lathe Using Pattern Recognition Techniques

    Science.gov (United States)

    Daneshmend, L. K.; Pak, H. A.

    1984-02-01

    On-line monitoring of the cutting process in CNC lathe is desirable to ensure unattended fault-free operation in an automated environment. The state of the cutting tool is one of the most important parameters which characterises the cutting process. Direct monitoring of the cutting tool or workpiece is not feasible during machining. However several variables related to the state of the tool can be measured on-line. A novel monitoring technique is presented which uses cutting torque as the variable for on-line monitoring. A classifier is designed on the basis of the empirical relationship between cutting torque and flank wear. The empirical model required by the on-line classifier is established during an automated training cycle using machine vision for off-line direct inspection of the tool.

  20. Automation System Products and Research

    OpenAIRE

    Rintala, Mikko; Sormunen, Jussi; Kuisma, Petri; Rahkala, Matti

    2014-01-01

    Automation systems are used in most buildings nowadays. In the past they were mainly used in industry to control and monitor critical systems. During the past few decades the automation systems have become more common and are used today from big industrial solutions to homes of private customers. With the growing need for ecologic and cost-efficient management systems, home and building automation systems are becoming a standard way of controlling lighting, ventilation, heating etc. Auto...

  1. Application of neural network and pattern recognition software to the automated analysis of continuous nuclear monitoring of on-load reactors

    Energy Technology Data Exchange (ETDEWEB)

    Howell, J.A.; Eccleston, G.W.; Halbig, J.K.; Klosterbuer, S.F. [Los Alamos National Lab., NM (United States); Larson, T.W. [California Polytechnic State Univ., San Luis Obispo, CA (US)

    1993-08-01

    Automated analysis using pattern recognition and neural network software can help interpret data, call attention to potential anomalies, and improve safeguards effectiveness. Automated software analysis, based on pattern recognition and neural networks, was applied to data collected from a radiation core discharge monitor system located adjacent to an on-load reactor core. Unattended radiation sensors continuously collect data to monitor on-line refueling operations in the reactor. The huge volume of data collected from a number of radiation channels makes it difficult for a safeguards inspector to review it all, check for consistency among the measurement channels, and find anomalies. Pattern recognition and neural network software can analyze large volumes of data from continuous, unattended measurements, thereby improving and automating the detection of anomalies. The authors developed a prototype pattern recognition program that determines the reactor power level and identifies the times when fuel bundles are pushed through the core during on-line refueling. Neural network models were also developed to predict fuel bundle burnup to calculate the region on the on-load reactor face from which fuel bundles were discharged based on the radiation signals. In the preliminary data set, which was limited and consisted of four distinct burnup regions, the neural network model correctly predicted the burnup region with an accuracy of 92%.

  2. Automated system for data acquisition and monitoring

    Directory of Open Access Journals (Sweden)

    Borza Sorin

    2017-01-01

    Full Text Available The Environmental management has become, with the development of human society a very important issue. There have been multiple systems that automatically monitors the environment. In this paper we propose a system that integrates GIS software and data acquisition software. In addition the proposed system implements new AHP multicriteria method that can get an answer online on each pollutant influence on limited geographical area in which the monitors. Factors pollutants of limited geographical areas are taken automatically by specific sensors through acquisition board. Labview software, with virtual instrument created by transferring them into a database Access. Access database they are taken up by software Geomedia Professional and processed using multi-criteria method AHP, so that at any moment, their influence on the environment and classify these influences, can be plotted on the screen monitoring system. The system allows, the automatic collection of data, the memorization and the generation of GIS elements. The research presented in this paper were aimed at implementing multi-criteria methods in GIS software.

  3. [Automated processing of data from the 1985 population and housing census].

    Science.gov (United States)

    Cholakov, S

    1987-01-01

    The author describes the method of automated data processing used in the 1985 census of Bulgaria. He notes that the computerization of the census involves decentralization and the use of regional computing centers as well as data processing at the Central Statistical Office's National Information Computer Center. Special attention is given to problems concerning the projection and programming of census data. (SUMMARY IN ENG AND RUS)

  4. Longwall automation 2

    Energy Technology Data Exchange (ETDEWEB)

    David Hainsworth; David Reid; Con Caris; J.C. Ralston; C.O. Hargrave; Ron McPhee; I.N. Hutchinson; A. Strange; C. Wesner [CSIRO (Australia)

    2008-05-15

    This report covers a nominal two-year extension to the Major Longwall Automation Project (C10100). Production standard implementation of Longwall Automation Steering Committee (LASC) automation systems has been achieved at Beltana and Broadmeadow mines. The systems are now used on a 24/7 basis and have provided production benefits to the mines. The LASC Information System (LIS) has been updated and has been implemented successfully in the IT environment of major coal mining houses. This enables 3D visualisation of the longwall environment and equipment to be accessed on line. A simulator has been specified and a prototype system is now ready for implementation. The Shearer Position Measurement System (SPMS) has been upgraded to a modular commercial production standard hardware solution.A compact hardware solution for visual face monitoring has been developed, an approved enclosure for a thermal infrared camera has been produced and software for providing horizon control through faulted conditions has been delivered. The incorporation of the LASC Cut Model information into OEM horizon control algorithms has been bench and underground tested. A prototype system for shield convergence monitoring has been produced and studies to identify techniques for coal flow optimisation and void monitoring have been carried out. Liaison with equipment manufacturers has been maintained and technology delivery mechanisms for LASC hardware and software have been established.

  5. Automation facilities for agricultural machinery control

    Directory of Open Access Journals (Sweden)

    A. Yu. Izmaylov

    2017-01-01

    Full Text Available The possibility of use of the automation equipment for agricultural machinery control is investigated. The authors proposed solutions on creation of the centralized unified automated information system for mobile aggregates management. In accordance with the modern requirements this system should be open, integrated into the general schema of agricultural enterprise control. Standard hardware, software and communicative features should be realized in tasks of monitoring and control. Therefore the schema should be get with use the unified modules and Russian standards. The complex multivariate unified automated control system for different objects of agricultural purpose based on block and modular creation should correspond to the following principles: high reliability, simplicity of service, low expenses in case of operation, the short payback period connected to increase in productivity, the reduced losses when harvesting, postharvest processing and storage, the improved energetic indices. Technological processes control in agricultural production is exercised generally with feedback. The example without feedback is program control by temperature in storage in case of the cooling mode. Feedback at technological processes control in agricultural production allows to optimally solve a problem of rational distribution of functions in man-distributed systems and forming the intelligent ergonomic interfaces, consistent with professional perceptions of decision-makers. The negative feedback created by the control unit allows to support automatically a quality index of technological process at the set level. The quantitative analysis of a production situation base itself upon deeply formalized basis of computer facilities that promotes making of the optimal solution. Information automated control system introduction increases labor productivity by 40 percent, reduces energetic costs by 25 percent. Improvement of quality of the executed technological

  6. Starting the automation process by using group technology

    Directory of Open Access Journals (Sweden)

    Jorge Andrés García Barbosa

    2004-09-01

    Full Text Available This article describes starting-up an automation process based on applying group technology (GT. Mecanizados CNC, a company making matallurgical sector products, bases the layout (organisation and disposition of its machinery on the concept of manufacturing cells; production is programmed once the best location for the equipment has been determined. The order of making products and suitable setting up of tools for the machinery in the cells is established, aimed at minimising set up leading to achieving 15% improvement in productivity.

  7. Automated forms processing - An alternative to time-consuming manual double entry of data in arthroplasty registries?

    DEFF Research Database (Denmark)

    Paulsen, Aksel

    2012-01-01

    , there was no statistical difference compared to single-key data entry (error proportion=0.007 (95% CI: 0.001-0.024), (p= 0.656)), as well as double-key data entry (error proportion=0.003 (95% CI: 0.000-0.019), (p= 0.319)). Discussion and conclusion: Automated forms processing is a valid alternative to double manual data......Background: The clinical and scientific usage of patient-reported outcome measures is increasing in the health services. Often paper forms are used. Manual double entry of data is defined as the definitive gold standard for transferring data to an electronic format, but the process is laborious....... Automated forms processing may be an alternative, but further validation is warranted. Materials and Methods: 200 patients were randomly selected from a cohort of 5777 patients who had previously answered two different questionnaires. The questionnaires were scanned using an automated forms processing...

  8. Automated Impedance Tomography for Monitoring Permeable Reactive Barrier Health

    Energy Technology Data Exchange (ETDEWEB)

    LaBrecque, D J; Adkins, P L

    2009-07-02

    The objective of this research was the development of an autonomous, automated electrical geophysical monitoring system which allows for near real-time assessment of Permeable Reactive Barrier (PRB) health and aging and which provides this assessment through a web-based interface to site operators, owners and regulatory agencies. Field studies were performed at four existing PRB sites; (1) a uranium tailing site near Monticello, Utah, (2) the DOE complex at Kansas City, Missouri, (3) the Denver Federal Center in Denver, Colorado and (4) the Asarco Smelter site in East Helena, Montana. Preliminary surface data over the PRB sites were collected (in December, 2005). After the initial round of data collection, the plan was modified to include studies inside the barriers in order to better understand barrier aging processes. In September 2006 an autonomous data collection system was designed and installed at the EPA PRB and the electrode setups in the barrier were revised and three new vertical electrode arrays were placed in dedicated boreholes which were in direct contact with the PRB material. Final data were collected at the Kansas City, Denver and Monticello, Utah PRB sites in the fall of 2007. At the Asarco Smelter site in East Helena, Montana, nearly continuous data was collected by the autonomous monitoring system from June 2006 to November 2007. This data provided us with a picture of the evolution of the barrier, enabling us to examine barrier changes more precisely and determine whether these changes are due to installation issues or are normal barrier aging. Two rounds of laboratory experiments were carried out during the project. We conducted column experiments to investigate the effect of mineralogy on the electrical signatures resulting from iron corrosion and mineral precipitation in zero valent iron (ZVI) columns. In the second round of laboratory experiments we observed the electrical response from simulation of actual field PRBs at two sites: the

  9. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  10. Automated processing of thermal infrared images of Osservatorio Vesuviano permanent surveillance network by using Matlab code

    Science.gov (United States)

    Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa

    2017-04-01

    The permanent thermal infrared surveillance network of Osservatorio Vesuviano (INGV) is composed of 6 stations which acquire IR frames of fumarole fields in the Campi Flegrei caldera and inside the Vesuvius crater (Italy). The IR frames are uploaded to a dedicated server in the Surveillance Center of Osservatorio Vesuviano in order to process the infrared data and to excerpt all the information contained. In a first phase the infrared data are processed by an automated system (A.S.I.R.A. Acq- Automated System of IR Analysis and Acquisition) developed in Matlab environment and with a user-friendly graphic user interface (GUI). ASIRA daily generates time-series of residual temperature values of the maximum temperatures observed in the IR scenes after the removal of seasonal effects. These time-series are displayed in the Surveillance Room of Osservatorio Vesuviano and provide information about the evolution of shallow temperatures field of the observed areas. In particular the features of ASIRA Acq include: a) efficient quality selection of IR scenes, b) IR images co-registration in respect of a reference frame, c) seasonal correction by using a background-removal methodology, a) filing of IR matrices and of the processed data in shared archives accessible to interrogation. The daily archived records can be also processed by ASIRA Plot (Matlab code with GUI) to visualize IR data time-series and to help in evaluating inputs parameters for further data processing and analysis. Additional processing features are accomplished in a second phase by ASIRA Tools which is Matlab code with GUI developed to extract further information from the dataset in automated way. The main functions of ASIRA Tools are: a) the analysis of temperature variations of each pixel of the IR frame in a given time interval, b) the removal of seasonal effects from temperature of every pixel in the IR frames by using an analytic approach (removal of sinusoidal long term seasonal component by using a

  11. Testing the effectiveness of automated acoustic sensors for monitoring vocal activity of Marbled Murrelets Brachyramphus marmoratus

    Science.gov (United States)

    Cragg, Jenna L.; Burger, Alan E.; Piatt, John F.

    2015-01-01

    Cryptic nest sites and secretive breeding behavior make population estimates and monitoring of Marbled Murrelets Brachyramphus marmoratus difficult and expensive. Standard audio-visual and radar protocols have been refined but require intensive field time by trained personnel. We examined the detection range of automated sound recorders (Song Meters; Wildlife Acoustics Inc.) and the reliability of automated recognition models (“recognizers”) for identifying and quantifying Marbled Murrelet vocalizations during the 2011 and 2012 breeding seasons at Kodiak Island, Alaska. The detection range of murrelet calls by Song Meters was estimated to be 60 m. Recognizers detected 20 632 murrelet calls (keer and keheer) from a sample of 268 h of recordings, yielding 5 870 call series, which compared favorably with human scanning of spectrograms (on average detecting 95% of the number of call series identified by a human observer, but not necessarily the same call series). The false-negative rate (percentage of murrelet call series that the recognizers failed to detect) was 32%, mainly involving weak calls and short call series. False-positives (other sounds included by recognizers as murrelet calls) were primarily due to complex songs of other bird species, wind and rain. False-positives were lower in forest nesting habitat (48%) and highest in shrubby vegetation where calls of other birds were common (97%–99%). Acoustic recorders tracked spatial and seasonal trends in vocal activity, with higher call detections in high-quality forested habitat and during late July/early August. Automated acoustic monitoring of Marbled Murrelet calls could provide cost-effective, valuable information for assessing habitat use and temporal and spatial trends in nesting activity; reliability is dependent on careful placement of sensors to minimize false-positives and on prudent application of digital recognizers with visual checking of spectrograms.

  12. An Advanced Pre-Processing Pipeline to Improve Automated Photogrammetric Reconstructions of Architectural Scenes

    Directory of Open Access Journals (Sweden)

    Marco Gaiani

    2016-02-01

    Full Text Available Automated image-based 3D reconstruction methods are more and more flooding our 3D modeling applications. Fully automated solutions give the impression that from a sample of randomly acquired images we can derive quite impressive visual 3D models. Although the level of automation is reaching very high standards, image quality is a fundamental pre-requisite to produce successful and photo-realistic 3D products, in particular when dealing with large datasets of images. This article presents an efficient pipeline based on color enhancement, image denoising, color-to-gray conversion and image content enrichment. The pipeline stems from an analysis of various state-of-the-art algorithms and aims to adjust the most promising methods, giving solutions to typical failure causes. The assessment evaluation proves how an effective image pre-processing, which considers the entire image dataset, can improve the automated orientation procedure and dense 3D point cloud reconstruction, even in the case of poor texture scenarios.

  13. The Atmospheric Data Acquisition And Interpolation Process For Center-TRACON Automation System

    Science.gov (United States)

    Jardin, M. R.; Erzberger, H.; Denery, Dallas G. (Technical Monitor)

    1995-01-01

    The Center-TRACON Automation System (CTAS), an advanced new air traffic automation program, requires knowledge of spatial and temporal atmospheric conditions such as the wind speed and direction, the temperature and the pressure in order to accurately predict aircraft trajectories. Real-time atmospheric data is available in a grid format so that CTAS must interpolate between the grid points to estimate the atmospheric parameter values. The atmospheric data grid is generally not in the same coordinate system as that used by CTAS so that coordinate conversions are required. Both the interpolation and coordinate conversion processes can introduce errors into the atmospheric data and reduce interpolation accuracy. More accurate algorithms may be computationally expensive or may require a prohibitively large amount of data storage capacity so that trade-offs must be made between accuracy and the available computational and data storage resources. The atmospheric data acquisition and processing employed by CTAS will be outlined in this report. The effects of atmospheric data processing on CTAS trajectory prediction will also be analyzed, and several examples of the trajectory prediction process will be given.

  14. Future of Automated Insulin Delivery Systems

    NARCIS (Netherlands)

    Castle, Jessica R.; DeVries, J. Hans; Kovatchev, Boris

    2017-01-01

    Advances in continuous glucose monitoring (CGM) have brought on a paradigm shift in the management of type 1 diabetes. These advances have enabled the automation of insulin delivery, where an algorithm determines the insulin delivery rate in response to the CGM values. There are multiple automated

  15. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation.

    Science.gov (United States)

    Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D; Christie, Steven D R

    2017-01-01

    Additive manufacturing or '3D printing' is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis.

  16. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    Directory of Open Access Journals (Sweden)

    Andrew J. Capel

    2017-01-01

    Full Text Available Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis.

  17. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    Science.gov (United States)

    Capel, Andrew J; Wright, Andrew; Harding, Matthew J; Weaver, George W; Li, Yuqi; Harris, Russell A; Edmondson, Steve; Goodridge, Ruth D

    2017-01-01

    Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro- and milliscale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multifunctional fluidic devices with embedded reaction monitoring capability. The selectively laser melted parts are the first published examples of multifunctional 3D printed metal fluidic devices. These devices allow high temperature and pressure chemistry to be performed in solvent systems destructive to the majority of devices manufactured via stereolithography, polymer jetting and fused deposition modelling processes previously utilised for this application. These devices were integrated with commercially available flow chemistry, chromatographic and spectroscopic analysis equipment, allowing automated online and inline optimisation of the reaction medium. This set-up allowed the optimisation of two reactions, a ketone functional group interconversion and a fused polycyclic heterocycle formation, via spectroscopic and chromatographic analysis. PMID:28228852

  18. Statistical algorithm for automated signature analysis of power spectral density data

    International Nuclear Information System (INIS)

    Piety, K.R.

    1977-01-01

    A statistical algorithm has been developed and implemented on a minicomputer system for on-line, surveillance applications. Power spectral density (PSD) measurements on process signals are the performance signatures that characterize the ''health'' of the monitored equipment. Statistical methods provide a quantitative basis for automating the detection of anomalous conditions. The surveillance algorithm has been tested on signals from neutron sensors, proximeter probes, and accelerometers to determine its potential for monitoring nuclear reactors and rotating machinery

  19. Automated information and control complex of hydro-gas endogenous mine processes

    Science.gov (United States)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  20. ADVANCES IN CLOG STATE MONITORING FOR USE IN AUTOMATED REED BED INSTALLATIONS

    Directory of Open Access Journals (Sweden)

    Theodore HUGHES-RILEY

    2014-06-01

    Full Text Available Constructed wetlands are a popular form of waste-water treatment that have proliferated across Europe and the rest of the world in recent years as an environmentally conscious form of waste water treatment. The ability to monitor the conditions in the bed and control input factors such as heating and aeration may extend the lifetime of the reed bed substantially beyond the ten year lifetime normally reached. The Autonomous Reed Bed Installation (ARBI project is an EU FP7 initiative to develop a reed bed with automated control over input parameters based on readings taken from embedded sensors. Automated remedial action may improve bed treatment efficiency, and prolong the life of the bed and avoiding the need to refurbish the bed, which is both time consuming and costly. One critical parameter to observe is the clog state of the reed bed, as this can severely impact on the efficiency of water treatment to the point of the bed becoming non-operable. Magnetic resonance (MR sensors can be a powerful tool in determining clogging levels, and has previously been explored in the literature. This work is based on a conference paper (2nd International Conference "Water resources and wetlands", 2014 and details magnetic sensors suitable for long-term embedding into a constructed wetland. Unlike previous studies this work examines a probe embedded into a wetland.

  1. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    Science.gov (United States)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  2. Improved automated lumen contour detection by novel multifrequency processing algorithm with current intravascular ultrasound system.

    Science.gov (United States)

    Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro

    2013-02-01

    The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.

  3. Partitioning,Automation and Error Recovery in the Control and Monitoring System of an LHC Experiment

    Institute of Scientific and Technical Information of China (English)

    C.Gaspar

    2001-01-01

    The Joint Controls Project(JCOP)is a collaboration between CERN and the four LHC experiments to find and implement common solutions for their control and monitoring systems.As part of this project and Architecture Working Group was set up in order to study the requirements and devise an architectural model that would suit the four experiments.Many issues were studied by this working group:Alarm handling,Access Control,Hierarchical Control,etc.This paper will report on the specific issue of hierarchical control and in particular partitioning,automation and error recovery.

  4. Developing an automated database for monitoring ultrasound- and computed tomography-guided procedure complications and diagnostic yield.

    Science.gov (United States)

    Itri, Jason N; Jones, Lisa P; Kim, Woojin; Boonn, William W; Kolansky, Ana S; Hilton, Susan; Zafar, Hanna M

    2014-04-01

    Monitoring complications and diagnostic yield for image-guided procedures is an important component of maintaining high quality patient care promoted by professional societies in radiology and accreditation organizations such as the American College of Radiology (ACR) and Joint Commission. These outcome metrics can be used as part of a comprehensive quality assurance/quality improvement program to reduce variation in clinical practice, provide opportunities to engage in practice quality improvement, and contribute to developing national benchmarks and standards. The purpose of this article is to describe the development and successful implementation of an automated web-based software application to monitor procedural outcomes for US- and CT-guided procedures in an academic radiology department. The open source tools PHP: Hypertext Preprocessor (PHP) and MySQL were used to extract relevant procedural information from the Radiology Information System (RIS), auto-populate the procedure log database, and develop a user interface that generates real-time reports of complication rates and diagnostic yield by site and by operator. Utilizing structured radiology report templates resulted in significantly improved accuracy of information auto-populated from radiology reports, as well as greater compliance with manual data entry. An automated web-based procedure log database is an effective tool to reliably track complication rates and diagnostic yield for US- and CT-guided procedures performed in a radiology department.

  5. Wireless energizing system for an automated implantable sensor

    Energy Technology Data Exchange (ETDEWEB)

    Swain, Biswaranjan; Nayak, Praveen P.; Kar, Durga P.; Bhuyan, Satyanarayan; Mishra, Laxmi P. [Department of Electronics and Instrumentation Engineering, Siksha ‘O’ Anusandhan University, Bhubaneswar 751030 (India)

    2016-07-15

    The wireless drive of an automated implantable electronic sensor has been explored for health monitoring applications. The proposed system comprises of an automated biomedical sensing system which is energized through resonant inductive coupling. The implantable sensor unit is able to monitor the body temperature parameter and sends back the corresponding telemetry data wirelessly to the data recoding unit. It has been observed that the wireless power delivery system is capable of energizing the automated biomedical implantable electronic sensor placed over a distance of 3 cm from the power transmitter with an energy transfer efficiency of 26% at the operating resonant frequency of 562 kHz. This proposed method ensures real-time monitoring of different human body temperatures around the clock. The monitored temperature data have been compared with a calibrated temperature measurement system to ascertain the accuracy of the proposed system. The investigated technique can also be useful for monitoring other body parameters such as blood pressure, bladder pressure, and physiological signals of the patient in vivo using various implantable sensors.

  6. Wireless energizing system for an automated implantable sensor

    International Nuclear Information System (INIS)

    Swain, Biswaranjan; Nayak, Praveen P.; Kar, Durga P.; Bhuyan, Satyanarayan; Mishra, Laxmi P.

    2016-01-01

    The wireless drive of an automated implantable electronic sensor has been explored for health monitoring applications. The proposed system comprises of an automated biomedical sensing system which is energized through resonant inductive coupling. The implantable sensor unit is able to monitor the body temperature parameter and sends back the corresponding telemetry data wirelessly to the data recoding unit. It has been observed that the wireless power delivery system is capable of energizing the automated biomedical implantable electronic sensor placed over a distance of 3 cm from the power transmitter with an energy transfer efficiency of 26% at the operating resonant frequency of 562 kHz. This proposed method ensures real-time monitoring of different human body temperatures around the clock. The monitored temperature data have been compared with a calibrated temperature measurement system to ascertain the accuracy of the proposed system. The investigated technique can also be useful for monitoring other body parameters such as blood pressure, bladder pressure, and physiological signals of the patient in vivo using various implantable sensors.

  7. Wireless energizing system for an automated implantable sensor.

    Science.gov (United States)

    Swain, Biswaranjan; Nayak, Praveen P; Kar, Durga P; Bhuyan, Satyanarayan; Mishra, Laxmi P

    2016-07-01

    The wireless drive of an automated implantable electronic sensor has been explored for health monitoring applications. The proposed system comprises of an automated biomedical sensing system which is energized through resonant inductive coupling. The implantable sensor unit is able to monitor the body temperature parameter and sends back the corresponding telemetry data wirelessly to the data recoding unit. It has been observed that the wireless power delivery system is capable of energizing the automated biomedical implantable electronic sensor placed over a distance of 3 cm from the power transmitter with an energy transfer efficiency of 26% at the operating resonant frequency of 562 kHz. This proposed method ensures real-time monitoring of different human body temperatures around the clock. The monitored temperature data have been compared with a calibrated temperature measurement system to ascertain the accuracy of the proposed system. The investigated technique can also be useful for monitoring other body parameters such as blood pressure, bladder pressure, and physiological signals of the patient in vivo using various implantable sensors.

  8. Process monitoring for reprocessing plant safeguards: a summary review

    International Nuclear Information System (INIS)

    Kerr, H.T.; Ehinger, M.H.; Wachter, J.W.; Hebble, T.L.

    1986-10-01

    Process monitoring is a term typically associated with a detailed look at plant operating data to determine plant status. Process monitoring has been generally associated with operational control of plant processes. Recently, process monitoring has been given new attention for a possible role in international safeguards. International Safeguards Project Office (ISPO) Task C.59 has the goal to identify specific roles for process monitoring in international safeguards. As the preliminary effort associated with this task, a review of previous efforts in process monitoring for safeguards was conducted. Previous efforts mentioned concepts and a few specific applications. None were comprehensive in addressing all aspects of a process monitoring application for safeguards. This report summarizes the basic elements that must be developed in a comprehensive process monitoring application for safeguards. It then summarizes the significant efforts that have been documented in the literature with respect to the basic elements that were addressed

  9. Development of mathematical models for automation of strength calculation during plastic deformation processing

    Science.gov (United States)

    Steposhina, S. V.; Fedonin, O. N.

    2018-03-01

    Dependencies that make it possible to automate the force calculation during surface plastic deformation (SPD) processing and, thus, to shorten the time for technological preparation of production have been developed.

  10. Application of an automated wireless structural monitoring system for long-span suspension bridges

    International Nuclear Information System (INIS)

    Kurata, M.; Lynch, J. P.; Linden, G. W. van der; Hipley, P.; Sheng, L.-H.

    2011-01-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  11. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges

    Science.gov (United States)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.

    2011-06-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  12. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  13. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    Tobin, K.W.; Karnowski, T.P.; Gleason, S.S.; Jensen, D.; Lakhani, F.

    1999-01-01

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  14. Bioreactor process monitoring using an automated microfluidic platform for cell-based assays

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    We report on a novel microfluidic system designed to monitor in real-time the concentration of live and dead cells in industrial cell production. Custom-made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample-to-waste liquid management and image cytometry-based ...

  15. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  16. Reference Tools for Data Processing, Office Automation, and Data Communications: An Introductory Guide.

    Science.gov (United States)

    Cupoli, Patricia Dymkar

    1981-01-01

    Provides an introduction to various reference sources which are useful in dealing with the areas of data processing, office automation, and communications technologies. A bibliography with vendor listings is included. (FM)

  17. Automation and efficiency in the operational processes: a case study in a logistics operator

    OpenAIRE

    Nascimento, Dener Gomes do; Silva, Giovanni Henrique da

    2017-01-01

    Globalization has made the automations become increasingly feasible and with the technological development many operations can be optimized, bringing productivity gains. Logistics is a major benefit of all this development, because lives a time extremely competitive, in which being efficient is a requirement to stay alive in the market. Inserted in this context, this article seeks from the analysis of the processes in a distribution center, identify opportunities to automate operations to gai...

  18. Distribution automation at BC Hydro : a case study

    Energy Technology Data Exchange (ETDEWEB)

    Siew, C. [BC Hydro, Vancouver, BC (Canada). Smart Grid Development Program

    2009-07-01

    This presentation discussed a distribution automation study conducted by BC Hydro to determine methods of improving grid performance by supporting intelligent transmission and distribution systems. The utility's smart grid program includes a number of utility-side and customer-side applications, including enabled demand response, microgrid, and operational efficiency applications. The smart grid program will improve reliability and power quality by 40 per cent, improve conservation and energy efficiency throughout the province, and provide enhanced customer service. Programs and initiatives currently underway at the utility include distribution management, smart metering, distribution automation, and substation automation programs. The utility's automation functionality will include fault interruption and locating, restoration capability, and restoration success. A decision support system has also been established to assist control room and field operating personnel with monitoring and control of the electric distribution system. Protection, control and monitoring (PCM) and volt VAR optimization upgrades are also planned. Reclosers are also being automated, and an automation guide has been developed for switches. tabs., figs.

  19. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  20. Automated Selection Of Pictures In Sequences

    Science.gov (United States)

    Rorvig, Mark E.; Shelton, Robert O.

    1995-01-01

    Method of automated selection of film or video motion-picture frames for storage or examination developed. Beneficial in situations in which quantity of visual information available exceeds amount stored or examined by humans in reasonable amount of time, and/or necessary to reduce large number of motion-picture frames to few conveying significantly different information in manner intermediate between movie and comic book or storyboard. For example, computerized vision system monitoring industrial process programmed to sound alarm when changes in scene exceed normal limits.

  1. Automated Remote Monitoring of Depression: Acceptance Among Low-Income Patients in Diabetes Disease Management.

    Science.gov (United States)

    Ramirez, Magaly; Wu, Shinyi; Jin, Haomiao; Ell, Kathleen; Gross-Schulman, Sandra; Myerchin Sklaroff, Laura; Guterman, Jeffrey

    2016-01-25

    Remote patient monitoring is increasingly integrated into health care delivery to expand access and increase effectiveness. Automation can add efficiency to remote monitoring, but patient acceptance of automated tools is critical for success. From 2010 to 2013, the Diabetes-Depression Care-management Adoption Trial (DCAT)-a quasi-experimental comparative effectiveness research trial aimed at accelerating the adoption of collaborative depression care in a safety-net health care system-tested a fully automated telephonic assessment (ATA) depression monitoring system serving low-income patients with diabetes. The aim of this study was to determine patient acceptance of ATA calls over time, and to identify factors predicting long-term patient acceptance of ATA calls. We conducted two analyses using data from the DCAT technology-facilitated care arm, in which for 12 months the ATA system periodically assessed depression symptoms, monitored treatment adherence, prompted self-care behaviors, and inquired about patients' needs for provider contact. Patients received assessments at 6, 12, and 18 months using Likert-scale measures of willingness to use ATA calls, preferred mode of reach, perceived ease of use, usefulness, nonintrusiveness, privacy/security, and long-term usefulness. For the first analysis (patient acceptance over time), we computed descriptive statistics of these measures. In the second analysis (predictive factors), we collapsed patients into two groups: those reporting "high" versus "low" willingness to use ATA calls. To compare them, we used independent t tests for continuous variables and Pearson chi-square tests for categorical variables. Next, we jointly entered independent factors found to be significantly associated with 18-month willingness to use ATA calls at the univariate level into a logistic regression model with backward selection to identify predictive factors. We performed a final logistic regression model with the identified significant

  2. Adaptive Soa Stack-Based Business Process Monitoring Platform

    Directory of Open Access Journals (Sweden)

    Przemysław Dadel

    2014-01-01

    Full Text Available Executable business processes that formally describe company activities are well placed in the SOA environment as they allow for declarative organization of high-level system logic.However, for both technical and non-technical users, to fully benet from that element of abstractionappropriate business process monitoring systems are required and existing solutions remain unsatisfactory.The paper discusses the problem of business process monitoring in the context of the service orientation paradigm in order to propose an architectural solution and provide implementation of a system for business process monitoring that alleviates the shortcomings of the existing solutions.Various platforms are investigated to obtain a broader view of the monitoring problem and to gather functional and non-functional requirements. These requirements constitute input forthe further analysis and the system design. The monitoring software is then implemented and evaluated according to the specied criteria.An extensible business process monitoring system was designed and built on top of OSGiMM - a dynamic, event-driven, congurable communications layer that provides real-time monitoring capabilities for various types of resources. The system was tested against the stated functional requirements and its implementation provides a starting point for the further work.It is concluded that providing a uniform business process monitoring solution that satises a wide range of users and business process platform vendors is a dicult endeavor. It is furthermore reasoned that only an extensible, open-source, monitoring platform built on top of a scalablecommunication core has a chance to address all the stated and future requirements.

  3. Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.

    Science.gov (United States)

    Greenlee, Eric T; DeLucia, Patricia R; Newton, David C

    2018-03-01

    The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.

  4. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    Science.gov (United States)

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  5. Monitoring and controlling the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Ahring, B K; Angelidaki, I [The Technical Univ. of Denmark, Dept. of Environmental Science and Engineering, Lyngby (Denmark)

    1997-08-01

    Many modern large-scale biogas plants have been constructed recently, increasing the demand for proper monitoring and control of these large reactor systems. For monitoring the biogas process, an easy to measure and reliable indicator is required, which reflects the metabolic state and the activity of the bacterial populations in the reactor. In this paper, we discuss existing indicators as well as indicators under development which can potentially be used to monitor the state of the biogas process in a reactor. Furthermore, data are presented from two large scale thermophilic biogas plants, subjected to temperature changes and where the concentration of volatile fatty acids was monitored. The results clearly demonstrated that significant changes in the concentration of the individual VFA occurred although the biogas production was not significantly changed. Especially the concentrations of butyrate, isobutyrate and isovalerate showed significant changes. Future improvements of process control could therefore be based on monitoring of the concentration of specific VFA`s together with information about the bacterial populations in the reactor. The last information could be supplied by the use of modern molecular techniques. (au) 51 refs.

  6. Robotic platform for parallelized cultivation and monitoring of microbial growth parameters in microwell plates.

    Science.gov (United States)

    Knepper, Andreas; Heiser, Michael; Glauche, Florian; Neubauer, Peter

    2014-12-01

    The enormous variation possibilities of bioprocesses challenge process development to fix a commercial process with respect to costs and time. Although some cultivation systems and some devices for unit operations combine the latest technology on miniaturization, parallelization, and sensing, the degree of automation in upstream and downstream bioprocess development is still limited to single steps. We aim to face this challenge by an interdisciplinary approach to significantly shorten development times and costs. As a first step, we scaled down analytical assays to the microliter scale and created automated procedures for starting the cultivation and monitoring the optical density (OD), pH, concentrations of glucose and acetate in the culture medium, and product formation in fed-batch cultures in the 96-well format. Then, the separate measurements of pH, OD, and concentrations of acetate and glucose were combined to one method. This method enables automated process monitoring at dedicated intervals (e.g., also during the night). By this approach, we managed to increase the information content of cultivations in 96-microwell plates, thus turning them into a suitable tool for high-throughput bioprocess development. Here, we present the flowcharts as well as cultivation data of our automation approach. © 2014 Society for Laboratory Automation and Screening.

  7. Equipment monitoring and diagnosis of their mechanical condition

    International Nuclear Information System (INIS)

    Morel, J.; Monnier, B.

    1994-01-01

    The main objectives of reactor monitoring are reviewed and the three monitoring types are described: reception controls and alarms, periodical controls, and specific monitoring. The monitoring process is then presented: information acquisition, dysfunction detection and diagnosis, risk analysis and maintenance action determination. Diagnosis assistance is now automated with expert systems such as DIVA (shaft vibration diagnosis) and DIAPO (primary pump diagnosis). Several application examples at EDF are described: SEXTEN reactor containment tightness monitoring, large and small-size turbo-machine monitoring, reactor inner structure monitoring, loose part detection in the primary circuit. All these informations will be centralized in a general monitoring and diagnosis assistance station. 3 fig

  8. Shared Representations and the Translation Process

    DEFF Research Database (Denmark)

    Schaeffer, Moritz; Carl, Michael

    2015-01-01

    The purpose of the present chapter is to investigate automated processing during translation. We provide evidence from a translation priming study which suggests that translation involves activation of shared lexico-semantic and syntactical representations, i.e., the activation of features of both...... source and target language items which share one single cognitive representation. We argue that activation of shared representations facilitates automated processing. The chapter revises the literal translation hypothesis and the monitor model (Ivir 1981; Toury 1995; Tirkkonen-Condit 2005), and re...

  9. Shared Representations and the Translation Process

    DEFF Research Database (Denmark)

    Schaeffer, Moritz; Carl, Michael

    2013-01-01

    The purpose of the present paper is to investigate automated processing during translation. We provide evidence from a translation priming study which suggests that translation involves activation of shared lexico-semantic and syntactical representations, i.e., the activation of features of both...... source and target language items which share one single cognitive representation. We argue that activation of shared representations facilitates automated processing. The paper revises the literal translation hypothesis and the monitor model (Ivir 1981; Toury 1995; Tirkkonen-Condit 2005), and re...

  10. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.

  11. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  12. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.

  13. Integrating Standard Operating Procedures with Spacecraft Automation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Spacecraft automation can be used to greatly reduce the demands on crew member and flight controllers time and attention. Automation can monitor critical resources,...

  14. Work Planing Automation at Mechanical Subdivision

    OpenAIRE

    Dzindzelėta, Vytautas

    2005-01-01

    Work planing automation, installation possibilities and future outlook at mechanical subdivision. To study how the work planing has changed before and after automation process and to analyse automation process methodology.

  15. Stromal vascular fraction isolated from lipo-aspirates using an automated processing system: bench and bed analysis.

    Science.gov (United States)

    Doi, Kentaro; Tanaka, Shinsuke; Iida, Hideo; Eto, Hitomi; Kato, Harunosuke; Aoi, Noriyuki; Kuno, Shinichiro; Hirohi, Toshitsugu; Yoshimura, Kotaro

    2013-11-01

    The heterogeneous stromal vascular fraction (SVF), containing adipose-derived stem/progenitor cells (ASCs), can be easily isolated through enzymatic digestion of aspirated adipose tissue. In clinical settings, however, strict control of technical procedures according to standard operating procedures and validation of cell-processing conditions are required. Therefore, we evaluated the efficiency and reliability of an automated system for SVF isolation from adipose tissue. SVF cells, freshly isolated using the automated procedure, showed comparable number and viability to those from manual isolation. Flow cytometric analysis confirmed an SVF cell composition profile similar to that after manual isolation. In addition, the ASC yield after 1 week in culture was also not significantly different between the two groups. Our clinical study, in which SVF cells isolated with the automated system were transplanted with aspirated fat tissue for soft tissue augmentation/reconstruction in 42 patients, showed satisfactory outcomes with no serious side-effects. Taken together, our results suggested that the automated isolation system is as reliable a method as manual isolation and may also be useful in clinical settings. Automated isolation is expected to enable cell-based clinical trials in small facilities with an aseptic room, without the necessity of a good manufacturing practice-level cell processing area. Copyright © 2012 John Wiley & Sons, Ltd.

  16. A data-driven multiplicative fault diagnosis approach for automation processes.

    Science.gov (United States)

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  17. The automated testing system of programs with the graphic user interface within the context of educational process

    OpenAIRE

    Sychev, O.; Kiryushkin, A.

    2009-01-01

    The paper describes the problems of automation of educational process at the course "Programming on high level language. Algorithmic languages". Complexities of testing of programs with the user interface are marked. Existing analogues was considered. Methods of automation of student's jobs testing are offered.

  18. Automation of ETE-CC 2/3 (Effluent Treatment Station); Automacao da ETE-CC 2/3 (Estacao de Tratamento de Efluentes)

    Energy Technology Data Exchange (ETDEWEB)

    Sinzato, Frederico Takashi Di Tanno; Esteves, Joao Paulo Leite; Souza, Rafael Soares de; Gomes, Lucio Nascimento; Santos, Leonardo Paiva [Companhia Siderurgica Nacional (CSN), Volta Redonda, RJ (Brazil)

    2009-11-01

    The present technical contribution presents the results of the implantation of a complete automation system of the ETE-CC 2/3 (Effluent Treatment Station of Continuous Casting Machine 2 and 3 of CSN), improving the reliability and the operation mode of the plant. The implanted system has the following features: remote operation and remote monitoring of all equipment of station; redundancy of operation stations, PLC's, communication networks and UPS; possibility of local control of equipment without automation system; wireless system of control and monitoring for the filters; recording system for all process variables. (author)

  19. Processing Approaches for DAS-Enabled Continuous Seismic Monitoring

    Science.gov (United States)

    Dou, S.; Wood, T.; Freifeld, B. M.; Robertson, M.; McDonald, S.; Pevzner, R.; Lindsey, N.; Gelvin, A.; Saari, S.; Morales, A.; Ekblaw, I.; Wagner, A. M.; Ulrich, C.; Daley, T. M.; Ajo Franklin, J. B.

    2017-12-01

    Distributed Acoustic Sensing (DAS) is creating a "field as laboratory" capability for seismic monitoring of subsurface changes. By providing unprecedented spatial and temporal sampling at a relatively low cost, DAS enables field-scale seismic monitoring to have durations and temporal resolutions that are comparable to those of laboratory experiments. Here we report on seismic processing approaches developed during data analyses of three case studies all using DAS-enabled seismic monitoring with applications ranging from shallow permafrost to deep reservoirs: (1) 10-hour downhole monitoring of cement curing at Otway, Australia; (2) 2-month surface monitoring of controlled permafrost thaw at Fairbanks, Alaska; (3) multi-month downhole and surface monitoring of carbon sequestration at Decatur, Illinois. We emphasize the data management and processing components relevant to DAS-based seismic monitoring, which include scalable approaches to data management, pre-processing, denoising, filtering, and wavefield decomposition. DAS has dramatically increased the data volume to the extent that terabyte-per-day data loads are now typical, straining conventional approaches to data storage and processing. To achieve more efficient use of disk space and network bandwidth, we explore improved file structures and data compression schemes. Because noise floor of DAS measurements is higher than that of conventional sensors, optimal processing workflow involving advanced denoising, deconvolution (of the source signatures), and stacking approaches are being established to maximize signal content of DAS data. The resulting workflow of data management and processing could accelerate the broader adaption of DAS for continuous monitoring of critical processes.

  20. An automated dose tracking system for adaptive radiation therapy.

    Science.gov (United States)

    Liu, Chang; Kim, Jinkoo; Kumarasiri, Akila; Mayyas, Essa; Brown, Stephen L; Wen, Ning; Siddiqui, Farzan; Chetty, Indrin J

    2018-02-01

    The implementation of adaptive radiation therapy (ART) into routine clinical practice is technically challenging and requires significant resources to perform and validate each process step. The objective of this report is to identify the key components of ART, to illustrate how a specific automated procedure improves efficiency, and to facilitate the routine clinical application of ART. Data was used from patient images, exported from a clinical database and converted to an intermediate format for point-wise dose tracking and accumulation. The process was automated using in-house developed software containing three modularized components: an ART engine, user interactive tools, and integration tools. The ART engine conducts computing tasks using the following modules: data importing, image pre-processing, dose mapping, dose accumulation, and reporting. In addition, custom graphical user interfaces (GUIs) were developed to allow user interaction with select processes such as deformable image registration (DIR). A commercial scripting application programming interface was used to incorporate automated dose calculation for application in routine treatment planning. Each module was considered an independent program, written in C++or C#, running in a distributed Windows environment, scheduled and monitored by integration tools. The automated tracking system was retrospectively evaluated for 20 patients with prostate cancer and 96 patients with head and neck cancer, under institutional review board (IRB) approval. In addition, the system was evaluated prospectively using 4 patients with head and neck cancer. Altogether 780 prostate dose fractions and 2586 head and neck cancer dose fractions went processed, including DIR and dose mapping. On average, daily cumulative dose was computed in 3 h and the manual work was limited to 13 min per case with approximately 10% of cases requiring an additional 10 min for image registration refinement. An efficient and convenient

  1. Selecting automation for the clinical chemistry laboratory.

    Science.gov (United States)

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  2. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  3. Automated Plasma Spray (APS) process feasibility study: Plasma spray process development and evaluation

    Science.gov (United States)

    Fetheroff, C. W.; Derkacs, T.; Matay, I. M.

    1979-01-01

    An automated plasma spray (APS) process was developed to apply two layer (NiCrAlY and ZrO2-12Y2O3) thermal-barrier coatings to aircraft gas turbine engine blade airfoils. The APS process hardware consists of four subsystems: a mechanical blade positioner incorporating two interlaced six-degree-of-freedom assemblies; a noncoherent optical metrology subsystem; a microprocessor-based adaptive system controller; and commercial plasma spray equipment. Over fifty JT9D first stage turbine blades specimens were coated with the APS process in preliminary checkout and evaluation studies. The best of the preliminary specimens achieved an overall coating thickness uniformity of + or - 53 micrometers, much better than is achievable manually. Factors limiting this performance were identified and process modifications were initiated accordingly. Comparative evaluations of coating thickness uniformity for manually sprayed and APS coated specimens were initiated. One of the preliminary evaluation specimens was subjected to a torch test and metallographic evaluation.

  4. Text mining from ontology learning to automated text processing applications

    CERN Document Server

    Biemann, Chris

    2014-01-01

    This book comprises a set of articles that specify the methodology of text mining, describe the creation of lexical resources in the framework of text mining and use text mining for various tasks in natural language processing (NLP). The analysis of large amounts of textual data is a prerequisite to build lexical resources such as dictionaries and ontologies and also has direct applications in automated text processing in fields such as history, healthcare and mobile applications, just to name a few. This volume gives an update in terms of the recent gains in text mining methods and reflects

  5. Monitoring endemic livestock diseases using laboratory diagnostic data: A simulation study to evaluate the performance of univariate process monitoring control algorithms.

    Science.gov (United States)

    Lopes Antunes, Ana Carolina; Dórea, Fernanda; Halasa, Tariq; Toft, Nils

    2016-05-01

    Surveillance systems are critical for accurate, timely monitoring and effective disease control. In this study, we investigated the performance of univariate process monitoring control algorithms in detecting changes in seroprevalence for endemic diseases. We also assessed the effect of sample size (number of sentinel herds tested in the surveillance system) on the performance of the algorithms. Three univariate process monitoring control algorithms were compared: Shewart p Chart(1) (PSHEW), Cumulative Sum(2) (CUSUM) and Exponentially Weighted Moving Average(3) (EWMA). Increases in seroprevalence were simulated from 0.10 to 0.15 and 0.20 over 4, 8, 24, 52 and 104 weeks. Each epidemic scenario was run with 2000 iterations. The cumulative sensitivity(4) (CumSe) and timeliness were used to evaluate the algorithms' performance with a 1% false alarm rate. Using these performance evaluation criteria, it was possible to assess the accuracy and timeliness of the surveillance system working in real-time. The results showed that EWMA and PSHEW had higher CumSe (when compared with the CUSUM) from week 1 until the end of the period for all simulated scenarios. Changes in seroprevalence from 0.10 to 0.20 were more easily detected (higher CumSe) than changes from 0.10 to 0.15 for all three algorithms. Similar results were found with EWMA and PSHEW, based on the median time to detection. Changes in the seroprevalence were detected later with CUSUM, compared to EWMA and PSHEW for the different scenarios. Increasing the sample size 10 fold halved the time to detection (CumSe=1), whereas increasing the sample size 100 fold reduced the time to detection by a factor of 6. This study investigated the performance of three univariate process monitoring control algorithms in monitoring endemic diseases. It was shown that automated systems based on these detection methods identified changes in seroprevalence at different times. Increasing the number of tested herds would lead to faster

  6. [Automation and organization of technological process of urinalysis].

    Science.gov (United States)

    Kolenkin, S M; Kishkun, A A; Kol'chenko, O L

    2000-12-01

    Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.

  7. 3D printed fluidics with embedded analytic functionality for automated reaction optimisation

    OpenAIRE

    Andrew J. Capel; Andrew Wright; Matthew J. Harding; George W. Weaver; Yuqi Li; Russell A. Harris; Steve Edmondson; Ruth D. Goodridge; Steven D. R. Christie

    2017-01-01

    Additive manufacturing or ‘3D printing’ is being developed as a novel manufacturing process for the production of bespoke micro and milli-scale fluidic devices. When coupled with online monitoring and optimisation software, this offers an advanced, customised method for performing automated chemical synthesis. This paper reports the use of two additive manufacturing processes, stereolithography and selective laser melting, to create multi-functional fluidic devices with embedded reaction moni...

  8. Development of automated welding process for field fabrication of thick walled pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, U A

    1981-01-01

    Research on automatic welding processes for the fabrication of thick-walled pressure vessels continued. A literature review on the subject was completed. A laboratory study of criteria for judging acceptable root parameters continued. Equipment for a demonstration facility to test the components and processes of the automated welding system has been specified and is being obtained. (LCL)

  9. Development of automated welding process for field fabrication of thick walled pressure vessels

    International Nuclear Information System (INIS)

    Schneider, U.A.

    Research on automatic welding processes for the fabrication of thick-walled pressure vessels continued. A literature review on the subject was completed. A laboratory study of criteria for judging acceptable root parameters continued. Equipment for a demonstration facility to test the components and processes of the automated welding system has been specified and is being obtained

  10. Automated control of the laser welding process of heart valve scaffolds

    Directory of Open Access Journals (Sweden)

    Weber Moritz

    2016-09-01

    Full Text Available Using the electrospinning process the geometry of a heart valve is not replicable by just one manufacturing process. To produce heart valve scaffolds the heart valve leaflets and the vessel have to be produced in separated spinning processes. For the final product of a heart valve they have to be mated afterwards. In this work an already existing three-axes laser was enhanced to laser weld those scaffolds. The automation control software is based on the robot operating system (ROS. The mechatronically control is done by an Arduino Mega. A graphical user interface (GUI is written with Python and Kivy.

  11. Context-awareness in task automation services by distributed event processing

    OpenAIRE

    Coronado Barrios, Miguel; Bruns, Ralf; Dunkel, Jürgen; Stipković, Sebastian

    2014-01-01

    Everybody has to coordinate several tasks everyday, usually in a manual manner. Recently, the concept of Task Automation Services has been introduced to automate and personalize the task coordination problem. Several user centered platforms and applications have arisen in the last years, that let their users configure their very own automations based on third party services. In this paper, we propose a new system architecture for Task Automation Services in a heterogeneous mobile, smart devic...

  12. Automation bias: a systematic review of frequency, effect mediators, and mitigators.

    Science.gov (United States)

    Goddard, Kate; Roudsari, Abdul; Wyatt, Jeremy C

    2012-01-01

    Automation bias (AB)--the tendency to over-rely on automation--has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human-automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners.

  13. Manned spacecraft automation and robotics

    Science.gov (United States)

    Erickson, Jon D.

    1987-01-01

    The Space Station holds promise of being a showcase user and driver of advanced automation and robotics technology. The author addresses the advances in automation and robotics from the Space Shuttle - with its high-reliability redundancy management and fault tolerance design and its remote manipulator system - to the projected knowledge-based systems for monitoring, control, fault diagnosis, planning, and scheduling, and the telerobotic systems of the future Space Station.

  14. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  15. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  16. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    Science.gov (United States)

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  17. Automations influence on nuclear power plants: a look at three accidents and how automation played a role.

    Science.gov (United States)

    Schmitt, Kara

    2012-01-01

    Nuclear power is one of the ways that we can design an efficient sustainable future. Automation is the primary system used to assist operators in the task of monitoring and controlling nuclear power plants (NPP). Automation performs tasks such as assessing the status of the plant's operations as well as making real time life critical situational specific decisions. While the advantages and disadvantages of automation are well studied in variety of domains, accidents remind us that there is still vulnerability to unknown variables. This paper will look at the effects of automation within three NPP accidents and incidents and will consider why automation failed in preventing these accidents from occurring. It will also review the accidents at the Three Mile Island, Chernobyl, and Fukushima Daiichi NPP's in order to determine where better use of automation could have resulted in a more desirable outcome.

  18. Modern Solutions for Automation of Electrical Traction Power Supply Systems

    Directory of Open Access Journals (Sweden)

    Ana Mihaela Andreica

    2011-09-01

    Full Text Available This paper presents modern solutions for the automation of the electrical traction power supply system used in urban public transport (trams, trolleybuses and subway trains. The monitoring and control of this process uses SCADA distributed architectures, grouped around a central point (dispatcher who controls all field sensors, transmitters and actuators using programmable logical controllers. The presented applications refer to the Bucharest electrical transport infrastructure.

  19. A scale-up field experiment for the monitoring of a burning process using chemical, audio, and video sensors.

    Science.gov (United States)

    Stavrakakis, P; Agapiou, A; Mikedi, K; Karma, S; Statheropoulos, M; Pallis, G C; Pappa, A

    2014-01-01

    Fires are becoming more violent and frequent resulting in major economic losses and long-lasting effects on communities and ecosystems; thus, efficient fire monitoring is becoming a necessity. A novel triple multi-sensor approach was developed for monitoring and studying the burning of dry forest fuel in an open field scheduled experiment; chemical, optical, and acoustical sensors were combined to record the fire spread. The results of this integrated field campaign for real-time monitoring of the fire event are presented and discussed. Chemical analysis, despite its limitations, corresponded to the burning process with a minor time delay. Nevertheless, the evolution profile of CO2, CO, NO, and O2 were detected and monitored. The chemical monitoring of smoke components enabled the observing of the different fire phases (flaming, smoldering) based on the emissions identified in each phase. The analysis of fire acoustical signals presented accurate and timely response to the fire event. In the same content, the use of a thermographic camera, for monitoring the biomass burning, was also considerable (both profiles of the intensities of average gray and red component greater than 230) and presented similar promising potentials to audio results. Further work is needed towards integrating sensors signals for automation purposes leading to potential applications in real situations.

  20. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    Science.gov (United States)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  1. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  2. Process analysis in a THTR trial reprocessing plant

    International Nuclear Information System (INIS)

    Brodda, B.G.; Filss, P.; Kirchner, H.; Kroth, K.; Lammertz, H.; Schaedlich, W.; Brocke, W.; Buerger, K.; Halling, H.; Watzlawik, K.H.

    1979-01-01

    The demands on an analytical control system for a THTR trial reprocessing plant are specified. In a rather detailed example, a typical sampling, sample monitoring and measuring process is described. Analytical control is partly automated. Data acquisition and evaluation by computer are described for some important, largely automated processes. Sample management and recording of in-line and off-line data are carried out by a data processing system. Some important experiments on sample taking, sample transport and on special analysis are described. (RB) [de

  3. Automated monitoring of recovered water quality

    Science.gov (United States)

    Misselhorn, J. E.; Hartung, W. H.; Witz, S. W.

    1974-01-01

    Laboratory prototype water quality monitoring system provides automatic system for online monitoring of chemical, physical, and bacteriological properties of recovered water and for signaling malfunction in water recovery system. Monitor incorporates whenever possible commercially available sensors suitably modified.

  4. Automating usability of ATLAS distributed computing resources

    International Nuclear Information System (INIS)

    Tupputi, S A; Girolamo, A Di; Kouba, T; Schovancová, J

    2014-01-01

    The automation of ATLAS Distributed Computing (ADC) operations is essential to reduce manpower costs and allow performance-enhancing actions, which improve the reliability of the system. In this perspective a crucial case is the automatic handling of outages of ATLAS computing sites storage resources, which are continuously exploited at the edge of their capabilities. It is challenging to adopt unambiguous decision criteria for storage resources of non-homogeneous types, sizes and roles. The recently developed Storage Area Automatic Blacklisting (SAAB) tool has provided a suitable solution, by employing an inference algorithm which processes history of storage monitoring tests outcome. SAAB accomplishes both the tasks of providing global monitoring as well as automatic operations on single sites. The implementation of the SAAB tool has been the first step in a comprehensive review of the storage areas monitoring and central management at all levels. Such review has involved the reordering and optimization of SAM tests deployment and the inclusion of SAAB results in the ATLAS Site Status Board with both dedicated metrics and views. The resulting structure allows monitoring the storage resources status with fine time-granularity and automatic actions to be taken in foreseen cases, like automatic outage handling and notifications to sites. Hence, the human actions are restricted to reporting and following up problems, where and when needed. In this work we show SAAB working principles and features. We present also the decrease of human interactions achieved within the ATLAS Computing Operation team. The automation results in a prompt reaction to failures, which leads to the optimization of resource exploitation.

  5. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  6. Automated Solar Cell Assembly Teamed Process Research. Final subcontract report, 6 January 1993--31 October 1995

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Hogan, S. J.; Breen, W. F.; Murach, J. M.; Sutherland, S. F.; Patterson, J. S.; Darkazalli, G. [Spire Corp., Bedford, MA (US)

    1996-02-01

    This is the Final Technical Report for a program entitled ''Automated Solar Cell Assembly Teamed Process Research,'' funded by the US Department of Energy. This program was part of Phase 3A of the Photovoltaic Manufacturing Technology (PVMaT) project, which addressed the generic needs of the photovoltaic (PV) industry for improved quality, accelerated production scale-up, and substantially reduced manufacturing cost. Crystalline silicon solar cells (Czochralski monocrystalline, cast polycrystalline, and ribbon polycrystalline) are used in the great majority of PV modules produced in the US, accounting for 95% of all shipments in 1994. Spire's goal in this program was to reduce the cost of these modules by developing high throughput (5 MW per year) automated processes for interconnecting solar cells made from standard and thin silicon wafers. Spire achieved this goal by developing a completely new automated processing system, designated the SPI-ASSEMBLER{trademark} 5000, which is now offered as a commercial product to the PV industry. A discussion of the project and of the Assembler is provided.

  7. Understanding human management of automation errors

    Science.gov (United States)

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  8. The using of the control room automation against human errors

    International Nuclear Information System (INIS)

    Kautto, A.

    1993-01-01

    The control room automation has developed very strongly during the 80's in IVO (Imatran Voima Oy). The former work expanded strongly with building of the full scope training simulator to the Loviisa plant. The important milestones has been, for example the testing of the Critical Function Monitoring System, a concept developed by Combustion Eng. Inc., in Loviisa training simulator 1982, the replacing of the process and simulator computers in Loviisa 1989, and 1990 and the presenting the use of the computer based procedures in training of operators 1993. With developing of automation and procedures it is possible to minimize the probability of human error. However, it is not possible totally eliminate the risks caused by human errors. (orig.)

  9. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    Science.gov (United States)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence

  10. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  11. Cloud-based CT dose monitoring using the DICOM-structured report. Fully automated analysis in regard to national diagnostic reference levels

    International Nuclear Information System (INIS)

    Boos, J.; Rubbert, C.; Heusch, P.; Lanzman, R.S.; Aissa, J.; Antoch, G.; Kroepil, P.

    2016-01-01

    To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). Materials and Methods: We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDI vol ) and dose length product (DLP). Results: Data of 36 523 examinations (131 527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDI vol and DLP were 51.3 % and 52.8 % of the national DRLs, respectively. CTDI vol and DLP reached 43.8 % and 43.1 % for abdominal CT (n = 10 590), 66.6 % and 69.6 % for cranial CT (n = 16 098) and 37.8 % and 44.0 % for chest CT (n = 10 387) of the compared national DRLs, respectively. Overall, the CTDI vol exceeded national DRLs in 1.9 % of the examinations, while the DLP exceeded national DRLs in 2.9 % of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50 % of the DRLs. Conclusion: The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50 % of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments.

  12. Monitoring of polymer melt processing

    International Nuclear Information System (INIS)

    Alig, Ingo; Steinhoff, Bernd; Lellinger, Dirk

    2010-01-01

    The paper reviews the state-of-the-art of in-line and on-line monitoring during polymer melt processing by compounding, extrusion and injection moulding. Different spectroscopic and scattering techniques as well as conductivity and viscosity measurements are reviewed and compared concerning their potential for different process applications. In addition to information on chemical composition and state of the process, the in situ detection of morphology, which is of specific interest for multiphase polymer systems such as polymer composites and polymer blends, is described in detail. For these systems, the product properties strongly depend on the phase or filler morphology created during processing. Examples for optical (UV/vis, NIR) and ultrasonic attenuation spectra recorded during extrusion are given, which were found to be sensitive to the chemical composition as well as to size and degree of dispersion of micro or nanofillers in the polymer matrix. By small-angle light scattering experiments, process-induced structures were detected in blends of incompatible polymers during compounding. Using conductivity measurements during extrusion, the influence of processing conditions on the electrical conductivity of polymer melts with conductive fillers (carbon black or carbon nanotubes) was monitored. (topical review)

  13. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceability requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate their need for varying degrees of reliability in routine weighing operations

  14. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceable requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate our need for varying degrees of reliability in routine weighing operations. 1 ref

  15. AUTOMATED SYSTEM OF DATA PROCESSING WITH THE IMPLEMENTATION OF RATING TECHNOLOGY OF TEACHING

    Directory of Open Access Journals (Sweden)

    О. И. Дзювина

    2014-01-01

    Full Text Available Rating technology of teaching enables independent and individual work of students, increase their motivation.Purpose: to increase the efficiency of data processing with the implementation of rating technology of teaching.Method: analysis, synthesis,experiment.Results. Developed an automated data processing system for the implementation of rating technology of teaching.Practical implication. Education.Purchase on Elibrary.ru > Buy now

  16. Web-based execution of graphical work-flows: a modular platform for multifunctional scientific process automation

    International Nuclear Information System (INIS)

    De Ley, E.; Jacobs, D.; Ounsy, M.

    2012-01-01

    The Passerelle process automation suite offers a fundamentally modular solution platform, based on a layered integration of several best-of-breed technologies. It has been successfully applied by Synchrotron Soleil as the sequencer for data acquisition and control processes on its beamlines, integrated with TANGO as a control bus and GlobalScreen TM ) as the SCADA package. Since last year it is being used as the graphical work-flow component for the development of an eclipse-based Data Analysis Work Bench, at ESRF. The top layer of Passerelle exposes an actor-based development paradigm, based on the Ptolemy framework (UC Berkeley). Actors provide explicit reusability and strong decoupling, combined with an inherently concurrent execution model. Actor libraries exist for TANGO integration, web-services, database operations, flow control, rules-based analysis, mathematical calculations, launching external scripts etc. Passerelle's internal architecture is based on OSGi, the major Java framework for modular service-based applications. A large set of modules exist that can be recombined as desired to obtain different features and deployment models. Besides desktop versions of the Passerelle work-flow workbench, there is also the Passerelle Manager. It is a secured web application including a graphical editor, for centralized design, execution, management and monitoring of process flows, integrating standard Java Enterprise services with OSGi. We will present the internal technical architecture, some interesting application cases and the lessons learnt. (authors)

  17. FY-2010 Process Monitoring Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Casella, Amanda J.; Hines, Wes; Levitskaia, Tatiana G.; henkell, J.; Schwantes, Jon M.; Jordan, Elizabeth A.; Lines, Amanda M.; Fraga, Carlos G.; Peterson, James M.; Verdugo, Dawn E.; Christensen, Ronald N.; Peper, Shane M.

    2011-01-01

    During FY 2010, work under the Spectroscopy-Based Process Monitoring task included ordering and receiving four fluid flow meters and four flow visible-near infrared spectrometer cells to be instrumented within the centrifugal contactor system at Pacific Northwest National Laboratory (PNNL). Initial demonstrations of real-time spectroscopic measurements on cold-stream simulants were conducted using plutonium (Pu)/uranium (U) (PUREX) solvent extraction process conditions. The specific test case examined the extraction of neodymium nitrate (Nd(NO3)3) from an aqueous nitric acid (HNO3) feed into a tri-n-butyl phosphate (TBP)/ n-dodecane solvent. Demonstration testing of this system included diverting a sample from the aqueous feed meanwhile monitoring the process in every phase using the on-line spectroscopic process monitoring system. The purpose of this demonstration was to test whether spectroscopic monitoring is capable of determining the mass balance of metal nitrate species involved in a cross-current solvent extraction scheme while also diverting a sample from the system. The diversion scenario involved diverting a portion of the feed from a counter-current extraction system while a continuous extraction experiment was underway. A successful test would demonstrate the ability of the process monitoring system to detect and quantify the diversion of material from the system during a real-time continuous solvent extraction experiment. The system was designed to mimic a PUREX-type extraction process with a bank of four centrifugal contactors. The aqueous feed contained Nd(NO3)3 in HNO3, and the organic phase was composed of TBP/n-dodecane. The amount of sample observed to be diverted by on-line spectroscopic process monitoring was measured to be 3 mmol (3 x 10-3 mol) Nd3+. This value was in excellent agreement with the 2.9 mmol Nd3+ value based on the known mass of sample taken (i.e., diverted) directly from the system feed solution.

  18. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    Poellaenen, R.; Ilander, T.; Lehtinen, J.; Leppaenen, A.; Nikkinen, M.; Toivonen, H.; Ylaetalo, S.; Smartt, H.; Garcia, R.; Martinez, R.; Glidewell, D.; Krantz, K.

    1999-01-01

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  19. Process development for automated solar cell and module production. Task 4: automated array assembly. Quarterly report No. 5

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J.J.

    1980-01-31

    Construction of an automated solar cell layup and interconnect system is now complete. This system incorporates a Unimate 2000 B industrial robot with an end effector consisting of a vacuum pick up and induction heating coil. The robot interfaces with a smart cell preparation station which correctly orients the cell, applies solder paste and forms and positions the correct lengths of interconnect lead. The system is controlled and monitored by a TRS-80 micro computer. The first operational tests of the fully integrated station have been run. These tests proved the soundness of the basic design concept but also pointed to areas in which modifications are necessary. These modifications are nearly complete and the improved parts are being integrated. Development of the controlling computer program is progressing to both reflect these changes and reduce operating time.

  20. Smart membranes for monitoring membrane based desalination processes

    KAUST Repository

    Laleg-Kirati, Taous-Meriem

    2017-10-12

    Various examples are related to smart membranes for monitoring membrane based process such as, e.g., membrane distillation processes. In one example, a membrane, includes a porous surface and a plurality of sensors (e.g., temperature, flow and/or impedance sensors) mounted on the porous surface. In another example, a membrane distillation (MD) process includes the membrane. Processing circuitry can be configured to monitor outputs of the plurality of sensors. The monitored outputs can be used to determine membrane degradation, membrane fouling, or to provide an indication of membrane replacement or cleaning. The sensors can also provide temperatures or temperature differentials across the porous surface, which can be used to improve modeling or control the MD process.

  1. Multi-method automated diagnostics of rotating machines

    Science.gov (United States)

    Kostyukov, A. V.; Boychenko, S. N.; Shchelkanov, A. V.; Burda, E. A.

    2017-08-01

    The automated machinery diagnostics and monitoring systems utilized within the petrochemical plants are an integral part of the measures taken to ensure safety and, as a consequence, the efficiency of these industrial facilities. Such systems are often limited in their functionality due to the specifics of the diagnostic techniques adopted. As the diagnostic techniques applied in each system are limited, and machinery defects can have different physical nature, it becomes necessary to combine several diagnostics and monitoring systems to control various machinery components. Such an approach is inconvenient, since it requires additional measures to bring the diagnostic results in a single view of the technical condition of production assets. In this case, we mean by a production facility a bonded complex of a process unit, a drive, a power source and lines. A failure of any of these components will cause an outage of the production asset, which is unacceptable. The purpose of the study is to test a combined use of vibration diagnostics and partial discharge techniques within the diagnostic systems of enterprises for automated control of the technical condition of rotating machinery during maintenance and at production facilities. The described solutions allow you to control the condition of mechanical and electrical components of rotating machines. It is shown that the functionality of the diagnostics systems can be expanded with minimal changes in technological chains of repair and operation of rotating machinery. Automation of such systems reduces the influence of the human factor on the quality of repair and diagnostics of the machinery.

  2. SOFTWARE OF MONITORING SYSTEM FOR ALLOCATED INFORMATION OF STATE PROGRAM ON INNOVATIVE DEVELOPMENT OF THE REPUBLIC OF BELARUS

    Directory of Open Access Journals (Sweden)

    V. A. Rybak

    2009-01-01

    Full Text Available The paper analyzes main indices (indicators of realization of the State program on innovative development of the Republic of Belarus (SPIDRB, contains and justifies a hierarchical structure of data processing and display, finalizes a list of the SPIDRB participants and executors. Major functions of the units pertaining to automation of an automated SPIDRB monitoring system are determined in the paper. In order to accumulate, process and furnish information a system of documentary databases on the basis of IBM Lotus Domino/Notes 8, relational databases of IBM DB2 and MS SQL Server 2005 is used in the given paper. Interaction with data suppliers is ensured by means of e-mail. The proposed scientific principles and software allow to automate a process of SPIDRB monitoring and to raise a decision-making efficiency in the field of innovative economic development of our country.

  3. Highly Automated Agile Testing Process: An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Jarosław Berłowski

    2016-09-01

    Full Text Available This paper presents a description of an agile testing process in a medium size software project that is developed using Scrum. The research methods used is the case study were as follows: surveys, quantifiable project data sources and qualitative project members opinions were used for data collection. Challenges related to the testing process regarding a complex project environment and unscheduled releases were identified. Based on the obtained results, we concluded that the described approach addresses well the aforementioned issues. Therefore, recommendations were made with regard to the employed principles of agility, specifically: continuous integration, responding to change, test automation and test driven development. Furthermore, an efficient testing environment that combines a number of test frameworks (e.g. JUnit, Selenium, Jersey Test with custom-developed simulators is presented.

  4. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  5. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  6. Automated road segment creation process : a report on research sponsored by SaferSim.

    Science.gov (United States)

    2016-08-01

    This report provides a summary of a set of tools that can be used to automate the process : of generating roadway surfaces from alignment and texture information. The tools developed : were created in Python 3.x and rely on the availability of two da...

  7. Towards the development of an automated ATP measuring platform to monitor microbial quality of drinking water

    DEFF Research Database (Denmark)

    Tatari, Karolina; Hansen, C. B.; Rasmussen, A.

    is detected by a photomultiplier. Temperature in the assay box is controlled and set to 25°C. Calibration of the system using ATP standard solutions was successful, both for free and for total ATP. Chemical release of ATP by reagent addition however resulted in the formation of particles that ultimately......This work aimed to develop an automated and nearly on-line method to monitor ATP levels in drinking water as an indicator of microbial contamination. The system consists of a microfluidic cartridge installed in a light tight box, where the sample is mixed with the reagents and the emitted light...

  8. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  9. MICROBIOLOGICAL MONITORING AND AUTOMATED EVENT SAMPLING AT KARST SPRINGS USING LEO-SATELLITES

    Science.gov (United States)

    Stadler, Hermann; Skritek, Paul; Sommer, Regina; Mach, Robert L.; Zerobin, Wolfgang; Farnleitner, Andreas H.

    2010-01-01

    Data communication via Low-Earth-Orbit Satellites between portable hydro-meteorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E.coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples by hand compared with the auto-sampling procedure revealed corresponding results and were in agreement to the ISO 9308-1 reference method. E.coli concentrations were individually corrected by event specific die-off rates (0.10–0.14 day−1) compensating losses due to sample storage at spring temperature in the auto sampler. Two large summer events 2005/2006 at a large alpine karst spring (LKAS2) were monitored including detailed analysis of E.coli dynamics (n = 271) together with comprehensive hydrological characterisations. High resolution time series demonstrated a sudden increase of E.coli concentrations in spring water (approx. 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorbent coefficient measured at 254nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-Satellite based system it is a helpful tool for Early-Warning-Systems in the field of drinking water protection. PMID:18776628

  10. Application of the informational reference system OZhUR to the automated processing of data from satellites of the Kosmos series

    Science.gov (United States)

    Pokras, V. M.; Yevdokimov, V. P.; Maslov, V. D.

    1978-01-01

    The structure and potential of the information reference system OZhUR designed for the automated data processing systems of scientific space vehicles (SV) is considered. The system OZhUR ensures control of the extraction phase of processing with respect to a concrete SV and the exchange of data between phases.The practical application of the system OZhUR is exemplified in the construction of a data processing system for satellites of the Cosmos series. As a result of automating the operations of exchange and control, the volume of manual preparation of data is significantly reduced, and there is no longer any need for individual logs which fix the status of data processing. The system Ozhur is included in the automated data processing system Nauka which is realized in language PL-1 in a binary one-address system one-state (BOS OS) electronic computer.

  11. A guide to automation techniques in calorimetry

    International Nuclear Information System (INIS)

    Renz, D.P.; Wetzel, J.R.; Breakall, K.L.; James, S.J.; Kasperski, P.W.

    1992-01-01

    Many improvements occurring in calorimetry measurement technology in the past few years will help current users of calorimeters to achieve better use of their existing systems. These include a more user friendly operator computer interface, a more detailed printout at the end of each run, improved data processing to eliminate operator error, and improved system monitoring to detect system or environmental problems. In addition, an electrical calibration heater has been developed to replace plutonium-238 heat standards for calibrating calorimeters, and several automation systems allow for easier and safer system operation

  12. Interpreting complex data by methods of recognition and classification in an automated system of aerogeophysical material processing

    Energy Technology Data Exchange (ETDEWEB)

    Koval' , L.A.; Dolgov, S.V.; Liokumovich, G.B.; Ovcharenko, A.V.; Priyezzhev, I.I.

    1984-01-01

    The system of automated processing of aerogeophysical data, ASOM-AGS/YeS, is equipped with complex interpretation of multichannel measurements. Algorithms of factor analysis, automatic classification and apparatus of a priori specified (selected) decisive rules are used. The areas of effect of these procedures can be initially limited to the specified geological information. The possibilities of the method are demonstrated by the results of automated processing of the aerogram-spectrometric measurements in the region of the known copper-porphyr manifestation in Kazakhstan. This ore deposit was clearly noted after processing by the method of main components by complex aureole of independent factors U (severe increase), Th (noticeable increase), K (decrease).

  13. Automated Hydroforming of Seamless Superconducting RF Cavity

    International Nuclear Information System (INIS)

    Nagata, Tomohiko; Shinozawa, Seiichi; Abe, Noriyuki; Nagakubo, Junki; Murakami, Hirohiko; Tajima, Tsuyoshi; Inoue, Hitoshi; Yamanaka, Masashi; Ueno, Kenji

    2012-01-01

    We are studying the possibility of automated hydroforming process for seamless superconducting RF cavities. Preliminary hydroforming tests of three-cell cavities from seamless tubes made of C1020 copper have been performed. The key point of an automated forming is to monitor and strictly control some parameters such as operation time, internal pressure and material displacements. Especially, it is necessary for our studies to be able to control axial and radial deformation independently. We plan to perform the forming in two stages to increase the reliability of successful forming. In the first stage hydroforming by using intermediate constraint dies, three-cell cavities were successfully formed in less than 1 minute. In parallel, we did elongation tests on cavity-quality niobium and confirmed that it is possible to achieve an elongation of >64% in 2 stages that is required for our forming of 1.3 GHz cavities.

  14. Using microwave Doppler radar in automated manufacturing applications

    Science.gov (United States)

    Smith, Gregory C.

    Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help

  15. A wireless smart sensor network for automated monitoring of cable tension

    International Nuclear Information System (INIS)

    Sim, Sung-Han; Cho, Soojin; Li, Jian; Jo, Hongki; Park, Jong-Woong; Jung, Hyung-Jo; Spencer Jr, Billie F

    2014-01-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea. (paper)

  16. A wireless smart sensor network for automated monitoring of cable tension

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jong-Woong; Cho, Soojin; Spencer, Billie F., Jr.; Jung, Hyung-Jo

    2014-02-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea.

  17. Implementation and Impact of an Automated Group Monitoring and Feedback System to Promote Hand Hygiene Among Health Care Personnel

    Science.gov (United States)

    Conway, Laurie J.; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L.

    2015-01-01

    Article-at-a-Glance Background Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care–associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. Methods An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. Results After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Conclusions Feedback via an automated system was associated with improved hand hygiene performance in the short term. PMID:25252389

  18. Implementation and impact of an automated group monitoring and feedback system to promote hand hygiene among health care personnel.

    Science.gov (United States)

    Conway, Laurie J; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L

    2014-09-01

    Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care-associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Feedback via an automated system was associated with improved hand hygiene performance in the short-term.

  19. Chemotherapy-related neuropathic symptom management: a randomized trial of an automated symptom-monitoring system paired with nurse practitioner follow-up.

    Science.gov (United States)

    Kolb, Noah Allan; Smith, Albert Gordon; Singleton, John Robinson; Beck, Susan L; Howard, Diantha; Dittus, Kim; Karafiath, Summer; Mooney, Kathi

    2018-05-01

    The purpose of this study was to evaluate a new care model to reduce chemotherapy-induced neuropathic symptoms. Neuropathic symptom usual care was prospectively compared to an automated symptom-monitoring and coaching system, SymptomCare@Home (SCH), which included nurse practitioner follow-up triggered by moderate to severe symptoms. Patients beginning chemotherapy were randomized to usual care (UC) or to the SCH intervention. This sub-analysis included only taxane/platin therapies. Participants called the automated telephone symptom-monitoring system daily to report numbness and tingling. The monitoring system recorded patient-reported neuropathic symptom severity, distress, and activity interference on a 0-10 scale. UC participants were instructed to call their oncologist for symptom management. SCH participants with symptom severity of ≥ 4 received automated self-care strategies, and a nurse practitioner (NP) provided guideline-based care. There were 252 participants, 78.6% of which were female. Mean age was 55.1 years. Mean follow-up was 90.2 ± 39.9 days (81.1 ± 40.3 calls). SCH participants had fewer days of moderate (1.8 ± 4.0 vs. 8.6 ± 17.3, p < 0.001) and severe chemotherapy-induced peripheral neuropathy symptoms (0.3 ± 1.0 vs. 1.1 ± 5.2, p = 0.006). SCH participants had fewer days with moderate and severe symptom-related distress (1.4 ± 3.7 vs. 6.9 ± 15.0, p < 0.001; 0.2 ± 0.9 vs. 1.5 ± 6.1, p = 0.001) and trended towards less activity interference (3.3 ± 1.9 vs. 3.8 ± 2.1, p = 0.08). Other neuropathic symptoms were addressed in 5.8-15.4% of SCH follow-up calls. The SCH system effectively identified neuropathic symptoms and their severity and, paired with NP follow-up, reduced symptom prevalence, severity, and distress compared to usual care.

  20. Automated Processing of Plasma Samples for Lipoprotein Separation by Rate-Zonal Ultracentrifugation.

    Science.gov (United States)

    Peters, Carl N; Evans, Iain E J

    2016-12-01

    Plasma lipoproteins are the primary means of lipid transport among tissues. Defining alterations in lipid metabolism is critical to our understanding of disease processes. However, lipoprotein measurement is limited to specialized centers. Preparation for ultracentrifugation involves the formation of complex density gradients that is both laborious and subject to handling errors. We created a fully automated device capable of forming the required gradient. The design has been made freely available for download by the authors. It is inexpensive relative to commercial density gradient formers, which generally create linear gradients unsuitable for rate-zonal ultracentrifugation. The design can easily be modified to suit user requirements and any potential future improvements. Evaluation of the device showed reliable peristaltic pump accuracy and precision for fluid delivery. We also demonstrate accurate fluid layering with reduced mixing at the gradient layers when compared to usual practice by experienced laboratory personnel. Reduction in layer mixing is of critical importance, as it is crucial for reliable lipoprotein separation. The automated device significantly reduces laboratory staff input and reduces the likelihood of error. Overall, this device creates a simple and effective solution to formation of complex density gradients. © 2015 Society for Laboratory Automation and Screening.

  1. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.

  2. Software for an automated processing system for radioisotope information from multichannel radiodiagnostic instruments

    International Nuclear Information System (INIS)

    Zelenin, P.E.; Meier, V.P.

    1985-01-01

    The SAORI-01 system for the automated processing of radioisotope information is designed for the collection, processing, and representation of information coming from gamma chambers and multichannel radiodiagnostic instruments (MRI) and is basically oriented toward the radiodiagnostic laboratories of major multidisciplinary hospitals and scientific-research institutes. The functional characteristics of the basic software are discussed, and permits performance of the following functions: collection of information regarding MRI; processing and representation of recorded information; storage of patient files on magnetic carriers; and writing of special processing programs in the FORTRAN and BASIC high-level language

  3. Automation of the software production process for multiple cryogenic control applications

    OpenAIRE

    Fluder, Czeslaw; Lefebvre, Victor; Pezzetti, Marco; Plutecki, Przemyslaw; Tovar-González, Antonio; Wolak, Tomasz

    2018-01-01

    The development of process control systems for the cryogenic infrastructure at CERN is based on an automatic software generation approach. The overall complexity of the systems, their frequent evolution as well as the extensive use of databases, repositories, commercial engineering software and CERN frameworks have led to further efforts towards improving the existing automation based software production methodology. A large number of control system upgrades were successfully performed for th...

  4. Physiological Self-Regulation and Adaptive Automation

    Science.gov (United States)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  5. A survey on automated wheeze detection systems for asthmatic patients

    Directory of Open Access Journals (Sweden)

    Syamimi Mardiah Shaharum

    2012-11-01

    Full Text Available The purpose of this paper is to present an evidence of automated wheeze detection system by a survey that can be very beneficial for asthmatic patients. Generally, for detecting asthma in a patient, stethoscope is used for ascertaining wheezes present. This causes a major problem nowadays because a number of patients tend to delay the interpretation time, which can lead to misinterpretations and in some worst cases to death. Therefore, the development of automated system would ease the burden of medical personnel. A further discussion on automated wheezes detection system will be presented later in the paper. As for the methodology, a systematic search of articles published as early as 1985 to 2012 was conducted. Important details including the hardware used, placement of hardware, and signal processing methods have been presented clearly thus hope to help and encourage future researchers to develop commercial system that will improve the diagnosing and monitoring of asthmatic patients.

  6. 78 FR 44142 - Modification of Two National Customs Automation Program (NCAP) Tests Concerning Automated...

    Science.gov (United States)

    2013-07-23

    ... Customs Automation Program (NCAP) Tests Concerning Automated Commercial Environment (ACE) Document Image... (CBP's) plan to modify the National Customs Automation Program (NCAP) tests concerning document imaging... entry process by reducing the number of data elements required to obtain release for cargo transported...

  7. Automating multistep flow synthesis: approach and challenges in integrating chemistry, machines and logic

    Directory of Open Access Journals (Sweden)

    Chinmay A. Shukla

    2017-05-01

    Full Text Available The implementation of automation in the multistep flow synthesis is essential for transforming laboratory-scale chemistry into a reliable industrial process. In this review, we briefly introduce the role of automation based on its application in synthesis viz. auto sampling and inline monitoring, optimization and process control. Subsequently, we have critically reviewed a few multistep flow synthesis and suggested a possible control strategy to be implemented so that it helps to reliably transfer the laboratory-scale synthesis strategy to a pilot scale at its optimum conditions. Due to the vast literature in multistep synthesis, we have classified the literature and have identified the case studies based on few criteria viz. type of reaction, heating methods, processes involving in-line separation units, telescopic synthesis, processes involving in-line quenching and process with the smallest time scale of operation. This classification will cover the broader range in the multistep synthesis literature.

  8. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  9. Automation of Test Cases for Web Applications : Automation of CRM Test Cases

    OpenAIRE

    Seyoum, Alazar

    2012-01-01

    The main theme of this project was to design a test automation framework for automating web related test cases. Automating test cases designed for testing a web interface provide a means of improving a software development process by shortening the testing phase in the software development life cycle. In this project an existing AutoTester framework and iMacros test automation tools were used. CRM Test Agent was developed to integrate AutoTester to iMacros and to enable the AutoTester,...

  10. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles.

    Science.gov (United States)

    Görgl, R.; Brandstätter, E.

    2016-03-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  11. Laser materials processing of complex components: from reverse engineering via automated beam path generation to short process development cycles

    Science.gov (United States)

    Görgl, Richard; Brandstätter, Elmar

    2017-01-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser cladding and laser-based additive manufacturing are given.

  12. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  13. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  14. Automation for mineral resource development

    Energy Technology Data Exchange (ETDEWEB)

    Norrie, A.W.; Turner, D.R. (eds.)

    1986-01-01

    A total of 55 papers were presented at the symposium under the following headings: automation and the future of mining; modelling and control of mining processes; transportation for mining; automation and the future of metallurgical processes; modelling and control of metallurgical processes; and general aspects. Fifteen papers have been abstracted separately.

  15. Robotic Automation Process - The next major revolution in terms of back office operations improvement

    Directory of Open Access Journals (Sweden)

    Anagnoste Sorin

    2017-07-01

    Full Text Available Forced to provide results consistent results to shareholders the organizations turned to Robotic Process Automation (RPA in order to tackle the following typical challenges they face: (1 Cost reduction, (2 Quality increase and (3 Faster processes. RPA is now considered the next big thing for the Shared Services Centers (SSC and Business Process Outsourced (BPO around the world, and especially in Central and Eastern Europe. In SSCs and BPOs the activities with the highest potential for automation are in finance, supply chain and in human resource departments. This means that the problems these business are facing are mostly related to high data entry volumes, high error rates, significant rework, numerous manual processes, multiple not-integrated legacy systems and high turnover due to repetitive/low value added activities. One advantage of RPA is that it can be trained by the users to undertake structured repeatable, computer based tasks interacting in the same time with multiple systems while performing complex decisions based on algorithms. By doing this, the robot can identify the exceptions for manual processing, remove idle times and keep logs of actions performed. Another advantage is that the automated solutions can work 24/7, it can be implemented fast, work with the existing architecture, cut data entry costs by up to 70% and perform at 30% of the cost of a full time employee, thus providing a quick and tangible return to organizations. For Romania, a key destination for SSCs and BPOs, this technology will make them more competitive, but also will lead to a creation of a series of high-paid jobs while eliminating the low-input jobs. The paper will analyze also the most important vendor providers of RPA solutions on the market and will provide specific case studies from different industries, thus helping future leaders and organizations taking better decisions.

  16. The design of an automated electrolytic enrichment apparatus for tritium

    Energy Technology Data Exchange (ETDEWEB)

    Myers, J.L.

    1994-12-01

    The Radiation Analytical Sciences Section at Laboratory at Lawrence Livermore National Laboratory performs analysis of low-level tritium concentrations in various natural water samples from the Tri-Valley Area, DOE Nevada Test Site, Site 300 in Tracy, CA, and other various places around the world. Low levels of tritium, a radioactive isotope of hydrogen, which is pre-concentrated in the RAS laboratory using an electrolytic enrichment apparatus. Later these enriched waters are analyzed by liquid scintillation counting to determine the activity of tritium. The enrichment procedure and the subsequent purification process by vacuum distillation are currently undertaken manually, hence being highly labor-intensive. The whole process typically takes about 2 to 3 weeks to complete a batch of 30 samples, with a dedicated personnel operating the process. The goal is to automate the entire process, specifically having the operation PC-LabVIEW{trademark} controlled with real-time monitoring capability. My involvement was in the design and fabrication of a prototypical automated electrolytic enrichment cell. Work will be done on optimizing the electrolytic process by assessing the different parameters of the enrichment procedure. Hardware and software development have also been an integral component of this project.

  17. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    Science.gov (United States)

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  18. Autonomous monitoring of control hardware to predict off-normal conditions using NIF automatic alignment systems

    International Nuclear Information System (INIS)

    Awwal, Abdul A.S.; Wilhelmsen, Karl; Leach, Richard R.; Miller-Kamm, Vicki; Burkhart, Scott; Lowe-Webb, Roger; Cohen, Simon

    2012-01-01

    Highlights: ► An automatic alignment system was developed to process images of the laser beams. ► System uses processing to adjust a series of control loops until alignment criteria are satisfied. ► Monitored conditions are compared against nominal values with an off-normal alert. ► Automated health monitoring system trends off-normals with a large image history. - Abstract: The National Ignition Facility (NIF) is a high power laser system capable of supporting high-energy-density experimentation as a user facility for the next 30 years. In order to maximize the facility availability, preventive maintenance enhancements are being introduced into the system. An example of such an enhancement is a camera-based health monitoring system, integrated into the automated alignment system, which provides an opportunity to monitor trends in measurements such as average beam intensity, size of the beam, and pixel saturation. The monitoring system will generate alerts based on observed trends in measurements to allow scheduled pro-active maintenance before routine off-normal detection stops system operations requiring unscheduled intervention.

  19. Autonomous monitoring of control hardware to predict off-normal conditions using NIF automatic alignment systems

    Energy Technology Data Exchange (ETDEWEB)

    Awwal, Abdul A.S., E-mail: awwal1@llnl.gov [Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States); Wilhelmsen, Karl; Leach, Richard R.; Miller-Kamm, Vicki; Burkhart, Scott; Lowe-Webb, Roger; Cohen, Simon [Lawrence Livermore National Laboratory, Livermore, CA 94550 (United States)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer An automatic alignment system was developed to process images of the laser beams. Black-Right-Pointing-Pointer System uses processing to adjust a series of control loops until alignment criteria are satisfied. Black-Right-Pointing-Pointer Monitored conditions are compared against nominal values with an off-normal alert. Black-Right-Pointing-Pointer Automated health monitoring system trends off-normals with a large image history. - Abstract: The National Ignition Facility (NIF) is a high power laser system capable of supporting high-energy-density experimentation as a user facility for the next 30 years. In order to maximize the facility availability, preventive maintenance enhancements are being introduced into the system. An example of such an enhancement is a camera-based health monitoring system, integrated into the automated alignment system, which provides an opportunity to monitor trends in measurements such as average beam intensity, size of the beam, and pixel saturation. The monitoring system will generate alerts based on observed trends in measurements to allow scheduled pro-active maintenance before routine off-normal detection stops system operations requiring unscheduled intervention.

  20. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing; Metodologia de monitoracao e diagnostico automatizado de rolamentos utilizando logica paraconsistente, transformada de Wavelet e processamento de sinais digitais

    Energy Technology Data Exchange (ETDEWEB)

    Masotti, Paulo Henrique Ferraz

    2006-07-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  1. Novel automated blood separations validate whole cell biomarkers.

    Directory of Open Access Journals (Sweden)

    Douglas E Burger

    Full Text Available Progress in clinical trials in infectious disease, autoimmunity, and cancer is stymied by a dearth of successful whole cell biomarkers for peripheral blood lymphocytes (PBLs. Successful biomarkers could help to track drug effects at early time points in clinical trials to prevent costly trial failures late in development. One major obstacle is the inaccuracy of Ficoll density centrifugation, the decades-old method of separating PBLs from the abundant red blood cells (RBCs of fresh blood samples.To replace the Ficoll method, we developed and studied a novel blood-based magnetic separation method. The magnetic method strikingly surpassed Ficoll in viability, purity and yield of PBLs. To reduce labor, we developed an automated platform and compared two magnet configurations for cell separations. These more accurate and labor-saving magnet configurations allowed the lymphocytes to be tested in bioassays for rare antigen-specific T cells. The automated method succeeded at identifying 79% of patients with the rare PBLs of interest as compared with Ficoll's uniform failure. We validated improved upfront blood processing and show accurate detection of rare antigen-specific lymphocytes.Improving, automating and standardizing lymphocyte detections from whole blood may facilitate development of new cell-based biomarkers for human diseases. Improved upfront blood processes may lead to broad improvements in monitoring early trial outcome measurements in human clinical trials.

  2. Novel automated blood separations validate whole cell biomarkers.

    Science.gov (United States)

    Burger, Douglas E; Wang, Limei; Ban, Liqin; Okubo, Yoshiaki; Kühtreiber, Willem M; Leichliter, Ashley K; Faustman, Denise L

    2011-01-01

    Progress in clinical trials in infectious disease, autoimmunity, and cancer is stymied by a dearth of successful whole cell biomarkers for peripheral blood lymphocytes (PBLs). Successful biomarkers could help to track drug effects at early time points in clinical trials to prevent costly trial failures late in development. One major obstacle is the inaccuracy of Ficoll density centrifugation, the decades-old method of separating PBLs from the abundant red blood cells (RBCs) of fresh blood samples. To replace the Ficoll method, we developed and studied a novel blood-based magnetic separation method. The magnetic method strikingly surpassed Ficoll in viability, purity and yield of PBLs. To reduce labor, we developed an automated platform and compared two magnet configurations for cell separations. These more accurate and labor-saving magnet configurations allowed the lymphocytes to be tested in bioassays for rare antigen-specific T cells. The automated method succeeded at identifying 79% of patients with the rare PBLs of interest as compared with Ficoll's uniform failure. We validated improved upfront blood processing and show accurate detection of rare antigen-specific lymphocytes. Improving, automating and standardizing lymphocyte detections from whole blood may facilitate development of new cell-based biomarkers for human diseases. Improved upfront blood processes may lead to broad improvements in monitoring early trial outcome measurements in human clinical trials.

  3. Automated detection of macular drusen using geometric background leveling and threshold selection.

    Science.gov (United States)

    Smith, R Theodore; Chan, Jackie K; Nagasaki, Takayuki; Ahmad, Umer F; Barbazetto, Irene; Sparrow, Janet; Figueroa, Marta; Merriam, Joanna

    2005-02-01

    Age-related macular degeneration (ARMD) is the most prevalent cause of visual loss in patients older than 60 years in the United States. Observation of drusen is the hallmark finding in the clinical evaluation of ARMD. To segment and quantify drusen found in patients with ARMD using image analysis and to compare the efficacy of image analysis segmentation with that of stereoscopic manual grading of drusen. Retrospective study. University referral center.Patients Photographs were randomly selected from an available database of patients with known ARMD in the ongoing Columbia University Macular Genetics Study. All patients were white and older than 60 years. Twenty images from 17 patients were selected as representative of common manifestations of drusen. Image preprocessing included automated color balancing and, where necessary, manual segmentation of confounding lesions such as geographic atrophy (3 images). The operator then chose among 3 automated processing options suggested by predominant drusen type. Automated processing consisted of elimination of background variability by a mathematical model and subsequent histogram-based threshold selection. A retinal specialist using a graphic tablet while viewing stereo pairs constructed digital drusen drawings for each image. The sensitivity and specificity of drusen segmentation using the automated method with respect to manual stereoscopic drusen drawings were calculated on a rigorous pixel-by-pixel basis. The median sensitivity and specificity of automated segmentation were 70% and 81%, respectively. After preprocessing and option choice, reproducibility of automated drusen segmentation was necessarily 100%. Automated drusen segmentation can be reliably performed on digital fundus photographs and result in successful quantification of drusen in a more precise manner than is traditionally possible with manual stereoscopic grading of drusen. With only minor preprocessing requirements, this automated detection

  4. FLAME MONITORING IN POWER STATION BOILERS USING IMAGE PROCESSING

    Directory of Open Access Journals (Sweden)

    K. Sujatha

    2012-05-01

    Full Text Available Combustion quality in power station boilers plays an important role in minimizing the flue gas emissions. In the present work various intelligent schemes to infer the flue gas emissions by monitoring the flame colour at the furnace of the boiler are proposed here. Flame image monitoring involves capturing the flame video over a period of time with the measurement of various parameters like Carbon dioxide (CO2, excess oxygen (O2, Nitrogen dioxide (NOx, Sulphur dioxide (SOx and Carbon monoxide (CO emissions plus the flame temperature at the core of the fire ball, air/fuel ratio and the combustion quality. Higher the quality of combustion less will be the flue gases at the exhaust. The flame video was captured using an infrared camera. The flame video is then split up into the frames for further analysis. The video splitter is used for progressive extraction of the flame images from the video. The images of the flame are then pre-processed to reduce noise. The conventional classification and clustering techniques include the Euclidean distance classifier (L2 norm classifier. The intelligent classifier includes the Radial Basis Function Network (RBF, Back Propagation Algorithm (BPA and parallel architecture with RBF and BPA (PRBFBPA. The results of the validation are supported with the above mentioned performance measures whose values are in the optimal range. The values of the temperatures, combustion quality, SOx, NOx, CO, CO2 concentrations, air and fuel supplied corresponding to the images were obtained thereby indicating the necessary control action taken to increase or decrease the air supply so as to ensure complete combustion. In this work, by continuously monitoring the flame images, combustion quality was inferred (complete/partial/incomplete combustion and the air/fuel ratio can be automatically varied. Moreover in the existing set-up, measurements like NOx, CO and CO2 are inferred from the samples that are collected periodically or by

  5. Seismic array processing and computational infrastructure for improved monitoring of Alaskan and Aleutian seismicity and volcanoes

    Science.gov (United States)

    Lindquist, Kent Gordon

    We constructed a near-real-time system, called Iceworm, to automate seismic data collection, processing, storage, and distribution at the Alaska Earthquake Information Center (AEIC). Phase-picking, phase association, and interprocess communication components come from Earthworm (U.S. Geological Survey). A new generic, internal format for digital data supports unified handling of data from diverse sources. A new infrastructure for applying processing algorithms to near-real-time data streams supports automated information extraction from seismic wavefields. Integration of Datascope (U. of Colorado) provides relational database management of all automated measurements, parametric information for located hypocenters, and waveform data from Iceworm. Data from 1997 yield 329 earthquakes located by both Iceworm and the AEIC. Of these, 203 have location residuals under 22 km, sufficient for hazard response. Regionalized inversions for local magnitude in Alaska yield Msb{L} calibration curves (logAsb0) that differ from the Californian Richter magnitude. The new curve is 0.2\\ Msb{L} units more attenuative than the Californian curve at 400 km for earthquakes north of the Denali fault. South of the fault, and for a region north of Cook Inlet, the difference is 0.4\\ Msb{L}. A curve for deep events differs by 0.6\\ Msb{L} at 650 km. We expand geographic coverage of Alaskan regional seismic monitoring to the Aleutians, the Bering Sea, and the entire Arctic by initiating the processing of four short-period, Alaskan seismic arrays. To show the array stations' sensitivity, we detect and locate two microearthquakes that were missed by the AEIC. An empirical study of the location sensitivity of the arrays predicts improvements over the Alaskan regional network that are shown as map-view contour plots. We verify these predictions by detecting an Msb{L} 3.2 event near Unimak Island with one array. The detection and location of four representative earthquakes illustrates the expansion

  6. A GPS-based Real-time Road Traffic Monitoring System

    Science.gov (United States)

    Tanti, Kamal Kumar

    In recent years, monitoring systems are astonishingly inclined towards ever more automatic; reliably interconnected, distributed and autonomous operation. Specifically, the measurement, logging, data processing and interpretation activities may be carried out by separate units at different locations in near real-time. The recent evolution of mobile communication devices and communication technologies has fostered a growing interest in the GIS & GPS-based location-aware systems and services. This paper describes a real-time road traffic monitoring system based on integrated mobile field devices (GPS/GSM/IOs) working in tandem with advanced GIS-based application software providing on-the-fly authentications for real-time monitoring and security enhancement. The described system is developed as a fully automated, continuous, real-time monitoring system that employs GPS sensors and Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based on the requirements of client’s satisfaction. Due to the modular architecture of the system, other sensor types may be supported with minimal effort. Data on the distributed network & measurements are transmitted via cellular SIM cards to a Control Unit, which provides for post-processing and network management. The Control Unit may be remotely accessed via an Internet connection. The new system will not only provide more consistent data about the road traffic conditions but also will provide methods for integrating with other Intelligent Transportation Systems (ITS). For communication between the mobile device and central monitoring service GSM technology is used. The resulting system is characterized by autonomy, reliability and a high degree of automation.

  7. Automated processing of whole blood units: operational value and in vitro quality of final blood components.

    Science.gov (United States)

    Jurado, Marisa; Algora, Manuel; Garcia-Sanchez, Félix; Vico, Santiago; Rodriguez, Eva; Perez, Sonia; Barbolla, Luz

    2012-01-01

    The Community Transfusion Centre in Madrid currently processes whole blood using a conventional procedure (Compomat, Fresenius) followed by automated processing of buffy coats with the OrbiSac system (CaridianBCT). The Atreus 3C system (CaridianBCT) automates the production of red blood cells, plasma and an interim platelet unit from a whole blood unit. Interim platelet unit are pooled to produce a transfusable platelet unit. In this study the Atreus 3C system was evaluated and compared to the routine method with regards to product quality and operational value. Over a 5-week period 810 whole blood units were processed using the Atreus 3C system. The attributes of the automated process were compared to those of the routine method by assessing productivity, space, equipment and staffing requirements. The data obtained were evaluated in order to estimate the impact of implementing the Atreus 3C system in the routine setting of the blood centre. Yield and in vitro quality of the final blood components processed with the two systems were evaluated and compared. The Atreus 3C system enabled higher throughput while requiring less space and employee time by decreasing the amount of equipment and processing time per unit of whole blood processed. Whole blood units processed on the Atreus 3C system gave a higher platelet yield, a similar amount of red blood cells and a smaller volume of plasma. These results support the conclusion that the Atreus 3C system produces blood components meeting quality requirements while providing a high operational efficiency. Implementation of the Atreus 3C system could result in a large organisational improvement.

  8. Automation of the process of generation of the students insurance, applying RFID and GPRS technologies

    Directory of Open Access Journals (Sweden)

    Nelson Barrera-Lombana

    2013-07-01

    Full Text Available This article presents the description of the design and implementation of a system which allows the fulfilment of a consultation service on various parameters to a web server using a GSM modem, exchanging information systems over the Internet (ISS and radio-frequency identification (RFID. The application validates for its use in automation of the process of generation of the student insurance, and hardware and software, developed by the Research Group in Robotics and Industrial Automation GIRAof UPTC, are used as a platform.

  9. Uranium concentration monitor manual: 2300 system

    International Nuclear Information System (INIS)

    Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.

    1985-04-01

    This manual describes the design, operation, and procedures for measurement control for the automated uranium concentration monitor on the 2300 solvent extraction system at the Oak Ridge Y-12 Plant. The nonintrusive monitor provides a near-real time readout of uranium concentration at two locations simultaneously in the solvent extraction system for process monitoring and control. Detectors installed at the top of the extraction column and at the bottom of the backwash column acquire spectra of gamma rays from the solvent extraction solutions in the columns. Pulse-height analysis of these spectra gives the concentration of uranium in the organic product of the extraction column and in the aqueous product of the solvent extraction system. The visual readouts of concentrations for process monitoring are updated every 2 min for both detection systems. Simultaneously, the concentration results are shipped to a remote computer that has been installed by Y-12 to demonstrate automatic control of the solvent extraction system based on input of near-real time process operation information. 8 refs., 13 figs., 4 tabs

  10. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  11. The Celution® System: Automated Processing of Adipose-Derived Regenerative Cells in a Functionally Closed System.

    Science.gov (United States)

    Fraser, John K; Hicok, Kevin C; Shanahan, Rob; Zhu, Min; Miller, Scott; Arm, Douglas M

    2014-01-01

    Objective: To develop a closed, automated system that standardizes the processing of human adipose tissue to obtain and concentrate regenerative cells suitable for clinical treatment of thermal and radioactive burn wounds. Approach: A medical device was designed to automate processing of adipose tissue to obtain a clinical-grade cell output of stromal vascular cells that may be used immediately as a therapy for a number of conditions, including nonhealing wounds resulting from radiation damage. Results: The Celution ® System reliably and reproducibly generated adipose-derived regenerative cells (ADRCs) from tissue collected manually and from three commercial power-assisted liposuction devices. The entire process of introducing tissue into the system, tissue washing and proteolytic digestion, isolation and concentration of the nonadipocyte nucleated cell fraction, and return to the patient as a wound therapeutic, can be achieved in approximately 1.5 h. An alternative approach that applies ultrasound energy in place of enzymatic digestion demonstrates extremely poor efficiency cell extraction. Innovation: The Celution System is the first medical device validated and approved by multiple international regulatory authorities to generate autologous stromal vascular cells from adipose tissue that can be used in a real-time bedside manner. Conclusion: Initial preclinical and clinical studies using ADRCs obtained using the automated tissue processing Celution device described herein validate a safe and effective manner to obtain a promising novel cell-based treatment for wound healing.

  12. Automated analysis of heterogeneous carbon nanostructures by high-resolution electron microscopy and on-line image processing

    International Nuclear Information System (INIS)

    Toth, P.; Farrer, J.K.; Palotas, A.B.; Lighty, J.S.; Eddings, E.G.

    2013-01-01

    High-resolution electron microscopy is an efficient tool for characterizing heterogeneous nanostructures; however, currently the analysis is a laborious and time-consuming manual process. In order to be able to accurately and robustly quantify heterostructures, one must obtain a statistically high number of micrographs showing images of the appropriate sub-structures. The second step of analysis is usually the application of digital image processing techniques in order to extract meaningful structural descriptors from the acquired images. In this paper it will be shown that by applying on-line image processing and basic machine vision algorithms, it is possible to fully automate the image acquisition step; therefore, the number of acquired images in a given time can be increased drastically without the need for additional human labor. The proposed automation technique works by computing fields of structural descriptors in situ and thus outputs sets of the desired structural descriptors in real-time. The merits of the method are demonstrated by using combustion-generated black carbon samples. - Highlights: ► The HRTEM analysis of heterogeneous nanostructures is a tedious manual process. ► Automatic HRTEM image acquisition and analysis can improve data quantity and quality. ► We propose a method based on on-line image analysis for the automation of HRTEM image acquisition. ► The proposed method is demonstrated using HRTEM images of soot particles

  13. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  14. Process automation using combinations of process and machine control technologies with application to a continuous dissolver

    International Nuclear Information System (INIS)

    Spencer, B.B.; Yarbro, O.O.

    1991-01-01

    Operation of a continuous rotary dissolver, designed to leach uranium-plutonium fuel from chopped sections of reactor fuel cladding using nitric acid, has been automated. The dissolver is a partly continuous, partly batch process that interfaces at both ends with batchwise processes, thereby requiring synchronization of certain operations. Liquid acid is fed and flows through the dissolver continuously, whereas chopped fuel elements are fed to the dissolver in small batches and move through the compartments of the dissolver stagewise. Sequential logic (or machine control) techniques are used to control discrete activities such as the sequencing of isolation valves. Feedback control is used to control acid flowrates and temperatures. Expert systems technology is used for on-line material balances and diagnostics of process operation. 1 ref., 3 figs

  15. Processing of the WLCG monitoring data using NoSQL

    Science.gov (United States)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  16. Processing of the WLCG monitoring data using NoSQL

    International Nuclear Information System (INIS)

    Andreeva, J; Beche, A; Karavakis, E; Saiz, P; Tuckett, D; Belov, S; Kadochnikov, I; Schovancova, J; Dzhunov, I

    2014-01-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  17. METHODOLOGICAL PROBLEMS AND WAYS OF CREATION OF THE AIRCRAFT EQUIPMENT TEST AUTOMATED MANAGEMENT SYSTEM

    Directory of Open Access Journals (Sweden)

    Vladimir Michailovich Vetoshkin

    2017-01-01

    Full Text Available The development of new and modernization of existing aviation equipment specimens of different classes are ac- companied and completed by the complex process of ground and flight tests. This phase of aviation equipment life cycle is implemented by means of organizational and technical systems - running centers. The latter include various proving grounds, measuring complex and systems, aircraft, ships, security and flight control offices, information processing laborato- ries and many other elements. The system analysis results of development challenges of the automated control systems of aviation equipment tests operations are presented. The automated control systems are in essence an automated data bank. The key role of development of flight tests automated control system in the process of creation of the automated control sys- tems of aviation equipment tests operations is substantiated. The way of the mobile modular measuring complexes integra- tion and the need for national methodologies and technological standards for database systems design concepts are grounded. Database system, as a central element in this scheme, provides collection, storing and updating of values of the elements described above in pace and the required frequency of the controlled object state monitoring. It is database system that pro- vides the supervisory unit with actual data corresponding to specific moments of time, which concern the state processes, assessments of the progress and results of flight experiments, creating the necessary environment for aviation equipment managing and testing as a whole. The basis for development of subsystems of automated control systems of aviation equip- ment tests operations are conceptual design processes of the respective database system, the implementation effectiveness of which largely determines the level of success and ability to develop the systems being created. Introduced conclusions and suggestions can be used in the

  18. AUTOMATED TECHNIQUE FOR FLOW MEASUREMENTS FROM MARIOTTE RESERVOIRS.

    Science.gov (United States)

    Constantz, Jim; Murphy, Fred

    1987-01-01

    The mariotte reservoir supplies water at a constant hydraulic pressure by self-regulation of its internal gas pressure. Automated outflow measurements from mariotte reservoirs are generally difficult because of the reservoir's self-regulation mechanism. This paper describes an automated flow meter specifically designed for use with mariotte reservoirs. The flow meter monitors changes in the mariotte reservoir's gas pressure during outflow to determine changes in the reservoir's water level. The flow measurement is performed by attaching a pressure transducer to the top of a mariotte reservoir and monitoring gas pressure changes during outflow with a programmable data logger. The advantages of the new automated flow measurement techniques include: (i) the ability to rapidly record a large range of fluxes without restricting outflow, and (ii) the ability to accurately average the pulsing flow, which commonly occurs during outflow from the mariotte reservoir.

  19. Automated Calibration of Dosimeters for Diagnostic Radiology

    International Nuclear Information System (INIS)

    Romero Acosta, A.; Gutierrez Lores, S.

    2015-01-01

    Calibration of dosimeters for diagnostic radiology includes current and charge measurements, which are often repetitive. However, these measurements are usually done using modern electrometers, which are equipped with an RS-232 interface that enables instrument control from a computer. This paper presents an automated system aimed to the measurements for the calibration of dosimeters used in diagnostic radiology. A software application was developed, in order to achieve the acquisition of the electric charge readings, measured values of the monitor chamber, calculation of the calibration coefficient and issue of a calibration certificate. A primary data record file is filled and stored in the computer hard disk. The calibration method used was calibration by substitution. With this system, a better control over the calibration process is achieved and the need for human intervention is reduced. the automated system will be used in the calibration of dosimeters for diagnostic radiology at the Cuban Secondary Standard Dosimetry Laboratory of the Center for Radiation Protection and Hygiene. (Author)

  20. Automated Psychological Categorization via Linguistic Processing System

    National Research Council Canada - National Science Library

    Eramo, Mark

    2004-01-01

    .... This research examined whether or not Information Technology (IT) tools, specializing in text mining, are robust enough to automate the categorization/segmentation of individual profiles for the purpose...