WorldWideScience

Sample records for automated process monitoring

  1. Automated process safety parameters monitoring system

    International Nuclear Information System (INIS)

    Iyudina, O.S.; Solov'eva, A.G.; Syrov, A.A.

    2015-01-01

    Basing on the expertise in upgrading and creation of control systems for NPP process equipment, “Diakont” has developed the automated process safety parameters monitoring system project. The monitoring system is a set of hardware, software and data analysis tools based on a dynamic logical-and-probabilistic model of process safety. The proposed monitoring system can be used for safety monitoring and analysis of the following processes: reactor core reloading; spent nuclear fuel transfer; startup, loading, on-load operation and shutdown of an NPP turbine [ru

  2. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; Grate, Jay W.; DeVol, Timothy A.

    2004-01-01

    This research program is directed toward rapid, sensitive, and selective determination of beta and alpha-emitting radionuclides such as 99Tc, 90Sr, and trans-uranium (TRU) elements in low activity waste (LAW) processing streams. The overall technical approach is based on automated radiochemical measurement principles, which entails integration of sample treatment and separation chemistries and radiometric detection within a single functional analytical instrument. Nuclear waste process streams are particularly challenging for rapid analytical methods due to the complex, high-ionic-strength, caustic brine sample matrix, the presence of interfering radionuclides, and the variable and uncertain speciation of the radionuclides of interest. As a result, matrix modification, speciation control, and separation chemistries are required for use in automated process analyzers. Significant knowledge gaps exist relative to the design of chemistries for such analyzers so that radionuclides can be quantitatively and rapidly separated and analyzed in solutions derived from low-activity waste processing operations. This research is addressing these knowledge gaps in the area of separation science, nuclear detection, and analytical chemistry and instrumentation. The outcome of these investigations will be the knowledge necessary to choose appropriate chemistries for sample matrix modification and analyte speciation control and chemistries for rapid and selective separation and preconcentration of target radionuclides from complex sample matrices. In addition, new approaches for quantification of alpha emitters in solution using solid-state diode detectors, as well as improved instrumentation and signal processing techniques for use with solid-state and scintillation detectors, will be developed. New knowledge of the performance of separation materials, matrix modification and speciation control chemistries, instrument configurations, and quantitative analytical approaches will

  3. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Jay W. Grate; Timothy A. DeVol

    2006-01-01

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodology, and to initiate work on new detection approaches, especially using modified diode detectors

  4. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  5. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  6. Plug-and-play monitoring and performance optimization for industrial automation processes

    CERN Document Server

    Luo, Hao

    2017-01-01

    Dr.-Ing. Hao Luo demonstrates the developments of advanced plug-and-play (PnP) process monitoring and control systems for industrial automation processes. With aid of the so-called Youla parameterization, a novel PnP process monitoring and control architecture (PnP-PMCA) with modularized components is proposed. To validate the developments, a case study on an industrial rolling mill benchmark is performed, and the real-time implementation on a laboratory brushless DC motor is presented. Contents PnP Process Monitoring and Control Architecture Real-Time Configuration Techniques for PnP Process Monitoring Real-Time Configuration Techniques for PnP Performance Optimization Benchmark Study and Real-Time Implementation Target Groups Researchers and students of Automation and Control Engineering Practitioners in the area of Industrial and Production Engineering The Author Hao Luo received the Ph.D. degree at the Institute for Automatic Control and Complex Systems (AKS) at the University of Duisburg-Essen, Germany, ...

  7. Bioreactor process monitoring using an automated microfluidic platform for cell-based assays

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    We report on a novel microfluidic system designed to monitor in real-time the concentration of live and dead cells in industrial cell production. Custom-made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample-to-waste liquid management and image cytometry......-based detection contribute to the high programmability and automation of this platform. Furthermore, this is to the best of our knowledge, the first use of Dean vortices to implement a wide range of dilution factors to highly concentrated cell samples. The combination of a curved channel geometry and high flow...... rates enables the rapid passive mixing and homogenization of the diluted cell plug....

  8. Automated personnel radiation monitor

    International Nuclear Information System (INIS)

    Sterling, S.G.

    1981-01-01

    An automated Personnel Low-Level Radiation Portal Monitor has been developed by UNC Nuclear Industries, Inc. It is micro-computer controlled and uses nineteen large gas flow radiation detectors. By employing a micro-computer, sophisticated mathematical analysis is used on the detector informational data base to determine the statistical probability of contamination. This system provides for: (1) Increased sensitivity to point source contamination; (2) Real time background level compensation before and during Portal occupancy; (3) Variable counting periods as necessary to provide a significant statistical probability of contamination; (4) Continuous self-testing of system components, detector operability and sensitivity; and (5) Multiple modes of operation allowing the operator/owner control from continuous walk-through (for SNM detection at gates) to complete whole body counts (at step-off points from radiation zones). Sr-90 sources of .005 uCi can be detected from the hands and feet with a 90% confidence level, less than .1% false alarm rate with background levels up to 0.1 mR/hr. For the occupants periphery adjacent to the detectors, a sensitivity of .01 uCi is readily attainable. Alpha particle detection is legitimately available on hands, due to close proximity detection and thin Mylar detector cover techniques

  9. Automated personnel radiation monitor

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, S.G.

    1981-06-01

    An automated Personnel Low-Level Radiation Portal Monitor has been developed by UNC Nuclear Industries, Inc. It is micro-computer controlled and uses nineteen large gas flow radiation detectors. By employing a micro-computer, sophisticated mathematical analysis is used on the detector informational data base to determine the statistical probability of contamination. This system provides for: (1) Increased sensitivity to point source contamination; (2) Real time background level compensation before and during Portal occupancy; (3) Variable counting periods as necessary to provide a significant statistical probability of contamination; (4) Continuous self-testing of system components, detector operability and sensitivity; and (5) Multiple modes of operation allowing the operator/owner control from continuous walk-through (for SNM detection at gates) to complete whole body counts (at step-off points from radiation zones). Sr-90 sources of .005 uCi can be detected from the hands and feet with a 90% confidence level, less than .1% false alarm rate with background levels up to 0.1 mR/hr. For the occupants periphery adjacent to the detectors, a sensitivity of .01 uCi is readily attainable. Alpha particle detection is legitimately available on hands, due to close proximity detection and thin Mylar detector cover techniques.

  10. Identity Management Processes Automation

    Directory of Open Access Journals (Sweden)

    A. Y. Lavrukhin

    2010-03-01

    Full Text Available Implementation of identity management systems consists of two main parts, consulting and automation. The consulting part includes development of a role model and identity management processes description. The automation part is based on the results of consulting part. This article describes the most important aspects of IdM implementation.

  11. Automated system for acquisition and image processing for the control and monitoring boned nopal

    Science.gov (United States)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  12. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing

    International Nuclear Information System (INIS)

    Masotti, Paulo Henrique Ferraz

    2006-01-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  13. Automated time-lapse electrical resistivity tomography (ERT) for improved process analysis and long-term monitoring of frozen ground

    Science.gov (United States)

    Hauck, Christian; Hilbich, Christin; Fuss, Christian

    2010-05-01

    Determining the subsurface ice and unfrozen water content in cold regions are important tasks in all kind of cryospheric studies, but especially on perennial (permafrost) or seasonal frozen ground, where little insights can be gained from direct observations at the surface. In the absence of boreholes, geophysical methods are often the only possibility for visualising and quantifying the subsurface characteristics. Their successful applications in recent years lead to more and more sophisticated approaches including 2- and 3-dimensional monitoring and even quantifying the ice and unfrozen water content evolution within the subsurface. Due to the strong sensitivity of electrical resistivity to the phase change between unfrozen water and ice, the application of electrical and electromagnetic techniques has been especially successful. Within these methods, Electrical Resistivity Tomography (ERT) is often favoured due to its comparatively easy and fast data processing, its robustness against ambient noise and its good performance even in harsh, cold and heterogeneous environments. Numerous recent studies have shown that ERT is principally suitable to spatially delineate ground ice, differentiate between ice-poor and ice-rich occurrences, monitor freezing, thawing and infiltration processes. However, resistivity surveys have still to be made manually, which poses large constraints concerning the comparability of measurements at specific time instances, e.g. the choice of the date for end-of-summer measurements, and/or the possibility for measurements during winter, when many locations are inaccessible. Furthermore, many climate studies require the analysis of statistically meaningful properties, such as maximum/minimum values and monthly or annual mean values, which cannot be determined using temporally sparse and irregularly spaced measurements. As a new system for future automated measurements with regular time interval (e.g. 1 measurement per day), an automated ERT

  14. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    Science.gov (United States)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  15. Simultaneous and automated monitoring of the multimetal biosorption processes by potentiometric sensor array and artificial neural network.

    Science.gov (United States)

    Wilson, D; del Valle, M; Alegret, S; Valderrama, C; Florido, A

    2013-09-30

    In this communication, a new methodology for the simultaneous and automated monitoring of biosorption processes of multimetal mixtures of polluting heavy metals on vegetable wastes based on flow-injection potentiometry (FIP) and electronic tongue detection (ET) is presented. A fixed-bed column filled with grape stalks from wine industry wastes is used as the biosorption setup to remove the metal mixtures from the influent solution. The monitoring system consists in a computer controlled-FIP prototype with the ET based on an array of 9 flow-through ion-selective electrodes and electrodes with generic response to divalent ions placed in series, plus an artificial neural network response model. The cross-response to Cu(2+), Cd(2+), Zn(2+), Pb(2+) and Ca(2+) (as target ions) is used, and only when dynamic treatment of the kinetic components of the transient signal is incorporated, a correct operation of the system is achieved. For this purpose, the FIA peaks are transformed via use of Fourier treatment, and selected coefficients are used to feed an artificial neural network response model. Real-time monitoring of different binary (Cu(2+)/ Pb(2+)), (Cu(2+)/ Zn(2+)) and ternary mixtures (Cu(2+)/ Pb(2+)/ Zn(2+)), (Cu(2+)/ Zn(2+)/ Cd(2+)), simultaneous to the release of Ca(2+) in the effluent solution, are achieved satisfactorily using the reported system, obtaining the corresponding breakthrough curves, and showing the ion-exchange mechanism among the different metals. Analytical performance is verified against conventional spectroscopic techniques, with good concordance of the obtained breakthrough curves and modeled adsorption parameters. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Tracking forest canopy dynamics from an automated proximal hyperspectral monitoring system: linking remote sensing observations to leaf level photosynthetic processes

    Science.gov (United States)

    Woodgate, W.; van Gorsel, E.; Hughes, D.; Suarez, L.; Cabello-Leblic, A.; Held, A. A.; Norton, A.; Dempsey, R.

    2017-12-01

    To better understand the vegetation response to climate extremes we have developed a fully automated hyperspectral and thermal monitoring system installed on a flux tower at a mature Eucalypt forest site - Tumbarumba, Australia. The automated system bridges spatial, spectral and temporal scales between satellite and in situ observations. Here, we have been acquiring high resolution panoramic hyperspectral and thermal images of the forest canopy three times per day since mid-2014.A specific focus of the work to date has been linking light use efficiency (LUE) as measured by the flux tower to remote sensing observations from the leaf, to crown, to canopy scale. Specifically, targeted field campaigns were conducted in 2016 to establish the interrelationship between structure, function, and spectra. At the leaf level destructive sampling to quantify photosynthetic pigments was conducted to pick apart the mechanisms contributing to photosynthetic processes of non-photochemical quenching and the resultant changes in observed leaf spectra. At the crown level, Terrestrial Laser Scanning data was used to derive canopy structural information, enabling distance to crown and crown foliage density to be calculated to a fine degree of detail. This information is critical for correcting attenuation of the thermal signal from atmospheric transmission, and to distinguish the relative foliage-to-soil contribution to the thermal and hyperspectral imagery. Ancillary data streams from sap flow and dendrometer devices serve to link leaf, crown and canopy observations.Preliminary results of the leaf and crown level relationships between function and spectra will be discussed. We will demonstrate that operating in a tall canopy (40m) forest can lead to additional complexities. We have found the relationship strength between traditional remote sensing LUE proxies and photosynthetic proxies derived from pigments varies strongly with canopy height and pigment pool size. Additionally, the

  17. Automated biomonitoring: living sensors as environmental monitors

    National Research Council Canada - National Science Library

    Gruber, D; Diamond, J

    1988-01-01

    Water quality continues to present problems of global concern and has resulted in greatly increased use of automated biological systems in monitoring drinking water, industrial effluents and wastewater...

  18. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  19. An automated neutron monitor maintenance system

    International Nuclear Information System (INIS)

    Moore, F.S.; Griffin, J.C.; Odell, D.M.C.

    1996-01-01

    Neutron detectors are commonly used by the nuclear materials processing industry to monitor fissile materials in process vessels and tanks. The proper functioning of these neutron monitors must be periodically evaluated. We have developed and placed in routine use a PC-based multichannel analyzer (MCA) system for on-line BF3 and He-3 gas-filled detector function testing. The automated system: 1) acquires spectral data from the monitor system, 2) analyzes the spectrum to determine the detector's functionality, 3) makes suggestions for maintenance or repair, as required, and 4) saves the spectrum and results to disk for review. The operator interface has been designed to be user-friendly and to minimize the training requirements of the user. The system may also be easily customized for various applications

  20. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    The industrial production of cells has a large unmet need for greater process monitoring, in addition to the standard temperature, pH and oxygen concentration determination. Monitoring the cell health by a vast range of fluorescence cell-based assays can greatly improve the feedback control...

  1. Process computers automate CERN power supply installations

    International Nuclear Information System (INIS)

    Ullrich, H.; Martin, A.

    1974-01-01

    Higher standards of performance and reliability in the power plants of large particle accelerators necessitate increasing use of automation. The CERN (European Nuclear Research Centre) in Geneva started to employ process computers for plant automation at an early stage in its history. The great complexity and extent of the plants for high-energy physics first led to the setting-up of decentralized automatic systems which are now being increasingly combined into one interconnected automation system. One of these automatic systems controls and monitors the extensive power supply installations for the main ring magnets in the experimental zones. (orig.) [de

  2. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  3. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  4. Automated personal dosimetry monitoring system for NPP

    International Nuclear Information System (INIS)

    Chanyshev, E.; Chechyotkin, N.; Kondratev, A.; Plyshevskaya, D.

    2006-01-01

    Full text: Radiation safety of personnel at nuclear power plants (NPP) is a priority aim. Degree of radiation exposure of personnel is defined by many factors: NPP design, operation of equipment, organizational management of radiation hazardous works and, certainly, safety culture of every employee. Automated Personal Dosimetry Monitoring System (A.P.D.M.S.) is applied at all nuclear power plants nowadays in Russia to eliminate the possibility of occupational radiation exposure beyond regulated level under different modes of NPP operation. A.P.D.M.S. provides individual radiation dose registration. In the paper the efforts of Design Bureau 'Promengineering' in construction of software and hardware complex of A.P.D.M.S. (S.H.W. A.P.D.M.S.) for NPP with PWR are presented. The developed complex is intended to automatize activities of radiation safety department when caring out individual dosimetry control. The complex covers all main processes concerning individual monitoring of external and internal radiation exposure as well as dose recording, management, and planning. S.H.W. A.P.D.M.S. is a multi-purpose system which software was designed on the modular approach. This approach presumes modification and extension of software using new components (modules) without changes in other components. Such structure makes the system flexible and allows modifying it in case of implementation a new radiation safety requirements and extending the scope of dosimetry monitoring. That gives the possibility to include with time new kinds of dosimetry control for Russian NPP in compliance with IAEA recommendations, for instance, control of the equivalent dose rate to the skin and the equivalent dose rate to the lens of the eye S.H.W. A.P.D.M.S. provides dosimetry control as follows: Current monitoring of external radiation exposure: - Gamma radiation dose measurement using radio-photoluminescent personal dosimeters. - Neutron radiation dose measurement using thermoluminescent

  5. Development and implementation of an automated system for antiquated of the process of gamma radiation monitors calibration

    International Nuclear Information System (INIS)

    Silva Junior, Iremar Alves

    2012-01-01

    In this study it was carried out the development and implementation of a system for the appropriate process of gamma radiation monitors calibration, constituted by a pneumatic dispositive to exchange the attenuators and a positioning table, both actuated through a control panel. We also implemented a System of Caesa-Gammatron Irradiator, which increased the range of the air kerma rates, due to its higher activity comparing with the current system of gamma radiation in use in the calibration laboratory of gamma irradiation. Hence, it was necessary the installation of an attenuator dispositive remotely controlled in this irradiator system. Lastly, it was carried out an evaluation of the reduction in the rates of the occupational dose. This dissertation was developed with the aim of improving the quality of the services of calibration and tests of gamma radiation monitors - provided by the IPEN Laboratory of Instrument Calibration - as well as decreasing the occupational dose of the technicians involved in the process of calibration, following thus the principles of radiation protection. (author)

  6. Automated data acquisition for dam monitoring

    International Nuclear Information System (INIS)

    Koopmans, R.; Jakubick, A.T.

    1990-01-01

    Automated data acquisition for dam monitoring is crucial to emergency response, allows frequent readings without increased labour and cost, allows monitoring of instrument response to changing environmental and physical influences, enables direct computer acquisition of data, and has numerous other advantages. The experience of Ontario Hydro, British Columbia Hydro, and other utilities with automated data acquisition systems is described. Details are provided of remote monitoring systems, instrumentation, data loggers, tape cassette backup, power sources, data transmission equipment, modems and telephone networks, computers and peripherals, and system performance. The utility's plans for future expansion of the systems are described. Utility experience with the automated systems is also described for Clarence Dam, Bath County Pumped Storage Station, Scott Dam, and Vermillion Dam in the U.S., Ajaure Dam in Sweden, and ENEL Dam in the Valtellina area of Italy. 17 refs., 2 figs

  7. Automating the personnel dosimeter monitoring program

    International Nuclear Information System (INIS)

    Compston, M.W.

    1982-12-01

    The personnel dosimetry monitoring program at the Portsmouth uranium enrichment facility has been improved by using thermoluminescent dosimetry to monitor for ionizing radiation exposure, and by automating most of the operations and all of the associated information handling. A thermoluminescent dosimeter (TLD) card, worn by personnel inside security badges, stores the energy of ionizing radiation. The dosimeters are changed-out periodically and are loaded 150 cards at a time into an automated reader-processor. The resulting data is recorded and filed into a useful form by computer programming developed for this purpose

  8. The Automator: Intelligent control system monitoring

    International Nuclear Information System (INIS)

    M. Bickley; D.A. Bryan; K.S. White

    1999-01-01

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  9. Automated Cryocooler Monitor and Control System

    Science.gov (United States)

    Britcliffe, Michael J.; Hanscon, Theodore R.; Fowler, Larry E.

    2011-01-01

    A system was designed to automate cryogenically cooled low-noise amplifier systems used in the NASA Deep Space Network. It automates the entire operation of the system including cool-down, warm-up, and performance monitoring. The system is based on a single-board computer with custom software and hardware to monitor and control the cryogenic operation of the system. The system provides local display and control, and can be operated remotely via a Web interface. The system controller is based on a commercial single-board computer with onboard data acquisition capability. The commercial hardware includes a microprocessor, an LCD (liquid crystal display), seven LED (light emitting diode) displays, a seven-key keypad, an Ethernet interface, 40 digital I/O (input/output) ports, 11 A/D (analog to digital) inputs, four D/A (digital to analog) outputs, and an external relay board to control the high-current devices. The temperature sensors used are commercial silicon diode devices that provide a non-linear voltage output proportional to temperature. The devices are excited with a 10-microamp bias current. The system is capable of monitoring and displaying three temperatures. The vacuum sensors are commercial thermistor devices. The output of the sensors is a non-linear voltage proportional to vacuum pressure in the 1-Torr to 1-millitorr range. Two sensors are used. One measures the vacuum pressure in the cryocooler and the other the pressure at the input to the vacuum pump. The helium pressure sensor is a commercial device that provides a linear voltage output from 1 to 5 volts, corresponding to a gas pressure from 0 to 3.5 MPa (approx. = 500 psig). Control of the vacuum process is accomplished with a commercial electrically operated solenoid valve. A commercial motor starter is used to control the input power of the compressor. The warm-up heaters are commercial power resistors sized to provide the appropriate power for the thermal mass of the particular system, and

  10. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  11. Monitoring system for automation of experimental researches in cutting

    International Nuclear Information System (INIS)

    Kuzinovski, Mikolaj; Trajchevski, Neven; Filipovski, Velimir; Tomov, Mite; Cichosz, Piotr

    2009-01-01

    This study presents procedures being performed when projecting and realizing experimental scientific researches by application of the automated measurement system with a computer support in all experiment stages. A special accent is placed on the measurement system integration and mathematical processing of data from experiments. Automation processes are described through the realized own automated monitoring system for research of physical phenomena in the cutting process with computer-aided data acquisition. The monitoring system is intended for determining the tangential, axial and radial component of the cutting force, as well as average temperature in the cutting process. The hardware acquisition art consists of amplifiers and A/D converters, while as for analysis and visualization software for P C is developed by using M S Visual C++. For mathematical description researched physical phenomena CADEX software is made, which in connection with MATLAB is intended for projecting processing and analysis of experimental scientific researches against the theory for planning multi-factorial experiments. The design and construction of the interface and the computerized measurement system were done by the Faculty of Mechanical Engineering in Skopje in collaboration with the Faculty of Electrical Engineering and Information Technologies in Skopje and the Institute of Production Engineering and Automation, Wroclaw University of Technology, Poland. Gaining own scientific research measurement system with free access to hardware and software parts provides conditions for a complete control of the research process and reduction of interval of the measuring uncertainty of gained results from performed researches.

  12. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  13. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  14. AUTOMATED LOW-COST PHOTOGRAMMETRY FOR FLEXIBLE STRUCTURE MONITORING

    Directory of Open Access Journals (Sweden)

    C. H. Wang

    2012-07-01

    Full Text Available Structural monitoring requires instruments which can provide high precision and accuracy, reliable measurements at good temporal resolution and rapid processing speeds. Long-term campaigns and flexible structures are regarded as two of the most challenging subjects in monitoring engineering structures. Long-term monitoring in civil engineering is generally considered to be labourintensive and financially expensive and it can take significant effort to arrange the necessary human resources, transportation and equipment maintenance. When dealing with flexible structure monitoring, it is of paramount importance that any monitoring equipment used is able to carry out rapid sampling. Low cost, automated, photogrammetric techniques therefore have the potential to become routinely viable for monitoring non-rigid structures. This research aims to provide a photogrammetric solution for long-term flexible structural monitoring purposes. The automated approach was achieved using low-cost imaging devices (mobile phones to replace traditional image acquisition stations and substantially reduce the equipment costs. A self-programmed software package was developed to deal with the hardware-software integration and system operation. In order to evaluate the performance of this low-cost monitoring system, a shaking table experiment was undertaken. Different network configurations and target sizes were used to determine the best configuration. A large quantity of image data was captured by four DSLR cameras and four mobile phone cameras respectively. These image data were processed using photogrammetric techniques to calculate the final results for the system evaluation.

  15. SHARP - Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. G.

    1990-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  16. SHARP: Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. Gaius

    1991-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  17. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  18. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor

    2017-01-01

    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  19. Real-time bioacoustics monitoring and automated species identification

    Directory of Open Access Journals (Sweden)

    T. Mitchell Aide

    2013-07-01

    Full Text Available Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON, a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website (arbimon.net. Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  20. Real-time bioacoustics monitoring and automated species identification.

    Science.gov (United States)

    Aide, T Mitchell; Corrada-Bravo, Carlos; Campos-Cerqueira, Marconi; Milan, Carlos; Vega, Giovany; Alvarez, Rafael

    2013-01-01

    Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON), a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website (arbimon.net). Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  1. Automated wireless monitoring system for cable tension using smart sensors

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jongwoong; Cho, Soojin; Spencer, Billie F.; Yun, Chung-Bang

    2013-04-01

    Cables are critical load carrying members of cable-stayed bridges; monitoring tension forces of the cables provides valuable information for SHM of the cable-stayed bridges. Monitoring systems for the cable tension can be efficiently realized using wireless smart sensors in conjunction with vibration-based cable tension estimation approaches. This study develops an automated cable tension monitoring system using MEMSIC's Imote2 smart sensors. An embedded data processing strategy is implemented on the Imote2-based wireless sensor network to calculate cable tensions using a vibration-based method, significantly reducing the wireless data transmission and associated power consumption. The autonomous operation of the monitoring system is achieved by AutoMonitor, a high-level coordinator application provided by the Illinois SHM Project Services Toolsuite. The monitoring system also features power harvesting enabled by solar panels attached to each sensor node and AutoMonitor for charging control. The proposed wireless system has been deployed on the Jindo Bridge, a cable-stayed bridge located in South Korea. Tension forces are autonomously monitored for 12 cables in the east, land side of the bridge, proving the validity and potential of the presented tension monitoring system for real-world applications.

  2. AUTOMATED CONTROL SYSTEM AND MONITORING BY TECHNOLOGICAL PROCESSES BY PRODUCTION OF POLYMERIC AND BITUMINOUS TAPES ON THE BASIS OF APPLICATION OF SCADA OF SYSTEM

    Directory of Open Access Journals (Sweden)

    A. S. Kirienko

    2016-01-01

    Full Text Available Expediency of use of a control system and monitoring of technological processes of production is proved in article that will allow to lower work expenses, and also to increase productivity due to the best production process.The main objective of system, remote monitoring is that gives the chance far off and to quickly give an assessment to the current situation on production, to accept reasonable and timely administrative decisions.

  3. Automated behaviour monitoring in dairy cows

    NARCIS (Netherlands)

    Mol, de R.M.; Verhoeven, P.H.F.M.; Hogewerf, P.H.; Ipema, A.H.

    2011-01-01

    Acceleration sensors in a Wireless Sensor Network (WSN) were used to monitor the behaviour of dairy cows. The data processing from 3D acceleration into behaviour classification (lying, standing or walking) was based on a two-steps method: first the distinction between lying and standing/walking was

  4. Lifecycle, Iteration, and Process Automation with SMS Gateway

    Directory of Open Access Journals (Sweden)

    Fenny Fenny

    2015-12-01

    Full Text Available Producing a better quality software system requires an understanding of the indicators of the software quality through defect detection, and automated testing. This paper aims to elevate the design and automated testing process in an engine water pump of a drinking water plant. This paper proposes how software developers can improve the maintainability and reliability of automated testing system and report the abnormal state when an error occurs on the machine. The method in this paper uses literature to explain best practices and case studies of a drinking water plant. Furthermore, this paper is expected to be able to provide insights into the efforts to better handle errors and perform automated testing and monitoring on a machine.

  5. Monitoring of operating processes

    International Nuclear Information System (INIS)

    Barry, R.F.

    1981-01-01

    Apparatus is described for monitoring the processes of a nuclear reactor to detect off-normal operation of any process and for testing the monitoring apparatus. The processes are evaluated by response to their paramters, such as temperature, pressure, etc. The apparatus includes a pair of monitoring paths or signal processing units. Each unit includes facilities for receiving on a time-sharing basis, a status binary word made up of digits each indicating the status of a process, whether normal or off-normal, and test-signal binary words simulating the status binary words. The status words and test words are processed in succession during successive cycles. During each cycle, the two units receive the same status word and the same test word. The test words simulate the status words both when they indicate normal operation and when they indicate off-normal operation. Each signal-processing unit includes a pair of memories. Each memory receives a status word or a test word, as the case may be, and converts the received word into a converted status word or a converted test word. The memories of each monitoring unit operate into a non-coincidence which signals non-coincidence of the converted word out of one memory of a signal-processing unit not identical to the converted word of the other memory of the same unit

  6. Practical Automated Vulnerability Monitoring Using Program State Invariants

    NARCIS (Netherlands)

    Giuffrida, C.; Cavallaro, L.; Tanenbaum, A.S.

    2013-01-01

    Despite the growing attention to security concerns and advances in code verification tools, many memory errors still escape testing and plague production applications with security vulnerabilities. We present RCORE, an efficient dynamic program monitoring infrastructure to perform automated security

  7. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  8. Design and implementation of an Internet based effective controlling and monitoring system with wireless fieldbus communications technologies for process automation--an experimental study.

    Science.gov (United States)

    Cetinceviz, Yucel; Bayindir, Ramazan

    2012-05-01

    The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  9. An automated system for processing electrodermal activity.

    Science.gov (United States)

    Frantzidis, Christos A; Konstantinidis, Evdokimos; Pappas, Costas; Bamidis, Panagiotis D

    2009-01-01

    A new approach is presented in this paper for the display and processing of electrodermal activity. It offers a fully automated interface for the pre-processing and scoring individual skin conductance responses (SCRs). The application supports parallel processing by means of multiple threads. Batch processing is also available. The XML format is used to describe the derived features. The system is employed to analyze emotion-related data.

  10. ARTIP: Automated Radio Telescope Image Processing Pipeline

    Science.gov (United States)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  11. Automated system for data acquisition and monitoring

    Directory of Open Access Journals (Sweden)

    Borza Sorin

    2017-01-01

    Full Text Available The Environmental management has become, with the development of human society a very important issue. There have been multiple systems that automatically monitors the environment. In this paper we propose a system that integrates GIS software and data acquisition software. In addition the proposed system implements new AHP multicriteria method that can get an answer online on each pollutant influence on limited geographical area in which the monitors. Factors pollutants of limited geographical areas are taken automatically by specific sensors through acquisition board. Labview software, with virtual instrument created by transferring them into a database Access. Access database they are taken up by software Geomedia Professional and processed using multi-criteria method AHP, so that at any moment, their influence on the environment and classify these influences, can be plotted on the screen monitoring system. The system allows, the automatic collection of data, the memorization and the generation of GIS elements. The research presented in this paper were aimed at implementing multi-criteria methods in GIS software.

  12. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    Energy Technology Data Exchange (ETDEWEB)

    Musyurka, A. V., E-mail: musyurkaav@burges.rushydro.ru [Bureya HPP (a JSC RusGidro affiliate) (Russian Federation)

    2016-09-15

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  13. Wind Turbine Manufacturing Process Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Waseem Faidi; Chris Nafis; Shatil Sinha; Chandra Yerramalli; Anthony Waas; Suresh Advani; John Gangloff; Pavel Simacek

    2012-04-26

    To develop a practical inline inspection that could be used in combination with automated composite material placement equipment to economically manufacture high performance and reliable carbon composite wind turbine blade spar caps. The approach technical feasibility and cost benefit will be assessed to provide a solid basis for further development and implementation in the wind turbine industry. The program is focused on the following technology development: (1) Develop in-line monitoring methods, using optical metrology and ultrasound inspection, and perform a demonstration in the lab. This includes development of the approach and performing appropriate demonstration in the lab; (2) Develop methods to predict composite strength reduction due to defects; and (3) Develop process models to predict defects from leading indicators found in the uncured composites.

  14. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    Padgett, H.C.; Kingsbury, W.G.

    1990-01-01

    The Siemens RDS 112, an automated radiochemical production and delivery system designed to support a clinical PET program, consists of an 11 MeV, proton only, negative ion cyclotron, a shield, a computer, and targetry and chemical processing modules to produce radiochemicals used in PET imaging. The principal clinical PET tracers are [ 18 F]FDG, [ 13 N]ammonia and [ 15 O]water. Automated synthesis of [ 18 F]FDG is achieved using the Chemistry Process Control Unit (CPCU), a general purpose valve-and-tubing device that emulates manual processes while allowing for competent operator intervention. Using function-based command file software, this pressure-driven synthesis system carries out chemical processing procedures by timing only, without process-based feedback. To date, nine CPCUs have installed at seven institutions resulting in 1,200+ syntheses of [ 18 F]FDG, with an average yield of 55% (EOB)

  15. Employee on Boarding Process Automation

    OpenAIRE

    Khushboo Nalband; Priyanka Jadhav; Geetanjali Salunke

    2017-01-01

    On boarding, also known as organizational socialization, plays a vital role in building the initial relationship between an organization and an employee. It also contributes to an employees’ satisfaction, better performance and greater organizational commitment thus increasing an employees’ effectiveness and productivity in his/her role. Therefore, it is essential that on boarding process of an organization is efficient and effective to improve new employees’ retention. Generally this on boar...

  16. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  17. An automated platform for phytoplankton ecology and aquatic ecosystem monitoring.

    Science.gov (United States)

    Pomati, Francesco; Jokela, Jukka; Simona, Marco; Veronesi, Mauro; Ibelings, Bas W

    2011-11-15

    High quality monitoring data are vital for tracking and understanding the causes of ecosystem change. We present a potentially powerful approach for phytoplankton and aquatic ecosystem monitoring, based on integration of scanning flow-cytometry for the characterization and counting of algal cells with multiparametric vertical water profiling. This approach affords high-frequency data on phytoplankton abundance, functional traits and diversity, coupled with the characterization of environmental conditions for growth over the vertical structure of a deep water body. Data from a pilot study revealed effects of an environmental disturbance event on the phytoplankton community in Lake Lugano (Switzerland), characterized by a reduction in cytometry-based functional diversity and by a period of cyanobacterial dominance. These changes were missed by traditional limnological methods, employed in parallel to high-frequency monitoring. Modeling of phytoplankton functional diversity revealed the importance of integrated spatiotemporal data, including circadian time-lags and variability over the water column, to understand the drivers of diversity and dynamic processes. The approach described represents progress toward an automated and trait-based analysis of phytoplankton natural communities. Streamlining of high-frequency measurements may represent a resource for understanding, modeling and managing aquatic ecosystems under impact of environmental change, yielding insight into processes governing phytoplankton community resistance and resilience.

  18. Automated chemical monitoring in new projects of nuclear power plant units

    Science.gov (United States)

    Lobanok, O. I.; Fedoseev, M. V.

    2013-07-01

    The development of automated chemical monitoring systems in nuclear power plant units for the past 30 years is briefly described. The modern level of facilities used to support the operation of automated chemical monitoring systems in Russia and abroad is shown. Hardware solutions suggested by the All-Russia Institute for Nuclear Power Plant Operation (which is the General Designer of automated process control systems for power units used in the AES-2006 and VVER-TOI Projects) are presented, including the structure of additional equipment for monitoring water chemistry (taking the Novovoronezh 2 nuclear power plant as an example). It is shown that the solutions proposed with respect to receiving and processing of input measurement signals and subsequent construction of standard control loops are unified in nature. Simultaneous receipt of information from different sources for ensuring that water chemistry is monitored in sufficient scope and with required promptness is one of the problems that have been solved successfully. It is pointed out that improved quality of automated chemical monitoring can be supported by organizing full engineering follow-up of the automated chemical monitoring system's equipment throughout its entire service life.

  19. Monitoring bivariate process

    Directory of Open Access Journals (Sweden)

    Marcela A. G. Machado

    2009-12-01

    Full Text Available The T² chart and the generalized variance |S| chart are the usual tools for monitoring the mean vector and the covariance matrix of multivariate processes. The main drawback of these charts is the difficulty to obtain and to interpret the values of their monitoring statistics. In this paper, we study control charts for monitoring bivariate processes that only requires the computation of sample means (the ZMAX chart for monitoring the mean vector, sample variances (the VMAX chart for monitoring the covariance matrix, or both sample means and sample variances (the MCMAX chart in the case of the joint control of the mean vector and the covariance matrix.Os gráficos de T² e da variância amostral generalizada |S| são as ferramentas usualmente utilizadas no monitoramento do vetor de médias e da matriz de covariâncias de processos multivariados. A principal desvantagem desses gráficos é a dificuldade em obter e interpretar os valores de suas estatísticas de monitoramento. Neste artigo, estudam-se gráficos de controle para o monitoramento de processos bivariados que necessitam somente do cálculo de médias amostrais (gráfico ZMAX para o monitoramento do vetor de médias, ou das variâncias amostrais (gráfico VMAX para o monitoramento da matriz de covariâncias, ou então das médias e variâncias amostrais (gráfico MCMAX para o caso do monitoramento conjunto do vetor de médias e da matriz de covariâncias.

  20. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  1. Means of storage and automated monitoring of versions of text technical documentation

    Science.gov (United States)

    Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.

    2018-03-01

    The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.

  2. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing; Metodologia de monitoracao e diagnostico automatizado de rolamentos utilizando logica paraconsistente, transformada de Wavelet e processamento de sinais digitais

    Energy Technology Data Exchange (ETDEWEB)

    Masotti, Paulo Henrique Ferraz

    2006-07-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  3. Automated Irrigation System for Greenhouse Monitoring

    Science.gov (United States)

    Sivagami, A.; Hareeshvare, U.; Maheshwar, S.; Venkatachalapathy, V. S. K.

    2018-03-01

    The continuous requirement for the food needs the rapid improvement in food production technology. The economy of food production is mainly dependent on agriculture and the weather conditions, which are isotropic and thus we are not able to utilize the whole agricultural resources. The main reason is the deficiency of rainfall and paucity in land reservoir water. The continuous withdrawal water from the ground reduces the water level resulting in most of the land to come under the arid. In the field of cultivation, use of appropriate method of irrigation plays a vital role. Drip irrigation is a renowned methodology which is very economical and proficient. When the conventional drip irrigation system is followed, the farmer has to tag along the irrigation timetable, which is different for diverse crops. The current work makes the drip irrigation system an automated one, thereby the farmer doesn't want to follow any timetable since the sensor senses the soil moisture content and based on it supplies the water. Moreover the practice of economical sensors and the simple circuitry makes this project as an inexpensive product, which can be bought even by an underprivileged farmer. The current project is best suited for places where water is limited and has to be used in limited quantity.

  4. Automated TLD system for gamma radiation monitoring

    International Nuclear Information System (INIS)

    Nyberg, P.C.; Ott, J.D.; Edmonds, C.M.; Hopper, J.L.

    1979-01-01

    A gamma radiation monitoring system utilizing a commercially available TLD reader and unique microcomputer control has been built to assess the external radiation exposure to the resident population near a nuclear weapons testing facility. Maximum use of the microcomputer was made to increase the efficiency of data acquisition, transmission, and preparation, and to reduce operational costs. The system was tested for conformance with an applicable national standard for TLD's used in environmental measurements

  5. Automated TLD system for gamma radiation monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Nyberg, P.C.; Ott, J.D.; Edmonds, C.M.; Hopper, J.L.

    1979-01-01

    A gamma radiation monitoring system utilizing a commercially available TLD reader and unique microcomputer control has been built to assess the external radiation exposure to the resident population near a nuclear weapons testing facility. Maximum use of the microcomputer was made to increase the efficiency of data acquisition, transmission, and preparation, and to reduce operational costs. The system was tested for conformance with an applicable national standard for TLD's used in environmental measurements.

  6. Cluster processing business level monitor

    International Nuclear Information System (INIS)

    Muniz, Francisco J.

    2017-01-01

    This article describes a Cluster Processing Monitor. Several applications with this functionality can be freely found doing a search in the Google machine. However, those applications may offer more features that are needed on the Processing Monitor being proposed. Therefore, making the monitor output evaluation difficult to be understood by the user, at-a-glance. In addition, such monitors may add unnecessary processing cost to the Cluster. For these reasons, a completely new Cluster Processing Monitor module was designed and implemented. In the CDTN, Clusters are broadly used, mainly, in deterministic methods (CFD) and non-deterministic methods (Monte Carlo). (author)

  7. Evaluation of new and conventional thermoluminescent phosphors for environmental monitoring using automated thermoluminescent dosimeter readers

    International Nuclear Information System (INIS)

    Rathbone, B.A.; Endres, A.W.; Antonio, E.J.

    1994-01-01

    In recent years there has been considerable interest in a new generation of super-sensitive thermoluminescent (TL) phosphors for potential use in routine personnel and environmental monitoring. Two of these phosphors are evaluated in this paper for selected characteristics relevant to environmental monitoring, along with two conventional phosphors widely used in environmental monitoring. The characteristics evaluated are light-induced fading, light-induced background, linearity and variability at low dose, and the minimum measurable dose. These characteristics were determined using an automated commercial dosimetry system and routine processing protocols. Annealing and readout protocols for each phosphor were optimized for use in a large-scale environmental monitoring program

  8. Automating slope monitoring in mines with terrestrial lidar scanners

    Science.gov (United States)

    Conforti, Dario

    2014-05-01

    Static terrestrial laser scanners (TLS) have been an important component of slope monitoring for some time, and many solutions for monitoring the progress of a slide have been devised over the years. However, all of these solutions have required users to operate the lidar equipment in the field, creating a high cost in time and resources, especially if the surveys must be performed very frequently. This paper presents a new solution for monitoring slides, developed using a TLS and an automated data acquisition, processing and analysis system. In this solution, a TLS is permanently mounted within sight of the target surface and connected to a control computer. The control software on the computer automatically triggers surveys according to a user-defined schedule, parses data into point clouds, and compares data against a baseline. The software can base the comparison against either the original survey of the site or the most recent survey, depending on whether the operator needs to measure the total or recent movement of the slide. If the displacement exceeds a user-defined safety threshold, the control computer transmits alerts via SMS text messaging and/or email, including graphs and tables describing the nature and size of the displacement. The solution can also be configured to trigger the external visual/audio alarm systems. If the survey areas contain high-traffic areas such as roads, the operator can mark them for exclusion in the comparison to prevent false alarms. To improve usability and safety, the control computer can connect to a local intranet and allow remote access through the software's web portal. This enables operators to perform most tasks with the TLS from their office, including reviewing displacement reports, downloading survey data, and adjusting the scan schedule. This solution has proved invaluable in automatically detecting and alerting users to potential danger within the monitored areas while lowering the cost and work required for

  9. Process development for automated solar cell and module production. Task 4: automated array assembly

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J.J.

    1980-06-30

    The scope of work under this contract involves specifying a process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use. This process sequence is then critically analyzed from a technical and economic standpoint to determine the technological readiness of each process step for implementation. The process steps are ranked according to the degree of development effort required and according to their significance to the overall process. Under this contract the steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development. Economic analysis using the SAMICS system has been performed during these studies to assure that development efforts have been directed towards the ultimate goal of price reduction. Details are given. (WHK)

  10. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation.

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-06-27

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  11. Default mode contributions to automated information processing.

    Science.gov (United States)

    Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A

    2017-11-28

    Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.

  12. Wellhead monitors automate Lake Maracaibo gas lift

    Energy Technology Data Exchange (ETDEWEB)

    Adjunta, J.C. (Maraven S.A., Lagunillas (Venezuela)); Majek, A. (Texas Electronic Resources, Houston, TX (United States))

    1994-11-28

    High-performance personal computer (PC) and intelligent remote terminal unit (IRTU) technology have optimized the remote control of gas lift injection and surveillance of over 1,000 offshore production wells at Lake Maracaibo in Venezuela. In its 3-year program, Maraven expects a 27,000 b/d increase in oil production by reducing deferred production and optimizing gas lift injection by as much as 20%. In addition, real time data on well performance will enhance production management as well as allocation of operational and maintenance resources. The remote control system consists of a solar-powered wellhead monitor (WHM) installed on each well platform. At each flow gathering station within a 2-mile range of a family of wells, a host terminal unit polls and stores the well data with low power, 250-mw radios. From a remote location, 60 miles onshore, an operator interface polls the host units for real time data with 5-watt radios operating in the 900-megahertz band. The paper describes the design, optimization, telemetry management, and selection of a single vendor for this system. The economic impact of this system to Maraven is also discussed.

  13. Automated Processing Workflow for Ambient Seismic Recordings

    Science.gov (United States)

    Girard, A. J.; Shragge, J.

    2017-12-01

    Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that

  14. Automated inundation monitoring using TerraSAR-X multitemporal imagery

    Science.gov (United States)

    Gebhardt, S.; Huth, J.; Wehrmann, T.; Schettler, I.; Künzer, C.; Schmidt, M.; Dech, S.

    2009-04-01

    The Mekong Delta in Vietnam offers natural resources for several million inhabitants. However, a strong population increase, changing climatic conditions and regulatory measures at the upper reaches of the Mekong lead to severe changes in the Delta. Extreme flood events occur more frequently, drinking water availability is increasingly limited, soils show signs of salinization or acidification, species and complete habitats diminish. During the Monsoon season the river regularly overflows its banks in the lower Mekong area, usually with beneficial effects. However, extreme flood events occur more frequently causing extensive damage, on the average once every 6 to 10 years river flood levels exceed the critical beneficial level X-band SAR data are well suited for deriving inundated surface areas. The TerraSAR-X sensor with its different scanning modi allows for the derivation of spatial and temporal high resolved inundation masks. The paper presents an automated procedure for deriving inundated areas from TerraSAR-X Scansar and Stripmap image data. Within the framework of the German-Vietnamese WISDOM project, focussing the Mekong Delta region in Vietnam, images have been acquired covering the flood season from June 2008 to November 2008. Based on these images a time series of the so called watermask showing inundated areas have been derived. The product is required as intermediate to (i) calibrate 2d inundation model scenarios, (ii) estimate the extent of affected areas, and (iii) analyze the scope of prior crisis. The image processing approach is based on the assumption that water surfaces are forward scattering the radar signal resulting in low backscatter signals to the sensor. It uses multiple grey level thresholds and image morphological operations. The approach is robust in terms of automation, accuracy, robustness, and processing time. The resulting watermasks show the seasonal flooding pattern with inundations starting in July, having their peak at the end

  15. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  16. Automation in irrigation process in family farm with Arduino platform

    Directory of Open Access Journals (Sweden)

    Kianne Crystie Bezerra da Cunha

    2016-03-01

    Full Text Available The small farmers tend not to use mechanical inputs in the irrigation process due to the high cost than conventional irrigation systems have and in other cases, the lack of knowledge and technical guidance makes the farmer theme using the system. Thus, all control and monitoring are made by hand without the aid of machines and this practice can lead to numerous problems from poor irrigation, and water waste, energy, and deficits in production. It is difficult to deduce when to irrigate, or how much water applied in cultivation, measure the soil temperature variables, temperature, and humidity, etc. The objective of this work is to implement an automated irrigation system aimed at family farming that is low cost and accessible to the farmer. The system will be able to monitor all parameters from irrigation. For this to occur, the key characteristics of family farming, Arduino platform, and irrigation were analyzed.

  17. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong

    2014-11-01

    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  18. Semisupervised Gaussian Process for Automated Enzyme Search.

    Science.gov (United States)

    Mellor, Joseph; Grigoras, Ioana; Carbonell, Pablo; Faulon, Jean-Loup

    2016-06-17

    Synthetic biology is today harnessing the design of novel and greener biosynthesis routes for the production of added-value chemicals and natural products. The design of novel pathways often requires a detailed selection of enzyme sequences to import into the chassis at each of the reaction steps. To address such design requirements in an automated way, we present here a tool for exploring the space of enzymatic reactions. Given a reaction and an enzyme the tool provides a probability estimate that the enzyme catalyzes the reaction. Our tool first considers the similarity of a reaction to known biochemical reactions with respect to signatures around their reaction centers. Signatures are defined based on chemical transformation rules by using extended connectivity fingerprint descriptors. A semisupervised Gaussian process model associated with the similar known reactions then provides the probability estimate. The Gaussian process model uses information about both the reaction and the enzyme in providing the estimate. These estimates were validated experimentally by the application of the Gaussian process model to a newly identified metabolite in Escherichia coli in order to search for the enzymes catalyzing its associated reactions. Furthermore, we show with several pathway design examples how such ability to assign probability estimates to enzymatic reactions provides the potential to assist in bioengineering applications, providing experimental validation to our proposed approach. To the best of our knowledge, the proposed approach is the first application of Gaussian processes dealing with biological sequences and chemicals, the use of a semisupervised Gaussian process framework is also novel in the context of machine learning applied to bioinformatics. However, the ability of an enzyme to catalyze a reaction depends on the affinity between the substrates of the reaction and the enzyme. This affinity is generally quantified by the Michaelis constant KM

  19. Managing Automation: A Process, Not a Project.

    Science.gov (United States)

    Hoffmann, Ellen

    1988-01-01

    Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…

  20. Automation of film badge monitoring in the GDR

    International Nuclear Information System (INIS)

    Pohl, K.P.; Krey, W.; Sowa, W.; Klucke, H.; Scheler, R.

    1985-01-01

    An automated procedure for central film badge processing in the National Board of Nuclear Safety and Radiation Protection (SAAS) is presented. Main unit is a K 1620 computer connected with the remote working place, on the one hand, via an SKR-CAMAC crate controller and a link connection with an AMCA computer controlling the measuring process and, on the other hand, via an IFSS interface with a K 8911 terminal with printer (hard copy). (author)

  1. Automation of film badge monitoring in the GDR

    Energy Technology Data Exchange (ETDEWEB)

    Pohl, K.P.; Krey, W.; Sowa, W.; Klucke, H.; Scheler, R. (Staatliches Amt fuer Atomsicherheit und Strahlenschutz, Berlin (German Democratic Republic))

    1985-08-01

    An automated procedure for central film badge processing in the National Board of Nuclear Safety and Radiation Protection (SAAS) is presented. Main unit is a K 1620 computer connected with the remote working place, on the one hand, via an SKR-CAMAC crate controller and a link connection with an AMCA computer controlling the measuring process and, on the other hand, via an IFSS interface with a K 8911 terminal with printer (hard copy).

  2. Unattended reaction monitoring using an automated microfluidic sampler and on-line liquid chromatography.

    Science.gov (United States)

    Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve

    2018-04-03

    In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. AUTOMATION OF CHAMPAGNE WINES PROCESS IN SPARKLING WINE PRESSURE TANK

    Directory of Open Access Journals (Sweden)

    E. V. Lukyanchuk

    2016-08-01

    Full Text Available The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex avtomatization sites grape processing with bulk transportation of the latter; improving the quality and durability of wines by the processing of a wide applying wine cold and heat, as well as technical and microbiological control most powerful automation equipment; the introduction of automated production processes of continuous technical champagne, sherry wine and cognac alcohol madery; the use of complex automation auxiliary production sites (boilers, air conditioners, refrigeration unitsand other.; complex avtomatization creation of enterprises, and sites manufactory bottling wines. In the wine industry developed more sophisticated schemes of automation and devices that enable the transition to integrated production automation, will create, are indicative automated enterprise serving for laboratories to study of the main problems of automation of production processes of winemaking.

  4. Automated Student Aid Processing: The Challenge and Opportunity.

    Science.gov (United States)

    St. John, Edward P.

    1985-01-01

    To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)

  5. Problems and Challenges of Automating Cataloguing Process at ...

    African Journals Online (AJOL)

    This paper discusses the problems faced by Kenneth Dike Library in automating its cataloguing process since 1992. It further attempts to identify some of the constraints inhibiting the success of the process: inadequate funding, dearth of systems analysts, absence of dedicated commitment to automation on the part of the ...

  6. Automated testing of arrhythmia monitors using annotated databases.

    Science.gov (United States)

    Elghazzawi, Z; Murray, W; Porter, M; Ezekiel, E; Goodall, M; Staats, S; Geheb, F

    1992-01-01

    Arrhythmia-algorithm performance is typically tested using the AHA and MIT/BIH databases. The tools for this test are simulation software programs. While these simulations provide rapid results, they neglect hardware and software effects in the monitor. To provide a more accurate measure of performance in the actual monitor, a system has been developed for automated arrhythmia testing. The testing system incorporates an IBM-compatible personal computer, a digital-to-analog converter, an RS232 board, a patient-simulator interface to the monitor, and a multi-tasking software package for data conversion and communication with the monitor. This system "plays" patient data files into the monitor and saves beat classifications in detection files. Tests were performed using the MIT/BIH and AHA databases. Statistics were generated by comparing the detection files with the annotation files. These statistics were marginally different from those that resulted from the simulation. Differences were then examined. As expected, the differences were related to monitor hardware effects.

  7. Automate The Tax Levy Process (Taxy)

    Data.gov (United States)

    Social Security Administration — This data store contains information to support the automation of Tax Levy payments. Data includes but is not limited to Title II benefits adjustment data, as well...

  8. Automation of data processing | G | African Journal of Range and ...

    African Journals Online (AJOL)

    Data processing can be time-consuming when experiments with advanced designs are employed. This, coupled with a shortage of research workers, necessitates automation. It is suggested that with automation the first step is to determine how the data must be analysed. The second step is to determine what programmes ...

  9. Tools for automated acoustic monitoring within the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  10. Process Monitoring for Nuclear Safeguards

    International Nuclear Information System (INIS)

    Ehinger, Michael H.; Pomeroy, George D.; Budlong-Sylvester, Kory W.

    2009-01-01

    Process Monitoring has long been used to evaluate industrial processes and operating conditions in nuclear and non-nuclear facilities. In nuclear applications there is a recognized need to demonstrate the safeguards benefits from using advanced process monitoring on spent fuel reprocessing technologies and associated facilities, as a complement to nuclear materials accounting. This can be accomplished by: defining credible diversion pathway scenarios as a sample problem; using advanced sensor and data analysis techniques to illustrate detection capabilities; and formulating 'event detection' methodologies as a means to quantify performance of the safeguards system. Over the past 30 years there have been rapid advances and improvement in the technology associated with monitoring and control of industrial processes. In the context of bulk handling facilities that process nuclear materials, modern technology can provide more timely information on the location and movement of nuclear material to help develop more effective safeguards. For international safeguards, inspection means verification of material balance data as reported by the operator through the State to the international inspectorate agency. This verification recognizes that the State may be in collusion with the operator to hide clandestine activities, potentially during abnormal process conditions with falsification of data to mask the removal. Records provided may show material is accounted for even though a removal occurred. Process monitoring can offer additional fidelity during a wide variety of operating conditions to help verify the declaration or identify possible diversions. The challenge is how to use modern technology for process monitoring and control in a proprietary operating environment subject to safeguards inspectorate or other regulatory oversight. Under the U.S. National Nuclear Security Administration's Next Generation Safeguards Initiative, a range of potential safeguards applications

  11. More steps towards process automation for optical fabrication

    Science.gov (United States)

    Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina

    2017-06-01

    In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.

  12. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    Johnson, S.M.; Lorenz, R.

    1992-01-01

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  13. OCT monitoring of pathophysiological processes

    Science.gov (United States)

    Gladkova, Natalia D.; Shakhova, Natalia M.; Shakhov, Andrei; Petrova, Galina P.; Zagainova, Elena; Snopova, Ludmila; Kuznetzova, Irina N.; Chumakov, Yuri; Feldchtein, Felix I.; Gelikonov, Valentin M.; Gelikonov, Grigory V.; Kamensky, Vladislav A.; Kuranov, Roman V.; Sergeev, Alexander M.

    1999-04-01

    Based on results of clinical examination of about 200 patients we discuss capabilities of the optical coherence tomography (OCT) in monitoring and diagnosing of various pathophysiological processes. Performed in several clinical areas including dermatology, urology, laryngology, gynecology, and dentistry, our study shows the existence of common optical features in manifestation of a pathophysiological process in different organs. In this paper we focus at such universal tomographic optical signs for processes of inflammation, necrosis and tumor growth. We also present data on dynamical OCT monitoring of evolution of pathophysiological processes, both at the stage of disease development and following-up results of different treatments such as drug application, radiation therapy, cryodestruction, and laser vaporization. The discovered peculiarities of OCT images for structural and functional imaging of biological tissues can be put as a basis for application of this method for diagnosing of pathology, guidance of treatment, estimation of its adequacy and assessing of the healing process.

  14. Automated radon-thoron monitoring for earthquake prediction research

    International Nuclear Information System (INIS)

    Shapiro, M.H.; Melvin, J.D.; Copping, N.A.; Tombrello, T.A.; Whitcomb, J.H.

    1980-01-01

    This paper describes an automated instrument for earthquake prediction research which monitors the emission of radon ( 222 Rn) and thoron ( 220 Rn) from rock. The instrument uses aerosol filtration techniques and beta counting to determine radon and thoron levels. Data from the first year of operation of a field prototype suggest an annual cycle in the radon level at the site which is related to thermoelastic strains in the crust. Two anomalous increases in the radon level of short duration have been observed during the first year of operation. One anomaly appears to have been a precursor for a nearby earthquake (2.8 magnitude, Richter scale), and the other may have been associated with changing hydrological conditions resulting from heavy rainfall

  15. Process computers automate CERN power supply installations

    CERN Document Server

    Ullrich, H

    1974-01-01

    Computerized automation systems are being used at CERN, Geneva, to improve the capacity, operational reliability and flexibility of the power supply installations for main ring magnets in the experimental zones of particle accelerators. A detailed account of the technological problem involved is followed in the article by a description of the system configuration, the program system and field experience already gathered in similar schemes. (1 refs).

  16. Automation in a material processing/storage facility

    International Nuclear Information System (INIS)

    Peterson, K.; Gordon, J.

    1997-01-01

    The Savannah River Site (SRS) is currently developing a new facility, the Actinide Packaging and Storage Facility (APSF), to process and store legacy materials from the United States nuclear stockpile. A variety of materials, with a variety of properties, packaging and handling/storage requirements, will be processed and stored at the facility. Since these materials are hazardous and radioactive, automation will be used to minimize worker exposure. Other benefits derived from automation of the facility include increased throughput capacity and enhanced security. The diversity of materials and packaging geometries to be handled poses challenges to the automation of facility processes. In addition, the nature of the materials to be processed underscores the need for safety, reliability and serviceability. The application of automation in this facility must, therefore, be accomplished in a rational and disciplined manner to satisfy the strict operational requirements of the facility. Among the functions to be automated are the transport of containers between process and storage areas via an Automatic Guided Vehicle (AGV), and various processes in the Shipping Package Unpackaging (SPU) area, the Accountability Measurements (AM) area, the Special Isotope Storage (SIS) vault and the Special Nuclear Materials (SNM) vault. Other areas of the facility are also being automated, but are outside the scope of this paper

  17. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  18. Monitoring and control of the Rossendorf research reactor using a microcomputerized automation system

    International Nuclear Information System (INIS)

    Ba weg, F.; Enkelmann, W.; Klebau, J.

    1982-01-01

    A decentral hierarchic information system (HIS) is presented, which has been developed for monitoring and control of the Rossendorf Research Reactor RFR, but which may also be considered the prototype of a digital automation system (AS) to be used in power stations. The functions integrated in the HIS are as follows: process monitoring, process control, and use of a specialized industrial robot for control of charging and discharging of the materials to be irradiated. The AS is realized on the basis of the process computer system PRA 30 (A 6492) developed in the GDR and including a computer K 1630 and the intelligent process terminals ursadat 5000 connected by a fast serial interface (IFLS). (author)

  19. Automated radiation monitoring at the Russian shipyards Atomflot and Polyarninski

    Energy Technology Data Exchange (ETDEWEB)

    Sidhu, R.S. [Inst. for Energy Technology, Kjeller (Norway); Endregard, M. [Norwegian Defence Research Establishment, Kjeller (Norway); Moskowitz, P.D. [Brookhaven National Laboratory, Upton, NY (United States); Sanders, J. [US Dept. of Defense, Washington, DC (United States); Bursuk, V. [Russian Navy, Moscow (Russian Federation); Kuzmin, V. [Polyarninsky shipyard, Polyarny (Russian Federation); Gavrilov, S.; Kisselev, V. [Russian Academy of Sciences, Moscow (Russian Federation). Nuclear Safety Inst.

    2005-09-15

    The increased rate of decommissioning and dismantling of Russian nuclear submarines has created the need for improved radioactive waste management and radiation monitoring in Northwest Russia. The Arctic Military Environmental Cooperation (AMEC) programme addresses these needs. AMEC is a cooperative effort between military establishments of the Russian Federation, United States, Norway and the United Kingdom to reduce potential environmental threats from military installations and activities in the Arctic and enhancing the environmental security in the region. Results from the AMEC Nuclear Safety project AMEC 1.5-1 are presented. The goal of this project is to enhance the ability of the Russian Navy to effectively and safely perform radioecological monitoring at selected facilities for dismantling of nuclear submarines and handling and disposition of spent nuclear fuel and radioactive waste. This has been accomplished by development of an automated and centralised radiological surveillance system based on the Norwegian software package PICASSO. The system has successfully been installed and is in regular operation at Atomflot. The second installation of the PICASSO system is at the Naval shipyard FSUE 10 SRZ in Polyarny, northwest of Murmansk. The installation was initiated in October 2004 and will be completed in the fall of 2005.

  20. Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

    Directory of Open Access Journals (Sweden)

    Duc Nguyen Minh

    2017-01-01

    Full Text Available This work describes Live Monitor, the monitoring subsystem of SDDS – an automated system for space experiment data processing, storage, and distribution created at SINP MSU. Live Monitor allows operators and developers of satellite data centers to identify errors occurred in data processing quickly and to prevent further consequences caused by the errors. All activities of the whole data processing cycle are illustrated via a web interface in real-time. Notification messages are delivered to responsible people via emails and Telegram messenger service. The flexible monitoring mechanism implemented in Live Monitor allows us to dynamically change and control events being shown on the web interface on our demands. Physicists, whose space weather analysis models are functioning upon satellite data provided by SDDS, can use the developed RESTful API to monitor their own events and deliver customized notification messages by their needs.

  1. Mechanization and automation of production processes in turbine building

    Science.gov (United States)

    Slobodyanyuk, V. P.

    1984-02-01

    Specialists at the All-Union Institute of Planning and Technology of Energy Machine Building are working on the problem of mechanization and automation of production processes. One of the major technological processes being worked on is the production of welded units. At the present time the Institute has designed a centralized cutting and manufacturing shop in use at several metallurgical plants, clamping devices for materials hoists based on permanent magnets, a program controlled installation for driving shaped apertures in welded diaphragm rims and an automated system for planning technological processes involved in manufacturing operations. Even in the manufacture of such individualized devices as turbines, mechanization and automation of production processes are economically justified. During the 11th Five Year Plan, the Institute will continue to develop progressive technological processes and equipment for precise shaping of turbine blade blanks, mechanical working of parts of steam, gas and hydraulic turbines, as well as nuclear powerplant turbines.

  2. Food intake monitoring: automated chew event detection in chewing sounds.

    Science.gov (United States)

    Päßler, Sebastian; Fischer, Wolf-Joachim

    2014-01-01

    The analysis of the food intake behavior has the potential to provide insights into the development of obesity and eating disorders. As an elementary part of this analysis, chewing strokes have to be detected and counted. Our approach for food intake analysis is the evaluation of chewing sounds generated during the process of eating. These sounds were recorded by microphones applied to the outer ear canal of the user. Eight different algorithms for automated chew event detection were presented and evaluated on two datasets. The first dataset contained food intake sounds from the consumption of six types of food. The second dataset consisted of recordings of different environmental sounds. These datasets contained 68,094 chew events in around 18 h recording data. The results of the automated chew event detection were compared to manual annotations. Precision and recall over 80% were achieved by most of the algorithms. A simple noise reduction algorithm using spectral subtraction was implemented for signal enhancement. Its benefit on the chew event detection performance was evaluated. A reduction of the number of false detections by 28% on average was achieved by maintaining the detection performance. The system is able to be used for calculation of the chewing frequency in laboratory settings.

  3. An automated process for deceit detection

    Science.gov (United States)

    Nwogu, Ifeoma; Frank, Mark; Govindaraju, Venu

    2010-04-01

    In this paper we present a prototype for an automated deception detection system. Similar to polygraph examinations, we attempt to take advantage of the theory that false answers will produce distinctive measurements in certain physiological manifestations. We investigate the role of dynamic eye-based features such as eye closure/blinking and lateral movements of the iris in detecting deceit. The features are recorded both when the test subjects are having non-threatening conversations as well as when they are being interrogated about a crime they might have committed. The rates of the behavioral changes are blindly clustered into two groups. Examining the clusters and their characteristics, we observe that the dynamic features selected for deception detection show promising results with an overall deceptive/non-deceptive prediction rate of 71.43% from a study consisting of 28 subjects.

  4. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  5. Automation of the Marine Corps Planning Process

    Science.gov (United States)

    2014-06-01

    Continuance Model EOU ease of use EPLRS Enhanced Position Location and Reporting System ESGPP ESG Planning Process FA field artillery FRAGO...planning processes like MCPP, MEU/ARG Planning Process (MEU/ARG PP), Army Planning Process (the MDMP) and the ESG Planning Process (ESGPP) (PAE Manual

  6. Monitoring of polymer melt processing

    International Nuclear Information System (INIS)

    Alig, Ingo; Steinhoff, Bernd; Lellinger, Dirk

    2010-01-01

    The paper reviews the state-of-the-art of in-line and on-line monitoring during polymer melt processing by compounding, extrusion and injection moulding. Different spectroscopic and scattering techniques as well as conductivity and viscosity measurements are reviewed and compared concerning their potential for different process applications. In addition to information on chemical composition and state of the process, the in situ detection of morphology, which is of specific interest for multiphase polymer systems such as polymer composites and polymer blends, is described in detail. For these systems, the product properties strongly depend on the phase or filler morphology created during processing. Examples for optical (UV/vis, NIR) and ultrasonic attenuation spectra recorded during extrusion are given, which were found to be sensitive to the chemical composition as well as to size and degree of dispersion of micro or nanofillers in the polymer matrix. By small-angle light scattering experiments, process-induced structures were detected in blends of incompatible polymers during compounding. Using conductivity measurements during extrusion, the influence of processing conditions on the electrical conductivity of polymer melts with conductive fillers (carbon black or carbon nanotubes) was monitored. (topical review)

  7. Automated method of processing video data from track detectors

    Science.gov (United States)

    Aleksandrov, A. B.; Goncharova, L. A.; Davydov, D. A.; Publichenko, P. A.; Roganova, T. M.; Polukhina, N. G.; Feinberg, E. L.

    2007-10-01

    New automated methods simplify significantly and accelerate processing of data from emulsion detectors. In addition to acceleration, automation of measurements allows large files of experimental data to be processed and their statistics to be made sufficient. It also gives impetus to the development of projects of new experiments with large-volume targets and emulsions and large-area solid-state track detectors. In this regard, the problem of increase in the number of scientists with required level of training capable of operation with automated technical equipment of this class becomes urgent. Every year, ten Moscow students master new methods working at the P. N. Lebedev Institute of Physics of the Russian Academy of Sciences with the PAVIKOM fully-automated measuring complex [1 3]. Most students now engaged in high-energy physics gain a notion of only outdated manual methods of processing data from track detectors. In 2005, a new practical work on determination of energy of neutrons transmitted through a nuclear emulsion was prepared on the basis of the PAVIKOM complex and physical experimental work of the Physical Department of Moscow State University. This practical work makes it possible to acquaint the students with initial skills used in automated processing of data from track detectors and can be included into educational process for students of physical departments.

  8. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    Science.gov (United States)

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  9. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    Directory of Open Access Journals (Sweden)

    Tongxin Shu

    2017-11-01

    Full Text Available Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA, while achieving around the same Normalized Mean Error (NME, DDASA is superior in saving 5.31% more battery energy.

  10. MODELING OF AUTOMATION PROCESSES CONCERNING CROP CULTIVATION BY AVIATION

    Directory of Open Access Journals (Sweden)

    V. I. Ryabkov

    2010-01-01

    Full Text Available The paper considers modeling of automation processes concerning crop cultivation by aviation. Processes that take place in three interconnected environments: human, technical and movable air objects are described by a model which is based on a set theory. Stochastic network theory of mass service systems for description of human-machine system of real time is proposed in the paper.

  11. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  12. High-Throughput Automation in Chemical Process Development.

    Science.gov (United States)

    Selekman, Joshua A; Qiu, Jun; Tran, Kristy; Stevens, Jason; Rosso, Victor; Simmons, Eric; Xiao, Yi; Janey, Jacob

    2017-06-07

    High-throughput (HT) techniques built upon laboratory automation technology and coupled to statistical experimental design and parallel experimentation have enabled the acceleration of chemical process development across multiple industries. HT technologies are often applied to interrogate wide, often multidimensional experimental spaces to inform the design and optimization of any number of unit operations that chemical engineers use in process development. In this review, we outline the evolution of HT technology and provide a comprehensive overview of how HT automation is used throughout different industries, with a particular focus on chemical and pharmaceutical process development. In addition, we highlight the common strategies of how HT automation is incorporated into routine development activities to maximize its impact in various academic and industrial settings.

  13. The automated discovery of hybrid processes

    NARCIS (Netherlands)

    Maggi, F.M.; Slaats, T.; Reijers, H.A.

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedural

  14. The Automated Discovery of Hybrid Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Reijers, Hajo; Maggi, Fabrizio Maria

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedu...

  15. Towards the Automated Annotation of Process Models

    NARCIS (Netherlands)

    Leopold, H.; Meilicke, C.; Fellmann, M.; Pittke, F.; Stuckenschmidt, H.; Mendling, J.

    2016-01-01

    Many techniques for the advanced analysis of process models build on the annotation of process models with elements from predefined vocabularies such as taxonomies. However, the manual annotation of process models is cumbersome and sometimes even hardly manageable taking the size of taxonomies into

  16. Monitored Retrievable Storage/Multi-Purpose Canister analysis: Simulation and economics of automation

    International Nuclear Information System (INIS)

    Bennett, P.C.; Stringer, J.B.

    1994-01-01

    Robotic automation is examined as a possible alternative to manual spent nuclear fuel, transport cask and Multi-Purpose canister (MPC) handling at a Monitored Retrievable Storage (MRS) facility. Automation of key operational aspects for the MRS/MPC system are analyzed to determine equipment requirements, through-put times and equipment costs is described. The economic and radiation dose impacts resulting from this automation are compared to manual handling methods

  17. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  18. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C...

  19. DEVELOPMENT OF AN AUTOMATED BATCH-PROCESS SOLAR ...

    African Journals Online (AJOL)

    One of the shortcomings of solar disinfection of water (SODIS) is the absence of a feedback mechanism indicating treatment completion. This work presents the development of an automated batch-process water disinfection system aimed at solving this challenge. Locally sourced materials in addition to an Arduinomicro ...

  20. Automating the Fireshed Assessment Process with ArcGIS

    Science.gov (United States)

    Alan Ager; Klaus Barber

    2006-01-01

    A library of macros was developed to automate the Fireshed process within ArcGIS. The macros link a number of vegetation simulation and wildfire behavior models (FVS, SVS, FARSITE, and FlamMap) with ESRI geodatabases, desktop software (Access, Excel), and ArcGIS. The macros provide for (1) an interactive linkage between digital imagery, vegetation data, FVS-FFE, and...

  1. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance

  2. Integration of disabled people in an automated work process

    Science.gov (United States)

    Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.

    2017-05-01

    Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.

  3. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  4. Automated high-volume aerosol sampling station for environmental radiation monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m{sup 3}/h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10{sup -6} Bq/m{sup 3}. The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too 10 refs.

  5. ECG acquisition and automated remote processing

    CERN Document Server

    Gupta, Rajarshi; Bera, Jitendranath

    2014-01-01

    The book is focused on the area of remote processing of ECG in the context of telecardiology, an emerging area in the field of Biomedical Engineering Application. Considering the poor infrastructure and inadequate numbers of physicians in rural healthcare clinics in India and other developing nations, telemedicine services assume special importance. Telecardiology, a specialized area of telemedicine, is taken up in this book considering the importance of cardiac diseases, which is prevalent in the population under discussion. The main focus of this book is to discuss different aspects of ECG acquisition, its remote transmission and computerized ECG signal analysis for feature extraction. It also discusses ECG compression and application of standalone embedded systems, to develop a cost effective solution of a telecardiology system.

  6. Automation of the micro-arc oxidation process

    Science.gov (United States)

    Golubkov, P. E.; Pecherskaya, E. A.; Karpanin, O. V.; Shepeleva, Y. V.; Zinchenko, T. O.; Artamonov, D. V.

    2017-11-01

    At present the significantly increased interest in micro-arc oxidation (MAO) encourages scientists to look for the solution of the problem of this technological process controllability. To solve this problem an automated technological installation MAO was developed, its structure and control principles are presented in this article. This device will allow to provide the controlled synthesis of MAO coatings and to identify MAO process patterns which contributes to commercialization of this technology.

  7. The automated data processing architecture for the GPI Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  8. Semi-automated Image Processing for Preclinical Bioluminescent Imaging.

    Science.gov (United States)

    Slavine, Nikolai V; McColl, Roderick W

    Bioluminescent imaging is a valuable noninvasive technique for investigating tumor dynamics and specific biological molecular events in living animals to better understand the effects of human disease in animal models. The purpose of this study was to develop and test a strategy behind automated methods for bioluminescence image processing from the data acquisition to obtaining 3D images. In order to optimize this procedure a semi-automated image processing approach with multi-modality image handling environment was developed. To identify a bioluminescent source location and strength we used the light flux detected on the surface of the imaged object by CCD cameras. For phantom calibration tests and object surface reconstruction we used MLEM algorithm. For internal bioluminescent sources we used the diffusion approximation with balancing the internal and external intensities on the boundary of the media and then determined an initial order approximation for the photon fluence we subsequently applied a novel iterative deconvolution method to obtain the final reconstruction result. We find that the reconstruction techniques successfully used the depth-dependent light transport approach and semi-automated image processing to provide a realistic 3D model of the lung tumor. Our image processing software can optimize and decrease the time of the volumetric imaging and quantitative assessment. The data obtained from light phantom and lung mouse tumor images demonstrate the utility of the image reconstruction algorithms and semi-automated approach for bioluminescent image processing procedure. We suggest that the developed image processing approach can be applied to preclinical imaging studies: characteristics of tumor growth, identify metastases, and potentially determine the effectiveness of cancer treatment.

  9. Providing security for automated process control systems at hydropower engineering facilities

    Science.gov (United States)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.

    2016-12-01

    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  10. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-01-01

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  11. Automated long-term monitoring of parallel microfluidic operations applying a machine vision-assisted positioning method.

    Science.gov (United States)

    Yip, Hon Ming; Li, John C S; Xie, Kai; Cui, Xin; Prasad, Agrim; Gao, Qiannan; Leung, Chi Chiu; Lam, Raymond H W

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  12. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    Directory of Open Access Journals (Sweden)

    Hon Ming Yip

    2014-01-01

    Full Text Available As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  13. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  14. Technology transfer potential of an automated water monitoring system. [market research

    Science.gov (United States)

    Jamieson, W. M.; Hillman, M. E. D.; Eischen, M. A.; Stilwell, J. M.

    1976-01-01

    The nature and characteristics of the potential economic need (markets) for a highly integrated water quality monitoring system were investigated. The technological, institutional and marketing factors that would influence the transfer and adoption of an automated system were studied for application to public and private water supply, public and private wastewater treatment and environmental monitoring of rivers and lakes.

  15. Mass spectrometry-based monitoring of millisecond protein–ligand binding dynamics using an automated microfluidic platform

    Energy Technology Data Exchange (ETDEWEB)

    Cong, Yongzheng; Katipamula, Shanta; Trader, Cameron D.; Orton, Daniel J.; Geng, Tao; Baker, Erin S.; Kelly, Ryan T.

    2016-01-01

    Characterizing protein-ligand binding dynamics is crucial for understanding protein function and developing new therapeutic agents. We have developed a novel microfluidic platform that features rapid mixing of protein and ligand solutions, variable incubation times, and on-chip electrospray ionization to perform label-free, solution-based monitoring of protein-ligand binding dynamics. This platform offers many advantages including automated processing, rapid mixing, and low sample consumption.

  16. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  17. AUTOMATION OF TRACEABILITY PROCESS AT GRAIN TERMINAL LLC “ UKRTRANSAGRO"

    Directory of Open Access Journals (Sweden)

    F. A. TRISHYN

    2017-07-01

    Full Text Available A positive trend of growth in both grain production and export is indicated. In the current marketing year the export potential of the Ukrainian grain market is close to the record level. However, the high positions in the rating of world exporters are achieved not only due to the high export potential, but also because of higher quality and logistics. These factors depend directly on the quality of enterprise management and all processes occurring at it. One of the perspective ways of enterprise development is the implementation of the traceability system and further automation of the traceability process. European integration laws are obliging Ukrainian enterprises to have a traceability system. Traceability is an ability to follow the movement of a feed or food through specified stages of production, processing and distribution. The process of traceability is managing by people, which implies a human factor. Automation will allow, in a greater extent, to exclude the human factor that will mean decreasing of errors in documentation and will speed up the process of grain transshipment. Research work on the process was carried out on the most modern grain terminal - LLC “UkrTransAgro”. The terminal is located in the Ukrainian water area of the Azov Sea (Mariupol, Ukraine. Characteristics of the terminal: capacity of a simultaneous storage - 48,120 thousand tons, acceptance of crops from transport - 4,500 tons / day; acceptance of crops from railway transport - 3000 tons / day, transshipment capacity - up to 1.2 million tons per year, shipment to the sea vessels - 7000 tons / day. The analysis of the automation level of the grain terminal is carried out. The company uses software from 1C - «1C: Enterprise 8. Accounting for grain elevator, mill, and feed mill for Ukraine». This software is used for quantitative and qualitative registration at the elevator in accordance with industry guidelines and standards. The software product has many

  18. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  19. Intelligent sensor-model automated control of PMR-15 autoclave processing

    Science.gov (United States)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    1992-01-01

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  20. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    Science.gov (United States)

    Singh, Preetpal

    Canada is home to thousands of freshwater lakes and rivers. Apart from being sources of infinite natural beauty, rivers and lakes are an important source of water, food and transportation. The northern hemisphere of Canada experiences extreme cold temperatures in the winter resulting in a freeze up of regional lakes and rivers. Frozen lakes and rivers tend to offer unique opportunities in terms of wildlife harvesting and winter transportation. Ice roads built on frozen rivers and lakes are vital supply lines for industrial operations in the remote north. Monitoring the ice freeze-up and break-up dates annually can help predict regional climatic changes. Lake ice impacts a variety of physical, ecological and economic processes. The construction and maintenance of a winter road can cost millions of dollars annually. A good understanding of ice mechanics is required to build and deem an ice road safe. A crucial factor in calculating load bearing capacity of ice sheets is the thickness of ice. Construction costs are mainly attributed to producing and maintaining a specific thickness and density of ice that can support different loads. Climate change is leading to warmer temperatures causing the ice to thin faster. At a certain point, a winter road may not be thick enough to support travel and transportation. There is considerable interest in monitoring winter road conditions given the high construction and maintenance costs involved. Remote sensing technologies such as Synthetic Aperture Radar have been successfully utilized to study the extent of ice covers and record freeze-up and break-up dates of ice on lakes and rivers across the north. Ice road builders often used Ultrasound equipment to measure ice thickness. However, an automated monitoring system, based on machine vision and image processing technology, which can measure ice thickness on lakes has not been thought of. Machine vision and image processing techniques have successfully been used in manufacturing

  1. Integrated safeguards and security for a highly automated process

    International Nuclear Information System (INIS)

    Zack, N.R.; Hunteman, W.J.; Jaeger, C.D.

    1993-01-01

    Before the cancellation of the New Production Reactor Programs for the production of tritium, the reactors and associated processing were being designed to contain some of the most highly automated and remote systems conceived for a Department of Energy facility. Integrating safety, security, materials control and accountability (MC and A), and process systems at the proposed facilities would enhance the overall information and protection-in-depth available. Remote, automated fuel handling and assembly/disassembly techniques would deny access to the nuclear materials while upholding ALARA principles but would also require the full integration of all data/information systems. Such systems would greatly enhance MC and A as well as facilitate materials tracking. Physical protection systems would be connected with materials control features to cross check activities and help detect and resolve anomalies. This paper will discuss the results of a study of the safeguards and security benefits achieved from a highly automated and integrated remote nuclear facility and the impacts that such systems have on safeguards and computer and information security

  2. Text mining from ontology learning to automated text processing applications

    CERN Document Server

    Biemann, Chris

    2014-01-01

    This book comprises a set of articles that specify the methodology of text mining, describe the creation of lexical resources in the framework of text mining and use text mining for various tasks in natural language processing (NLP). The analysis of large amounts of textual data is a prerequisite to build lexical resources such as dictionaries and ontologies and also has direct applications in automated text processing in fields such as history, healthcare and mobile applications, just to name a few. This volume gives an update in terms of the recent gains in text mining methods and reflects

  3. Silicon Carbide Temperature Monitor Processing Improvements. Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Unruh, Troy Casey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Daw, Joshua Earl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Al Rashdan, Ahamad [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-29

    Silicon carbide (SiC) temperature monitors are used as temperature sensors in Advanced Test Reactor (ATR) irradiations at the Idaho National Laboratory (INL). Although thermocouples are typically used to provide real-time temperature indication in instrumented lead tests, other indicators, such as melt wires, are also often included in such tests as an independent technique of detecting peak temperatures incurred during irradiation. In addition, less expensive static capsule tests, which have no leads attached for real-time data transmission, often rely on melt wires as a post-irradiation technique for peak temperature indication. Melt wires are limited in that they can only detect whether a single temperature is or is not exceeded. SiC monitors are advantageous because a single monitor can be used to detect for a range of temperatures that occurred during irradiation. As part of the process initiated to make SiC temperature monitors available at the ATR, post-irradiation evaluations of these monitors have been previously completed at the High Temperature Test Laboratory (HTTL). INL selected the resistance measurement approach for determining irradiation temperature from SiC temperature monitors because it is considered to be the most accurate measurement. The current process involves the repeated annealing of the SiC monitors at incrementally increasing temperature, with resistivity measurements made between annealing steps. The process is time consuming and requires the nearly constant attention of a trained staff member. In addition to the expensive and lengthy post analysis required, the current process adds many potential sources of error in the measurement, as the sensor must be repeatedly moved from furnace to test fixture. This time-consuming post irradiation analysis is a significant portion of the total cost of using these otherwise inexpensive sensors. An additional consideration of this research is that, if the SiC post processing can be automated, it

  4. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang

    2013-01-01

      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  5. Using Automation to Improve the Flight Software Testing Process

    Science.gov (United States)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  6. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    Science.gov (United States)

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  7. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  8. An automated platform for phytoplankton ecology and aquatic ecosystem monitoring

    NARCIS (Netherlands)

    Pomati, F.; Jokela, J.; Simona, M.; Veronesi, M.; Ibelings, B.W.

    2011-01-01

    High quality monitoring data are vital for tracking and understanding the causes of ecosystem change. We present a potentially powerful approach for phytoplankton and aquatic ecosystem monitoring, based on integration of scanning flow-cytometry for the characterization and counting of algal cells

  9. Tracking forest canopy stress from an automated proximal hyperspectral monitoring system

    Science.gov (United States)

    Woodgate, William; van Gorsel, Eva; Hughes, Dale; Cabello-Leblic, Arantxa

    2016-04-01

    Increasing climate variability and associated extreme weather events such as drought are likely to profoundly affect ecosystems, as many ecological processes are more sensitive to climate extremes than to changes in the mean states. However, the response of vegetation to these changes is one of the largest uncertainties in projecting future climate, carbon sequestration, and water resources. This remains a major limitation for long term climate prediction models integrating vegetation dynamics that are crucial for modelling the interplay of water, carbon and radiation fluxes. Satellite remote sensing data, such as that from the MODIS, Landsat and Sentinel missions, are the only viable means to study national and global vegetation trends. Highly accurate in-situ data is critical to better understand and validate our satellite products. Here, we developed a fully automated hyperspectral monitoring system installed on a flux monitoring tower at a mature Eucalypt forest site. The monitoring system is designed to provide a long-term (May 2014 - ongoing) and high temporal characterisation (3 acquisitions per day) of the proximal forest canopy to an unprecedented level of detail. The system comprises four main instruments: a thermal imaging camera and hyperspectral line camera (spectral ranges 7.5-14 μm and 0.4-1 μm, respectively), an upward pointing spectrometer (350-1000 nm), and hemispherical camera. The time series of hyperspectral and thermal imagery and flux tower data provides a unique dataset to study the impacts of logging, nutrient, and heat stress on trees and forest. Specifically, the monitoring system can be used to derive a range of physiological and structural indices that are also derived by satellites, such as PRI, TCARI/OSAVI, and NDVI. The monitoring system, to our knowledge, is the first fully automated data acquisition system that allows for spatially resolved spectral measurements at the sub-crown scale. Preliminary results indicate that canopy

  10. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    Science.gov (United States)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  11. CONCEPT AND STRUCTURE OF AUTOMATED SYSTEM FOR MONITORING STUDENT LEARNING QUALITY

    Directory of Open Access Journals (Sweden)

    M. Yu. Kataev

    2017-01-01

    organization and management of the learning process in a higher educational institution. The factors that affect the level of student knowledge obtained during training are shown. On this basis, the determining factors in assessing the level of knowledge are highlighted. It is offered to build the managing of individual training at any time interval on the basis of a calculation of the generalized criterion which consists of students’ current progress, their activity and time spent for training.The block structure of the automated program system of continuous monitoring of achievements of each student is described. All functional blocks of system are interconnected with educational process. The main advantage of this system is that students have continuous access to materials about own individual achievements and mistakes; from passive consumers of information they turn into active members of the education, and thus, they can achieve bigger effectiveness of personal vocational training. It is pointed out that information base of such system has to be available not only to students and teachers, but also future employers of university graduates.Practical significance. The concept of automated system for education results monitoring and technique of processing of collected material presented in the article are based on a simple and obvious circumstance: a student with high progress spends more time on training and leads active lifestyle in comparison with fellow students; therefore, that student with high probability will be more successful in the chosen profession. Thus, for ease of use, complete, fully detailed and digitized information on individual educational achievements of future expert is necessary not only for effective management of educational process in higher education institutions, but also for employers interested in well-prepared, qualified and hard-working staff intended to take responsibility for labour duties.

  12. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław

    2016-12-01

    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  13. Westinghouse integrated cementation facility. Smart process automation minimizing secondary waste

    International Nuclear Information System (INIS)

    Fehrmann, H.; Jacobs, T.; Aign, J.

    2015-01-01

    The Westinghouse Cementation Facility described in this paper is an example for a typical standardized turnkey project in the area of waste management. The facility is able to handle NPP waste such as evaporator concentrates, spent resins and filter cartridges. The facility scope covers all equipment required for a fully integrated system including all required auxiliary equipment for hydraulic, pneumatic and electric control system. The control system is based on actual PLC technology and the process is highly automated. The equipment is designed to be remotely operated, under radiation exposure conditions. 4 cementation facilities have been built for new CPR-1000 nuclear power stations in China

  14. DEVELOPING UNIVERSAL INSTALLATION WITH AUTOMATIC MONITORING AND CONTROL PROCESS OF MIXING, WHIPPING AND MOLD ING BISCUIT DOUGH

    Directory of Open Access Journals (Sweden)

    G. O. Magomedov

    2013-01-01

    Full Text Available The mixing-melting-forming installation with the automated system of monitoring and managements of technolog-ical processes and structurizations at intensive kneading, melting and formation of biscuit dough.

  15. Automated Pre-processing for NMR Assignments with Reduced Tedium

    Energy Technology Data Exchange (ETDEWEB)

    2004-05-11

    An important rate-limiting step in the reasonance asignment process is accurate identification of resonance peaks in MNR spectra. NMR spectra are noisy. Hence, automatic peak-picking programs must navigate between the Scylla of reliable but incomplete picking, and the Charybdis of noisy but complete picking. Each of these extremes complicates the assignment process: incomplete peak-picking results in the loss of essential connectivities, while noisy picking conceals the true connectivities under a combinatiorial explosion of false positives. Intermediate processing can simplify the assignment process by preferentially removing false peaks from noisy peak lists. This is accomplished by requiring consensus between multiple NMR experiments, exploiting a priori information about NMR spectra, and drawing on empirical statistical distributions of chemical shift extracted from the BioMagResBank. Experienced NMR practitioners currently apply many of these techniques "by hand", which is tedious, and may appear arbitrary to the novice. To increase efficiency, we have created a systematic and automated approach to this process, known as APART. Automated pre-processing has three main advantages: reduced tedium, standardization, and pedagogy. In the hands of experienced spectroscopists, the main advantage is reduced tedium (a rapid increase in the ratio of true peaks to false peaks with minimal effort). When a project is passed from hand to hand, the main advantage is standardization. APART automatically documents the peak filtering process by archiving its original recommendations, the accompanying justifications, and whether a user accepted or overrode a given filtering recommendation. In the hands of a novice, this tool can reduce the stumbling block of learning to differentiate between real peaks and noise, by providing real-time examples of how such decisions are made.

  16. Experience of using automated monitoring systems of the strain state of bearing structures on the olympic objects sochi-2014

    Directory of Open Access Journals (Sweden)

    Shakhraman’yan Andrey Mikhaylovich

    2015-12-01

    Full Text Available Various defects, which occur because of the influence of different environmental factors become the reason for the emergencies of building structures. Monitoring of certain parameters of bearing structures in the process of their erection and beginning of operation will help detecting negative processes which may endanger mechanical safety of buildings. The authors offer the operating results of automated monitoring system of the bearing structures state of the ice arena “Shayba” in the Olympic park in Sochi during the earthquake which happened on December 23th, 2012. The arena was equipped with a dynamic monitoring system, which helped estimating the influence of a seismic occurrence on the building constructions, to make prompt conclusions on absence of damages of the bearing structures, get important data on the dynamic response of the structure.

  17. Organization of film data processing in the PPI-SA automated system

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Perekatov, V.G.

    1984-01-01

    Organization of processing nuclear interaction images at PUOS - type standard devices using the PPI-SA automated system is considered. The system is made in the form of a complete module comprising two scanning measuring projectors and a scan-ning automatic device which operate in real time on line with the BESM-4-computer. The system comprises: subsystem for photographic film scanning, selection of events for measurements and preliminary encoding; subsystem for formation and generation of libraries with data required for monitoring the scanning automatic device; subsystem for precision measurements separate coordinates on photo images of nuclear particle tracks and ionization losses. The system software comprises monitoring programs for the projectors and scanning automatic device as well as test functional control programs and operating system. The programs are organized a modular concept. By changing the module set the system can be modified and adapted for image processing in different fields of science and technology

  18. Process development for automated solar cell and module production. Task 4. Automated array assembly. Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Witham, C.R.

    1979-06-12

    MBA has been working on the automated array assembly task of the Low-Cost Solar Array project. A baseline sequence for the manufacture of solar cell modules is specified. Starting with silicon wafers, the process goes through damage etching, texture etching, junction formation, plasma edge etch, aluminum back surface field formation, and screen printed metallization to produce finished solar cells which are then series connected on a ribbon and bonded into a finished glass, PVB, tedlar module. A number of steps required additional developmental effort to verify technical and economic feasibility. These steps include texture etching, plasma edge etch, aluminum back surface field formation, array layup and interconnect, and module edge sealing and framing.

  19. Monitoring and Control of the Automated Transfer Vehicle

    Science.gov (United States)

    Hugonnet, C.; D'Hoine, S.

    The objective of this paper is to present succinctly the architecture of the heart of the ATV Control Centre: the Monitoring and Control developed by CS for the French Space Agency (CNES) and the European Space Agency (ESA). At the moment, the Monitoring and Control is in the development phase, a first real time version will be delivered to CNES in July 2003, then a second version will be delivered in October including off line capabilities. The following paper introduces the high level specifications and the main driving performance criteria of the monitoring and control system in order to successfully operate these complex ATV space vehicles from the first flight planned in 2004. It presents the approach taken by CS and CNES in order to meet this challenge in a very short time. ATV-CC Monitoring and Control system is based on the reuse of flight proven components that are integrated in a software bus based architecture. The paper particularly shows the advantages of using new computer technologies in operational system: use of Object Oriented technologies from specification, design (UML) to development (C++, Java, PLSQL), use of a CORBA Object Request Broker for the exchange of messages and some centralised services, use of Java for the development of an ergonomic and standardised (for all functions of the M&C) Graphical User Interface and the extensive use of XML for data exchanges.

  20. Highly Automated Agile Testing Process: An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Jarosław Berłowski

    2016-09-01

    Full Text Available This paper presents a description of an agile testing process in a medium size software project that is developed using Scrum. The research methods used is the case study were as follows: surveys, quantifiable project data sources and qualitative project members opinions were used for data collection. Challenges related to the testing process regarding a complex project environment and unscheduled releases were identified. Based on the obtained results, we concluded that the described approach addresses well the aforementioned issues. Therefore, recommendations were made with regard to the employed principles of agility, specifically: continuous integration, responding to change, test automation and test driven development. Furthermore, an efficient testing environment that combines a number of test frameworks (e.g. JUnit, Selenium, Jersey Test with custom-developed simulators is presented.

  1. Automated Grid Monitoring for LHCb through HammerCloud

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  2. Automated analysis of PET based in-vivo monitoring in ion beam therapy

    International Nuclear Information System (INIS)

    Kuess, P.

    2014-01-01

    Particle Therapy (PT)-PET is currently the only clinically approved in-vivo method for monitoring PT. Due to fragmentation processes in the patients' tissue and the beam projectiles, a beta plus activity distribution (BAD) can be measured during or shortly after the irradiation. The recorded activity map can not be directly compared to the planned dose distribution. However, by means of a Monte Carlo (MC) simulation it is possible to predict the measured BAD from a treatment plan (TP). Thus to verify a patient's treatment fraction the actual PET measurement can be compared to the respective BAD prediction. This comparison is currently performed by visual inspection which requires experienced evaluators and is rather time consuming. In this PhD thesis an evaluation tool is presented to compare BADs in an automated and objective way. The evaluation method was based on the Pearson's correlation coefficient (PCC) – an established measure in medical image processing – which was coded into a software tool. The patient data used to develop, test and validate the software tool were acquired at the GSI research facility where over 400 patient treatments with 12C were monitored by means of an in-beam PET prototype. The number of data sets was increased by artificially altering BAD to simulate different beam ranges. The automated detection tool was tested in head and neck (H&N), prostate, lung, and brain. To generate carbon ion TPs the treatment planning system TRiP98 was used for all cases. From these TPs the respective BAD predictions were derived. Besides the detection of range deviations by means of PT-PET also the automated detection of patient setup uncertainties was investigated. Although all measured patient data were recorded during the irradiation (in-beam) also scenarios performing PET scans shortly after the irradiation (in-room) were considered. To analyze the achievable precision of PT-PET with the automated evaluation tool based on

  3. Automated speech quality monitoring tool based on perceptual evaluation

    OpenAIRE

    Vozňák, Miroslav; Rozhon, Jan

    2010-01-01

    The paper deals with a speech quality monitoring tool which we have developed in accordance with PESQ (Perceptual Evaluation of Speech Quality) and is automatically running and calculating the MOS (Mean Opinion Score). Results are stored into database and used in a research project investigating how meteorological conditions influence the speech quality in a GSM network. The meteorological station, which is located in our university campus provides information about a temperature,...

  4. A Comparative Experimental Study on the Use of Machine Learning Approaches for Automated Valve Monitoring Based on Acoustic Emission Parameters

    Science.gov (United States)

    Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.

    2018-03-01

    Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.

  5. An automated approach to monitoring low-level waste allocations

    International Nuclear Information System (INIS)

    Fuchs, R.L.; Hill, G.R.

    1986-01-01

    A program is underway to identify a means for monitoring low-level waste (LLW) allocations for compliance with the limited availability provisions contained in the Low-Level Radioactive Waste Policy Amendments Act of 1985. These amendments establish annual disposal limitations at the operating sites through transition, licensing, and construction periods and set out detailed short term reactor allocations to ensure compliance. The approach is complex because it assigns allocations according to reactor type and licensure date, allows allocation aggregation among commonly-owned units, permits the carrying forward of unused capacity from one period to the next, and provides for exchanges of allocations. Successful implementation of the congressional mandate will require careful monitoring of the division of capacity. This task can be accomplished with a sophisticated information management system. The Southern States Energy Board and the DOE Low-Level Waste Mangement Program, through this lead contract or EG and G Idaho, Inc., in cooperation with South Carolina and the Southeast Interstate Low-Level Radioactive Waste Management Compact Commission, are developing a computer interface that will allow the exchange of information with the U.S. Department of Energy's National Low-Level Waste Information System. This computer network is being used as a regional prototype for monitoring low-level waste allocations

  6. DPMine Graphical Language for Automation of Experiments in Process Mining

    Directory of Open Access Journals (Sweden)

    S. A. Shershakov

    2014-01-01

    Full Text Available Process mining is a new direction in the field of modeling and analysis of processes, where the use of information from event logs describing the history of the system behavior plays an important role. Methods and approaches used in the process mining are often based on various heuristics, and experiments with large event logs are crucial for the study and comparison of the developed methods and algorithms. Such experiments are very time consuming, so automation of experiments is an important task in the field of process mining. This paper presents the language DPMine developed specifically to describe and carry out experiments on the discovery and analysis of process models. The basic concepts of the DPMine language as well as principles and mechanisms of its extension are described. Ways of integration of the DPMine language as dynamically loaded components into the VTMine modeling tool are considered. An illustrating example of an experiment for building a fuzzy model of the process discovered from the log data stored in a normalized database is given.

  7. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  8. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  9. Automated Groundwater Monitoring of Uranium at the Hanford Site, Washington - 13116

    International Nuclear Information System (INIS)

    Burge, Scott R.; O'Hara, Matthew J.

    2013-01-01

    An automated groundwater monitoring system for the detection of uranyl ion in groundwater was deployed at the 300 Area Industrial Complex, Hanford Site, Washington. The research was conducted to determine if at-site, automated monitoring of contaminant movement in the subsurface is a viable alternative to the baseline manual sampling and analytical laboratory assay methods currently employed. The monitoring system used Arsenazo III, a colorimetric chelating compound, for the detection of the uranyl ion. The analytical system had a limit of quantification of approximately 10 parts per billion (ppb, μg/L). The EPA's drinking water maximum contaminant level (MCL) is 30 ppb [1]. In addition to the uranyl ion assay, the system was capable of acquiring temperature, conductivity, and river level data. The system was fully automated and could be operated remotely. The system was capable of collecting water samples from four sampling sources, quantifying the uranyl ion, and periodically performing a calibration of the analytical cell. The system communications were accomplished by way of cellular data link with the information transmitted through the internet. Four water sample sources were selected for the investigation: one location provided samples of Columbia River water, and the remaining three sources provided groundwater from aquifer sampling tubes positioned in a vertical array at the Columbia River shoreline. The typical sampling schedule was to sample the four locations twice per day with one calibration check per day. This paper outlines the instrumentation employed, the operation of the instrumentation, and analytical results for a period of time between July and August, 2012. The presentation includes the uranyl ion concentration and conductivity results from the automated sampling/analysis system, along with a comparison between the automated monitor's analytical performance and an independent laboratory analysis. Benefits of using the automated system as an

  10. Automation of a problem list using natural language processing

    Directory of Open Access Journals (Sweden)

    Haug Peter J

    2005-08-01

    Full Text Available Abstract Background The medical problem list is an important part of the electronic medical record in development in our institution. To serve the functions it is designed for, the problem list has to be as accurate and timely as possible. However, the current problem list is usually incomplete and inaccurate, and is often totally unused. To alleviate this issue, we are building an environment where the problem list can be easily and effectively maintained. Methods For this project, 80 medical problems were selected for their frequency of use in our future clinical field of evaluation (cardiovascular. We have developed an Automated Problem List system composed of two main components: a background and a foreground application. The background application uses Natural Language Processing (NLP to harvest potential problem list entries from the list of 80 targeted problems detected in the multiple free-text electronic documents available in our electronic medical record. These proposed medical problems drive the foreground application designed for management of the problem list. Within this application, the extracted problems are proposed to the physicians for addition to the official problem list. Results The set of 80 targeted medical problems selected for this project covered about 5% of all possible diagnoses coded in ICD-9-CM in our study population (cardiovascular adult inpatients, but about 64% of all instances of these coded diagnoses. The system contains algorithms to detect first document sections, then sentences within these sections, and finally potential problems within the sentences. The initial evaluation of the section and sentence detection algorithms demonstrated a sensitivity and positive predictive value of 100% when detecting sections, and a sensitivity of 89% and a positive predictive value of 94% when detecting sentences. Conclusion The global aim of our project is to automate the process of creating and maintaining a problem

  11. Automation of the DoD Export License Application Review Process

    National Research Council Canada - National Science Library

    Young, Shelton

    2002-01-01

    .... The overall audit objective was to determine whether Federal automation programs supporting the export license and review process could be used to establish a common electronic interface creating...

  12. Automated Machinery Health Monitoring Using Stress Wave Analysis & Artificial Intelligence

    National Research Council Canada - National Science Library

    Board, David

    1998-01-01

    .... Army, for application to helicopter drive train components. The system will detect structure borne, high frequency acoustic data, and process it with feature extraction and polynomial network artificial intelligence software...

  13. A rapid automated procedure for laboratory and shipboard spectrophotometric measurements of seawater alkalinity: continuously monitored single-step acid additions

    Science.gov (United States)

    Liu, X.; Byrne, R. H.; Lindemuth, M.; Easley, R. A.; Patsavas, M. C.

    2012-12-01

    An automated system for shipboard and laboratory alkalinity measurements is presented. The simple system, which consists of a Dosimat titrator to deliver acid volumetrically and a USB 4000 spectrophotometer to monitor the titration progress, provides fast, precise and accurate measurements of total alkalinity for oceanographic research. The analytical method is based on single-point HCl titrations of seawater samples of a known volume; bromol cresol purple is used as an indicator to determine the final pH. Field data from an Arctic cruise demonstrates accuracy and precision around 1 micro mol/kg and a sample processing rate of 6 min per sample.

  14. AIRCRAFT POWER SUPPLY SYSTEM DESIGN PROCESS AS AN AUTOMATION OBJECT

    Directory of Open Access Journals (Sweden)

    Boris V. Zhmurov

    2018-01-01

    aircraft and take into account all the requirements of the customer and the regulatory and technical documentation is its automation.Automation of the design of EPS aircraft as an optimization task involves the formalization of the object of optimization, as well as the choice of the criterion of efficiency and control actions. Under the object of optimization in this case we mean the design process of the EPS, the formalization of which includes formalization and the design object – the aircraft power supply system.

  15. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...

  16. Process monitoring of abrasive waterjet formation

    OpenAIRE

    Putz, Matthias; Dittrich, Markus; Dix, Martin

    2016-01-01

    Difficult to machine materials require innovative processing solutions for a stable and high quality contouring process of complex forms. Abrasive waterjet cutting gains in importance due to the continuous development of novel high performance materials and multi-material components. A reliable process monitoring during the machining operation becomes essential to avoid waste production. However, the measurement of the process conditions during abrasive waterjet cutting is difficult based on ...

  17. Automation of the CFD Process on Distributed Computing Systems

    Science.gov (United States)

    Tejnil, Ed; Gee, Ken; Rizk, Yehia M.

    2000-01-01

    A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational

  18. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  19. An improved approach for process monitoring in laser material processing

    Science.gov (United States)

    König, Hans-Georg; Pütsch, Oliver; Stollenwerk, Jochen; Loosen, Peter

    2016-04-01

    Process monitoring is used in many different laser material processes due to the demand for reliable and stable processes. Among different methods, on-axis process monitoring offers multiple advantages. To observe a laser material process it is unavoidable to choose a wavelength for observation that is different to the one used for material processing, otherwise the light of the processing laser would outshine the picture of the process. By choosing a different wavelength, lateral chromatic aberration occurs in not chromatically corrected optical systems with optical scanning units and f-Theta lenses. These aberrations lead to a truncated image of the process on the camera or the pyrometer, respectively. This is the reason for adulterated measurements and non-satisfying images of the process. A new approach for solving the problem of field dependent lateral chromatic aberration in process monitoring is presented. Therefore, the scanner-based optical system is reproduced in a simulation environment, to predict the occurring lateral chromatic aberrations. In addition, a second deflecting system is integrated into the system. By using simulation, a predictive control is designed that uses the additional deflecting system to introduce reverse lateral deviations in order to compensate the lateral effect of chromatic aberration. This paper illustrates the concept and the implementation of the predictive control, which is used to eliminate lateral chromatic aberrations in process monitoring, the simulation on which the system is based the optical system as well as the control concept.

  20. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    Science.gov (United States)

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  1. Water quality monitoring using an automated portable fiber optic biosensor: RAPTOR

    Science.gov (United States)

    Anderson, George P.; Rowe-Taitt, Chris A.

    2001-03-01

    The RAPTOR is a portable, automated biosensor capable of performing rapid, ten-minute assays on a sample for four target analytes simultaneously. Samples are analyzed using a fluorescent sandwich immunoassay on the surface of short polystyrene optical probes with capture antibody adsorbed to the probe surface. Target analytes bound to the fiber by capture antibodies are detected with fluorescently labeled tracer antibodies, which are held in a separate reservoir. Since target recognition is a two-step process, selectivity is enhanced, and the optical probes can be reused up to forty times, or until a positive result is obtained. This greatly reduces the logistical burden for field operations. Numerous assays for toxins, such as SEB and ricin, and bacteria, such as Bacillus anthracis and Francisella tularensis, have been developed for the RAPTOR. An assay of particular interest for water quality monitoring and the screening of fruits and vegetables is detection of Giardia cysts. Giardia lamblia is a parasitic protozoan common in the developing world that causes severe intestinal infections. Thus, a simple field assay for screening water supplies would be highly useful. Such an assay has been developed using the RAPTOR. The detection limit for Giardia cysts was 5x104/ml for a 10-minute assay.

  2. Using process-oriented interfaces for solving the automation paradox in highly automated navy vessels

    NARCIS (Netherlands)

    Diggelen, J. van; Post, W.; Rakhorst, M.; Plasmeijer, R.; Staal, W. van

    2014-01-01

    This paper describes a coherent engineering method for developing high level human machine interaction within a highly automated environment consisting of sensors, actuators, automatic situation assessors and planning devices. Our approach combines ideas from cognitive work analysis, cognitive

  3. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  4. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology

    OpenAIRE

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-01-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishm...

  5. FY-2010 Process Monitoring Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Casella, Amanda J.; Hines, Wes; Levitskaia, Tatiana G.; henkell, J.; Schwantes, Jon M.; Jordan, Elizabeth A.; Lines, Amanda M.; Fraga, Carlos G.; Peterson, James M.; Verdugo, Dawn E.; Christensen, Ronald N.; Peper, Shane M.

    2011-01-01

    During FY 2010, work under the Spectroscopy-Based Process Monitoring task included ordering and receiving four fluid flow meters and four flow visible-near infrared spectrometer cells to be instrumented within the centrifugal contactor system at Pacific Northwest National Laboratory (PNNL). Initial demonstrations of real-time spectroscopic measurements on cold-stream simulants were conducted using plutonium (Pu)/uranium (U) (PUREX) solvent extraction process conditions. The specific test case examined the extraction of neodymium nitrate (Nd(NO3)3) from an aqueous nitric acid (HNO3) feed into a tri-n-butyl phosphate (TBP)/ n-dodecane solvent. Demonstration testing of this system included diverting a sample from the aqueous feed meanwhile monitoring the process in every phase using the on-line spectroscopic process monitoring system. The purpose of this demonstration was to test whether spectroscopic monitoring is capable of determining the mass balance of metal nitrate species involved in a cross-current solvent extraction scheme while also diverting a sample from the system. The diversion scenario involved diverting a portion of the feed from a counter-current extraction system while a continuous extraction experiment was underway. A successful test would demonstrate the ability of the process monitoring system to detect and quantify the diversion of material from the system during a real-time continuous solvent extraction experiment. The system was designed to mimic a PUREX-type extraction process with a bank of four centrifugal contactors. The aqueous feed contained Nd(NO3)3 in HNO3, and the organic phase was composed of TBP/n-dodecane. The amount of sample observed to be diverted by on-line spectroscopic process monitoring was measured to be 3 mmol (3 x 10-3 mol) Nd3+. This value was in excellent agreement with the 2.9 mmol Nd3+ value based on the known mass of sample taken (i.e., diverted) directly from the system feed solution.

  6. Effect of Automated Prescription Drug Monitoring Program Queries on Emergency Department Opioid Prescribing.

    Science.gov (United States)

    Sun, Benjamin C; Charlesworth, Christina J; Lupulescu-Mann, Nicoleta; Young, Jenny I; Kim, Hyunjee; Hartung, Daniel M; Deyo, Richard A; McConnell, K John

    2018-03-01

    We assess whether an automated prescription drug monitoring program intervention in emergency department (ED) settings is associated with reductions in opioid prescribing and quantities. We performed a retrospective cohort study of ED visits by Medicaid beneficiaries. We assessed the staggered implementation (pre-post) of automated prescription drug monitoring program queries at 86 EDs in Washington State from January 1, 2013, to September 30, 2015. The outcomes included any opioid prescribed within 1 day of the index ED visit and total dispensed morphine milligram equivalents. The exposure was the automated prescription drug monitoring program query intervention. We assessed program effects stratified by previous high-risk opioid use. We performed multiple sensitivity analyses, including restriction to pain-related visits, restriction to visits with a confirmed prescription drug monitoring program query, and assessment of 6 specific opioid high-risk indicators. The study included 1,187,237 qualifying ED visits (898,162 preintervention; 289,075 postintervention). Compared with the preintervention period, automated prescription drug monitoring program queries were not significantly associated with reductions in the proportion of visits with opioid prescribing (5.8 per 1,000 encounters; 95% confidence interval [CI] -0.11 to 11.8) or the amount of prescribed morphine milligram equivalents (difference 2.66; 95% CI -0.15 to 5.48). There was no evidence of selective reduction in patients with previous high-risk opioid use (1.2 per 1,000 encounters, 95% CI -9.5 to 12.0; morphine milligram equivalents 1.22, 95% CI -3.39 to 5.82). The lack of a selective reduction in high-risk patients was robust to all sensitivity analyses. An automated prescription drug monitoring program query intervention was not associated with reductions in ED opioid prescribing or quantities, even in patients with previous high-risk opioid use. Copyright © 2017 American College of Emergency

  7. Automating the Human Factors Engineering and Evaluation Processes

    International Nuclear Information System (INIS)

    Mastromonico, C.

    2002-01-01

    The Westinghouse Savannah River Company (WSRC) has developed a software tool for automating the Human Factors Engineering (HFE) design review, analysis, and evaluation processes. The tool provides a consistent, cost effective, graded, user-friendly approach for evaluating process control system Human System Interface (HSI) specifications, designs, and existing implementations. The initial set of HFE design guidelines, used in the tool, was obtained from NUREG- 0700. Each guideline was analyzed and classified according to its significance (general concept vs. supporting detail), the HSI technology (computer based vs. non-computer based), and the HSI safety function (safety vs. non-safety). Approximately 10 percent of the guidelines were determined to be redundant or obsolete and were discarded. The remaining guidelines were arranged in a Microsoft Access relational database, and a Microsoft Visual Basic user interface was provided to facilitate the HFE design review. The tool also provides the capability to add new criteria to accommodate advances in HSI technology and incorporate lessons learned. Summary reports produced by the tool can be easily ported to Microsoft Word and other popular PC office applications. An IBM compatible PC with Microsoft Windows 95 or higher is required to run the application

  8. Problems of collaborative work of the automated process control system (APCS) and the its information security and solutions.

    Science.gov (United States)

    Arakelyan, E. K.; Andryushin, A. V.; Mezin, S. V.; Kosoy, A. A.; Kalinina, Ya V.; Khokhlov, I. S.

    2017-11-01

    The principle of interaction of the specified systems of technological protections by the Automated process control system (APCS) and information safety in case of incorrect execution of the algorithm of technological protection is offered. - checking the correctness of the operation of technological protection in each specific situation using the functional relationship between the monitored parameters. The methodology for assessing the economic feasibility of developing and implementing an information security system.

  9. On-line process control monitoring system

    International Nuclear Information System (INIS)

    O'Rourke, P.E.; Van Hare, D.R.; Prather, W.S.

    1992-01-01

    This patent describes apparatus for monitoring at a plurality of locations within a system the concentration of at least one chemical substance involved in a chemical process. It comprises plurality of process cells; first means for carrying the light; second means for carrying the light; means for producing a spectrum from the light received by the second carrying means; multiplexing means for selecting one process cell of the plurality of process cells at a time so that the producing means can produce a process spectrum from the one cell of the process cells; a reference cell for producing a reference spectrum for comparison to the process spectrum; a standard cell for producing a standard spectrum for comparison to the process spectrum; and means for comparing the reference spectrum, the standard spectrum and the process spectrum and determining the concentration of the chemical substance in the process cell

  10. Monitoring and controlling the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Ahring, B.K.; Angelidaki, I. [The Technical Univ. of Denmark, Dept. of Environmental Science and Engineering, Lyngby (Denmark)

    1997-08-01

    Many modern large-scale biogas plants have been constructed recently, increasing the demand for proper monitoring and control of these large reactor systems. For monitoring the biogas process, an easy to measure and reliable indicator is required, which reflects the metabolic state and the activity of the bacterial populations in the reactor. In this paper, we discuss existing indicators as well as indicators under development which can potentially be used to monitor the state of the biogas process in a reactor. Furthermore, data are presented from two large scale thermophilic biogas plants, subjected to temperature changes and where the concentration of volatile fatty acids was monitored. The results clearly demonstrated that significant changes in the concentration of the individual VFA occurred although the biogas production was not significantly changed. Especially the concentrations of butyrate, isobutyrate and isovalerate showed significant changes. Future improvements of process control could therefore be based on monitoring of the concentration of specific VFA`s together with information about the bacterial populations in the reactor. The last information could be supplied by the use of modern molecular techniques. (au) 51 refs.

  11. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  12. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    Science.gov (United States)

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  13. Electronic Tongue-FIA system for the Monitoring of Heavy Metal Biosorption Processes

    Science.gov (United States)

    Wilson, D.; Florido, A.; Valderrama, C.; de Labastida, M. Fernández; Alegret, S.; del Valle, M.

    2011-09-01

    An automated flow injection potentiometric (FIP) system with electronic tongue detection (ET) was used for the monitoring of biosorption processes of heavy metals on waste biomaterial. Grape stalk wastes were used as biosorbent to remove Cu2+ ions in a fixed-bed column setup. For the monitoring, the used ET employed a sensor array formed by Cu2+ and Ca2+ selective electrodes and two generic heavy-metal electrodes. The subsequent cross-response obtained was processed by a multilayer artificial neural network (ANN) model in order to resolve the concentrations of the monitored species. The coupling of the electronic tongue with the automation features of the flow-injection system (ET-FIP) allowed us to accurately characterize the biosorption process, through obtaining its breakthrough curves. In parallel, fractions of the extract solution were analyzed by atomic absorption spectroscopy in order to validate the results obtained with the reported methodology.

  14. Automated Processing of Zebrafish Imaging Data: A Survey

    Science.gov (United States)

    Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A.; Kausler, Bernhard X.; Ledesma-Carbayo, María J.; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-01-01

    Abstract Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines. PMID:23758125

  15. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Wang Mingzhong; Huang Guogang [Pingdingshan Mining Bureau (China); Wang Yunjia; Guogangli [China Univ. of Mining and Technology, Xuzhou (China)

    1996-12-31

    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  16. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER

  17. Monitoring of batch processes using spectroscopy

    NARCIS (Netherlands)

    Gurden, S. P.; Westerhuis, J. A.; Smilde, A. K.

    2002-01-01

    There is an increasing need for new techniques for the understanding, monitoring and the control of batch processes. Spectroscopy is now becoming established as a means of obtaining real-time, high-quality chemical information at frequent time intervals and across a wide range of industrial

  18. Monitoring process variability using auxiliary information

    NARCIS (Netherlands)

    Riaz, M.

    2008-01-01

    In this study a Shewhart type control chart namely V (r) chart is proposed for improved monitoring of process variability (targeting large shifts) of a quality characteristic of interest Y. The proposed control chart is based on regression type estimator of variance using a single auxiliary variable

  19. Monitoring process mean level using auxiliary information

    NARCIS (Netherlands)

    Riaz, M.

    2008-01-01

    In this study, a Shewhart-type control chart is proposed for the improved monitoring of process mean level (targeting both moderate and large shifts which is the major concern of Shewhart-type control charts) of a quality characteristic of interest Y. The proposed control chart, namely the M-r

  20. Concurrent Pilot Instrument Monitoring in the Automated Multi-Crew Airline Cockpit.

    Science.gov (United States)

    Jarvis, Stephen R

    2017-12-01

    Pilot instrument monitoring has been described as "inadequate," "ineffective," and "insufficient" after multicrew aircraft accidents. Regulators have called for improved instrument monitoring by flight crews, but scientific knowledge in the area is scarce. Research has tended to investigate the monitoring of individual pilots when in the pilot-flying role; very little research has looked at crew monitoring, or that of the "monitoring-pilot" role despite it being half of the apparent problem. Eye-tracking data were collected from 17 properly constituted and current Boeing 737 crews operating in a full motion simulator. Each crew flew four realistic flight segments, with pilots swapping between the pilot-flying and pilot-monitoring roles, with and without the autopilot engaged. Analysis was performed on the 375 maneuvering-segments prior to localizer intercept. Autopilot engagement led to significantly less visual dwell time on the attitude director indicator (mean 212.8-47.8 s for the flying pilot and 58.5-39.8 s for the monitoring-pilot) and an associated increase on the horizontal situation indicator (18-52.5 s and 36.4-50.5 s). The flying-pilots' withdrawal of attention from the primary flight reference and increased attention to the primary navigational reference was paralleled rather than complemented by the monitoring-pilot, suggesting that monitoring vulnerabilities can be duplicated in the flight deck. Therefore it is possible that accident causes identified as "inadequate" or "insufficient" monitoring, are in fact a result of parallel monitoring.Jarvis SR. Concurrent pilot instrument monitoring in the automated multi-crew airline cockpit. Aerosp Med Hum Perform. 2017; 88(12):1100-1106.

  1. Automated Mobility Transitions: Governing Processes in the UK

    Directory of Open Access Journals (Sweden)

    Debbie Hopkins

    2018-03-01

    Full Text Available Contemporary systems of mobility are undergoing a transition towards automation. In the UK, this transition is being led by (often new partnerships between incumbent manufacturers and new entrants, in collaboration with national governments, local/regional councils, and research institutions. This paper first offers a framework for analyzing the governance of the transition, adapting ideas from the Transition Management (TM perspective, and then applies the framework to ongoing automated vehicle transition dynamics in the UK. The empirical analysis suggests that the UK has adopted a reasonably comprehensive approach to the governing of automated vehicle innovation but that this approach cannot be characterized as sufficiently inclusive, democratic, diverse and open. The lack of inclusivity, democracy, diversity and openness is symptomatic of the post-political character of how the UK’s automated mobility transition is being governed. The paper ends with a call for a reconfiguration of the automated vehicle transition in the UK and beyond, so that much more space is created for dissent and for reflexive and comprehensive big picture thinking on (automated mobility futures.

  2. Failsafe automation of Phase II clinical trial interim monitoring for stopping rules.

    Science.gov (United States)

    Day, Roger S

    2010-02-01

    In Phase II clinical trials in cancer, preventing the treatment of patients on a study when current data demonstrate that the treatment is insufficiently active or too toxic has obvious benefits, both in protecting patients and in reducing sponsor costs. Considerable efforts have gone into experimental designs for Phase II clinical trials with flexible sample size, usually implemented by early stopping rules. The intended benefits will not ensue, however, if the design is not followed. Despite the best intentions, failures can occur for many reasons. The main goal is to develop an automated system for interim monitoring, as a backup system supplementing the protocol team, to ensure that patients are protected. A secondary goal is to stimulate timely recording of patient assessments. We developed key concepts and performance needs, then designed, implemented, and deployed a software solution embedded in the clinical trials database system. The system has been in place since October 2007. One clinical trial tripped the automated monitor, resulting in e-mails that initiated statistician/investigator review in timely fashion. Several essential contributing activities still require human intervention, institutional policy decisions, and institutional commitment of resources. We believe that implementing the concepts presented here will provide greater assurance that interim monitoring plans are followed and that patients are protected from inadequate response or excessive toxicity. This approach may also facilitate wider acceptance and quicker implementation of new interim monitoring algorithms.

  3. Automated measurement of pressure injury through image processing.

    Science.gov (United States)

    Li, Dan; Mathews, Carol

    2017-11-01

    To develop an image processing algorithm to automatically measure pressure injuries using electronic pressure injury images stored in nursing documentation. Photographing pressure injuries and storing the images in the electronic health record is standard practice in many hospitals. However, the manual measurement of pressure injury is time-consuming, challenging and subject to intra/inter-reader variability with complexities of the pressure injury and the clinical environment. A cross-sectional algorithm development study. A set of 32 pressure injury images were obtained from a western Pennsylvania hospital. First, we transformed the images from an RGB (i.e. red, green and blue) colour space to a YC b C r colour space to eliminate inferences from varying light conditions and skin colours. Second, a probability map, generated by a skin colour Gaussian model, guided the pressure injury segmentation process using the Support Vector Machine classifier. Third, after segmentation, the reference ruler - included in each of the images - enabled perspective transformation and determination of pressure injury size. Finally, two nurses independently measured those 32 pressure injury images, and intraclass correlation coefficient was calculated. An image processing algorithm was developed to automatically measure the size of pressure injuries. Both inter- and intra-rater analysis achieved good level reliability. Validation of the size measurement of the pressure injury (1) demonstrates that our image processing algorithm is a reliable approach to monitoring pressure injury progress through clinical pressure injury images and (2) offers new insight to pressure injury evaluation and documentation. Once our algorithm is further developed, clinicians can be provided with an objective, reliable and efficient computational tool for segmentation and measurement of pressure injuries. With this, clinicians will be able to more effectively monitor the healing process of pressure

  4. Automated terrestrial laser scanning with near-real-time change detection – monitoring of the Séchilienne landslide

    Directory of Open Access Journals (Sweden)

    R. A. Kromer

    2017-05-01

    Full Text Available We present an automated terrestrial laser scanning (ATLS system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.

  5. Quality monitoring of the extinguisher maintenance process

    Directory of Open Access Journals (Sweden)

    Raphael Henrique Teixeira da Silva

    2016-09-01

    Full Text Available Statistical Process Control (SPC contains a collection of statistical tools used to monitor the processes and serves as a great aid to reduce variability. This study therefore presents the application of some of the tools proposed by SPC at an extinguisher company. More specifically, the most frequent defects and their quantity in the fire extinguisher maintenance process were evaluated. Assessment of the defects found identified that the excess load of 1 kg dry chemical extinguishers and class BC powder generated losses for the company. Furthermore, study of the process capacity ratios showed that the process was not able to achieve the limits specified by the owner. It should be noted that the software R to develop the study was used, generating additional costs to the company. This article presents the software routines used to create the charts, the problems encountered during the evaluation process of the company and the correction of inadequate processes.

  6. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology.

    Science.gov (United States)

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-05-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  7. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology

    Science.gov (United States)

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-01-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient’s/parent’s smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. PMID:27018263

  8. Increasing the Efficiency of Automation of Production Processes by Reporting the Parameters of the Parts’ Flow

    Directory of Open Access Journals (Sweden)

    Pancho Tomov

    2017-08-01

    Full Text Available In this paper are presented the analysis and the proposal to increasing the efficiency of automation of production processes by reporting the parameters of the parts’ flow. As a main focus are defined the correlation and the dependence between the input and the output parameters of the automated production process. On that basis, the contemporary requirements for development of the production process call for it to be considered as a whole, regardless of the stage in which the process automation is performed.

  9. Vision-Based Geo-Monitoring - A New Approach for an Automated System

    Science.gov (United States)

    Wagner, A.; Reiterer, A.; Wasmeier, P.; Rieke-Zapp, D.; Wunderlich, T.

    2012-04-01

    The necessity for monitoring geo-risk areas such as rock slides is growing due to the increasing probability of such events caused by environmental change. Life with threat becomes to a calculable risk by geodetic deformation monitoring. An in-depth monitoring concept with modern measurement technologies allows the estimation of the hazard potential and the prediction of life-threatening situations. The movements can be monitored by sensors, placed in the unstable slope area. In most cases, it is necessary to enter the regions at risk in order to place the sensors and maintain them. Using long-range monitoring systems (e.g. terrestrial laser scanners, total stations, ground based synthetic aperture radar) allows avoiding this risk. To close the gap between the existing low-resolution, medium-accuracy sensors and conventional (co-operative target-based) surveying methods, image-assisted total stations (IATS) are a suggestive solution. IATS offer the user (e.g. metrology expert) an image capturing system (CCD/CMOS camera) in addition to 3D point measurements. The images of the telescope's visual field are projected onto the camera's chip. With appropriate calibration, these images are accurately geo-referenced and oriented since the horizontal and vertical angles of rotation are continuously recorded. The oriented images can directly be used for direction measurements with no need for object control points or further photogrammetric orientation processes. IATS are able to provide high density deformation fields with high accuracy (down to mm range), in all three coordinate directions. Tests have shown that with suitable image processing measurements a precision of 0.05 pixel ± 0.04·σ is possible (which corresponds to 0.03 mgon ± 0.04·σ). These results have to be seen under the consideration that such measurements are image-based only. For measuring in 3D object space the precision of pointing has to be taken into account. IATS can be used in two different ways

  10. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    Goncalves, Joao G.M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-01-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  11. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); Rahmat, K. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); University Malaya, Biomedical Imaging Department, Kuala Lumpur (Malaysia); Ariffin, H. [University of Malaya, Department of Paediatrics, Faculty of Medicine, Kuala Lumpur (Malaysia)

    2012-07-15

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  12. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    International Nuclear Information System (INIS)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M.; Rahmat, K.; Ariffin, H.

    2012-01-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  13. Automated Processing of ISIS Topside Ionograms into Electron Density Profiles

    Science.gov (United States)

    Reinisch, bodo W.; Huang, Xueqin; Bilitza, Dieter; Hills, H. Kent

    2004-01-01

    Modeling of the topside ionosphere has for the most part relied on just a few years of data from topside sounder satellites. The widely used Bent et al. (1972) model, for example, is based on only 50,000 Alouette 1 profiles. The International Reference Ionosphere (IRI) (Bilitza, 1990, 2001) uses an analytical description of the graphs and tables provided by Bent et al. (1972). The Alouette 1, 2 and ISIS 1, 2 topside sounder satellites of the sixties and seventies were ahead of their times in terms of the sheer volume of data obtained and in terms of the computer and software requirements for data analysis. As a result, only a small percentage of the collected topside ionograms was converted into electron density profiles. Recently, a NASA-funded data restoration project has undertaken and is continuing the process of digitizing the Alouette/ISIS ionograms from the analog 7-track tapes. Our project involves the automated processing of these digital ionograms into electron density profiles. The project accomplished a set of important goals that will have a major impact on understanding and modeling of the topside ionosphere: (1) The TOPside Ionogram Scaling and True height inversion (TOPIST) software was developed for the automated scaling and inversion of topside ionograms. (2) The TOPIST software was applied to the over 300,000 ISIS-2 topside ionograms that had been digitized in the fkamework of a separate AISRP project (PI: R.F. Benson). (3) The new TOPIST-produced database of global electron density profiles for the topside ionosphere were made publicly available through NASA s National Space Science Data Center (NSSDC) ftp archive at . (4) Earlier Alouette 1,2 and ISIS 1, 2 data sets of electron density profiles from manual scaling of selected sets of ionograms were converted fiom a highly-compressed binary format into a user-friendly ASCII format and made publicly available through nssdcftp.gsfc.nasa.gov. The new database for the topside ionosphere established

  14. Safety monitoring in process and control

    International Nuclear Information System (INIS)

    Esparza, V. Jr.; Sebo, D.E.

    1984-01-01

    Safety Functions provide a method of ensuring the safe operation of any large-scale processing plant. Successful implementation of safety functions requires continuous monitoring of safety function values and trends. Because the volume of information handled by a plant operator occassionally can become overwhelming, attention may be diverted from the primary concern of maintaining plant safety. With this in mind EG and G, Idaho developed various methods and techniques for use in a computerized Safety Function Monitoring System and tested the application of these techniques using a simulated nuclear power plant, the Loss-of-Fluid Test Facility (LOFT) at the Idaho National Engineering Laboratory (INEL). This paper presents the methods used in the development of a Safety Function Monitoring System

  15. Automated processing of thermal infrared images of Osservatorio Vesuviano permanent surveillance network by using Matlab code

    Science.gov (United States)

    Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa

    2017-04-01

    The permanent thermal infrared surveillance network of Osservatorio Vesuviano (INGV) is composed of 6 stations which acquire IR frames of fumarole fields in the Campi Flegrei caldera and inside the Vesuvius crater (Italy). The IR frames are uploaded to a dedicated server in the Surveillance Center of Osservatorio Vesuviano in order to process the infrared data and to excerpt all the information contained. In a first phase the infrared data are processed by an automated system (A.S.I.R.A. Acq- Automated System of IR Analysis and Acquisition) developed in Matlab environment and with a user-friendly graphic user interface (GUI). ASIRA daily generates time-series of residual temperature values of the maximum temperatures observed in the IR scenes after the removal of seasonal effects. These time-series are displayed in the Surveillance Room of Osservatorio Vesuviano and provide information about the evolution of shallow temperatures field of the observed areas. In particular the features of ASIRA Acq include: a) efficient quality selection of IR scenes, b) IR images co-registration in respect of a reference frame, c) seasonal correction by using a background-removal methodology, a) filing of IR matrices and of the processed data in shared archives accessible to interrogation. The daily archived records can be also processed by ASIRA Plot (Matlab code with GUI) to visualize IR data time-series and to help in evaluating inputs parameters for further data processing and analysis. Additional processing features are accomplished in a second phase by ASIRA Tools which is Matlab code with GUI developed to extract further information from the dataset in automated way. The main functions of ASIRA Tools are: a) the analysis of temperature variations of each pixel of the IR frame in a given time interval, b) the removal of seasonal effects from temperature of every pixel in the IR frames by using an analytic approach (removal of sinusoidal long term seasonal component by using a

  16. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  17. Pilot study of enhanced social support with automated telephone monitoring after psychiatric hospitalization for depression.

    Science.gov (United States)

    Pfeiffer, Paul N; Valenstein, Marcia; Ganoczy, Dara; Henry, Jennifer; Dobscha, Steven K; Piette, John D

    2017-02-01

    Following discharge, patients hospitalized for depression are at high risk for poor retention in outpatient care and adverse outcomes. Pilot tests a post-hospital monitoring and enhanced support program for depression. 48 patients at a Veterans Affairs Medical Center discharged following a depression-related inpatient stay received weekly visits or phone calls for 6 months from their choice of either a family member/friend (n = 19) or a certified peer support specialist (n = 29). Participants also completed weekly automated telephone monitoring calls assessing depressive symptoms and antidepressant medication adherence. Over 90% of participants were more satisfied with their care due to the service. The mean change from baseline to 6 months in depression symptoms was -7.9 (p Depression Inventory-II for those supported by a family member/friend, whereas those supported by a peer specialist had mean changes of -3.5 (p  0.10), respectively. Increased contact with a chosen support person coupled with automated telephone monitoring after psychiatric hospitalization is an acceptable service for patients with depression. Those who received the service, and particularly those supported by a family member/friend, experienced reductions in symptoms of depression.

  18. Marketing automation processes as a way to improve contemporary marketing of a company

    OpenAIRE

    Witold Świeczak

    2013-01-01

    The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influenc...

  19. Automated Intelligent Monitoring and the Controlling Software System for Solar Panels

    Science.gov (United States)

    Nalamwar, H. S.; Ivanov, M. A.; Baidali, S. A.

    2017-01-01

    The inspection of the solar panels on a periodic basis is important to improve longevity and ensure performance of the solar system. To get the most solar potential of the photovoltaic (PV) system is possible through an intelligent monitoring & controlling system. The monitoring & controlling system has rapidly increased its popularity because of its user-friendly graphical interface for data acquisition, monitoring, controlling and measurements. In order to monitor the performance of the system especially for renewable energy source application such as solar photovoltaic (PV), data-acquisition systems had been used to collect all the data regarding the installed system. In this paper the development of a smart automated monitoring & controlling system for the solar panel is described, the core idea is based on IoT (the Internet of Things). The measurements of data are made using sensors, block management data acquisition modules, and a software system. Then, all the real-time data collection of the electrical output parameters of the PV plant such as voltage, current and generated electricity is displayed and stored in the block management. The proposed system is smart enough to make suggestions if the panel is not working properly, to display errors, to remind about maintenance of the system through email or SMS, and to rotate panels according to a sun position using the Ephemeral table that stored in the system. The advantages of the system are the performance of the solar panel system which can be monitored and analyzed.

  20. Ideal versus real automated twin column recycling chromatography process.

    Science.gov (United States)

    Gritti, Fabrice; Leal, Mike; McDonald, Thomas; Gilar, Martin

    2017-07-28

    The full baseline separation of two compounds (selectivity factors αcolumn of any length given the pressure limitations of current LC instruments. The maximum efficiency is that of an infinitely long column operated at infinitely small flow rates. It is determined by the maximum allowable system pressure, the column permeability (particle size), the viscosity of the eluent, and the intensity of the effective diffusivity of the analytes along the column. Alternatively, the twin-column recycling separation process (TCRSP) can overcome the efficiency limit of the single-column approach. In the TCRSP, the sample mixture may be transferred from one to a second (twin) column until its band has spread over one column length. Basic theory of chromatography is used to confirm that the speed-resolution performance of the TCRSP is intrinsically superior to that of the single-column process. This advantage is illustrated in this work by developing an automated TCRSP for the challenging separation of two polycyclic aromatic hydrocarbon (PAH) isomers (benzo[a]anthracene and chrysene) in the reversed-phase retention mode at pressure smaller than 5000psi. The columns used are the 3.0mm×150mm column packed with 3.5μm XBridge BEH-C 18 material (α=1.010) and the 3.0mm or 4.6mm×150mm columns packed with the same 3.5μm XSelect HSST 3 material (α=1.025). The isocratic mobile phase is an acetonitrile-water mixture (80/20, v/v). Remarkably, significant differences are observed between the predicted retention times and efficiencies of the ideal TCRSP (given by the number of cycles multiplied by the retention time and efficiency of one column) and those of the real TCRSP. The fundamental explanation lies in the pressure-dependent retention of these PAHs or in the change of their partial molar volume as they are transferred from the mobile to the stationary phase. A revisited retention and efficiency model is then built to predict the actual performance of real TCRSPs. The

  1. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Devol, Timothy A.

    2005-01-01

    Comparison of different pulse shape discrimination methods was performed under two different experimental conditions and the best method was identified. Beta/gamma discrimination of 90Sr/90Y and 137Cs was performed using a phoswich detector made of BC400 (2.5 cm OD x 1.2 cm) and BGO (2.5 cm O.D. x 2.5 cm ) scintillators. Alpha/gamma discrimination of 210Po and 137Cs was performed using a CsI:Tl (2.8 x 1.4 x 1.4 cm3) scintillation crystal. The pulse waveforms were digitized with a DGF-4c (X-Ray Instrumentation Associates) and analyzed offline with IGOR Pro software (Wavemetrics, Inc.). The four pulse shape discrimination methods that were compared include: rise time discrimination, digital constant fraction discrimination, charge ratio, and constant time discrimination (CTD) methods. The CTD method is the ratio of the pulse height at a particular time after the beginning of the pulse to the time at the maximum pulse height. The charge comparison method resulted in a Figure of Merit (FoM) of 3.3 (9.9 % spillover) and 3.7 (0.033 % spillover) for the phoswich and the CsI:Tl scintillator setups, respectively. The CTD method resulted in a FoM of 3.9 (9.2 % spillover) and 3.2 (0.25 % spillover), respectively. Inverting the pulse shape data typically resulted in a significantly higher FoM than conventional methods, but there was no reduction in % spillover values. This outcome illustrates that the FoM may not be a good scheme for the quantification of a system to perform pulse shape discrimination. Comparison of several pulse shape discrimination (PSD) methods was performed as a means to compare traditional analog and digital PSD methods on the same scintillation pulses. The X-ray Instrumentation Associates DGF-4C (40 Msps, 14-bit) was used to digitize waveforms from a CsI:Tl crystal and BC400/BGO phoswich detector

  2. A controlled trial of electronic automated advisory vital signs monitoring in general hospital wards.

    Science.gov (United States)

    Bellomo, Rinaldo; Ackerman, Michael; Bailey, Michael; Beale, Richard; Clancy, Greg; Danesh, Valerie; Hvarfner, Andreas; Jimenez, Edgar; Konrad, David; Lecardo, Michele; Pattee, Kimberly S; Ritchie, Josephine; Sherman, Kathie; Tangkau, Peter

    2012-08-01

    Deteriorating ward patients are at increased risk. Electronic automated advisory vital signs monitors may help identify such patients and improve their outcomes. A total of 349 beds, in 12 general wards in ten hospitals in the United States, Europe, and Australia. Cohort of 18,305 patients. Before-and-after controlled trial. We deployed electronic automated advisory vital signs monitors to assist in the acquisition of vital signs and calculation of early warning scores. We assessed their effect on frequency, type, and treatment of rapid response team calls; survival to hospital discharge or to 90 days for rapid response team call patients; overall type and number of serious adverse events and length of hospital stay. We studied 9,617 patients before (control) and 8,688 after (intervention) deployment of electronic automated advisory vital signs monitors. Among rapid response team call patients, intervention was associated with an increased proportion of calls secondary to abnormal respiratory vital signs (from 21% to 31%; difference [95% confidence interval] 9.9 [0.1-18.5]; p=.029). Survival immediately after rapid response team treatment and survival to hospital discharge or 90 days increased from 86% to 92% (difference [95% confidence interval] 6.3 [0.0-12.6]; p=.04). Intervention was also associated with a decrease in median length of hospital stay in all patients (unadjusted p<.0001; adjusted p=.09) and more so in U.S. patients (from 3.4 to 3.0 days; unadjusted p<.0001; adjusted ratio [95% confidence interval] 1.03 [1.00-1.06]; p=.026). The time required to complete and record a set of vital signs decreased from 4.1±1.3 mins to 2.5±0.5 mins (difference [95% confidence interval] 1.6 [1.4-1.8]; p<.0001). Deployment of electronic automated advisory vital signs monitors was associated with an improvement in the proportion of rapid response team-calls triggered by respiratory criteria, increased survival of patients receiving rapid response team calls, and

  3. Automated processing of data on the use of motor vehicles in the Serbian Armed Forces

    Directory of Open Access Journals (Sweden)

    Nikola S. Osmokrović

    2012-10-01

    Full Text Available The main aim of introducing information technology into the armed forces is the automation of the management process. The management in movement and transport (M&T in our armed forces has been included in the process of automation from the beginning. For that reason, today we can speak about the automated processing of data on road traffic safety and on the use of motor vehicles. With regard to the overall development of the information system of the movement and transport service, the paper presents an information system of the M&T service for the processing of data on the use of motor vehicles. The main features, components and functions of the 'Vozila' application, which was specially developed for the automated processing of data on motor vehicle use, are explained in particular.

  4. Prajna: adding automated reasoning to the visual- analysis process.

    Science.gov (United States)

    Swing, E

    2010-01-01

    Developers who create applications for knowledge representation must contend with challenges in both the abundance of data and the variety of toolkits, architectures, and standards for representing it. Prajna is a flexible Java toolkit designed to overcome these challenges with an extensible architecture that supports both visualization and automated reasoning.

  5. Automated Plasma Spray (APS) process feasibility study: Plasma spray process development and evaluation

    Science.gov (United States)

    Fetheroff, C. W.; Derkacs, T.; Matay, I. M.

    1979-01-01

    An automated plasma spray (APS) process was developed to apply two layer (NiCrAlY and ZrO2-12Y2O3) thermal-barrier coatings to aircraft gas turbine engine blade airfoils. The APS process hardware consists of four subsystems: a mechanical blade positioner incorporating two interlaced six-degree-of-freedom assemblies; a noncoherent optical metrology subsystem; a microprocessor-based adaptive system controller; and commercial plasma spray equipment. Over fifty JT9D first stage turbine blades specimens were coated with the APS process in preliminary checkout and evaluation studies. The best of the preliminary specimens achieved an overall coating thickness uniformity of + or - 53 micrometers, much better than is achievable manually. Factors limiting this performance were identified and process modifications were initiated accordingly. Comparative evaluations of coating thickness uniformity for manually sprayed and APS coated specimens were initiated. One of the preliminary evaluation specimens was subjected to a torch test and metallographic evaluation.

  6. Automated external cardioversion defibrillation monitoring in cardiac arrest: a randomized trial.

    Science.gov (United States)

    Ali, Bakhtiar; Bloom, Heather; Veledar, Emir; House, Dorothy; Norvel, Robert; Dudley, Samuel C; Zafari, A Maziar

    2008-06-11

    In-hospital cardiac arrest has a poor prognosis despite active electrocardiography monitoring. The initial rhythm of approximately 25% of in-hospital cardiopulmonary resuscitation (CPR) events is pulseless ventricular tachycardia/ventricular fibrillation (VT/VF). Early defibrillation is an independent predictor of survival in CPR events caused by VT/VF. The automated external cardioverter defibrillator (AECD) is a device attached by pads to the chest wall that monitors, detects, and within seconds, automatically delivers electric countershock to an appropriate tachyarrhythmia. To evaluate safety of AECD monitoring in hospitalized patients. To evaluate whether AECDs provide earlier defibrillation than hospital code teams. The study is a prospective trial randomizing patients admitted to the telemetry ward to standard CPR (code team) or standard CPR plus AECD monitoring (PowerHeart CRM). The AECD is programmed to deliver one 150 J biphasic shock to patients in sustained VT/VF. Data is collected using the Utstein criteria for cardiac arrest. The primary endpoint is time-to-defibrillation; secondary outcomes include neurological status and survival to discharge, with 3-year follow-up. To date, 192 patients have been recruited in the time period between 10/10/2006 to 7/20/2007. A total of 3,655 hours of telemetry data have been analyzed in the AECD arm. The AECD has monitored ambulatory telemetry patients in sinus rhythm, sinus tachycardia, supraventricular tachycardia, atrial flutter or fibrillation, with premature ventricular complexes and non-sustained VT without delivery of inappropriate shocks. One patient experienced sustained VT during AECD monitoring, who was successfully defibrillated (17 seconds after meeting programmed criteria). There are no events to report in the control arm. The patient survived the event without neurological complications. During the same time period, mean time to shock for VT/VF cardiac arrest occurring outside the telemetry ward was

  7. Reference Tools for Data Processing, Office Automation, and Data Communications: An Introductory Guide.

    Science.gov (United States)

    Cupoli, Patricia Dymkar

    1981-01-01

    Provides an introduction to various reference sources which are useful in dealing with the areas of data processing, office automation, and communications technologies. A bibliography with vendor listings is included. (FM)

  8. Results and discussion of laboratory experiences with different automated TLD readers for personnel monitoring

    International Nuclear Information System (INIS)

    Regulla, D.F.; Drexeler, G.

    Although the film seems to continue serving as the main personnel dosemeter in Germany for the time in sight, the evolution of particularly solid state techniques and their properties are thoroughly considered with respect to a possible generalized application in personnel monitoring. For this reason different automated TLD systems that are commercially available have been investigated in the laboratory in order to find out their usefulness for a largescale or also decentralized service. Along with studying the dosimetrical and apparative parameters, the question has been discussed to which monitoring philosophy these TLD systems seem to fit. It is reported both on experimental experiences achieved as well as on the results of basic discussions that in return influence the discussion about the necessary outfit of personnel TL dosemeters

  9. Advanced process monitoring and feedback control to enhance cell culture process production and robustness.

    Science.gov (United States)

    Zhang, An; Tsang, Valerie Liu; Moore, Brandon; Shen, Vivian; Huang, Yao-Ming; Kshirsagar, Rashmi; Ryll, Thomas

    2015-12-01

    It is a common practice in biotherapeutic manufacturing to define a fixed-volume feed strategy for nutrient feeds, based on historical cell demand. However, once the feed volumes are defined, they are inflexible to batch-to-batch variations in cell growth and physiology and can lead to inconsistent productivity and product quality. In an effort to control critical quality attributes and to apply process analytical technology (PAT), a fully automated cell culture feedback control system has been explored in three different applications. The first study illustrates that frequent monitoring and automatically controlling the complex feed based on a surrogate (glutamate) level improved protein production. More importantly, the resulting feed strategy was translated into a manufacturing-friendly manual feed strategy without impact on product quality. The second study demonstrates the improved process robustness of an automated feed strategy based on online bio-capacitance measurements for cell growth. In the third study, glucose and lactate concentrations were measured online and were used to automatically control the glucose feed, which in turn changed lactate metabolism. These studies suggest that the auto-feedback control system has the potential to significantly increase productivity and improve robustness in manufacturing, with the goal of ensuring process performance and product quality consistency. © 2015 Wiley Periodicals, Inc.

  10. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    Science.gov (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for

  11. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    Science.gov (United States)

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years

  12. Testing the effectiveness of automated acoustic sensors for monitoring vocal activity of Marbled Murrelets Brachyramphus marmoratus

    Science.gov (United States)

    Cragg, Jenna L.; Burger, Alan E.; Piatt, John F.

    2015-01-01

    Cryptic nest sites and secretive breeding behavior make population estimates and monitoring of Marbled Murrelets Brachyramphus marmoratus difficult and expensive. Standard audio-visual and radar protocols have been refined but require intensive field time by trained personnel. We examined the detection range of automated sound recorders (Song Meters; Wildlife Acoustics Inc.) and the reliability of automated recognition models (“recognizers”) for identifying and quantifying Marbled Murrelet vocalizations during the 2011 and 2012 breeding seasons at Kodiak Island, Alaska. The detection range of murrelet calls by Song Meters was estimated to be 60 m. Recognizers detected 20 632 murrelet calls (keer and keheer) from a sample of 268 h of recordings, yielding 5 870 call series, which compared favorably with human scanning of spectrograms (on average detecting 95% of the number of call series identified by a human observer, but not necessarily the same call series). The false-negative rate (percentage of murrelet call series that the recognizers failed to detect) was 32%, mainly involving weak calls and short call series. False-positives (other sounds included by recognizers as murrelet calls) were primarily due to complex songs of other bird species, wind and rain. False-positives were lower in forest nesting habitat (48%) and highest in shrubby vegetation where calls of other birds were common (97%–99%). Acoustic recorders tracked spatial and seasonal trends in vocal activity, with higher call detections in high-quality forested habitat and during late July/early August. Automated acoustic monitoring of Marbled Murrelet calls could provide cost-effective, valuable information for assessing habitat use and temporal and spatial trends in nesting activity; reliability is dependent on careful placement of sensors to minimize false-positives and on prudent application of digital recognizers with visual checking of spectrograms.

  13. Automated electronic monitoring of circuit pressures during continuous renal replacement therapy: a technical report.

    Science.gov (United States)

    Zhang, Ling; Baldwin, Ian; Zhu, Guijun; Tanaka, Aiko; Bellomo, Rinaldo

    2015-03-01

    Automated electronic monitoring and analysis of circuit pressures during continuous renal replacement therapy (CRRT) has the potential to predict failure and allow intervention to optimise function. Current CRRT machines can measure and store pressure readings for downloading into databases and for analysis. We developed a procedure to obtain such data at intervals of 1 minute and analyse them using the Prismaflex CRRT machine, and we present an example of such analysis. We obtained data on pressures obtained at intervals of 1 minute in a patient with acute kidney injury and sepsis treated with continuous haemofiltration at 2 L/hour of ultrafiltration and a blood flow of 200 mL/minute. Data analysis identified progressive increases in transmembrane pressure (TMP) and prefilter pressure (PFP) from time 0 until 33 hours or clotting. TMP increased from 104 mmHg to 313 mmHg and PFP increased from from 131 mmHg to 185 mmHg. Effluent pressure showed a progressive increase in the negative pressure applied to achieve ultrafiltration from 0 mmHg to -168 mmHg. The inflection point for such changes was also identified. Blood pathway pressures for access and return remained unchanged throughout. Automated electronic monitoring of circuit pressure during CRRT is possible and provides useful information on the evolution of circuit clotting.

  14. Use of automated medication adherence monitoring in bipolar disorder research: pitfalls, pragmatics, and possibilities.

    Science.gov (United States)

    Levin, Jennifer B; Sams, Johnny; Tatsuoka, Curtis; Cassidy, Kristin A; Sajatovic, Martha

    2015-04-01

    Medication nonadherence occurs in 20-60% of persons with bipolar disorder (BD) and is associated with serious negative outcomes, including relapse, hospitalization, incarceration, suicide and high healthcare costs. Various strategies have been developed to measure adherence in BD. This descriptive paper summarizes challenges and workable strategies using electronic medication monitoring in a randomized clinical trial (RCT) in patients with BD. Descriptive data from 57 nonadherent individuals with BD enrolled in a prospective RCT evaluating a novel customized adherence intervention versus control were analyzed. Analyses focused on whole group data and did not assess intervention effects. Adherence was assessed with the self-reported Tablets Routine Questionnaire and the Medication Event Monitoring System (MEMS). The majority of participants were women (74%), African American (69%), with type I BD (77%). Practical limitations of MEMS included misuse in conjunction with pill minders, polypharmacy, cost, failure to bring to research visits, losing the device, and the device impacting baseline measurement. The advantages were more precise measurement, less biased recall, and collecting data from past time periods for missed interim visits. Automated devices such as MEMS can assist investigators in evaluating adherence in patients with BD. Knowing the anticipated pitfalls allows study teams to implement preemptive procedures for successful implementation in BD adherence studies and can help pave the way for future refinements as automated adherence assessment technologies become more sophisticated and readily available.

  15. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Semi-Disk Structure

    Science.gov (United States)

    2018-01-01

    ARL-TR-8271 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter... Energy Detection Algorithm Based on Morphological Filter Processing with a Semi-Disk Structure by Kwok F Tom Sensors and Electron Devices...September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Semi-Disk Structure 5a

  16. Process automation using combinations of process and machine control technologies with application to a continuous dissolver

    International Nuclear Information System (INIS)

    Spencer, B.B.; Yarbro, O.O.

    1991-01-01

    Operation of a continuous rotary dissolver, designed to leach uranium-plutonium fuel from chopped sections of reactor fuel cladding using nitric acid, has been automated. The dissolver is a partly continuous, partly batch process that interfaces at both ends with batchwise processes, thereby requiring synchronization of certain operations. Liquid acid is fed and flows through the dissolver continuously, whereas chopped fuel elements are fed to the dissolver in small batches and move through the compartments of the dissolver stagewise. Sequential logic (or machine control) techniques are used to control discrete activities such as the sequencing of isolation valves. Feedback control is used to control acid flowrates and temperatures. Expert systems technology is used for on-line material balances and diagnostics of process operation. 1 ref., 3 figs

  17. Signal Processing for Beam Position Monitors

    CERN Document Server

    Vismara, Giuseppe

    2000-01-01

    At the first sight the problem to determine the beam position from the ratio of the induced charges of the opposite electrodes of a beam monitor seems trivial, but up to now no unique solution has been found that fits the various demands of all particle accelerators. The purpose of this paper is to help "instrumentalists" to choose the best processing system for their particular application, depending on the machine size, the input dynamic range, the required resolution and the acquisition speed. After a general introduction and an analysis of the electrical signals to be treated (frequency and time domain), the definition of the electronic specifications will be reviewed. The tutorial will present the different families in which the processing systems can be grouped. A general description of the operating principles with relative advantages and disadvantages for the most employed processing systems is presented. Special emphasis will be put on recent technological developments based on telecommunication circ...

  18. Automated radiological monitoring at a Russian Ministry of Defence Naval Site

    International Nuclear Information System (INIS)

    Moskowitz, P.D.; Pomerville, J.; Gavrilov, S.; Kisselev, V.; Daniylan, V.; Belikov, A.; Egorkin, A.; Sokolovski, Y.; Endregard, M.; Krosshavn, M.; Sundling, C.V.; Yokstad, H.

    2001-01-01

    The Arctic Military Environmental Cooperation (AMEC) Program is a cooperative effort between the military establishments of the Kingdom of Norway, the Russian Federation, and the US. This paper discusses joint activities conducted over the past year among Norwegian, Russian, and US technical experts on a project to develop, demonstrate and implement automated radiological monitoring at Russian Navy facilities engaged in the dismantlement of nuclear-powered strategic ballistic missile launching submarines. Radiological monitoring is needed at these facilities to help protect workers engaged in the dismantlement program and the public living within the footprint of routine and accidental radiation exposure areas. By providing remote stand-alone monitoring, the Russian Navy will achieve added protection due to the defense-in-depth strategy afforded by local (at the site), regional (Kola) and national-level (Moscow) oversight. The system being implemented at the Polyaminsky Russian Naval Shipyard was developed from a working model tested at the Russian Institute for Nuclear Safety, Moscow, Russia. It includes Russian manufactured terrestrial and underwater gamma detectors, smart controllers for graded sampling, radio-modems for offsite transmission of the data, and a data fusion/display system: The data fusion/display system is derived from the Norwegian Picasso AMEC Environmental Monitoring software package. This computer package allows monitoring personnel to review the real-time and historical status of monitoring at specific sites and objects and to establish new monitoring protocols as required, for example, in an off-normal accident situation. Plans are being developed to implement the use of this system at most RF Naval sites handling spent nuclear fuel

  19. Infrasonic Stethoscope for Monitoring Physiological Processes

    Science.gov (United States)

    Shams, Qamar A. (Inventor); Zuckerwar, Allan J. (Inventor); Dimarcantonio, Albert L. (Inventor)

    2016-01-01

    An infrasonic stethoscope for monitoring physiological processes of a patient includes a microphone capable of detecting acoustic signals in the audible frequency bandwidth and in the infrasonic bandwidth (0.03 to 1000 Hertz), a body coupler attached to the body at a first opening in the microphone, a flexible tube attached to the body at a second opening in the microphone, and an earpiece attached to the flexible tube. The body coupler is capable of engagement with a patient to transmit sounds from the person, to the microphone and then to the earpiece.

  20. Process versus content in eyewitness metamemory monitoring.

    Science.gov (United States)

    Robinson, M D; Johnson, J T; Robertson, D A

    2000-09-01

    Three studies (Ns = 200, 135, and 187 college undergraduates) contrasted process versus content accounts of eyewitness metamemory monitoring. Subjective vividness, a cue related to memory content, was a better predictor of confidence and accuracy than were cues related to the retrieval process. Participants who were asked to recall, rather than recognize, event details displayed greater insight into accuracy, primarily because vividness was a more valid accuracy cue under recall conditions. Results reinforce the value of recall-based protocols for eliciting eyewitness testimony and suggest some specific conditions (e.g., yes-no recognition) under which investigators should be especially cautious in relying on confidence to infer accuracy. In addition, results point to a general framework for understanding moderating effects on eyewitness metamemory accuracy.

  1. Information processing for aerospace structural health monitoring

    Science.gov (United States)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  2. New signal processing algorithms for automated external defibrillators

    OpenAIRE

    Irusta Zarandona, Unai

    2017-01-01

    [ES]La fibrilación ventricular (VF) es el primer ritmo registrado en el 40\\,\\% de las muertes súbitas por paro cardiorrespiratorio extrahospitalario (PCRE). El único tratamiento eficaz para la FV es la desfibrilación mediante una descarga eléctrica. Fuera del hospital, la descarga se administra mediante un desfibrilador externo automático (DEA), que previamente analiza el electrocardiograma (ECG) del paciente y comprueba si presenta un ritmo desfibrilable. La supervivencia en un caso de PCRE ...

  3. CULTURE AND TECHNOLOGY: AUTOMATION IN THE CREATIVE PROCESSES OF NARRATIVE

    Directory of Open Access Journals (Sweden)

    Fernando Fogliano

    2013-12-01

    Full Text Available The objective here is to think on the problem raised by the progressively opaque presence of technology in the contemporary artistic production. Automation is the most evident aspect of technology of devices used for production, post-production and dissemination of this cultural activity. Along the text the philosophers Vilém Flusser and Gilbert Simon are put in confrontation so that a more profound insight can be obtained. Language is considered here as the integrative factor in the search for a new convergent conceptual scenario that enable us understand the consequences of the technological convergence

  4. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives.

    Science.gov (United States)

    Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini

    2017-11-27

    Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.

  5. Marketing automation processes as a way to improve contemporary marketing of a company

    Directory of Open Access Journals (Sweden)

    Witold Świeczak

    2013-09-01

    Full Text Available The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influence an effective course of actions taken as a part of marketing automation. Because the concept of marketing automation is a completely new reality; it is giving up the communication based on mass distribution of a uniform contents for really personalized individual and fully automated communication. This is a completely new idea, a kind of coexistence, in which both a sales department and a marketing department cooperate with each other closely to achieve the best result. It is also a situation in which marketing can definitely confirm its contribution to the income generated by the company. But marketing automation also means huge analytical possibilities and a real increase of a company’s value, its value added generated by the system – the source of information about clients, about all processes both marketing and sales, taking place in a company. The introduction of marketing automation system alters not only the current functioning of a marketing department, but also marketers themselves. In fact, everything that marketing automation system provides, including primarily accumulated unique knowledge of the client, is also a critical marketing value of every modern enterprise.

  6. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  7. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  8. The Multi-Isotope Process (MIP) Monitor Project: FY12 Progress and Accomplishments

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B.; Orton, Christopher R.; Jordan, David V.; Schwantes, Jon M.; Bender, Sarah; Dayman, Kenneth J.; Unlu, Kenan; Landsberger, Sheldon

    2012-09-27

    The Multi-Isotope Process (MIP) Monitor, being developed at Pacific Northwest National Laboratory (PNNL), provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of "...(minimization of) the risks of nuclear proliferation and terrorism." The MIP Monitor measures distributions of a suite of indicator (radioactive) isotopes present within product and waste streams of a nuclear reprocessing facility. These indicator isotopes are monitored on-line by gamma spectrometry and compared, in near-real-time, to spectral patterns representing "normal" process conditions using multivariate pattern recognition software. The monitor utilizes this multivariate analysis and gamma spectroscopy of reprocessing streams to detect small changes in the gamma spectrum, which may indicate changes in process conditions. Multivariate analysis methods common in chemometrics, such as principal component analysis (PCA) and partial least squares regression (PLS), act as pattern recognition techniques, which can detect small deviations from the expected, nominal condition. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting. Development of the MIP Monitor approach continues to evaluate the efficacy of the monitor for automated, real-time or near-real-time application. This report details follow-on research and development efforts sponsored by the U.S. Department of Energy Fuel Cycle Research and Development related to the MIP Monitor for fiscal year

  9. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    International Nuclear Information System (INIS)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan

    2012-01-01

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED adj ). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED adj between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED adj that differed by up to 44% from effective dose estimates that were not

  10. STAMPS: software tool for automated MRI post-processing on a supercomputer

    OpenAIRE

    Bigler, Don C.; Aksu, Yaman; Yang, Qing X.

    2009-01-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features....

  11. ADVANCES IN CLOG STATE MONITORING FOR USE IN AUTOMATED REED BED INSTALLATIONS

    Directory of Open Access Journals (Sweden)

    Theodore HUGHES-RILEY

    2014-06-01

    Full Text Available Constructed wetlands are a popular form of waste-water treatment that have proliferated across Europe and the rest of the world in recent years as an environmentally conscious form of waste water treatment. The ability to monitor the conditions in the bed and control input factors such as heating and aeration may extend the lifetime of the reed bed substantially beyond the ten year lifetime normally reached. The Autonomous Reed Bed Installation (ARBI project is an EU FP7 initiative to develop a reed bed with automated control over input parameters based on readings taken from embedded sensors. Automated remedial action may improve bed treatment efficiency, and prolong the life of the bed and avoiding the need to refurbish the bed, which is both time consuming and costly. One critical parameter to observe is the clog state of the reed bed, as this can severely impact on the efficiency of water treatment to the point of the bed becoming non-operable. Magnetic resonance (MR sensors can be a powerful tool in determining clogging levels, and has previously been explored in the literature. This work is based on a conference paper (2nd International Conference "Water resources and wetlands", 2014 and details magnetic sensors suitable for long-term embedding into a constructed wetland. Unlike previous studies this work examines a probe embedded into a wetland.

  12. Automated selected reaction monitoring software for accurate label-free protein quantification.

    Science.gov (United States)

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  13. AUTOMATION OF CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    O. Yu. Sakaliuk

    2016-09-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. A process model simulation conducted business process notation eEPC and BPMN. Also held database design, creation of file structure and create AIS interface. Implemented interaction with the webcam. Justification feasibility of software development, and the definition of performance based on the results petal chart, it is safe to say that an automated way to much more efficient compared to manual mode. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  14. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    Schobert, G.; Arnold, N.; Bashore, D.; Mika, R.; Oliaro, G.

    1989-01-01

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  15. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    Science.gov (United States)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-04-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  16. Comparative analysis of automation of production process with industrial robots in Asia/Australia and Europe

    Directory of Open Access Journals (Sweden)

    I. Karabegović

    2017-01-01

    Full Text Available The term "INDUSTRY 4.0" or "fourth industrial revolution" was first introduced at the fair in 2011 in Hannover. It comes from the high-tech strategy of the German Federal Government that promotes automation-computerization to complete smart automation, meaning the introduction of a method of self-automation, self-configuration, self-diagnosing and fixing the problem, knowledge and intelligent decision-making. Any automation, including smart, cannot be imagined without industrial robots. Along with the fourth industrial revolution, ‘’robotic revolution’’ is taking place in Japan. Robotic revolution refers to the development and research of robotic technology with the aim of using robots in all production processes, and the use of robots in real life, to be of service to a man in daily life. Knowing these facts, an analysis was conducted of the representation of industrial robots in the production processes on the two continents of Europe and Asia /Australia, as well as research that industry is ready for the introduction of intelligent automation with the goal of establishing future smart factories. The paper gives a representation of the automation of production processes in Europe and Asia/Australia, with predictions for the future.

  17. Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.

    Science.gov (United States)

    Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W

    2014-02-01

    The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.

  18. Automated tests for diagnosing and monitoring cognitive impairment: a diagnostic accuracy review.

    Science.gov (United States)

    Aslam, Rabeea'h W; Bates, Vickie; Dundar, Yenal; Hounsome, Juliet; Richardson, Marty; Krishan, Ashma; Dickson, Rumona; Boland, Angela; Kotas, Eleanor; Fisher, Joanne; Sikdar, Sudip; Robinson, Louise

    2016-10-01

    Cognitive impairment is a growing public health concern, and is one of the most distinctive characteristics of all dementias. The timely recognition of dementia syndromes can be beneficial, as some causes of dementia are treatable and are fully or partially reversible. Several automated cognitive assessment tools for assessing mild cognitive impairment (MCI) and early dementia are now available. Proponents of these tests cite as benefits the tests' repeatability and robustness and the saving of clinicians' time. However, the use of these tools to diagnose and/or monitor progressive cognitive impairment or response to treatment has not yet been evaluated. The aim of this review was to determine whether or not automated computerised tests could accurately identify patients with progressive cognitive impairment in MCI and dementia and, if so, to investigate their role in monitoring disease progression and/or response to treatment. Five electronic databases (MEDLINE, EMBASE, The Cochrane Library, ISI Web of Science and PsycINFO), plus ProQuest, were searched from 2005 to August 2015. The bibliographies of retrieved citations were also examined. Trial and research registers were searched for ongoing studies and reviews. A second search was run to identify individual test costs and acquisition costs for the various tools identified in the review. Two reviewers independently screened all titles and abstracts to identify potentially relevant studies for inclusion in the review. Full-text copies were assessed independently by two reviewers. Data were extracted and assessed for risk of bias by one reviewer and independently checked for accuracy by a second. The results of the data extraction and quality assessment for each study are presented in structured tables and as a narrative summary. The electronic searching of databases, including ProQuest, resulted in 13,542 unique citations. The titles and abstracts of these were screened and 399 articles were shortlisted for full

  19. Utility of an Automated Thermal-Based Approach for Monitoring Evapotranspiration

    Science.gov (United States)

    Timmermans, Wim J.; Kustas, William P.; Andreu, Ana

    2015-12-01

    A very simple remote sensing-based model for water use monitoring is presented. The model acronym DATTUTDUT (Deriving Atmosphere Turbulent Transport Useful To Dummies Using Temperature) is a Dutch word which loosely translates as "it's unbelievable that it works". DATTUTDUT is fully automated and only requires a surface temperature map, making it simple to use and providing a rapid estimate of spatially-distributed fluxes. The algorithm is first tested over a range of environmental and land-cover conditions using data from four short-term field experiments and then evaluated over a growing season in an agricultural region. Flux model output is in satisfactory agreement with observations and established remote sensing-based models, except under dry and partial canopy cover conditions. This suggests that DATTUTDUT has utility in identifying relative water use and as an operational tool providing initial estimates of ET anomalies in data-poor regions that would be confirmed using more robust modeling techniques.

  20. On-line process control monitoring system

    Science.gov (United States)

    O'Rourke, Patrick E.; Van Hare, David R.; Prather, William S.

    1992-01-01

    An on-line, fiber-optic based apparatus for monitoring the concentration of a chemical substance at a plurality of locations in a chemical processing system comprises a plurality of probes, each of which is at a different location in the system, a light source, optic fibers for carrying light to and from the probes, a multiplexer for switching light from the source from one probe to the next in series, a diode array spectrophotometer for producing a spectrum from the light received from the probes, and a computer programmed to analyze the spectra so produced. The probes allow the light to pass through the chemical substance so that a portion of the light is absorbed before being returned to the multiplexer. A standard and a reference cell are included for data validation and error checking.

  1. Automated external cardioversion defibrillation monitoring in cardiac arrest: a randomized trial

    Directory of Open Access Journals (Sweden)

    Norvel Robert

    2008-06-01

    Full Text Available Abstract Background In-hospital cardiac arrest has a poor prognosis despite active electrocardiography monitoring. The initial rhythm of approximately 25% of in-hospital cardiopulmonary resuscitation (CPR events is pulseless ventricular tachycardia/ventricular fibrillation (VT/VF. Early defibrillation is an independent predictor of survival in CPR events caused by VT/VF. The automated external cardioverter defibrillator (AECD is a device attached by pads to the chest wall that monitors, detects, and within seconds, automatically delivers electric countershock to an appropriate tachyarrhythmia. Study Objectives • To evaluate safety of AECD monitoring in hospitalized patients. • To evaluate whether AECDs provide earlier defibrillation than hospital code teams. Methods The study is a prospective trial randomizing patients admitted to the telemetry ward to standard CPR (code team or standard CPR plus AECD monitoring (PowerHeart CRM. The AECD is programmed to deliver one 150 J biphasic shock to patients in sustained VT/VF. Data is collected using the Utstein criteria for cardiac arrest. The primary endpoint is time-to-defibrillation; secondary outcomes include neurological status and survival to discharge, with 3-year follow-up. Results To date, 192 patients have been recruited in the time period between 10/10/2006 to 7/20/2007. A total of 3,655 hours of telemetry data have been analyzed in the AECD arm. The AECD has monitored ambulatory telemetry patients in sinus rhythm, sinus tachycardia, supraventricular tachycardia, atrial flutter or fibrillation, with premature ventricular complexes and non-sustained VT without delivery of inappropriate shocks. One patient experienced sustained VT during AECD monitoring, who was successfully defibrillated (17 seconds after meeting programmed criteria. There are no events to report in the control arm. The patient survived the event without neurological complications. During the same time period, mean time to

  2. Measures and mechanisms for process monitoring in evolving business networks

    OpenAIRE

    Comuzzi, M.; Vonk, J.; Grefen, P.

    2012-01-01

    The literature on monitoring of cross-organizational processes, executed within business networks, considers monitoring only in the network formation phase, since network establishment determines what can be monitored during process execution. In particular, the impact of evolution in such networks on monitoring is not considered. When a business network evolves, e.g. contracts are introduced, updated, or dropped, or actors join or leave the network, the monitoring requirements of the network...

  3. A wireless smart sensor network for automated monitoring of cable tension

    International Nuclear Information System (INIS)

    Sim, Sung-Han; Cho, Soojin; Li, Jian; Jo, Hongki; Park, Jong-Woong; Jung, Hyung-Jo; Spencer Jr, Billie F

    2014-01-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea. (paper)

  4. A wireless smart sensor network for automated monitoring of cable tension

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jong-Woong; Cho, Soojin; Spencer, Billie F., Jr.; Jung, Hyung-Jo

    2014-02-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea.

  5. The Effects of Automated Prompting and Self-Monitoring on Homework Completion for a Student with Attention Deficit Hyperactivity Disorder

    Science.gov (United States)

    Blicha, Amy; Belfiore, Phillip J.

    2013-01-01

    This study examined the effects of an intervention consisting of automated prompting and self-monitoring on the level of independent homework task completion for an elementary-age student with attention deficit hyperactivity disorder (ADHD). Instituting a single subject, within series ABAB design, the results showed a consistent increase and…

  6. Behavioural and physiological responses of laying hens to automated monitoring equipment.

    Science.gov (United States)

    Buijs, Stephanie; Booth, Francesca; Richards, Gemma; McGaughey, Laura; Nicol, Christine J; Edgar, Joanne; Tarlton, John F

    2018-02-01

    Automated monitoring of behaviour can offer a wealth of information in circumstances where observing behaviour is difficult or time consuming. However, this often requires attaching monitoring devices to the animal which can alter behaviour, potentially invalidating any data collected. Birds often show increased preening and energy expenditure when wearing devices and, especially in laying hens, there is a risk that individuals wearing devices will attract aggression from conspecifics. We studied the behavioural and physiological response of 20 laying hens to backpacks containing monitoring devices fastened with elastic loops around the wing base. We hypothesised that backpacks would lead to a stress-induced decrease in peripheral temperature, increased preening, more aggression from conspecifics, and reduced bodyweights. This was evaluated by thermography of the eye and comb (when isolated after fitting backpacks), direct observations of behaviour (when isolated, when placed back into the group, and on later days), and weighing (before and after each 7-day experimental period). Each hen wore a backpack during one of the two experimental periods only and was used as her own control. Contrary to our hypothesis, eye temperature was higher when hens wore a backpack (No backpack: 30.2 °C (IQR: 29.0-30.6) vs. Backpack: 30.9 °C (IQR: 30.0-32.0), P e., pecking the backpack or leg rings) was still affected 2-7 days after fitting (No backpack: 0 pecks/hen/minute (IQR: 0-0), vs. Backpack: 0 (IQR: 0-0.07), P < 0.05). We found no effect of our backpacks on bodyweight. In conclusion, our backpacks seem suitable to attach monitoring equipment to hens with only a very minor effect on their behaviour after a short acclimation period (≤2 days).

  7. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  8. Study on diesel vertical migration characteristics and mechanism in water-bearing sand stratum using an automated resistivity monitoring system.

    Science.gov (United States)

    Pan, Yuying; Jia, Yonggang; Wang, Yuhua; Xia, Xin; Guo, Lei

    2018-02-01

    Oil spills frequently occur on both land and sea. Petroleum in mobile phase will cause serious pollution in the sediment and can form a secondary pollution source. Therefore, it is very important to study the migration of petroleum in sediments ideally in a rapid and simplified approach. The release of diesel was simulated using fine beach sand to construct a model aquifer, and dynamic monitoring was carried out using an automated monitoring system including a resistivity probe originally developed by our research group. The mobile phase migration fronts were determined accurately using wavelet analysis method combined with resistivity curve method. Then, a relationship between resistivity and the joint oil-water content was established. The main conclusions were as follows. The seepage velocity of the diesel with high mobility at the initial stage of infiltration was faster, followed by a period when gravity seepage was dominant, and finally a redistribution period at the later stage, which was mainly an oil-water displacement process. The resistivity trends for diesel infiltration in different water-saturated soil layers varied with depth. The resistivity in the vadose zone fluctuated significantly, increasing initially and later decreasing. The resistivity change in the capillary zone was relatively small and constant in the initial stage; then, it increased and subsequently decreased. The resistivity in the saturated zone was basically unchanged with depth, and the value became slightly larger than the background value over time. Overall, for a large volume of mobile phase diesel leakage, the arrival migration fronts can be detected by wavelet analysis combined with resistivity curves. The thickness of the oil slick in the capillary zone can be estimated by resistivity changes. The relationships between resistivity and both the moisture content and oil-water joint saturation are in agreement with the linear models. The research results provide basic data and a

  9. Development of automated welding process for field fabrication of thick walled pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, U A

    1981-01-01

    Research on automatic welding processes for the fabrication of thick-walled pressure vessels continued. A literature review on the subject was completed. A laboratory study of criteria for judging acceptable root parameters continued. Equipment for a demonstration facility to test the components and processes of the automated welding system has been specified and is being obtained. (LCL)

  10. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges

    Science.gov (United States)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.

    2011-06-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  11. An Automated Policy Refinement Process Supported by Expert Knowledge

    OpenAIRE

    Rochaeli, Taufiq

    2009-01-01

    In a policy-based system management, a policy refinement process is required to translate abstract policies, which are specified by human, into enforceable policies, which are enforced by machine. However, a manual policy refinement process imposes some problems. The first problem is that it requires expert knowledge to perform the policy refinement process. The second problem is that refining policies for complex systems is a tedious task. Manual refinement process may cause some negative co...

  12. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  13. Extreme temperature regulation in the automation process in the production of biogas methane tanks

    OpenAIRE

    Uzhelovs'kyy, V.; Tkachenko, S.

    2015-01-01

    Problem statement. The efficiency of bioconversion processes is largely determined by the level of automation and process control biogas, which allows to optimize the process, enhance its efficiency and the ability to adapt to the real operating conditions.Analyzing of the resent research. In studies of contemporary experts in the field of process management energy production from unconventional sources has had a steady and good understanding of the management of biogas production from biomas...

  14. 40 CFR 63.1429 - Process vent monitoring requirements.

    Science.gov (United States)

    2010-07-01

    ..., analyzer vents, open-ended valves or lines, and pressure relief valves needed for safety purposes are not... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Process vent monitoring requirements....1429 Process vent monitoring requirements. (a) Monitoring equipment requirements. The owner or operator...

  15. Greenhouse Gases in the South Atlantic: Testing and Automation of Instrumentation for Long-Term Monitoring

    Science.gov (United States)

    Lowry, D.; Fisher, R.; Sriskantharajah, S.; Lanoisellé, M.; Etchells, A.; Manning, A.; Nisbet, E.

    2009-04-01

    Understanding ocean uptake of atmospheric CO2 by the Southern Ocean is important for modelling of future global warming scenarios, particularly since it was recently proposed that this sink was reducing (Le Quéré, et al., 2007). To help our understanding of this problem a new project aims to flask sample air from 5 South Atlantic sites and set up continuous monitoring at the 2 most accessible of these: Ascension Island and the Falklands. Flask sample measurements will include CO2 and CH4 mixing ratios and the ^13C measurement of both of these gases using the rapid continuous flow trace gas analysis system at Royal Holloway, University of London (RHUL). Routine precisions are ±0.03 per mil and ±0.05 per mil for CO2 and CH4, respectively (Fisher et al., 2006). A time series of ^13C in CH4 was maintained for Ascension Island from 2000-2005 and a time series for methane isotopes commenced for the Falkland Islands in autumn 2007. To meet the continuous monitoring requirements of the new project, three Picarro G1301 CO2 / CH4 / H2O Cavity Ring Down Spectrometers (CRDS) were installed at RHUL in October 2008 for testing, calibration and the development of an automated air inlet system suitable for analysis of calibration gases at the remote sites. Initial testing included calibration with NOAA calibrated and target gases, validation of the Picarro-defined H2O-correction of CO2, and derivation of an H2O-correction for CH4. Continuing checks on the H2O correction are made by having 2 instruments side-by-side taking air from the same inlet, but one having a combined Nafion / Mg-perchlorate drying system that utilizes the analysis system exhaust gas for the reverse flow through the Nafion and maintains water-levels at 0.05% for more than 2 weeks. These instruments are connected to the same air inlet as a GC measuring CH4 mixing ratio and a LiCor 6252 measuring CO2 mixing ratio at 30-minute and 1-minute intervals respectively. The third CRDS instrument is connected to a

  16. A Continuous Automated Vault Inventory System (CAVIS) for accountability monitoring of stored nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Pickett, C.A.; Barham, M.A.; Gafford, T.A.; Hutchinson, D.P.; Jordan, J.K.; Maxey, L.C.; Moran, B.W.; Muhs, J.; Nodine, R.; Simpson, M.L. [and others

    1994-12-08

    Nearly all facilities that store hazardous (radioactive or non-radioactive) materials must comply with prevailing federal, state, and local laws. These laws usually have components that require periodic physical inspections to insure that all materials remain safely and securely stored. The inspections are generally labor intensive, slow, put personnel at risk, and only find anomalies after they have occurred. The system described in this paper was developed for monitoring stored nuclear materials resulting from weapons dismantlement, but its applications extend to any storage facility that meets the above criteria. The traditional special nuclear material (SNM) accountability programs, that are currently used within most of the Department of Energy (DOE) complex, require the physical entry of highly trained personnel into SNM storage vaults. This imposes the need for additional security measures, which typically mandate that extra security personnel be present while SNM inventories are performed. These requirements increase labor costs and put additional personnel at risk to radiation exposure. In some cases, individuals have received radiation exposure equivalent to the annual maximum during just one inventory verification. With increasing overhead costs, the current system is rapidly becoming too expensive to operate, the need for an automated method of inventory verification is evident. The Continuous Automated Vault Inventory System (CAVIS) described in this paper was designed and prototyped as a low cost, highly reliable, and user friendly system that is capable of providing, real-time weight, gamma. and neutron energy confirmation from each item stored in a SNM vault. This paper describes the sensor technologies, the CAVIS prototype system (built at Y- 12 for highly enriched uranium storage), the technical requirements that must be achieved to assure successful implementation, and descriptions of sensor technologies needed for a plutonium facility.

  17. Safeguards inventory and process monitoring regulatory comparison

    Energy Technology Data Exchange (ETDEWEB)

    Cavaluzzi, Jack M. [Texas A & M Univ., College Station, TX (United States); Gibbs, Philip W. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2013-06-27

    Detecting the theft or diversion of the relatively small amount of fissile material needed to make a nuclear weapon given the normal operating capacity of many of today’s running nuclear production facilities is a difficult task. As throughput increases, the ability of the Material Control and Accountability (MC&A) Program to detect the material loss decreases because the statistical measurement uncertainty also increases. The challenge faced is the ability of current accounting, measurement, and material control programs to detect small yet significant losses under some regulatory approaches can decrease to the point where it is extremely low if not practically non-existent at normal operating capacities. Adding concern to this topic is that there are variations among regulatory bodies as far as what is considered a Significant Quantity (SQ). Some research suggests that thresholds should be lower than those found in any current regulation which if adopted would make meeting detection goals even more difficult. This paper reviews and compares the current regulatory requirements for the MA elements related to physical inventory, uncertainty of the Inventory Difference (ID), and Process Monitoring (PM) in the United States Department of Energy (DOE) and Nuclear Regulatory Commission (NRC), Rosatom of the Russian Federation and the Chinese Atomic Energy Agency (CAEA) of China. The comparison looks at how the regulatory requirements for the implementation of various MA elements perform across a range of operating capacities in example facilities.

  18. The integration of process monitoring for safeguards

    International Nuclear Information System (INIS)

    Cipiti, Benjamin B.; Zinaman, Owen R.

    2010-01-01

    The Separations and Safeguards Performance Model is a reprocessing plant model that has been developed for safeguards analyses of future plant designs. The model has been modified to integrate bulk process monitoring data with traditional plutonium inventory balances to evaluate potential advanced safeguards systems. Taking advantage of the wealth of operator data such as flow rates and mass balances of bulk material, the timeliness of detection of material loss was shown to improve considerably. Four diversion cases were tested including both abrupt and protracted diversions at early and late times in the run. The first three cases indicated alarms before half of a significant quantity of material was removed. The buildup of error over time prevented detection in the case of a protracted diversion late in the run. Some issues related to the alarm conditions and bias correction will need to be addressed in future work. This work both demonstrates the use of the model for performing diversion scenario analyses and for testing advanced safeguards system designs.

  19. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.

  20. An automated fog monitoring system for the Indo-Gangetic Plains based on satellite measurements

    Science.gov (United States)

    Patil, Dinesh; Chourey, Reema; Rizvi, Sarwar; Singh, Manoj; Gautam, Ritesh

    2016-05-01

    Fog is a meteorological phenomenon that causes reduction in regional visibility and affects air quality, thus leading to various societal and economic implications, especially disrupting air and rail transportation. The persistent and widespread winter fog impacts the entire the Indo-Gangetic Plains (IGP), as frequently observed in satellite imagery. The IGP is a densely populated region in south Asia, inhabiting about 1/6th of the world's population, with a strong upward pollution trend. In this study, we have used multi-spectral radiances and aerosol/cloud retrievals from Terra/Aqua MODIS data for developing an automated web-based fog monitoring system over the IGP. Using our previous and existing methodologies, and ongoing algorithm development for the detection of fog and retrieval of associated microphysical properties (e.g. fog droplet effective radius), we characterize the widespread fog detection during both daytime and nighttime. Specifically, for the night time fog detection, the algorithm employs a satellite-based bi-spectral brightness temperature difference technique between two spectral channels: MODIS band-22 (3.9μm) and band-31 (10.75μm). Further, we are extending our algorithm development to geostationary satellites, for providing continuous monitoring of the spatial-temporal variation of fog. We anticipate that the ongoing and future development of a fog monitoring system would be of assistance to air, rail and vehicular transportation management, as well as for dissemination of fog information to government agencies and general public. The outputs of fog detection algorithm and related aerosol/cloud parameters are operationally disseminated via http://fogsouthasia.com/.

  1. FLAME MONITORING IN POWER STATION BOILERS USING IMAGE PROCESSING

    Directory of Open Access Journals (Sweden)

    K. Sujatha

    2012-05-01

    Full Text Available Combustion quality in power station boilers plays an important role in minimizing the flue gas emissions. In the present work various intelligent schemes to infer the flue gas emissions by monitoring the flame colour at the furnace of the boiler are proposed here. Flame image monitoring involves capturing the flame video over a period of time with the measurement of various parameters like Carbon dioxide (CO2, excess oxygen (O2, Nitrogen dioxide (NOx, Sulphur dioxide (SOx and Carbon monoxide (CO emissions plus the flame temperature at the core of the fire ball, air/fuel ratio and the combustion quality. Higher the quality of combustion less will be the flue gases at the exhaust. The flame video was captured using an infrared camera. The flame video is then split up into the frames for further analysis. The video splitter is used for progressive extraction of the flame images from the video. The images of the flame are then pre-processed to reduce noise. The conventional classification and clustering techniques include the Euclidean distance classifier (L2 norm classifier. The intelligent classifier includes the Radial Basis Function Network (RBF, Back Propagation Algorithm (BPA and parallel architecture with RBF and BPA (PRBFBPA. The results of the validation are supported with the above mentioned performance measures whose values are in the optimal range. The values of the temperatures, combustion quality, SOx, NOx, CO, CO2 concentrations, air and fuel supplied corresponding to the images were obtained thereby indicating the necessary control action taken to increase or decrease the air supply so as to ensure complete combustion. In this work, by continuously monitoring the flame images, combustion quality was inferred (complete/partial/incomplete combustion and the air/fuel ratio can be automatically varied. Moreover in the existing set-up, measurements like NOx, CO and CO2 are inferred from the samples that are collected periodically or by

  2. The Use of an Automated System (GreenFeed) to Monitor Enteric Methane and Carbon Dioxide Emissions from Ruminant Animals.

    Science.gov (United States)

    Hristov, Alexander N; Oh, Joonpyo; Giallongo, Fabio; Frederick, Tyler; Weeks, Holley; Zimmerman, Patrick R; Harper, Michael T; Hristova, Rada A; Zimmerman, R Scott; Branco, Antonio F

    2015-09-07

    Ruminant animals (domesticated or wild) emit methane (CH4) through enteric fermentation in their digestive tract and from decomposition of manure during storage. These processes are the major sources of greenhouse gas (GHG) emissions from animal production systems. Techniques for measuring enteric CH4 vary from direct measurements (respiration chambers, which are highly accurate, but with limited applicability) to various indirect methods (sniffers, laser technology, which are practical, but with variable accuracy). The sulfur hexafluoride (SF6) tracer gas method is commonly used to measure enteric CH4 production by animal scientists and more recently, application of an Automated Head-Chamber System (AHCS) (GreenFeed, C-Lock, Inc., Rapid City, SD), which is the focus of this experiment, has been growing. AHCS is an automated system to monitor CH4 and carbon dioxide (CO2) mass fluxes from the breath of ruminant animals. In a typical AHCS operation, small quantities of baiting feed are dispensed to individual animals to lure them to AHCS multiple times daily. As the animal visits AHCS, a fan system pulls air past the animal's muzzle into an intake manifold, and through an air collection pipe where continuous airflow rates are measured. A sub-sample of air is pumped out of the pipe into non-dispersive infra-red sensors for continuous measurement of CH4 and CO2 concentrations. Field comparisons of AHCS to respiration chambers or SF6 have demonstrated that AHCS produces repeatable and accurate CH4 emission results, provided that animal visits to AHCS are sufficient so emission estimates are representative of the diurnal rhythm of rumen gas production. Here, we demonstrate the use of AHCS to measure CO2 and CH4 fluxes from dairy cows given a control diet or a diet supplemented with technical-grade cashew nut shell liquid.

  3. Synthesis of many different types of organic small molecules using one automated process.

    Science.gov (United States)

    Li, Junqi; Ballmer, Steven G; Gillis, Eric P; Fujii, Seiko; Schmidt, Michael J; Palazzolo, Andrea M E; Lehmann, Jonathan W; Morehouse, Greg F; Burke, Martin D

    2015-03-13

    Small-molecule synthesis usually relies on procedures that are highly customized for each target. A broadly applicable automated process could greatly increase the accessibility of this class of compounds to enable investigations of their practical potential. Here we report the synthesis of 14 distinct classes of small molecules using the same fully automated process. This was achieved by strategically expanding the scope of a building block-based synthesis platform to include even C(sp3)-rich polycyclic natural product frameworks and discovering a catch-and-release chromatographic purification protocol applicable to all of the corresponding intermediates. With thousands of compatible building blocks already commercially available, many small molecules are now accessible with this platform. More broadly, these findings illuminate an actionable roadmap to a more general and automated approach for small-molecule synthesis. Copyright © 2015, American Association for the Advancement of Science.

  4. Automated processing of webcam images for phenological classification.

    Directory of Open Access Journals (Sweden)

    Ludwig Bothmann

    Full Text Available Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels' time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the

  5. Fully automated concentration control of the acidic texturisation process

    OpenAIRE

    Dannenberg, T.; Zimmer, M.; Rentsch, J.

    2012-01-01

    To enable a concentration control in the acidic texturing process we have closed the feedback loop from analytical data to the dosing mechanism of the used process tool. In order to analyze the process bath we used near-infrared spectroscopy in an online setup as well as ion chromatography as an inline method in a second approach. Using the developed dosing algorithm allows a concentration optimization of HF and HNO3 in dependence of the Si concentrations. This allows a further optimization o...

  6. Development of an automated foam processing system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Gallaher, J.B.

    1978-01-01

    Processing variables in the polyurethane foam encapsulation procedure on an electronic assembly timer occasionally yielded foam which was dimensionally unstable. This change in size was large enough that the affected timers would not meet gage requirements and had to be reworked. This instability was indicative of a marginal process. A thorough investigation of the problem determined that inadequate mixing of the two constituents of the foam was the cause. To eliminate the cause of the marginal process, requirements were defined which were used as guidelines in specifying the necessary equipment. This specification was then issued to suppliers for quotes. Once the quotes were received, the capabilities of the different foam processing systems were reviewed to assure conformity to the specification.

  7. Aozan: an automated post-sequencing data-processing pipeline.

    Science.gov (United States)

    Perrin, Sandrine; Firmo, Cyril; Lemoine, Sophie; Le Crom, Stéphane; Jourdren, Laurent

    2017-07-15

    Data management and quality control of output from Illumina sequencers is a disk space- and time-consuming task. Thus, we developed Aozan to automatically handle data transfer, demultiplexing, conversion and quality control once a run has finished. This software greatly improves run data management and the monitoring of run statistics via automatic emails and HTML web reports. Aozan is implemented in Java and Python, supported on Linux systems, and distributed under the GPLv3 License at: http://www.outils.genomique.biologie.ens.fr/aozan/ . Aozan source code is available on GitHub: https://github.com/GenomicParisCentre/aozan . aozan@biologie.ens.fr. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. Automated input data management in manufacturing process simulation

    OpenAIRE

    Ettefaghian, Alireza

    2015-01-01

    Input Data Management (IDM) is a time consuming and costly process for Discrete Event Simulation (DES) projects. Input Data Management is considered as the basis of real-time process simulation (Bergmann, Stelzer and Strassburger, 2011). According to Bengtsson et al. (2009), data input phase constitutes on the average about 31% of the time of an entire simulation project. Moreover, the lack of interoperability between manufacturing applications and simulation software leads to a high cost to ...

  9. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  10. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    Science.gov (United States)

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  11. Automated control of the laser welding process of heart valve scaffolds

    OpenAIRE

    Weber Moritz; Hoheisel Anna L.; Glasmacher Birgit

    2016-01-01

    Using the electrospinning process the geometry of a heart valve is not replicable by just one manufacturing process. To produce heart valve scaffolds the heart valve leaflets and the vessel have to be produced in separated spinning processes. For the final product of a heart valve they have to be mated afterwards. In this work an already existing three-axes laser was enhanced to laser weld those scaffolds. The automation control software is based on the robot operating system (ROS). The mecha...

  12. A scalable, fully automated process for construction of sequence-ready human exome targeted capture libraries.

    Science.gov (United States)

    Fisher, Sheila; Barry, Andrew; Abreu, Justin; Minie, Brian; Nolan, Jillian; Delorey, Toni M; Young, Geneva; Fennell, Timothy J; Allen, Alexander; Ambrogio, Lauren; Berlin, Aaron M; Blumenstiel, Brendan; Cibulskis, Kristian; Friedrich, Dennis; Johnson, Ryan; Juhn, Frank; Reilly, Brian; Shammas, Ramy; Stalker, John; Sykes, Sean M; Thompson, Jon; Walsh, John; Zimmer, Andrew; Zwirko, Zac; Gabriel, Stacey; Nicol, Robert; Nusbaum, Chad

    2011-01-01

    Genome targeting methods enable cost-effective capture of specific subsets of the genome for sequencing. We present here an automated, highly scalable method for carrying out the Solution Hybrid Selection capture approach that provides a dramatic increase in scale and throughput of sequence-ready libraries produced. Significant process improvements and a series of in-process quality control checkpoints are also added. These process improvements can also be used in a manual version of the protocol.

  13. Automated multiscale morphometry of muscle disease from second harmonic generation microscopy using tensor-based image processing.

    Science.gov (United States)

    Garbe, Christoph S; Buttgereit, Andreas; Schürmann, Sebastian; Friedrich, Oliver

    2012-01-01

    Practically, all chronic diseases are characterized by tissue remodeling that alters organ and cellular function through changes to normal organ architecture. Some morphometric alterations become irreversible and account for disease progression even on cellular levels. Early diagnostics to categorize tissue alterations, as well as monitoring progression or remission of disturbed cytoarchitecture upon treatment in the same individual, are a new emerging field. They strongly challenge spatial resolution and require advanced imaging techniques and strategies for detecting morphological changes. We use a combined second harmonic generation (SHG) microscopy and automated image processing approach to quantify morphology in an animal model of inherited Duchenne muscular dystrophy (mdx mouse) with age. Multiphoton XYZ image stacks from tissue slices reveal vast morphological deviation in muscles from old mdx mice at different scales of cytoskeleton architecture: cell calibers are irregular, myofibrils within cells are twisted, and sarcomere lattice disruptions (detected as "verniers") are larger in number compared to samples from healthy mice. In young mdx mice, such alterations are only minor. The boundary-tensor approach, adapted and optimized for SHG data, is a suitable approach to allow quick quantitative morphometry in whole tissue slices. The overall detection performance of the automated algorithm compares very well with manual "by eye" detection, the latter being time consuming and prone to subjective errors. Our algorithm outperfoms manual detection by time with similar reliability. This approach will be an important prerequisite for the implementation of a clinical image databases to diagnose and monitor specific morphological alterations in chronic (muscle) diseases. © 2011 IEEE

  14. COED Transactions, Vol. IX, No. 9, September 1977. A Complete Course in Process Automation.

    Science.gov (United States)

    Marcovitz, Alan B., Ed.

    This document presents a mechanical engineering unit addressing essential aspects of computerized plant automation. The unit reviews the control of the simplest of all processes, a 1-measured variable, 1-controlled variable system, through the computer control of an air compressor. (SL)

  15. GIS-based NEXRAD Stage III precipitation database: automated approaches for data processing and visualization

    Science.gov (United States)

    Xie, Hongjie; Zhou, Xiaobing; Vivoni, Enrique R.; Hendrickx, Jan M. H.; Small, Eric E.

    2005-02-01

    This study develops a geographical information system (GIS) approach for automated processing of the Next Generation Weather Radar (NEXRAD) Stage III precipitation data. The automated processing system, implemented by using commercial GIS and a number of Perl scripts and C/C++ programs, allows for rapid data display, requires less storage capacity, and provides the analytical and data visualization tools inherent in GIS as compared to traditional methods. In this paper, we illustrate the development of automatic techniques to preprocess raw NEXRAD Stage III data, transform the data to a GIS format, select regions of interest, and retrieve statistical rainfall analysis over user-defined spatial and temporal scales. Computational expense is reduced significantly using the GIS-based automated techniques. For example, 1-year Stage III data processing (˜9000 files) for the West Gulf River Forecast Center takes about 3 days of computation time instead of months of manual work. To illustrate the radar precipitation database and its visualization capabilities, we present three application examples: (1) GIS-based data visualization and integration, and ArcIMS-based web visualization and publication system, (2) a spatial-temporal analysis of monsoon rainfall patterns over the Rio Grande River Basin, and (3) the potential of GIS-based radar data for distributed watershed models. We conclude by discussing the potential applications of automated techniques for radar rainfall processing and its integration with GIS-based hydrologic information systems.

  16. 3-D image pre-processing algorithms for improved automated tracing of neuronal arbors.

    Science.gov (United States)

    Narayanaswamy, Arunachalam; Wang, Yu; Roysam, Badrinath

    2011-09-01

    The accuracy and reliability of automated neurite tracing systems is ultimately limited by image quality as reflected in the signal-to-noise ratio, contrast, and image variability. This paper describes a novel combination of image processing methods that operate on images of neurites captured by confocal and widefield microscopy, and produce synthetic images that are better suited to automated tracing. The algorithms are based on the curvelet transform (for denoising curvilinear structures and local orientation estimation), perceptual grouping by scalar voting (for elimination of non-tubular structures and improvement of neurite continuity while preserving branch points), adaptive focus detection, and depth estimation (for handling widefield images without deconvolution). The proposed methods are fast, and capable of handling large images. Their ability to handle images of unlimited size derives from automated tiling of large images along the lateral dimension, and processing of 3-D images one optical slice at a time. Their speed derives in part from the fact that the core computations are formulated in terms of the Fast Fourier Transform (FFT), and in part from parallel computation on multi-core computers. The methods are simple to apply to new images since they require very few adjustable parameters, all of which are intuitive. Examples of pre-processing DIADEM Challenge images are used to illustrate improved automated tracing resulting from our pre-processing methods.

  17. Process methods and levels of automation of wood pallet repair in the United States

    Science.gov (United States)

    Jonghun Park; Laszlo Horvath; Robert J. Bush

    2016-01-01

    This study documented the current status of wood pallet repair in the United States by identifying the types of processing and equipment usage in repair operations from an automation prespective. The wood pallet repair firms included in the sudy received an average of approximately 1.28 million cores (i.e., used pallets) for recovery in 2012. A majority of the cores...

  18. Improved monitoring of batch processes by incorporating external information

    NARCIS (Netherlands)

    Ramaker, H. J.; van Sprang, E. N. M.; Gurden, S. P.; Westerhuis, J. A.; Smilde, A. K.

    2002-01-01

    In this paper an overview is given of statistical process monitoring with the emphasis on batch processes and the possible steps to take for improving this by incorporating external information. First, the general concept of statistical process monitoring of batches is explained. This concept has

  19. Emergency healthcare process automation using mobile computing and cloud services.

    Science.gov (United States)

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2012-10-01

    Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case.

  20. Transportation informatics : advanced image processing techniques automated pavement distress evaluation.

    Science.gov (United States)

    2010-01-01

    The current project, funded by MIOH-UTC for the period 1/1/2009- 4/30/2010, is concerned : with the development of the framework for a transportation facility inspection system using : advanced image processing techniques. The focus of this study is ...

  1. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A.

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  2. Automation and efficiency in the operational processes: a case study in a logistics operator

    Directory of Open Access Journals (Sweden)

    Dener Gomes do Nascimento

    2017-07-01

    Full Text Available Globalization has made the automations become increasingly feasible and with the technological development many operations can be optimized, bringing productivity gains. Logistics is a major benefit of all this development, because lives a time extremely competitive, in which being efficient is a requirement to stay alive in the market. Inserted in this context, this article seeks from the analysis of the processes in a distribution center, identify opportunities to automate operations to gain productivity and offer a better working conditions for employees.

  3. Automation of the process of generation of the students insurance, applying RFID and GPRS technologies

    Directory of Open Access Journals (Sweden)

    Nelson Barrera-Lombana

    2013-07-01

    Full Text Available This article presents the description of the design and implementation of a system which allows the fulfilment of a consultation service on various parameters to a web server using a GSM modem, exchanging information systems over the Internet (ISS and radio-frequency identification (RFID. The application validates for its use in automation of the process of generation of the student insurance, and hardware and software, developed by the Research Group in Robotics and Industrial Automation GIRAof UPTC, are used as a platform.

  4. Automated monitoring of behavior reveals bursty interaction patterns and rapid spreading dynamics in honeybee social networks.

    Science.gov (United States)

    Gernat, Tim; Rao, Vikyath D; Middendorf, Martin; Dankowicz, Harry; Goldenfeld, Nigel; Robinson, Gene E

    2018-02-13

    Social networks mediate the spread of information and disease. The dynamics of spreading depends, among other factors, on the distribution of times between successive contacts in the network. Heavy-tailed (bursty) time distributions are characteristic of human communication networks, including face-to-face contacts and electronic communication via mobile phone calls, email, and internet communities. Burstiness has been cited as a possible cause for slow spreading in these networks relative to a randomized reference network. However, it is not known whether burstiness is an epiphenomenon of human-specific patterns of communication. Moreover, theory predicts that fast, bursty communication networks should also exist. Here, we present a high-throughput technology for automated monitoring of social interactions of individual honeybees and the analysis of a rich and detailed dataset consisting of more than 1.2 million interactions in five honeybee colonies. We find that bees, like humans, also interact in bursts but that spreading is significantly faster than in a randomized reference network and remains so even after an experimental demographic perturbation. Thus, while burstiness may be an intrinsic property of social interactions, it does not always inhibit spreading in real-world communication networks. We anticipate that these results will inform future models of large-scale social organization and information and disease transmission, and may impact health management of threatened honeybee populations. Copyright © 2018 the Author(s). Published by PNAS.

  5. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  6. Automated remote cameras for monitoring alluvial sandbars on the Colorado River in Grand Canyon, Arizona

    Science.gov (United States)

    Grams, Paul E.; Tusso, Robert B.; Buscombe, Daniel

    2018-02-27

    Automated camera systems deployed at 43 remote locations along the Colorado River corridor in Grand Canyon National Park, Arizona, are used to document sandbar erosion and deposition that are associated with the operations of Glen Canyon Dam. The camera systems, which can operate independently for a year or more, consist of a digital camera triggered by a separate data controller, both of which are powered by an external battery and solar panel. Analysis of images for categorical changes in sandbar size show deposition at 50 percent or more of monitoring sites during controlled flood releases done in 2012, 2013, 2014, and 2016. The images also depict erosion of sandbars and show that erosion rates were highest in the first 3 months following each controlled flood. Erosion rates were highest in 2015, the year of highest annual dam release volume. Comparison of the categorical estimates of sandbar change agree with sandbar change (erosion or deposition) measured by topographic surveys in 76 percent of cases evaluated. A semiautomated method for quantifying changes in sandbar area from the remote-camera images by rectifying the oblique images and segmenting the sandbar from the rest of the image is presented. Calculation of sandbar area by this method agrees with sandbar area determined by topographic survey within approximately 8 percent and allows quantification of sandbar area monthly (or more frequently).

  7. Current status of process monitoring for IAEA safeguards

    International Nuclear Information System (INIS)

    Koroyasu, M.

    1987-06-01

    Based on literature survey, this report tries to answer some of the following questions on process monitoring for safeguards purposes of future large scale reprocessing plants: what is process monitoring, what are the basic elements of process monitoring, what kinds of process monitoring are there, what are the basic problems of process monitoring, what is the relationship between process monitoring and near-real-time materials accountancy, what are actual results of process monitoring tests and what should be studied in future. A brief description of Advanced Safeguards Approaches proposed by the four states (France, U.K., Japan and U.S.A.), the approach proposed by the U.S.A., the description of the process monitoring, the main part of the report published as a result of one of the U.S. Support Programmes for IAEA Safeguards and an article on process monitoring presented at an IAEA Symposium held in November 1986 are given in the annexes. 24 refs, 20 figs, tabs

  8. UNICOS CPC6: Automated Code Generation for Process Control Applications

    OpenAIRE

    Fernandez Adiego, B; Blanco Vinuela, E; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins ...

  9. Monitoring, accounting and automated decision support for the ALICE experiment based on the MonALISA framework

    CERN Document Server

    Cirstoiu, C; Betev, L; Saiz, P; Peters, A J; Muraru, A; Voicu, R; Legrand, I

    2007-01-01

    We are developing a general purpose monitoring system for the ALICE experiment, based on the MonALISA framework. MonALISA (Monitoring Agents using a Large Integrated Services Architecture) is a fully distributed system with no single point of failure that is able to collect, store monitoring information and present it as significant perspectives and synthetic views on the status and the trends of the entire system. Furthermore, agents can use it for taking automated operational decisions. Monitoring information is gathered locally from all the components running in each site. The entire flow of information is aggregated on site level by a MonALISA service and then collected and presented in various forms by a central MonALISA Repository. Based on this information, other services take operational decisions such as alerts, triggers, service restarts and automatic production job or transfer submissions. The system monitors all the components: computer clusters (all major parameters of each computing node), jobs ...

  10. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    Science.gov (United States)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius

  11. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules: Final Subcontract Report, April 1998 - April 2002

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore, S. B.; Hogan, S. J.

    2002-11-01

    This report describes the automated systems developed for PV module assembly and testing processes after lamination. These processes are applicable to a broad range of module types, including those made with wafer-based and thin-film solar cells. Survey data and input from module manufacturers gathered during site visits were used to define system capabilities and process specifications. Spire completed mechanical, electrical, and software engineering for four automation systems: a module edge trimming system, the SPI-TRIM 350; an edge sealing and framing system, the SPI-FRAMER 350; an integrated module testing system, the SPI-MODULE QA 350; and a module buffer storage system, the SPI-BUFFER 350. A fifth system for junction-box installation, the SPI-BOXER 350, was nearly completed during the program. A new-size solar simulator, the SPI-SUN SIMULATOR 350i, was designed as part of the SPI-MODULE QA 350. This simulator occupies minimal production floor space, and its test area is large enough to handle most production modules. The automated systems developed in this program are designed for integration to create automated production lines.

  12. Automation of the Process to Obtain U F4 Powders

    International Nuclear Information System (INIS)

    Fenocchio, A.D

    2001-01-01

    Here is exposed the preliminary analysis of the control system to be implemented in the Production Plant of UF 4 Powders.The work has been done in the electronic laboratory.This implies, the setting of devices (PLC, Temperature Controllers, etc.) and the setting of the communications using the proper protocol.Also is shown a study about the logic for the first part of the conversion process of UF 6 : the evaporation.This study is used to define the methodology to follow in a future PLC program

  13. a Critical Review of Automated Photogrammetric Processing of Large Datasets

    Science.gov (United States)

    Remondino, F.; Nocerino, E.; Toschi, I.; Menna, F.

    2017-08-01

    The paper reports some comparisons between commercial software able to automatically process image datasets for 3D reconstruction purposes. The main aspects investigated in the work are the capability to correctly orient large sets of image of complex environments, the metric quality of the results, replicability and redundancy. Different datasets are employed, each one featuring a diverse number of images, GSDs at cm and mm resolutions, and ground truth information to perform statistical analyses of the 3D results. A summary of (photogrammetric) terms is also provided, in order to provide rigorous terms of reference for comparisons and critical analyses.

  14. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    Science.gov (United States)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical

  15. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  16. Automated solar cell assembly teamed process research. Semiannual subcontract report, December 6, 1993--June 30, 1994

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. [Spire Corp., Bedford, MA (United States)

    1995-01-01

    This is the second Semiannual Technical Progress Report for the program titled `Automated Solar Cell Assembly Teamed Process Research` funded under National Renewable Energy Laboratory (NREL) subcontract No. ZAG-3-11219-01. This report describes the work done on Phase II of the program in the period from December 6, 1993 to June 30, 1994. Spire`s objective in this program is to develop high throughput (5 MW/yr) automated processes for interconnecting thin (200 {mu}m) silicon solar cells. High yield will be achieved with these fragile cells through the development of low mechanical stress and low thermal stress processes. For example, a machine vision system is being developed for cell alignment without mechanically contacting the cell edges, while a new soldering process is being developed to solder metal interconnect ribbons simultaneously to a cells` front and back contacts, eliminating one of the two heating steps normally used for soldering each cell.

  17. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    Science.gov (United States)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  18. Automated system of monitoring and positioning of functional units of mining technological machines for coal-mining enterprises

    Directory of Open Access Journals (Sweden)

    Meshcheryakov Yaroslav

    2018-01-01

    Full Text Available This article is show to the development of an automated monitoring and positioning system for functional nodes of mining technological machines. It describes the structure, element base, algorithms for identifying the operating states of a walking excavator; various types of errors in the functioning of microelectromechanical gyroscopes and accelerometers, as well as methods for their correction based on the Madgwick fusion filter. The results of industrial tests of an automated monitoring and positioning system for functional units on one of the opencast coal mines of Kuzbass are presented. This work is addressed to specialists working in the fields of the development of embedded systems and control systems, radio electronics, mechatronics, and robotics.

  19. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles.

    Science.gov (United States)

    Görgl, R.; Brandstätter, E.

    2016-03-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  20. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives

    DEFF Research Database (Denmark)

    Podevin, Michael Paul Ambrose; Fotidis, Ioannis; Angelidaki, Irini

    2017-01-01

    -value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main......Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high...... contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity – the capability of monitoring several analytes simultaneously – in the interest of improving product quality, productivity, and process automation...

  1. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  2. An image-processing program for automated counting

    Science.gov (United States)

    Cunningham, D.J.; Anderson, W.H.; Anthony, R.M.

    1996-01-01

    An image-processing program developed by the National Institute of Health, IMAGE, was modified in a cooperative project between remote sensing specialists at the Ohio State University Center for Mapping and scientists at the Alaska Science Center to facilitate estimating numbers of black brant (Branta bernicla nigricans) in flocks at Izembek National Wildlife Refuge. The modified program, DUCK HUNT, runs on Apple computers. Modifications provide users with a pull down menu that optimizes image quality; identifies objects of interest (e.g., brant) by spectral, morphometric, and spatial parameters defined interactively by users; counts and labels objects of interest; and produces summary tables. Images from digitized photography, videography, and high- resolution digital photography have been used with this program to count various species of waterfowl.

  3. AUTOMATED SYSTEM OF DATA PROCESSING WITH THE IMPLEMENTATION OF RATING TECHNOLOGY OF TEACHING

    Directory of Open Access Journals (Sweden)

    О. И. Дзювина

    2014-01-01

    Full Text Available Rating technology of teaching enables independent and individual work of students, increase their motivation.Purpose: to increase the efficiency of data processing with the implementation of rating technology of teaching.Method: analysis, synthesis,experiment.Results. Developed an automated data processing system for the implementation of rating technology of teaching.Practical implication. Education.Purchase on Elibrary.ru > Buy now

  4. Impact of Business Process Automation on Employees’ Efficiency

    OpenAIRE

    Sarfraz Ahmad Sirohey; Ahmed Imran Hunjra; Babar Khalid

    2012-01-01

    Business Process Automation (BPA) is assumed to enhance organizational efficiency by decreasing level of effort and elimination of redundant processes and procedures. The study in hand is an analysis of impact by the BPA on the AGPR employees’ efficiency. Five variables namely, New System Understanding, Adaptation to New Methodology, Response of Employees to Change, Conformity to Standards and Employees’ Efficiency. A questionnaire comprised of 29 items was adapted for primary data collec...

  5. Quantitative and Qualitative Analysis of Aconitum Alkaloids in Raw and Processed Chuanwu and Caowu by HPLC in Combination with Automated Analytical System and ESI/MS/MS

    Directory of Open Access Journals (Sweden)

    Aimin Sun

    2012-01-01

    Full Text Available HPLC in combination with automated analytical system and ESI/MS/MS was used to analyze aconitine (A, mesaconitine (MA, hypaconitine (HA, and their benzoyl analogs in the Chinese herbs Caowu and Chuanwu. First, an HPLC method was developed and validated to determine A, MA, and HA in raw and processed Caowu and Chuanwu. Then an automated analytical system and ESI/MS/MS were applied to analyze these alkaloids and their semihydrolyzed products. The results obtained from automated analytical system are identical to those from ESI/MS/MS, which indicated that the method is a convenient and rapid tool for the qualitative analysis of herbal preparations. Furthermore, HA was little hydrolyzed by heating processes and thus it might account more for the toxicity of processed aconites. Hence, HA could be used as an indicator when one alkaloid is required as a reference to monitor the quality of raw and processed Chuanwu and Caowu. In addition, the raw and processed Chuanwu and Caowu can be distinguished by monitoring the ratio of A and MA to HA.

  6. Monitoring Industrial Food Processes Using Spectroscopy & Chemometrics

    DEFF Research Database (Denmark)

    Pedersen, Dorthe Kjær; Engelsen, Søren Balling

    2001-01-01

    In the last decade rapid spectroscopic measurements have revolutionized quality control in practically all areas of primary food and feed production. Near-infrared spectroscopy (NIR & NIT) has been implemented for monitoring the quality of millions of samples of cereals, milk and meat with unprec...

  7. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    Science.gov (United States)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  8. Automated vehicle counting using image processing and machine learning

    Science.gov (United States)

    Meany, Sean; Eskew, Edward; Martinez-Castro, Rosana; Jang, Shinae

    2017-04-01

    Vehicle counting is used by the government to improve roadways and the flow of traffic, and by private businesses for purposes such as determining the value of locating a new store in an area. A vehicle count can be performed manually or automatically. Manual counting requires an individual to be on-site and tally the traffic electronically or by hand. However, this can lead to miscounts due to factors such as human error A common form of automatic counting involves pneumatic tubes, but pneumatic tubes disrupt traffic during installation and removal, and can be damaged by passing vehicles. Vehicle counting can also be performed via the use of a camera at the count site recording video of the traffic, with counting being performed manually post-recording or using automatic algorithms. This paper presents a low-cost procedure to perform automatic vehicle counting using remote video cameras with an automatic counting algorithm. The procedure would utilize a Raspberry Pi micro-computer to detect when a car is in a lane, and generate an accurate count of vehicle movements. The method utilized in this paper would use background subtraction to process the images and a machine learning algorithm to provide the count. This method avoids fatigue issues that are encountered in manual video counting and prevents the disruption of roadways that occurs when installing pneumatic tubes

  9. Automated processing of massive audio/video content using FFmpeg

    Directory of Open Access Journals (Sweden)

    Kia Siang Hock

    2014-01-01

    Full Text Available Audio and video content forms an integral, important and expanding part of the digital collections in libraries and archives world-wide. While these memory institutions are familiar and well-versed in the management of more conventional materials such as books, periodicals, ephemera and images, the handling of audio (e.g., oral history recordings and video content (e.g., audio-visual recordings, broadcast content requires additional toolkits. In particular, a robust and comprehensive tool that provides a programmable interface is indispensable when dealing with tens of thousands of hours of audio and video content. FFmpeg is comprehensive and well-established open source software that is capable of the full-range of audio/video processing tasks (such as encode, decode, transcode, mux, demux, stream and filter. It is also capable of handling a wide-range of audio and video formats, a unique challenge in memory institutions. It comes with a command line interface, as well as a set of developer libraries that can be incorporated into applications.

  10. UNICOS CPC6: automated code generation for process control applications

    International Nuclear Information System (INIS)

    Fernandez Adiego, B.; Blanco Vinuela, E.; Prieto Barreiro, I.

    2012-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS). As a part of this framework, UNICOS-CPC provides a well defined library of device types, a methodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) to develop CPC applications. The CPC component is composed of several platform oriented plug-ins (PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA generator (based on PVSS) and the CPC wizard as a dedicated plug-in created to provide the user a friendly GUI (Graphical User Interface). A tool called UAB Bootstrap will manage the different UAB components, like CPC, and its dependencies with the resource packages. This tool guides the control system developer during the installation, update and execution of the UAB components. (authors)

  11. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  12. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    Science.gov (United States)

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  13. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meier, David E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coble, Jamie B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, David V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mcdonald, Luther W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Forrester, Joel B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Unlu, Kenan [Pennsylvania State Univ., University Park, PA (United States); Landsberger, Sheldon [Univ. of Texas, Austin, TX (United States); Bender, Sarah [Pennsylvania State Univ., University Park, PA (United States); Dayman, Kenneth J. [Univ. of Texas, Austin, TX (United States); Reilly, Dallas D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-01

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicate changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.

  14. A method for the automated long-term monitoring of three-spined stickleback Gasterosteus aculeatus shoal dynamics.

    Science.gov (United States)

    Kleinhappel, T K; Al-Zoubi, A; Al-Diri, B; Burman, O; Dickinson, P; John, L; Wilkinson, A; Pike, T W

    2014-04-01

    This paper describes and evaluates a flexible, non-invasive tagging system for the automated identification and long-term monitoring of individual three-spined sticklebacks Gasterosteus aculeatus. The system is based on barcoded tags, which can be reliably and robustly detected and decoded to provide information on an individual's identity and location. Because large numbers of fish can be individually tagged, it can be used to monitor individual- and group-level dynamics within fish shoals. © 2014 The Fisheries Society of the British Isles.

  15. Automation of NLO processes and decays and POWHEG matching in WHIZARD

    International Nuclear Information System (INIS)

    Reuter, Jürgen; Chokoufé, Bijan; Weiss, Christian; Hoang, André; Kilian, Wolfgang; Stahlhofen, Maximilian; Teubner, Thomas

    2016-01-01

    We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one- loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e + e - processes are shown. Also, first NLO- studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the Powheg formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables. (paper)

  16. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  17. Automated Hardware and Software System for Monitoring the Earth’s Magnetic Environment

    Directory of Open Access Journals (Sweden)

    Alexei Gvishiani

    2016-12-01

    Full Text Available The continuous growth of geophysical observations requires adequate methods for their processing and analysis. This becomes one of the most important and widely discussed issues in the data science community. The system analysis methods and data mining techniques are able to sustain the solution of this problem. This paper presents an innovative holistic hardware/software system (HSS developed for efficient management and intellectual analysis of geomagnetic data, registered by Russian geomagnetic observatories and international satellites. Geomagnetic observatories that comprise the International Real-time Magnetic Observatory Network (INTERMAGNET produce preliminary (raw and definitive (corrected geomagnetic data of the highest quality. The designed system automates and accelerates routine production of definitive data from the preliminary magnetograms, obtained by Russian observatories, due to implemented algorithms that involve artificial intelligence elements. The HSS is the first system that provides sophisticated automatic detection and multi-criteria classification of extreme geomagnetic conditions, which may be hazardous for technological infrastructure and economic activity in Russia. It enables the online access to digital geomagnetic data, its processing results and modelling calculations along with their visualization on conventional and spherical screens. The concept of the presented system agrees with the accepted ‘four Vs’ paradigm of Big Data. The HSS can increase significantly the ‘velocity’ and ‘veracity’ features of the INTERMAGNET system. It also provides fusion of large sets of ground-based and satellite geomagnetic data, thus facilitating the ‘volume’ and ‘variety’ of handled data.

  18. Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.

    2018-01-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  19. High-automated system of film data mathematical processing for polarized deuteron experiments

    International Nuclear Information System (INIS)

    Balgansuren, Ya.; Buzdavina, N.A.; Glagolev, V.V.

    1986-01-01

    A specialized software system which allowed to reduce essentially the time of experimental data analysis has been developed in order to provide timely processing of film information in polarized deuteron experiments. With its help preliminary data on deuteron polarization has been obtained in a few months after experiment start up, and the total data processing (15 thousand events) has been carried out in less than in a year from the chamber irradiation time. High rate of data processing has been achieved due to complex automation of all stages of processing

  20. Developments and automation in purex process control analytical measurement systems (Preprint no. IT-20)

    International Nuclear Information System (INIS)

    Ramanujam, A.

    1991-02-01

    The fuel reprocessing facility based on purex process depends on efficient process control analytical measurement systems for its successful operation. The process control laboratory plays a vital role in catering to these requirements. This paper describes the various efforts put in to improve its performance capabilities in three major areas of operation, viz. sample handling, analytical and data processing. In developing automation aids and analytical techniques, apart from the special emphasis put on reduction in personnel exposure to radiation and time required for analysis, due consideration has been given to operational reliability and safety of the system. (author). 15 refs., 4 tabs., 3 figs

  1. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  2. Processing Approaches for DAS-Enabled Continuous Seismic Monitoring

    Science.gov (United States)

    Dou, S.; Wood, T.; Freifeld, B. M.; Robertson, M.; McDonald, S.; Pevzner, R.; Lindsey, N.; Gelvin, A.; Saari, S.; Morales, A.; Ekblaw, I.; Wagner, A. M.; Ulrich, C.; Daley, T. M.; Ajo Franklin, J. B.

    2017-12-01

    Distributed Acoustic Sensing (DAS) is creating a "field as laboratory" capability for seismic monitoring of subsurface changes. By providing unprecedented spatial and temporal sampling at a relatively low cost, DAS enables field-scale seismic monitoring to have durations and temporal resolutions that are comparable to those of laboratory experiments. Here we report on seismic processing approaches developed during data analyses of three case studies all using DAS-enabled seismic monitoring with applications ranging from shallow permafrost to deep reservoirs: (1) 10-hour downhole monitoring of cement curing at Otway, Australia; (2) 2-month surface monitoring of controlled permafrost thaw at Fairbanks, Alaska; (3) multi-month downhole and surface monitoring of carbon sequestration at Decatur, Illinois. We emphasize the data management and processing components relevant to DAS-based seismic monitoring, which include scalable approaches to data management, pre-processing, denoising, filtering, and wavefield decomposition. DAS has dramatically increased the data volume to the extent that terabyte-per-day data loads are now typical, straining conventional approaches to data storage and processing. To achieve more efficient use of disk space and network bandwidth, we explore improved file structures and data compression schemes. Because noise floor of DAS measurements is higher than that of conventional sensors, optimal processing workflow involving advanced denoising, deconvolution (of the source signatures), and stacking approaches are being established to maximize signal content of DAS data. The resulting workflow of data management and processing could accelerate the broader adaption of DAS for continuous monitoring of critical processes.

  3. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Directory of Open Access Journals (Sweden)

    Dai Fei Elmer Ker

    Full Text Available Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and

  4. An image processing framework for automated analysis of swimming behavior in tadpoles with vestibular alterations

    Science.gov (United States)

    Zarei, Kasra; Fritzsch, Bernd; Buchholz, James H. J.

    2017-03-01

    Micogravity, as experienced during prolonged space flight, presents a problem for space exploration. Animal models, specifically tadpoles, with altered connections of the vestibular ear allow the examination of the effects of microgravity and can be quantitatively monitored through tadpole swimming behavior. We describe an image analysis framework for performing automated quantification of tadpole swimming behavior. Speckle reducing anisotropic diffusion is used to smooth tadpole image signals by diffusing noise while retaining edges. A narrow band level set approach is used for sharp tracking of the tadpole body. The use of level set method for interface tracking provides an inherent advantage of using level set based image segmentation algorithm (active contouring). Active contour segmentation is followed by two-dimensional skeletonization, which allows the automated quantification of tadpole deflection angles, and subsequently tadpole escape (or C-start) response times. Evaluation of the image analysis methodology was performed by comparing the automated quantifications of deflection angles to manual assessments (obtained using a standard grading scheme), and produced a high correlation (r2 = 0.99) indicating high reliability and accuracy of the proposed method. The methods presented form an important element of objective quantification of the escape response of the tadpole vestibular system to mechanical and biochemical manipulations, and can ultimately contribute to a better understanding of the effects of altered gravity perception on humans.

  5. Computer-based diagnostic monitoring to enhance the human-machine interface of complex processes

    International Nuclear Information System (INIS)

    Kim, I.S.

    1992-02-01

    There is a growing interest in introducing an automated, on-line, diagnostic monitoring function into the human-machine interfaces (HMIs) or control rooms of complex process plants. The design of such a system should be properly integrated with other HMI systems in the control room, such as the alarms system or the Safety Parameter Display System (SPDS). This paper provides a conceptual foundation for the development of a Plant-wide Diagnostic Monitoring System (PDMS), along with functional requirements for the system and other advanced HMI systems. Insights are presented into the design of an efficient and robust PDMS, which were gained from a critical review of various methodologies developed in the nuclear power industry, the chemical process industry, and the space technological community

  6. Processing of the WLCG monitoring data using NoSQL

    Science.gov (United States)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  7. Processing of the WLCG monitoring data using NoSQL

    International Nuclear Information System (INIS)

    Andreeva, J; Beche, A; Karavakis, E; Saiz, P; Tuckett, D; Belov, S; Kadochnikov, I; Schovancova, J; Dzhunov, I

    2014-01-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  8. The automated testing system of programs with the graphic user interface within the context of educational process

    OpenAIRE

    Sychev, O.; Kiryushkin, A.

    2009-01-01

    The paper describes the problems of automation of educational process at the course "Programming on high level language. Algorithmic languages". Complexities of testing of programs with the user interface are marked. Existing analogues was considered. Methods of automation of student's jobs testing are offered.

  9. An automated method for large-scale monitoring of seed dispersal by ants.

    Science.gov (United States)

    Bologna, Audrey; Toffin, Etienne; Detrain, Claire; Campo, Alexandre

    2017-01-10

    Myrmecochory is the process of seed dispersal by ants; however, it is highly challenging to study, mainly because of the small size of both partners and the comparatively large range of dispersal. The mutualistic interaction between ants and seeds involves the former retrieving diaspores, consuming their elaiosome (a nutrient-rich appendage), and the rejection of seeds from the nest. Here, we introduce a semi-automated method based on stitching high resolution images together, allowing the study of myrmecochory in a controlled environment over time. We validate the effectiveness of our method in detecting and discriminating seeds and ants. We show that the number of retrieved diaspores varies highly among colonies, and is independent of both their size and activity level, even though the dynamics of diaspore collection are correlated with the arrival of ants at the food source. We find that all retrieved seeds are rejected from the nest in a clustered pattern, and, surprisingly, they are also frequently redispersed within the arena afterwards, despite lacking elaiosome. This finding suggests that the dispersal pattern might be more complex and dynamic than expected. Our method unveils new insights on the mechanisms of myrmecochory, and could be usefully adapted to study other dispersal phenomena.

  10. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  11. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  12. Spectral imaging applications: Remote sensing, environmental monitoring, medicine, military operations, factory automation and manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Gat, N.; Subramanian, S. [Opto-Knowledge Systems, Inc. (United States); Barhen, J. [Oak Ridge National Lab., TN (United States); Toomarian, N. [Jet Propulsion Lab., Pasadena, CA (United States)

    1996-12-31

    This paper reviews the activities at OKSI related to imaging spectroscopy presenting current and future applications of the technology. The authors discuss the development of several systems including hardware, signal processing, data classification algorithms and benchmarking techniques to determine algorithm performance. Signal processing for each application is tailored by incorporating the phenomenology appropriate to the process, into the algorithms. Pixel signatures are classified using techniques such as principal component analyses, generalized eigenvalue analysis and novel very fast neural network methods. The major hyperspectral imaging systems developed at OKSI include the Intelligent Missile Seeker (IMS) demonstration project for real-time target/decoy discrimination, and the Thermal InfraRed Imaging Spectrometer (TIRIS) for detection and tracking of toxic plumes and gases. In addition, systems for applications in medical photodiagnosis, manufacturing technology, and for crop monitoring are also under development.

  13. Process development for automated solar-cell and module production. Task 4. Automated array assembly. Quarterly report No. 3

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J. J.; Gifford, M.

    1981-04-15

    The Automated Lamination Station is mechanically complete and is currently undergoing final wiring. The high current driver and isolator boards have been completed and installed, and the main interface board is under construction. The automated vacuum chamber has had a minor redesign to increase stiffness and improve the cover open/close mechanism. Design of the Final Assembly Station has been completed and construction is underway.

  14. A realization of an automated data flow for data collecting, processing, storing and retrieving

    International Nuclear Information System (INIS)

    Friedsam, H.; Pushor, R.; Ruland, R.

    1986-11-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's. 14 refs., 4 figs

  15. Monitoring of the process of Flash-Butt Welding

    OpenAIRE

    Chvertko,Yevgenia; Shevchenko,Mykola; Pirumov,Andriy

    2013-01-01

    Statistical methods of analysis are currently widely used to develop control and monitoring systems for different welding processes. These methods allow to obtain information about the process including effect of all factors on its results, which is often difficult to evaluate due to the complexity of the process. The authors made efforts to apply these methods to develop the system for monitoring the parameters of flash-butt welding in real-time mode. The paper gives brief information about ...

  16. Comprehensive automation and monitoring of MV grids as the key element of improvement of energy supply reliability and continuity

    Directory of Open Access Journals (Sweden)

    Stanisław Kubacki

    2012-03-01

    Full Text Available The paper presents the issue of comprehensive automation and monitoring of medium voltage (MV grids as a key element of the Smart Grid concept. The existing condition of MV grid control and monitoring is discussed, and the concept of a solution which will provide the possibility of remote automatic grid reconfiguration and ensure full grid observability from the dispatching system level is introduced. Automation of MV grid switching is discussed in detail to isolate a faulty line section and supply electricity at the time of the failure to the largest possible number of recipients. An example of such automation controls’ operation is also presented. The paper’s second part presents the key role of the quick fault location function and the possibility of the MV grid’s remote reconfiguration for improving power supply reliability (SAIDI and SAIFI indices. It is also shown how an increase in the number of points fitted with faulted circuit indicators with the option of remote control of switches from the dispatch system in MV grids may affect reduction of SAIDI and SAIFI indices across ENERGA-OPERATOR SA divisions.

  17. Simplified automated image analysis for detection and phenotyping of Mycobacterium tuberculosis on porous supports by monitoring growing microcolonies.

    Directory of Open Access Journals (Sweden)

    Alice L den Hertog

    Full Text Available BACKGROUND: Even with the advent of nucleic acid (NA amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS, as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. METHODS: Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO supports. Repeated imaging during colony growth greatly simplifies "computer vision" and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. SIGNIFICANCE: Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation.

  18. BiomaSoft: data processing system for monitoring and evaluating food and energy production. Part I

    International Nuclear Information System (INIS)

    Quevedo, J. R.; Suárez, J.

    2015-01-01

    The integrated food and energy production in Cuba demands to process diverse and voluminous information to make local, sectoral and national decisions, in order to have incidence on public policies, for which the support of automated systems that facilitate the monitoring and evaluation (M&E) of the integrated food and energy production in Cuban municipalities is necessary. The objective of this research was to identify the tools for the design of the data processing system BiomaSoft and to contextualize its application environment. The software development methodology was RUP (Rational Unified Process), with UML (Unified Modeling Language) as modeling language and PHP (Hypertext Pre-Processor) as programming language. The environment was conceptualized through a dominion model and the functional and non-functional requisites that should be fulfilled, as well as the Use Case Diagram of the system, with the description of actors, were specified. For the display of BiomaSoft a configuration based on two types of physical nodes (a web server and client computers) was conceived, in the municipalities that participate in the project «Biomass as renewable energy source for Cuban rural areas» (BIOMAS-CUBA). It is concluded that the monitoring and evaluation of integrated food and energy production under Cuban conditions can be made through the automated system BiomaSoft, and the identification of tools for its design and the contextualization of its application environment contribute to this purpose. (author)

  19. Development strategy and process models for phased automation of design and digital manufacturing electronics

    Science.gov (United States)

    Korshunov, G. I.; Petrushevskaya, A. A.; Lipatnikov, V. A.; Smirnova, M. S.

    2018-03-01

    The strategy of quality of electronics insurance is represented as most important. To provide quality, the processes sequence is considered and modeled by Markov chain. The improvement is distinguished by simple database means of design for manufacturing for future step-by-step development. Phased automation of design and digital manufacturing electronics is supposed. The MatLab modelling results showed effectiveness increase. New tools and software should be more effective. The primary digital model is proposed to represent product in the processes sequence from several processes till the whole life circle.

  20. Automated control of the laser welding process of heart valve scaffolds

    Directory of Open Access Journals (Sweden)

    Weber Moritz

    2016-09-01

    Full Text Available Using the electrospinning process the geometry of a heart valve is not replicable by just one manufacturing process. To produce heart valve scaffolds the heart valve leaflets and the vessel have to be produced in separated spinning processes. For the final product of a heart valve they have to be mated afterwards. In this work an already existing three-axes laser was enhanced to laser weld those scaffolds. The automation control software is based on the robot operating system (ROS. The mechatronically control is done by an Arduino Mega. A graphical user interface (GUI is written with Python and Kivy.

  1. Automated Radiology Report Summarization Using an Open-Source Natural Language Processing Pipeline.

    Science.gov (United States)

    Goff, Daniel J; Loehfelm, Thomas W

    2017-10-30

    Diagnostic radiologists are expected to review and assimilate findings from prior studies when constructing their overall assessment of the current study. Radiology information systems facilitate this process by presenting the radiologist with a subset of prior studies that are more likely to be relevant to the current study, usually by comparing anatomic coverage of both the current and prior studies. It is incumbent on the radiologist to review the full text report and/or images from those prior studies, a process that is time-consuming and confers substantial risk of overlooking a relevant prior study or finding. This risk is compounded when patients have dozens or even hundreds of prior imaging studies. Our goal is to assess the feasibility of natural language processing techniques to automatically extract asserted and negated disease entities from free-text radiology reports as a step towards automated report summarization. We compared automatically extracted disease mentions to a gold-standard set of manual annotations for 50 radiology reports from CT abdomen and pelvis examinations. The automated report summarization pipeline found perfect or overlapping partial matches for 86% of the manually annotated disease mentions (sensitivity 0.86, precision 0.66, accuracy 0.59, F1 score 0.74). The performance of the automated pipeline was good, and the overall accuracy was similar to the interobserver agreement between the two manual annotators.

  2. Monitoring Car Drivers' Condition Using Image Processing

    Science.gov (United States)

    Adachi, Kazumasa; Yamamto, Nozomi; Yamamoto, Osami; Nakano, Tomoaki; Yamamoto, Shin

    We have developed a car driver monitoring system for measuring drivers' consciousness, with which we aim to reduce car accidents caused by drowsiness of drivers. The system consists of the following three subsystems: an image capturing system with a pulsed infrared CCD camera, a system for detecting blinking waveform by the images using a neural network with which we can extract images of face and eye areas, and a system for measuring drivers' consciousness analyzing the waveform with a fuzzy inference technique and others. The third subsystem extracts three factors from the waveform first, and analyzed them with a statistical method, while our previous system used only one factor. Our experiments showed that the three-factor method we used this time was more effective to measure drivers' consciousness than the one-factor method we described in the previous paper. Moreover, the method is more suitable for fitting parameters of the system to each individual driver.

  3. Neural network monitoring of resistance welding processes

    OpenAIRE

    Quero Reboul, José Manuel; Millán Vázquez de la Torre, Rafael Luis; García Franquelo, Leopoldo; Cañas, J.

    1994-01-01

    Control of weld quality is one of the most important and complex processes to be carried out on production lines. Neural networks have shown good results in fields such as modelling and control of physical processes. It is suggested in this article that a neural classifier should be used to carry out non‐destructive on‐line analysis. This system has been developed and installed at resistance welding stations. Results confirm the validity of neural networks used for this type of application.

  4. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Entezari, Pegah; Kino, Aya; Honarmand, Amir R.; Galizia, Mauricio S.; Yang, Yan; Collins, Jeremy; Yaghmai, Vahid; Carr, James C.

    2013-01-01

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  5. Development of an efficient automated hyperspectral processing system using embedded computing

    Science.gov (United States)

    Brown, Matthew S.; Glaser, Eli; Grassinger, Scott; Slone, Ambrose; Salvador, Mark

    2012-06-01

    Automated hyperspectral image processing enables rapid detection and identification of important military targets from hyperspectral surveillance and reconnaissance images. The majority of this processing is done using ground-based CPUs on hyperspectral data after it has been manually exfiltrated from the mobile sensor platform. However, by utilizing high-performance, on-board processing hardware, the data can be immediately processed, and the exploitation results can be distributed over a low-bandwidth downlink, allowing rapid responses to situations as they unfold. Additionally, transitioning to higher-performance and more-compact processing architectures such as GPUs, DSPs, and FPGAs will allow the size, weight, and power (SWaP) demands of the system to be reduced. This will allow the next generation of hyperspectral imaging and processing systems to be deployed on a much wider range of smaller manned and unmanned vehicles. In this paper, we present results on the development of an automated, near-real-time hyperspectral processing system using a commercially available NVIDIA® Telsa™ GPU. The processing chain utilizes GPU-optimized implementations of well-known atmospheric-correction, anomaly-detection, and target-detection algorithms in order to identify targetmaterial spectra from a hyperspectral image. We demonstrate that the system can return target-detection results for HYDICE data with 308×1280 pixels and 145 bands against 30 target spectra in less than four seconds.

  6. A fuzzy model for processing and monitoring vital signs in ICU patients

    Directory of Open Access Journals (Sweden)

    Valentim Ricardo AM

    2011-08-01

    Full Text Available Abstract Background The area of the hospital automation has been the subject of much research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others; communication (tracking patients, staff and materials, development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials; and aid to medical diagnosis (according to each speciality. Methods In this context, this paper presents a Fuzzy model for helping medical diagnosis of Intensive Care Unit (ICU patients and their vital signs monitored through a multiparameter heart screen. Intelligent systems techniques were used in the data acquisition and processing (sorting, transforming, among others it into useful information, conducting pre-diagnosis and providing, when necessary, alert signs to the medical staff. Conclusions The use of fuzzy logic turned to the medical area can be very useful if seen as a tool to assist specialists in this area. This paper presented a fuzzy model able to monitor and classify the condition of the vital signs of hospitalized patients, sending alerts according to the pre-diagnosis done helping the medical diagnosis.

  7. Process monitoring in international safeguards for reprocessing plants: A demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Ehinger, M.H.

    1989-01-01

    In the period 1985--1987, the Oak Ridge National Laboratory investigated the possible role of process monitoring for international safeguards applications in fuel reprocessing plants. This activity was conducted under Task C.59, ''Review of Process Monitoring Safeguards Technology for Reprocessing Facilities'' of the US program of Technical Assistance to the International Atomic Energy Agency (IAEA) Safeguards program. The final phase was a demonstration of process monitoring applied in a prototypical reprocessing plant test facility at ORNL. This report documents the demonstration and test results. 35 figs.

  8. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    Science.gov (United States)

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  9. Monitoring autocorrelated process: A geometric Brownian motion process approach

    Science.gov (United States)

    Li, Lee Siaw; Djauhari, Maman A.

    2013-09-01

    Autocorrelated process control is common in today's modern industrial process control practice. The current practice of autocorrelated process control is to eliminate the autocorrelation by using an appropriate model such as Box-Jenkins models or other models and then to conduct process control operation based on the residuals. In this paper we show that many time series are governed by a geometric Brownian motion (GBM) process. Therefore, in this case, by using the properties of a GBM process, we only need an appropriate transformation and model the transformed data to come up with the condition needs in traditional process control. An industrial example of cocoa powder production process in a Malaysian company will be presented and discussed to illustrate the advantages of the GBM approach.

  10. Signal Processing Methods Monitor Cranial Pressure

    Science.gov (United States)

    2010-01-01

    Dr. Norden Huang, of Goddard Space Flight Center, invented a set of algorithms (called the Hilbert-Huang Transform, or HHT) for analyzing nonlinear and nonstationary signals that developed into a user-friendly signal processing technology for analyzing time-varying processes. At an auction managed by Ocean Tomo Federal Services LLC, licenses of 10 U.S. patents and 1 domestic patent application related to HHT were sold to DynaDx Corporation, of Mountain View, California. DynaDx is now using the licensed NASA technology for medical diagnosis and prediction of brain blood flow-related problems, such as stroke, dementia, and traumatic brain injury.

  11. An improved, computer-based, on-line gamma monitor for plutonium anion exchange process control

    International Nuclear Information System (INIS)

    Pope, N.G.; Marsh, S.F.

    1987-06-01

    An improved, low-cost, computer-based system has replaced a previously developed on-line gamma monitor. Both instruments continuously profile uranium, plutonium, and americium in the nitrate anion exchange process used to recover and purify plutonium at the Los Alamos Plutonium Facility. The latest system incorporates a personal computer that provides full-feature multichannel analyzer (MCA) capabilities by means of a single-slot, plug-in integrated circuit board. In addition to controlling all MCA functions, the computer program continuously corrects for gain shift and performs all other data processing functions. This Plutonium Recovery Operations Gamma Ray Energy Spectrometer System (PROGRESS) provides on-line process operational data essential for efficient operation. By identifying abnormal conditions in real time, it allows operators to take corrective actions promptly. The decision-making capability of the computer will be of increasing value as we implement automated process-control functions in the future. 4 refs., 6 figs

  12. APACS: Monitoring and diagnosis of complex processes

    International Nuclear Information System (INIS)

    Kramer, B.M.; Mylopoulos, J.; Cheng Wang

    1994-01-01

    This paper describes APACS - a new framework for a system that detects, predicts and identifies faults in industrial processes. The APACS frameworks provides a structure in which a heterogeneous set of programs can share a common view of the problem and a common model of the domain. (author). 17 refs, 2 figs

  13. Monitoring of steam sterilization processes in the dental office

    NARCIS (Netherlands)

    van Doornmalen, J.P.C.M.; Rietmeijer, A.G.M.; Feilzer, A.J.; Kopinga, K.

    2013-01-01

    In dental offices steam sterilization is used to prevent infection of staff and patient. The necessity of sterilization is obvious. To ensure effective sterilization processes each load has to be monitored. Based on literature and standards a state of the art concept of every load monitoring is

  14. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  15. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  16. Automated information and control complex of hydro-gas endogenous mine processes

    Science.gov (United States)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  17. Adaptive Soa Stack-Based Business Process Monitoring Platform

    Directory of Open Access Journals (Sweden)

    Przemysław Dadel

    2014-01-01

    Full Text Available Executable business processes that formally describe company activities are well placed in the SOA environment as they allow for declarative organization of high-level system logic.However, for both technical and non-technical users, to fully benet from that element of abstractionappropriate business process monitoring systems are required and existing solutions remain unsatisfactory.The paper discusses the problem of business process monitoring in the context of the service orientation paradigm in order to propose an architectural solution and provide implementation of a system for business process monitoring that alleviates the shortcomings of the existing solutions.Various platforms are investigated to obtain a broader view of the monitoring problem and to gather functional and non-functional requirements. These requirements constitute input forthe further analysis and the system design. The monitoring software is then implemented and evaluated according to the specied criteria.An extensible business process monitoring system was designed and built on top of OSGiMM - a dynamic, event-driven, congurable communications layer that provides real-time monitoring capabilities for various types of resources. The system was tested against the stated functional requirements and its implementation provides a starting point for the further work.It is concluded that providing a uniform business process monitoring solution that satises a wide range of users and business process platform vendors is a dicult endeavor. It is furthermore reasoned that only an extensible, open-source, monitoring platform built on top of a scalablecommunication core has a chance to address all the stated and future requirements.

  18. Lateralization of spatial information processing in response monitoring

    Directory of Open Access Journals (Sweden)

    Ann-Kathrin eStock

    2014-01-01

    Full Text Available The current study aims at identifying how lateralized multisensory spatial information processing affects response monitoring and action control. In a previous study, we investigated multimodal sensory integration in response monitoring processes using a Simon task. Behavioral and neurophysiologic results suggested that different aspects of response monitoring are asymmetrically and independently allocated to the hemispheres: While efference-copy based information on the motor execution of the task is further processed in the hemisphere that originally generated the motor command, proprioception-based spatial information is processed in the hemisphere contralateral to the effector. Hence, crossing hands (entering a foreign spatial hemifield yielded an augmented bilateral activation during response monitoring since these two kinds of information were processed in opposing hemispheres. Because the traditional Simon task does not provide the possibility to investigate which aspect of the spatial configuration leads to the observed hemispheric allocation, we introduced a new double crossed condition that allows for the dissociation of internal / physiological and external / physical influences on response monitoring processes. Comparing behavioral and neurophysiologic measures of this new condition to those of the traditional Simon task setup, we could demonstrate that the egocentric representation of the physiological effector’s spatial location accounts for the observed lateralization of spatial information in action control. The finding that the location of the physical effector had a very small influence on response monitoring measures suggests that this aspect is either less important and / or processed in different brain areas than egocentric physiological information.

  19. Metrology Sampling Strategies for Process Monitoring Applications

    KAUST Repository

    Vincent, Tyrone L.

    2011-11-01

    Shrinking process windows in very large scale integration semiconductor manufacturing have already necessitated the development of control systems capable of addressing sub-lot-level variation. Within-wafer control is the next milestone in the evolution of advanced process control from lot-based and wafer-based control. In order to adequately comprehend and control within-wafer spatial variation, inline measurements must be performed at multiple locations across the wafer. At the same time, economic pressures prompt a reduction in metrology, for both capital and cycle-time reasons. This paper explores the use of modeling and minimum-variance prediction as a method to select the sites for measurement on each wafer. The models are developed using the standard statistical tools of principle component analysis and canonical correlation analysis. The proposed selection method is validated using real manufacturing data, and results indicate that it is possible to significantly reduce the number of measurements with little loss in the information obtained for the process control systems. © 2011 IEEE.

  20. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  1. Potentiometric electronic tongue-flow injection analysis system for the monitoring of heavy metal biosorption processes.

    Science.gov (United States)

    Wilson, D; del Valle, M; Alegret, S; Valderrama, C; Florido, A

    2012-05-15

    An automated flow injection potentiometric (FIP) system with electronic tongue detection (ET) is used for the monitoring of biosorption processes of heavy metals on vegetable wastes. Grape stalk wastes are used as biosorbent to remove Cu(2+) ions in a fixed-bed column configuration. The ET is formed by a 5-sensor array with Cu(2+) and Ca(2+)-selective electrodes and electrodes with generic response to heavy-metals, plus an artificial neural network response model of the sensor's cross-response. The real-time monitoring of both the Cu(2+) and the cation exchanged and released (Ca(2+)) in the effluent solution is performed by using flow-injection potentiometric electronic tongue system. The coupling of the electronic tongue with automation features of the flow-injection system allows us to accurately characterize the Cu(2+) ion-biosorption process, through obtaining its breakthrough curves, and the profile of the Ca(2+) ion release. In parallel, fractions of the extract solution are analysed by spectroscopic techniques in order to validate the results obtained with the reported methodology. The sorption performance of grape stalks is also evaluated by means of well-established sorption models. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. A Log Mining Approach for Process Monitoring in SCADA

    NARCIS (Netherlands)

    Hadziosmanovic, D.; Bolzoni, D.; Hartel, Pieter H.

    2010-01-01

    SCADA (Supervisory Control and Data Acquisition) systems are used for controlling and monitoring industrial processes. We propose a methodology to systematically identify potential process-related threats in SCADA. Process-related threats take place when an attacker gains user access rights and

  3. A Log Mining Approach for Process Monitoring in SCADA

    NARCIS (Netherlands)

    Hadziosmanovic, D.; Bolzoni, D.; Hartel, Pieter H.

    2012-01-01

    SCADA (Supervisory Control and Data Acquisition) systems are used for controlling and monitoring industrial processes. We propose a methodology to systematically identify potential process-related threats in SCADA. Process-related threats take place when an attacker gains user access rights and

  4. Power up your plant - An introduction to integrated process and power automation

    Energy Technology Data Exchange (ETDEWEB)

    Vasel, Jeffrey

    2010-09-15

    This paper discusses how a single integrated system can increase energy efficiency, improve plant uptime, and lower life cycle costs. Integrated Process and Power Automation is a new system integration architecture and power strategy that addresses the needs of the process and power generation industries. The architecture is based on Industrial Ethernet standards such as IEC 61850 and Profinet as well as Fieldbus technologies. The energy efficiency gains from integration are discussed in a power generation use case. A power management system success story from a major oil and gas company, Petrobras, is also discussed.

  5. Automated Collection of Real-Time Alerts of Citizens as a Useful Tool to Continuously Monitor Malodorous Emissions.

    Science.gov (United States)

    Brattoli, Magda; Mazzone, Antonio; Giua, Roberto; Assennato, Giorgio; de Gennaro, Gianluigi

    2016-02-26

    The evaluation of odor emissions and dispersion is a very arduous topic to face; the real-time monitoring of odor emissions, the identification of chemical components and, with proper certainty, the source of annoyance represent a challenge for stakeholders such as local authorities. The complaints of people, often not systematic and variously distributed, in general do not allow us to quantify the perceived annoyance. Experimental research has been performed to detect and evaluate olfactory annoyance, based on field testing of an innovative monitoring methodology grounded in automatic recording of citizen alerts. It has been applied in Taranto, in the south of Italy where a relevant industrial area is located, by using Odortel(®) for automated collection of citizen alerts. To evaluate its reliability, the collection system has been integrated with automated samplers, able to sample odorous air in real time, according to the citizen alerts of annoyance and, moreover, with meteorological data (especially the wind direction) and trends in odor marker compounds, recorded by air quality monitoring stations. The results have allowed us, for the first time, to manage annoyance complaints, test their reliability, and obtain information about the distribution and entity of the odor phenomena, such that we were able to identify, with supporting evidence, the source as an oil refinery plant.

  6. Technical note: Validation of a commercial system for the continuous and automated monitoring of dairy cow activity.

    Science.gov (United States)

    Tullo, E; Fontana, I; Gottardo, D; Sloth, K H; Guarino, M

    2016-09-01

    Current farm sizes do not allow the precise identification and tracking of individual cows and their health and behavioral records. Currently, the application of information technology within intensive dairy farming takes a key role in proper routine management to improve animal welfare and to enhance the comfort of dairy cows. An existing application based on information technology is represented by the GEA CowView system (GEA Farm Technologies, Bönen, Germany). This system is able to detect and monitor animal behavioral activities based on positioning, through the creation of a virtual map of the barn that outlines all the areas where cows have access. The aim of this study was to validate the accuracy, sensitivity, and specificity of data provided by the CowView system. The validation was performed by comparing data automatically obtained from the CowView system with those obtained by a manual labeling procedure performed on video recordings. Data used for the comparisons were represented by the zone-related activities performed by the selected dairy cows and were classified into 2 categories: activity and localization. The duration in seconds of each of the activities/localizations detected both with the manual labeling and with the automated system were used to evaluate the correlation coefficients among data; and subsequently the accuracy, sensitivity, specificity, and positive and negative predictive values of the automated monitoring system were calculated. The results of this validation study showed that the CowView automated monitoring system is able to identify the cow localization/position (alley, trough, cubicles) with high reliability in relation to the zone-related activities performed by dairy cows (accuracy higher than 95%). The results obtained support the CowView system as an innovative potential solution for the easier management of dairy cows. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. Robotic Automation Process - The next major revolution in terms of back office operations improvement

    Directory of Open Access Journals (Sweden)

    Anagnoste Sorin

    2017-07-01

    Full Text Available Forced to provide results consistent results to shareholders the organizations turned to Robotic Process Automation (RPA in order to tackle the following typical challenges they face: (1 Cost reduction, (2 Quality increase and (3 Faster processes. RPA is now considered the next big thing for the Shared Services Centers (SSC and Business Process Outsourced (BPO around the world, and especially in Central and Eastern Europe. In SSCs and BPOs the activities with the highest potential for automation are in finance, supply chain and in human resource departments. This means that the problems these business are facing are mostly related to high data entry volumes, high error rates, significant rework, numerous manual processes, multiple not-integrated legacy systems and high turnover due to repetitive/low value added activities. One advantage of RPA is that it can be trained by the users to undertake structured repeatable, computer based tasks interacting in the same time with multiple systems while performing complex decisions based on algorithms. By doing this, the robot can identify the exceptions for manual processing, remove idle times and keep logs of actions performed. Another advantage is that the automated solutions can work 24/7, it can be implemented fast, work with the existing architecture, cut data entry costs by up to 70% and perform at 30% of the cost of a full time employee, thus providing a quick and tangible return to organizations. For Romania, a key destination for SSCs and BPOs, this technology will make them more competitive, but also will lead to a creation of a series of high-paid jobs while eliminating the low-input jobs. The paper will analyze also the most important vendor providers of RPA solutions on the market and will provide specific case studies from different industries, thus helping future leaders and organizations taking better decisions.

  8. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    Science.gov (United States)

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  9. Infrared signature analysis - Real time monitoring of manufacturing processes

    International Nuclear Information System (INIS)

    Bangs, E.R.

    1988-01-01

    The ability to monitor manufacturing processes in an adaptive control mode and perform an inspection in real time is of interest to fabricators in the pressure vessel, aerospace, automotive, nuclear, and shipbuilding industries. Results of a series of experiments using infrared thermography as the principal sensing mode are presented to show how artificial intelligence contained in infrared isotherm, contains vast critical process variables. Image processing computer software development has demonstrated in a spot welding application how the process can be monitored and controlled in real time. The IR vision sensor program is now under way. Research thus far has focused on fusion welding, resistance spot welding and metal removal. 6 references

  10. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff....... Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  11. A data-driven multiplicative fault diagnosis approach for automation processes.

    Science.gov (United States)

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  12. PLS-based memory control scheme for enhanced process monitoring

    KAUST Repository

    Harrou, Fouzi

    2017-01-20

    Fault detection is important for safe operation of various modern engineering systems. Partial least square (PLS) has been widely used in monitoring highly correlated process variables. Conventional PLS-based methods, nevertheless, often fail to detect incipient faults. In this paper, we develop new PLS-based monitoring chart, combining PLS with multivariate memory control chart, the multivariate exponentially weighted moving average (MEWMA) monitoring chart. The MEWMA are sensitive to incipient faults in the process mean, which significantly improves the performance of PLS methods and widen their applicability in practice. Using simulated distillation column data, we demonstrate that the proposed PLS-based MEWMA control chart is more effective in detecting incipient fault in the mean of the multivariate process variables, and outperform the conventional PLS-based monitoring charts.

  13. Development and evaluation of a profile negotiation process for integrating aircraft and air traffic control automation

    Science.gov (United States)

    Green, Steven M.; Denbraven, Wim; Williams, David H.

    1993-01-01

    The development and evaluation of the profile negotiation process (PNP), an interactive process between an aircraft and air traffic control (ATC) that integrates airborne and ground-based automation capabilities to determine conflict-free trajectories that are as close to an aircraft's preference as possible, are described. The PNP was evaluated in a real-time simulation experiment conducted jointly by NASA's Ames and Langley Research Centers. The Ames Center/TRACON Automation System (CTAS) was used to support the ATC environment, and the Langley Transport Systems Research Vehicle (TSRV) piloted cab was used to simulate a 4D Flight Management System (FMS) capable aircraft. Both systems were connected in real time by way of voice and data lines; digital datalink communications capability was developed and evaluated as a means of supporting the air/ground exchange of trajectory data. The controllers were able to consistently and effectively negotiate nominally conflict-free vertical profiles with the 4D-equipped aircraft. The actual profiles flown were substantially closer to the aircraft's preference than would have been possible without the PNP. However, there was a strong consensus among the pilots and controllers that the level of automation of the PNP should be increased to make the process more transparent. The experiment demonstrated the importance of an aircraft's ability to accurately execute a negotiated profile as well as the need for digital datalink to support advanced air/ground data communications. The concept of trajectory space is proposed as a comprehensive approach for coupling the processes of trajectory planning and tracking to allow maximum pilot discretion in meeting ATC constraints.

  14. Real time monitoring automation of dose rate absorbed in air due to environmental gamma radiation

    International Nuclear Information System (INIS)

    Dominguez Ley, Orlando; Capote Ferrera, Eduardo; Carrazana Gonzalez, Jorge A.; Manzano de Armas, Jose F.; Alonso Abad, Dolores; Prendes Alonso, Miguel; Tomas Zerquera, Juan; Caveda Ramos, Celia A.; Kalber, Olof; Fabelo Bonet, Orlando; Montalvan Estrada, Adelmo; Cartas Aguila, Hector; Leyva Fernandez, Julio C.

    2005-01-01

    The Center of Radiation Protection and Hygiene (CPHR) as the head institution of the National Radiological Environmental Surveillance Network (RNVRA) has strengthened its detection and response capacity for a radiological emergency situation. The measurements of gamma dose rate at the main point of the RNVRA are obtained in real time and the CPHR receives the data coming from those points in a short time. To achieve the operability of the RNVRA it was necessary to complete the existent monitoring facilities using 4 automatic gamma probes, implementing in this way a real time measurement system. The software, GenitronProbe for obtaining the data automatically from the probe, Data Mail , for sending the data via e-mail, and Gamma Red , for receiving and processing the data in the head institution ,were developed

  15. Smart membranes for monitoring membrane based desalination processes

    KAUST Repository

    Laleg-Kirati, Taous-Meriem

    2017-10-12

    Various examples are related to smart membranes for monitoring membrane based process such as, e.g., membrane distillation processes. In one example, a membrane, includes a porous surface and a plurality of sensors (e.g., temperature, flow and/or impedance sensors) mounted on the porous surface. In another example, a membrane distillation (MD) process includes the membrane. Processing circuitry can be configured to monitor outputs of the plurality of sensors. The monitored outputs can be used to determine membrane degradation, membrane fouling, or to provide an indication of membrane replacement or cleaning. The sensors can also provide temperatures or temperature differentials across the porous surface, which can be used to improve modeling or control the MD process.

  16. Automated identification of Monogeneans using digital image processing and K-nearest neighbour approaches.

    Science.gov (United States)

    Yousef Kalafi, Elham; Tan, Wooi Boon; Town, Christopher; Dhillon, Sarinder Kaur

    2016-12-22

    Monogeneans are flatworms (Platyhelminthes) that are primarily found on gills and skin of fishes. Monogenean parasites have attachment appendages at their haptoral regions that help them to move about the body surface and feed on skin and gill debris. Haptoral attachment organs consist of sclerotized hard parts such as hooks, anchors and marginal hooks. Monogenean species are differentiated based on their haptoral bars, anchors, marginal hooks, reproductive parts' (male and female copulatory organs) morphological characters and soft anatomical parts. The complex structure of these diagnostic organs and also their overlapping in microscopic digital images are impediments for developing fully automated identification system for monogeneans (LNCS 7666:256-263, 2012), (ISDA; 457-462, 2011), (J Zoolog Syst Evol Res 52(2): 95-99. 2013;). In this study images of hard parts of the haptoral organs such as bars and anchors are used to develop a fully automated identification technique for monogenean species identification by implementing image processing techniques and machine learning methods. Images of four monogenean species namely Sinodiplectanotrema malayanus, Trianchoratus pahangensis, Metahaliotrema mizellei and Metahaliotrema sp. (undescribed) were used to develop an automated technique for identification. K-nearest neighbour (KNN) was applied to classify the monogenean specimens based on the extracted features. 50% of the dataset was used for training and the other 50% was used as testing for system evaluation. Our approach demonstrated overall classification accuracy of 90%. In this study Leave One Out (LOO) cross validation is used for validation of our system and the accuracy is 91.25%. The methods presented in this study facilitate fast and accurate fully automated classification of monogeneans at the species level. In future studies more classes will be included in the model, the time to capture the monogenean images will be reduced and improvements in

  17. The integrated business information system: using automation to monitor cost-effectiveness of park operations

    Science.gov (United States)

    Dick Stanley; Bruce Jackson

    1995-01-01

    The cost-effectiveness of park operations is often neglected because information is laborious to compile. The information, however, is critical if we are to derive maximum benefit from scarce resources. This paper describes an automated system for calculating cost-effectiveness ratios with minimum effort using data from existing data bases.

  18. Monitoring the Performance of Human and Automated Scores for Spoken Responses

    Science.gov (United States)

    Wang, Zhen; Zechner, Klaus; Sun, Yu

    2018-01-01

    As automated scoring systems for spoken responses are increasingly used in language assessments, testing organizations need to analyze their performance, as compared to human raters, across several dimensions, for example, on individual items or based on subgroups of test takers. In addition, there is a need in testing organizations to establish…

  19. Development of an automated chip culture system with integrated on-line monitoring for maturation culture of retinal pigment epithelial cells

    Directory of Open Access Journals (Sweden)

    Mee-Hae Kim

    2017-10-01

    Full Text Available In cell manufacturing, the establishment of a fully automated, microfluidic, cell culture system that can be used for long-term cell cultures, as well as for process optimization is highly desirable. This study reports the development of a novel chip bioreactor system that can be used for automated long-term maturation cultures of retinal pigment epithelial (RPE cells. The system consists of an incubation unit, a medium supply unit, a culture observation unit, and a control unit. In the incubation unit, the chip contains a closed culture vessel (2.5 mm diameter, working volume 9.1 μL, which can be set to 37 °C and 5% CO2, and uses a gas-permeable resin (poly- dimethylsiloxane as the vessel wall. RPE cells were seeded at 5.0 × 104 cells/cm2 and the medium was changed every day by introducing fresh medium using the medium supply unit. Culture solutions were stored either in the refrigerator or the freezer, and fresh medium was prepared before any medium change by warming to 37 °C and mixing. Automated culture was allowed to continue for 30 days to allow maturation of the RPE cells. This chip culture system allows for the long-term, bubble-free, culture of RPE cells, while also being able to observe cells in order to elucidate their cell morphology or show the presence of tight junctions. This culture system, along with an integrated on-line monitoring system, can therefore be applied to long-term cultures of RPE cells, and should contribute to process control in RPE cell manufacturing.

  20. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  1. Automated radioxenon monitoring for the comprehensive nuclear-test-ban treaty in two distinctive locations: Ottawa and Tahiti.

    Science.gov (United States)

    Stocki, T J; Blanchard, X; D'Amours, R; Ungar, R K; Fontaine, J P; Sohier, M; Bean, M; Taffary, T; Racine, J; Tracy, B L; Brachet, G; Jean, M; Meyerhof, D

    2005-01-01

    In preparation for verification of the Comprehensive Nuclear-Test-Ban-Treaty, automated radioxenon monitoring is performed in two distinctive environments: Ottawa and Tahiti. These sites are monitored with SPALAX (Systeme de Prelevement d'air Automatique en Ligne avec l'Analyse des radioXenons) technology, which automatically extracts radioxenon from the atmosphere and measures the activity concentrations of (131m,133m,133,135)Xe. The resulting isotopic concentrations can be useful to discern nuclear explosions from nuclear industry xenon emissions. Ambient radon background, which may adversely impact analyser sensitivity, is discussed. Upper concentration limits are reported for the apparently radioxenon free Tahiti environment. Ottawa has a complex radioxenon background due to proximity to nuclear reactors and medical isotope facilities. Meteorological models suggest that, depending on the wind direction, the radioxenon detected in Ottawa can be characteristic of the normal radioxenon background in the Eastern United States, Europe, and Japan or distinctive due to medical isotope production.

  2. Process development of human multipotent stromal cell microcarrier culture using an automated high‐throughput microbioreactor

    Science.gov (United States)

    Hanga, Mariana P.; Heathman, Thomas R. J.; Coopman, Karen; Nienow, Alvin W.; Williams, David J.; Hewitt, Christopher J.

    2017-01-01

    ABSTRACT Microbioreactors play a critical role in process development as they reduce reagent requirements and can facilitate high‐throughput screening of process parameters and culture conditions. Here, we have demonstrated and explained in detail, for the first time, the amenability of the automated ambr15 cell culture microbioreactor system for the development of scalable adherent human mesenchymal multipotent stromal/stem cell (hMSC) microcarrier culture processes. This was achieved by first improving suspension and mixing of the microcarriers and then improving cell attachment thereby reducing the initial growth lag phase. The latter was achieved by using only 50% of the final working volume of medium for the first 24 h and using an intermittent agitation strategy. These changes resulted in >150% increase in viable cell density after 24 h compared to the original process (no agitation for 24 h and 100% working volume). Using the same methodology as in the ambr15, similar improvements were obtained with larger scale spinner flask studies. Finally, this improved bioprocess methodology based on a serum‐based medium was applied to a serum‐free process in the ambr15, resulting in >250% increase in yield compared to the serum‐based process. At both scales, the agitation used during culture was the minimum required for microcarrier suspension, NJS. The use of the ambr15, with its improved control compared to the spinner flask, reduced the coefficient of variation on viable cell density in the serum containing medium from 7.65% to 4.08%, and the switch to serum free further reduced these to 1.06–0.54%, respectively. The combination of both serum‐free and automated processing improved the reproducibility more than 10‐fold compared to the serum‐based, manual spinner flask process. The findings of this study demonstrate that the ambr15 microbioreactor is an effective tool for bioprocess development of hMSC microcarrier cultures and that a combination

  3. Use of process monitoring data to enhance material accounting

    International Nuclear Information System (INIS)

    Brouns, R.J.; Smith, B.W.

    1980-06-01

    A study was conducted for the United States Nuclear Regulatory Commission as part of a continuing program to estimate the effectiveness of using process monitoring data to enhance special nuclear material accounting in nuclear facilities. Two licensed fuel fabrication facilities with internal scrap recovery processes were examined. The loss detection sensitivity, timeliness and localization capabilities of the process monitoring technique were evaluated for single and multiple (trickle) losses. The impact of records manipulation, mass and isotopic substitution, and collusion between two insiders as methods for concealing diversion were also studied

  4. Algorithms of control parameters selection for automation of FDM 3D printing process

    Directory of Open Access Journals (Sweden)

    Kogut Paweł

    2017-01-01

    Full Text Available The paper presents algorithms of control parameters selection of the Fused Deposition Modelling (FDM technology in case of an open printing solutions environment and 3DGence ONE printer. The following parameters were distinguished: model mesh density, material flow speed, cooling performance, retraction and printing speeds. These parameters are independent in principle printing system, but in fact to a certain degree that results from the selected printing equipment features. This is the first step for automation of the 3D printing process in FDM technology.

  5. Groundwater monitoring plan for the 300 Area process trenches

    International Nuclear Information System (INIS)

    Lindberg, J.W.; Chou, C.J.; Johnson, V.G.

    1995-01-01

    This document describes the groundwater monitoring program for the Hanford Site 300 Area Process Trenches (300 APT). The 300 APT are a Resource Conservation and Recovery Act of 1976 (RCRA) regulated unit. The 300 APT are included in the Dangerous Waste Portion of the Resource Conservation and Recovery Act Permit for the Treatment, Storage, and Disposal of Dangerous Waste, Permit No. WA890008967, and are subject to final-status requirements for groundwater monitoring. This document describes a compliance monitoring program for groundwater in the uppermost aquifer system at the 300 APT. This plan describes the 300 APT monitoring network, constituent list, sampling schedule, statistical methods, and sampling and analysis protocols that will be employed for the 300 APT. This plan will be used to meet groundwater monitoring requirements from the time the 300 APT becomes part of the Permit and through the postclosure care period until certification of final closure

  6. Groundwater monitoring plan for the 300 Area process trenches

    Energy Technology Data Exchange (ETDEWEB)

    Lindberg, J.W.; Chou, C.J.; Johnson, V.G.

    1995-05-23

    This document describes the groundwater monitoring program for the Hanford Site 300 Area Process Trenches (300 APT). The 300 APT are a Resource Conservation and Recovery Act of 1976 (RCRA) regulated unit. The 300 APT are included in the Dangerous Waste Portion of the Resource Conservation and Recovery Act Permit for the Treatment, Storage, and Disposal of Dangerous Waste, Permit No. WA890008967, and are subject to final-status requirements for groundwater monitoring. This document describes a compliance monitoring program for groundwater in the uppermost aquifer system at the 300 APT. This plan describes the 300 APT monitoring network, constituent list, sampling schedule, statistical methods, and sampling and analysis protocols that will be employed for the 300 APT. This plan will be used to meet groundwater monitoring requirements from the time the 300 APT becomes part of the Permit and through the postclosure care period until certification of final closure.

  7. Process monitoring IAN Agroparks in India : Transforum report 2009

    NARCIS (Netherlands)

    Gerritsen, A.L.; Chakravarthy, G.K.D.K.; Giesen, E.

    2009-01-01

    This is the first report of the TransForum project Process monitoring agroparks international, which focuses on India and specific on the development of the IFFCO Kisan SEZ Nellore in the south of India. It contains an overview of process design and the content of the proposition of IAN agroparks in

  8. sampling plans for monitoring quality control process at a plastic

    African Journals Online (AJOL)

    Dr Obe

    control techniques. Quality control techniques are a means of seeing that quality standards are maintained. A more specific definition of quality control, simply state by juran. 5. , is that it is “the ... but acceptance sampling is a common tool in practice. .... are useful for process monitoring and process capability analysis. Also ...

  9. Developing an automated database for monitoring ultrasound- and computed tomography-guided procedure complications and diagnostic yield.

    Science.gov (United States)

    Itri, Jason N; Jones, Lisa P; Kim, Woojin; Boonn, William W; Kolansky, Ana S; Hilton, Susan; Zafar, Hanna M

    2014-04-01

    Monitoring complications and diagnostic yield for image-guided procedures is an important component of maintaining high quality patient care promoted by professional societies in radiology and accreditation organizations such as the American College of Radiology (ACR) and Joint Commission. These outcome metrics can be used as part of a comprehensive quality assurance/quality improvement program to reduce variation in clinical practice, provide opportunities to engage in practice quality improvement, and contribute to developing national benchmarks and standards. The purpose of this article is to describe the development and successful implementation of an automated web-based software application to monitor procedural outcomes for US- and CT-guided procedures in an academic radiology department. The open source tools PHP: Hypertext Preprocessor (PHP) and MySQL were used to extract relevant procedural information from the Radiology Information System (RIS), auto-populate the procedure log database, and develop a user interface that generates real-time reports of complication rates and diagnostic yield by site and by operator. Utilizing structured radiology report templates resulted in significantly improved accuracy of information auto-populated from radiology reports, as well as greater compliance with manual data entry. An automated web-based procedure log database is an effective tool to reliably track complication rates and diagnostic yield for US- and CT-guided procedures performed in a radiology department.

  10. Identification of stable areas in unreferenced laser scans for automated geomorphometric monitoring

    Science.gov (United States)

    Wujanz, Daniel; Avian, Michael; Krueger, Daniel; Neitzel, Frank

    2018-04-01

    Current research questions in the field of geomorphology focus on the impact of climate change on several processes subsequently causing natural hazards. Geodetic deformation measurements are a suitable tool to document such geomorphic mechanisms, e.g. by capturing a region of interest with terrestrial laser scanners which results in a so-called 3-D point cloud. The main problem in deformation monitoring is the transformation of 3-D point clouds captured at different points in time (epochs) into a stable reference coordinate system. In this contribution, a surface-based registration methodology is applied, termed the iterative closest proximity algorithm (ICProx), that solely uses point cloud data as input, similar to the iterative closest point algorithm (ICP). The aim of this study is to automatically classify deformations that occurred at a rock glacier and an ice glacier, as well as in a rockfall area. For every case study, two epochs were processed, while the datasets notably differ in terms of geometric characteristics, distribution and magnitude of deformation. In summary, the ICProx algorithm's classification accuracy is 70 % on average in comparison to reference data.

  11. Web-based execution of graphical work-flows: a modular platform for multifunctional scientific process automation

    International Nuclear Information System (INIS)

    De Ley, E.; Jacobs, D.; Ounsy, M.

    2012-01-01

    The Passerelle process automation suite offers a fundamentally modular solution platform, based on a layered integration of several best-of-breed technologies. It has been successfully applied by Synchrotron Soleil as the sequencer for data acquisition and control processes on its beamlines, integrated with TANGO as a control bus and GlobalScreen TM ) as the SCADA package. Since last year it is being used as the graphical work-flow component for the development of an eclipse-based Data Analysis Work Bench, at ESRF. The top layer of Passerelle exposes an actor-based development paradigm, based on the Ptolemy framework (UC Berkeley). Actors provide explicit reusability and strong decoupling, combined with an inherently concurrent execution model. Actor libraries exist for TANGO integration, web-services, database operations, flow control, rules-based analysis, mathematical calculations, launching external scripts etc. Passerelle's internal architecture is based on OSGi, the major Java framework for modular service-based applications. A large set of modules exist that can be recombined as desired to obtain different features and deployment models. Besides desktop versions of the Passerelle work-flow workbench, there is also the Passerelle Manager. It is a secured web application including a graphical editor, for centralized design, execution, management and monitoring of process flows, integrating standard Java Enterprise services with OSGi. We will present the internal technical architecture, some interesting application cases and the lessons learnt. (authors)

  12. ATTEMPTS TO AUTOMATE THE PROCESS OF GENERATION OF ORTHOIMAGES OF OBJECTS OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    J. S. Markiewicz

    2015-02-01

    Full Text Available At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. The orthoimage is a cartometric form of photographic presentation of information in the two-dimensional reference system. The paper will discuss the issue of automation of the orthoimage generation basing on the TLS data and digital images. At present attempts are made to apply modern technologies not only for the needs of surveys, but also during the data processing. This paper will present attempts aiming at utilisation of appropriate algorithms and the author’s application for automatic generation of the projection plane, for the needs of acquisition of intensity orthoimages from the TLS data. Such planes are defined manually in the majority of popular TLS data processing applications. A separate issue related to the RGB image generation is the orientation of digital images in relation to scans. It is important, in particular in such cases when scans and photographs are not taken simultaneously. This paper will present experiments concerning the utilisation of the SIFT algorithm for automatic matching of intensity orthoimages of the intensity and digital (RGB photographs. Satisfactory results of the process of automation, as well as in relation to the quality of resulting orthoimages have been obtained.

  13. Geocoding uncertainty analysis for the automated processing of Sentinel-1 data using Sentinel-1 Toolbox software

    Science.gov (United States)

    Dostálová, Alena; Naeimi, Vahid; Wagner, Wolfgang; Elefante, Stefano; Cao, Senmao; Persson, Henrik

    2016-10-01

    One of the major advantages of the Sentinel-1 data is its capability to provide very high spatio-temporal coverage allowing the mapping of large areas as well as creation of dense time-series of the Sentinel-1 acquisitions. The SGRT software developed at TU Wien aims at automated processing of Sentinel-1 data for global and regional products. The first step of the processing consists of the Sentinel-1 data geocoding with the help of S1TBX software and their resampling to a common grid. These resampled images serve as an input for the product derivation. Thus, it is very important to select the most reliable processing settings and assess the geocoding uncertainty for both backscatter and projected local incidence angle images. Within this study, selection of Sentinel-1 acquisitions over 3 test areas in Europe were processed manually in the S1TBX software, testing multiple software versions, processing settings and digital elevation models (DEM) and the accuracy of the resulting geocoded images were assessed. Secondly, all available Sentinel-1 data over the areas were processed using selected settings and detailed quality check was performed. Overall, strong influence of the used DEM on the geocoding quality was confirmed with differences up to 80 meters in areas with higher terrain variations. In flat areas, the geocoding accuracy of backscatter images was overall good, with observed shifts between 0 and 30m. Larger systematic shifts were identified in case of projected local incidence angle images. These results encourage the automated processing of large volumes of Sentinel-1 data.

  14. Automated Cognitive Health Assessment Using Smart Home Monitoring of Complex Tasks.

    Science.gov (United States)

    Dawadi, Prafulla N; Cook, Diane J; Schmitter-Edgecombe, Maureen

    2013-11-01

    One of the many services that intelligent systems can provide is the automated assessment of resident well-being. We hypothesize that the functional health of individuals, or ability of individuals to perform activities independently without assistance, can be estimated by tracking their activities using smart home technologies. In this paper, we introduce a machine learning-based method for assessing activity quality in smart homes. To validate our approach we quantify activity quality for 179 volunteer participants who performed a complex, interweaved set of activities in our smart home apartment. We observed a statistically significant correlation (r=0.79) between automated assessment of task quality and direct observation scores. Using machine learning techniques to predict the cognitive health of the participants based on task quality is accomplished with an AUC value of 0.64. We believe that this capability is an important step in understanding everyday functional health of individuals in their home environments.

  15. Automated identification of copepods using digital image processing and artificial neural network.

    Science.gov (United States)

    Leow, Lee Kien; Chew, Li-Lee; Chong, Ving Ching; Dhillon, Sarinder Kaur

    2015-01-01

    Copepods are planktonic organisms that play a major role in the marine food chain. Studying the community structure and abundance of copepods in relation to the environment is essential to evaluate their contribution to mangrove trophodynamics and coastal fisheries. The routine identification of copepods can be very technical, requiring taxonomic expertise, experience and much effort which can be very time-consuming. Hence, there is an urgent need to introduce novel methods and approaches to automate identification and classification of copepod specimens. This study aims to apply digital image processing and machine learning methods to build an automated identification and classification technique. We developed an automated technique to extract morphological features of copepods' specimen from captured images using digital image processing techniques. An Artificial Neural Network (ANN) was used to classify the copepod specimens from species Acartia spinicauda, Bestiolina similis, Oithona aruensis, Oithona dissimilis, Oithona simplex, Parvocalanus crassirostris, Tortanus barbatus and Tortanus forcipatus based on the extracted features. 60% of the dataset was used for a two-layer feed-forward network training and the remaining 40% was used as testing dataset for system evaluation. Our approach demonstrated an overall classification accuracy of 93.13% (100% for A. spinicauda, B. similis and O. aruensis, 95% for T. barbatus, 90% for O. dissimilis and P. crassirostris, 85% for O. similis and T. forcipatus). The methods presented in this study enable fast classification of copepods to the species level. Future studies should include more classes in the model, improving the selection of features, and reducing the time to capture the copepod images.

  16. An algorithm for automated layout of process description maps drawn in SBGN.

    Science.gov (United States)

    Genc, Begum; Dogrusoz, Ugur

    2016-01-01

    Evolving technology has increased the focus on genomics. The combination of today's advanced techniques with decades of molecular biology research has yielded huge amounts of pathway data. A standard, named the Systems Biology Graphical Notation (SBGN), was recently introduced to allow scientists to represent biological pathways in an unambiguous, easy-to-understand and efficient manner. Although there are a number of automated layout algorithms for various types of biological networks, currently none specialize on process description (PD) maps as defined by SBGN. We propose a new automated layout algorithm for PD maps drawn in SBGN. Our algorithm is based on a force-directed automated layout algorithm called Compound Spring Embedder (CoSE). On top of the existing force scheme, additional heuristics employing new types of forces and movement rules are defined to address SBGN-specific rules. Our algorithm is the only automatic layout algorithm that properly addresses all SBGN rules for drawing PD maps, including placement of substrates and products of process nodes on opposite sides, compact tiling of members of molecular complexes and extensively making use of nested structures (compound nodes) to properly draw cellular locations and molecular complex structures. As demonstrated experimentally, the algorithm results in significant improvements over use of a generic layout algorithm such as CoSE in addressing SBGN rules on top of commonly accepted graph drawing criteria. An implementation of our algorithm in Java is available within ChiLay library (https://github.com/iVis-at-Bilkent/chilay). ugur@cs.bilkent.edu.tr or dogrusoz@cbio.mskcc.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  17. Knowledge management and process monitoring of pharmaceutical processes in the quality by design paradigm.

    Science.gov (United States)

    Rathore, Anurag S; Bansal, Anshuman; Hans, Jaspinder

    2013-01-01

    Pharmaceutical processes are complex and highly variable in nature. The complexity and variability associated with these processes result in inconsistent and sometimes unpredictable process outcomes. To deal with the complexity and understand the causes of variability in these processes, in-depth knowledge and thorough understanding of the process and the various factors affecting the process performance become critical. This makes knowledge management and process monitoring an indispensable part of the process improvement efforts for any pharmaceutical organization.

  18. Enhanced way of securing automated teller machine to track the misusers using secure monitor tracking analysis

    Science.gov (United States)

    Sadhasivam, Jayakumar; Alamelu, M.; Radhika, R.; Ramya, S.; Dharani, K.; Jayavel, Senthil

    2017-11-01

    Now a days the people's attraction towards Automated Teller Machine(ATM) has been increasing even in rural areas. As of now the security provided by all the bank is ATM pin number. Hackers know the way to easily identify the pin number and withdraw money if they haven stolen the ATM card. Also, the Automated Teller Machine is broken and the money is stolen. To overcome these disadvantages, we propose an approach “Automated Secure Tracking System” to secure and tracking the changes in ATM. In this approach, while creating the bank account, the bank should scan the iris known (a part or movement of our eye) and fingerprint of the customer. The scanning can be done with the position of the eye movements and fingerprints identified with the shortest measurements. When the card is swiped then ATM should request the pin, scan the iris and recognize the fingerprint and then allow the customer to withdraw money. If somebody tries to break the ATM an alert message is given to the nearby police station and the ATM shutter is automatically closed. This helps in avoiding the hackers who withdraw money by stealing the ATM card and also helps the government in identifying the criminals easily.

  19. Implementation of a novel postoperative monitoring system using automated Modified Early Warning Scores (MEWS) incorporating end-tidal capnography.

    Science.gov (United States)

    Blankush, Joseph M; Freeman, Robbie; McIlvaine, Joy; Tran, Trung; Nassani, Stephen; Leitman, I Michael

    2017-10-01

    Modified Early Warning Scores (MEWS) provide real-time vital sign (VS) trending and reduce ICU admissions in post-operative patients. These early warning calculations classically incorporate oxygen saturation, heart rate, respiratory rate, systolic blood pressure, and temperature but have not previously included end-tidal CO2 (EtCO 2 ), more recently identified as an independent predictor of critical illness. These systems may be subject to failure when physiologic data is incorrectly measured, leading to false alarms and increased workload. This study investigates whether the implementation of automated devices that utilize ongoing vital signs monitoring and MEWS calculations, inclusive of a score for end-tidal CO 2 (EtCO 2 ), can be feasibly implemented on the general care hospital floor and effectively identify derangements in a post-operative patient's condition while limiting the amount of false alarms that would serve to increase provider workload. From July to November 2014, post-operative patients meeting the inclusion criteria (BMI > 30 kg/m 2 , history of obstructive sleep apnea, or the use of patient-controlled analgesia (PCA) or epidural narcotics) were monitored using automated devices that record minute-by-minute VS included in classic MEWS calculations as well as EtCO 2 . Automated messages via pagers were sent to providers for instances when the device measured elevated MEWS, abnormal EtCO 2 , and oxygen desaturations below 85 %. Data, including alarm and message details from the first 133 patients, were recorded and analyzed. Overall, 3.3 alarms and pages sounded per hour of monitoring. Device-only alarms sounded 2.7 times per hour-21 % were technical alarms. The remaining device-only alarms for concerning VS sounded 2.0/h, 70 % for falsely recorded VS. Pages for abnormal EtCO 2 sounded 0.4/h (82 % false recordings) while pages for low blood oxygen saturation sounded 0.1/h (55 % false alarms). 143 times (0.1 pages/h) the devices calculated

  20. An automated and integrated framework for dust storm detection based on ogc web processing services

    Science.gov (United States)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data

  1. Analysis of irradiated U-7wt%Mo dispersion fuel microstructures using automated image processing

    Energy Technology Data Exchange (ETDEWEB)

    Collette, R. [Colorado School of Mines, Nuclear Science and Engineering Program, 1500 Illinois St, Golden, CO 80401 (United States); King, J., E-mail: kingjc@mines.edu [Colorado School of Mines, Nuclear Science and Engineering Program, 1500 Illinois St, Golden, CO 80401 (United States); Buesch, C. [Oregon State University, 1500 SW Jefferson St., Corvallis, OR 97331 (United States); Keiser, D.D.; Williams, W.; Miller, B.D.; Schulthess, J. [Nuclear Fuels and Materials Division, Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-6188 (United States)

    2016-07-15

    The High Performance Research Reactor Fuel Development (HPPRFD) program is responsible for developing low enriched uranium (LEU) fuel substitutes for high performance reactors fueled with highly enriched uranium (HEU) that have not yet been converted to LEU. The uranium-molybdenum (U-Mo) fuel system was selected for this effort. In this study, fission gas pore segmentation was performed on U-7wt%Mo dispersion fuel samples at three separate fission densities using an automated image processing interface developed in MATLAB. Pore size distributions were attained that showed both expected and unexpected fission gas behavior. In general, it proved challenging to identify any dominant trends when comparing fission bubble data across samples from different fuel plates due to varying compositions and fabrication techniques. The results exhibited fair agreement with the fission density vs. porosity correlation developed by the Russian reactor conversion program. - Highlights: • Automated image processing is used to extract fission gas bubble data from irradiated U−Mo fuel samples. • Verification and validation tests are performed to ensure the algorithm's accuracy. • Fission bubble parameters are predictably difficult to compare across samples of varying compositions. • The 2-D results suggest the need for more homogenized fuel sampling in future studies. • The results also demonstrate the value of 3-D reconstruction techniques.

  2. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    Sawant, Pramilla D.; Prabhu, Supreetha P.; Suja, A.; Wankhede, Sonal; Chaudhary, Seema; Rao, D.D.; Pradeepkumar, K.S.; Das, A.P.; Badodkar, B.D.

    2014-01-01

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  3. Computational testing for automated preprocessing: a Matlab toolbox to enable large scale electroencephalography data processing

    Directory of Open Access Journals (Sweden)

    Benjamin U. Cowley

    2017-03-01

    Full Text Available Electroencephalography (EEG is a rich source of information regarding brain function. However, the preprocessing of EEG data can be quite complicated, due to several factors. For example, the distinction between true neural sources and noise is indeterminate; EEG data can also be very large. The various factors create a large number of subjective decisions with consequent risk of compound error. Existing tools present the experimenter with a large choice of analysis methods. Yet it remains a challenge for the researcher to integrate methods for batch-processing of the average large datasets, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g. the classification of artefacts in channels, epochs or segments. This introduces extra subjectivity, is slow and is not reproducible. Batching and well-designed automation can help to regularise EEG preprocessing, and thus reduce human effort, subjectivity and consequent error. We present the computational testing for automated preprocessing (CTAP toolbox, to facilitate: (i batch-processing that is easy for experts and novices alike; (ii testing and manual comparison of preprocessing methods. CTAP extends the existing data structure and functions from the well-known EEGLAB toolbox, based on Matlab and produces extensive quality control outputs. CTAP is available under MIT licence from https://github.com/bwrc/ctap.

  4. A Noble Approach of Process Automation in Galvanized Nut, Bolt Manufacturing Industry

    Directory of Open Access Journals (Sweden)

    Akash Samanta

    2012-05-01

    Full Text Available Corrosion costs money”, The Columbus battle institute estimates that corrosion costs Americans more than $ 220 billion annually, about 4.3% of the gross natural product [1].Now a days due to increase of pollution, the rate of corrosion is also increasing day-by-day mainly in India, so, to save the steel structures, galvanizing is the best and the simplest solution. Due to this reason galvanizing industries are increasing day-by-day since mid of 1700s.Galvanizing is a controlled metallurgical combination of zinc and steel that can provide a corrosion resistance in a wide variety of environment. In fact, the galvanized metal corrosion resistance factor can be some 70 to 80 times greater that the base metal material. Keeping in mind the importance of this industry, a noble approach of process automation in galvanized nut-bolt  manufacturing plant is presented here as nuts and bolts are the prime ingredient of any structure. In this paper the main objectives of any industry like survival, profit maximization, profit satisfying and sales growth are fulfilled. Furthermore the environmental aspects i.e. pollution control and energy saving are also considered in this paper. The whole automation process is done using programmable logic controller (PLC which has number of unique advantages like being faster, reliable, requires less maintenance and reprogrammable. The whole system has been designed and tested using GE, FANUC PLC.

  5. Innovative strategy for effective critical laboratory result management: end-to-end process using automation and manual call centre.

    Science.gov (United States)

    Ti, Lian Kah; Ang, Sophia Bee Leng; Saw, Sharon; Sethi, Sunil Kumar; Yip, James W L

    2012-08-01

    Timely reporting and acknowledgement are crucial steps in critical laboratory results (CLR) management. The authors previously showed that an automated pathway incorporating short messaging system (SMS) texts, auto-escalation, and manual telephone back-up improved the rate and speed of physician acknowledgement compared with manual telephone calling alone. This study investigated if it also improved the rate and speed of physician intervention to CLR and whether utilising the manual back-up affected intervention rates. Data from seven audits between November 2007 and January 2011 were analysed. These audits were carried out to assess the robustness of CLR reporting process in the authors' institution. Comparisons were made in the rate and speed of acknowledgement and intervention between the audits performed before and after automation. Using the automation audits, the authors compared intervention data between communication with SMS only and when manual intervention was required. 1680 CLR were reported during the audit periods. Automation improved the rate (100% vs 84.2%; pautomation audits, the use of SMS only did not improve physician intervention rates. The automated communication pathway improved physician intervention rate and time in tandem with improved acknowledgement rate and time when compared with manual telephone calling. The use of manual intervention to augment automation did not adversely affect physician intervention rate, implying that an end-to-end pathway was more important than automation alone.

  6. Implementation and impact of an automated group monitoring and feedback system to promote hand hygiene among health care personnel.

    Science.gov (United States)

    Conway, Laurie J; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L

    2014-09-01

    Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care-associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Feedback via an automated system was associated with improved hand hygiene performance in the short-term.

  7. Process monitoring using a Quality and Technical Surveillance Program

    International Nuclear Information System (INIS)

    Rafferty, C.A.

    1995-01-01

    The purpose of process monitoring using a Quality and Technical Surveillance Program was to help ensure manufactured clad vents sets fully met technical and quality requirements established by the manufacturer and the customer, and that line and program management were immediately alerted if any aspect of the manufacturing activities drifted out of acceptable limits. The Quality and Technical Surveillance Program provided a planned, scheduled approach to monitor key processes and documentation illuminated potential problem areas early enough to permit timely corrective actions to reverse negative trends that, if left uncorrected, could have resulted in deficient hardware. Significant schedule and cost impacts were eliminated

  8. Structural health monitoring an advanced signal processing perspective

    CERN Document Server

    Chen, Xuefeng; Mukhopadhyay, Subhas

    2017-01-01

    This book highlights the latest advances and trends in advanced signal processing (such as wavelet theory, time-frequency analysis, empirical mode decomposition, compressive sensing and sparse representation, and stochastic resonance) for structural health monitoring (SHM). Its primary focus is on the utilization of advanced signal processing techniques to help monitor the health status of critical structures and machines encountered in our daily lives: wind turbines, gas turbines, machine tools, etc. As such, it offers a key reference guide for researchers, graduate students, and industry professionals who work in the field of SHM.

  9. Thermal monitoring of the thermoplastic injection molding process with FBGs

    Science.gov (United States)

    Alberto, Nélia J.; Nogueira, Rogério N.; Neto, Victor F.

    2014-08-01

    Injection molding is an important polymer processing method for manufacturing plastic components. In this work, the thermal monitoring of the thermoplastic injection molding is presented, since temperature is a critical parameter that influences the process features. A set of fiber Bragg gratings were multiplexed, aiming a two dimensional monitoring of the mold. The results allowed to identify the different stages of the thermoplastic molding cycle. Additionally, the data provide information about the heat transfer phenomena, an important issue for the thermoplastic injection sector, and thus for an endless number of applications that employ this type of materials.

  10. FT-NIR: A Tool for Process Monitoring and More.

    Science.gov (United States)

    Martoccia, Domenico; Lutz, Holger; Cohen, Yvan; Jerphagnon, Thomas; Jenelten, Urban

    2018-03-30

    With ever-increasing pressure to optimize product quality, to reduce cost and to safely increase production output from existing assets, all combined with regular changes in terms of feedstock and operational targets, process monitoring with traditional instruments reaches its limits. One promising answer to these challenges is in-line, real time process analysis with spectroscopic instruments, and above all Fourier-Transform Near Infrared spectroscopy (FT-NIR). Its potential to afford decreased batch cycle times, higher yields, reduced rework and minimized batch variance is presented and application examples in the field of fine chemicals are given. We demonstrate that FT-NIR can be an efficient tool for improved process monitoring and optimization, effective process design and advanced process control.

  11. Monitoring and analysis of air emissions based on condition models derived from process history

    Directory of Open Access Journals (Sweden)

    M. Liukkonen

    2016-12-01

    Full Text Available Evaluation of online information on operating conditions is necessary when reducing air emissions in energy plants. In this respect, automated monitoring and control are of primary concern, particularly in biomass combustion. As monitoring of emissions in power plants is ever more challenging because of low-grade fuels and fuel mixtures, new monitoring applications are needed to extract essential information from the large amount of measurement data. The management of emissions in energy boilers lacks economically efficient, fast, and competent computational systems that could support decision-making regarding the improvement of emission efficiency. In this paper, a novel emission monitoring platform based on the self-organizing map method is presented. The system is capable, not only of visualizing the prevailing status of the process and detecting problem situations (i.e. increased emission release rates, but also of analyzing these situations automatically and presenting factors potentially affecting them. The system is demonstrated using measurement data from an industrial circulating fluidized bed boiler fired by forest residue as the primary fuel and coal as the supporting fuel.

  12. Process development for automated solar cell and module production. Task 4. Automated array assembly. Quarterly report No. 1

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J. J.

    1980-10-15

    Work has been divided into five phases. The first phase is to modify existing hardware and controlling computer software to: (1) improve cell-to-cell placement accuracy, (2) improve the solder joint while reducing the amount of solder and flux smear on the cell's surface, and (3) reduce the system cycle time to 10 seconds. The second phase involves expanding the existing system's capabilities to be able to reject broken cells and make post-solder electrical tests. Phase 3 involves developing new hardware to allow for the automated encapsulation of solar modules. This involves three discrete pieces of hardware: (1) a vacuum platen end effector for the robot which allows it to pick up the 1' x 4' array of 35 inter-connected cells. With this, it can also pick up the cover glass and completed module, (2) a lamination preparation station which cuts the various encapsulation components from roll storage and positions them for encapsulation, and (3) an automated encapsulation chamber which interfaces with the above two and applies the heat and vacuum to cure the encapsulants. Phase 4 involves the final assembly of the encapsulated array into a framed, edge-sealed module completed for installation. For this we are using MBA's Glass Reinforced Concrete (GRC) in panels such as those developed by MBA for JPL under contract No. 955281. The GRC panel plays the multiple role of edge frame, substrate and mounting structure. An automated method of applying the edge seal will also be developed. The final phase (5) is the fabrication of six 1' x 4' electrically active solar modules using the above developed equipment. Progress is reported. (WHK)

  13. An alarm filtering system for an automated process: a multiple-agent approach

    International Nuclear Information System (INIS)

    Khoualdi, Kamel

    1994-01-01

    Nowadays, the supervision process of industrial installations is more and more complex involving the automation of their control. A malfunction generates an avalanche of alarms. The operator, in charge of the supervision, must face the incident and execute right actions to recover a normal situation. Generally, he is drowned under the great number of alarms. Our aim, in the frame of our researches, is to perform an alarm filtering system for an automated metro line, to help the operator finding the main alarm responsible for the malfunction. Our works are divided into two parts, both dealing with study and development of an alarm filtering system but using two different approaches. The first part is developed in the frame of the SARA project (an operator assistance system for an automated metro line) which is an expert system prototype helping the operators of a command center. In this part, a centralized approach has been used representing the events with a single event graph and using a global procedure to perform diagnosis. This approach has itself shown its limits. In the second part of our works, we have considered the distributed artificial intelligence (DAI) techniques, and more especially the multi-agent approach. The multi-agent approach has been motivated by the natural distribution of the metro line equipment and by the fact that each equipment has its own local control and knowledge. Thus, each equipment has been considered as an autonomous agent. Through agents cooperation, the system is able to determine the main alarm and the faulty equipment responsible for the incident. A prototype, written in SPIRAL (a tool for knowledge-based system) is running on a workstation. This prototype has allowed the concretization and the validation of our multi-agent approach. (author) [fr

  14. Use of the module of functional operator in designing the standard means for data processing in the automated control system of NPP construction

    International Nuclear Information System (INIS)

    Lyanko, S.D.; Teslya, Yu.N.

    1988-01-01

    Problems of automated control system introduction in the process of NPP construction are discussed. The notion of information medium and the structure of the automated system software are considered. The automated system of information service for management departments introduced at the South Ukrainian NPP is described. Block structure of the program modules of the system permitted to increase its efficiency and simplified further developments

  15. Fluorescence based real time monitoring of fouling in process chromatography

    Science.gov (United States)

    Pathak, Mili; Lintern, Katherine; Chopda, Viki; Bracewell, Daniel G.; Rathore, Anurag S.

    2017-01-01

    A real time monitoring of fouling in liquid chromatography has been presented. The versatility of the approach has been proven by successful implementation in three case studies with an error protein A ligand density and foulant concentration for assessing performance of protein A chromatography resin during purification of monoclonal antibodies. The observations have been supported from LC-MS/MS studies that were independently performed. The second application involves monitoring of foulant deposition during multimode cation exchange chromatography based purification of human serum albumin. Finally, in the third application, monitoring of foulants during multimodal hydrophobic interaction chromatography of recombinant human granulocyte colony stimulating factor is demonstrated. In all three cases, it is observed that the fluorescence intensity consistently increases with resin reuse as more foulants are deposited over time. The proposed approach can be readily used for real time monitoring of fouling and process control. PMID:28358349

  16. Computer-controlled radiochemical synthesis: a chemistry process control unit for the automated production of radiochemicals

    Energy Technology Data Exchange (ETDEWEB)

    Padgett, H.C.; Schmidt, D.G.; Luxen, A.; Bida, G.T.; Satyamurthy, N.; Barrio, J.R. (California Univ., Los Angeles, CA (USA). Dept. of Radiology)

    1989-01-01

    A computer-controlled general purpose chemistry process control unit (CPCU) suitable for the automated production of radiochemicals has been developed. This valve-and-tubing synthesis system can be user programmed to accommodate a variety of chemical processes. In a practical demonstration of its utility, the CPCU has been configured and programmed to synthesize 2-deoxy-2-(/sup 18/F)fluoro-D-glucose (2-(/sup 18/F)FDG) using aqueous (/sup 18/F)fluoride ion. The unit has been similarly configured and programmed to synthesize 2-deoxy-2-(/sup 18/F)fluoro-D-mannose (48% EOB), 3-(2'-(/sup 18/F)fluoroethyl)spiperone (29% EOB), and (/sup 18/F)fluoroacetate (66% EOB) from aqueous (/sup 18/F)-fluoride ion, and 2-(/sup 18/F)FDG from gaseous acetyl hypo(/sup 18/F)fluorite (20% EOB). (author).

  17. Automated work-flow for processing high-resolution direct infusion electrospray ionization mass spectral fingerprints

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    2007-01-01

    an automated data processing pipeline to compare large numbers of fingerprint spectra from direct infusion experiments analyzed by high resolution MS. We describe some of the intriguing problems that have to be addressed. starting with the conversion and pre-processing of the raw data to the final data...... analysis. Illustrated on the direct infusion analysis (ESI-TOF-MS) of complex mixtures the method exploits the full quality of the high-resolution present in the mass spectra. Although the method is illustrated as a new library search method for high resolution MS, we demonstrate that the output...... of the preprocessing is applicable to cluster-, discriminant analysis, and related multivariate methods applied directly to mass spectra from direct infusion analysis of crude extracts. This is done to find the relationship between several terverticillate Penicillium species and identify the ions responsible...

  18. Instrumentation, Field Network And Process Automation for the LHC Cryogenic Line Tests

    CERN Document Server

    Bager, T; Bertrand, G; Casas-Cubillos, J; Gomes, P; Parente, C; Riddone, G; Suraci, A

    2000-01-01

    This paper describes the cryogenic control system and associated instrumentation of the test facility for 3 pre-series units of the LHC Cryogenic Distribution Line. For each unit, the process automation is based on a Programmable Logic Con-troller implementing more than 30 closed control loops and handling alarms, in-terlocks and overall process management. More than 160 sensors and actuators are distributed over 150 m on a Profibus DP/PA network. Parameterization, cali-bration and diagnosis are remotely available through the bus. Considering the diversity, amount and geographical distribution of the instru-mentation involved, this is a representative approach to the cryogenic control system for CERN's next accelerator.

  19. Automated four-dimensional Monte Carlo workflow using log files and real-time motion monitoring

    DEFF Research Database (Denmark)

    Sibolt, Patrik; Cronholm, R.O.; Heath, E.

    2017-01-01

    With emerging techniques for tracking and gating methods in radiotherapy of lung cancer patients, there is an increasing need for efficient four-dimensional Monte Carlo (4DMC) based quality assurance (QA). An automated and flexible workflow for 4DMC QA, based on the 4DdefDOSXYZnrc user code, has...... been developed in python. The workflow has been tested and verified using an in-house developed dosimetry system comprised of a dynamic thorax phantom constructed for plastic scintillator dosimetry. The workflow is directly compatible with any treatment planning system and can also be triggered...

  20. Automated solar cell assembly team process research. Annual subcontract report, 1 January 1993--31 December 1993

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M J; Hogan, S J; Darkazalli, G; Breen, W F; Murach, J M; Sutherland, S F; Patterson, J S [Spire Corp., Bedford, MA (United States)

    1994-06-01

    This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire`s objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell`s Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.

  1. Process control monitoring systems, industrial plants, and process control monitoring methods

    Science.gov (United States)

    Skorpik, James R [Kennewick, WA; Gosselin, Stephen R [Richland, WA; Harris, Joe C [Kennewick, WA

    2010-09-07

    A system comprises a valve; a plurality of RFID sensor assemblies coupled to the valve to monitor a plurality of parameters associated with the valve; a control tag configured to wirelessly communicate with the respective tags that are coupled to the valve, the control tag being further configured to communicate with an RF reader; and an RF reader configured to selectively communicate with the control tag, the reader including an RF receiver. Other systems and methods are also provided.

  2. Acoustic monitoring of a fluidized bed coating process

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Veski, Peep; Pedersen, Joan G.

    2007-01-01

      The aim of the study was to investigate the potential of acoustic monitoring of a production scale fluidized bed coating process. The correlation between sensor signals and the estimated amount of film applied and percentage release, respectively, were investigated in coating potassium chloride...... (KCl) crystals with ethylcellulose (EC). Vibrations were measured with two different types of accelerometers. Different positions for placing the accelerometers and two different product containers were included in the study. Top spray coating of KCl was chosen as a ‘worst case' scenario from a coating...... point perspective. The acoustic monitoring has the potential of summarising the commonly used means to monitor the coating process. The best partial least squares (PLS) regressions, obtained by the high frequency accelerometer, showed for the release a correlation coefficient of 0.92 and a root mean...

  3. Interpreting complex data by methods of recognition and classification in an automated system of aerogeophysical material processing

    Energy Technology Data Exchange (ETDEWEB)

    Koval' , L.A.; Dolgov, S.V.; Liokumovich, G.B.; Ovcharenko, A.V.; Priyezzhev, I.I.

    1984-01-01

    The system of automated processing of aerogeophysical data, ASOM-AGS/YeS, is equipped with complex interpretation of multichannel measurements. Algorithms of factor analysis, automatic classification and apparatus of a priori specified (selected) decisive rules are used. The areas of effect of these procedures can be initially limited to the specified geological information. The possibilities of the method are demonstrated by the results of automated processing of the aerogram-spectrometric measurements in the region of the known copper-porphyr manifestation in Kazakhstan. This ore deposit was clearly noted after processing by the method of main components by complex aureole of independent factors U (severe increase), Th (noticeable increase), K (decrease).

  4. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  5. High-throughput microfluidics automated cytogenetic processing for effectively lowering biological process time and aid triage during radiation accidents

    International Nuclear Information System (INIS)

    Ramakumar, Adarsh

    2016-01-01

    Nuclear or radiation mass casualties require individual, rapid, and accurate dose-based triage of exposed subjects for cytokine therapy and supportive care, to save life. Radiation mass casualties will demand high-throughput individual diagnostic dose assessment for medical management of exposed subjects. Cytogenetic techniques are widely used for triage and definitive radiation biodosimetry. Prototype platform to demonstrate high-throughput microfluidic micro incubation to support the logistics of sample in miniaturized incubators from the site of accident to analytical labs has been developed. Efforts have been made, both at the level of developing concepts and advanced system for higher throughput in processing the samples and also implementing better and efficient methods of logistics leading to performance of lab-on-chip analyses. Automated high-throughput platform with automated feature extraction, storage, cross platform data linkage, cross platform validation and inclusion of multi-parametric biomarker approaches will provide the first generation high-throughput platform systems for effective medical management, particularly during radiation mass casualty events

  6. Condition Monitoring of a Process Filter Applying Wireless Vibration Analysis

    Directory of Open Access Journals (Sweden)

    Pekka KOSKELA

    2011-05-01

    Full Text Available This paper presents a novel wireless vibration-based method for monitoring the degree of feed filter clogging. In process industry, these filters are applied to prevent impurities entering the process. During operation, the filters gradually become clogged, decreasing the feed flow and, in the worst case, preventing it. The cleaning of the filter should therefore be carried out predictively in order to avoid equipment damage and unnecessary process downtime. The degree of clogging is estimated by first calculating the time domain indices from low frequency accelerometer samples and then taking the median of the processed values. Nine different statistical quantities are compared based on the estimation accuracy and criteria for operating in resource-constrained environments with particular focus on energy efficiency. The initial results show that the method is able to detect the degree of clogging, and the approach may be applicable to filter clogging monitoring.

  7. Monitoring and control of fine abrasive finishing processes

    DEFF Research Database (Denmark)

    Lazarev, Ruslan

    signals were analysed in time-frequency domain and specific process features are extracted in relation to machining parameters and processed surface properties. Development and research of the process monitoring was done with background in evaluation of surface roughness parameters. The characterization...... of surface topography was essential part of engineering of new tools and machine elements. Therefore, the generation of surface texture should be indirectly monitored, and the machining parameters should be adjusted appropriately. Based on evaluating of the surface parameters, polishing process...... was segmented using discretization methods. The applied methodology was proposed for implementation as an on-line system and is considered to be a part of the next generation of STRECON NanoRAP machine....

  8. Monitoring sodium in commercially processed foods from stores and restaurants

    Science.gov (United States)

    Most of the sodium we eat comes from commercially processed foods from stores and restaurants. Sodium reduction in these foods is a key component of several recent public health efforts. Agricultural Research Service (ARS) of USDA, CDC and FDA have launched a collaborative program to monitor sodium ...

  9. Forest Service National Visitor Use Monitoring Process: Research Method Documentation

    Science.gov (United States)

    Donald B.K. English; Susan M. Kocis; Stanley J. Zarnoch; J. Ross Arnold

    2002-01-01

    In response to the need for improved information on recreational use of National Forest System lands, the authors have developed a nationwide, systematic monitoring process. This report documents the methods they used in estimating recreational use on an annual basis. The basic unit of measure is exiting volume of visitors from a recreation site on a given day. Sites...

  10. Using warping information for batch process monitoring and fault classification.

    NARCIS (Netherlands)

    González-Martínez, J.M.; Westerhuis, J.A.; Ferrer, A.

    2013-01-01

    This paper discusses how to use the warping information obtained after batch synchronization for process monitoring and fault classification. The warping information can be used for i) building unsupervised control charts or ii) fault classification when a rich faulty batches database is available.

  11. Developments and Trends in Monitoring and Control of Machining Processes

    NARCIS (Netherlands)

    Tönshoff, H.K.; Wulfsberg, J.P.; Kals, H.J.J.; König, W.; van Luttervelt, C.A.

    1988-01-01

    This paper describes conventional and enhanced methods for the monitoring and control of machining processes with a limitation to cutting and grinding machine tools. The differences between the various methods and the corresponding equipment, software and strategies are considered as well as the

  12. Facility Effluent Monitoring Plan for the 325 Radiochemical Processing Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Shields, K.D.; Ballinger, M.Y.

    1999-04-02

    This Facility Effluent Monitoring Plan (FEMP) has been prepared for the 325 Building Radiochemical Processing Laboratory (RPL) at the Pacific Northwest National Laboratory (PNNL) to meet the requirements in DOE Order 5400.1, ''General Environmental Protection Programs.'' This FEMP has been prepared for the RPL primarily because it has a ''major'' (potential to emit >0.1 mrem/yr) emission point for radionuclide air emissions according to the annual National Emission Standards for Hazardous Air Pollutants (NESHAP) assessment performed. This section summarizes the airborne and liquid effluents and the inventory based NESHAP assessment for the facility. The complete monitoring plan includes characterization of effluent streams, monitoring/sampling design criteria, a description of the monitoring systems and sample analysis, and quality assurance requirements. The RPL at PNNL houses radiochemistry research, radioanalytical service, radiochemical process development, and hazardous and radioactive mixed waste treatment activities. The laboratories and specialized facilities enable work ranging from that with nonradioactive materials to work with picogram to kilogram quantities of fissionable materials and up to megacurie quantities of other radionuclides. The special facilities within the building include two shielded hot-cell areas that provide for process development or analytical chemistry work with highly radioactive materials and a waste treatment facility for processing hazardous, mixed radioactive, low-level radioactive, and transuranic wastes generated by PNNL activities.

  13. Generation and monitoring of a discrete stable random process

    CERN Document Server

    Hopcraft, K I; Matthews, J O

    2002-01-01

    A discrete stochastic process with stationary power law distribution is obtained from a death-multiple immigration population model. Emigrations from the population form a random series of events which are monitored by a counting process with finite-dynamic range and response time. It is shown that the power law behaviour of the population is manifested in the intermittent behaviour of the series of events. (letter to the editor)

  14. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa

    2008-03-01

    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  15. Near-infrared spectroscopy monitoring and control of the fluidized bed granulation and coating processes-A review.

    Science.gov (United States)

    Liu, Ronghua; Li, Lian; Yin, Wenping; Xu, Dongbo; Zang, Hengchang

    2017-09-15

    The fluidized bed granulation and pellets coating technologies are widely used in pharmaceutical industry, because the particles made in a fluidized bed have good flowability, compressibility, and the coating thickness of pellets are homogeneous. With the popularization of process analytical technology (PAT), real-time analysis for critical quality attributes (CQA) was getting more attention. Near-infrared (NIR) spectroscopy, as a PAT tool, could realize the real-time monitoring and control during the granulating and coating processes, which could optimize the manufacturing processes. This article reviewed the application of NIR spectroscopy in CQA (moisture content, particle size and tablet/pellet thickness) monitoring during fluidized bed granulation and coating processes. Through this review, we would like to provide references for realizing automated control and intelligent production in fluidized bed granulation and pellets coating of pharmaceutical industry. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Exposing exposure: enhancing patient safety through automated data mining of nuclear medicine reports for quality assurance and organ dose monitoring.

    Science.gov (United States)

    Ikuta, Ichiro; Sodickson, Aaron; Wasser, Elliot J; Warden, Graham I; Gerbaudo, Victor H; Khorasani, Ramin

    2012-08-01

    To develop and validate an open-source informatics toolkit capable of creating a radiation exposure data repository from existing nuclear medicine report archives and to demonstrate potential applications of such data for quality assurance and longitudinal patient-specific radiation dose monitoring. This study was institutional review board approved and HIPAA compliant. Informed consent was waived. An open-source toolkit designed to automate the extraction of data on radiopharmaceuticals and administered activities from nuclear medicine reports was developed. After iterative code training, manual validation was performed on 2359 nuclear medicine reports randomly selected from September 17, 1985, to February 28, 2011. Recall (sensitivity) and precision (positive predictive value) were calculated with 95% binomial confidence intervals. From the resultant institutional data repository, examples of usage in quality assurance efforts and patient-specific longitudinal radiation dose monitoring obtained by calculating organ doses from the administered activity and radiopharmaceutical of each examination were provided. Validation statistics yielded a combined recall of 97.6% ± 0.7 (95% confidence interval) and precision of 98.7% ± 0.5. Histograms of administered activity for fluorine 18 fluorodeoxyglucose and iodine 131 sodium iodide were generated. An organ dose heatmap which displays a sample patient's dose accumulation from multiple nuclear medicine examinations was created. Large-scale repositories of radiation exposure data can be extracted from institutional nuclear medicine report archives with high recall and precision. Such repositories enable new approaches in radiation exposure patient safety initiatives and patient-specific radiation dose monitoring.

  17. Automated microfluidic platform of bead-based electrochemical immunosensor integrated with bioreactor for continual monitoring of cell secreted biomarkers

    Science.gov (United States)

    Riahi, Reza; Shaegh, Seyed Ali Mousavi; Ghaderi, Masoumeh; Zhang, Yu Shrike; Shin, Su Ryon; Aleman, Julio; Massa, Solange; Kim, Duckjin; Dokmeci, Mehmet Remzi; Khademhosseini, Ali

    2016-04-01

    There is an increasing interest in developing microfluidic bioreactors and organs-on-a-chip platforms combined with sensing capabilities for continual monitoring of cell-secreted biomarkers. Conventional approaches such as ELISA and mass spectroscopy cannot satisfy the needs of continual monitoring as they are labor-intensive and not easily integrable with low-volume bioreactors. This paper reports on the development of an automated microfluidic bead-based electrochemical immunosensor for in-line measurement of cell-secreted biomarkers. For the operation of the multi-use immunosensor, disposable magnetic microbeads were used to immobilize biomarker-recognition molecules. Microvalves were further integrated in the microfluidic immunosensor chip to achieve programmable operations of the immunoassay including bead loading and unloading, binding, washing, and electrochemical sensing. The platform allowed convenient integration of the immunosensor with liver-on-chips to carry out continual quantification of biomarkers secreted from hepatocytes. Transferrin and albumin productions were monitored during a 5-day hepatotoxicity assessment in which human primary hepatocytes cultured in the bioreactor were treated with acetaminophen. Taken together, our unique microfluidic immunosensor provides a new platform for in-line detection of biomarkers in low volumes and long-term in vitro assessments of cellular functions in microfluidic bioreactors and organs-on-chips.

  18. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  19. Quality control of CT systems by automated monitoring of key performance indicators: a two-year study.

    Science.gov (United States)

    Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-07-08

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.

  20. RESEARCH OF PROCESS OF FINANCIAL MONITORING ORGANIZING IN BANKS

    Directory of Open Access Journals (Sweden)

    Viktoriia Kovalenko

    2017-03-01

    Full Text Available The article aims to study methods and tools for financial monitoring by banks. It is proved that one of the main global financial problems in recent years is increasing number of cases of banks participating in the money laundering. It causes banks huge losses, undermines the credibility of honest depositors, in addition, circulation of funds like these hurts the national economy. The article is to develop recommendations to improve the effectiveness of financial monitoring in banks. It is proved that the current model of the national financial monitoring system includes the following elements: the purpose of macroeconomic and microeconomic levels; principles; function; facilities; subjects; types of financial monitoring; methods of implementation and regulatory prevue regulation. It is proved that the major problems related to the financial monitoring of banks are the following: lack of legislatively established quality requirements of customer information; persons engaged in legalization of illegal incomes are highly qualified, which greatly facilitates them through the bank of suspicious transactions; the process of settlement bank failure in the conduct of questionable transactions. Keywords: financial monitoring, suspicious transactions, bank, money laundering, financing of terrorism.

  1. Acoustic monitoring of rotating machine by advanced signal processing technology

    International Nuclear Information System (INIS)

    Kanemoto, Shigeru

    2010-01-01

    The acoustic data remotely measured by hand held type microphones are investigated for monitoring and diagnosing the rotational machine integrity in nuclear power plants. The plant operator's patrol monitoring is one of the important activities for condition monitoring. However, remotely measured sound has some difficulties to be considered for precise diagnosis or quantitative judgment of rotating machine anomaly, since the measurement sensitivity is different in each measurement, and also, the sensitivity deteriorates in comparison with an attached type sensor. Hence, in the present study, several advanced signal processing methods are examined and compared in order to find optimum anomaly monitoring technology from the viewpoints of both sensitivity and robustness of performance. The dimension of pre-processed signal feature patterns are reduced into two-dimensional space for the visualization by using the standard principal component analysis (PCA) or the kernel based PCA. Then, the normal state is classified by using probabilistic neural network (PNN) or support vector data description (SVDD). By using the mockup test facility of rotating machine, it is shown that the appropriate combination of the above algorithms gives sensitive and robust anomaly monitoring performance. (author)

  2. Optimisation of the discrete conductivity and dissolved oxygen monitoring using continuous data series obtained with automated measurement stations.

    Science.gov (United States)

    D'heygere, T; Goethals, P; van Griensven, A; Vandenberghe, V; Bauwens, W; Vanrolleghem, P; De Pauw, N

    2001-01-01

    During the last five years, research on the relation between pollution loads and ecological river water quality has been done on the Dender river. In addition to biological sampling of macroinvertebrates and fish, automated measurement stations were used too to investigate the spatio-temporal variability of physical-chemical water pollution. This study on on-line water quality data collection is based on a measurement campaign during March-April 2000 with two automated measurement stations at two different sites: the flow control weirs at Geraardsbergen and Denderleeuw. These measurement stations contain sensors for temperature, turbidity, conductivity, pH, redoxpotential and dissolved oxygen. Short wave radiation as well as rainfall were monitored by means of pyranometers and rain gauges. A refrigerated sampler with 24 bottles allowed to take samples for additional laboratory analyses. In this study, continuous measurements of two physical-chemical parameters, conductivity and dissolved oxygen, were analysed to evaluate the adequacy of the current monitoring frequency in Flanders. Analysis showed that discrete conductivity measurements can be sufficient for trend detection, but the measuring frequency must be highly increased from one measurement per month to at least 8 measurements. Continuous measurements for conductivity are preferred because extreme values are obtained as well. For dissolved oxygen, a single measurement per month in not enough. The percentage of dissolved oxygen showed a strong diurnal variation with maxima in the late afternoon (photosynthesis) and minima at night (respiration). This parameter also differed significantly from day to day. Continuous measurements are therefore necessary for a reliable assessment of the dissolved oxygen budget of surface waters. When using discrete measurements for dissolved oxygen, a set time should be introduced to eliminate diurnal variation.

  3. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  4. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  5. Automated business process management – in times of digital transformation using machine learning or artificial intelligence

    Directory of Open Access Journals (Sweden)

    Paschek Daniel

    2017-01-01

    Full Text Available The continuous optimization of business processes is still a challenge for companies. In times of digital transformation, faster changing internal and external framework conditions and new customer expectations for fastest delivery and best quality of goods and many more, companies should set up their internal process at the best way. But what to do if framework conditions changed unexpectedly? The purpose of the paper is to analyse how the digital transformation will impact the Business Process Management (BPM while using methods like machine learning or artificial intelligence. Therefore, the core components will be explained, compared and set up in relation. To identify application areas interviews and analysis will be held up with digital companies. The finding of the paper will be recommendation for action in the field of BPM and process optimization through machine learning and artificial intelligence. The Approach of optimizing and management processes via machine learning and artificial intelligence will support companies to decide which tool will be the best for automated BPM.

  6. Image processing technologies in nuclear power plant monitoring

    International Nuclear Information System (INIS)

    Kubo, Katsumi; Kanemoto, Shigeru; Shimada, Hideo.

    1995-01-01

    Various monitoring activities are carried out in nuclear power plants to ensure that the high reliability requirements of such plants are met. Inspection patrols by operators are important for detecting small anomalies in equipment. Vibration, temperature, and visual images are major forms of information used in equipment inspections. We are developing remote automatic inspection technologies comprising image sensing of equipment conditions and automatic recognition of the images. This paper shows examples of image processing technologies, such as equipment monitoring using three-dimensional graphic plant models and vibration/temperature image data, and intelligent image recognition technology for detecting steam leakage. (author)

  7. System Health Monitoring Using a Novel Method: Security Unified Process

    Directory of Open Access Journals (Sweden)

    Alireza Shameli-Sendi

    2012-01-01

    and change management, and project management. The dynamic dimension, or phases, contains inception, analysis and design, construction, and monitoring. Risk assessment is a major part of the ISMS process. In SUP, we present a risk assessment model, which uses a fuzzy expert system to assess risks in organization. Since, the classification of assets is an important aspect of risk management and ensures that effective protection occurs, a Security Cube is proposed to identify organization assets as an asset classification model. The proposed model leads us to have an offline system health monitoring tool that is really a critical need in any organization.

  8. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    Science.gov (United States)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence

  9. Automated four-dimensional Monte Carlo workflow using log files and real-time motion monitoring

    International Nuclear Information System (INIS)

    Sibolt, P; Andersen, C E; Cronholm, R O; Heath, E; Behrens, C F

    2017-01-01

    With emerging techniques for tracking and gating methods in radiotherapy of lung cancer patients, there is an increasing need for efficient four-dimensional Monte Carlo (4DMC) based quality assurance (QA). An automated and flexible workflow for 4DMC QA, based on the 4DdefDOSXYZnrc user code, has been developed in python. The workflow has been tested and verified using an in-house developed dosimetry system comprised of a dynamic thorax phantom constructed for plastic scintillator dosimetry. The workflow is directly compatible with any treatment planning system and can also be triggered by the appearance of linac log files. It has minimum user interaction and, with the use of linac log files, it provides a method for verification of the actually delivered dose in the patient geometry. (paper)

  10. Automated four-dimensional Monte Carlo workflow using log files and real-time motion monitoring

    Science.gov (United States)

    Sibolt, P.; Cronholm, R. O.; Heath, E.; Andersen, C. E.; Behrens, C. F.

    2017-05-01

    With emerging techniques for tracking and gating methods in radiotherapy of lung cancer patients, there is an increasing need for efficient four-dimensional Monte Carlo (4DMC) based quality assurance (QA). An automated and flexible workflow for 4DMC QA, based on the 4DdefDOSXYZnrc user code, has been developed in python. The workflow has been tested and verified using an in-house developed dosimetry system comprised of a dynamic thorax phantom constructed for plastic scintillator dosimetry. The workflow is directly compatible with any treatment planning system and can also be triggered by the appearance of linac log files. It has minimum user interaction and, with the use of linac log files, it provides a method for verification of the actually delivered dose in the patient geometry.

  11. Automated Defect Classification (ADC) and Progression Monitoring (DPM) in wafer fab reticle requalification

    Science.gov (United States)

    Yen, T. H.; Lai, Rick; Tuo, Laurent C.; Tolani, Vikram; Chen, Dongxue; Hu, Peter; Yu, Jiao; Hwa, George; Zheng, Yan; Lakkapragada, Suresh; Wang, Kechang; Peng, Danping; Wang, Bill; Chiang, Kaiming

    2013-09-01

    As optical lithography continues to extend into low-k1 regime, resolution of mask patterns continue to diminish, and so do mask defect requirements due to increasing MEEF. Post-inspection, mask defects have traditionally been classified by operators manually based on visual review. This approach may have worked down to 65/55nm node layers. However, starting 45nm and smaller nodes, visually reviewing 50 to sometimes 100s of defects on masks with complex modelbased OPC, SRAF, and ILT geometries, is error-prone and takes up valuable inspection tool capacity. Both these shortcomings in manual defect review are overcome by adoption of the computational solution called Automated Defect Classification (ADC) wherein mask defects are accurately classified within seconds and consistent to guidelines used by production technicians and engineers.

  12. A METHOD OF COMPLEX AUTOMATED MONITORING OF UKRAINIAN POWER ENERGY SYSTEM OBJECTS TO INCREASE ITS OPERATION SAFETY

    Directory of Open Access Journals (Sweden)

    Ye.I. Sokol

    2016-05-01

    Full Text Available The paper describes an algorithm of the complex automated monitoring of Ukraine’s power energy system, aimed at ensuring safety of its personnel and equipment. This monitoring involves usage of unmanned aerial vehicles (UAVs for planned and unplanned registration status of power transmission lines (PTL and high-voltage substations (HVS. It is assumed that unscheduled overflights will be made in emergency situations on power lines. With the help of the UAV, pictures of transmission and HVS will be recorded from the air in the optical and infrared ranges, as well as strength of electric (EF and magnetic (MF fields will be measured along the route of flight. Usage specially developed software allows to compare the recorded pictures with pre-UAV etalon patterns corresponding to normal operation of investigated transmission lines and the HVSs. Such reference pattern together with the experimentally obtained maps of HVS’s protective grounding will be summarized in a single document – a passport of HVS and PTL. This passport must also contain the measured and calculated values of strength levels of EF and MF in the places where staff of power facilities stay as well as layout of equipment, the most vulnerable to the effects of electromagnetic interference. If necessary, as part of ongoing monitoring, recommendations will be given on the design and location of electromagnetic screens, reducing the levels of electromagnetic interference as well as on location of lightning rods, reducing probability lightning attachment to the objects. The paper presents analytic expressions, which formed the basis of the developed software for calculation of the EF strength in the vicinity of power lines. This software will be used as a base at UAV navigation along the transmission lines, as well as to detect violations in the transmission lines operation. Comparison of distributions of EF strength calculated with the help of the elaborated software with the known

  13. A comparison of timed artificial insemination and automated activity monitoring with hormone intervention in 3 commercial dairy herds.

    Science.gov (United States)

    Dolecheck, K A; Silvia, W J; Heersche, G; Wood, C L; McQuerry, K J; Bewley, J M

    2016-02-01

    The objective of this study was to compare the reproductive performance of cows inseminated based on automated activity monitoring with hormone intervention (AAM) to cows from the same herds inseminated using only an intensive timed artificial insemination (TAI) program. Cows (n=523) from 3 commercial dairy herds participated in this study. To be considered eligible for participation, cows must have been classified with a body condition score of at least 2.50, but no more than 3.50, passed a reproductive tract examination, and experienced no incidences of clinical, recorded metabolic diseases in the current lactation. Within each herd, cows were balanced for parity and predicted milk yield, then randomly assigned to 1 of 2 treatments: TAI or AAM. Cows assigned to the TAI group were subjected to an ovulation synchronization protocol consisting of presynchronization, Ovsynch, and Resynch for up to 3 inseminations. Cows assigned to the AAM treatment were fitted with a leg-mounted accelerometer (AfiAct Pedometer Plus, Afimilk, Kibbutz Afikim, Israel) at least 10 d before the end of the herd voluntary waiting period (VWP). Cows in the AAM treatment were inseminated at times indicated by the automated alert system for up to 90 d after the VWP. If an open cow experienced no AAM alert for a 39±7-d period (beginning at the end of the VWP), hormone intervention in the form of a single injection of either PGF2α or GnRH (no TAI) was permitted as directed by the herd veterinarian. Subsequent to hormone intervention, cows were inseminated when alerted in estrus by the AAM system. Pregnancy was diagnosed by ultrasound 33 to 46 d after insemination. Pregnancy loss was determined via a second ultrasound after 60 d pregnant. Timed artificial insemination cows experienced a median 11.0 d shorter time to first service. Automated activity-monitored cows experienced a median 17.5-d shorter service interval. No treatment difference in probability of pregnancy to first AI, probability

  14. Process control and recovery in the Link Monitor and Control Operator Assistant

    Science.gov (United States)

    Lee, Lorrine; Hill, Randall W., Jr.

    1993-01-01

    This paper describes our approach to providing process control and recovery functions in the Link Monitor and Control Operator Assistant (LMCOA). The focus of the LMCOA is to provide semi-automated monitor and control to support station operations in the Deep Space Network. The LMCOA will be demonstrated with precalibration operations for Very Long Baseline Interferometry on a 70-meter antenna. Precalibration, the task of setting up the equipment to support a communications link with a spacecraft, is a manual, time consuming and error-prone process. One problem with the current system is that it does not provide explicit feedback about the effects of control actions. The LMCOA uses a Temporal Dependency Network (TDN) to represent an end-to-end sequence of operational procedures and a Situation Manager (SM) module to provide process control, diagnosis, and recovery functions. The TDN is a directed network representing precedence, parallelism, precondition, and postcondition constraints. The SM maintains an internal model of the expected and actual states of the subsystems in order to determine if each control action executed successfully and to provide feedback to the user. The LMCOA is implemented on a NeXT workstation using Objective C, Interface Builder and the C Language Integrated Production System.

  15. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    Science.gov (United States)

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology.

  16. Use of an Automated Mobile Phone Messaging Robot in Postoperative Patient Monitoring.

    Science.gov (United States)

    Anthony, Chris A; Lawler, Ericka A; Ward, Christina M; Lin, Ines C; Shah, Apurva S

    2018-01-01

    Mobile phone messaging software robots allow clinicians and healthcare systems to communicate with patients without the need for human intervention. The purpose of this study was to (1) describe a method for communicating with patients postoperatively outside of the traditional healthcare setting by utilizing an automated software and mobile phone messaging platform and to (2) evaluate the first week of postoperative pain and opioid use after common ambulatory hand surgery procedures. The investigation was a prospective, multicenter investigation of patient-reported pain and opioid usage after ambulatory hand surgery. Inclusion criteria included any adult with a mobile phone capable of text messaging, who was undergoing a common ambulatory hand surgical procedure at one of three tertiary care institutions. Participants received daily, automated text messages inquiring about their pain level and how many tablets of prescription pain medication they had taken in the past 24 h. Initial 1-week response rate was assessed and compared between different patient demographics. Patient-reported pain and opioid use were also quantified for the first postoperative week. Statistical significance was set as p first postoperative week, with the highest levels of pain being reported in the first 48 h after surgery. Patients reported an average use of 15.9 ± 14.8 tablets of prescription opioid pain medication. We find that a mobile phone messaging software robot allows for effective data collection of postoperative pain and pain medication use. Patients undergoing common ambulatory hand procedures utilized an average of 16 tablets of opioid medication in the first postoperative week.

  17. Studies and research concerning BNFP: process monitoring and process surveillance demonstration program

    Energy Technology Data Exchange (ETDEWEB)

    Kight, H R

    1979-11-01

    Computerized methods of monitoring process functions and alarming off-standard conditions were implemented and demonstrated during the FY 1979 Uranium Run. In addition, prototype applications of instruments for the purpose of tamper indication and surveillance were tested.

  18. Automated process flowsheet synthesis for membrane processes using genetic algorithm: role of crossover operators

    KAUST Repository

    Shafiee, Alireza

    2016-06-25

    In optimization-based process flowsheet synthesis, optimization methods, including genetic algorithms (GA), are used as advantageous tools to select a high performance flowsheet by ‘screening’ large numbers of possible flowsheets. In this study, we expand the role of GA to include flowsheet generation through proposing a modified Greedysub tour crossover operator. Performance of the proposed crossover operator is compared with four other commonly used operators. The proposed GA optimizationbased process synthesis method is applied to generate the optimum process flowsheet for a multicomponent membrane-based CO2 capture process. Within defined constraints and using the random-point crossover, CO2 purity of 0.827 (equivalent to 0.986 on dry basis) is achieved which results in improvement (3.4%) over the simplest crossover operator applied. In addition, the least variability in the converged flowsheet and CO2 purity is observed for random-point crossover operator, which approximately implies closeness of the solution to the global optimum, and hence the consistency of the algorithm. The proposed crossover operator is found to improve the convergence speed of the algorithm by 77.6%.

  19. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    Science.gov (United States)

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results.

  20. OpenLMD, multimodal monitoring and control of LMD processing

    Science.gov (United States)

    Rodríguez-Araújo, Jorge; García-Díaz, Antón

    2017-02-01

    This paper presents OpenLMD, a novel open-source solution for on-line multimodal monitoring of Laser Metal Deposition (LMD). The solution is also applicable to a wider range of laser-based applications that require on-line control (e.g. laser welding). OpenLMD is a middleware that enables the orchestration and virtualization of a LMD robot cell, using several open-source frameworks (e.g. ROS, OpenCV, PCL). The solution also allows reconfiguration by easy integration of multiple sensors and processing equipment. As a result, OpenLMD delivers significant advantages over existing monitoring and control approaches, such as improved scalability, and multimodal monitoring and data sharing capabilities.

  1. Automated measurement and monitoring of bioprocesses: key elements of the M(3)C strategy.

    Science.gov (United States)

    Sonnleitner, Bernhard

    2013-01-01

    The state-of-routine monitoring items established in the bioprocess industry as well as some important state-of-the-art methods are briefly described and the potential pitfalls discussed. Among those are physical and chemical variables such as temperature, pressure, weight, volume, mass and volumetric flow rates, pH, redox potential, gas partial pressures in the liquid and molar fractions in the gas phase, infrared spectral analysis of the liquid phase, and calorimetry over an entire reactor. Classical as well as new optical versions are addressed. Biomass and bio-activity monitoring (as opposed to "measurement") via turbidity, permittivity, in situ microscopy, and fluorescence are critically analyzed. Some new(er) instrumental analytical tools, interfaced to bioprocesses, are explained. Among those are chromatographic methods, mass spectrometry, flow and sequential injection analyses, field flow fractionation, capillary electrophoresis, and flow cytometry. This chapter surveys the principles of monitoring rather than compiling instruments.

  2. Intelligent Classification of Heartbeats for Automated Real-Time ECG Monitoring

    Science.gov (United States)

    Park, Juyoung

    2014-01-01

    Abstract Background: The automatic interpretation of electrocardiography (ECG) data can provide continuous analysis of heart activity, allowing the effective use of wireless devices such as the Holter monitor. Materials and Methods: We propose an intelligent heartbeat monitoring system to detect the possibility of arrhythmia in real time. We detected heartbeats and extracted features such as the QRS complex and P wave from ECG signals using the Pan–Tompkins algorithm, and the heartbeats were then classified into 16 types using a decision tree. Results: We tested the sensitivity, specificity, and accuracy of our system against data from the MIT-BIH Arrhythmia Database. Our system achieved an average accuracy of 97% in heartbeat detection and an average heartbeat classification accuracy of above 96%, which is comparable with the best competing schemes. Conclusions: This work provides a guide to the systematic design of an intelligent classification system for decision support in Holter ECG monitoring. PMID:25010717

  3. Monitoring in the nearshore: A process for making reasoned decisions

    Science.gov (United States)

    Bodkin, James L.; Dean, T.A.

    2003-01-01

    Over the past several years, a conceptual framework for the GEM nearshore monitoring program has been developed through a series of workshops. However, details of the proposed monitoring program, e.g. what to sample, where to sample, when to sample and at how many sites, have yet to be determined. In FY 03 we were funded under Project 03687 to outline a process whereby specific alternatives to monitoring are developed and presented to the EVOS Trustee Council for consideration. As part of this process, two key elements are required before reasoned decisions can be made. These are: 1) a comprehensive historical perspective of locations and types of past studies conducted in the nearshore marine communities within Gulf of Alaska, and 2) estimates of costs for each element of a proposed monitoring program. We have developed a GIS database that details available information from past studies of selected nearshore habitats and species in the Gulf of Alaska and provide a visual means of selecting sites based (in part) on the locations for which historical data of interest are available. We also provide cost estimates for specific monitoring plan alternatives and outline several alternative plans that can be accomplished within reasonable budgetary constraints. The products that we will provide are: 1) A GIS database and maps showing the location and types of information available from the nearshore in the Gulf of Alaska; 2) A list of several specific monitoring alternatives that can be conducted within reasonable budgetary constraints; and 3) Cost estimates for proposed tasks to be conducted as part of the nearshore program. Because data compilation and management will not be completed until late in FY03 we are requesting support for close-out of this project in FY 04.

  4. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    Process monitoring provides important information on the product, process and manufacturing system during part manufacturing. Such information can be used for process optimization and detection of undesired processing conditions to initiate timely actions for avoidance of defects, thereby improving...... quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  5. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-01-01

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  6. PyDBS: an automated image processing workflow for deep brain stimulation surgery.

    Science.gov (United States)

    D'Albis, Tiziano; Haegelen, Claire; Essert, Caroline; Fernández-Vidal, Sara; Lalys, Florent; Jannin, Pierre

    2015-02-01

    Deep brain stimulation (DBS) is a surgical procedure for treating motor-related neurological disorders. DBS clinical efficacy hinges on precise surgical planning and accurate electrode placement, which in turn call upon several image processing and visualization tasks, such as image registration, image segmentation, image fusion, and 3D visualization. These tasks are often performed by a heterogeneous set of software tools, which adopt differing formats and geometrical conventions and require patient-specific parameterization or interactive tuning. To overcome these issues, we introduce in this article PyDBS, a fully integrated and automated image processing workflow for DBS surgery. PyDBS consists of three image processing pipelines and three visualization modules assisting clinicians through the entire DBS surgical workflow, from the preoperative planning of electrode trajectories to the postoperative assessment of electrode placement. The system's robustness, speed, and accuracy were assessed by means of a retrospective validation, based on 92 clinical cases. The complete PyDBS workflow achieved satisfactory results in 92 % of tested cases, with a median processing time of 28 min per patient. The results obtained are compatible with the adoption of PyDBS in clinical practice.

  7. "SmartMonitor"--an intelligent security system for the protection of individuals and small properties with the possibility of home automation.

    Science.gov (United States)

    Frejlichowski, Dariusz; Gościewska, Katarzyna; Forczmański, Paweł; Hofman, Radosław

    2014-06-05

    "SmartMonitor" is an intelligent security system based on image analysis that combines the advantages of alarm, video surveillance and home automation systems. The system is a complete solution that automatically reacts to every learned situation in a pre-specified way and has various applications, e.g., home and surrounding protection against unauthorized intrusion, crime detection or supervision over ill persons. The software is based on well-known and proven methods and algorithms for visual content analysis (VCA) that were appropriately modified and adopted to fit specific needs and create a video processing model which consists of foreground region detection and localization, candidate object extraction, object classification and tracking. In this paper, the "SmartMonitor" system is presented along with its architecture, employed methods and algorithms, and object analysis approach. Some experimental results on system operation are also provided. In the paper, focus is put on one of the aforementioned functionalities of the system, namely supervision over ill persons.

  8. Light scattering application for bacterial cell monitoring during cultivation process

    Science.gov (United States)

    Kotsyumbas, Igor Ya.; Kushnir, Igor M.; Bilyy, Rostyslav O.; Yarynovska, Ivanna H.; Getman, Vasyl'B.; Bilyi, Alexander I.

    2007-07-01

    Monitoring of bacterial cell numbers is of great importance not only in microbiological industry but also for control of liquids contamination in the food and pharmaceutical industries. Here we describe a novel low-cost and highly efficient technology for bacterial cell monitoring during cultivation process. The technology incorporates previously developed monitoring device and algorithm of its action. The devise analyses light scattered by suspended bacterial cells. Current stage utilizes monochromatic coherent light and detects amplitudes and durations of scattered light impulses, it does not require any labeling of bacterial cell. The system is calibrated using highly purificated bacteria-free water as standard. Liquid medial are diluted and analyzed by the proposed technology to determine presence of bacteria. Detection is done for a range of particle size from 0.1 to 10 μm, and thus particles size distribution is determined. We analyzed a set of different bacterial suspensions and also their changes in quantity and size distribution during cultivation. Based on the obtained results we conclude that proposed technology can be very effective for bacteria monitoring during cultivation process, providing benefits of low simplicity and low cost of analysis with simultaneous high detection precision.

  9. Process monitoring and control: Ammonia measurements in off-gases

    Energy Technology Data Exchange (ETDEWEB)

    Allendorf, S.; Ottesen, D.; Johnson, H. [Sandia National Labs., Livermore, CA (United States); Lambert, D. [Westinghouse Savannah River Company, Aiken, SC (United States)

    1997-05-01

    This interim report describes technical progress in the development of a laser-based, real-time optical monitor for ammonia in off-gas streams from defense waste processing applications at the Savannah River Site (SRS). An optimized monitor has been fabricated by Spectrum Diagnostix using a tunable diode laser operating in the 1.55-{mu}m wavelength region. Instrument detection limits of 2-3 ppm for ammonia are demonstrated that are more than adequate for the SRS required sensitivity of 10 ppm. Laboratory research at Sandia revealed a lack of interference at the operating wavelength by other molecular species that might be present in the SRS off-gas stream. Initial tests of the ammonia monitor by Sandia were conducted at SRS using a bench-scale processing system for surrogate defense waste sludges. The results of these experiments confirmed that ammonia concentrations issuing from the ammonia-scrubber section of the bench-scale reactor were below the design limit of 10 ppm. We also found that no other molecular species in the off-gas produced observable false-positive readings from the monitor. 5 refs., 6 figs.

  10. Intelligent Production Monitoring and Control based on Three Main Modules for Automated Manufacturing Cells in the Automotive Industry

    International Nuclear Information System (INIS)

    Berger, Ulrich; Kretzschmann, Ralf; Algebra, A. Vargas Veronica

    2008-01-01

    The automotive industry is distinguished by regionalization and customization of products. As consequence, the diversity of products will increase while the lot sizes will decrease. Thus, more product types will be handled along the process chain and common production paradigms will fail. Although Rapid Manufacturing (RM) methodology will be used for producing small individual lot sizes, new solution for joining and assembling these components are needed. On the other hand, the non-availability of existing operational knowledge and the absence of dynamic and explicit knowledge retrieval minimize the achievement of on-demand capabilities. Thus, in this paper, an approach for an Intelligent Production System will be introduced. The concept is based on three interlinked main modules: a Technology Data Catalogue (TDC) based on an ontology system, an Automated Scheduling Processor (ASP) based on graph theory and a central Programmable Automation Controller (PAC) for real-time sensor/actor communication. The concept is being implemented in a laboratory set-up with several assembly and joining processes and will be experimentally validated in some research and development projects

  11. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    Poellaenen, R.; Ilander, T.; Lehtinen, J.; Leppaenen, A.; Nikkinen, M.; Toivonen, H.; Ylaetalo, S.; Smartt, H.; Garcia, R.; Martinez, R.; Glidewell, D.; Krantz, K.

    1999-01-01

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  12. Method and apparatus for monitoring plasma processing operations

    Science.gov (United States)

    Smith, Jr., Michael Lane; Ward, Pamela Denise Peardon; Stevenson, Joel O'Don

    2002-01-01

    The invention generally relates to various aspects of a plasma process, and more specifically the monitoring of such plasma processes. One aspect relates in at least some manner to calibrating or initializing a plasma monitoring assembly. This type of calibration may be used to address wavelength shifts, intensity shifts, or both associated with optical emissions data obtained on a plasma process. A calibration light may be directed at a window through which optical emissions data is being obtained to determine the effect, if any, that the inner surface of the window is having on the optical emissions data being obtained therethrough, the operation of the optical emissions data gathering device, or both. Another aspect relates in at least some manner to various types of evaluations which may be undertaken of a plasma process which was run, and more typically one which is currently being run, within the processing chamber. Plasma health evaluations and process identification through optical emissions analysis are included in this aspect. Yet another aspect associated with the present invention relates in at least some manner to the endpoint of a plasma process (e.g., plasma recipe, plasma clean, conditioning wafer operation) or discrete/discernible portion thereof (e.g., a plasma step of a multiple step plasma recipe). Another aspect associated with the present invention relates to how one or more of the above-noted aspects may be implemented into a semiconductor fabrication facility, such as the distribution of wafers to a wafer production system. A final aspect of the present invention relates to a network a plurality of plasma monitoring systems, including with remote capabilities (i.e., outside of the clean room).

  13. Prototypic automated continuous recreational water quality monitoring of nine Chicago beaches

    Science.gov (United States)

    Dawn Shively,; Nevers, Meredith; Cathy Breitenbach,; Phanikumar, Mantha S.; Kasia Przybyla-Kelly,; Ashley M. Spoljaric,; Richard L. Whitman,

    2015-01-01

    Predictive empirical modeling is used in many locations worldwide as a rapid, alternative recreational water quality management tool to eliminate delayed notifications associated with traditional fecal indicator bacteria (FIB) culturing (referred to as the persistence model, PM) and to prevent errors in releasing swimming advisories. The goal of this study was to develop a fully automated water quality management system for multiple beaches using predictive empirical models (EM) and state-of-the-art technology. Many recent EMs rely on samples or data collected manually, which adds to analysis time and increases the burden to the beach manager. In this study, data from water quality buoys and weather stations were transmitted through cellular telemetry to a web hosting service. An executable program simultaneously retrieved and aggregated data for regression equations and calculated EM results each morning at 9:30 AM; results were transferred through RSS feed to a website, mapped to each beach, and received by the lifeguards to be posted at the beach. Models were initially developed for five beaches, but by the third year, 21 beaches were managed using refined and validated modeling systems. The adjusted R2 of the regressions relating Escherichia coli to hydrometeorological variables for the EMs were greater than those for the PMs, and ranged from 0.220 to 0.390 (2011) and 0.103 to 0.381 (2012). Validation results in 2013 revealed reduced predictive capabilities; however, three of the originally modeled beaches showed improvement in 2013 compared to 2012. The EMs generally showed higher accuracy and specificity than those of the PMs, and sensitivity was low for both approaches. In 2012 EM accuracy was 70–97%; specificity, 71–100%; and sensitivity, 0–64% and in 2013 accuracy was 68–97%; specificity, 73–100%; and sensitivity 0–36%. Factors that may have affected model capabilities include instrument malfunction, non-point source inputs, and sparse

  14. Performance assessment and improvement of control charts for statistical batch process monitoring

    NARCIS (Netherlands)

    Ramaker, Henk-Jan; van Sprang, Eric N. M.; Westerhuis, Johan A.; Gurden, Stephen P.; Smilde, Age K.; van der Meulen, Frank H.

    2006-01-01

    This paper describes the concepts of statistical batch process monitoring and the associated problems. It starts with an introduction to process monitoring in general which is then extended to batch process monitoring. The performance of control charts for batch process monitoring is discussed by

  15. Signal processing methodologies for an acoustic fetal heart rate monitor

    Science.gov (United States)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  16. Neural network training by Kalman filtering in process system monitoring

    International Nuclear Information System (INIS)

    Ciftcioglu, Oe.

    1996-03-01

    Kalman filtering approach for neural network training is described. Its extended form is used as an adaptive filter in a nonlinear environment of the form a feedforward neural network. Kalman filtering approach generally provides fast training as well as avoiding excessive learning which results in enhanced generalization capability. The network is used in a process monitoring application where the inputs are measurement signals. Since the measurement errors are also modelled in Kalman filter the approach yields accurate training with the implication of accurate neural network model representing the input and output relationships in the application. As the process of concern is a dynamic system, the input source of information to neural network is time dependent so that the training algorithm presents an adaptive form for real-time operation for the monitoring task. (orig.)

  17. Monitoring tablet surface roughness during the film coating process

    DEFF Research Database (Denmark)

    Seitavuopio, Paulus; Heinämäki, Jyrki; Rantanen, Jukka

    2006-01-01

    the process of film coating tablets were studied by noncontact laser profilometry and scanning electron microscopy (SEM). An EDX analysis was used to monitor the magnesium stearate and titanium dioxide of the tablets. The tablet cores were film coated with aqueous hydroxypropyl methylcellulose, and the film......The purpose of this study was to evaluate the change of surface roughness and the development of the film during the film coating process using laser profilometer roughness measurements, SEM imaging, and energy dispersive X-ray (EDX) analysis. Surface roughness and texture changes developing during...... coating was performed using an instrumented pilot-scale side-vented drum coater. The SEM images of the film-coated tablets showed that within the first 30 minutes, the surface of the tablet cores was completely covered with a thin film. The magnesium signal that was monitored by SEM-EDX disappeared after...

  18. A fuzzy logic control in adjustable autonomy of a multi-agent system for an automated elderly movement monitoring application.

    Science.gov (United States)

    Mostafa, Salama A; Mustapha, Aida; Mohammed, Mazin Abed; Ahmad, Mohd Sharifuddin; Mahmoud, Moamin A

    2018-04-01

    Autonomous agents are being widely used in many systems, such as ambient assisted-living systems, to perform tasks on behalf of humans. However, these systems usually operate in complex environments that entail uncertain, highly dynamic, or irregular workload. In such environments, autonomous agents tend to make decisions that lead to undesirable outcomes. In this paper, we propose a fuzzy-logic-based adjustable autonomy (FLAA) model to manage the autonomy of multi-agent systems that are operating in complex environments. This model aims to facilitate the autonomy management of agents and help them make competent autonomous decisions. The FLAA model employs fuzzy logic to quantitatively measure and distribute autonomy among several agents based on their performance. We implement and test this model in the Automated Elderly Movements Monitoring (AEMM-Care) system, which uses agents to monitor the daily movement activities of elderly users and perform fall detection and prevention tasks in a complex environment. The test results show that the FLAA model improves the accuracy and performance of these agents in detecting and preventing falls. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Monitoring Satellite Data Ingest and Processing for the Atmosphere Science Investigator-led Processing Systems (SIPS)

    Science.gov (United States)

    Witt, J.; Gumley, L.; Braun, J.; Dutcher, S.; Flynn, B.

    2017-12-01

    The Atmosphere SIPS (Science Investigator-led Processing Systems) team at the Space Science and Engineering Center (SSEC), which is funded through a NASA contract, creates Level 2 cloud and aerosol products from the VIIRS instrument aboard the S-NPP satellite. In order to monitor the ingest and processing of files, we have developed an extensive monitoring system to observe every step in the process. The status grid is used for real time monitoring, and shows the current state of the system, including what files we have and whether or not we are meeting our latency requirements. Our snapshot tool displays the state of the system in the past. It displays which files were available at a given hour and is used for historical and backtracking purposes. In addition to these grid like tools we have created histograms and other statistical graphs for tracking processing and ingest metrics, such as total processing time, job queue time, and latency statistics.

  20. Sensor fault-tolerant control for gear-shifting engaging process of automated manual transmission

    Science.gov (United States)

    Li, Liang; He, Kai; Wang, Xiangyu; Liu, Yahui

    2018-01-01

    Angular displacement sensor on the actuator of automated manual transmission (AMT) is sensitive to fault, and the sensor fault will disturb its normal control, which affects the entire gear-shifting process of AMT and results in awful riding comfort. In order to solve this problem, this paper proposes a method of fault-tolerant control for AMT gear-shifting engaging process. By using the measured current of actuator motor and angular displacement of actuator, the gear-shifting engaging load torque table is built and updated before the occurrence of the sensor fault. Meanwhile, residual between estimated and measured angular displacements is used to detect the sensor fault. Once the residual exceeds a determined fault threshold, the sensor fault is detected. Then, switch control is triggered, and the current observer and load torque table estimates an actual gear-shifting position to replace the measured one to continue controlling the gear-shifting process. Numerical and experiment tests are carried out to evaluate the reliability and feasibility of proposed methods, and the results show that the performance of estimation and control is satisfactory.