Sample records for automated process monitoring

  1. Elektronische monitoring van luchtwassers op veehouderijbedrijven = Automated process monitoring and data logging of air scrubbers at animal houses

    NARCIS (Netherlands)

    Melse, R.W.; Franssen, J.C.T.J.


    At 6 animal houses air scrubbers equipped with an automated process monitoring and data logging system were tested. The measured values were successfully stored but the measured values, especially the pH and EC of the recirculation water, appeared not to be correct at all times.

  2. Plug-and-play monitoring and performance optimization for industrial automation processes

    CERN Document Server

    Luo, Hao


    Dr.-Ing. Hao Luo demonstrates the developments of advanced plug-and-play (PnP) process monitoring and control systems for industrial automation processes. With aid of the so-called Youla parameterization, a novel PnP process monitoring and control architecture (PnP-PMCA) with modularized components is proposed. To validate the developments, a case study on an industrial rolling mill benchmark is performed, and the real-time implementation on a laboratory brushless DC motor is presented. Contents PnP Process Monitoring and Control Architecture Real-Time Configuration Techniques for PnP Process Monitoring Real-Time Configuration Techniques for PnP Performance Optimization Benchmark Study and Real-Time Implementation Target Groups Researchers and students of Automation and Control Engineering Practitioners in the area of Industrial and Production Engineering The Author Hao Luo received the Ph.D. degree at the Institute for Automatic Control and Complex Systems (AKS) at the University of Duisburg-Essen, Germany, ...

  3. Robowell: An automated process for monitoring ground water quality using established sampling protocols (United States)

    Granato, G.E.; Smith, K.P.


    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  4. Complex Event Processing Approach To Automated Monitoring Of Particle Accelerator And Its Control System

    Directory of Open Access Journals (Sweden)

    Karol Grzegorczyk


    Full Text Available This article presents the design and implementation of a software component for automated monitoring and diagnostic information analysis of a particle accelerator and its control system. The information that is analyzed can be seen as streams of events. A Complex Event Processing (CEP approach to event processing was selected. The main advantage of this approach is the ability to continuously query data coming from several streams. The presented software component is based on Esper, the most popular open-source implementation of CEP. As a test bed, the control system of the accelerator complex located at CERN, the European Organization for Nuclear Research, was chosen. The complex includes the Large Hadron Collider, the world’s most powerful accelerator. The main contribution to knowledge is by showing that the CEP approach can successfully address many of the challenges associated with automated monitoring of the accelerator and its control system that were previously unsolved. Test results, performance analysis, and a proposal for further works are also presented.

  5. Distributed cyberinfrastructure tools for automated data processing of structural monitoring data (United States)

    Zhang, Yilan; Kurata, Masahiro; Lynch, Jerome P.; van der Linden, Gwendolyn; Sederat, Hassan; Prakash, Atul


    The emergence of cost-effective sensing technologies has now enabled the use of dense arrays of sensors to monitor the behavior and condition of large-scale bridges. The continuous operation of dense networks of sensors presents a number of new challenges including how to manage such massive amounts of data that can be created by the system. This paper reports on the progress of the creation of cyberinfrastructure tools which hierarchically control networks of wireless sensors deployed in a long-span bridge. The internet-enabled cyberinfrastructure is centrally managed by a powerful database which controls the flow of data in the entire monitoring system architecture. A client-server model built upon the database provides both data-provider and system end-users with secured access to various levels of information of a bridge. In the system, information on bridge behavior (e.g., acceleration, strain, displacement) and environmental condition (e.g., wind speed, wind direction, temperature, humidity) are uploaded to the database from sensor networks installed in the bridge. Then, data interrogation services interface with the database via client APIs to autonomously process data. The current research effort focuses on an assessment of the scalability and long-term robustness of the proposed cyberinfrastructure framework that has been implemented along with a permanent wireless monitoring system on the New Carquinez (Alfred Zampa Memorial) Suspension Bridge in Vallejo, CA. Many data interrogation tools are under development using sensor data and bridge metadata (e.g., geometric details, material properties, etc.) Sample data interrogation clients including those for the detection of faulty sensors, automated modal parameter extraction.

  6. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates (United States)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo


    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  7. Automated system for acquisition and image processing for the control and monitoring boned nopal (United States)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.


    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  8. Automated monitoring of milk meters

    NARCIS (Netherlands)

    Mol, de R.M.; Andre, G.


    Automated monitoring might be an alternative for periodic checking of electronic milk meters. A computer model based on Dynamic Linear Modelling (DLM) has been developed for this purpose. Two situations are distinguished: more milking stands in the milking parlour and only one milking stand in the m



    Carrión Muñoz, Rolando; Docente de la FII - UNMSM


    The article shows the role that ergonomics in automation of processes, and the importance for Industrial Engineering.  El artículo nos muestra el papel que tiene la ergonomía en la automatización de los procesos, y la importancia para la Ingeniería Industrial.

  10. High-Throughput, Automated Protein A Purification Platform with Multiattribute LC-MS Analysis for Advanced Cell Culture Process Monitoring. (United States)

    Dong, Jia; Migliore, Nicole; Mehrman, Steven J; Cunningham, John; Lewis, Michael J; Hu, Ping


    The levels of many product related variants observed during the production of monoclonal antibodies are dependent on control of the manufacturing process, especially the cell culture process. However, it is difficult to characterize samples pulled from the bioreactor due to the low levels of product during the early stages of the process and the high levels of interfering reagents. Furthermore, analytical results are often not available for several days, which slows the process development cycle and prevents "real time" adjustments to the manufacturing process. To reduce the delay and enhance our ability to achieve quality targets, we have developed a low-volume, high-throughput, and high-content analytical platform for at-line product quality analysis. This workflow includes an automated, 96-well plate protein A purification step to isolate antibody product from the cell culture fermentation broth, followed by rapid, multiattribute LC-MS analysis. We have demonstrated quantitative correlations between particular process parameters with the levels of glycosylated and glycated species in a series of small scale experiments, but the platform could be used to monitor other attributes and applied across the biopharmaceutical industry.

  11. Heavy Oil Process Monitor: Automated On-Column Asphaltene Precipitation and Re-Dissolution

    Energy Technology Data Exchange (ETDEWEB)

    John F. Schabron; Joseph F. Rovani; Mark Sanderson


    An automated separation technique was developed that provides a new approach to measuring the distribution profiles of the most polar, or asphaltenic components of an oil, using a continuous flow system to precipitate and re-dissolve asphaltenes from the oil. Methods of analysis based on this new technique were explored. One method based on the new technique involves precipitation of a portion of residua sample in heptane on a polytetrafluoroethylene-packed (PTFE) column. The precipitated material is re-dissolved in three steps using solvents of increasing polarity: cyclohexane, toluene, and methylene chloride. The amount of asphaltenes that dissolve in cyclohexane is a useful diagnostic of the thermal history of oil, and its proximity to coke formation. For example, about 40 % (w/w) of the heptane asphaltenes from unpyrolyzed residua dissolves in cyclohexane. As pyrolysis progresses, this number decrease to below 15% as coke and toluene insoluble pre-coke materials appear. Currently, the procedure for the isolation of heptane asphaltenes and the determination of the amount of asphaltenes soluble in cyclohexane spans three days. The automated procedure takes one hour. Another method uses a single solvent, methylene chloride, to re-dissolve the material that precipitates on heptane on the PTFE-packed column. The area of this second peak can be used to calculate a value which correlates with gravimetric asphaltene content. Currently the gravimetric procedure to determine asphaltenes takes about 24 hours. The automated procedure takes 30 minutes. Results for four series of original and pyrolyzed residua were compared with data from the gravimetric methods. Methods based on the new on-column precipitation and re-dissolution technique provide significantly more detail about the polar constituent's oils than the gravimetric determination of asphaltenes.

  12. Bioreactor process monitoring using an automated microfluidic platform for cell-based assays

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin


    We report on a novel microfluidic system designed to monitor in real-time the concentration of live and dead cells in industrial cell production. Custom-made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample-to-waste liquid management and image cytometry...

  13. Simultaneous and automated monitoring of the multimetal biosorption processes by potentiometric sensor array and artificial neural network. (United States)

    Wilson, D; del Valle, M; Alegret, S; Valderrama, C; Florido, A


    In this communication, a new methodology for the simultaneous and automated monitoring of biosorption processes of multimetal mixtures of polluting heavy metals on vegetable wastes based on flow-injection potentiometry (FIP) and electronic tongue detection (ET) is presented. A fixed-bed column filled with grape stalks from wine industry wastes is used as the biosorption setup to remove the metal mixtures from the influent solution. The monitoring system consists in a computer controlled-FIP prototype with the ET based on an array of 9 flow-through ion-selective electrodes and electrodes with generic response to divalent ions placed in series, plus an artificial neural network response model. The cross-response to Cu(2+), Cd(2+), Zn(2+), Pb(2+) and Ca(2+) (as target ions) is used, and only when dynamic treatment of the kinetic components of the transient signal is incorporated, a correct operation of the system is achieved. For this purpose, the FIA peaks are transformed via use of Fourier treatment, and selected coefficients are used to feed an artificial neural network response model. Real-time monitoring of different binary (Cu(2+)/ Pb(2+)), (Cu(2+)/ Zn(2+)) and ternary mixtures (Cu(2+)/ Pb(2+)/ Zn(2+)), (Cu(2+)/ Zn(2+)/ Cd(2+)), simultaneous to the release of Ca(2+) in the effluent solution, are achieved satisfactorily using the reported system, obtaining the corresponding breakthrough curves, and showing the ion-exchange mechanism among the different metals. Analytical performance is verified against conventional spectroscopic techniques, with good concordance of the obtained breakthrough curves and modeled adsorption parameters.

  14. Evaluation of the occupational dose reduction after automation process for calibration of gamma radiation monitors; Avaliacao da reducao da dose ocupacional apos automacao do processo de calibracao de monitores de radiacao gama

    Energy Technology Data Exchange (ETDEWEB)

    Silva Junior, Iremar Alves da; Potiens, Maria da P.A., E-mail:, E-mail: [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)


    In this study, it was evaluated the occupational dose of calibration technicians responsible for monitor calibration of gamma radiation in the Instrument Calibration Laboratory of the Institute of Energy and Nuclear Research (IPEN-LCI), Sp, Brazil in your calibration activities before and after the automation of the process monitor calibration gamma be completed. Various measures of occupational dose values were taken inside the room calibration and the control room of radiators allowing calculate and show the occupational dose values in these environments during a full calibration of a monitor gamma radiation, showing the advantage of automation process, with decrease in dose and time calibration. (author)

  15. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams. (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W


    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  16. National Automated Conformity Inspection Process - (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  17. Automated process planning system (United States)

    Mann, W.


    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  18. Retrofitting automated process control systems at Ukrainian power stations

    Energy Technology Data Exchange (ETDEWEB)

    B.E. Simkin; V.S. Naumchik; B.D. Kozitskii (and others) [OAO L' vovORGRES, Lviv (Ukraine)


    Approaches and principles for retrofitting automated process control systems at Ukrainian power stations are considered. The results obtained from retrofitting the monitoring and control system of Unit 9 at the Burshtyn thermal power station are described.

  19. Automated wind-icing monitoring system

    Energy Technology Data Exchange (ETDEWEB)

    Horokhov, Y.; Nekrasov, Y.; Turbin, S. [Donbas National Academy of Civil Engineering and Architecture, Makeyevka, Donetsk (Ukraine); Grimud, G. [NEC Ukrenergo, Kiev (Ukraine)


    The development of automated wind-icing monitoring systems (AWIMS) has increased the operational reliability of existing overhead lines through a more accurate prediction of icing events in the Ukraine. The systems are capable of operating without the presence of personnel, and allow operators to immediately obtain information about icing processes. The systems provide statistically significant sets of data for determining and predicting loading conditions, as well as combining measurements of icing mass, wind speed and direction, temperature and humidity. An outline of the principles of AWIMS was presented in paper, as well as a description of the system's architecture and operating principles. The monitoring system consists of an ice mass measuring device; a strain gauge sensor; a photoelectric pickup to determine perpendicular mean wind direction; and a wire simulator. The measuring devices are installed 10 meters above ground. Data is transmitted every 30 minutes to a central information office, where information is processed and stored. Details of the ultrasonic anemometer for wind measurements as well as the devices used for humidity and temperature measurement were presented. The AWIMS computer software measures 6 climatic parameters: wind speed; wind direction; air temperature; humidity; icing mass; and wind pressure on ice-covered wires. Results of a series of tests were presented which included a weather station data analysis and a comparison of the AWIMS with standard climatic loads. An analysis of overhead line failure statistical data was also conducted. Spatial icing distributions were used to calculate the threshold sensitivity for the AWIMS. As estimation of overhead lines density per square kilometer showed was made to determine placement of the systems. It was concluded that 8 more AWIMS will be installed in the following year. 3 refs., 10 figs.

  20. Using artificial intelligence to automate remittance processing. (United States)

    Adams, W T; Snow, G M; Helmick, P M


    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  1. Automation of Large-scale Computer Cluster Monitoring Information Analysis (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi


    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  2. Elements of EAF automation processes (United States)

    Ioana, A.; Constantin, N.; Dragna, E. C.


    Our article presents elements of Electric Arc Furnace (EAF) automation. So, we present and analyze detailed two automation schemes: the scheme of electrical EAF automation system; the scheme of thermic EAF automation system. The application results of these scheme of automation consists in: the sensitive reduction of specific consummation of electrical energy of Electric Arc Furnace, increasing the productivity of Electric Arc Furnace, increase the quality of the developed steel, increasing the durability of the building elements of Electric Arc Furnace.

  3. Biogeochemical processing of nutrients in groundwater-fed stream during baseflow conditions - the value of fluorescence spectroscopy and automated high-frequency nutrient monitoring (United States)

    Bieroza, Magdalena; Heathwaite, Louise


    Recent research in groundwater-dominated streams indicates that organic matter plays an important role in nutrient transformations at the surface-groundwater interface known as the hyporheic zone. Mixing of water and nutrient fluxes in the hyporheic zone controls in-stream nutrients availability, dynamics and export to downstream reaches. In particular, benthic sediments can form adsorptive sinks for organic matter and reactive nutrients (nitrogen and phosphorus) that sustain a variety of hyporheic processes e.g. denitrification, microbial uptake. Thus, hyporheic metabolism can have an important effect on both quantity (concentration) and quality (labile vs. refractory character) of organic matter. Here high-frequency nutrient monitoring combined with spectroscopic analysis was used to provide insights into biogeochemical processing of a small, agricultural stream in the NE England subject to diffuse nutrient pollution. Biogeochemical data were collected hourly for a week at baseflow conditions when in-stream-hyporheic nutrient dynamics have the greatest impact on stream health. In-stream nutrients (total phosphorus, reactive phosphorus, nitrate nitrogen) and water quality parameters (turbidity, specific conductivity, pH, temperature, dissolved oxygen, redox potential) were measured in situ hourly by an automated bank-side laboratory. Concurrent hourly autosamples were retrieved daily and analysed for nutrients and fine sediments including spectroscopic analyses of dissolved organic matter - excitation-emission matrix (EEM) fluorescence spectroscopy and ultraviolet-visible (UV-Vis) absorbance spectroscopy. Our results show that organic matter can potentially be utilised as a natural, environmental tracer of the biogeochemical processes occurring at the surface-groundwater interface in streams. High-frequency spectroscopic characterisation of in-stream organic matter can provide useful quantitative and qualitative information on fluxes of reactive nutrients in

  4. Real-time bioacoustics monitoring and automated species identification (United States)

    Corrada-Bravo, Carlos; Campos-Cerqueira, Marconi; Milan, Carlos; Vega, Giovany; Alvarez, Rafael


    Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON), a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website ( Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica. PMID:23882441

  5. Real-time bioacoustics monitoring and automated species identification

    Directory of Open Access Journals (Sweden)

    T. Mitchell Aide


    Full Text Available Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON, a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website ( Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  6. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.;


    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...... data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...

  7. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor


    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  8. Multiplatform automated system for monitoring and sprinkler irrigation control

    Directory of Open Access Journals (Sweden)

    PINTO, M. L.


    Full Text Available The automation systems together with web and mobile control is a facilitator of the various processes in several areas, among them the agricultural sector. Specically in the irrigation management, the lowest cost technology is not able to satisfy the farmer's needs, which are the correct water supply to plants and remote monitoring of the irrigation. The objective of this paper is to present a system for controlling and monitoring irrigation with a multiplatform support for both desktop and web/mobile. The system is designed to realize automatic irrigation management in order to provide the exact amount of water needed for culture, avoiding water stress both the culture and the waste of resources such as water and electricity. Additionally, the system allows remote monitoring from anywhere by means of a computer and/or mobile device by internet. This work was developed during the undergraduate mentorship of the authors.


    Directory of Open Access Journals (Sweden)

    A. S. Kirienko


    Full Text Available Expediency of use of a control system and monitoring of technological processes of production is proved in article that will allow to lower work expenses, and also to increase productivity due to the best production process.The main objective of system, remote monitoring is that gives the chance far off and to quickly give an assessment to the current situation on production, to accept reasonable and timely administrative decisions.

  10. Automated solar cell assembly team process research (United States)

    Nowlan, M. J.; Hogan, S. J.; Darkazalli, G.; Breen, W. F.; Murach, J. M.; Sutherland, S. F.; Patterson, J. S.


    This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire's objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell's Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.

  11. Non-Contact Conductivity Measurement for Automated Sample Processing Systems (United States)

    Beegle, Luther W.; Kirby, James P.


    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  12. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  13. An automated platform for phytoplankton ecology and aquatic ecosystem monitoring. (United States)

    Pomati, Francesco; Jokela, Jukka; Simona, Marco; Veronesi, Mauro; Ibelings, Bas W


    High quality monitoring data are vital for tracking and understanding the causes of ecosystem change. We present a potentially powerful approach for phytoplankton and aquatic ecosystem monitoring, based on integration of scanning flow-cytometry for the characterization and counting of algal cells with multiparametric vertical water profiling. This approach affords high-frequency data on phytoplankton abundance, functional traits and diversity, coupled with the characterization of environmental conditions for growth over the vertical structure of a deep water body. Data from a pilot study revealed effects of an environmental disturbance event on the phytoplankton community in Lake Lugano (Switzerland), characterized by a reduction in cytometry-based functional diversity and by a period of cyanobacterial dominance. These changes were missed by traditional limnological methods, employed in parallel to high-frequency monitoring. Modeling of phytoplankton functional diversity revealed the importance of integrated spatiotemporal data, including circadian time-lags and variability over the water column, to understand the drivers of diversity and dynamic processes. The approach described represents progress toward an automated and trait-based analysis of phytoplankton natural communities. Streamlining of high-frequency measurements may represent a resource for understanding, modeling and managing aquatic ecosystems under impact of environmental change, yielding insight into processes governing phytoplankton community resistance and resilience.

  14. Automated chemical monitoring in new projects of nuclear power plant units (United States)

    Lobanok, O. I.; Fedoseev, M. V.


    The development of automated chemical monitoring systems in nuclear power plant units for the past 30 years is briefly described. The modern level of facilities used to support the operation of automated chemical monitoring systems in Russia and abroad is shown. Hardware solutions suggested by the All-Russia Institute for Nuclear Power Plant Operation (which is the General Designer of automated process control systems for power units used in the AES-2006 and VVER-TOI Projects) are presented, including the structure of additional equipment for monitoring water chemistry (taking the Novovoronezh 2 nuclear power plant as an example). It is shown that the solutions proposed with respect to receiving and processing of input measurement signals and subsequent construction of standard control loops are unified in nature. Simultaneous receipt of information from different sources for ensuring that water chemistry is monitored in sufficient scope and with required promptness is one of the problems that have been solved successfully. It is pointed out that improved quality of automated chemical monitoring can be supported by organizing full engineering follow-up of the automated chemical monitoring system's equipment throughout its entire service life.

  15. Automated Monitoring of Pipeline Rights-of-Way (United States)

    Frost, Chard Ritchie


    NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.

  16. Monitoring Business Processes (United States)

    Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Frati, Fulvio

    In this chapter, we introduce the TEKNE Metrics Framework that performs services to monitor business processes. This framework was designed to support the prescription and explanation of these processes. TEKNE's most innovative contribution is managing data expressed in declarative form. To face this challenge, the TEKNE project implemented an infrastructure that relies on declarative Semantic Web technologies designed to be used in distributed systems.

  17. Wind Turbine Manufacturing Process Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Waseem Faidi; Chris Nafis; Shatil Sinha; Chandra Yerramalli; Anthony Waas; Suresh Advani; John Gangloff; Pavel Simacek


    To develop a practical inline inspection that could be used in combination with automated composite material placement equipment to economically manufacture high performance and reliable carbon composite wind turbine blade spar caps. The approach technical feasibility and cost benefit will be assessed to provide a solid basis for further development and implementation in the wind turbine industry. The program is focused on the following technology development: (1) Develop in-line monitoring methods, using optical metrology and ultrasound inspection, and perform a demonstration in the lab. This includes development of the approach and performing appropriate demonstration in the lab; (2) Develop methods to predict composite strength reduction due to defects; and (3) Develop process models to predict defects from leading indicators found in the uncured composites.

  18. Classification Trees for Quality Control Processes in Automated Constructed Response Scoring. (United States)

    Williamson, David M.; Hone, Anne S.; Miller, Susan; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, the issue of monitoring the scoring process becomes a primary concern, particularly when the goal is to have automated scoring operate completely unassisted by humans. Using a vignette from the Architectural Registration Examination and data for 326 cases with both human…

  19. Automation and control of off-planet oxygen production processes (United States)

    Marner, W. J.; Suitor, J. W.; Schooley, L. S.; Cellier, F. E.


    This paper addresses several aspects of the automation and control of off-planet production processes. First, a general approach to process automation and control is discussed from the viewpoint of translating human process control procedures into automated procedures. Second, the control issues for the automation and control of off-planet oxygen processes are discussed. Sensors, instruments, and components are defined and discussed in the context of off-planet applications, and the need for 'smart' components is clearly established.

  20. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing; Metodologia de monitoracao e diagnostico automatizado de rolamentos utilizando logica paraconsistente, transformada de Wavelet e processamento de sinais digitais

    Energy Technology Data Exchange (ETDEWEB)

    Masotti, Paulo Henrique Ferraz


    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  1. Automated full matrix capture for industrial processes (United States)

    Brown, Roy H.; Pierce, S. Gareth; Collison, Ian; Dutton, Ben; Dziewierz, Jerzy; Jackson, Joseph; Lardner, Timothy; MacLeod, Charles; Morozov, Maxim


    Full matrix capture (FMC) ultrasound can be used to generate a permanent re-focusable record of data describing the geometry of a part; a valuable asset for an inspection process. FMC is a desirable acquisition mode for automated scanning of complex geometries, as it allows compensation for surface shape in post processing and application of the total focusing method. However, automating the delivery of such FMC inspection remains a significant challenge for real industrial processes due to the high data overhead associated with the ultrasonic acquisition. The benefits of NDE delivery using six-axis industrial robots are well versed when considering complex inspection geometries, but such an approach brings additional challenges to scanning speed and positional accuracy when combined with FMC inspection. This study outlines steps taken to optimize the scanning speed and data management of a process to scan the diffusion bonded membrane of a titanium test plate. A system combining a KUKA robotic arm and a reconfigurable FMC phased array controller is presented. The speed and data implications of different scanning methods are compared, and the impacts on data visualization quality are discussed with reference to this study. For the 0.5 m2 sample considered, typical acquisitions of 18 TB/m2 were measured for a triple back wall FMC acquisition, illustrating the challenge of combining high data throughput with acceptable scanning speeds.

  2. Automating slope monitoring in mines with terrestrial lidar scanners (United States)

    Conforti, Dario


    Static terrestrial laser scanners (TLS) have been an important component of slope monitoring for some time, and many solutions for monitoring the progress of a slide have been devised over the years. However, all of these solutions have required users to operate the lidar equipment in the field, creating a high cost in time and resources, especially if the surveys must be performed very frequently. This paper presents a new solution for monitoring slides, developed using a TLS and an automated data acquisition, processing and analysis system. In this solution, a TLS is permanently mounted within sight of the target surface and connected to a control computer. The control software on the computer automatically triggers surveys according to a user-defined schedule, parses data into point clouds, and compares data against a baseline. The software can base the comparison against either the original survey of the site or the most recent survey, depending on whether the operator needs to measure the total or recent movement of the slide. If the displacement exceeds a user-defined safety threshold, the control computer transmits alerts via SMS text messaging and/or email, including graphs and tables describing the nature and size of the displacement. The solution can also be configured to trigger the external visual/audio alarm systems. If the survey areas contain high-traffic areas such as roads, the operator can mark them for exclusion in the comparison to prevent false alarms. To improve usability and safety, the control computer can connect to a local intranet and allow remote access through the software's web portal. This enables operators to perform most tasks with the TLS from their office, including reviewing displacement reports, downloading survey data, and adjusting the scan schedule. This solution has proved invaluable in automatically detecting and alerting users to potential danger within the monitored areas while lowering the cost and work required for

  3. A plasma process monitor/control system

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, J.O.; Ward, P.P.; Smith, M.L. [Sandia National Labs., Albuquerque, NM (United States); Markle, R.J. [Advanced Micro Devices, Inc., Austin, TX (United States)


    Sandia National Laboratories has developed a system to monitor plasma processes for control of industrial applications. The system is designed to act as a fully automated, sand-alone process monitor during printed wiring board and semiconductor production runs. The monitor routinely performs data collection, analysis, process identification, and error detection/correction without the need for human intervention. The monitor can also be used in research mode to allow process engineers to gather additional information about plasma processes. The plasma monitor can perform real-time control of support systems known to influence plasma behavior. The monitor can also signal personnel to modify plasma parameters when the system is operating outside of desired specifications and requires human assistance. A notification protocol can be selected for conditions detected in the plasma process. The Plasma Process Monitor/Control System consists of a computer running software developed by Sandia National Laboratories, a commercially available spectrophotometer equipped with a charge-coupled device camera, an input/output device, and a fiber optic cable.

  4. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino


    In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...... and continues explaining different releasing strategies and principles. Then the paper classifies the numerous sensors used to monitor the effectiveness of grasping (part presence, exchanged force, stick-slip transitions, etc.). Later the grasping and releasing problems in different fields (from mechanical...

  5. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation. (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul


    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  6. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong


    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  7. Automation of electroweak corrections for LHC processes (United States)

    Chiesa, Mauro; Greiner, Nicolas; Tramontano, Francesco


    Next-to-leading order (NLO) electroweak corrections will play an important role in Run 2 of the Large Hadron Collider (LHC). Even though they are typically moderate at the level of total cross sections, they can lead to substantial deviations in the shapes of distributions. In particular, for the search for new physics, but also for a precise determination of Standard Model observables, their inclusion in theoretical predictions is mandatory for a reliable estimation of the Standard Model contribution. In this article we review the status and recent developments in electroweak calculations and their automation for LHC processes. We discuss general issues and properties of NLO electroweak corrections and present some examples, including the full calculation of the NLO corrections to the production of a W-boson in association with two jets computed using GoSam interfaced to MadDipole.

  8. Integrated system for automated financial document processing (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai


    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  9. Biosensors and Automation for Bioprocess Monitoring and Control



    Bioprocess monitoring and control is a complex task that needs rapid and reliable methods which are adaptable to continuous analysis. Process monitoring during fermentation is widely applicable in the field of pharmaceutical, food and beverages and wastewater treatment. The ability to monitor has direct relevance in improving performance, quality, productivity, and yield of the process. In fact, the complexity of the bioprocesses requires almost real time insight into the dynamic process for ...

  10. Semisupervised Gaussian Process for Automated Enzyme Search. (United States)

    Mellor, Joseph; Grigoras, Ioana; Carbonell, Pablo; Faulon, Jean-Loup


    Synthetic biology is today harnessing the design of novel and greener biosynthesis routes for the production of added-value chemicals and natural products. The design of novel pathways often requires a detailed selection of enzyme sequences to import into the chassis at each of the reaction steps. To address such design requirements in an automated way, we present here a tool for exploring the space of enzymatic reactions. Given a reaction and an enzyme the tool provides a probability estimate that the enzyme catalyzes the reaction. Our tool first considers the similarity of a reaction to known biochemical reactions with respect to signatures around their reaction centers. Signatures are defined based on chemical transformation rules by using extended connectivity fingerprint descriptors. A semisupervised Gaussian process model associated with the similar known reactions then provides the probability estimate. The Gaussian process model uses information about both the reaction and the enzyme in providing the estimate. These estimates were validated experimentally by the application of the Gaussian process model to a newly identified metabolite in Escherichia coli in order to search for the enzymes catalyzing its associated reactions. Furthermore, we show with several pathway design examples how such ability to assign probability estimates to enzymatic reactions provides the potential to assist in bioengineering applications, providing experimental validation to our proposed approach. To the best of our knowledge, the proposed approach is the first application of Gaussian processes dealing with biological sequences and chemicals, the use of a semisupervised Gaussian process framework is also novel in the context of machine learning applied to bioinformatics. However, the ability of an enzyme to catalyze a reaction depends on the affinity between the substrates of the reaction and the enzyme. This affinity is generally quantified by the Michaelis constant KM

  11. Automated monitoring of activated sludge using image analysis


    Motta, Maurício da; M. N. Pons; Roche, N; A.L. Amaral; Ferreira, E. C.; Alves, M.M.; Mota, M.; Vivier, H.


    An automated procedure for the characterisation by image analysis of the morphology of activated sludge has been used to monitor in a systematic manner the biomass in wastewater treatment plants. Over a period of one year, variations in terms mainly of the fractal dimension of flocs and of the amount of filamentous bacteria could be related to rain events affecting the plant influent flow rate and composition. Grand Nancy Council. Météo-France. Brasil. Ministério da Ciênc...

  12. Brainstem Monitoring in the Neurocritical Care Unit: A Rationale for Real-Time, Automated Neurophysiological Monitoring. (United States)

    Stone, James L; Bailes, Julian E; Hassan, Ahmed N; Sindelar, Brian; Patel, Vimal; Fino, John


    Patients with severe traumatic brain injury or large intracranial space-occupying lesions (spontaneous cerebral hemorrhage, infarction, or tumor) commonly present to the neurocritical care unit with an altered mental status. Many experience progressive stupor and coma from mass effects and transtentorial brain herniation compromising the ascending arousal (reticular activating) system. Yet, little progress has been made in the practicality of bedside, noninvasive, real-time, automated, neurophysiological brainstem, or cerebral hemispheric monitoring. In this critical review, we discuss the ascending arousal system, brain herniation, and shortcomings of our current management including the neurological exam, intracranial pressure monitoring, and neuroimaging. We present a rationale for the development of nurse-friendly-continuous, automated, and alarmed-evoked potential monitoring, based upon the clinical and experimental literature, advances in the prognostication of cerebral anoxia, and intraoperative neurophysiological monitoring.


    Directory of Open Access Journals (Sweden)

    E. V. Lukyanchuk


    Full Text Available The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex avtomatization sites grape processing with bulk transportation of the latter; improving the quality and durability of wines by the processing of a wide applying wine cold and heat, as well as technical and microbiological control most powerful automation equipment; the introduction of automated production processes of continuous technical champagne, sherry wine and cognac alcohol madery; the use of complex automation auxiliary production sites (boilers, air conditioners, refrigeration unitsand other.; complex avtomatization creation of enterprises, and sites manufactory bottling wines. In the wine industry developed more sophisticated schemes of automation and devices that enable the transition to integrated production automation, will create, are indicative automated enterprise serving for laboratories to study of the main problems of automation of production processes of winemaking.

  14. Automated Monitoring System for Waste Disposal Sites and Groundwater

    Energy Technology Data Exchange (ETDEWEB)

    S. E. Rawlinson


    A proposal submitted to the U.S. Department of Energy (DOE), Office of Science and Technology, Accelerated Site Technology Deployment (ASTD) program to deploy an automated monitoring system for waste disposal sites and groundwater, herein referred to as the ''Automated Monitoring System,'' was funded in fiscal year (FY) 2002. This two-year project included three parts: (1) deployment of cellular telephone modems on existing dataloggers, (2) development of a data management system, and (3) development of Internet accessibility. The proposed concept was initially (in FY 2002) to deploy cellular telephone modems on existing dataloggers and partially develop the data management system at the Nevada Test Site (NTS). This initial effort included both Bechtel Nevada (BN) and the Desert Research Institute (DRI). The following year (FY 2003), cellular modems were to be similarly deployed at Sandia National Laboratories (SNL) and Los Alamos National Laboratory (LANL), and the early data management system developed at the NTS was to be brought to those locations for site-specific development and use. Also in FY 2003, additional site-specific development of the complete system was to be conducted at the NTS. To complete the project, certain data, depending on site-specific conditions or restrictions involving distribution of data, were to made available through the Internet via the DRI/Western Region Climate Center (WRCC) WEABASE platform. If the complete project had been implemented, the system schematic would have looked like the figure on the following page.

  15. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉


    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  16. Automate The Tax Levy Process (Taxy) (United States)

    Social Security Administration — This data store contains information to support the automation of Tax Levy payments. Data includes but is not limited to Title II benefits adjustment data, as well...

  17. Tools for automated acoustic monitoring within the R package monitoR (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese


    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  18. Tools for automated acoustic monitoring within the R package monitoR

    DEFF Research Database (Denmark)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese


    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those...... with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors....

  19. Automated Impedance Tomography for Monitoring Permeable Reactive Barrier Health

    Energy Technology Data Exchange (ETDEWEB)

    LaBrecque, D J; Adkins, P L


    The objective of this research was the development of an autonomous, automated electrical geophysical monitoring system which allows for near real-time assessment of Permeable Reactive Barrier (PRB) health and aging and which provides this assessment through a web-based interface to site operators, owners and regulatory agencies. Field studies were performed at four existing PRB sites; (1) a uranium tailing site near Monticello, Utah, (2) the DOE complex at Kansas City, Missouri, (3) the Denver Federal Center in Denver, Colorado and (4) the Asarco Smelter site in East Helena, Montana. Preliminary surface data over the PRB sites were collected (in December, 2005). After the initial round of data collection, the plan was modified to include studies inside the barriers in order to better understand barrier aging processes. In September 2006 an autonomous data collection system was designed and installed at the EPA PRB and the electrode setups in the barrier were revised and three new vertical electrode arrays were placed in dedicated boreholes which were in direct contact with the PRB material. Final data were collected at the Kansas City, Denver and Monticello, Utah PRB sites in the fall of 2007. At the Asarco Smelter site in East Helena, Montana, nearly continuous data was collected by the autonomous monitoring system from June 2006 to November 2007. This data provided us with a picture of the evolution of the barrier, enabling us to examine barrier changes more precisely and determine whether these changes are due to installation issues or are normal barrier aging. Two rounds of laboratory experiments were carried out during the project. We conducted column experiments to investigate the effect of mineralogy on the electrical signatures resulting from iron corrosion and mineral precipitation in zero valent iron (ZVI) columns. In the second round of laboratory experiments we observed the electrical response from simulation of actual field PRBs at two sites: the

  20. Accuracy of the Dinamap 1846 XT automated blood pressure monitor. (United States)

    Beaubien, E R; Card, C M; Card, S E; Biem, H J; Wilson, T W


    Accurate blood pressure (BP) measurement is important for the detection and treatment of hypertension. Despite widespread use of automated devices, there is limited published evidence for their reliability and accuracy. To determine the reliability and accuracy of the Dinamap 1846XT (Critikon Corporation, Tampa, FL, USA), a commonly used non-invasive oscillometric BP monitor The Dinamap was evaluated against the mercury manometer in 70 randomly selected adult hospitalised medical patients. Each individual underwent three sets of standardised BP measurement by automated method and three sets by mercury manometer by two independent observers. Reliability of BP measurement was assessed by repeated measures analysis. Dinamap accuracy was evaluated according to the American Association of Medical Instrumentation (AAMI) and British Hypertension Society (BHS) guidelines. Most patients were either normotensive or had stage I hypertension. The Dinamap tended to overestimate lower diastolic BP, and displayed poor reliability (P mercury manometer and 84% of systolic and 80% of diastolic readings were within 10 mm hg (bhs grade c). systolic and diastolic accuracy were worse with pressures >160/90 mm Hg (grade D) although these measures were based on a smaller sample of subjects. In conclusion the Dinamap yields inaccurate estimates of both systolic and diastolic BP even under standardised, and thus optimal conditions. This inaccuracy is exaggerated at higher BP (>160/90 mm Hg), although the number of measurements at higher pressures was small. We recommend that this device not be used when accurate BP measurement is needed for therapeutic decision-making.

  1. An overview of the Environmental Monitoring Computer Automation Project

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S.M.; Lorenz, R.


    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS.

  2. An overview of the Environmental Monitoring Computer Automation Project

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S.M.; Lorenz, R.


    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS.

  3. Effects of automation of information-processing functions on teamwork. (United States)

    Wright, Melanie C; Kaber, David B


    We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.

  4. Wideband impedance spectrum analyzer for process automation applications (United States)

    Doerner, Steffen; Schneider, Thomas; Hauptmann, Peter R.


    For decades impedance spectroscopy is used in technical laboratories and research departments to investigate effects or material characteristics that affect the impedance spectrum of the sensor. Establishing this analytical approach for process automation and stand-alone applications will deliver additional and valuable information beside traditional measurement techniques such as the measurement of temperature, flow rate, and conductivity, among others. As yet, most of the current impedance analysis methods are suited for laboratory applications only since they involve stand-alone network analyzers that are slow, expensive, large, or immobile. Furthermore, those systems offer a large range of functionality that is not being used in process control and other fields of application. We developed a sensor interface based on high speed direct digital signal processing offering wideband impedance spectrum analysis with high resolution for frequency adjustment, excellent noise rejection, very high measurement rate, and convenient data exchange to common interfaces. The electronics has been implemented on two small circuit boards and it is well suited for process control applications such as monitoring phase transitions, characterization of fluidal systems, and control of biological processes. The impedance spectrum analyzer can be customized easily for different measurement applications by adapting the appropriate sensor module. It has been tested for industrial applications, e.g., dielectric spectroscopy and high temperature gas analysis.

  5. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; McCormick, T. W.; Lewis, E. R.; Hogan, S. J. (Spire Corporation)


    This report describes work performed by Spire Corporation during Phase 1 of this three-phase PVMaT subcontract to develop new automated post-lamination processes for PV module manufacturing. These processes are applicable to a very broad range of module types, including those made with wafer-based and thin-film solar cells. No off-the-shelf automation was available for these processes prior to this program. Spire conducted a survey of PV module manufacturers to identify current industry practices and to determine the requirements for the automated systems being developed in this program. Spire also completed detailed mechanical and electrical designs and developed software for two prototype automation systems: a module buffer storage system, designated the SPI-BUFFER 350, and an integrated module testing system, designated the SPI-MODULE QA 350. Researchers fabricated, tested, and evaluated both systems with module components from several module manufacturers. A new size simulator , th e SPI-SUN SIMULATOR 350i, was designed with a test area that can handle most production modules without consuming excessive floor space. Spire's subcontractor, the Automation and Robotics Research Institute (ARRI) at the University of Texas, developed and demonstrated module edge trimming, edge sealing, and framing processes that are suitable for automation. The automated processes under development throughout this program are being designed to be combined together to create automated production lines. ARRI completed a cost study to determine the level of investment that can be justified by implementing automation for post-lamination assembly and testing processes. The study concluded that a module production line operating two shifts per day and producing 10 MW of modules per year can justify $2.37 million in capital equipment, assuming a 5-year payback period.

  6. Automation of the Marine Corps Planning Process (United States)


    training, education , and doctrine that support C2 (MCWP 3-40.1, 2003). C. INFORMATION THEORY It is important to recognize that information comprises...elevate the importance of using automation. MCDP 1 describes war as a “complex phenomenon” (p.12, para. 2) and continues to explain, “as a result, war elements associated with conducting and planning warfare. A balance between artistic elements, such as creative solutions resulting from an

  7. An automated qualification framework for the MeerKAT CAM (Control-And-Monitoring) (United States)

    van den Heever, Lize; Marais, Neilen; Slabber, Martin


    This paper introduces and discusses the design of an Automated Qualification Framework (AQF) that was developed to automate as much as possible of the formal Qualification Testing of the Control And Monitoring (CAM) subsystem of the 64 dish MeerKAT radio telescope currently under construction in the Karoo region of South Africa. The AQF allows each Integrated CAM Test to reference the MeerKAT CAM requirement and associated verification requirement it covers and automatically produces the Qualification Test Procedure and Qualification Test Report from the test steps and evaluation steps annotated in the Integrated CAM Tests. The MeerKAT System Engineers are extremely happy with the AQF results, but mostly by the approach and process it enforces.

  8. Monitoring and control of fine abrasive finishing processes

    DEFF Research Database (Denmark)

    Lazarev, Ruslan

    of this work was to investigate foundations for process monitoring and control methods in application to semi-automated polishing machine based on the industrial robot. The monitoring system was built on NI data acquisition system with two sensors, acoustic emission sensor and accelerometer. Acquired sensory...... was quantified in terms of material removal volume. This property shows how efficient the surface processing is and leads to end point time detection of the process. It is one of the central topics in polishing process control due to its monotonous and time-consuming nature. To deal with it, the process...

  9. A system for automated monitoring of embankment deformation along the Qinghai-Tibet Railway in permafrost regions

    Institute of Scientific and Technical Information of China (English)

    YongPeng Yang; YaoHui Qu; HanCheng Cai; Jia Cheng; CaiMei Tang


    At present, the monitoring of embankment deformation in permafrost regions along the Qinghai-Tibet Railway is mainly done manually. However, the harsh climate on the plateau affects the results greatly by lowering the observation frequency, so the manual monitoring can barely meet the observational demand. This research develops a system of automated monitoring of embankment deformation, and aims to address the problems caused by the plateau climate and the perma-frost conditions in the region. The equipment consists of a monitoring module, a data collection module, a transmission module, and a data processing module. The field experiments during this program indicate that (1) the combined auto-mated monitoring device overcame the problems associated with the complicated and tough plateau environment by means of wireless transmission and automatic analysis of the embankment settlement data;(2) the calibration of the combined settlement gauge at −20 °C was highly accurate, with an error rate always <0.5%; (3) the gauge calibration at high-temperature conditions was also highly accurate, with an error rate<0.5%even though the surface of the instrument reached more than 50 °C;and (4) compared with the data manually taken, the data automatically acquired during field monitoring experiments demonstrated that the combined settlement gauge and the automated monitoring system could meet the requirements of the monitoring mission in permafrost regions along the Qinghai-Tibet Railway.

  10. The Automated Discovery of Hybrid Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Reijers, Hajo; Maggi, Fabrizio Maria


    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedu...

  11. The Tracking of Referents in Discourse: Automated versus Attended Processes. (United States)


    DA17583 THE TRACKING OF REFERENTS IN DISCOURSE: AUTOHATED 1/1 VERSUS ATTENDED PROCESSES (U) OREGON UNIV EUGENE DEPT OF PSYCHOLOGY T GIVON ET AL. 0 MAY...of language processing , used in early childhood or early second-language acquisition (Pidginization). This ess routinized processing mode can be...8217; . ’-, .. " . ’,,: ’, """": -. ,. Ln NCognitive Science Program II THE TRACKING OF REFERENTS IN DISCOURSE: AUTOMATED VS. ATTENDED PROCESSES by T. Givon, W. Kellogg, MI

  12. Toward the automation of road networks extraction processes (United States)

    Leymarie, Frederic; Boichis, Nicolas; Airault, Sylvain; Jamet, Olivier


    Syseca and IGN are working on various steps in the ongoing march from digital photogrammetry to the semi-automation and ultimately the full automation of data manipulation, i.e., capture and analysis. The immediate goals are to reduce the production costs and the data availability delays. Within this context, we have tackle the distinctive problem of 'automated road network extraction.' The methodology adopted is to first study semi-automatic solutions which probably increase the global efficiency of human operators in topographic data capture; in a second step, automatic solutions are designed based upon the gained experience. We report on different (semi-)automatic solutions for the road following algorithm. One key aspect of our method is to have the stages of 'detection' and 'geometric recovery' cooperate together while remaining distinct. 'Detection' is based on a local (texture) analysis of the image, while 'geometric recovery' is concerned with the extraction of 'road objects' for both monocular and stereo information. 'Detection' is a low-level visual process, 'reasoning' directly at the level of image intensities, while the mid-level visual process, 'geometric recovery', uses contextual knowledge about roads, both generic, e.g. parallelism of borders, and specific, e.g. using previously extracted road segments and disparities. We then pursue our 'march' by reporting on steps we are exploring toward full automation. We have in particular made attempts at tackling the automation of the initialization step to start searching in a valid direction.

  13. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.


    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  14. Automated high-volume aerosol sampling station for environmental radiation monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S


    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m{sup 3}/h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10{sup -6} Bq/m{sup 3}. The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too 10 refs.

  15. Recent Developments in Advanced Automated Post-Processing at AMOS (United States)


    Recent Developments in Advanced Automated Post-Processing at AMOS Michael Werth, Brandoch Calef, Daniel Thompson The Boeing Company Kathy...6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) The Boeing Company ,100

  16. Automation of J2EE Product Certification Process

    Directory of Open Access Journals (Sweden)

    Iddalgave Sandeep


    Full Text Available Automation is a process of developing a tool or a product to perform on its own, in order to achieve all the functionalities required of a complex system, without human intervention. The current Certification Process of the J2EE product in IBM is consuming unnecessary time even for the valid products. In order to reduce this processing overhead, the process of automation was proposed by the technologist to save time so as to accomplish the task early. The need for automating the tool of Certification arose due to unwanted utilization of man-hour and man-power. The existing system halts after the tool finishes the comparison of the listings (which contain details of the build, its environment and the list of files present within the build, until the administrator logs-in. Administrator checks each transaction ID individually for intimating the clients about the status of their products. The proposed system aims at sending the intimation to the clients as soon as the comparison is done successfully, without the intervention of administrator i.e., an automated notification will be sent after successful comparison of the listings. The tool will also provide an option to generate chart/graph whenever the managerial team needs it at just one click. The work carried out here is aimed at automating the process of certifying the Java products or assets developed by IBM Software Product Groups. It also aims at generating graphs at real-time for the number of products certified till date. The certification process is for IBM JRE/SDK used by IBM Software Product Groups. It will be done by comparing the listings submitted by the IBM Software Product Groups and the reference listings present in the Certification server. JRE/SDK should adhere to the constraints as per the Oracle license.

  17. Knowledge Automation How to Implement Decision Management in Business Processes

    CERN Document Server

    Fish, Alan N


    A proven decision management methodology for increased profits and lowered risks Knowledge Automation: How to Implement Decision Management in Business Processes describes a simple but comprehensive methodology for decision management projects, which use business rules and predictive analytics to optimize and automate small, high-volume business decisions. It includes Decision Requirements Analysis (DRA), a new method for taking the crucial first step in any IT project to implement decision management: defining a set of business decisions and identifying all the information-business knowledge

  18. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro


    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance an

  19. Automated analysis for lifecycle assembly processes

    Energy Technology Data Exchange (ETDEWEB)

    Calton, T.L.; Brown, R.G.; Peters, R.R.


    Many manufacturing companies today expend more effort on upgrade and disposal projects than on clean-slate design, and this trend is expected to become more prevalent in coming years. However, commercial CAD tools are better suited to initial product design than to the product`s full life cycle. Computer-aided analysis, optimization, and visualization of life cycle assembly processes based on the product CAD data can help ensure accuracy and reduce effort expended in planning these processes for existing products, as well as provide design-for-lifecycle analysis for new designs. To be effective, computer aided assembly planning systems must allow users to express the plan selection criteria that apply to their companies and products as well as to the life cycles of their products. Designing products for easy assembly and disassembly during its entire life cycle for purposes including service, field repair, upgrade, and disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and constraints (compared to initial assembly) require one to re-visit the significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or applied studies of life cycle assembly processes, which give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for; optimize, and analyze life cycle assembly processes.

  20. The Automation of Nowcast Model Assessment Processes (United States)


    S) Leelinda P Dawson, John W Raby, and Jeffrey A Smith 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION...model runs .............................13 Fig. 6 An example PSA log file, ps_auto_log, using DDA, one case- study date, 3 domains, 3 model runs, study date could be set for each run. This process was time-consuming when multiple configurations were required by the user. Also, each run


    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko


    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  2. Scientific Process Automation and Workflow Management

    Energy Technology Data Exchange (ETDEWEB)

    Ludaescher, Bertram T.; Altintas, Ilkay; Bowers, Shawn; Cummings, J.; Critchlow, Terence J.; Deelman, Ewa; De Roure, D.; Freire, Juliana; Goble, Carole; Jones, Matt; Klasky, S.; McPhillips, Timothy; Podhorszki, Norbert; Silva, C.; Taylor, I.; Vouk, M.


    We introduce and describe scientific workflows, i.e., executable descriptions of automatable scientific processes such as computational science simulations and data analyses. Scientific workflows are often expressed in terms of tasks and their (data ow) dependencies. This chapter first provides an overview of the characteristic features of scientific workflows and outlines their life cycle. A detailed case study highlights workflow challenges and solutions in simulation management. We then provide a brief overview of how some concrete systems support the various phases of the workflow life cycle, i.e., design, resource management, execution, and provenance management. We conclude with a discussion on community-based workflow sharing.

  3. Technology transfer potential of an automated water monitoring system. [market research (United States)

    Jamieson, W. M.; Hillman, M. E. D.; Eischen, M. A.; Stilwell, J. M.


    The nature and characteristics of the potential economic need (markets) for a highly integrated water quality monitoring system were investigated. The technological, institutional and marketing factors that would influence the transfer and adoption of an automated system were studied for application to public and private water supply, public and private wastewater treatment and environmental monitoring of rivers and lakes.

  4. ECG acquisition and automated remote processing

    CERN Document Server

    Gupta, Rajarshi; Bera, Jitendranath


    The book is focused on the area of remote processing of ECG in the context of telecardiology, an emerging area in the field of Biomedical Engineering Application. Considering the poor infrastructure and inadequate numbers of physicians in rural healthcare clinics in India and other developing nations, telemedicine services assume special importance. Telecardiology, a specialized area of telemedicine, is taken up in this book considering the importance of cardiac diseases, which is prevalent in the population under discussion. The main focus of this book is to discuss different aspects of ECG acquisition, its remote transmission and computerized ECG signal analysis for feature extraction. It also discusses ECG compression and application of standalone embedded systems, to develop a cost effective solution of a telecardiology system.

  5. Mass Spectrometry-Based Monitoring of Millisecond Protein-Ligand Binding Dynamics Using an Automated Microfluidic Platform

    Energy Technology Data Exchange (ETDEWEB)

    Cong, Yongzheng; Katipamula, Shanta; Trader, Cameron D.; Orton, Daniel J.; Geng, Tao; Baker, Erin Shammel; Kelly, Ryan T.


    Characterizing protein-ligand binding dynamics is crucial for understanding protein function and developing new therapeutic agents. We have developed a novel microfluidic platform that features rapid mixing of protein and ligand solutions, variable incubation times, and on-chip electrospray ionization to perform label-free, solution-based monitoring of protein-ligand binding dynamics. This platform offers many advantages including automated processing, rapid mixing, and low sample consumption.

  6. Providing security for automated process control systems at hydropower engineering facilities (United States)

    Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.


    This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.

  7. ZigBee Based Industrial Automation Profile for Power Monitoring Systems


    Archana R. Raut,; Dr. L. G. Malik


    Industrial automations which are mostly depend upon the power systems & which requires distance controlled and regulated systems. Mostly voltage and current equipped parameters along with power and energy management system forms the industrial scenario for automations. Wireless technology which meets to cost, speed and distance scenario will always be a point of an interest for research. In this research work we mainly monitored power related parameters and enable remote switching devices for...

  8. An extended process automation system : an approach based on a multi-agent system


    Seilonen, Ilkka


    This thesis describes studies on application of multi-agent systems (acronym: MAS) to enhance process automation systems. A specification of an extended process automation system is presented. According to this specification, MAS can be used to extend the functionality of ordinary process automation systems at higher levels of control. Anticipated benefits of the specification include enhanced reconfigurability, responsiveness and flexibility properties of process automation. Previous res...

  9. Intelligent sensor-model automated control of PMR-15 autoclave processing (United States)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.


    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  10. Thin film production with a new fully automated optical thickness monitoring system (Invited Paper) (United States)

    Lardon, M.; Selhofer, H.


    The increasing demand for complex multilayer optical coatings requires equipment with a completely automated process control system. The new optical thickness monitor GSM 420, which is part of the deposition control system BPU 420 allows the remotely controlled wave-length selection either with a grating monochromator combined with the appropriate order sorting filters or with a set of six narrow bandpass filters. The endpoint detection is based on the digital processing of the signal corresponding to the light intensity after transmission through or reflexion from a testglass located side by side with a quartz crystal microbalance at the center of the coating plant. Turning value monitoring or termination of the process at an arbitrary predetermined point are both possible. Single and multiple layers of silicon dioxide and titanium dioxide and combinations thereof were deposited. Excellent linear correlation between the optical thickness on the test glass and the geometrical layer thickness as measured by the quartz crystal microbalance was observed. The reproducibility for single layers of quarterwave thickness was found to be between +/- 0.7 to +/- 1.7 % of the center wavelength of the spectral extremum measured on the test glass, depending on wavelength (350 - 3200 nm) and coating material (SiO2 or TiO2 on glass).

  11. Advanced oxidation protein products (AOPP) for monitoring oxidative stress in critically ill patients: a simple, fast and inexpensive automated technique. (United States)

    Selmeci, László; Seres, Leila; Antal, Magda; Lukács, Júlia; Regöly-Mérei, Andrea; Acsády, György


    Oxidative stress is known to be involved in many human pathological processes. Although there are numerous methods available for the assessment of oxidative stress, most of them are still not easily applicable in a routine clinical laboratory due to the complex methodology and/or lack of automation. In research into human oxidative stress, the simplification and automation of techniques represent a key issue from a laboratory point of view at present. In 1996 a novel oxidative stress biomarker, referred to as advanced oxidation protein products (AOPP), was detected in the plasma of chronic uremic patients. Here we describe in detail an automated version of the originally published microplate-based technique that we adapted for a Cobas Mira Plus clinical chemistry analyzer. AOPP reference values were measured in plasma samples from 266 apparently healthy volunteers (university students; 81 male and 185 female subjects) with a mean age of 21.3 years (range 18-33). Over a period of 18 months we determined AOPP concentrations in more than 300 patients in our department. Our experiences appear to demonstrate that this technique is especially suitable for monitoring oxidative stress in critically ill patients (sepsis, reperfusion injury, heart failure) even at daily intervals, since AOPP exhibited rapid responses in both directions. We believe that the well-established relationship between AOPP response and induced damage makes this simple, fast and inexpensive automated technique applicable in daily routine laboratory practice for assessing and monitoring oxidative stress in critically ill or other patients.

  12. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness (United States)

    Singh, Preetpal

    Canada is home to thousands of freshwater lakes and rivers. Apart from being sources of infinite natural beauty, rivers and lakes are an important source of water, food and transportation. The northern hemisphere of Canada experiences extreme cold temperatures in the winter resulting in a freeze up of regional lakes and rivers. Frozen lakes and rivers tend to offer unique opportunities in terms of wildlife harvesting and winter transportation. Ice roads built on frozen rivers and lakes are vital supply lines for industrial operations in the remote north. Monitoring the ice freeze-up and break-up dates annually can help predict regional climatic changes. Lake ice impacts a variety of physical, ecological and economic processes. The construction and maintenance of a winter road can cost millions of dollars annually. A good understanding of ice mechanics is required to build and deem an ice road safe. A crucial factor in calculating load bearing capacity of ice sheets is the thickness of ice. Construction costs are mainly attributed to producing and maintaining a specific thickness and density of ice that can support different loads. Climate change is leading to warmer temperatures causing the ice to thin faster. At a certain point, a winter road may not be thick enough to support travel and transportation. There is considerable interest in monitoring winter road conditions given the high construction and maintenance costs involved. Remote sensing technologies such as Synthetic Aperture Radar have been successfully utilized to study the extent of ice covers and record freeze-up and break-up dates of ice on lakes and rivers across the north. Ice road builders often used Ultrasound equipment to measure ice thickness. However, an automated monitoring system, based on machine vision and image processing technology, which can measure ice thickness on lakes has not been thought of. Machine vision and image processing techniques have successfully been used in manufacturing

  13. Tracking forest canopy stress from an automated proximal hyperspectral monitoring system (United States)

    Woodgate, William; van Gorsel, Eva; Hughes, Dale; Cabello-Leblic, Arantxa


    Increasing climate variability and associated extreme weather events such as drought are likely to profoundly affect ecosystems, as many ecological processes are more sensitive to climate extremes than to changes in the mean states. However, the response of vegetation to these changes is one of the largest uncertainties in projecting future climate, carbon sequestration, and water resources. This remains a major limitation for long term climate prediction models integrating vegetation dynamics that are crucial for modelling the interplay of water, carbon and radiation fluxes. Satellite remote sensing data, such as that from the MODIS, Landsat and Sentinel missions, are the only viable means to study national and global vegetation trends. Highly accurate in-situ data is critical to better understand and validate our satellite products. Here, we developed a fully automated hyperspectral monitoring system installed on a flux monitoring tower at a mature Eucalypt forest site. The monitoring system is designed to provide a long-term (May 2014 - ongoing) and high temporal characterisation (3 acquisitions per day) of the proximal forest canopy to an unprecedented level of detail. The system comprises four main instruments: a thermal imaging camera and hyperspectral line camera (spectral ranges 7.5-14 μm and 0.4-1 μm, respectively), an upward pointing spectrometer (350-1000 nm), and hemispherical camera. The time series of hyperspectral and thermal imagery and flux tower data provides a unique dataset to study the impacts of logging, nutrient, and heat stress on trees and forest. Specifically, the monitoring system can be used to derive a range of physiological and structural indices that are also derived by satellites, such as PRI, TCARI/OSAVI, and NDVI. The monitoring system, to our knowledge, is the first fully automated data acquisition system that allows for spatially resolved spectral measurements at the sub-crown scale. Preliminary results indicate that canopy

  14. An automated platform for phytoplankton ecology and aquatic ecosystem monitoring

    NARCIS (Netherlands)

    Pomati, F.; Jokela, J.; Simona, M.; Veronesi, M.; Ibelings, B.W.


    High quality monitoring data are vital for tracking and understanding the causes of ecosystem change. We present a potentially powerful approach for phytoplankton and aquatic ecosystem monitoring, based on integration of scanning flow-cytometry for the characterization and counting of algal cells wi

  15. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure (United States)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.


    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  16. Text mining from ontology learning to automated text processing applications

    CERN Document Server

    Biemann, Chris


    This book comprises a set of articles that specify the methodology of text mining, describe the creation of lexical resources in the framework of text mining and use text mining for various tasks in natural language processing (NLP). The analysis of large amounts of textual data is a prerequisite to build lexical resources such as dictionaries and ontologies and also has direct applications in automated text processing in fields such as history, healthcare and mobile applications, just to name a few. This volume gives an update in terms of the recent gains in text mining methods and reflects

  17. Automated separation process for radioanalytical purposes at nuclear power plants. (United States)

    Nagy, L G; Vajda, N; Vodicska, M; Zagyvai, P; Solymosi, J


    Chemical separation processes have been developed to remove the matrix components and thus to determine fission products, especially radioiodine nuclides, in the primary coolant of WWER-type nuclear reactors. Special procedures have been elaborated to enrich long-lived nuclides in waste waters to be released and to separate and enrich caesium isotopes in the environment. All processes are based mainly on ion-exchange separations using amorphous zirconium phosphate. Automated equipment was constructed to meet the demands of the plant personnel for serial analysis.

  18. QualitySpy: a framework for monitoring software development processes

    Directory of Open Access Journals (Sweden)

    Marian Jureczko


    Full Text Available The growing popularity of highly iterative, agile processes creates increasing need for automated monitoring of the quality of software artifacts, which would be focused on short terms (in the case of eXtreme Programming process iteration can be limited to one week. This paper presents a framework that calculates software metrics and cooperates with development tools (e.g. source version control system and issue tracking system to describe current state of a software project with regard to its quality. The framework is designed to support high level of automation of data collection and to be useful for researchers as well as for industry. The framework is currently being developed hence the paper reports already implemented features as well as future plans. The first release is scheduled for July.


    Directory of Open Access Journals (Sweden)

    V. K. Bitiukov


    Full Text Available Summary. The article discusses the construction of algorithms for automated processing of microphotos of dairy products. Automated processing of micro photos of dairy products relevant in the study of the degree of homogenization. Microphotos of dairy products contain information about the distribution of fat globules in the mass fractions. Today, there are some of software products, offering image processing and relieving researchers from routine operations manual data processing. But it need to be adapted for performing the processing of microphotos of dairy products. In this paper we propose to use for processing the application package ImageJ for processing image files taken with digital microscope, and to calculate the statistical characteristics of the proposed use of the software package Statistica. Processing algorithm consists of successive stages of conversion to gray scale, scaling, filtering, binarization, object recognition and statistical processing of the results of recognition. The result of the implemented data processing algorithms is the distribution function of the fat globules in terms of volume or mass fraction, as well as the statistical parameters of the distribution (the mathematical expectation, variance, skewness and kurtosis coefficients. For the inspection of the algorithm and its debugging experimental studieswere carried out. Carries out the homogenization of farm milk at different pressures of homogenization. For each sample were made microphoto sand image processing carried out in accordance with the proposed algorithm. Studies have shown the effectiveness and feasibility of the proposed algorithm in the form of java script for ImageJ and then send the data to a file for the software package Statistica.

  20. A Process Model of Trust in Automation: A Signal Detection Theory Based Approach (United States)


    lead to trust in automation. We also discuss a simple process model , which helps us understand the results. Our experimental paradigm suggests that...participants are agnostic to the automation s behavior; instead, they merely focus on alarm rate. A process model suggests this is the result of a simple reward structure and a non-explicit cost of trusting the automation.

  1. A Low Cost Automated Monitoring System for Landslides Using Dual Frequency GPS (United States)

    Mills, H.; Edwards, S.


    Landslides are an existing and permanent threat to societies across the globe, generating financial and human losses whenever and wherever they occur. Drawing together the strands of science that provide increased understanding of landslide triggers through accurate modelling is therefore vital for the development of mitigation and management strategies. Together with climatic and geomorphological data a key input here is information on the precise location and timing of landslide events. However, the detailed monitoring of landslides and precursor movements is generally limited to episodic campaigns where limiting factors include equipment and mobilisation costs, time constraints and spatial resolution. This research has developed a geodetic tool of benefit to scientists involved in the development of closely coupled models that seek to explain trigger mechanisms such as rainfall duration and intensity and changes in groundwater pressure to actual real land movements. A fully automated low cost dual frequency GPS station for the continuous in-situ monitoring of landslide sites has been developed. System configuration combines a dual frequency GPS receiver, PC board with a GPRS modem and power supply to deliver 24hr/365day operation capability. Individual components have been chosen to provide the highest accuracies while minimising power consumption resulting in a system around half that of equivalent commercial systems. Measurement point-costs can be further reduced through the use of antenna switching and multi antenna arrays. Continuous data is delivered via mobile phone uplink and processed automatically using geodetic software. The developed system has been extensively tested on a purpose built platform capable of simulating ground movements. Co-mounted antennas have allowed direct comparisons with more expensive geodetic GPS receivers. The system is capable of delivering precise 3D coordinates with a 9 mm rms. The system can be up-scaled resulting in the

  2. Silicon Carbide Temperature Monitor Processing Improvements. Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Unruh, Troy Casey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Daw, Joshua Earl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Al Rashdan, Ahamad [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    Silicon carbide (SiC) temperature monitors are used as temperature sensors in Advanced Test Reactor (ATR) irradiations at the Idaho National Laboratory (INL). Although thermocouples are typically used to provide real-time temperature indication in instrumented lead tests, other indicators, such as melt wires, are also often included in such tests as an independent technique of detecting peak temperatures incurred during irradiation. In addition, less expensive static capsule tests, which have no leads attached for real-time data transmission, often rely on melt wires as a post-irradiation technique for peak temperature indication. Melt wires are limited in that they can only detect whether a single temperature is or is not exceeded. SiC monitors are advantageous because a single monitor can be used to detect for a range of temperatures that occurred during irradiation. As part of the process initiated to make SiC temperature monitors available at the ATR, post-irradiation evaluations of these monitors have been previously completed at the High Temperature Test Laboratory (HTTL). INL selected the resistance measurement approach for determining irradiation temperature from SiC temperature monitors because it is considered to be the most accurate measurement. The current process involves the repeated annealing of the SiC monitors at incrementally increasing temperature, with resistivity measurements made between annealing steps. The process is time consuming and requires the nearly constant attention of a trained staff member. In addition to the expensive and lengthy post analysis required, the current process adds many potential sources of error in the measurement, as the sensor must be repeatedly moved from furnace to test fixture. This time-consuming post irradiation analysis is a significant portion of the total cost of using these otherwise inexpensive sensors. An additional consideration of this research is that, if the SiC post processing can be automated, it

  3. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process (United States)

    Spaulding, Trent Joseph


    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  4. Starting the automation process by using group technology

    Directory of Open Access Journals (Sweden)

    Jorge Andrés García Barbosa


    Full Text Available This article describes starting-up an automation process based on applying group technology (GT. Mecanizados CNC, a company making matallurgical sector products, bases the layout (organisation and disposition of its machinery on the concept of manufacturing cells; production is programmed once the best location for the equipment has been determined. The order of making products and suitable setting up of tools for the machinery in the cells is established, aimed at minimising set up leading to achieving 15% improvement in productivity.

  5. Monitoring and Control of the Automated Transfer Vehicle (United States)

    Hugonnet, C.; D'Hoine, S.

    The objective of this paper is to present succinctly the architecture of the heart of the ATV Control Centre: the Monitoring and Control developed by CS for the French Space Agency (CNES) and the European Space Agency (ESA). At the moment, the Monitoring and Control is in the development phase, a first real time version will be delivered to CNES in July 2003, then a second version will be delivered in October including off line capabilities. The following paper introduces the high level specifications and the main driving performance criteria of the monitoring and control system in order to successfully operate these complex ATV space vehicles from the first flight planned in 2004. It presents the approach taken by CS and CNES in order to meet this challenge in a very short time. ATV-CC Monitoring and Control system is based on the reuse of flight proven components that are integrated in a software bus based architecture. The paper particularly shows the advantages of using new computer technologies in operational system: use of Object Oriented technologies from specification, design (UML) to development (C++, Java, PLSQL), use of a CORBA Object Request Broker for the exchange of messages and some centralised services, use of Java for the development of an ergonomic and standardised (for all functions of the M&C) Graphical User Interface and the extensive use of XML for data exchanges.

  6. Automated Grid Monitoring for LHCb through HammerCloud

    CERN Document Server

    CERN. Geneva


    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  7. Automated Performance Monitoring and Assessment for DCS Digital Systems. (United States)


    eye opening monitor performance assessment techniques. (3) Implement and program a CPMAS test processor subsystem. (4) Perform a field test internodal path in the planned DEB network is 11 ( Hillingdon to Schoenfeld). For this reason, the path from Node A to Node B in the Transmission...RADIOS SITE RADIOS HOHENSTADT 4 BANN 6 STUTTGART 4 HILLINGDON 3 LANGERKOPF 5 CROUGHTON 4 DONNERSBERG 10 MARTLESHAM HEATH 3 PIRMASENS 4 ADENAU 3

  8. Automated monitoring: a potential solution for achieving sustainable improvement in hand hygiene practices. (United States)

    Levchenko, Alexander I; Boscart, Veronique M; Fernie, Geoff R


    Adequate hand hygiene is often considered as the most effective method of reducing the rates of hospital-acquired infections, which are one of the major causes of increased cost, morbidity, and mortality in healthcare. Electronic monitoring technologies provide a promising direction for achieving sustainable hand hygiene improvement by introducing the elements of automated feedback and creating the possibility to automatically collect individual hand hygiene performance data. The results of the multiphase testing of an automated hand hygiene reminding and monitoring system installed in a complex continuing care setting are presented. The study included a baseline Phase 1, with the system performing automated data collection only, a preintervention Phase 2 with hand hygiene status indicator enabled, two intervention Phases 3 and 4 with the system generating hand hygiene reminding signals and periodic performance feedback sessions provided, and a postintervention Phase 5 with only hand hygiene status indicator enabled and no feedback sessions provided. A significant increase in hand hygiene performance observed during the first intervention Phase 3 was sustained over the second intervention Phase 4, with the postintervention phase also indicating higher hand hygiene activity rates compared with the preintervention and baseline phases. The overall trends observed during the multiphase testing, the factors affecting acceptability of the automated hand hygiene monitoring system, and various strategies of technology deployment are discussed.

  9. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang


      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  10. Monitoring method for automated CD-SEM recipes (United States)

    Maeda, Tatsuya; Iwama, Satoru; Nishihara, Makoto; Berger, Daniel; Berger, Andrew; Ueda, Kazuhiro; Kenichi, Takenouchi; Iizumi, Takashi


    A prototype of a digital video storage system (CD-watcher) has been developed and attached to a Hitachi S-9380 CD-SEM. The storage system has several modes that are selectable depending on the phenomenon of interest. The system can store video images of duration from a few seconds to a few weeks depending on resolution, sampling rate, and hard disc drive capacity. The system was used to analyze apparent focusing problems that occurred during the execution of automated recipes. Intermittent focusing problems had been an issue on a particular tool for a period of approximately three months. By reviewing saved images, the original diagnosis of the problem appeared to be auto focus. Two days after installation, the CD-watcher system was able to record the errors making it possible to determine the root cause by checking the stored video files. After analysis of the stored video files, it was apparent that the problem consisted of three types of errors. The ability to record and store video files reduced the time to isolate the problem and prevented incorrect diagnosis. The system was also used to explain a complex phenomenon that occurred during the observation a particular layer. Because it is sometimes difficult to accurately describe, and to have others easily understand, certain phenomena in a written report, the video storage system can be used in place of manual annotation. In this report, we describe the CD-watcher system, test results after installing the system on a Hitachi S9380 CD-SEM, and potential applications of the system.

  11. Highly Automated Agile Testing Process: An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Jarosław Berłowski


    Full Text Available This paper presents a description of an agile testing process in a medium size software project that is developed using Scrum. The research methods used is the case study were as follows: surveys, quantifiable project data sources and qualitative project members opinions were used for data collection. Challenges related to the testing process regarding a complex project environment and unscheduled releases were identified. Based on the obtained results, we concluded that the described approach addresses well the aforementioned issues. Therefore, recommendations were made with regard to the employed principles of agility, specifically: continuous integration, responding to change, test automation and test driven development. Furthermore, an efficient testing environment that combines a number of test frameworks (e.g. JUnit, Selenium, Jersey Test with custom-developed simulators is presented.

  12. Automated Image Processing for the Analysis of DNA Repair Dynamics

    CERN Document Server

    Riess, Thorsten; Tomas, Martin; Ferrando-May, Elisa; Merhof, Dorit


    The efficient repair of cellular DNA is essential for the maintenance and inheritance of genomic information. In order to cope with the high frequency of spontaneous and induced DNA damage, a multitude of repair mechanisms have evolved. These are enabled by a wide range of protein factors specifically recognizing different types of lesions and finally restoring the normal DNA sequence. This work focuses on the repair factor XPC (xeroderma pigmentosum complementation group C), which identifies bulky DNA lesions and initiates their removal via the nucleotide excision repair pathway. The binding of XPC to damaged DNA can be visualized in living cells by following the accumulation of a fluorescent XPC fusion at lesions induced by laser microirradiation in a fluorescence microscope. In this work, an automated image processing pipeline is presented which allows to identify and quantify the accumulation reaction without any user interaction. The image processing pipeline comprises a preprocessing stage where the ima...

  13. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław


    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  14. Automated, Multiplexed Electrical Impedance Spectroscopy Platform for Continuous Monitoring of Microtissue Spheroids. (United States)

    Bürgel, Sebastian C; Diener, Laurin; Frey, Olivier; Kim, Jin-Young; Hierlemann, Andreas


    Microtissue spheroids in microfluidic devices are increasingly used to establish novel in vitro organ models of the human body. As the spheroids are comparably sizable, it is difficult to monitor larger numbers of them by optical means. Therefore, electrical impedance spectroscopy (EIS) emerges as a viable alternative to probing spheroid properties. Current spheroid EIS systems are, however, not suitable for investigating multiple spheroids in parallel over extended time in an automated fashion. Here we address this issue by presenting an automated, multiplexed EIS (AMEIS) platform for impedance analysis in a microfluidic setting. The system was used to continuously monitor the effect of the anticancer drug fluorouracil (5-FU) on HCT116 cancer spheroids. Simultaneous EIS monitoring of up to 15 spheroids was performed in parallel over 4 days at a temporal resolution of 2 min without any need for pumps. The measurements were continuous in nature, and the setup was kept in a standard incubator under controlled conditions during the measurements. A baseline normalization method to improve robustness and to reduce the influence of slow changes in the medium conductivity on the spheroid EIS readings has been developed and validated by experiments and means of a finite-element model. The same method and platform was then used for online monitoring of cardiac spheroids. The beating frequency of each cardiac spheroid could be read out in a completely automated fashion. The developed system constitutes a promising method for simultaneously evaluating drug impact and/or toxic effects on multiple microtissue spheroids.

  15. Comparison of Standard Automated Perimetry, Short-Wavelength Automated Perimetry, and Frequency-Doubling Technology Perimetry to Monitor Glaucoma Progression. (United States)

    Hu, Rongrong; Wang, Chenkun; Gu, Yangshun; Racette, Lyne


    Detection of progression is paramount to the clinical management of glaucoma. Our goal is to compare the performance of standard automated perimetry (SAP), short-wavelength automated perimetry (SWAP), and frequency-doubling technology (FDT) perimetry in monitoring glaucoma progression.Longitudinal data of paired SAP, SWAP, and FDT from 113 eyes with primary open-angle glaucoma enrolled in the Diagnostic Innovations in Glaucoma Study or the African Descent and Glaucoma Evaluation Study were included. Data from all tests were expressed in comparable units by converting the sensitivity from decibels to unitless contrast sensitivity and by expressing sensitivity values in percent of mean normal based on an independent dataset of 207 healthy eyes with aging deterioration taken into consideration. Pointwise linear regression analysis was performed and 3 criteria (conservative, moderate, and liberal) were used to define progression and improvement. Global mean sensitivity (MS) was fitted with linear mixed models.No statistically significant difference in the proportion of progressing and improving eyes was observed across tests using the conservative criterion. Fewer eyes showed improvement on SAP compared to SWAP and FDT using the moderate criterion; and FDT detected less progressing eyes than SAP and SWAP using the liberal criterion. The agreement between these test types was poor. The linear mixed model showed a progressing trend of global MS overtime for SAP and SWAP, but not for FDT. The baseline estimate of SWAP MS was significantly lower than SAP MS by 21.59% of mean normal. FDT showed comparable estimation of baseline MS with SAP.SWAP and FDT do not appear to have significant benefits over SAP in monitoring glaucoma progression. SAP, SWAP, and FDT may, however, detect progression in different glaucoma eyes.

  16. Material quality development during the automated tow placement process (United States)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  17. FDEMS Sensing for Automated Intelligent Processing of PMR-15 (United States)

    Kranbuehl, David E.; Hood, D. K.; Rogozinski, J.; Barksdale, R.; Loos, Alfred C.; McRae, Doug


    The purpose of this grant was to develop frequency dependent dielectric measurements, often called FDEMS (frequency dependent electromagnetic sensing), to monitor and intelligently control the cure process in PMR-15, a stoichiometric mixture of a nadic ester, dimethyl ester, and methylendianiline in a monomor ratio.

  18. Automated EEG monitoring in defining a chronic epilepsy model. (United States)

    Mascott, C R; Gotman, J; Beaudet, A


    There has been a recent surge of interest in chronic animal models of epilepsy. Proper assessment of these models requires documentation of spontaneous seizures by EEG, observation, or both in each individual animal to confirm the presumed epileptic condition. We used the same automatic seizure detection system as that currently used for patients in our institution and many others. Electrodes were implanted in 43 rats before intraamygdalar administration of kainic acid (KA). Animals were monitored intermittently for 3 months. Nine of the rats were protected by anticonvulsants [pentobarbital (PB) and diazepam (DZP)] at the time of KA injection. Between 1 and 3 months after KA injection, spontaneous seizures were detected in 20 of the 34 unprotected animals (59%). Surprisingly, spontaneous seizures were also detected during the same period in 2 of the 9 protected animals that were intended to serve as nonepileptic controls. Although the absence of confirmed spontaneous seizures in the remaining animals cannot exclude their occurrence, it indicates that, if present, they are at least rare. On the other hand, definitive proof of epilepsy is invaluable in the attempt to interpret pathologic data from experimental brains.

  19. Process independent automated sizing methodology for current steering DAC (United States)

    Vural, R. A.; Kahraman, N.; Erkmen, B.; Yildirim, T.


    This study introduces a process independent automated sizing methodology based on general regression neural network (GRNN) for current steering complementary metal-oxide semiconductor (CMOS) digital-to-analog converter (DAC) circuit. The aim is to utilise circuit structures designed with previous process technologies and to synthesise circuit structures for novel process technologies in contrast to other modelling researches that consider a particular process technology. The simulations were performed using ON SEMI 1.5 µm, ON SEMI 0.5 µm and TSMC 0.35 µm technology process parameters. Eventually, a high-dimensional database was developed consisting of transistor sizes of DAC designs and corresponded static specification errors obtained from simulation results. The key point is that the GRNN was trained with the data set including the simulation results of ON-SEMI 1.5 µm and 0.5 µm technology parameters and the test data were constituted with only the simulation results of TSMC 0.35 µm technology parameters that had not been applied to GRNN for training beforehand. The proposed methodology provides the channel lengths and widths of all transistors for a newer technology when the designer sets the numeric values of DAC static output specifications as Differential Non-linearity error, Integral Non-linearity error, monotonicity and gain error as the inputs of the network.

  20. ZigBee Based Industrial Automation Profile for Power Monitoring Systems

    Directory of Open Access Journals (Sweden)

    Archana R. Raut,


    Full Text Available Industrial automations which are mostly depend upon the power systems & which requires distance controlled and regulated systems. Mostly voltage and current equipped parameters along with power and energy management system forms the industrial scenario for automations. Wireless technology which meets to cost, speed and distance scenario will always be a point of an interest for research. In this research work we mainly monitored power related parameters and enable remote switching devices for proper power management systems using ZigBee. This paper proposes a digital system for condition monitoring, diagnosis, and supervisory control for electric systems parameters like voltage and current using wireless sensor networks (WSNs based on ZigBee. Its main feature is its use of the ZigBee protocol as the communication medium between the transmitter and receiver modules. It illustrates that the new ZigBee standard performs well industrial environments.

  1. Analysing risk factors for urinary tract infection based on automated monitoring of hospital-acquired infection. (United States)

    Redder, J D; Leth, R A; Møller, J K


    Urinary tract infections account for as much as one-third of all nosocomial infections. The aim of this study was to examine previously reported characteristics of patients with hospital-acquired urinary tract infections (HA-UTI) using an automated infection monitoring system (Hospital-Acquired Infection Registry: HAIR). A matched case-control study was conducted to investigate the association of risk factors with HA-UTI. Patients with HA-UTI more frequently had indwelling urinary catheters or a disease in the genitourinary or nervous system than the controls. Automated hospital-acquired infection monitoring enables documentation of key risk factors to better evaluate infection control interventions in general or for selected groups of patients.

  2. Possibilities in optical monitoring of laser welding process (United States)

    Horník, Petr; Mrňa, Libor; Pavelka, Jan


    Laser welding is a modern, widely used but still not really common method of welding. With increasing demands on the quality of the welds, it is usual to apply automated machine welding and with on-line monitoring of the welding process. The resulting quality of the weld is largely affected by the behavior of keyhole. However, its direct observation during the welding process is practically impossible and it is necessary to use indirect methods. At ISI we have developed optical methods of monitoring the process. Most advanced is an analysis of radiation of laser-induced plasma plume forming in the keyhole where changes in the frequency of the plasma bursts are monitored and evaluated using Fourier and autocorrelation analysis. Another solution, robust and suitable for industry, is based on the observation of the keyhole inlet opening through a coaxial camera mounted in the welding head and the subsequent image processing by computer vision methods. A high-speed camera is used to understand the dynamics of the plasma plume. Through optical spectroscopy of the plume, we can study the excitation of elements in a material. It is also beneficial to monitor the gas flow of shielding gas using schlieren method.

  3. Automation of a problem list using natural language processing

    Directory of Open Access Journals (Sweden)

    Haug Peter J


    Full Text Available Abstract Background The medical problem list is an important part of the electronic medical record in development in our institution. To serve the functions it is designed for, the problem list has to be as accurate and timely as possible. However, the current problem list is usually incomplete and inaccurate, and is often totally unused. To alleviate this issue, we are building an environment where the problem list can be easily and effectively maintained. Methods For this project, 80 medical problems were selected for their frequency of use in our future clinical field of evaluation (cardiovascular. We have developed an Automated Problem List system composed of two main components: a background and a foreground application. The background application uses Natural Language Processing (NLP to harvest potential problem list entries from the list of 80 targeted problems detected in the multiple free-text electronic documents available in our electronic medical record. These proposed medical problems drive the foreground application designed for management of the problem list. Within this application, the extracted problems are proposed to the physicians for addition to the official problem list. Results The set of 80 targeted medical problems selected for this project covered about 5% of all possible diagnoses coded in ICD-9-CM in our study population (cardiovascular adult inpatients, but about 64% of all instances of these coded diagnoses. The system contains algorithms to detect first document sections, then sentences within these sections, and finally potential problems within the sentences. The initial evaluation of the section and sentence detection algorithms demonstrated a sensitivity and positive predictive value of 100% when detecting sections, and a sensitivity of 89% and a positive predictive value of 94% when detecting sentences. Conclusion The global aim of our project is to automate the process of creating and maintaining a problem

  4. Water quality monitoring using an automated portable fiber optic biosensor: RAPTOR (United States)

    Anderson, George P.; Rowe-Taitt, Chris A.


    The RAPTOR is a portable, automated biosensor capable of performing rapid, ten-minute assays on a sample for four target analytes simultaneously. Samples are analyzed using a fluorescent sandwich immunoassay on the surface of short polystyrene optical probes with capture antibody adsorbed to the probe surface. Target analytes bound to the fiber by capture antibodies are detected with fluorescently labeled tracer antibodies, which are held in a separate reservoir. Since target recognition is a two-step process, selectivity is enhanced, and the optical probes can be reused up to forty times, or until a positive result is obtained. This greatly reduces the logistical burden for field operations. Numerous assays for toxins, such as SEB and ricin, and bacteria, such as Bacillus anthracis and Francisella tularensis, have been developed for the RAPTOR. An assay of particular interest for water quality monitoring and the screening of fruits and vegetables is detection of Giardia cysts. Giardia lamblia is a parasitic protozoan common in the developing world that causes severe intestinal infections. Thus, a simple field assay for screening water supplies would be highly useful. Such an assay has been developed using the RAPTOR. The detection limit for Giardia cysts was 5x104/ml for a 10-minute assay.

  5. Water monitoring: automated and real time identification and classification of algae using digital microscopy. (United States)

    Coltelli, Primo; Barsanti, Laura; Evangelista, Valtere; Frassanito, Anna Maria; Gualtieri, Paolo


    Microalgae are unicellular photoautotrophs that grow in any habitat from fresh and saline water bodies, to hot springs and ice. Microalgae can be used as indicators to monitor water ecosystem conditions. These organisms react quickly and predictably to a broad range of environmental stressors, thus providing early signals of a changing environment. When grown extensively, microalgae may produce harmful effects on marine or freshwater ecology and fishery resources. Rapid and accurate recognition and classification of microalgae is one of the most important issues in water resource management. In this paper, a methodology for automatic and real time identification and enumeration of microalgae by means of image analysis is presented. The methodology is based on segmentation, shape feature extraction, pigment signature determination and neural network grouping; it attained 98.6% accuracy from a set of 53,869 images of 23 different microalgae representing the major algal phyla. In our opinion this methodology partly overcomes the lack of automated identification systems and is on the forefront of developing a computer-based image processing technique to automatically detect, recognize, identify and enumerate microalgae genera and species from all the divisions. This methodology could be useful for an appropriate and effective water resource management.

  6. Generic HPLC platform for automated enzyme reaction monitoring: Advancing the assay toolbox for transaminases and other PLP-dependent enzymes. (United States)

    Börner, Tim; Grey, Carl; Adlercreutz, Patrick


    Methods for rapid and direct quantification of enzyme kinetics independent of the substrate stand in high demand for both fundamental research and bioprocess development. This study addresses the need for a generic method by developing an automated, standardizable HPLC platform monitoring reaction progress in near real-time. The method was applied to amine transaminase (ATA) catalyzed reactions intensifying process development for chiral amine synthesis. Autosampler-assisted pipetting facilitates integrated mixing and sampling under controlled temperature. Crude enzyme formulations in high and low substrate concentrations can be employed. Sequential, small (1 µL) sample injections and immediate detection after separation permits fast reaction monitoring with excellent sensitivity, accuracy and reproducibility. Due to its modular design, different chromatographic techniques, e.g. reverse phase and size exclusion chromatography (SEC) can be employed. A novel assay for pyridoxal 5'-phosphate-dependent enzymes is presented using SEC for direct monitoring of enzyme-bound and free reaction intermediates. Time-resolved changes of the different cofactor states, e.g. pyridoxal 5'-phosphate, pyridoxamine 5'-phosphate and the internal aldimine were traced in both half reactions. The combination of the automated HPLC platform with SEC offers a method for substrate-independent screening, which renders a missing piece in the assay and screening toolbox for ATAs and other PLP-dependent enzymes.

  7. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS). (United States)


    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  8. EWMA control charts in statistical process monitoring

    NARCIS (Netherlands)

    Zwetsloot, I.M.


    In today’s world, the amount of available data is steadily increasing, and it is often of interest to detect changes in the data. Statistical process monitoring (SPM) provides tools to monitor data streams and to signal changes in the data. One of these tools is the control chart. The topic of this

  9. Automation of the CFD Process on Distributed Computing Systems (United States)

    Tejnil, Ed; Gee, Ken; Rizk, Yehia M.


    A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational

  10. FTIR monitoring of industrial scale CVD processes (United States)

    Hopfe, V.; Mosebach, H.; Meyer, M.; Sheel, D.; Grählert, W.; Throl, O.; Dresler, B.


    The goal is to improve chemical vapour deposition (CVD) and infiltration (CVI) process control by a multipurpose, knowledge based feedback system. For monitoring the CVD/CVI process in-situ FTIR spectroscopic data has been identified as input information. In the presentation, three commonly used, and distinctly different, types of industrial CVD/CVI processes are taken as test cases: (i) a thermal high capacity CVI batch process for manufacturing carbon fibre reinforced SiC composites for high temperature applications, (ii) a continuously driven CVD thermal process for coating float glass for energy protection, and (iii) a laser stimulated CVD process for continuously coating bundles of thin ceramic fibers. The feasibility of the concept with FTIR in-situ monitoring as a core technology has been demonstrated. FTIR monitoring sensibly reflects process conditions.

  11. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter


    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  12. Automation of the Technological Process to Produce Building Frame-Monolithic Modules Based on Fluoranhydrite (United States)

    Fedorchuk, J.; Sadenova, M.; Rusina, O.


    The paper first proposes the automation of the technological process to produce building frame-monolithic modules from production wastes, namely technogenic anhydrite and fluoranhydrite. A functional diagram of the process automation is developed, the devices to perform control and maintenance with account of the production characteristics are chosen.

  13. Single-cell bacteria growth monitoring by automated DEP-facilitated image analysis. (United States)

    Peitz, Ingmar; van Leeuwen, Rien


    Growth monitoring is the method of choice in many assays measuring the presence or properties of pathogens, e.g. in diagnostics and food quality. Established methods, relying on culturing large numbers of bacteria, are rather time-consuming, while in healthcare time often is crucial. Several new approaches have been published, mostly aiming at assaying growth or other properties of a small number of bacteria. However, no method so far readily achieves single-cell resolution with a convenient and easy to handle setup that offers the possibility for automation and high throughput. We demonstrate these benefits in this study by employing dielectrophoretic capturing of bacteria in microfluidic electrode structures, optical detection and automated bacteria identification and counting with image analysis algorithms. For a proof-of-principle experiment we chose an antibiotic susceptibility test with Escherichia coli and polymyxin B. Growth monitoring is demonstrated on single cells and the impact of the antibiotic on the growth rate is shown. The minimum inhibitory concentration as a standard diagnostic parameter is derived from a dose-response plot. This report is the basis for further integration of image analysis code into device control. Ultimately, an automated and parallelized setup may be created, using an optical microscanner and many of the electrode structures simultaneously. Sufficient data for a sound statistical evaluation and a confirmation of the initial findings can then be generated in a single experiment.

  14. Robotics, automation, and the new role of process control. (United States)

    McPherson, R A


    The natural progression of automation in the clinical laboratory next will lead to robotic devices to perform many of the manual tasks still remaining. To date, most efforts of laboratory automation have been directed at the analytic phase. New targets for automation will be at the preanalytic and postanalytic phases where many of the bottlenecks in specimen flow now occur in highly repetitive manual tasks. Laboratory professionals will have a unique opportunity to incorporate new concepts of robotics in their facilities to improve error rates and to use massive laboratory databases to improve medical and public health services.

  15. Process development for automated solar cell and module production. Task 4: Automated array assembly (United States)

    Hagerty, J. J.


    Progress in the development of automated solar cell and module production is reported. The unimate robot is programmed for the final 35 cell pattern to be used in the fabrication of the deliverable modules. The mechanical construction of the automated lamination station and final assembly station phases are completed and the first operational testing is underway. The final controlling program is written and optimized. The glass reinforced concrete (GRC) panels to be used for testing and deliverables are in production. Test routines are grouped together and defined to produce the final control program.

  16. Automating the packing heuristic design process with genetic programming. (United States)

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John


    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains.


    Directory of Open Access Journals (Sweden)

    Wasana C. Uduwela


    Full Text Available Information and Communication Technology improves the business competitiveness in both large scale enterprises as well as small and medium scale enterprises. Lack of technical knowledge in Information Communication Technology and the cost have been identified as challenges forsmall and medium enterprises to adopt ICT for their businesses. They can overcome this problem by using freely available tools/systems which aid to generate information systems automatically. However,they require the database structure; therefore, it is desirable to have a tool to automate the relational database design process. In the proposed approach, business forms were considered as the database requirement input sources among: learning from examples, natural language, structured input/output definition and schema definition and forms. The approach uses a functional dependency algorithm on the un-normalized data which is fed through the business form and then apply a normalization algorithm onthe discovered functional dependencies to have the normalized database structure. User intervention is needed to have the domain knowledgeof this approach. Finally, it develops the normalized database with all the keys and relationships; the accuracy of the out-come totally depend on the data fed by end users.

  18. Advanced monitoring with complex stream processing

    CERN Document Server

    CERN. Geneva


    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  19. Ultrasonic techniques for process monitoring and control.

    Energy Technology Data Exchange (ETDEWEB)

    Chien, H.-T.


    Ultrasonic techniques have been applied successfully to process monitoring and control for many industries, such as energy, medical, textile, oil, and material. It helps those industries in quality control, energy efficiency improving, waste reducing, and cost saving. This paper presents four ultrasonic systems, ultrasonic viscometer, on-loom, real-time ultrasonic imaging system, ultrasonic leak detection system, and ultrasonic solid concentration monitoring system, developed at Argonne National Laboratory in the past five years for various applications.

  20. Electronic Tongue-FIA system for the Monitoring of Heavy Metal Biosorption Processes (United States)

    Wilson, D.; Florido, A.; Valderrama, C.; de Labastida, M. Fernández; Alegret, S.; del Valle, M.


    An automated flow injection potentiometric (FIP) system with electronic tongue detection (ET) was used for the monitoring of biosorption processes of heavy metals on waste biomaterial. Grape stalk wastes were used as biosorbent to remove Cu2+ ions in a fixed-bed column setup. For the monitoring, the used ET employed a sensor array formed by Cu2+ and Ca2+ selective electrodes and two generic heavy-metal electrodes. The subsequent cross-response obtained was processed by a multilayer artificial neural network (ANN) model in order to resolve the concentrations of the monitored species. The coupling of the electronic tongue with the automation features of the flow-injection system (ET-FIP) allowed us to accurately characterize the biosorption process, through obtaining its breakthrough curves. In parallel, fractions of the extract solution were analyzed by atomic absorption spectroscopy in order to validate the results obtained with the reported methodology.

  1. An improved approach for process monitoring in laser material processing (United States)

    König, Hans-Georg; Pütsch, Oliver; Stollenwerk, Jochen; Loosen, Peter


    Process monitoring is used in many different laser material processes due to the demand for reliable and stable processes. Among different methods, on-axis process monitoring offers multiple advantages. To observe a laser material process it is unavoidable to choose a wavelength for observation that is different to the one used for material processing, otherwise the light of the processing laser would outshine the picture of the process. By choosing a different wavelength, lateral chromatic aberration occurs in not chromatically corrected optical systems with optical scanning units and f-Theta lenses. These aberrations lead to a truncated image of the process on the camera or the pyrometer, respectively. This is the reason for adulterated measurements and non-satisfying images of the process. A new approach for solving the problem of field dependent lateral chromatic aberration in process monitoring is presented. Therefore, the scanner-based optical system is reproduced in a simulation environment, to predict the occurring lateral chromatic aberrations. In addition, a second deflecting system is integrated into the system. By using simulation, a predictive control is designed that uses the additional deflecting system to introduce reverse lateral deviations in order to compensate the lateral effect of chromatic aberration. This paper illustrates the concept and the implementation of the predictive control, which is used to eliminate lateral chromatic aberrations in process monitoring, the simulation on which the system is based the optical system as well as the control concept.

  2. FY-2010 Process Monitoring Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Casella, Amanda J.; Hines, Wes; Levitskaia, Tatiana G.; henkell, J.; Schwantes, Jon M.; Jordan, Elizabeth A.; Lines, Amanda M.; Fraga, Carlos G.; Peterson, James M.; Verdugo, Dawn E.; Christensen, Ronald N.; Peper, Shane M.


    During FY 2010, work under the Spectroscopy-Based Process Monitoring task included ordering and receiving four fluid flow meters and four flow visible-near infrared spectrometer cells to be instrumented within the centrifugal contactor system at Pacific Northwest National Laboratory (PNNL). Initial demonstrations of real-time spectroscopic measurements on cold-stream simulants were conducted using plutonium (Pu)/uranium (U) (PUREX) solvent extraction process conditions. The specific test case examined the extraction of neodymium nitrate (Nd(NO3)3) from an aqueous nitric acid (HNO3) feed into a tri-n-butyl phosphate (TBP)/ n-dodecane solvent. Demonstration testing of this system included diverting a sample from the aqueous feed meanwhile monitoring the process in every phase using the on-line spectroscopic process monitoring system. The purpose of this demonstration was to test whether spectroscopic monitoring is capable of determining the mass balance of metal nitrate species involved in a cross-current solvent extraction scheme while also diverting a sample from the system. The diversion scenario involved diverting a portion of the feed from a counter-current extraction system while a continuous extraction experiment was underway. A successful test would demonstrate the ability of the process monitoring system to detect and quantify the diversion of material from the system during a real-time continuous solvent extraction experiment. The system was designed to mimic a PUREX-type extraction process with a bank of four centrifugal contactors. The aqueous feed contained Nd(NO3)3 in HNO3, and the organic phase was composed of TBP/n-dodecane. The amount of sample observed to be diverted by on-line spectroscopic process monitoring was measured to be 3 mmol (3 x 10-3 mol) Nd3+. This value was in excellent agreement with the 2.9 mmol Nd3+ value based on the known mass of sample taken (i.e., diverted) directly from the system feed solution.

  3. Partitioning,Automation and Error Recovery in the Control and Monitoring System of an LHC Experiment

    Institute of Scientific and Technical Information of China (English)



    The Joint Controls Project(JCOP)is a collaboration between CERN and the four LHC experiments to find and implement common solutions for their control and monitoring systems.As part of this project and Architecture Working Group was set up in order to study the requirements and devise an architectural model that would suit the four experiments.Many issues were studied by this working group:Alarm handling,Access Control,Hierarchical Control,etc.This paper will report on the specific issue of hierarchical control and in particular partitioning,automation and error recovery.

  4. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena (United States)

    Savant, Vaibhav; Smith, Niall


    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  5. Batch Statistical Process Monitoring Approach to a Cocrystallization Process. (United States)

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Santos, Adenilson O Dos; Lopes, João A


    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization.

  6. Monitoring and controlling the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Ahring, B.K.; Angelidaki, I. [The Technical Univ. of Denmark, Dept. of Environmental Science and Engineering, Lyngby (Denmark)


    Many modern large-scale biogas plants have been constructed recently, increasing the demand for proper monitoring and control of these large reactor systems. For monitoring the biogas process, an easy to measure and reliable indicator is required, which reflects the metabolic state and the activity of the bacterial populations in the reactor. In this paper, we discuss existing indicators as well as indicators under development which can potentially be used to monitor the state of the biogas process in a reactor. Furthermore, data are presented from two large scale thermophilic biogas plants, subjected to temperature changes and where the concentration of volatile fatty acids was monitored. The results clearly demonstrated that significant changes in the concentration of the individual VFA occurred although the biogas production was not significantly changed. Especially the concentrations of butyrate, isobutyrate and isovalerate showed significant changes. Future improvements of process control could therefore be based on monitoring of the concentration of specific VFA`s together with information about the bacterial populations in the reactor. The last information could be supplied by the use of modern molecular techniques. (au) 51 refs.

  7. Marketing automation processes as a way to improve contemporary marketing of a company


    Witold Świeczak


    The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influenc...

  8. Design Automation Systems for Production Preparation : Applied on the Rotary Draw Bending Process


    Johansson, Joel


    Intensive competition on the global market puts great pressure on manufacturing companies to develop and produce products that meet requirements from customers and investors. One key factor in meeting these requirements is the efficiency of the product development and the production preparation process. Design automation is a powerful tool to increase efficiency in these two processes. The benefits of automating the production preparation process are shortened led-time, improved product perfo...

  9. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Wang Mingzhong; Huang Guogang [Pingdingshan Mining Bureau (China); Wang Yunjia; Guogangli [China Univ. of Mining and Technology, Xuzhou (China)


    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  10. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  11. Automated processing of data on the use of motor vehicles in the Serbian Armed Forces

    Directory of Open Access Journals (Sweden)

    Nikola S. Osmokrović


    Full Text Available The main aim of introducing information technology into the armed forces is the automation of the management process. The management in movement and transport (M&T in our armed forces has been included in the process of automation from the beginning. For that reason, today we can speak about the automated processing of data on road traffic safety and on the use of motor vehicles. With regard to the overall development of the information system of the movement and transport service, the paper presents an information system of the M&T service for the processing of data on the use of motor vehicles. The main features, components and functions of the 'Vozila' application, which was specially developed for the automated processing of data on motor vehicle use, are explained in particular.

  12. Digital Automation and Real-Time Monitoring of an Original Installation for "Wet Combustion" of Organic Wastes (United States)

    Morozov, Yegor; Tikhomirov, Alexander A.; Saltykov, Mikhail; Trifonov, Sergey V.; Kudenko, D.. Yurii A.


    An original method for "wet combustion" of organic wastes, which is being developed at the IBP SB RAS, is a very promising approach for regeneration of nutrient solutions for plants in future spacecraft closed Bioregenerative Life Support Systems (BLSS). The method is quick, ecofriendly, does not require special conditions such as high pressure and temperature, and the resulting nitrogen stays in forms easy for further preparation of the fertilizer. An experimental testbed of a new-generation closed ecosystem is being currently run at the IBP SB RAS to examine compatibility of the latest technologies for accelerating the cycling. Integration of "wet combustion" of organic wastes into the information system of closed ecosystem experimental testbed has been studied as part of preparatory work. Digital automation and real-time monitoring of original "wet combustion" installation operation parameters have been implemented. The new system enabled remotely controlled or automatic work of the installation. Data are stored in standard easily processed formats, allowing further mathematical processing where necessary. During ongoing experiments on improving "wet combustion" of organic wastes, automatic monitoring can notice slight changes in process parameters and record them in more detail. The ultimate goal of the study is to include the "wet combustion" installation into future full-scale experiment with humans, thus reducing the time spent by the crew on life support issues while living in the BLSS. The work was carried out with the financial support of the Russian Scientific Foundation (project 14-14-00599).

  13. 7 CFR 277.18 - Establishment of an Automated Data Processing (ADP) and Information Retrieval System. (United States)


    ... acquisition including hardware, software, telecommunications, system testing, and data security; (iv) The... need for human assistance or intervention. Automated Data Processing Equipment or hardware means: (1... unauthorized use; (C) Software and data security; (D) Telecommunications security; (E) Personnel security;...

  14. A New Device to Automate the Monitoring of Critical Patients’ Urine Output

    Directory of Open Access Journals (Sweden)

    Abraham Otero


    Full Text Available Urine output (UO is usually measured manually each hour in acutely ill patients. This task consumes a substantial amount of time. Furthermore, in the literature there is evidence that more frequent (minute-by-minute UO measurement could impact clinical decision making and improve patient outcomes. However, it is not feasible to manually take minute-by-minute UO measurements. A device capable of automatically monitoring UO could save precious time of the healthcare staff and improve patient outcomes through a more precise and continuous monitoring of this parameter. This paper presents a device capable of automatically monitoring UO. It provides minute by minute measures and it can generate alarms that warn of deviations from therapeutic goals. It uses a capacitive sensor for the measurement of the UO collected within a rigid container. When the container is full, it automatically empties without requiring any internal or external power supply or any intervention by the nursing staff. In vitro tests have been conducted to verify the proper operation and accuracy in the measures of the device. These tests confirm the viability of the device to automate the monitoring of UO.

  15. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.


    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for

  16. A Camera and Multi-Sensor Automated Station Design for Polar Physical and Biological Systems Monitoring: AMIGOS (United States)

    Bohlander, J. A.; Ross, R.; Scambos, T.; Haran, T. M.; Bauer, R. J.


    The Automated Meteorology - Ice/Indigenous species - Geophysics Observation System (AMIGOS) consists of a set of measurement instruments and camera(s) controlled by a single-board computer with a simplified Linux operating system and an Iridium satellite modem supporting two-way communication. Primary features of the system relevant to polar operations are low power requirements, daily data uploading, reprogramming, tolerance for low temperatures, and various approaches for automatic resets and recovery from low power or cold shut-down. Instruments include a compact weather station, C/A or dual-frequency GPS, solar flux and reflectivity sensors, sonic snow gages, simplified radio-echo-sounder, and resistance thermometer string in the firn column. In the current state of development, there are two basic designs. One is intended for in situ observations of glacier conditions. The other design supports a high-resolution camera for monitoring biological or geophysical systems from short distances (100 m to 20 km). The stations have been successfully used in several locations for operational support, monitoring rapid ice changes in response to climate change or iceberg drift, and monitoring penguin colony activity. As of June, 2012, there are 9 AMIGOS systems installed, all on the Antarctic continent. The stations are a working prototype for a planned series of upgraded stations, currently termed 'Sentinels'. These stations would carry further instrumentation, communications, and processing capability to investigate ice - ocean interaction from ice tongue, ice shelf, or fjord coastline areas.

  17. The automated system for technological process of spacecraft's waveguide paths soldering (United States)

    Tynchenko, V. S.; Murygin, A. V.; Emilova, O. A.; Bocharov, A. N.; Laptenok, V. D.


    The paper solves the problem of automated process control of space vehicles waveguide paths soldering by means of induction heating. The peculiarities of the induction soldering process are analyzed and necessity of information-control system automation is identified. The developed automated system makes the control of the product heating process, by varying the power supplied to the inductor, on the basis of information about the soldering zone temperature, and stabilizing the temperature in a narrow range above the melting point of the solder but below the melting point of the waveguide. This allows the soldering process automating to improve the quality of the waveguides and eliminate burn-troughs. The article shows a block diagram of a software system consisting of five modules, and describes the main algorithm of its work. Also there is a description of the waveguide paths automated soldering system operation, for explaining the basic functions and limitations of the system. The developed software allows setting of the measurement equipment, setting and changing parameters of the soldering process, as well as view graphs of temperatures recorded by the system. There is shown the results of experimental studies that prove high quality of soldering process control and the system applicability to the tasks of automation.

  18. Process optimization and biocompatibility of cell carriers suitable for automated magnetic manipulation. (United States)

    Krejci, I; Piana, C; Howitz, S; Wegener, T; Fiedler, S; Zwanzig, M; Schmitt, D; Daum, N; Meier, K; Lehr, C M; Batista, U; Zemljic, S; Messerschmidt, J; Franzke, J; Wirth, M; Gabor, F


    There is increasing demand for automated cell reprogramming in the fields of cell biology, biotechnology and the biomedical sciences. Microfluidic-based platforms that provide unattended manipulation of adherent cells promise to be an appropriate basis for cell manipulation. In this study we developed a magnetically driven cell carrier to serve as a vehicle within an in vitro environment. To elucidate the impact of the carrier on cells, biocompatibility was estimated using the human adenocarcinoma cell line Caco-2. Besides evaluation of the quality of the magnetic carriers by field emission scanning electron microscopy, the rate of adherence, proliferation and differentiation of Caco-2 cells grown on the carriers was quantified. Moreover, the morphology of the cells was monitored by immunofluorescent staining. Early generations of the cell carrier suffered from release of cytotoxic nickel from the magnetic cushion. Biocompatibility was achieved by complete encapsulation of the nickel bulk within galvanic gold. The insulation process had to be developed stepwise and was controlled by parallel monitoring of the cell viability. The final carrier generation proved to be a proper support for cell manipulation, allowing proliferation of Caco-2 cells equal to that on glass or polystyrene as a reference for up to 10 days. Functional differentiation was enhanced by more than 30% compared with the reference. A flat, ferromagnetic and fully biocompatible carrier for cell manipulation was developed for application in microfluidic systems. Beyond that, this study offers advice for the development of magnetic cell carriers and the estimation of their biocompatibility.

  19. Seismic monitoring of torrential and fluvial processes (United States)

    Burtin, Arnaud; Hovius, Niels; Turowski, Jens M.


    In seismology, the signal is usually analysed for earthquake data, but earthquakes represent less than 1 % of continuous recording. The remaining data are considered as seismic noise and were for a long time ignored. Over the past decades, the analysis of seismic noise has constantly increased in popularity, and this has led to the development of new approaches and applications in geophysics. The study of continuous seismic records is now open to other disciplines, like geomorphology. The motion of mass at the Earth's surface generates seismic waves that are recorded by nearby seismometers and can be used to monitor mass transfer throughout the landscape. Surface processes vary in nature, mechanism, magnitude, space and time, and this variability can be observed in the seismic signals. This contribution gives an overview of the development and current opportunities for the seismic monitoring of geomorphic processes. We first describe the common principles of seismic signal monitoring and introduce time-frequency analysis for the purpose of identification and differentiation of surface processes. Second, we present techniques to detect, locate and quantify geomorphic events. Third, we review the diverse layout of seismic arrays and highlight their advantages and limitations for specific processes, like slope or channel activity. Finally, we illustrate all these characteristics with the analysis of seismic data acquired in a small debris-flow catchment where geomorphic events show interactions and feedbacks. Further developments must aim to fully understand the richness of the continuous seismic signals, to better quantify the geomorphic activity and to improve the performance of warning systems. Seismic monitoring may ultimately allow the continuous survey of erosion and transfer of sediments in the landscape on the scales of external forcing.

  20. Automated selected reaction monitoring software for accurate label-free protein quantification. (United States)

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik


    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.


    Directory of Open Access Journals (Sweden)

    Theodore HUGHES-RILEY


    Full Text Available Constructed wetlands are a popular form of waste-water treatment that have proliferated across Europe and the rest of the world in recent years as an environmentally conscious form of waste water treatment. The ability to monitor the conditions in the bed and control input factors such as heating and aeration may extend the lifetime of the reed bed substantially beyond the ten year lifetime normally reached. The Autonomous Reed Bed Installation (ARBI project is an EU FP7 initiative to develop a reed bed with automated control over input parameters based on readings taken from embedded sensors. Automated remedial action may improve bed treatment efficiency, and prolong the life of the bed and avoiding the need to refurbish the bed, which is both time consuming and costly. One critical parameter to observe is the clog state of the reed bed, as this can severely impact on the efficiency of water treatment to the point of the bed becoming non-operable. Magnetic resonance (MR sensors can be a powerful tool in determining clogging levels, and has previously been explored in the literature. This work is based on a conference paper (2nd International Conference "Water resources and wetlands", 2014 and details magnetic sensors suitable for long-term embedding into a constructed wetland. Unlike previous studies this work examines a probe embedded into a wetland.

  2. A prototype of an automated high resolution InSAR volcano-monitoring system in the MED-SUV project (United States)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas


    Volcanic processes which produce a variety of geological and hydrological hazards are difficult to predict and capable of triggering natural disasters on regional to global scales. Therefore it is important to monitor volcano continuously and with a high spatial and temporal sampling rate. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities and it helps for the better understanding and modelling of the involved geophysical processes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide a powerful tool for observing the eruptive activities and measuring the surface changes of millimetre accuracy. All the mentioned techniques with deformation time series extraction address the challenges by exploiting medium to large SAR image stacks. The process of selecting, ordering, downloading, storing, logging, extracting and preparing the data for processing is very time consuming has to be done manually for every single data-stack. In many cases it is even an iterative process which has to be done regularly and continuously. Therefore, data processing becomes slow which causes significant delays in data delivery. The SAR Satellite based High Resolution Data Acquisition System, which will be developed at DLR, will automate this entire time consuming tasks and allows an operational volcano monitoring system. Every 24 hours the system runs for searching new acquired scene over the volcanoes and keeps track of the data orders, log the status and download the provided data via ftp-transfer including E-Mail alert. Furthermore, the system will deliver specified reports and maps to a database for review and use by specialists. The user interaction will be minimized and iterative processes will be totally avoided. In this presentation, a prototype of SAR Satellite based High Resolution Data

  3. Automated tests for diagnosing and monitoring cognitive impairment: a diagnostic accuracy review. (United States)

    Aslam, Rabeea'h W; Bates, Vickie; Dundar, Yenal; Hounsome, Juliet; Richardson, Marty; Krishan, Ashma; Dickson, Rumona; Boland, Angela; Kotas, Eleanor; Fisher, Joanne; Sikdar, Sudip; Robinson, Louise


    BACKGROUND Cognitive impairment is a growing public health concern, and is one of the most distinctive characteristics of all dementias. The timely recognition of dementia syndromes can be beneficial, as some causes of dementia are treatable and are fully or partially reversible. Several automated cognitive assessment tools for assessing mild cognitive impairment (MCI) and early dementia are now available. Proponents of these tests cite as benefits the tests' repeatability and robustness and the saving of clinicians' time. However, the use of these tools to diagnose and/or monitor progressive cognitive impairment or response to treatment has not yet been evaluated. OBJECTIVES The aim of this review was to determine whether or not automated computerised tests could accurately identify patients with progressive cognitive impairment in MCI and dementia and, if so, to investigate their role in monitoring disease progression and/or response to treatment. DATA SOURCES Five electronic databases (MEDLINE, EMBASE, The Cochrane Library, ISI Web of Science and PsycINFO), plus ProQuest, were searched from 2005 to August 2015. The bibliographies of retrieved citations were also examined. Trial and research registers were searched for ongoing studies and reviews. A second search was run to identify individual test costs and acquisition costs for the various tools identified in the review. REVIEW METHODS Two reviewers independently screened all titles and abstracts to identify potentially relevant studies for inclusion in the review. Full-text copies were assessed independently by two reviewers. Data were extracted and assessed for risk of bias by one reviewer and independently checked for accuracy by a second. The results of the data extraction and quality assessment for each study are presented in structured tables and as a narrative summary. RESULTS The electronic searching of databases, including ProQuest, resulted in 13,542 unique citations. The titles and abstracts of these

  4. A novel automated bioreactor for scalable process optimisation of haematopoietic stem cell culture. (United States)

    Ratcliffe, E; Glen, K E; Workman, V L; Stacey, A J; Thomas, R J


    Proliferation and differentiation of haematopoietic stem cells (HSCs) from umbilical cord blood at large scale will potentially underpin production of a number of therapeutic cellular products in development, including erythrocytes and platelets. However, to achieve production processes that are scalable and optimised for cost and quality, scaled down development platforms that can define process parameter tolerances and consequent manufacturing controls are essential. We have demonstrated the potential of a new, automated, 24×15 mL replicate suspension bioreactor system, with online monitoring and control, to develop an HSC proliferation and differentiation process for erythroid committed cells (CD71(+), CD235a(+)). Cell proliferation was relatively robust to cell density and oxygen levels and reached up to 6 population doublings over 10 days. The maximum suspension culture density for a 48 h total media exchange protocol was established to be in the order of 10(7)cells/mL. This system will be valuable for the further HSC suspension culture cost reduction and optimisation necessary before the application of conventional stirred tank technology to scaled manufacture of HSC derived products.

  5. Advanced process monitoring and feedback control to enhance cell culture process production and robustness. (United States)

    Zhang, An; Tsang, Valerie Liu; Moore, Brandon; Shen, Vivian; Huang, Yao-Ming; Kshirsagar, Rashmi; Ryll, Thomas


    It is a common practice in biotherapeutic manufacturing to define a fixed-volume feed strategy for nutrient feeds, based on historical cell demand. However, once the feed volumes are defined, they are inflexible to batch-to-batch variations in cell growth and physiology and can lead to inconsistent productivity and product quality. In an effort to control critical quality attributes and to apply process analytical technology (PAT), a fully automated cell culture feedback control system has been explored in three different applications. The first study illustrates that frequent monitoring and automatically controlling the complex feed based on a surrogate (glutamate) level improved protein production. More importantly, the resulting feed strategy was translated into a manufacturing-friendly manual feed strategy without impact on product quality. The second study demonstrates the improved process robustness of an automated feed strategy based on online bio-capacitance measurements for cell growth. In the third study, glucose and lactate concentrations were measured online and were used to automatically control the glucose feed, which in turn changed lactate metabolism. These studies suggest that the auto-feedback control system has the potential to significantly increase productivity and improve robustness in manufacturing, with the goal of ensuring process performance and product quality consistency.

  6. Automated external cardioversion defibrillation monitoring in cardiac arrest: a randomized trial

    Directory of Open Access Journals (Sweden)

    Norvel Robert


    Full Text Available Abstract Background In-hospital cardiac arrest has a poor prognosis despite active electrocardiography monitoring. The initial rhythm of approximately 25% of in-hospital cardiopulmonary resuscitation (CPR events is pulseless ventricular tachycardia/ventricular fibrillation (VT/VF. Early defibrillation is an independent predictor of survival in CPR events caused by VT/VF. The automated external cardioverter defibrillator (AECD is a device attached by pads to the chest wall that monitors, detects, and within seconds, automatically delivers electric countershock to an appropriate tachyarrhythmia. Study Objectives • To evaluate safety of AECD monitoring in hospitalized patients. • To evaluate whether AECDs provide earlier defibrillation than hospital code teams. Methods The study is a prospective trial randomizing patients admitted to the telemetry ward to standard CPR (code team or standard CPR plus AECD monitoring (PowerHeart CRM. The AECD is programmed to deliver one 150 J biphasic shock to patients in sustained VT/VF. Data is collected using the Utstein criteria for cardiac arrest. The primary endpoint is time-to-defibrillation; secondary outcomes include neurological status and survival to discharge, with 3-year follow-up. Results To date, 192 patients have been recruited in the time period between 10/10/2006 to 7/20/2007. A total of 3,655 hours of telemetry data have been analyzed in the AECD arm. The AECD has monitored ambulatory telemetry patients in sinus rhythm, sinus tachycardia, supraventricular tachycardia, atrial flutter or fibrillation, with premature ventricular complexes and non-sustained VT without delivery of inappropriate shocks. One patient experienced sustained VT during AECD monitoring, who was successfully defibrillated (17 seconds after meeting programmed criteria. There are no events to report in the control arm. The patient survived the event without neurological complications. During the same time period, mean time to

  7. Evaluation of the Colin STBP-680 at rest and during exercise: an automated blood pressure monitor using R-wave gating.


    Bond, V.; Bassett, D R; Howley, E T; Lewis, J.; Walker, A J; Swan, P D; Tearney, R J; Adams, R.G.


    The application of automated blood pressure measurement during exercise has been limited by inaccuracies introduced by the effects of accompanying motion and noise. We evaluated a newly developed automated blood pressure monitor for measuring exercise blood pressure (Colin STBP-680; Colin, San Antonio, Texas, USA). The STBP-680 uses acoustic transduction with the assistance of the electrocardiogram R-wave to trigger the sampling period for blood pressure measurement. The automated monitor rea...

  8. Process monitoring using ultrasonic sensor systems. (United States)

    Henning, Bernd; Rautenberg, Jens


    Continuous in-line measurement of substance concentration in liquid mixtures is valuable in improving industrial processes in terms of material properties, energy efficiency and process safety. Ultrasonic sensor systems meet the practical requirements of a chemical sensor quite well. Currently ultrasonic sensor systems are widely used as acoustic chemical sensors to measure concentration of selected substances or to monitor the course of polymerisation, crystallisation or fermentation processes. Useable acoustic properties for the characterisation of liquid mixtures are sound velocity, sound absorption and acoustic impedance. This contribution will give a short overview of the state of the art and several trends for the use of ultrasonic sensor systems in process applications. Novel investigations show the very promising possibility to analyse liquid multi-phase mixtures like suspensions, emulsions and dispersions.

  9. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing (United States)

    Chien, Steve A.


    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  10. The Effects of Automated Prompting and Self-Monitoring on Homework Completion for a Student with Attention Deficit Hyperactivity Disorder (United States)

    Blicha, Amy; Belfiore, Phillip J.


    This study examined the effects of an intervention consisting of automated prompting and self-monitoring on the level of independent homework task completion for an elementary-age student with attention deficit hyperactivity disorder (ADHD). Instituting a single subject, within series ABAB design, the results showed a consistent increase and…

  11. Sociolinguistically Informed Natural Language Processing: Automating Irony Detection (United States)


    representation for verbal irony detection. Indeed, sociolinguistic theories of verbal irony imply that a 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND... social media REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR/MONITOR’S ACRONYM(S) ARO 8. PERFORMING ORGANIZATION...contrast to most text classification problems, word counts and syntactic features alone do not constitute an adequate representation for verbal irony

  12. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells. (United States)

    Ker, Dai Fei Elmer; Weiss, Lee E; Junkers, Silvina N; Chen, Mei; Yin, Zhaozheng; Sandbothe, Michael F; Huh, Seung-il; Eom, Sungeun; Bise, Ryoma; Osuna-Highley, Elvira; Kanade, Takeo; Campbell, Phil G


    Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and developing robotic cell

  13. The National Shipbuilding Research Program. Automated Process Application in Steel Fabrication and Subassembly Facilities; Phase I (Process Analysis) (United States)


    6 Automated Process Application in Steel Fabrication and Subassembly Facilities; Phase I ( Process Analysis ) U.S. DEPARTMENT OF THE NAVY CARDEROCK...Subassembly Facilities; Phase I ( Process Analysis ) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

  14. Process monitoring to determine electrostatic risks

    Energy Technology Data Exchange (ETDEWEB)

    Swenson, David E [Affinity Static Control Consulting, LLC 2609 Quanah Drive, Round Rock, TX 78681 (United States)], E-mail:


    Designing a factory electrostatic discharge (ESD) control program requires an understanding of all the processes where unprotected ESD susceptible items are handled either manually or by machine. Human handling aspects are generally understood and control procedures where people are involved are commonly implemented with great care. Personnel grounding systems, monitors, and the like, are installed in order to make sure that personnel do not accumulate and transfer electrostatic charge that could damage sensitive parts during handling or assembly operations. However, the ability to determine what is occurring inside of process equipment has not been particularly easy up to now. Equipment is now available that allows the measurement and recording of electrical field information inside of many process tools. One of the goals of this work is to be able to characterize equipment as capable of handling parts susceptible to specific levels that may be related to component part sensitivity.

  15. Monitoring of an antigen manufacturing process. (United States)

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih


    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  16. Swab culture monitoring of automated endoscope reprocessors after high-level disinfection

    Institute of Scientific and Technical Information of China (English)

    Lung-Sheng Lu; Keng-Liang Wu; Yi-Chun Chiu; Ming-Tzung Lin; Tsung-Hui Hu; King-Wah Chiu


    AIM:To conduct a bacterial culture study for monitoring decontamination of automated endoscope reprocessors (AERs) after high-level disinfection (HLD).METHODS:From February 2006 to January 2011,authors conducted randomized consecutive sampling each month for 7 AERs.Authors collected a total of 420 swab cultures,including 300 cultures from 5 gastroscope AERs,and 120 cultures from 2 colonoscope AERs.Swab cultures were obtained from the residual water from the AERs after a full reprocessing cycle.Samples were cultured to test for aerobic bacteria,anaerobic bacteria,and mycobacterium tuberculosis.RESULTS:The positive culture rate of the AERs was 2.0% (6/300) for gastroscope AERs and 0.8% (1/120)for colonoscope AERs.All the positive cultures,including 6 from gastroscope and 1 from colonoscope AERs,showed monofloral colonization.Of the gastroscope AER samples,50% (3/6) were colonized by aerobic bacterial and 50% (3/6) by fungal contaminations.CONCLUSION:A full reprocessing cycle of an AER with HLD is adequate for disinfection of the machine.Swab culture is a useful method for monitoring AER decontamination after each reprocessing cycle.Fungal contamination of AERs after reprocessing should also be kept in mind.

  17. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges (United States)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.


    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  18. Processing and review interface for strong motion data (PRISM) software, version 1.0.0—Methodology and automated processing (United States)

    Jones, Jeanne; Kalkan, Erol; Stephens, Christopher


    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.

  19. The Multi-Isotope Process (MIP) Monitor Project: FY12 Progress and Accomplishments

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B.; Orton, Christopher R.; Jordan, David V.; Schwantes, Jon M.; Bender, Sarah; Dayman, Kenneth J.; Unlu, Kenan; Landsberger, Sheldon


    The Multi-Isotope Process (MIP) Monitor, being developed at Pacific Northwest National Laboratory (PNNL), provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of "...(minimization of) the risks of nuclear proliferation and terrorism." The MIP Monitor measures distributions of a suite of indicator (radioactive) isotopes present within product and waste streams of a nuclear reprocessing facility. These indicator isotopes are monitored on-line by gamma spectrometry and compared, in near-real-time, to spectral patterns representing "normal" process conditions using multivariate pattern recognition software. The monitor utilizes this multivariate analysis and gamma spectroscopy of reprocessing streams to detect small changes in the gamma spectrum, which may indicate changes in process conditions. Multivariate analysis methods common in chemometrics, such as principal component analysis (PCA) and partial least squares regression (PLS), act as pattern recognition techniques, which can detect small deviations from the expected, nominal condition. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting. Development of the MIP Monitor approach continues to evaluate the efficacy of the monitor for automated, real-time or near-real-time application. This report details follow-on research and development efforts sponsored by the U.S. Department of Energy Fuel Cycle Research and Development related to the MIP Monitor for fiscal year

  20. A Continuous Automated Vault Inventory System (CAVIS) for accountability monitoring of stored nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Pickett, C.A.; Barham, M.A.; Gafford, T.A.; Hutchinson, D.P.; Jordan, J.K.; Maxey, L.C.; Moran, B.W.; Muhs, J.; Nodine, R.; Simpson, M.L. [and others


    Nearly all facilities that store hazardous (radioactive or non-radioactive) materials must comply with prevailing federal, state, and local laws. These laws usually have components that require periodic physical inspections to insure that all materials remain safely and securely stored. The inspections are generally labor intensive, slow, put personnel at risk, and only find anomalies after they have occurred. The system described in this paper was developed for monitoring stored nuclear materials resulting from weapons dismantlement, but its applications extend to any storage facility that meets the above criteria. The traditional special nuclear material (SNM) accountability programs, that are currently used within most of the Department of Energy (DOE) complex, require the physical entry of highly trained personnel into SNM storage vaults. This imposes the need for additional security measures, which typically mandate that extra security personnel be present while SNM inventories are performed. These requirements increase labor costs and put additional personnel at risk to radiation exposure. In some cases, individuals have received radiation exposure equivalent to the annual maximum during just one inventory verification. With increasing overhead costs, the current system is rapidly becoming too expensive to operate, the need for an automated method of inventory verification is evident. The Continuous Automated Vault Inventory System (CAVIS) described in this paper was designed and prototyped as a low cost, highly reliable, and user friendly system that is capable of providing, real-time weight, gamma. and neutron energy confirmation from each item stored in a SNM vault. This paper describes the sensor technologies, the CAVIS prototype system (built at Y- 12 for highly enriched uranium storage), the technical requirements that must be achieved to assure successful implementation, and descriptions of sensor technologies needed for a plutonium facility.

  1. Process defects and in situ monitoring methods in metal powder bed fusion: a review (United States)

    Grasso, Marco; Colosimo, Bianca Maria


    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  2. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS (United States)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew


    substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.

  3. An automated fog monitoring system for the Indo-Gangetic Plains based on satellite measurements (United States)

    Patil, Dinesh; Chourey, Reema; Rizvi, Sarwar; Singh, Manoj; Gautam, Ritesh


    Fog is a meteorological phenomenon that causes reduction in regional visibility and affects air quality, thus leading to various societal and economic implications, especially disrupting air and rail transportation. The persistent and widespread winter fog impacts the entire the Indo-Gangetic Plains (IGP), as frequently observed in satellite imagery. The IGP is a densely populated region in south Asia, inhabiting about 1/6th of the world's population, with a strong upward pollution trend. In this study, we have used multi-spectral radiances and aerosol/cloud retrievals from Terra/Aqua MODIS data for developing an automated web-based fog monitoring system over the IGP. Using our previous and existing methodologies, and ongoing algorithm development for the detection of fog and retrieval of associated microphysical properties (e.g. fog droplet effective radius), we characterize the widespread fog detection during both daytime and nighttime. Specifically, for the night time fog detection, the algorithm employs a satellite-based bi-spectral brightness temperature difference technique between two spectral channels: MODIS band-22 (3.9μm) and band-31 (10.75μm). Further, we are extending our algorithm development to geostationary satellites, for providing continuous monitoring of the spatial-temporal variation of fog. We anticipate that the ongoing and future development of a fog monitoring system would be of assistance to air, rail and vehicular transportation management, as well as for dissemination of fog information to government agencies and general public. The outputs of fog detection algorithm and related aerosol/cloud parameters are operationally disseminated via

  4. Secure VM for Monitoring Industrial Process Controllers

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, Dipankar [ORNL; Ali, Mohammad Hassan [University of Memphis; Abercrombie, Robert K [ORNL; Schlicher, Bob G [ORNL; Sheldon, Frederick T [ORNL; Carvalho, Marco [Institute of Human and Machine Cognition


    In this paper, we examine the biological immune system as an autonomic system for self-protection, which has evolved over millions of years probably through extensive redesigning, testing, tuning and optimization process. The powerful information processing capabilities of the immune system, such as feature extraction, pattern recognition, learning, memory, and its distributive nature provide rich metaphors for its artificial counterpart. Our study focuses on building an autonomic defense system, using some immunological metaphors for information gathering, analyzing, decision making and launching threat and attack responses. In order to detection Stuxnet like malware, we propose to include a secure VM (or dedicated host) to the SCADA Network to monitor behavior and all software updates. This on-going research effort is not to mimic the nature but to explore and learn valuable lessons useful for self-adaptive cyber defense systems.

  5. Development and Assessment of an Automated High-Resolution InSAR Volcano-Monitoring System (United States)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas


    Monitoring volcanoes and volcanic areas using synthetic aperture radar (SAR) data is a well-established method of risk assessment. However, acquisition planning, ordering, and downloading are time and work intensive, but inevitable process. It has to be done not only once before the actual processing, but for continuous monitoring, it poses a continuous and expensive effort. Therefore an automatic acquisition and processing system is developed at DLR, which allows pseudo-continuous processing of data sequences over the test site and also be applicable to any other optional test-site extension, including the increase of data volume. This system reduces the load of manual work necessary to perform interferometric stacking and quickly gain first information on evolving geophysical processes at the, but not limited to the Italian supersites.

  6. Development of a fully automated network system for long-term health-care monitoring at home. (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K


    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  7. The monitoring and control of TRUEX processes

    Energy Technology Data Exchange (ETDEWEB)

    Regalbuto, M.C.; Misra, B.; Chamberlain, D.B.; Leonard, R.A.; Vandegrift, G.F.


    The Generic TRUEX Model (GTM) was used to design a flowsheet for the TRUEX solvent extraction process that would be used to determine its instrumentation and control requirements. Sensitivity analyses of the key process variables, namely, the aqueous and organic flow rates, feed compositions, and the number of contactor stages, were carried out to assess their impact on the operation of the TRUEX process. Results of these analyses provide a basis for the selection of an instrument and control system and the eventual implementation of a control algorithm. Volume Two of this report is an evaluation of the instruments available for measuring many of the physical parameters. Equations that model the dynamic behavior of the TRUEX process have been generated. These equations can be used to describe the transient or dynamic behavior of the process for a given flowsheet in accordance with the TRUEX model. Further work will be done with the dynamic model to determine how and how quickly the system responds to various perturbations. The use of perturbation analysis early in the design stage will lead to a robust flowsheet, namely, one that will meet all process goals and allow for wide control bounds. The process time delay, that is, the speed with which the system reaches a new steady state, is an important parameter in monitoring and controlling a process. In the future, instrument selection and point-of-variable measurement, now done using the steady-state results reported here, will be reviewed and modified as necessary based on this dynamic method of analysis.

  8. The Use of an Automated System (GreenFeed) to Monitor Enteric Methane and Carbon Dioxide Emissions from Ruminant Animals. (United States)

    Hristov, Alexander N; Oh, Joonpyo; Giallongo, Fabio; Frederick, Tyler; Weeks, Holley; Zimmerman, Patrick R; Harper, Michael T; Hristova, Rada A; Zimmerman, R Scott; Branco, Antonio F


    Ruminant animals (domesticated or wild) emit methane (CH4) through enteric fermentation in their digestive tract and from decomposition of manure during storage. These processes are the major sources of greenhouse gas (GHG) emissions from animal production systems. Techniques for measuring enteric CH4 vary from direct measurements (respiration chambers, which are highly accurate, but with limited applicability) to various indirect methods (sniffers, laser technology, which are practical, but with variable accuracy). The sulfur hexafluoride (SF6) tracer gas method is commonly used to measure enteric CH4 production by animal scientists and more recently, application of an Automated Head-Chamber System (AHCS) (GreenFeed, C-Lock, Inc., Rapid City, SD), which is the focus of this experiment, has been growing. AHCS is an automated system to monitor CH4 and carbon dioxide (CO2) mass fluxes from the breath of ruminant animals. In a typical AHCS operation, small quantities of baiting feed are dispensed to individual animals to lure them to AHCS multiple times daily. As the animal visits AHCS, a fan system pulls air past the animal's muzzle into an intake manifold, and through an air collection pipe where continuous airflow rates are measured. A sub-sample of air is pumped out of the pipe into non-dispersive infra-red sensors for continuous measurement of CH4 and CO2 concentrations. Field comparisons of AHCS to respiration chambers or SF6 have demonstrated that AHCS produces repeatable and accurate CH4 emission results, provided that animal visits to AHCS are sufficient so emission estimates are representative of the diurnal rhythm of rumen gas production. Here, we demonstrate the use of AHCS to measure CO2 and CH4 fluxes from dairy cows given a control diet or a diet supplemented with technical-grade cashew nut shell liquid.

  9. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering


    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods.

  10. A Recursive Multiscale Correlation-Averaging Algorithm for an Automated Distributed Road Condition Monitoring System

    Energy Technology Data Exchange (ETDEWEB)

    Ndoye, Mandoye [Lawrence Livermore National Laboratory (LLNL); Barker, Alan M [ORNL; Krogmeier, James [Purdue University; Bullock, Darcy [Purdue University


    A signal processing approach is proposed to jointly filter and fuse spatially indexed measurements captured from many vehicles. It is assumed that these measurements are influenced by both sensor noise and measurement indexing uncertainties. Measurements from low-cost vehicle-mounted sensors (e.g., accelerometers and Global Positioning System (GPS) receivers) are properly combined to produce higher quality road roughness data for cost-effective road surface condition monitoring. The proposed algorithms are recursively implemented and thus require only moderate computational power and memory space. These algorithms are important for future road management systems, which will use on-road vehicles as a distributed network of sensing probes gathering spatially indexed measurements for condition monitoring, in addition to other applications, such as environmental sensing and/or traffic monitoring. Our method and the related signal processing algorithms have been successfully tested using field data.

  11. Effect of Using Automated Auditing Tools on Detecting Compliance Failures in Unmanaged Processes (United States)

    Doganata, Yurdaer; Curbera, Francisco

    The effect of using automated auditing tools to detect compliance failures in unmanaged business processes is investigated. In the absence of a process execution engine, compliance of an unmanaged business process is tracked by using an auditing tool developed based on business provenance technology or employing auditors. Since budget constraints limit employing auditors to evaluate all process instances, a methodology is devised to use both expert opinion on a limited set of process instances and the results produced by fallible automated audit machines on all process instances. An improvement factor is defined based on the average number of non-compliant process instances detected and it is shown that the improvement depends on the prevalence of non-compliance in the process as well as the sensitivity and the specificity of the audit machine.




    The article focused the next problem: What physical and virtual instruments interact in an interface Human Machine in order to automate a prototype to the process of pastes and liquids?; the design includes the measured and the selection of devices. The implementation considers the SMAR'S SYSTEM 302 platform to the process supervision and control; As a result of this we have an automated module whose investment was returned by the CEMA's courses. El artículo trata el siguiente problema: ¿Q...

  13. Automated processing of webcam images for phenological classification. (United States)

    Bothmann, Ludwig; Menzel, Annette; Menze, Bjoern H; Schunk, Christian; Kauermann, Göran


    Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels' time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the statistical software


    Greiter, M B; Denk, J; Hoedlmoser, H


    The individual monitoring service at the Helmholtz Zentrum München has adopted the recommendations of the ISO 4037 and 6980 standards series as base of its dosimetric systems for X-ray, gamma and beta dosimetry. These standards define technical requirements for radiation spectra and measurement processes, but leave flexibility in the implementation of irradiations as well as in the resulting uncertainty in dose or dose rate. This article provides an example for their practical implementation in the Munich IAEA/WHO secondary standard dosimetry laboratory. It focusses on two aspects: automation issues and uncertainties in calibration.

  15. Research progress of laser welding process dynamic monitoring technology based on plasma characteristics signal

    Directory of Open Access Journals (Sweden)

    Teng WANG


    Full Text Available During the high-power laser welding process, plasmas are induced by the evaporation of metal under laser radiation, which can affect the coupling of laser energy and the workpiece, and ultimately impact on the reliability of laser welding quality and process directly. The research of laser-induced plasma is a focus in high-power deep penetration welding field, which provides a promising research area for realizing the automation of welding process quality inspection. In recent years, the research of laser welding process dynamic monitoring technology based on plasma characteristics is mainly in two aspects, namely the research of plasma signal detection and the research of laser welding process modeling. The laser-induced plasma in the laser welding is introduced, and the related research of laser welding process dynamic monitoring technology based on plasma characteristics at home and abroad is analyzed. The current problems in the field are summarized, and the future development trend is put forward.

  16. The feasibility of automated online flow cytometry for in-situ monitoring of microbial dynamics in aquatic ecosystems

    Directory of Open Access Journals (Sweden)

    Michael Domenic Besmer


    Full Text Available Fluorescent staining coupled with flow cytometry (FCM is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i flowing tap water from a municipal drinking water supply network and (ii river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12 to 14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough towards the eventual establishment of fully automated online microbiological monitoring technologies.

  17. Synthesis of many different types of organic small molecules using one automated process. (United States)

    Li, Junqi; Ballmer, Steven G; Gillis, Eric P; Fujii, Seiko; Schmidt, Michael J; Palazzolo, Andrea M E; Lehmann, Jonathan W; Morehouse, Greg F; Burke, Martin D


    Small-molecule synthesis usually relies on procedures that are highly customized for each target. A broadly applicable automated process could greatly increase the accessibility of this class of compounds to enable investigations of their practical potential. Here we report the synthesis of 14 distinct classes of small molecules using the same fully automated process. This was achieved by strategically expanding the scope of a building block-based synthesis platform to include even C(sp3)-rich polycyclic natural product frameworks and discovering a catch-and-release chromatographic purification protocol applicable to all of the corresponding intermediates. With thousands of compatible building blocks already commercially available, many small molecules are now accessible with this platform. More broadly, these findings illuminate an actionable roadmap to a more general and automated approach for small-molecule synthesis.

  18. Monitoring of pharmaceutical processes with image analysis

    DEFF Research Database (Denmark)

    Kucheryavskiy, Sergey; Wu, Jian-Xiong


    Automated Guided Vehicles (AGVs) fleet scheduling is one of the big problems in Flexible Manufacturing System (FMS) control. The problem is more complicated when concurrent multi-product manufacturing and resource deadlock avoidance policies are considered. The objective of the research is to pro...

  19. An automated method to quantify microglia morphology and application to monitor activation state longitudinally in vivo.

    Directory of Open Access Journals (Sweden)

    Cleopatra Kozlowski

    Full Text Available Microglia are specialized immune cells of the brain. Upon insult, microglia initiate a cascade of cellular responses including a characteristic change in cell morphology. To study the dynamics of microglia immune response in situ, we developed an automated image analysis method that enables the quantitative assessment of microglia activation state within tissue based solely on cell morphology. Per cell morphometric analysis of fluorescently labeled microglia is achieved through local iterative threshold segmentation, which reduces errors caused by signal-to-noise variation across large volumes. We demonstrate, utilizing systemic application of lipopolysaccharide as a model of immune challenge, that several morphological parameters, including cell perimeter length, cell roundness and soma size, quantitatively distinguish resting versus activated populations of microglia within tissue comparable to traditional immunohistochemistry methods. Furthermore, we provide proof-of-concept data that monitoring soma size enables the longitudinal assessment of microglia activation in the mouse neocortex imaged via 2-photon in vivo microscopy. The ability to quantify microglia activation automatically by shape alone allows unbiased and rapid analysis of both fixed and in vivo central nervous system tissue.

  20. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  1. Utility of an Automated Thermal-Based Approach for Monitoring Evapotranspiration

    Directory of Open Access Journals (Sweden)

    Timmermans Wim J.


    Full Text Available A very simple remote sensing-based model for water use monitoring is presented. The model acronym DATTUTDUT (Deriving Atmosphere Turbulent Transport Useful To Dummies Using Temperature is a Dutch word which loosely translates as “it’s unbelievable that it works”. DATTUTDUT is fully automated and only requires a surface temperature map, making it simple to use and providing a rapid estimate of spatially- distributed fluxes. The algorithm is first tested over a range of environmental and land-cover conditions using data from four short-term field experiments and then evaluated over a growing season in an agricultural region. Flux model output is in satisfactory agreement with observations and established remote sensing-based models, except under dry and partial canopy cover conditions. This suggests that DATTUTDUT has utility in identifying relative water use and as an operational tool providing initial estimates of ET anomalies in data-poor regions that would be confirmed using more robust modeling techniques.

  2. Automated simultaneous monitoring of nitrate and nitrite in surface water by sequential injection analysis. (United States)

    Legnerová, Zlatuse; Solich, Petr; Sklenárová, Hana; Satínský, Dalibor; Karlícek, Rolf


    A fully automated procedure based on Sequential Injection Analysis (SIA) methodology for simultaneous monitoring of nitrate and nitrite in surface water samples is described. Nitrite was determined directly using the Griess diazo-coupling reaction and the formed azo dye was measured at 540 nm in the flow cell of the fibre-optic spectrophotometer. Nitrate zone was passed through a reducing mini-column containing copperised-cadmium. After the reduction of nitrate into nitrite the sample was aspirated by flow reversal to the holding coil, treated with the reagent and finally passed through the flow cell. The calibration curve was linear over the range 0.05-1.00 mg N l(-1) of nitrite and 0.50-50.00 mg N l(-1) of nitrate; correlation coefficients were 0.9993 and 0.9988 for nitrite and nitrate, respectively. Detection limits were 0.015 and 0.10 mg N l(-1) for nitrite and nitrate, respectively. The relative standard deviation (RSD) values (n = 3) were 1.10% and 1.32% for nitrite and nitrate, respectively. The total time of one measuring cycle was 250 s, thus the sample throughput was about 14 h(-1). Nitrate and nitrite were determined in the real samples of surface water, and the results have been compared with those obtained by two other flow methods; flow injection analysis based on the same reactions and isotachophoretic determination used in a routine environmental control laboratory.

  3. Automating security monitoring and analysis for Space Station Freedom's electric power system (United States)

    Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han


    Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A novel approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.

  4. Contaminant analysis automation demonstration proposal

    Energy Technology Data Exchange (ETDEWEB)

    Dodson, M.G.; Schur, A.; Heubach, J.G.


    The nation-wide and global need for environmental restoration and waste remediation (ER&WR) presents significant challenges to the analytical chemistry laboratory. The expansion of ER&WR programs forces an increase in the volume of samples processed and the demand for analysis data. To handle this expanding volume, productivity must be increased. However. The need for significantly increased productivity, faces contaminant analysis process which is costly in time, labor, equipment, and safety protection. Laboratory automation offers a cost effective approach to meeting current and future contaminant analytical laboratory needs. The proposed demonstration will present a proof-of-concept automated laboratory conducting varied sample preparations. This automated process also highlights a graphical user interface that provides supervisory, control and monitoring of the automated process. The demonstration provides affirming answers to the following questions about laboratory automation: Can preparation of contaminants be successfully automated?; Can a full-scale working proof-of-concept automated laboratory be developed that is capable of preparing contaminant and hazardous chemical samples?; Can the automated processes be seamlessly integrated and controlled?; Can the automated laboratory be customized through readily convertible design? and Can automated sample preparation concepts be extended to the other phases of the sample analysis process? To fully reap the benefits of automation, four human factors areas should be studied and the outputs used to increase the efficiency of laboratory automation. These areas include: (1) laboratory configuration, (2) procedures, (3) receptacles and fixtures, and (4) human-computer interface for the full automated system and complex laboratory information management systems.

  5. Automated processing of webcam images for phenological classification (United States)

    Bothmann, Ludwig; Menzel, Annette; Menze, Bjoern H.; Schunk, Christian; Kauermann, Göran


    Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels’ time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the statistical software

  6. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example (United States)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.


    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  7. Automated business processes in outbound logistics: An information system perspective

    DEFF Research Database (Denmark)

    Tambo, Torben


    by securing a much higher quality of data and eliminating a number of defect management processes in the domain of fashion wholesale and retail. The information system perspective is used in dealing with adopting bespoke ERP development effort of creating an ERP system on the leading edge of business process...

  8. Monitoring seasonal and diurnal changes in photosynthetic pigments with automated PRI and NDVI sensors (United States)

    Gamon, J. A.; Kovalchuck, O.; Wong, C. Y. S.; Harris, A.; Garrity, S. R.


    The vegetation indices normalized difference vegetation index (NDVI) and photochemical reflectance index (PRI) provide indicators of pigmentation and photosynthetic activity that can be used to model photosynthesis from remote sensing with the light-use-efficiency model. To help develop and validate this approach, reliable proximal NDVI and PRI sensors have been needed. We tested new NDVI and PRI sensors, "spectral reflectance sensors" (SRS sensors; recently developed by Decagon Devices, during spring activation of photosynthetic activity in evergreen and deciduous stands. We also evaluated two methods of sensor cross-calibration - one that considered sky conditions (cloud cover) at midday only, and another that also considered diurnal sun angle effects. Cross-calibration clearly affected sensor agreement with independent measurements, with the best method dependent upon the study aim and time frame (seasonal vs. diurnal). The seasonal patterns of NDVI and PRI differed for evergreen and deciduous species, demonstrating the complementary nature of these two indices. Over the spring season, PRI was most strongly influenced by changing chlorophyll : carotenoid pool sizes, while over the diurnal timescale, PRI was most affected by the xanthophyll cycle epoxidation state. This finding demonstrates that the SRS PRI sensors can resolve different processes affecting PRI over different timescales. The advent of small, inexpensive, automated PRI and NDVI sensors offers new ways to explore environmental and physiological constraints on photosynthesis, and may be particularly well suited for use at flux tower sites. Wider application of automated sensors could lead to improved integration of flux and remote sensing approaches for studying photosynthetic carbon uptake, and could help define the concept of contrasting vegetation optical types.

  9. Towards automated processing of the right of access in inter-organizational Web Service compositions

    DEFF Research Database (Denmark)

    Herkenhöner, Ralph; De Meer, Hermann; Jensen, Meiko;


    with trade secret protection. In this paper, we present an automated architecture to enable exercising the right of access in the domain of inter-organizational business processes based on Web Services technology. Deriving its requirements from the legal, economical, and technical obligations, we show...

  10. Automating the Weekly Flight Scheduling Process at the USAF Test Pilot School (United States)


    the resulting mixed integer programming model, all known solution techniques were impractical. Consequently, a heuristic algorithm was developed. The...Further Work . ......... . 70 Appendix A . Heuristic Algorithm Applied to the Example Problem . ......... 72 Appendix B. Heuristic Algorithm Applied a Full...automating the flight scheduling process at the TPS. 71 Appendix A : Heuristic Algorithm Applied to the Example Problem Student Pilot Availability Data

  11. Performance of Three Mode-Meter Block-Processing Algorithms for Automated Dynamic Stability Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel J.; Pierre, John W.; Zhou, Ning; Hauer, John F.; Parashar, Manu


    The frequency and damping of electromechanical modes offer considerable insight into the dynamic stability properties of a power system. The performance properties of three block-processing algorithms from the perspective of near real-time automated stability assessment are demonstrated and examined. The algorithms are: the extended modified Yule Walker (YW); extended modified Yule Walker with Spectral analysis (YWS); and numerical state-space subspace system identification(N4SID) algorithm. The YW and N4SID have been introduced in previous publications while the YWS is introduced here. Issues addressed include: stability assessment requirements; automated subset selecting identified modes; using algorithms in an automated format; data assumptions and quality; and expected algorithm estimation performance.

  12. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen


    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  13. Accounting Automation




    Accounting Automation   Click Link Below To Buy:  Or Visit Accounting Automation” Please respond to the following: Imagine you are a consultant hired to convert a manual accounting system to an automated system. Suggest the key advantages and disadvantages of automating a manual accounting system. Identify the most important step in the conversion process. Provide a rationale for your response. ...


    Directory of Open Access Journals (Sweden)

    K. Sujatha


    Full Text Available Combustion quality in power station boilers plays an important role in minimizing the flue gas emissions. In the present work various intelligent schemes to infer the flue gas emissions by monitoring the flame colour at the furnace of the boiler are proposed here. Flame image monitoring involves capturing the flame video over a period of time with the measurement of various parameters like Carbon dioxide (CO2, excess oxygen (O2, Nitrogen dioxide (NOx, Sulphur dioxide (SOx and Carbon monoxide (CO emissions plus the flame temperature at the core of the fire ball, air/fuel ratio and the combustion quality. Higher the quality of combustion less will be the flue gases at the exhaust. The flame video was captured using an infrared camera. The flame video is then split up into the frames for further analysis. The video splitter is used for progressive extraction of the flame images from the video. The images of the flame are then pre-processed to reduce noise. The conventional classification and clustering techniques include the Euclidean distance classifier (L2 norm classifier. The intelligent classifier includes the Radial Basis Function Network (RBF, Back Propagation Algorithm (BPA and parallel architecture with RBF and BPA (PRBFBPA. The results of the validation are supported with the above mentioned performance measures whose values are in the optimal range. The values of the temperatures, combustion quality, SOx, NOx, CO, CO2 concentrations, air and fuel supplied corresponding to the images were obtained thereby indicating the necessary control action taken to increase or decrease the air supply so as to ensure complete combustion. In this work, by continuously monitoring the flame images, combustion quality was inferred (complete/partial/incomplete combustion and the air/fuel ratio can be automatically varied. Moreover in the existing set-up, measurements like NOx, CO and CO2 are inferred from the samples that are collected periodically or by

  15. Lyophilization: a useful approach to the automation of analytical processes?


    de Castro, M. D. Luque; Izquierdo, A.


    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  16. Automation System in Rare Earths Countercurrent Extraction Processes

    Institute of Scientific and Technical Information of China (English)

    贾江涛; 严纯华; 廖春生; 吴声; 王明文; 李标国


    Based on the countercurrent extraction theory for optimized designing and simulating, the rare earth separation processes, the selection of the detecting points (stages) and on-line analysis for elements, the simulation of open loop response and its response speed, the diagnosis and the regulative prescription for running the solvent extraction cascades were studied.

  17. Automated Braille production from word-processed documents. (United States)

    Blenkhorn, P; Evans, G


    This paper describes a novel method for automatically generating Braille documents from word-processed (Microsoft Word) documents. In particular it details how, by using the Word Object Model, the translation system can map the layout information (format) in the print document into an appropriate Braille equivalent.

  18. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation. (United States)

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin


    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  19. Development and validation of an automated operational modal analysis algorithm for vibration-based monitoring and tensile load estimation (United States)

    Rainieri, Carlo; Fabbrocino, Giovanni


    In the last few decades large research efforts have been devoted to the development of methods for automated detection of damage and degradation phenomena at an early stage. Modal-based damage detection techniques are well-established methods, whose effectiveness for Level 1 (existence) and Level 2 (location) damage detection is demonstrated by several studies. The indirect estimation of tensile loads in cables and tie-rods is another attractive application of vibration measurements. It provides interesting opportunities for cheap and fast quality checks in the construction phase, as well as for safety evaluations and structural maintenance over the structure lifespan. However, the lack of automated modal identification and tracking procedures has been for long a relevant drawback to the extensive application of the above-mentioned techniques in the engineering practice. An increasing number of field applications of modal-based structural health and performance assessment are appearing after the development of several automated output-only modal identification procedures in the last few years. Nevertheless, additional efforts are still needed to enhance the robustness of automated modal identification algorithms, control the computational efforts and improve the reliability of modal parameter estimates (in particular, damping). This paper deals with an original algorithm for automated output-only modal parameter estimation. Particular emphasis is given to the extensive validation of the algorithm based on simulated and real datasets in view of continuous monitoring applications. The results point out that the algorithm is fairly robust and demonstrate its ability to provide accurate and precise estimates of the modal parameters, including damping ratios. As a result, it has been used to develop systems for vibration-based estimation of tensile loads in cables and tie-rods. Promising results have been achieved for non-destructive testing as well as continuous

  20. Adaptive Software Development supported by an Automated Process: a Reference Model

    Directory of Open Access Journals (Sweden)

    AFFONSO, F. J.


    Full Text Available This paper presents a reference model as an automated process to assist the adaptive software development at runtime, also known as Self-adaptive Systems (SaS at runtime. This type of software has specific characteristics in comparison to traditional one, since it allows that changes (structural or behavioral to be incorporated at runtime. Automated processes have been used as a feasible solution to conduct software adaptation at runtime by minimizing human involvement (developers and speeding up the execution of tasks. In parallel, reference models have been used to aggregate knowledge and architectural artifacts, since they capture the systems essence in specific domains. However, presently no there is reference model based on reflection for the automation of software adaptation at runtime. In this scenario, this paper presents a reference model based on reflection, as an automated process, for the development of software systems that require adaptation at runtime. To show the applicability of the model, a case study was conducted and a good perspective to efficiently contribute to the area of SaS has been obtained.

  1. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules: Final Subcontract Report, April 1998 - April 2002

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore, S. B.; Hogan, S. J.


    This report describes the automated systems developed for PV module assembly and testing processes after lamination. These processes are applicable to a broad range of module types, including those made with wafer-based and thin-film solar cells. Survey data and input from module manufacturers gathered during site visits were used to define system capabilities and process specifications. Spire completed mechanical, electrical, and software engineering for four automation systems: a module edge trimming system, the SPI-TRIM 350; an edge sealing and framing system, the SPI-FRAMER 350; an integrated module testing system, the SPI-MODULE QA 350; and a module buffer storage system, the SPI-BUFFER 350. A fifth system for junction-box installation, the SPI-BOXER 350, was nearly completed during the program. A new-size solar simulator, the SPI-SUN SIMULATOR 350i, was designed as part of the SPI-MODULE QA 350. This simulator occupies minimal production floor space, and its test area is large enough to handle most production modules. The automated systems developed in this program are designed for integration to create automated production lines.

  2. Automated Hardware and Software System for Monitoring the Earth’s Magnetic Environment

    Directory of Open Access Journals (Sweden)

    Alexei Gvishiani


    Full Text Available The continuous growth of geophysical observations requires adequate methods for their processing and analysis. This becomes one of the most important and widely discussed issues in the data science community. The system analysis methods and data mining techniques are able to sustain the solution of this problem. This paper presents an innovative holistic hardware/software system (HSS developed for efficient management and intellectual analysis of geomagnetic data, registered by Russian geomagnetic observatories and international satellites. Geomagnetic observatories that comprise the International Real-time Magnetic Observatory Network (INTERMAGNET produce preliminary (raw and definitive (corrected geomagnetic data of the highest quality. The designed system automates and accelerates routine production of definitive data from the preliminary magnetograms, obtained by Russian observatories, due to implemented algorithms that involve artificial intelligence elements. The HSS is the first system that provides sophisticated automatic detection and multi-criteria classification of extreme geomagnetic conditions, which may be hazardous for technological infrastructure and economic activity in Russia. It enables the online access to digital geomagnetic data, its processing results and modelling calculations along with their visualization on conventional and spherical screens. The concept of the presented system agrees with the accepted ‘four Vs’ paradigm of Big Data. The HSS can increase significantly the ‘velocity’ and ‘veracity’ features of the INTERMAGNET system. It also provides fusion of large sets of ground-based and satellite geomagnetic data, thus facilitating the ‘volume’ and ‘variety’ of handled data.

  3. The integration of process monitoring for safeguards.

    Energy Technology Data Exchange (ETDEWEB)

    Cipiti, Benjamin B.; Zinaman, Owen R.


    The Separations and Safeguards Performance Model is a reprocessing plant model that has been developed for safeguards analyses of future plant designs. The model has been modified to integrate bulk process monitoring data with traditional plutonium inventory balances to evaluate potential advanced safeguards systems. Taking advantage of the wealth of operator data such as flow rates and mass balances of bulk material, the timeliness of detection of material loss was shown to improve considerably. Four diversion cases were tested including both abrupt and protracted diversions at early and late times in the run. The first three cases indicated alarms before half of a significant quantity of material was removed. The buildup of error over time prevented detection in the case of a protracted diversion late in the run. Some issues related to the alarm conditions and bias correction will need to be addressed in future work. This work both demonstrates the use of the model for performing diversion scenario analyses and for testing advanced safeguards system designs.

  4. Cassini's Maneuver Automation Software (MAS) Process: How to Successfully Command 200 Navigation Maneuvers (United States)

    Yang, Genevie Velarde; Mohr, David; Kirby, Charles E.


    To keep Cassini on its complex trajectory, more than 200 orbit trim maneuvers (OTMs) have been planned from July 2004 to July 2010. With only a few days between many of these OTMs, the operations process of planning and executing the necessary commands had to be automated. The resulting Maneuver Automation Software (MAS) process minimizes the workforce required for, and maximizes the efficiency of, the maneuver design and uplink activities. The MAS process is a well-organized and logically constructed interface between Cassini's Navigation (NAV), Spacecraft Operations (SCO), and Ground Software teams. Upon delivery of an orbit determination (OD) from NAV, the MAS process can generate a maneuver design and all related uplink and verification products within 30 minutes. To date, all 112 OTMs executed by the Cassini spacecraft have been successful. MAS was even used to successfully design and execute a maneuver while the spacecraft was in safe mode.

  5. Quantitative and Qualitative Analysis of Aconitum Alkaloids in Raw and Processed Chuanwu and Caowu by HPLC in Combination with Automated Analytical System and ESI/MS/MS

    Directory of Open Access Journals (Sweden)

    Aimin Sun


    Full Text Available HPLC in combination with automated analytical system and ESI/MS/MS was used to analyze aconitine (A, mesaconitine (MA, hypaconitine (HA, and their benzoyl analogs in the Chinese herbs Caowu and Chuanwu. First, an HPLC method was developed and validated to determine A, MA, and HA in raw and processed Caowu and Chuanwu. Then an automated analytical system and ESI/MS/MS were applied to analyze these alkaloids and their semihydrolyzed products. The results obtained from automated analytical system are identical to those from ESI/MS/MS, which indicated that the method is a convenient and rapid tool for the qualitative analysis of herbal preparations. Furthermore, HA was little hydrolyzed by heating processes and thus it might account more for the toxicity of processed aconites. Hence, HA could be used as an indicator when one alkaloid is required as a reference to monitor the quality of raw and processed Chuanwu and Caowu. In addition, the raw and processed Chuanwu and Caowu can be distinguished by monitoring the ratio of A and MA to HA.

  6. Monitoring polio supplementary immunization activities using an automated short text messaging system in Karachi, Pakistan (United States)

    Murtaza, A; Khoja, S; Zaidi, AK; Ali, SA


    Abstract Problem Polio remains endemic in many areas of Pakistan, including large urban centres such as Karachi. Approach During each of seven supplementary immunization activities against polio in Karachi, mobile phone numbers of the caregivers of a random sample of eligible children were obtained. A computer-based system was developed to send two questions – as short message service (SMS) texts – automatically to each number after the immunization activity: “Did the vaccinator visit your house?” and “Did the enrolled child in your household receive oral polio vaccine?” Persistent non-responders were phoned directly by an investigator. Local setting A cluster sampling technique was used to select representative samples of the caregivers of young children in Karachi in general and of such caregivers in three of the six “high-risk” districts of the city where polio cases were detected in 2011. Relevant changes In most of the supplementary immunization activities investigated, vaccine coverages estimated using the SMS system were very similar to those estimated by interviewing by phone those caregivers who never responded to the SMS messages. In the high-risk districts investigated, coverages estimated using the SMS system were also similar to those recorded – using lot quality assurance sampling – by the World Health Organization. Lessons learnt For the monitoring of coverage in supplementary immunization activities, automated SMS-based systems appear to be an attractive and relatively inexpensive option. Further research is needed to determine if coverage data collected by SMS-based systems provide estimates that are sufficiently accurate. Such systems may be useful in other large-scale immunization campaigns. PMID:24700982

  7. Intelligent remote health monitoring using evident-based DSS for automated assistance. (United States)

    Serhani, Mohamed Adel; Benharref, Abdelghani; Nujum, Al Ramzana


    The shift from common diagnosis practices to continuous monitoring based on body sensors has transformed healthcare from hospital-centric to patient-centric. Continuous monitoring generates huge and continuous amount of data revealing changing insights. Existing approaches to analyze streams of data in order to produce validated decisions relied mostly on static learning and analytics techniques. In this paper, we propose an incremental learning and adaptive analytics scheme relying on evident data and rule-based Decision Support System (DSS). The later continuously enriches its knowledge base with incremental learning information impacting the decision and proposing up-to-date recommendations. Some intelligent features augmented the monitoring scheme with data pre-processing and cleansing support, which helped empowering data analytics efficiency. Generated assistances are viewable to users on their mobile devices and to physician via a portal. We evaluate our incremental learning and analytics scheme using seven well-known learning techniques. The set of experimental scenarios of continuous heart rate and ECG monitoring demonstrated that the incremental learning combined with rule-based DSS afforded high classification accuracy, evidenced decision, and validated assistance.

  8. Temperature versus time curves for manual and automated soldering processes

    Energy Technology Data Exchange (ETDEWEB)

    Trent, M.A.


    Temperature-versus-time curves were recorded for various electronic components during pre-tinning, hand soldering, and drag soldering operations to determine the temperature ranges encountered. The component types investigated included a wide range of electronic assemblies. The data collected has been arranged by process and will help engineers to: (1) predetermine the thermal profile to which various components are subjected during the soldering operation; (2) decide--on the basis of component heat sensitivity and the need for thermal relief--where hand soldering would be more feasible than drag soldering; and (3) determine the optimum drag solder control parameters.

  9. Process development for automated solar cell and module production. Task 4: automated array assembly. Quarterly report No. 1

    Energy Technology Data Exchange (ETDEWEB)

    Witham, C.R.


    The objective of this program is to determine the state-of-the-art and to develop some of the technology required to allow for large volume and low cost terrestrial solar panel production. The baseline production facility being studied would provide for production of 200 megawatts of solar panels per year from an input commodity as sawn Czochralski wafers. Initial analysis of available automation equipment applicable to the 1986 goals shows that most of the equipment will have to be of special design. The currently available equipment is designed for the semiconductor industry where process volumes are low. Maximum speeds are of the range of 100 to 200 wafers per hour. Using special equipment it appears feasible to produce the solar cells with 6 to 8 parallel production lines operating three shifts per day, seven days per week and to produce the encapsulated modules with 1 to 3 parallel production lines. Preliminary costs analyses show promise for reaching the 1986 price goals assuming a SAMICS wafer price of $0.28/wafer (1986 dollars). Initial work has been done to study the applicability of a plasma process to perform back etch of the cells. This area shows promise for eliminating wet chemical etching procedures with attendant rinse and dry equipment and time required.


    Directory of Open Access Journals (Sweden)

    О. И. Дзювина


    Full Text Available Rating technology of teaching enables independent and individual work of students, increase their motivation.Purpose: to increase the efficiency of data processing with the implementation of rating technology of teaching.Method: analysis, synthesis,experiment.Results. Developed an automated data processing system for the implementation of rating technology of teaching.Practical implication. Education.Purchase on > Buy now

  11. Monitoring individual cow udder health in automated milking systems using online somatic cell counts. (United States)

    Sørensen, L P; Bjerring, M; Løvendahl, P


    This study presents and validates a detection and monitoring model for mastitis based on automated frequent sampling of online cell count (OCC). Initially, data were filtered and adjusted for sensor drift and skewed distribution using ln-transformation. Acceptable data were passed on to a time-series model using double exponential smoothing to estimate level and trends at cow level. The OCC levels and trends were converted to a continuous (0-1) scale, termed elevated mastitis risk (EMR), where values close to zero indicate healthy cow status and values close to 1 indicate high risk of mastitis. Finally, a feedback loop was included to dynamically request a time to next sample, based on latest EMR values or errors in the raw data stream. The estimated EMR values were used to issue 2 types of alerts, new and (on-going) intramammary infection (IMI) alerts. The new alerts were issued when the EMR values exceeded a threshold, and the IMI alerts were issued for subsequent alerts. New alerts were only issued after the EMR had been below the threshold for at least 8d. The detection model was evaluated using time-window analysis and commercial herd data (6 herds, 595,927 milkings) at different sampling intensities. Recorded treatments of mastitis were used as gold standard. Significantly higher EMR values were detected in treated than in contemporary untreated cows. The proportion of detected mastitis cases using new alerts was between 28.0 and 43.1% and highest for a fixed sampling scheme aiming at 24h between measurements. This was higher for IMI alerts, between 54.6 and 89.0%, and highest when all available measurements were used. The lowest false alert rate of 6.5 per 1,000 milkings was observed when all measurements were used. The results showed that a dynamic sampling scheme with a default value of 24h between measurements gave only a small reduction in proportion of detected mastitis treatments and remained at 88.5%. It was concluded that filtering of raw data

  12. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles. (United States)

    Görgl, R.; Brandstätter, E.


    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  13. Automated processing of massive audio/video content using FFmpeg

    Directory of Open Access Journals (Sweden)

    Kia Siang Hock


    Full Text Available Audio and video content forms an integral, important and expanding part of the digital collections in libraries and archives world-wide. While these memory institutions are familiar and well-versed in the management of more conventional materials such as books, periodicals, ephemera and images, the handling of audio (e.g., oral history recordings and video content (e.g., audio-visual recordings, broadcast content requires additional toolkits. In particular, a robust and comprehensive tool that provides a programmable interface is indispensable when dealing with tens of thousands of hours of audio and video content. FFmpeg is comprehensive and well-established open source software that is capable of the full-range of audio/video processing tasks (such as encode, decode, transcode, mux, demux, stream and filter. It is also capable of handling a wide-range of audio and video formats, a unique challenge in memory institutions. It comes with a command line interface, as well as a set of developer libraries that can be incorporated into applications.

  14. Automated Coronal Loop Identification Using Digital Image Processing Techniques (United States)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.


    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  15. Automation of NLO processes and decays and POWHEG matching in WHIZARD

    Energy Technology Data Exchange (ETDEWEB)

    Reuter, Juergen; Chokoufe, Bijan [DESY, Hamburg (Germany). Theory Group; Hoang, Andre [Vienna Univ. (Austria). Faculty of Physics; Vienna Univ. (Austria). Erwin Schroedinger International Inst. for Mathematical Physics; Kilian, Wolfgang [Siegen Univ. (Germany); Stahlhofen, Maximilian [Mainz Univ. (Germany). PRISMA Cluster of Excellence; DESY, Hamburg (Germany). Theory Group; Teubner, Thomas [Liverpool Univ. (United Kingdom). Dept. of Mathematical Sciences; Weiss, Christian [DESY, Hamburg (Germany). Theory Group; Siegen Univ. (Germany)


    We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one-loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e{sup +}e{sup -} processes are shown. Also, first NLO-studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the POWHEG formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables.

  16. Simplified automated image analysis for detection and phenotyping of Mycobacterium tuberculosis on porous supports by monitoring growing microcolonies.

    Directory of Open Access Journals (Sweden)

    Alice L den Hertog

    Full Text Available BACKGROUND: Even with the advent of nucleic acid (NA amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS, as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. METHODS: Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO supports. Repeated imaging during colony growth greatly simplifies "computer vision" and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. SIGNIFICANCE: Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation.

  17. Comprehensive automation and monitoring of MV grids as the key element of improvement of energy supply reliability and continuity

    Directory of Open Access Journals (Sweden)

    Stanisław Kubacki


    Full Text Available The paper presents the issue of comprehensive automation and monitoring of medium voltage (MV grids as a key element of the Smart Grid concept. The existing condition of MV grid control and monitoring is discussed, and the concept of a solution which will provide the possibility of remote automatic grid reconfiguration and ensure full grid observability from the dispatching system level is introduced. Automation of MV grid switching is discussed in detail to isolate a faulty line section and supply electricity at the time of the failure to the largest possible number of recipients. An example of such automation controls’ operation is also presented. The paper’s second part presents the key role of the quick fault location function and the possibility of the MV grid’s remote reconfiguration for improving power supply reliability (SAIDI and SAIFI indices. It is also shown how an increase in the number of points fitted with faulted circuit indicators with the option of remote control of switches from the dispatch system in MV grids may affect reduction of SAIDI and SAIFI indices across ENERGA-OPERATOR SA divisions.

  18. 30 CFR 828.12 - In situ processing: Monitoring. (United States)


    ... 30 Mineral Resources 3 2010-07-01 2010-07-01 false In situ processing: Monitoring. 828.12 Section 828.12 Mineral Resources OFFICE OF SURFACE MINING RECLAMATION AND ENFORCEMENT, DEPARTMENT OF THE... PROCESSING § 828.12 In situ processing: Monitoring. (a) Each person who conducts in situ...

  19. Spectral imaging applications: Remote sensing, environmental monitoring, medicine, military operations, factory automation and manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Gat, N.; Subramanian, S. [Opto-Knowledge Systems, Inc. (United States); Barhen, J. [Oak Ridge National Lab., TN (United States); Toomarian, N. [Jet Propulsion Lab., Pasadena, CA (United States)


    This paper reviews the activities at OKSI related to imaging spectroscopy presenting current and future applications of the technology. The authors discuss the development of several systems including hardware, signal processing, data classification algorithms and benchmarking techniques to determine algorithm performance. Signal processing for each application is tailored by incorporating the phenomenology appropriate to the process, into the algorithms. Pixel signatures are classified using techniques such as principal component analyses, generalized eigenvalue analysis and novel very fast neural network methods. The major hyperspectral imaging systems developed at OKSI include the Intelligent Missile Seeker (IMS) demonstration project for real-time target/decoy discrimination, and the Thermal InfraRed Imaging Spectrometer (TIRIS) for detection and tracking of toxic plumes and gases. In addition, systems for applications in medical photodiagnosis, manufacturing technology, and for crop monitoring are also under development.

  20. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Directory of Open Access Journals (Sweden)

    Dai Fei Elmer Ker

    Full Text Available Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and

  1. Research on the Correlation Between Oil Menitoring and Vibration Monitoring in Information Collecting and Processing Monitoring

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xin-ze; YAN Xin-ping; ZHAO Chun-hong; GAO Xiao-hong; XIAO Han-liang


    Oil monitoriug and vibration monitoring are two principal techniques for mechanical fault diagnosis and condition monitoring at present. They monitor the mechanical condition by different approaches, neverthelcss, oil and vibration monitoring are related in information collecting and processing. In the same mechanical system, the information obtained from the same information source can be described with the same expression form. The expressions are constituted of a structure matrix, a relative matrix and a system matrix. For oil and vibration monitoring, the information source is correlation and the collection is independent and complementary. And oil monitoring and vibration monitoring have the same process method when they yield their information. This rcsearch has provided a reasonable and useful approach to combine oil monitoring and vibration monitoring.

  2. InPRO: Automated Indoor Construction Progress Monitoring Using Unmanned Aerial Vehicles (United States)

    Hamledari, Hesam

    In this research, an envisioned automated intelligent robotic solution for automated indoor data collection and inspection that employs a series of unmanned aerial vehicles (UAV), entitled "InPRO", is presented. InPRO consists of four stages, namely: 1) automated path planning; 2) autonomous UAV-based indoor inspection; 3) automated computer vision-based assessment of progress; and, 4) automated updating of 4D building information models (BIM). The works presented in this thesis address the third stage of InPRO. A series of computer vision-based methods that automate the assessment of construction progress using images captured at indoor sites are introduced. The proposed methods employ computer vision and machine learning techniques to detect the components of under-construction indoor partitions. In particular, framing (studs), insulation, electrical outlets, and different states of drywall sheets (installing, plastering, and painting) are automatically detected using digital images. High accuracy rates, real-time performance, and operation without a priori information are indicators of the methods' promising performance.

  3. Evaluation of a Multi-Parameter Sensor for Automated, Continuous Cell Culture Monitoring in Bioreactors (United States)

    Pappas, D.; Jeevarajan, A.; Anderson, M. M.


    offer automated, continuous monitoring of cell cultures with a temporal resolution of 1 minute, which is not attainable by sampling via handheld blood analyzer (i-STAT). Conclusion: The resulting bias and precision found in these cell culture-based studies is comparable to Paratrend sensor clinical results. Although the large error in p02 measurements (+/-18 mm Hg) may be acceptable for clinical applications, where Paratrend values are periodically adjusted to a BGA measurement, the O2 sensor in this bundle may not be reliable enough for the single-calibration requirement of sensors used in NASA's bioreactors. The pH and pC02 sensors in the bundle are reliable and stable over the measurement period, and can be used without recalibration to measure cell cultures in rn.jcrogravity biotechnology experiments. Future work will test additional Paratrend sensors to provide statistical assessment of sensor performance.


    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager


    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi......The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  5. Automated system for monitoring groundwater levels at an experimental low-level waste disposal site

    Energy Technology Data Exchange (ETDEWEB)

    Newbold, J.D.; Bogle, M.A.


    One of the major problems with disposing of low-level solid wastes in the eastern United States is the potential for water-waste interactions and leachate migration. To monitor groundwater fluctuations and the frequency with which groundwater comes into contact with a group of experimental trenches, work at Oak Ridge National Laboratory's Engineered Test Facility (ETF) has employed a network of water level recorders that feed information from 15 on-site wells to a centralized data recording system. The purpose of this report is to describe the monitoring system being used and to document the computer programs that have been developed to process the data. Included in this report are data based on more than 2 years of water level information for ETF wells 1 through 12 and more than 6 months of data from all 15 wells. The data thus reflect both long-term trends as well as a large number of short-term responses to individual storm events. The system was designed to meet the specific needs of the ETF, but the hardware and computer routines have generic application to a variety of groundwater monitoring situations. 5 references.




    The paper describes the problems of automation of educational process at the course "Programming on high level language. Algorithmic languages". Complexities of testing of programs with the user interface are marked. Existing analogues was considered. Methods of automation of student's jobs testing are offered.

  7. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meier, David E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coble, Jamie B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, David V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mcdonald, Luther W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Forrester, Joel B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Unlu, Kenan [Pennsylvania State Univ., University Park, PA (United States); Landsberger, Sheldon [Univ. of Texas, Austin, TX (United States); Bender, Sarah [Pennsylvania State Univ., University Park, PA (United States); Dayman, Kenneth J. [Univ. of Texas, Austin, TX (United States); Reilly, Dallas D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicate changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.

  8. Total Column Greenhouse Gas Monitoring in Central Munich: Automation and Measurements (United States)

    Chen, Jia; Heinle, Ludwig; Paetzold, Johannes C.; Le, Long


    It is challenging to use in-situ surface measurements of CO2 and CH4 to derive emission fluxes in urban regions. Surface concentrations typically have high variance due to the influence of nearby sources, and they are strongly modulated by mesoscale transport phenomena that are difficult to simulate in atmospheric models. The integrated amount of a tracer through the whole atmosphere is a direct measure of the mass loading of the atmosphere given by emissions. Column measurements are insensitive to vertical redistribution of tracer mass, e.g. due to growth of the planetary boundary layer, and are also less influenced by nearby point sources, whose emissions are concentrated in a thin layer near the surface. Column observations are more compatible with the scale of atmospheric models and hence provide stronger constraints for inverse modeling. In Munich we are aiming at establishing a regional sensor network with differential column measurements, i.e. total column measurements of CO2 and CH4 inside and outside of the city. The inner-city station is equipped with a compact solar-tracking Fourier transform spectrometer (Bruker EM27/SUN) in the campus of Technische Universität München, and our measurements started in Aug. 2015. The measurements over seasons will be shown, as well as preliminary emission studies using these observations. To deploy the compact spectrometers for stationary monitoring of the urban emissions, an automatic protection and control system is mandatory and a challenging task. It will allow solar measurements whenever the sun is out and reliable protection of the instrument when it starts to rain. We have developed a simplified and highly reliable concept for the enclosure, aiming for a fully automated data collection station without the need of local human interactions. Furthermore, we are validating and combining the OCO-2 satellite-based measurements with our ground-based measurements. For this purpose, we have developed a software tool that

  9. Monitoring Industrial Food Processes Using Spectroscopy & Chemometrics

    DEFF Research Database (Denmark)

    Pedersen, Dorthe Kjær; Engelsen, Søren Balling


    In the last decade rapid spectroscopic measurements have revolutionized quality control in practically all areas of primary food and feed production. Near-infrared spectroscopy (NIR & NIT) has been implemented for monitoring the quality of millions of samples of cereals, milk and meat with unprec......In the last decade rapid spectroscopic measurements have revolutionized quality control in practically all areas of primary food and feed production. Near-infrared spectroscopy (NIR & NIT) has been implemented for monitoring the quality of millions of samples of cereals, milk and meat...

  10. Potentiometric electronic tongue-flow injection analysis system for the monitoring of heavy metal biosorption processes



    An automated flow injection potentiometric (FIP) system with electronic tongue detection (ET) is used for the monitoring of biosorption processes of heavy metals on vegetable wastes. Grape stalk wastes are used as biosorbent to remove Cu2+ ions in a fixed-bed column configuration. The ET is formed by a 5-sensor array with Cu2+ and Ca2+-selective electrodes and electrodes with generic response to heavy-metals, plus an artificial neural network response model of the sensor's cross-response. The...

  11. A Practical Guide to the Technology and Adoption of Software Process Automation (United States)


    In any way bo infringe on the rights of th adema• k holder. Table of Contents 1 Introduction 1 2 Software Process Automation in Context 5 2.1 Process...e18al CMU/SEI-94-TR-007 * -4 0’ I ** V I .43- 5. - K - CMU/SEI-94-TR007iv - . List of Tables Table 2-1: Pages of Practices at the Five CMM Levels...the Level 1 organization performs in order to achieve Level 2. 10 CMU/SEI-94-TR-007 •ho(5) OPTIMIZING rmvative tools mid technologies evalmut-i edfor

  12. Analysis of the thoracic aorta using a semi-automated post processing tool

    Energy Technology Data Exchange (ETDEWEB)

    Entezari, Pegah, E-mail: [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Kino, Aya, E-mail: [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Honarmand, Amir R., E-mail: [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Galizia, Mauricio S., E-mail: [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yang, Yan, E-mail: [Vital images Inc, Minnetonka, MN (United States); Collins, Jeremy, E-mail: [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yaghmai, Vahid, E-mail: [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Carr, James C., E-mail: [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States)


    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  13. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels


    Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff....... Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  14. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)


    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  15. Development of an automated sample preparation module for environmental monitoring of biowarfare agents. (United States)

    Hindson, Benjamin J; Brown, Steve B; Marshall, Graham D; McBride, Mary T; Makarewicz, Anthony J; Gutierrez, Dora M; Wolcott, Duane K; Metz, Thomas R; Madabhushi, Ramakrishna S; Dzenitis, John M; Colston, Billy W


    An automated sample preparation module, based upon sequential injection analysis (SIA), has been developed for use within an autonomous pathogen detection system. The SIA system interfaced aerosol sampling with multiplexed microsphere immunoassay-flow cytometric detection. Metering and sequestering of microspheres using SIA was found to be reproducible and reliable, over 24-h periods of autonomous operation. Four inbuilt immunoassay controls showed excellent immunoassay and system stability over five days of unattended continuous operation. Titration curves for two biological warfare agents, Bacillus anthracis and Yersinia pestis, obtained using the automated SIA procedure were shown to be similar to those generated using a manual microtiter plate procedure.

  16. Implementation of a novel postoperative monitoring system using automated Modified Early Warning Scores (MEWS) incorporating end-tidal capnography. (United States)

    Blankush, Joseph M; Freeman, Robbie; McIlvaine, Joy; Tran, Trung; Nassani, Stephen; Leitman, I Michael


    Modified Early Warning Scores (MEWS) provide real-time vital sign (VS) trending and reduce ICU admissions in post-operative patients. These early warning calculations classically incorporate oxygen saturation, heart rate, respiratory rate, systolic blood pressure, and temperature but have not previously included end-tidal CO2 (EtCO2), more recently identified as an independent predictor of critical illness. These systems may be subject to failure when physiologic data is incorrectly measured, leading to false alarms and increased workload. This study investigates whether the implementation of automated devices that utilize ongoing vital signs monitoring and MEWS calculations, inclusive of a score for end-tidal CO2 (EtCO2), can be feasibly implemented on the general care hospital floor and effectively identify derangements in a post-operative patient's condition while limiting the amount of false alarms that would serve to increase provider workload. From July to November 2014, post-operative patients meeting the inclusion criteria (BMI > 30 kg/m(2), history of obstructive sleep apnea, or the use of patient-controlled analgesia (PCA) or epidural narcotics) were monitored using automated devices that record minute-by-minute VS included in classic MEWS calculations as well as EtCO2. Automated messages via pagers were sent to providers for instances when the device measured elevated MEWS, abnormal EtCO2, and oxygen desaturations below 85 %. Data, including alarm and message details from the first 133 patients, were recorded and analyzed. Overall, 3.3 alarms and pages sounded per hour of monitoring. Device-only alarms sounded 2.7 times per hour-21 % were technical alarms. The remaining device-only alarms for concerning VS sounded 2.0/h, 70 % for falsely recorded VS. Pages for abnormal EtCO2 sounded 0.4/h (82 % false recordings) while pages for low blood oxygen saturation sounded 0.1/h (55 % false alarms). 143 times (0.1 pages/h) the devices calculated a MEWS

  17. Development of an automated processing system for potential fishing zone forecast (United States)

    Ardianto, R.; Setiawan, A.; Hidayat, J. J.; Zaky, A. R.


    The Institute for Marine Research and Observation (IMRO) - Ministry of Marine Affairs and Fisheries Republic of Indonesia (MMAF) has developed a potential fishing zone (PFZ) forecast using satellite data, called Peta Prakiraan Daerah Penangkapan Ikan (PPDPI). Since 2005, IMRO disseminates everyday PPDPI maps for fisheries marine ports and 3 days average for national areas. The accuracy in determining the PFZ and processing time of maps depend much on the experience of the operators creating them. This paper presents our research in developing an automated processing system for PPDPI in order to increase the accuracy and shorten processing time. PFZ are identified by combining MODIS sea surface temperature (SST) and chlorophyll-a (CHL) data in order to detect the presence of upwelling, thermal fronts and biological productivity enhancement, where the integration of these phenomena generally representing the PFZ. The whole process involves data download, map geo-process as well as layout that are carried out automatically by Python and ArcPy. The results showed that the automated processing system could be used to reduce the operator’s dependence on determining PFZ and speed up processing time.

  18. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)


    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  19. Validation of the process control system of an automated large scale manufacturing plant. (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H


    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  20. An Empirical Study on the Impact of Automation on the Requirements Analysis Process

    Institute of Scientific and Technical Information of China (English)

    Giuseppe Lami; Robert W. Ferguson


    Requirements analysis is an important phase in a software project. The analysis is often performed in aninformal way by specialists who review documents looking for ambiguities, technical inconsistencies and incomplete parts.Automation is still far from being applied in requirements analyses, above all since natural languages are informal andthus difficult to treat automatically. There are only a few tools that can analyse texts. One of them, called QuARS, wasdeveloped by the Istituto di Scienza e Tecnologie dell'Informazione and can analyse texts in terms of ambiguity. This paperdescribes how QuARS was used in a formal empirical experiment to assess the impact in terms of effectiveness and efficacyof the automation in the requirements review process of a software company.

  1. A fuzzy model for processing and monitoring vital signs in ICU patients

    Directory of Open Access Journals (Sweden)

    Valentim Ricardo AM


    Full Text Available Abstract Background The area of the hospital automation has been the subject of much research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others; communication (tracking patients, staff and materials, development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials; and aid to medical diagnosis (according to each speciality. Methods In this context, this paper presents a Fuzzy model for helping medical diagnosis of Intensive Care Unit (ICU patients and their vital signs monitored through a multiparameter heart screen. Intelligent systems techniques were used in the data acquisition and processing (sorting, transforming, among others it into useful information, conducting pre-diagnosis and providing, when necessary, alert signs to the medical staff. Conclusions The use of fuzzy logic turned to the medical area can be very useful if seen as a tool to assist specialists in this area. This paper presented a fuzzy model able to monitor and classify the condition of the vital signs of hospitalized patients, sending alerts according to the pre-diagnosis done helping the medical diagnosis.

  2. Power up your plant - An introduction to integrated process and power automation

    Energy Technology Data Exchange (ETDEWEB)

    Vasel, Jeffrey


    This paper discusses how a single integrated system can increase energy efficiency, improve plant uptime, and lower life cycle costs. Integrated Process and Power Automation is a new system integration architecture and power strategy that addresses the needs of the process and power generation industries. The architecture is based on Industrial Ethernet standards such as IEC 61850 and Profinet as well as Fieldbus technologies. The energy efficiency gains from integration are discussed in a power generation use case. A power management system success story from a major oil and gas company, Petrobras, is also discussed.

  3. Processing of the WLCG monitoring data using NoSQL (United States)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.


    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  4. An Integrated Solution for both Monitoring and Controlling for Automization Using Wireless Sensor Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    M Gnana Seelan


    Full Text Available Temperature monitoring plays a major role in controlling it according to its varied conditions. Thisprocess is common in all critical areas like data centre, server rooms, grid rooms and other datacommunication equipped rooms. This is mandatory for each organization/industry to impart suchprocess, as most of the critical data would be in data centre along with their network infrastructure whichhaving various electronic, electrical and mechanical devices are involved for data transmissions. Thesedevices are very much depend on the environmental factors such as temperature, moisture, humidity etc.,and also emit heat in the form of thermal energy when they are in functional. To overcome these heats,the server/data centre room(s would be engaged with multiple (distributed air-conditioning (ac systemsto provide cooling environment and maintain the temperature level of the room. The proposed paper isthe study of automization of monitoring and controlling temperature as per desired requirements withwsn network

  5. Automated feature extraction by combining polarimetric SAR and object-based image analysis for monitoring of natural resource exploitation


    Plank, Simon; Mager, Alexander; Schöpfer, Elisabeth


    An automated feature extraction procedure based on the combination of a pixel-based unsupervised classification of polarimetric synthetic aperture radar data (PolSAR) and an object-based post-classification is presented. High resolution SpotLight dual-polarimetric (HH/VV) TerraSAR-X imagery acquired over the Doba basin, Chad, is used for method development and validation. In an iterative training procedure the best suited polarimetric speckle filter, processing parameters for the following en...

  6. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms. (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov


    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  7. Process monitoring in international safeguards for reprocessing plants: A demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Ehinger, M.H.


    In the period 1985--1987, the Oak Ridge National Laboratory investigated the possible role of process monitoring for international safeguards applications in fuel reprocessing plants. This activity was conducted under Task C.59, ''Review of Process Monitoring Safeguards Technology for Reprocessing Facilities'' of the US program of Technical Assistance to the International Atomic Energy Agency (IAEA) Safeguards program. The final phase was a demonstration of process monitoring applied in a prototypical reprocessing plant test facility at ORNL. This report documents the demonstration and test results. 35 figs.

  8. Potentiometric electronic tongue-flow injection analysis system for the monitoring of heavy metal biosorption processes. (United States)

    Wilson, D; del Valle, M; Alegret, S; Valderrama, C; Florido, A


    An automated flow injection potentiometric (FIP) system with electronic tongue detection (ET) is used for the monitoring of biosorption processes of heavy metals on vegetable wastes. Grape stalk wastes are used as biosorbent to remove Cu(2+) ions in a fixed-bed column configuration. The ET is formed by a 5-sensor array with Cu(2+) and Ca(2+)-selective electrodes and electrodes with generic response to heavy-metals, plus an artificial neural network response model of the sensor's cross-response. The real-time monitoring of both the Cu(2+) and the cation exchanged and released (Ca(2+)) in the effluent solution is performed by using flow-injection potentiometric electronic tongue system. The coupling of the electronic tongue with automation features of the flow-injection system allows us to accurately characterize the Cu(2+) ion-biosorption process, through obtaining its breakthrough curves, and the profile of the Ca(2+) ion release. In parallel, fractions of the extract solution are analysed by spectroscopic techniques in order to validate the results obtained with the reported methodology. The sorption performance of grape stalks is also evaluated by means of well-established sorption models.

  9. [Fetal ECG monitoring system based on MCU processing]. (United States)

    Hu, Gang; Chen, Wei; Xie, Xicheng; Zhang, Hao


    In order to monitor the fetus in labor, the signal characteristic from fetal scalp electrode is researched, An adaptation algorithm and a peak to peak detecting technology are adopted in signal processing, and an adaptation gain control method is used to eliminate disturber from base-line shift. A fetal ECG monitoring system is designed on the basis of C8051F020 MCU.

  10. Automated delineation and characterization of watersheds for more than 3,000 surface-water-quality monitoring stations active in 2010 in Texas (United States)

    Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.


    The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.

  11. Adaptive Soa Stack-Based Business Process Monitoring Platform

    Directory of Open Access Journals (Sweden)

    Przemysław Dadel


    Full Text Available Executable business processes that formally describe company activities are well placed in the SOA environment as they allow for declarative organization of high-level system logic.However, for both technical and non-technical users, to fully benet from that element of abstractionappropriate business process monitoring systems are required and existing solutions remain unsatisfactory.The paper discusses the problem of business process monitoring in the context of the service orientation paradigm in order to propose an architectural solution and provide implementation of a system for business process monitoring that alleviates the shortcomings of the existing solutions.Various platforms are investigated to obtain a broader view of the monitoring problem and to gather functional and non-functional requirements. These requirements constitute input forthe further analysis and the system design. The monitoring software is then implemented and evaluated according to the specied criteria.An extensible business process monitoring system was designed and built on top of OSGiMM - a dynamic, event-driven, congurable communications layer that provides real-time monitoring capabilities for various types of resources. The system was tested against the stated functional requirements and its implementation provides a starting point for the further work.It is concluded that providing a uniform business process monitoring solution that satises a wide range of users and business process platform vendors is a dicult endeavor. It is furthermore reasoned that only an extensible, open-source, monitoring platform built on top of a scalablecommunication core has a chance to address all the stated and future requirements.

  12. Implications of critical chain methodology for business process flexible automation projects in economic organizations

    Directory of Open Access Journals (Sweden)

    Paul BRUDARU


    Full Text Available Business processes flexible automation projects involve the use of methods and technologies from Business Processes Management area (BPM that aim at increasing the agility of organizations in changing the business processes as response to environmental changes. BPM-type projects are a mix between process improvement projects and software development which implies a high complexity in managing them. The successful implementation of these projects involves overcoming problems inherent as delays in the activities of projects, multi-tasking, lack of focus which can not be solved by traditional project management tools. An approach which takes account of the difficulties of BPM projects is critical chain methodology. Using critical chain method provides the methodology fundament necessary for the successful completion of BPM-type projects.

  13. A Scheme for Automation of Telecom Data Processing for Business Application

    CERN Document Server

    Nair, T R Gopalakrishnan; V., Suma; Maharajan, Ezhilarasan


    As the telecom industry is witnessing a large scale growth, one of the major challenges faced in the domain deals with the analysis and processing of telecom transactional data which are generated in large volumes by embedded system communication controllers having various functions. This paper deals with the analysis of such raw data files which are made up of the sequences of the tokens. It also depicts the method in which the files are parsed for extracting the information leading to the final storage in predefined data base tables. The parser is capable of reading the file in a line structured way and store the tokens into the predefined tables of data bases. The whole process is automated using the SSIS tools available in the SQL server. The log table is maintained in each step of the process which will enable tracking of the file for any risk mitigation. It can extract, transform and load data resulting in the processing.

  14. Automated work-flow for processing high-resolution direct infusion electrospray ionization mass spectral fingerprints

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn


    The use of mass spectrometry (MS) is pivotal in analyses of the metabolome and presents a major challenge for subsequent data processing. While the last few years have given new high performance instruments, there has not been a comparable development in data processing. In this paper we discuss...... an automated data processing pipeline to compare large numbers of fingerprint spectra from direct infusion experiments analyzed by high resolution MS. We describe some of the intriguing problems that have to be addressed. starting with the conversion and pre-processing of the raw data to the final data...... analysis. Illustrated on the direct infusion analysis (ESI-TOF-MS) of complex mixtures the method exploits the full quality of the high-resolution present in the mass spectra. Although the method is illustrated as a new library search method for high resolution MS, we demonstrate that the output...

  15. Development and testing of an automated High-resolution InSAR volcano-monitoring system in the MED-SUV project (United States)

    Chowdhury, Tanvir Ahmed; Minet, Christian; Fritz, Thomas; Rodriguez Gonzalez, Fernando


    Volcanic unrest which produces a variety of geological and hydrological hazards is difficult to predict. Therefore it is important to monitor volcanoes continuously. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities. Besides the improvements of the understanding of geophysical processes underlying the volcanic systems of Vesuvius/ Campi Flegrei and Mt. Etna, one of the main goals of the MED-SUV (MEDiterranean SUpersite Volcanoes) project is to design a system for automatically monitoring ground deformations over active volcanoes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide powerful tools for observing the surface changes with millimeter accuracy. All the mentioned techniques address the challenges by exploiting medium to large SAR image stacks. The generation of interferometric products constitutes a major effort in terms of processing and planning. It requires a high degree of automation, robustness and quality control of the overall process. As a consequence of these requirements and constrains, the Integrated Wide Area Processor (IWAP) developed at DLR is introduced in the framework of a remote sensing task of MED-SUV project. The IWAP has been conceived and designed to optimize the processing workflow in order to minimize the processing time. Moreover, a quality control concept has been developed and integrated in the workflow. The IWAP is structured into three parts: (i) firstly, preparation of an order file containing some configuration parameters and invokes the processor; (ii) secondly, upon request from the processor, the operator performs some manual interactions by means of visual interfaces; (iii) analysis of the final product supported by extensive product visualization. This visualization supports the interpretation of the results without the need of

  16. Annotated bibliography of films in automation, data processing, and computer science

    CERN Document Server

    Soloman, Martin B Jr


    With the rapid development of computer science and the expanding use of computers in all facets of American life, there has been made available a wide range of instructional and informational films on automation, data processing, and computer science. Here is the first annotated bibliography of these and related films, gathered from industrial, institutional, and other sources.This bibliography annotates 244 films, alphabetically arranged by title, with a detailed subject index. Information is also provided concerning the intended audience, rental-purchase data, ordering procedures, and such s

  17. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA


    Full Text Available In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for security and confidentiality of stored data and also on presenting the identified techniques and procedures to implement these requirements.


    Energy Technology Data Exchange (ETDEWEB)

    Schwantes, Jon M.; Orton, Christopher R.; Fraga, Carlos G.; Christensen, Richard; Laspe, Amy R.; Ward, Rebecca M.


    Model and experimental estimates of the Multi-Isotope Process Monitor performance for determining burnup after dissolution and acid concentration during solvent extraction steps during reprocessing of spent nuclear fuel are presented.

  19. Dedicated monolithic infrared spectrometer for process monitoring (United States)

    Chadha, Suneet; Kyle, William; Bolduc, Roy A.; Curtiss, Lawrence E.


    Foster-Miller has leveraged its innovations in IR fiber- optic probes and the recent development of a miniature spectrometer to build a novel IR sensor system for process applications. The developed sensor systems is a low-cost alternative to process FTIR and filter based systems. A monolithic wedge-grating optic provides the spectral dispersion with low cost thermopile point or array detectors picking off the diffracted wavelengths from the optic. The integrated optic provides spectral discrimination between 3- 12 micrometers with resolution at 8 cm-1 or better and high overall optical throughput. The device has a fixed cylindrical grating uniquely bonded to the edge of a ZnSe conditioning 'wedge'. The conditioning optic overcomes limitations of concave gratings as it accepts high angle light at the narrow end of the wedge and progressively conditions it to be near normal to the grating. On return, the diffracted wavelengths are concentrated on the discrete or array detector elements by the wedge, providing throughput comparable to that of an FTIR. The miniature spectrometer coupled to flow through liquid cells or multipass gas cells provides significant cost advantage over conventional sampling methodologies. Currently, we are investigating process applications for the petroleum and dairy markets. The sensor system eliminates the cost, complexity, reliability and bandwidth/resolution problems associated with either Fabry Perot or Michelson Interferometer based approaches for low-cost process applications.

  20. MAC3 Evaluation: Monitoring Process, Documenting Outcomes (United States)

    Korey, Jane


    The role of evaluation is to determine whether a project achieves what it sets out to do. Using a strategy often referred to as "backwards planning" or "backwards research design," the evaluation process operationalizes project goals and then, asking the question "What would success look like?" identifies measurable indices of success (Friedman,…

  1. The missing link to success: using a business process management system to automate and manage process improvement. (United States)

    Hess, Ray


    Healthcare continues to face many significant challenges in its quest to provide optimal patient care. Many hospitals have instituted various process improvement methodologies to address these challenges. The outcome of these efforts still produces a large volume of manual tasks that must be addressed by the caregiver. The Chester County Hospital employed a Business Process Management (BPM) engine to automate and manage several of these processes. A BPM engine can perform key tasks and interact with the clinician to decrease the manual requirements of a process. The result is reduced workloads and improved outcomes. The Chester County Hospital has been able to demonstrate significant decreases in hospital acquired MRSA infections and compliance with several CMS core measures. There are multiple items to evaluate before attempting to use a BPM engine. This paper reviews the work at Chester County, its outcomes and the considerations that were important for achieving success.

  2. Library Automation


    Dhakne, B. N.; Giri, V. V.; Waghmode, S. S.


    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  3. Automated support for processing special assignment airlift missions (SAAMs): A concept paper

    Energy Technology Data Exchange (ETDEWEB)

    Sexton, A.; Hwang, Ho-Ling.


    The Airlift Development Analysis System (ADANS) is a research and development effort sponsored by the Headquarters, Military Airlift Command (HQ MAC) and conducted by researchers at Oak Ridge National Laboratory (ORNL). The purpose of this effort is to upgrade HQ MAC's automated capabilities for scheduling peacetime/wartime missions, deliberate and execution planning, and analysis of the airlift system. HQ MAC is also integrating its airlift scheduling processes to provide an easier transition from peacetime to wartime duties. The goal of this research and development effort is to provide an integrated system that allows HQ MAC to better maintain its forces in a constant state of readiness. The development for ADANS is scheduled in three increments and is to be completed in September 1992. This paper documents the procedures for processing SAAMs at HQ MAC as of May 1989 and presents functions that will be provided by ADANS to support SAAMs. In general, ADANS will provide HQ MAC Current Operations, Airlift Management Division, Special Airlift Branch (HQ MAC/DOOMS) and Air Transportation, Director of Cargo and Requirements, Special Assignment Airlift Division (HQ MAC/TRKS) with the ability to automate SAAM information management/data processing tasks, along with systematic error and consistency checking. Reports will be generated automatically by the system for analysis of SAAM requirements and airlift allocation. Consequently, better data integrity among various forms and reports can be expected.


    Directory of Open Access Journals (Sweden)

    J. S. Markiewicz


    Full Text Available At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. The orthoimage is a cartometric form of photographic presentation of information in the two-dimensional reference system. The paper will discuss the issue of automation of the orthoimage generation basing on the TLS data and digital images. At present attempts are made to apply modern technologies not only for the needs of surveys, but also during the data processing. This paper will present attempts aiming at utilisation of appropriate algorithms and the author’s application for automatic generation of the projection plane, for the needs of acquisition of intensity orthoimages from the TLS data. Such planes are defined manually in the majority of popular TLS data processing applications. A separate issue related to the RGB image generation is the orientation of digital images in relation to scans. It is important, in particular in such cases when scans and photographs are not taken simultaneously. This paper will present experiments concerning the utilisation of the SIFT algorithm for automatic matching of intensity orthoimages of the intensity and digital (RGB photographs. Satisfactory results of the process of automation, as well as in relation to the quality of resulting orthoimages have been obtained.

  5. Conflict Monitoring in Dual Process Theories of Thinking (United States)

    De Neys, Wim; Glumicic, Tamara


    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman…

  6. A log mining approach for process monitoring in SCADA

    NARCIS (Netherlands)

    Hadziosmanovic, Dina; Bolzoni, Damiano; Hartel, Pieter


    SCADA (Supervisory Control and Data Acquisition) systems are used for controlling and monitoring industrial processes. We propose a methodology to systematically identify potential process-related threats in SCADA. Process-related threats take place when an attacker gains user access rights and perf

  7. Ewmareg control chart applied in the monitoring of industrial processes

    Directory of Open Access Journals (Sweden)

    Danilo Cuzzuol Pedrini


    Full Text Available If the process quality characteristics are dependent of control variables, and these vary during the process operation, the basic assumptions of control charts are violated. If the values of the control variables are known, it’s possible to apply the regression control chart. One of the most recent works in this area is the EWMAREG chart, which is the monitoring of the standardized residuals using exponentially weighted moving average control chart. In this paper, we present a systematic application of the EWMAREG control chart in monitoring a simulated process of chemical industry. The process characteristic monitored was the corrosion rate of steel pipe in function of four process control variables. The tool applied demonstrated high potential to detect change in surveillance of corrosion rate, ensuring stability process.

  8. Metrology Sampling Strategies for Process Monitoring Applications

    KAUST Repository

    Vincent, Tyrone L.


    Shrinking process windows in very large scale integration semiconductor manufacturing have already necessitated the development of control systems capable of addressing sub-lot-level variation. Within-wafer control is the next milestone in the evolution of advanced process control from lot-based and wafer-based control. In order to adequately comprehend and control within-wafer spatial variation, inline measurements must be performed at multiple locations across the wafer. At the same time, economic pressures prompt a reduction in metrology, for both capital and cycle-time reasons. This paper explores the use of modeling and minimum-variance prediction as a method to select the sites for measurement on each wafer. The models are developed using the standard statistical tools of principle component analysis and canonical correlation analysis. The proposed selection method is validated using real manufacturing data, and results indicate that it is possible to significantly reduce the number of measurements with little loss in the information obtained for the process control systems. © 2011 IEEE.

  9. Geocoding uncertainty analysis for the automated processing of Sentinel-1 data using Sentinel-1 Toolbox software (United States)

    Dostálová, Alena; Naeimi, Vahid; Wagner, Wolfgang; Elefante, Stefano; Cao, Senmao; Persson, Henrik


    One of the major advantages of the Sentinel-1 data is its capability to provide very high spatio-temporal coverage allowing the mapping of large areas as well as creation of dense time-series of the Sentinel-1 acquisitions. The SGRT software developed at TU Wien aims at automated processing of Sentinel-1 data for global and regional products. The first step of the processing consists of the Sentinel-1 data geocoding with the help of S1TBX software and their resampling to a common grid. These resampled images serve as an input for the product derivation. Thus, it is very important to select the most reliable processing settings and assess the geocoding uncertainty for both backscatter and projected local incidence angle images. Within this study, selection of Sentinel-1 acquisitions over 3 test areas in Europe were processed manually in the S1TBX software, testing multiple software versions, processing settings and digital elevation models (DEM) and the accuracy of the resulting geocoded images were assessed. Secondly, all available Sentinel-1 data over the areas were processed using selected settings and detailed quality check was performed. Overall, strong influence of the used DEM on the geocoding quality was confirmed with differences up to 80 meters in areas with higher terrain variations. In flat areas, the geocoding accuracy of backscatter images was overall good, with observed shifts between 0 and 30m. Larger systematic shifts were identified in case of projected local incidence angle images. These results encourage the automated processing of large volumes of Sentinel-1 data.

  10. Methods of Complex Data Processing from Technical Means of Monitoring

    Directory of Open Access Journals (Sweden)

    Serhii Tymchuk


    Full Text Available The problem of processing the information from different types of monitoring equipment was examined. The use of generalized methods of information processing, based on the techniques of clustering combined territorial information sources for monitoring and the use of framing model of knowledge base for identification of monitoring objects was proposed as a possible solution of the problem. Clustering methods were formed on the basis of Lance-Williams hierarchical agglomerative procedure using the Ward metrics. Frame model of knowledge base was built using the tools of object-oriented modeling.

  11. Application of PLC’s for Automation of Processes in Industries

    Directory of Open Access Journals (Sweden)

    Rahul Pawar


    Full Text Available Several industries utilize sequential industrial process which is respective in nature. For such processes industries have to depend upon use of relays, stepping drum, timers and controls, considerable difficulties experienced in reprogramming necessitated due to change in the nature of production. Often the whole system has to be scrapped and a redesigning is required. To overcome these problems PLC control system was introduced. The PLC can be described as a control ladder comprising a sequence program. PLC sequence program consists of normally open and normally closed contacts connected in parallel or in series. It also has relay coils, which turns ON and OFF as the state of these contacts change. In this paper, about all aspects of these powerful and versatile tools and its applications to process automation has been discussed

  12. Developing Human-Machine Interfaces to Support Monitoring of UAV Automation (United States)


    d’automatisation émergeant dans les systèmes d’UAV, une analyse des tâches changeantes des équipages d’UAV, et un examen sélectif de la recherche pertinente...initiative mixte. Un examen exhaustif mené récemment sur la documentation relative à la confiance a fait ressortir les lignes directrices de conception...which occurs when they incorporate as much as automation into the system without considering the consequence on human operators, such as mental

  13. The Technique of Building a Networked Manufacturing Process Monitoring System

    Institute of Scientific and Technical Information of China (English)

    XIE Yong; ZHANG Yu; YANG Musheng


    This paper introduces the constitute, structure and the software model of a set of networked manufacturing process monitoring system, using JAVA network technique to realize a set of three layer distributed manufacturing process monitoring system which is comprised with remote manage center, manufacturing process supervision center and the units of measure and control layer such as displacement sensor, the device of temperature measure and alarm etc. The network integration of the production management layer, the process control layer and the hard ware control layer is realized via using this approach. The design using object-oriented technique based on JAVA can easily transport to different operation systems with high performance of the expansibility.

  14. PLS-based memory control scheme for enhanced process monitoring

    KAUST Repository

    Harrou, Fouzi


    Fault detection is important for safe operation of various modern engineering systems. Partial least square (PLS) has been widely used in monitoring highly correlated process variables. Conventional PLS-based methods, nevertheless, often fail to detect incipient faults. In this paper, we develop new PLS-based monitoring chart, combining PLS with multivariate memory control chart, the multivariate exponentially weighted moving average (MEWMA) monitoring chart. The MEWMA are sensitive to incipient faults in the process mean, which significantly improves the performance of PLS methods and widen their applicability in practice. Using simulated distillation column data, we demonstrate that the proposed PLS-based MEWMA control chart is more effective in detecting incipient fault in the mean of the multivariate process variables, and outperform the conventional PLS-based monitoring charts.

  15. Automated Formosat Image Processing System for Rapid Response to International Disasters (United States)

    Cheng, M. C.; Chou, S. C.; Chen, Y. C.; Chen, B.; Liu, C.; Yu, S. J.


    FORMOSAT-2, Taiwan's first remote sensing satellite, was successfully launched in May of 2004 into the Sun-synchronous orbit at 891 kilometers of altitude. With the daily revisit feature, the 2-m panchromatic, 8-m multi-spectral resolution images captured have been used for researches and operations in various societal benefit areas. This paper details the orchestration of various tasks conducted in different institutions in Taiwan in the efforts responding to international disasters. The institutes involved including its space agency-National Space Organization (NSPO), Center for Satellite Remote Sensing Research of National Central University, GIS Center of Feng-Chia University, and the National Center for High-performance Computing. Since each institution has its own mandate, the coordinated tasks ranged from receiving emergency observation requests, scheduling and tasking of satellite operation, downlink to ground stations, images processing including data injection, ortho-rectification, to delivery of image products. With the lessons learned from working with international partners, the FORMOSAT Image Processing System has been extensively automated and streamlined with a goal to shorten the time between request and delivery in an efficient manner. The integrated team has developed an Application Interface to its system platform that provides functions of search in archive catalogue, request of data services, mission planning, inquiry of services status, and image download. This automated system enables timely image acquisition and substantially increases the value of data product. Example outcome of these efforts in recent response to support Sentinel Asia in Nepal Earthquake is demonstrated herein.

  16. Automation of data processing and calculation of retention parameters and thermodynamic data for gas chromatography (United States)

    Makarycheva, A. I.; Faerman, V. A.


    The analyses of automation patterns is performed and the programming solution for the automation of data processing of the chromatographic data and their further information storage with a help of a software package, Mathcad and MS Excel spreadsheets, is developed. The offered approach concedes the ability of data processing algorithm modification and does not require any programming experts participation. The approach provides making a measurement of the given time and retention volumes, specific retention volumes, a measurement of differential molar free adsorption energy, and a measurement of partial molar solution enthalpies and isosteric heats of adsorption. The developed solution is focused on the appliance in a small research group and is tested on the series of some new gas chromatography sorbents. More than 20 analytes were submitted to calculation of retention parameters and thermodynamic sorption quantities. The received data are provided in the form accessible to comparative analysis, and they are able to find sorbing agents with the most profitable properties to solve some concrete analytic issues.

  17. An automated and integrated framework for dust storm detection based on ogc web processing services (United States)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.


    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data

  18. Automated microfluidic platform of bead-based electrochemical immunosensor integrated with bioreactor for continual monitoring of cell secreted biomarkers (United States)

    Riahi, Reza; Shaegh, Seyed Ali Mousavi; Ghaderi, Masoumeh; Zhang, Yu Shrike; Shin, Su Ryon; Aleman, Julio; Massa, Solange; Kim, Duckjin; Dokmeci, Mehmet Remzi; Khademhosseini, Ali


    There is an increasing interest in developing microfluidic bioreactors and organs-on-a-chip platforms combined with sensing capabilities for continual monitoring of cell-secreted biomarkers. Conventional approaches such as ELISA and mass spectroscopy cannot satisfy the needs of continual monitoring as they are labor-intensive and not easily integrable with low-volume bioreactors. This paper reports on the development of an automated microfluidic bead-based electrochemical immunosensor for in-line measurement of cell-secreted biomarkers. For the operation of the multi-use immunosensor, disposable magnetic microbeads were used to immobilize biomarker-recognition molecules. Microvalves were further integrated in the microfluidic immunosensor chip to achieve programmable operations of the immunoassay including bead loading and unloading, binding, washing, and electrochemical sensing. The platform allowed convenient integration of the immunosensor with liver-on-chips to carry out continual quantification of biomarkers secreted from hepatocytes. Transferrin and albumin productions were monitored during a 5-day hepatotoxicity assessment in which human primary hepatocytes cultured in the bioreactor were treated with acetaminophen. Taken together, our unique microfluidic immunosensor provides a new platform for in-line detection of biomarkers in low volumes and long-term in vitro assessments of cellular functions in microfluidic bioreactors and organs-on-chips.

  19. Automated microfluidic platform of bead-based electrochemical immunosensor integrated with bioreactor for continual monitoring of cell secreted biomarkers. (United States)

    Riahi, Reza; Shaegh, Seyed Ali Mousavi; Ghaderi, Masoumeh; Zhang, Yu Shrike; Shin, Su Ryon; Aleman, Julio; Massa, Solange; Kim, Duckjin; Dokmeci, Mehmet Remzi; Khademhosseini, Ali


    There is an increasing interest in developing microfluidic bioreactors and organs-on-a-chip platforms combined with sensing capabilities for continual monitoring of cell-secreted biomarkers. Conventional approaches such as ELISA and mass spectroscopy cannot satisfy the needs of continual monitoring as they are labor-intensive and not easily integrable with low-volume bioreactors. This paper reports on the development of an automated microfluidic bead-based electrochemical immunosensor for in-line measurement of cell-secreted biomarkers. For the operation of the multi-use immunosensor, disposable magnetic microbeads were used to immobilize biomarker-recognition molecules. Microvalves were further integrated in the microfluidic immunosensor chip to achieve programmable operations of the immunoassay including bead loading and unloading, binding, washing, and electrochemical sensing. The platform allowed convenient integration of the immunosensor with liver-on-chips to carry out continual quantification of biomarkers secreted from hepatocytes. Transferrin and albumin productions were monitored during a 5-day hepatotoxicity assessment in which human primary hepatocytes cultured in the bioreactor were treated with acetaminophen. Taken together, our unique microfluidic immunosensor provides a new platform for in-line detection of biomarkers in low volumes and long-term in vitro assessments of cellular functions in microfluidic bioreactors and organs-on-chips.

  20. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing (United States)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo


    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing

  1. Quality control of CT systems by automated monitoring of key performance indicators: a two-year study. (United States)

    Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette


    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.

  2. Classification of Automated Search Traffic (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  3. An Automated Electronic Tongue for In-Situ Quick Monitoring of Trace Heavy Metals in Water Environment (United States)

    Cai, Wei; Li, Yi; Gao, Xiaoming; Guo, Hongsun; Zhao, Huixin; Wang, Ping


    An automated electronic tongue instrumentation has been developed for in-situ concentration determination of trace heavy metals in water environment. The electronic tongue contains two main parts. The sensor part consists of a silicon-based Hg-coated Au microelectrodes array (MEA) for the detection of Zn(II), Cd(II), Pb(II) and Cu(II) and a multiple light-addressable potentiometric sensor (MLAPS) for the detection of Fe(III) and Cr(VI). The control part employs pumps, valves and tubes to enable the pick-up and pretreatment of aqueous sample. The electronic tongue realized detection of the six metals mentioned above at part-per-billion (ppb) level without manual operation. This instrumentation will have wide application in quick monitoring and prediction the heavy metal pollution in lakes and oceans.

  4. Monitoring of recombinant protein production using bioluminescence in a semiautomated fermentation process. (United States)

    Trezzani, I; Nadri, M; Dorel, C; Lejeune, P; Bellalou, J; Lieto, J; Hammouri, H; Longin, R; Dhurjati, P


    On-line optimization of fermentation processes can be greatly aided by the availability of information on the physiological state of the cell. The goal of our "BioLux" research project was to design a recombinant cell capable of intracellular monitoring of product synthesis and to use it as part of an automated fermentation system. A recombinant plasmid was constructed containing an inducible promoter that controls the gene coding for a model protein and the genes necessary for bioluminescence. The cells were cultured in microfermenters equipped with an on-line turbidity sensor and a specially designed on-line light sensor capable of continuous measurement of bioluminescence. Initial studies were done under simple culture conditions, and a linear correlation between luminescence and protein production was obtained. Such specially designed recombinant bioluminescent cells can potentially be applied for model-based inference of intracellular product formation, as well as for optimization and control of recombinant fermentation processes.

  5. A Review for Model Plant Mismatch Measures in Process Monitoring

    Institute of Scientific and Technical Information of China (English)

    王洪; 谢磊; 宋执环


    Model is usually necessary for the design of a control loop. Due to simplification and unknown dynamics, model plant mismatch is inevitable in the control loop. In process monitoring, detection of mismatch and evaluation of its influences are demanded. In this paper several mismatch measures are presented based on different model descriptions. They are categorized into different groups from different perspectives and their potential in detection and diagnosis is evaluated. Two case studies on mixing process and distillation process demonstrate the efficacy of the framework of mismatch monitoring.

  6. Nonlinear Statistical Process Monitoring and Fault Detection Using Kernel ICA

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xi; YAN Wei-wu; ZHAO Xu; SHAO Hui-he


    A novel nonlinear process monitoring and fault detection method based on kernel independent component analysis (ICA) is proposed. The kernel ICA method is a two-phase algorithm: whitened kernel principal component (KPCA) plus ICA. KPCA spheres data and makes the data structure become as linearly separable as possible by virtue of an implicit nonlinear mapping determined by kernel. ICA seeks the projection directions in the KPCA whitened space, making the distribution of the projected data as non-gaussian as possible. The application to the fluid catalytic cracking unit (FCCU) simulated process indicates that the proposed process monitoring method based on kernel ICA can effectively capture the nonlinear relationship in process variables. Its performance significantly outperforms monitoring method based on ICA or KPCA.


    Directory of Open Access Journals (Sweden)

    Ye.I. Sokol


    Full Text Available The paper describes an algorithm of the complex automated monitoring of Ukraine’s power energy system, aimed at ensuring safety of its personnel and equipment. This monitoring involves usage of unmanned aerial vehicles (UAVs for planned and unplanned registration status of power transmission lines (PTL and high-voltage substations (HVS. It is assumed that unscheduled overflights will be made in emergency situations on power lines. With the help of the UAV, pictures of transmission and HVS will be recorded from the air in the optical and infrared ranges, as well as strength of electric (EF and magnetic (MF fields will be measured along the route of flight. Usage specially developed software allows to compare the recorded pictures with pre-UAV etalon patterns corresponding to normal operation of investigated transmission lines and the HVSs. Such reference pattern together with the experimentally obtained maps of HVS’s protective grounding will be summarized in a single document – a passport of HVS and PTL. This passport must also contain the measured and calculated values of strength levels of EF and MF in the places where staff of power facilities stay as well as layout of equipment, the most vulnerable to the effects of electromagnetic interference. If necessary, as part of ongoing monitoring, recommendations will be given on the design and location of electromagnetic screens, reducing the levels of electromagnetic interference as well as on location of lightning rods, reducing probability lightning attachment to the objects. The paper presents analytic expressions, which formed the basis of the developed software for calculation of the EF strength in the vicinity of power lines. This software will be used as a base at UAV navigation along the transmission lines, as well as to detect violations in the transmission lines operation. Comparison of distributions of EF strength calculated with the help of the elaborated software with the known

  8. A comparison of timed artificial insemination and automated activity monitoring with hormone intervention in 3 commercial dairy herds. (United States)

    Dolecheck, K A; Silvia, W J; Heersche, G; Wood, C L; McQuerry, K J; Bewley, J M


    The objective of this study was to compare the reproductive performance of cows inseminated based on automated activity monitoring with hormone intervention (AAM) to cows from the same herds inseminated using only an intensive timed artificial insemination (TAI) program. Cows (n=523) from 3 commercial dairy herds participated in this study. To be considered eligible for participation, cows must have been classified with a body condition score of at least 2.50, but no more than 3.50, passed a reproductive tract examination, and experienced no incidences of clinical, recorded metabolic diseases in the current lactation. Within each herd, cows were balanced for parity and predicted milk yield, then randomly assigned to 1 of 2 treatments: TAI or AAM. Cows assigned to the TAI group were subjected to an ovulation synchronization protocol consisting of presynchronization, Ovsynch, and Resynch for up to 3 inseminations. Cows assigned to the AAM treatment were fitted with a leg-mounted accelerometer (AfiAct Pedometer Plus, Afimilk, Kibbutz Afikim, Israel) at least 10 d before the end of the herd voluntary waiting period (VWP). Cows in the AAM treatment were inseminated at times indicated by the automated alert system for up to 90 d after the VWP. If an open cow experienced no AAM alert for a 39±7-d period (beginning at the end of the VWP), hormone intervention in the form of a single injection of either PGF2α or GnRH (no TAI) was permitted as directed by the herd veterinarian. Subsequent to hormone intervention, cows were inseminated when alerted in estrus by the AAM system. Pregnancy was diagnosed by ultrasound 33 to 46 d after insemination. Pregnancy loss was determined via a second ultrasound after 60 d pregnant. Timed artificial insemination cows experienced a median 11.0 d shorter time to first service. Automated activity-monitored cows experienced a median 17.5-d shorter service interval. No treatment difference in probability of pregnancy to first AI, probability

  9. Advanced Process Monitoring Techniques for Safeguarding Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Schwantes, Jon M.; Levitskaia, Tatiana G.; Fraga, Carlos G.; Peper, Shane M.


    The International Atomic Energy Agency (IAEA) has established international safeguards standards for fissionable material at spent fuel reprocessing plants to ensure that significant quantities of weapons-grade nuclear material are not diverted from these facilities. For large throughput nuclear facilities, it is difficult to satisfy the IAEA safeguards accountancy goal for detection of abrupt diversion. Currently, methods to verify material control and accountancy (MC&A) at these facilities require time-consuming and resource-intensive destructive assay (DA). Leveraging new on-line non destructive assay (NDA) process monitoring techniques in conjunction with the traditional and highly precise DA methods may provide an additional measure to nuclear material accountancy which would potentially result in a more timely, cost-effective and resource efficient means for safeguards verification at such facilities. By monitoring process control measurements (e.g. flowrates, temperatures, or concentrations of reagents, products or wastes), abnormal plant operations can be detected. Pacific Northwest National Laboratory (PNNL) is developing on-line NDA process monitoring technologies, including both the Multi-Isotope Process (MIP) Monitor and a spectroscopy-based monitoring system, to potentially reduce the time and resource burden associated with current techniques. The MIP Monitor uses gamma spectroscopy and multivariate analysis to identify off-normal conditions in process streams. The spectroscopic monitor continuously measures chemical compositions of the process streams including actinide metal ions (U, Pu, Np), selected fission products, and major cold flowsheet chemicals using UV-Vis, Near IR and Raman spectroscopy. This paper will provide an overview of our methods and report our on-going efforts to develop and demonstrate the technologies.

  10. Working Toward Robust Process Monitoring for Safeguards Applications

    Energy Technology Data Exchange (ETDEWEB)

    Krichinsky, Alan M [ORNL; Bell, Lisa S [ORNL; Gilligan, Kimberly V [ORNL; Laughter, Mark D [ORNL; Miller, Paul [ORNL; Pickett, Chris A [ORNL; Richardson, Dave [ORNL; Rowe, Nathan C [ORNL; Younkin, James R [ORNL


    New safeguards technologies allow continuous monitoring of plant processes. Efforts to deploy these technologies, as described in a preponderance of literature, typically have consisted of case studies attempting to prove their efficacy in proof-of-principle installations. While the enhanced safeguards capabilities of continuous monitoring have been established, studies thus far have not addressed such challenges as manipulation of a system by a host nation. To prevent this and other such vulnerabilities, one technology, continuous load cell monitoring, was reviewed. This paper will present vulnerabilities as well as mitigation strategies that were identified.


    Directory of Open Access Journals (Sweden)

    C. Ö. Kıvılcım


    Full Text Available The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect’s Şehzade Mosque in Istanbul, Turkey.

  12. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation (United States)

    Kıvılcım, C. Ö.; Duran, Z.


    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  13. Monitoring and analysis of air emissions based on condition models derived from process history

    Directory of Open Access Journals (Sweden)

    M. Liukkonen


    Full Text Available Evaluation of online information on operating conditions is necessary when reducing air emissions in energy plants. In this respect, automated monitoring and control are of primary concern, particularly in biomass combustion. As monitoring of emissions in power plants is ever more challenging because of low-grade fuels and fuel mixtures, new monitoring applications are needed to extract essential information from the large amount of measurement data. The management of emissions in energy boilers lacks economically efficient, fast, and competent computational systems that could support decision-making regarding the improvement of emission efficiency. In this paper, a novel emission monitoring platform based on the self-organizing map method is presented. The system is capable, not only of visualizing the prevailing status of the process and detecting problem situations (i.e. increased emission release rates, but also of analyzing these situations automatically and presenting factors potentially affecting them. The system is demonstrated using measurement data from an industrial circulating fluidized bed boiler fired by forest residue as the primary fuel and coal as the supporting fuel.

  14. Automated measurement and monitoring of bioprocesses: key elements of the M(3)C strategy. (United States)

    Sonnleitner, Bernhard


    The state-of-routine monitoring items established in the bioprocess industry as well as some important state-of-the-art methods are briefly described and the potential pitfalls discussed. Among those are physical and chemical variables such as temperature, pressure, weight, volume, mass and volumetric flow rates, pH, redox potential, gas partial pressures in the liquid and molar fractions in the gas phase, infrared spectral analysis of the liquid phase, and calorimetry over an entire reactor. Classical as well as new optical versions are addressed. Biomass and bio-activity monitoring (as opposed to "measurement") via turbidity, permittivity, in situ microscopy, and fluorescence are critically analyzed. Some new(er) instrumental analytical tools, interfaced to bioprocesses, are explained. Among those are chromatographic methods, mass spectrometry, flow and sequential injection analyses, field flow fractionation, capillary electrophoresis, and flow cytometry. This chapter surveys the principles of monitoring rather than compiling instruments.

  15. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa


    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  16. "SmartMonitor"--an intelligent security system for the protection of individuals and small properties with the possibility of home automation. (United States)

    Frejlichowski, Dariusz; Gościewska, Katarzyna; Forczmański, Paweł; Hofman, Radosław


    "SmartMonitor" is an intelligent security system based on image analysis that combines the advantages of alarm, video surveillance and home automation systems. The system is a complete solution that automatically reacts to every learned situation in a pre-specified way and has various applications, e.g., home and surrounding protection against unauthorized intrusion, crime detection or supervision over ill persons. The software is based on well-known and proven methods and algorithms for visual content analysis (VCA) that were appropriately modified and adopted to fit specific needs and create a video processing model which consists of foreground region detection and localization, candidate object extraction, object classification and tracking. In this paper, the "SmartMonitor" system is presented along with its architecture, employed methods and algorithms, and object analysis approach. Some experimental results on system operation are also provided. In the paper, focus is put on one of the aforementioned functionalities of the system, namely supervision over ill persons.

  17. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C


    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  18. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening (United States)

    Phillips, Rachel; Madhavan, Poornima


    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence

  19. Knowledge management and process monitoring of pharmaceutical processes in the quality by design paradigm. (United States)

    Rathore, Anurag S; Bansal, Anshuman; Hans, Jaspinder


    Pharmaceutical processes are complex and highly variable in nature. The complexity and variability associated with these processes result in inconsistent and sometimes unpredictable process outcomes. To deal with the complexity and understand the causes of variability in these processes, in-depth knowledge and thorough understanding of the process and the various factors affecting the process performance become critical. This makes knowledge management and process monitoring an indispensable part of the process improvement efforts for any pharmaceutical organization.

  20. Prototypic automated continuous recreational water quality monitoring of nine Chicago beaches (United States)

    Dawn Shively,; Nevers, Meredith; Cathy Breitenbach,; Phanikumar, Mantha S.; Kasia Przybyla-Kelly,; Ashley M. Spoljaric,; Richard L. Whitman,


    Predictive empirical modeling is used in many locations worldwide as a rapid, alternative recreational water quality management tool to eliminate delayed notifications associated with traditional fecal indicator bacteria (FIB) culturing (referred to as the persistence model, PM) and to prevent errors in releasing swimming advisories. The goal of this study was to develop a fully automated water quality management system for multiple beaches using predictive empirical models (EM) and state-of-the-art technology. Many recent EMs rely on samples or data collected manually, which adds to analysis time and increases the burden to the beach manager. In this study, data from water quality buoys and weather stations were transmitted through cellular telemetry to a web hosting service. An executable program simultaneously retrieved and aggregated data for regression equations and calculated EM results each morning at 9:30 AM; results were transferred through RSS feed to a website, mapped to each beach, and received by the lifeguards to be posted at the beach. Models were initially developed for five beaches, but by the third year, 21 beaches were managed using refined and validated modeling systems. The adjusted R2 of the regressions relating Escherichia coli to hydrometeorological variables for the EMs were greater than those for the PMs, and ranged from 0.220 to 0.390 (2011) and 0.103 to 0.381 (2012). Validation results in 2013 revealed reduced predictive capabilities; however, three of the originally modeled beaches showed improvement in 2013 compared to 2012. The EMs generally showed higher accuracy and specificity than those of the PMs, and sensitivity was low for both approaches. In 2012 EM accuracy was 70–97%; specificity, 71–100%; and sensitivity, 0–64% and in 2013 accuracy was 68–97%; specificity, 73–100%; and sensitivity 0–36%. Factors that may have affected model capabilities include instrument malfunction, non-point source inputs, and sparse

  1. An automated image processing method to quantify collagen fibre organization within cutaneous scar tissue. (United States)

    Quinn, Kyle P; Golberg, Alexander; Broelsch, G Felix; Khan, Saiqa; Villiger, Martin; Bouma, Brett; Austen, William G; Sheridan, Robert L; Mihm, Martin C; Yarmush, Martin L; Georgakoudi, Irene


    Standard approaches to evaluate scar formation within histological sections rely on qualitative evaluations and scoring, which limits our understanding of the remodelling process. We have recently developed an image analysis technique for the rapid quantification of fibre alignment at each pixel location. The goal of this study was to evaluate its application for quantitatively mapping scar formation in histological sections of cutaneous burns. To this end, we utilized directional statistics to define maps of fibre density and directional variance from Masson's trichrome-stained sections for quantifying changes in collagen organization during scar remodelling. Significant increases in collagen fibre density are detectable soon after burn injury in a rat model. Decreased fibre directional variance in the scar was also detectable between 3 weeks and 6 months after injury, indicating increasing fibre alignment. This automated analysis of fibre organization can provide objective surrogate endpoints for evaluating cutaneous wound repair and regeneration.

  2. Use of an Automated Image Processing Program to Quantify Recombinant Adenovirus Particles (United States)

    Obenauer-Kutner, Linda J.; Halperin, Rebecca; Ihnat, Peter M.; Tully, Christopher P.; Bordens, Ronald W.; Grace, Michael J.


    Electron microscopy has a pivotal role as an analytical tool in pharmaceutical research. However, digital image data have proven to be too large for efficient quantitative analysis. We describe here the development and application of an automated image processing (AIP) program that rapidly quantifies shape measurements of recombinant adenovirus (rAd) obtained from digitized field emission scanning electron microscope (FESEM) images. The program was written using the macro-recording features within Image-Pro® Plus software. The macro program, which is linked to a Microsoft Excel spreadsheet, consists of a series of subroutines designed to automatically measure rAd vector objects from the FESEM images. The application and utility of this macro program has enabled us to rapidly and efficiently analyze very large data sets of rAd samples while minimizing operator time.

  3. Analysis of irradiated U-7wt%Mo dispersion fuel microstructures using automated image processing (United States)

    Collette, R.; King, J.; Buesch, C.; Keiser, D. D.; Williams, W.; Miller, B. D.; Schulthess, J.


    The High Performance Research Reactor Fuel Development (HPPRFD) program is responsible for developing low enriched uranium (LEU) fuel substitutes for high performance reactors fueled with highly enriched uranium (HEU) that have not yet been converted to LEU. The uranium-molybdenum (U-Mo) fuel system was selected for this effort. In this study, fission gas pore segmentation was performed on U-7wt%Mo dispersion fuel samples at three separate fission densities using an automated image processing interface developed in MATLAB. Pore size distributions were attained that showed both expected and unexpected fission gas behavior. In general, it proved challenging to identify any dominant trends when comparing fission bubble data across samples from different fuel plates due to varying compositions and fabrication techniques. The results exhibited fair agreement with the fission density vs. porosity correlation developed by the Russian reactor conversion program.

  4. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection. (United States)

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel


    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology.

  5. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke. (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T


    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  6. Porosity of additive manufacturing parts for process monitoring (United States)

    Slotwinski, J. A.; Garboczi, E. J.


    Some metal additive manufacturing processes can produce parts with internal porosity, either intentionally (with careful selection of the process parameters) or unintentionally (if the process is not well-controlled.) Material porosity is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants, since surface-breaking pores allow for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the process. We are developing an ultrasonic sensor for detecting changes in porosity in metal parts during fabrication on a metal powder bed fusion system, for use as a process monitor. This paper will describe our work to develop an ultrasonic-based sensor for monitoring part porosity during an additive build, including background theory, the development and detailed characterization of reference additive porosity samples, and a potential design for in-situ implementation.

  7. An intelligent path to quality—process monitoring and control (United States)

    Wen, Sheree


    The potential application of process monitoring and control in various industries is infinite and the impact is major. With computer-aided design and manufacturing, the process and tools can be managed beyond the reach of human hands. In the current environment, where computing power is increasing on an exponential scale and application software has been developing like blossoming spring flowers, material manufacturers can easily capitalize on these advantages. Conversely, major research is still needed in the in-situ sensing of material properties, the processing environment during fabrication processes, and adaptive control schemes to feed these parameters back to the process controllers.

  8. Automated Planning of Science Products Based on Nadir Overflights and Alerts for Onboard and Ground Processing (United States)

    Chien, Steve A.; McLaren, David A.; Rabideau, Gregg R.; Mandl, Daniel; Hengemihle, Jerry


    A set of automated planning algorithms is the current operations baseline approach for the Intelligent Payload Module (IPM) of the proposed Hyper spectral Infrared Imager (HyspIRI) mission. For this operations concept, there are only local (e.g. non-depletable) operations constraints, such as real-time downlink and onboard memory, and the forward sweeping algorithm is optimal for determining which science products should be generated onboard and on ground based on geographical overflights, science priorities, alerts, requests, and onboard and ground processing constraints. This automated planning approach was developed for the HyspIRI IPM concept. The HyspIRI IPM is proposed to use an X-band Direct Broadcast (DB) capability that would enable data to be delivered to ground stations virtually as it is acquired. However, the HyspIRI VSWIR and TIR instruments will produce approximately 1 Gbps data, while the DB capability is 15 Mbps for a approx. =60X oversubscription. In order to address this mismatch, this innovation determines which data to downlink based on both the type of surface the spacecraft is overflying, and the onboard processing of data to detect events. For example, when the spacecraft is overflying Polar Regions, it might downlink a snow/ice product. Additionally, the onboard software will search for thermal signatures indicative of a volcanic event or wild fire and downlink summary information (extent, spectra) when detected, thereby reducing data volume. The planning system described above automatically generated the IPM mission plan based on requested products, the overflight regions, and available resources.

  9. Development of an automated system for continuous monitoring of powered roof support in longwall panel

    Institute of Scientific and Technical Information of China (English)



    Described the development of an Intrinsically Safe System for continuous monitoring of load and convergence of powered roof supports installed at longwall faces.The system developed for monitoring of behavior of a powered support in a mechanized longwall sublevel caving face. The logging system can be programmed for logging the data from the sensors at different logging intervals ranging from 16 h to 1 ms for logging variation in hydraulic pressures in legs and convergence of the support during progressive face advance. For recording dynamic loads, the data logger can be programmed to start fast logging, say at 10 ms intervals, when the pressure in a leg reaches a pre-specified threshold value, and continue fast logging until the pressure drops below this threshold value. This fast logging automatically stops when the pressure drops below this threshold value.

  10. Automated Grid Monitoring for the LHCb Experiment Through HammerCloud

    CERN Document Server

    Dice, Bradley


    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  11. The AAL project: Automated monitoring and intelligent AnaLysis for the ATLAS data taking infrastructure

    CERN Document Server

    Magnoni, L; The ATLAS collaboration; Kazarov, A


    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for filtering and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The huge flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This require strong competence and experience in understanding and discovering problems and root causes, and often the meaningful in...

  12. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    CERN Document Server

    Kazarov, A; The ATLAS collaboration; Magnoni, L


    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for filtering and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The huge flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This require strong competence and experience in understanding and discovering problems and root causes, and often the meaningful in...

  13. Statistical process control methods for expert system performance monitoring. (United States)

    Kahn, M G; Bailey, T C; Steib, S A; Fraser, V J; Dunagan, W C


    The literature on the performance evaluation of medical expert system is extensive, yet most of the techniques used in the early stages of system development are inappropriate for deployed expert systems. Because extensive clinical and informatics expertise and resources are required to perform evaluations, efficient yet effective methods of monitoring performance during the long-term maintenance phase of the expert system life cycle must be devised. Statistical process control techniques provide a well-established methodology that can be used to define policies and procedures for continuous, concurrent performance evaluation. Although the field of statistical process control has been developed for monitoring industrial processes, its tools, techniques, and theory are easily transferred to the evaluation of expert systems. Statistical process tools provide convenient visual methods and heuristic guidelines for detecting meaningful changes in expert system performance. The underlying statistical theory provides estimates of the detection capabilities of alternative evaluation strategies. This paper describes a set of statistical process control tools that can be used to monitor the performance of a number of deployed medical expert systems. It describes how p-charts are used in practice to monitor the GermWatcher expert system. The case volume and error rate of GermWatcher are then used to demonstrate how different inspection strategies would perform.

  14. FY 2009 Progress: Process Monitoring Technology Demonstration at PNNL

    Energy Technology Data Exchange (ETDEWEB)

    Arrigo, Leah M.; Christensen, Ronald N.; Fraga, Carlos G.; Liezers, Martin; Peper, Shane M.; Thomas, Elizabeth M.; Bryan, Samuel A.; Douglas, Matthew; Laspe, Amy R.; Lines, Amanda M.; Peterson, James M.; Ward, Rebecca M.; Casella, Amanda J.; Duckworth, Douglas C.; Levitskaia, Tatiana G.; Orton, Christopher R.; Schwantes, Jon M.


    Pacific Northwest National Laboratory (PNNL) is developing and demonstrating three technologies designed to assist in the monitoring of reprocessing facilities in near-real time. These technologies include 1) a multi-isotope process monitor (MIP), 2) a spectroscopy-based monitor that uses UV-Vis-NIR (ultraviolet-visible-near infrared) and Raman spectrometers, and 3) an electrochemically modulated separations approach (EMS). The MIP monitor uses gamma spectroscopy and pattern recognition software to identify off-normal conditions in process streams. The UV-Vis-NIR and Raman spectroscopic monitoring continuously measures chemical compositions of the process streams including actinide metal ions (uranium, plutonium, neptunium), selected fission products, and major cold flow sheet chemicals. The EMS approach provides an on-line means for separating and concentrating elements of interest out of complex matrices prior to detection via nondestructive assay by gamma spectroscopy or destructive analysis with mass spectrometry. A general overview of the technologies and ongoing demonstration results are described in this report.

  15. Automation or De-automation (United States)

    Gorlach, Igor; Wessel, Oliver


    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  16. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    Energy Technology Data Exchange (ETDEWEB)

    Sudowe, Ralf [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program and Health Physics Dept.; Roman, Audrey [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Dailey, Ashlee [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Go, Elaine [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program


    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  17. Application of Citect in the Automation Stereoscopic Warehouse Monitoring and Control System%Citect在自动化立体仓库监控系统中的应用

    Institute of Scientific and Technical Information of China (English)

    王涛; 王忠庆


    本文介绍了CITECT软件在自动化立体仓库监控系统中的应用。根据立体仓库自身特点,实现运行过程中的数据监控以及控制操作,并完成数据处理、报警、数据存储等功能。经运行,证明系统操作方便,具有优良的人机界面及较高的可靠性。%This Paper introduces al application of Citect development platform in the automation stereoscopic warehouse.According to the characteristics of automation stereoscopic warehouse,the system realizes data monitor and control operations,achieves data processing,alarming and storage.Operation shows the system manipulate conveniently and possesses high reliability.

  18. Evaluation of the Colin STBP-680 at rest and during exercise: an automated blood pressure monitor using R-wave gating. (United States)

    Bond, V; Bassett, D R; Howley, E T; Lewis, J; Walker, A J; Swan, P D; Tearney, R J; Adams, R G


    The application of automated blood pressure measurement during exercise has been limited by inaccuracies introduced by the effects of accompanying motion and noise. We evaluated a newly developed automated blood pressure monitor for measuring exercise blood pressure (Colin STBP-680; Colin, San Antonio, Texas, USA). The STBP-680 uses acoustic transduction with the assistance of the electrocardiogram R-wave to trigger the sampling period for blood pressure measurement. The automated monitor readings were compared with simultaneous technician mercury sphygmomanometric readings in the same arm. Blood pressure was measured in 18 men at rest and during exercise at 40% VO2 peak, (low intensity), 70% VO2 peak (moderate intensity) and VO2 peak (high intensity) on the cycle ergometer. Mean(s.d.) systolic blood pressure difference between the automated monitor and mercury manometer readings at rest and during exercise at low, moderate and high work intensities were 3(0) mmHg, 3(2) mmHg, 1(1) mmHg, and 0(11) mmHg respectively (analysis of variance; P > 0.05). Resting diastolic blood pressure obtained with the STBP-680 was similar to the mercury manometer readings (78(10) versus 81(7) mmHg (P > 0.05). Exercise diastolic pressure at the low level of work intensity was almost identical between the automated monitor and mercury manometer readings (64(8) versus 65(10) mmHg (not significant)). Diastolic blood pressure readings between the STBP-680 and mercury manometer showed a greater difference at the moderate and high workloads (11 mmHg and 9 mmHg, respectively), but this difference was not significant (P > 0.05).(ABSTRACT TRUNCATED AT 250 WORDS)

  19. Automated process flowsheet synthesis for membrane processes using genetic algorithm: role of crossover operators

    KAUST Repository

    Shafiee, Alireza


    In optimization-based process flowsheet synthesis, optimization methods, including genetic algorithms (GA), are used as advantageous tools to select a high performance flowsheet by ‘screening’ large numbers of possible flowsheets. In this study, we expand the role of GA to include flowsheet generation through proposing a modified Greedysub tour crossover operator. Performance of the proposed crossover operator is compared with four other commonly used operators. The proposed GA optimizationbased process synthesis method is applied to generate the optimum process flowsheet for a multicomponent membrane-based CO2 capture process. Within defined constraints and using the random-point crossover, CO2 purity of 0.827 (equivalent to 0.986 on dry basis) is achieved which results in improvement (3.4%) over the simplest crossover operator applied. In addition, the least variability in the converged flowsheet and CO2 purity is observed for random-point crossover operator, which approximately implies closeness of the solution to the global optimum, and hence the consistency of the algorithm. The proposed crossover operator is found to improve the convergence speed of the algorithm by 77.6%.

  20. Memory-type control charts for monitoring the process dispersion

    NARCIS (Netherlands)

    Abbas, N.; Riaz, M.; Does, R.J.M.M.


    Control charts have been broadly used for monitoring the process mean and dispersion. Cumulative sum (CUSUM) and exponentially weighted moving average (EWMA) control charts are memory control charts as they utilize the past information in setting up the control structure. This makes CUSUM and EWMA-t

  1. Monitoring sodium in commercially processed foods from stores and restaurants (United States)

    Most of the sodium we eat comes from commercially processed foods from stores and restaurants. Sodium reduction in these foods is a key component of several recent public health efforts. Agricultural Research Service (ARS) of USDA, CDC and FDA have launched a collaborative program to monitor sodium ...

  2. Monitoring of anaerobic digestion processes: A review perspective

    DEFF Research Database (Denmark)

    Madsen, Michael; Holm-Nielsen, Jens Bo; Esbensen, Kim


    to a new level of reliability and effectiveness. It is shown, how proper involvement of process sampling understanding, Theory of Sampling (TOS), constitutes a critical success factor. We survey the more recent trends within the field of AD monitoring and the powerful PAT/TOS/chemometrics application...

  3. Acoustic monitoring of a fluidized bed coating process

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Veski, Peep; Pedersen, Joan G.;


      The aim of the study was to investigate the potential of acoustic monitoring of a production scale fluidized bed coating process. The correlation between sensor signals and the estimated amount of film applied and percentage release, respectively, were investigated in coating potassium chloride...

  4. Facility Effluent Monitoring Plan for the 325 Radiochemical Processing Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Shields, K.D.; Ballinger, M.Y.


    This Facility Effluent Monitoring Plan (FEMP) has been prepared for the 325 Building Radiochemical Processing Laboratory (RPL) at the Pacific Northwest National Laboratory (PNNL) to meet the requirements in DOE Order 5400.1, ''General Environmental Protection Programs.'' This FEMP has been prepared for the RPL primarily because it has a ''major'' (potential to emit >0.1 mrem/yr) emission point for radionuclide air emissions according to the annual National Emission Standards for Hazardous Air Pollutants (NESHAP) assessment performed. This section summarizes the airborne and liquid effluents and the inventory based NESHAP assessment for the facility. The complete monitoring plan includes characterization of effluent streams, monitoring/sampling design criteria, a description of the monitoring systems and sample analysis, and quality assurance requirements. The RPL at PNNL houses radiochemistry research, radioanalytical service, radiochemical process development, and hazardous and radioactive mixed waste treatment activities. The laboratories and specialized facilities enable work ranging from that with nonradioactive materials to work with picogram to kilogram quantities of fissionable materials and up to megacurie quantities of other radionuclides. The special facilities within the building include two shielded hot-cell areas that provide for process development or analytical chemistry work with highly radioactive materials and a waste treatment facility for processing hazardous, mixed radioactive, low-level radioactive, and transuranic wastes generated by PNNL activities.

  5. Photometric Monitoring of Active Galactic Nuclei in the Center for Automated Space Science: Preliminary Results (United States)

    Culler, Ryan; Deckard, Monica; Guilaran, Fonsie; Watson, Casey; Carini, Michael; Gelderman, Richard; Neely, William


    In this paper, we will present preliminary results of our program to photometrically monitor a set of Active Galactic Nuclei (AGN) known as Blazars. Using CCDs as N-star photometers and a technique known as aperture photometry, we can achieve close to 0.02 magnitude precision with small to midsize telescopes. Blazars are highly luminous and highly variable; studying these variations provides insight into the central engines producing the high luminosities. we report on our reduction and analysis of CCD data obtained at one of our collaborating institutions, the NF Observatory at Western New Mexico University. CCD data obtained at the Western Kentucky University 24 inch telescope will also be discussed.

  6. Automated System Of Monitoring Of The Physical Condition Of The Staff Of The Enterprise (United States)

    Pilipenko, A.


    In the work the author solves an important applied problem of increasing of safety of engineering procedures and production using technologies of monitoring of a condition of employees. The author offers a work algorithm, structural and basic electric schemes of system of collection of data of employee’s condition of the enterprise and some parameters of the surrounding environment. In the article the author offers an approach to increasing of efficiency of acceptance of management decisions at the enterprise at the expense of the prompt analysis of information about employee’s condition and productivity of his work and also about various parameters influencing these factors.

  7. Automated modal tracking and fatigue assessment of a wind turbine based on continuous dynamic monitoring

    Directory of Open Access Journals (Sweden)

    Oliveira Gustavo


    Full Text Available The paper describes the implementation of a dynamic monitoring system at a 2.0 MW onshore wind turbine. The system is composed by two components aiming at the structural integrity and fatigue assessment. The first component enables the continuous tracking of modal characteristics of the wind turbine (natural frequency values, modal damping ratios and mode shapes in order to detect abnormal deviations of these properties, which may be caused by the occurrence of structural damage. On the other hand, the second component allows the estimation of the remaining fatigue lifetime of the structure based on the analysis of the measured cycles of structural vibration.

  8. Generation and monitoring of a discrete stable random process

    CERN Document Server

    Hopcraft, K I; Matthews, J O


    A discrete stochastic process with stationary power law distribution is obtained from a death-multiple immigration population model. Emigrations from the population form a random series of events which are monitored by a counting process with finite-dynamic range and response time. It is shown that the power law behaviour of the population is manifested in the intermittent behaviour of the series of events. (letter to the editor)

  9. Monitoring solar energetic particles with an armada of European spacecraft and the new automated SEPF (Solar Energetic Proton Fluxes) Tool (United States)

    Sandberg, I.; Daglis, I. A.; Anastasiadis, A.; Balasis, G.; Georgoulis, M.; Nieminen, P.; Evans, H.; Daly, E.


    previous presentations and papers that the exploration and analysis of SREM data may contribute significantly to investigations and modeling efforts of SPE generation and propagation in the heliosphere and in the Earth's magnetosphere. ISARS/NOA recently released an automated software tool for the monitoring of Solar Energetic Proton Fluxes (SEPF) using measurements of SREM. The SEPF tool is based on the automated implementation of the inverse method developed by ISARS/NOA, permitting the calculation of high-energy proton fluxes from SREM data. Results of the method have been validated for selected number of past solar energetic particle events using measurements from other space-born proton monitors. The SEPF tool unfolds downlinked SREM count-rates, calculates the omnidirectional differential proton fluxes and provides results to the space weather community acting as a multi-point proton flux monitor on a daily-basis. The SEPF tool is a significant European space weather asset and will support the efforts towards an efficient European Space Situational Awareness programme.

  10. A new highly automated sputter equipment for in situ investigation of deposition processes with synchrotron radiation (United States)

    Döhrmann, Ralph; Botta, Stephan; Buffet, Adeline; Santoro, Gonzalo; Schlage, Kai; Schwartzkopf, Matthias; Bommel, Sebastian; Risch, Johannes F. H.; Mannweiler, Roman; Brunner, Simon; Metwalli, Ezzeldin; Müller-Buschbaum, Peter; Roth, Stephan V.


    HASE (Highly Automated Sputter Equipment) is a new mobile setup developed to investigate deposition processes with synchrotron radiation. HASE is based on an ultra-high vacuum sputter deposition chamber equipped with an in-vacuum sample pick-and-place robot. This enables a fast and reliable sample change without breaking the vacuum conditions and helps to save valuable measurement time, which is required for experiments at synchrotron sources like PETRA III at DESY. An advantageous arrangement of several sputter guns, mounted on a rotative flange, gives the possibility to sputter under different deposition angles or to sputter different materials on the same substrate. The chamber is also equipped with a modular sample stage, which allows for the integration of different sample environments, such as a sample heating and cooling device. The design of HASE is unique in the flexibility. The combination of several different sputtering methods like standard deposition, glancing angle deposition, and high pressure sputter deposition combined with heating and cooling possibil-ities of the sample, the large exit windows, and the degree of automation facilitate many different grazing incidence X-ray scattering experiments, such as grazing incidence small and wide angle X-ray scattering, in one setup. In this paper we describe in detail the design and the performance of the new equipment and present the installation of the HASE apparatus at the Micro and Nano focus X-ray Scattering beamline (MiNaXS) at PETRA III. Furthermore, we describe the measurement options and present some selected results. The HASE setup has been successfully commissioned and is now available for users.

  11. Process control and recovery in the Link Monitor and Control Operator Assistant (United States)

    Lee, Lorrine; Hill, Randall W., Jr.


    This paper describes our approach to providing process control and recovery functions in the Link Monitor and Control Operator Assistant (LMCOA). The focus of the LMCOA is to provide semi-automated monitor and control to support station operations in the Deep Space Network. The LMCOA will be demonstrated with precalibration operations for Very Long Baseline Interferometry on a 70-meter antenna. Precalibration, the task of setting up the equipment to support a communications link with a spacecraft, is a manual, time consuming and error-prone process. One problem with the current system is that it does not provide explicit feedback about the effects of control actions. The LMCOA uses a Temporal Dependency Network (TDN) to represent an end-to-end sequence of operational procedures and a Situation Manager (SM) module to provide process control, diagnosis, and recovery functions. The TDN is a directed network representing precedence, parallelism, precondition, and postcondition constraints. The SM maintains an internal model of the expected and actual states of the subsystems in order to determine if each control action executed successfully and to provide feedback to the user. The LMCOA is implemented on a NeXT workstation using Objective C, Interface Builder and the C Language Integrated Production System.

  12. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe


    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  13. Batch process monitoring based on multilevel ICA-PCA

    Institute of Scientific and Technical Information of China (English)

    Zhi-qiang GE; Zhi-huan SONG


    In this paper,we describe a new batch process monitoring method based on multilevel independent component analysis and principal component analysis (MLICA-PCA).Unlike the conventional multi-way principal component analysis (MPCA) method,MLICA-PCA provides a separated interpretation for multilevel batch process data.Batch process data are partitioned into two levels:the within-batch level and the between-batch level.In each level,the Gaussian and non-Ganssian components of process information can be separately extracted.I2,T2 and SPE statistics are individually built and monitored.The new method facilitates fault diagnosis.Since the two variation levels arc decomposed,the variables responsible for faults in each level can be identified and interpreted more easily.A case study of the Dupont benchmark process showed that the proposed method was more efficient and interpretable in fault detection and diagnosis,compared to the alternative batch process monitoring method.

  14. Automated process parameters tuning for an injection moulding machine with soft computing§

    Institute of Scientific and Technical Information of China (English)

    Peng ZHAO; Jian-zhong FU; Hua-min ZHOU; Shu-biao CUI


    In injection moulding production, the tuning of the process parameters is a challenging job, which relies heavily on the experience of skilled operators. In this paper, taking into consideration operator assessment during moulding trials, a novel intelligent model for automated tuning of process parameters is proposed. This consists of case based reasoning (CBR), empirical model (EM), and fuzzy logic (FL) methods. CBR and EM are used to imitate recall and intuitive thoughts of skilled operators,respectively, while FL is adopted to simulate the skilled operator optimization thoughts. First, CBR is used to set up the initial process parameters. If CBR fails, EM is employed to calculate the initial parameters. Next, a moulding trial is performed using the initial parameters. Then FL is adopted to optimize these parameters and correct defects repeatedly until the moulded part is found to be satisfactory. Based on the above methodologies, intelligent software was developed and embedded in the controller of an injection moulding machine. Experimental results show that the intelligent software can be effectively used in practical production, and it greatly reduces the dependence on the experience of the operators.

  15. Current good manufacturing practice in plant automation of biological production processes. (United States)

    Dorresteijn, R C; Wieten, G; van Santen, P T; Philippi, M C; de Gooijer, C D; Tramper, J; Beuvery, E C


    The production of biologicals is subject to strict governmental regulations. These are drawn up in current good manufacturing practices (cGMP), a.o. by the U.S. Food and Drug Administration. To implement cGMP in a production facility, plant automation becomes an essential tool. For this purpose Manufacturing Execution Systems (MES) have been developed that control all operations inside a production facility. The introduction of these recipe-driven control systems that follow ISA S88 standards for batch processes has made it possible to implement cGMP regulations in the control strategy of biological production processes. Next to this, an MES offers additional features such as stock management, planning and routing tools, process-dependent control, implementation of software sensors and predictive models, application of historical data and on-line statistical techniques for trend analysis and detection of instrumentation failures. This paper focuses on the development of new production strategies in which cGMP guidelines are an essential part.

  16. Automating Finance (United States)

    Moore, John


    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  17. Development of automated welding process for field fabrication of thick walled pressure vessels. Fourth quarter, FY 1980

    Energy Technology Data Exchange (ETDEWEB)


    Progress is reported in research on the automated welding of heavy steel plate for the fabrication of pressure vessels. Information is included on: torch and shield adaptation; mechanical control of the welding process; welding parameters; joint design; filler wire optimizaton; nondestructive testing of welds; and weld repair. (LCL)

  18. Laboratory support for the didactic process of engineering processes automation at the Faculty of Mechanical Engineering

    Directory of Open Access Journals (Sweden)

    G. Wszołek


    Full Text Available Purpose: The scope of the paper is to present effects of creating the laboratory support for the didactic process of automatic control of engineering processes.Design/methodology/approach: The discussed laboratory framework is a complex system, flexible in terms of further development, operating on four basic levels: rudimental- serving general introductory classes to the subject, advanced level- suitable for specialisation classes, hardware and software for individual or team work assignments completed in the course of self-studies, semester projects, BSc and MSc. theses, and the sophisticated level designed for PhD and DSc research workers.Findings: Close cooperation with industry and practical implementation of joint research projects play a crucial role in the functioning of the laboratory framework.Practical implications: The education of modern engineers and Masters of Science in automatic control and robotics is a challenging task which may be successfully accomplished only if faced with industrial reality. Continuously advancing industrial companies demand graduates who can quickly adjust to the workflow and who can instantly utilize the knowledge and skills acquired in the complex, interdisciplinary field of mechatronics.Originality/value: The discussed laboratory framework successfully couples software and hardware, providing a complex yet flexible system open for further development, enabling teaching and research into the design and operation of modern control systems, both by means of virtual construction and testing in simulation programs, as well as on real industrial structures configured in laboratory workstations.

  19. 3D cutting tool-wear monitoring in the process

    Energy Technology Data Exchange (ETDEWEB)

    Cerce, Luka; Pusavec, Franci; Kopac Janez [University of Ljubljana, Askerceva (Slovenia)


    The tool-wear of cutting tools has a very strong impact on the product quality as well as efficiency of the machining processes. Therefore, it in-the process characterization is crucial. This paper presents an innovative and reliable direct measuring procedure for measuring spatial cutting tool-wear with usage of laser profile sensor. The technique provides possibility for determination of 3D wear profiles, as advantage to currently used 2D techniques. The influence of the orientation of measurement head on the accuracy and the amount of captured reliable data was examined and the optimal setup of the measuring system was defined. Further, a special clamping system was designed to mount the measurement device on the machine tool turret. To test the measurement system, tool-life experiment was performed. Additionally, a new tool-life criterion was developed, including spatial characteristics of the tool-wear. The results showed that novel tool-wear and tool-life diagnostic represent objective and robust estimator of the machining process. Additionally, such automation of tool-wear diagnostics on machine tool provides higher productivity and quality of the machining process.

  20. The Error Monitoring and Processing System in Alcohol Use

    Directory of Open Access Journals (Sweden)

    Menizibeya O. Welcome


    Full Text Available Background: Current data suggest that alcohol might play significant role in error commission. Error commission is related to the functions of the Error Monitoring and Processing System (EMPS located in the substantia nigra of the midbrain, basal ganglia and cortex of the forebrain. The main components of the EMPS are the dopaminergic system and anterior cingulate cortex. Although, recent data show that alcohol disrupts the EMPS, the ways in which alcohol affects this system are poorly understood.Aims & Objectives: We reviewed recent data that suggest the indirect effect of alcohol use on error commission.Methods / Study Design: Databases were searched for relevant literatures using the following keywords combination – Alcohol AND Error Commission (OR Processing, Monitoring, Correction, Detection. Literatures were searched in scientific databases (Medline, DOAJ, Embase from 1940 to August 2010, journal website (Psychophysiology, Neuroscience and Trends in Neuroscience. Manual book search, including library information were included in the data collection process. Other additional information was searched through Google.Results / Findings: Blood and brain glucose levels play a vital role in error commission, and are related to error commission, monitoring and processing through the modulation of the activity of the dopaminergic system. To summarize the results of our findings, here we suggest a hypothesis of Alcohol-Related Glucose-Dependent System of Error Monitoring and Processing (ARGD-EMPS hypothesis, which holds that the disruption of the EMPS is related to the competency of glucose homeostasis regulation, which in turn may determine the dopamine level as a major component of the EMPS. The ARGD-EMPS hypothesis explains the general processes and mechanism of alcohol related disruption of the EMPS.Conclusion: Alcohol may indirectly disrupt the EMPS by affecting dopamine level through disorders in blood glucose homeostasis regulation. The


    Esson, Douglas W; Nollens, Hendrik H; Schmitt, Todd L; Fritz, Kevin J; Simeone, Claire A; Stewart, Brent S


    A female harbor seal pup rescued along the coast of San Diego on 13 June 2012 was diagnosed with bilateral mature cataracts, apparently congenital, in association with vitreal herniation in the anterior chamber of each eye. The cataracts were surgically removed on 1 August 2012 with single-port aphakic phacoemulsification and automated anterior vitrectomy. Postoperative monitoring during the next several weeks indicated that vision had been functionally repaired and that she could visually orient to and capture live fish in three different environments and in the presence of other animals. Consequently, we equipped the seal with a satellite-linked radio transmitter and returned her to the Pacific Ocean on 21 November 2012, and then monitored her movements until radio contact ended on 2 March 2013. She remained along the San Diego coast from 21 November until 5 December 2012 when she relocated to the Coronado Islands and remained there until 26 December. She then traveled directly to San Clemente Island and remained foraging in the near-shore kelp beds there through 2 March 2013, when radio contact ended. To our knowledge, this is the first published report of cataract treatment in a marine mammal using high-frequency ultrasound to emulsify the lenses followed by suction removal of the emulsified microfragments (i.e., phacoemulsification). Moreover, the rapid postoperative recovery of the seal and its quick acclimation, orientation, navigation, and foraging in marine habitats after return to the Pacific Ocean indicates that these surgical procedures can be safe and effective treatments for cataracts in seals, with substantially reduced postsurgical complications relative to other types of lens fragmentation and removal procedures.

  2. A semi-automated AFM photomask repair process for manufacturing application using SPR6300 (United States)

    Dellagiovanna, Mario; Yoshioka, Hidenori; Miyashita, Hiroyuki; Murai, Shiaki; Nakaue, Takuya; Takaoka, Osamu; Uemoto, Atsushi; Kikuchi, Syuichi; Hagiwara, Ryoji; Benard, Stephane


    For almost a decade Nanomachining application has been studied and developed to repair next generation of photomasks. This technique, based on Atomic Force Microscopy (AFM), applies a mechanical removing of the defects with almost negligible quartz-damage, high accuracy of the edge-placement and without spurious depositions (stain, implanted elements, etc.) that may affect the optical transmission. SII NanoTechnology Inc. (SIINT) is carrying out a joint-development project with DNP Photomask Europe S.p.A. (DPE) that has allowed the installation in DPE of the next generation state-of-the-art AFM based system SPR6300 to meet the repair specifications for the 65 nm Node. Drift phenomena of the AFM probe represent one of the major obstacles for whichever kind of nano-manipulation (imaging and material or pattern modification). AFM drift undermines the repeatability and accuracy performances of the process. The repair methodology, called NewDLock, implemented on SPR6300, is a semi-automated procedure by which the drift amount, regardless of its origin, is estimated in advance and compensated during the process. Now AFM Nanomachining approach is going to reveal properties of repeatability and user-friendly utilization that make it suitable for the production environment.

  3. SigMate: a Matlab-based automated tool for extracellular neuronal signal processing and analysis. (United States)

    Mahmud, Mufti; Bertoldo, Alessandra; Girardi, Stefano; Maschietto, Marta; Vassanelli, Stefano


    Rapid advances in neuronal probe technology for multisite recording of brain activity have posed a significant challenge to neuroscientists for processing and analyzing the recorded signals. To be able to infer meaningful conclusions quickly and accurately from large datasets, automated and sophisticated signal processing and analysis tools are required. This paper presents a Matlab-based novel tool, "SigMate", incorporating standard methods to analyze spikes and EEG signals, and in-house solutions for local field potentials (LFPs) analysis. Available modules at present are - 1. In-house developed algorithms for: data display (2D and 3D), file operations (file splitting, file concatenation, and file column rearranging), baseline correction, slow stimulus artifact removal, noise characterization and signal quality assessment, current source density (CSD) analysis, latency estimation from LFPs and CSDs, determination of cortical layer activation order using LFPs and CSDs, and single LFP clustering; 2. Existing modules: spike detection, sorting and spike train analysis, and EEG signal analysis. SigMate has the flexibility of analyzing multichannel signals as well as signals from multiple recording sources. The in-house developed tools for LFP analysis have been extensively tested with signals recorded using standard extracellular recording electrode, and planar and implantable multi transistor array (MTA) based neural probes. SigMate will be disseminated shortly to the neuroscience community under the open-source GNU-General Public License.

  4. Achieving mask order processing automation, interoperability and standardization based on P10 (United States)

    Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.


    Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.

  5. A Permanent Automated Real-Time Passive Acoustic Monitoring System for Bottlenose Dolphin Conservation in the Mediterranean Sea.

    Directory of Open Access Journals (Sweden)

    Marco Brunoldi

    Full Text Available Within the framework of the EU Life+ project named LIFE09 NAT/IT/000190 ARION, a permanent automated real-time passive acoustic monitoring system for the improvement of the conservation status of the transient and resident population of bottlenose dolphin (Tursiops truncatus has been implemented and installed in the Portofino Marine Protected Area (MPA, Ligurian Sea. The system is able to detect the simultaneous presence of dolphins and boats in the area and to give their position in real time. This information is used to prevent collisions by diffusing warning messages to all the categories involved (tourists, professional fishermen and so on. The system consists of two gps-synchronized acoustic units, based on a particular type of marine buoy (elastic beacon, deployed about 1 km off the Portofino headland. Each one is equipped with a four-hydrophone array and an onboard acquisition system which can record the typical social communication whistles emitted by the dolphins and the sound emitted by boat engines. Signals are pre-filtered, digitized and then broadcast to the ground station via wi-fi. The raw data are elaborated to get the direction of the acoustic target to each unit, and hence the position of dolphins and boats in real time by triangulation.

  6. Semi-automated tracking and continuous monitoring of inferior vena cava diameter in simulated and experimental ultrasound imaging. (United States)

    Mesin, Luca; Pasquero, Paolo; Albani, Stefano; Porta, Massimo; Roatta, Silvestro


    Assessment of respirophasic fluctuations in the diameter of the inferior vena cava (IVC) is detrimentally affected by its concomitant displacements. This study was aimed at presenting and validating a method to compensate for IVC movement artifacts while continuously measuring IVC diameter in an automated fashion (with minimal interaction with the user) from a longitudinal B-mode ultrasound clip. Performance was tested on both experimental ultrasound clips collected from four healthy patients and simulations, implementing rigid IVC displacements and pulsation. Compared with traditional M-mode measurements, the new approach systematically reduced errors in caval index assessment (range over maximum diameter value) to an extent depending on individual vessel geometry, IVC movement and choice of the M-line (the line along which the diameter is computed). In experimental recordings, this approach identified both the cardiac and respiratory components of IVC movement and pulsatility and evidenced the spatial dependence of IVC pulsatility. IVC tracking appears to be a promising approach to reduce movement artifacts and to improve the reliability of IVC diameter monitoring.

  7. Photoacoustic monitoring of inhomogeneous curing processes in polystyrene emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Vargas-Luna, M.; Gutierrez-Juarez, G.; Rodriguez-Vizcaino, J.M.; Varela-Nsjera, J.B.; Rodriguez-Palencia, J.M.; Bernal-Alvarado, J.; Sosa, M. [Instituto de Fisica, Universidad de Guanajuato, Leon, Guanajuato (Mexico); Alvarado-Gil, J.J. [Centro de Investigacion y de Estudios Avanzados del IPN, Unidad Merida, Antigua Carretera a Progreso, Merida, Yucatan (Mexico)


    The time evolution of the inhomogeneous curing process of polystyrene emulsions is studied using a variant of the conventional photoacoustic (PA) technique. The thermal effusivity, as a function of time, is determined in order to monitor the sintering process of a styrene emulsion in different steps of the manufacturing procedure. PA measurements of thermal effusivity show a sigmoidal growth as a function of time during the curing process. The parameterization of these curves permits the determination of the characteristic curing time and velocity of the process. A decreasing of the curing time and an increasing curing velocity for the final steps of the manufacturing process are observed. The feasibility of our approach and its potentiality for the characterization of other curing process are discussed. (author)

  8. A Multi-Scale Flood Monitoring System Based on Fully Automatic MODIS and TerraSAR-X Processing Chains

    Directory of Open Access Journals (Sweden)

    Enrico Stein


    Full Text Available A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently under development at DLR’s (German Aerospace Center Center for Satellite based Crisis Information (ZKI to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS. A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR based satellite mission (TerraSAR-X. The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS and the TerraSAR-X Flood Service (TFS include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013.

  9. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    quality assurance. This thesis is aimed at a systematic development of process monitoring solutions, constituting a key element of intelligent manufacturing systems towards zero defect manufacturing. A methodological approach of general applicability is presented in this concern.The approach consists...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...

  10. Process monitoring of ultrasound compaction as a small-scale heating process. (United States)

    Ueda, Hiroshi; Kawai, Akira; Hayashi, Takashi; Ida, Yasuo; Kakemi, Masawo; Tozuka, Yuichi


    Ultrasound compaction is a simple small-scale heating process. The aim of this study was to elucidate the polymer phase transition process during ultrasound compaction by process monitoring. Morphological change with heat occurs when ultrasound energy is supplied. Monitoring of the process revealed changes in both punch position and the pressure of the die in terms of the polymer's phase transition process. The optimum ultrasound energy for complete transition could be detected by a sudden increase in the pressure on the lower punch. Such optimum energy clearly depended on the polymer's glass transition temperature (Tg), suggesting that Tg is the predominant parameter in the ultrasound compaction process. Optimization of ultrasound energy based on monitoring profiles is a promising way to obtain a desirable product by thermoplastic treatment with minimal thermal degradation due to excess supply of energy.

  11. Monitoring tablet surface roughness during the film coating process

    DEFF Research Database (Denmark)

    Seitavuopio, Paulus; Heinämäki, Jyrki; Rantanen, Jukka


    The purpose of this study was to evaluate the change of surface roughness and the development of the film during the film coating process using laser profilometer roughness measurements, SEM imaging, and energy dispersive X-ray (EDX) analysis. Surface roughness and texture changes developing during...... the process of film coating tablets were studied by noncontact laser profilometry and scanning electron microscopy (SEM). An EDX analysis was used to monitor the magnesium stearate and titanium dioxide of the tablets. The tablet cores were film coated with aqueous hydroxypropyl methylcellulose, and the film...... coating was performed using an instrumented pilot-scale side-vented drum coater. The SEM images of the film-coated tablets showed that within the first 30 minutes, the surface of the tablet cores was completely covered with a thin film. The magnesium signal that was monitored by SEM-EDX disappeared after...

  12. Automated Monitoring for Circuit Breaker on Line%断路器运行在线自动监测

    Institute of Scientific and Technical Information of China (English)



    断路器在电力系统中非常重要,用它来切换其他设备在工作与停止两种状态。断路器的可靠运行对重新配置电力系统的能力至关重要,并且可靠性可以通过定期检查和维护来保证。现在可以采用一种自动断路器监测系统来监测断路器的控制电路。系统包括一套新的断路器监测数据采集智能电子设备,它位于断路器上,用于获得关于实时运行的详细信息。本文也论证了系统范围的数据分析的应用,它使得追踪断路器的开关序列成为可能,同时也给出了关于断路器开关序列的性能和最终结果的结论。%Circuit breakers (CBs) are very important elements in the power system. They are used to switch other equipment in and out of service. Reliable operation of circuit breakers is critical to the ability to reconfigure a power system and can be assured by regular inspection and maintenance. An automated circuit breaker monitoring system is proposed to monitor circuit breaker's control circuit. The system consists of a new CB monitoring data acquisition IED that is located at circuit breaker and captures detailed information about its operation in real-time, An application of system wide data analysis is demonstrated. It makes possible to track the circuit breaker switching sequences and make conclusions about their performance and final outcome.

  13. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment (United States)

    Li, Y. T.; Wittenberg, L. J.


    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  14. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules; Annual Technical Progress Report: 15 June 1999--14 July 2000

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Lewis, E. R.; Hogan, S. J.


    Spire is addressing the PVMaT project goals of photovoltaic (PV) module cost reduction and improved module manufacturing process technology. New cost-effective automation processes are being developed for post-lamination PV module assembly, where post-lamination is defined as the processes after the solar cells are encapsulated. These processes apply to both crystalline and thin-film solar cell modules. Four main process areas are being addressed: (1) Module buffer storage and handling between steps; (2) Module edge trimming, edge sealing, and framing; (3) Junction-box installation; and (4) Testing for module performance, electrical isolation, and ground-path continuity.

  15. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance (United States)

    Schiepek, Günter; Aichhorn, Wolfgang; Gruber, Martin; Strunk, Guido; Bachler, Egon; Aas, Benjamin


    Objective: The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients' compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific) surveys. Methods: The process-outcome monitoring is technically realized by an internet-based device for data collection and data analysis, the Synergetic Navigation System. Its feasibility is documented by a compliance study on 151 clients treated in an inpatient and a day-treatment clinic. Results: We found high compliance rates (mean: 78.3%, median: 89.4%) amongst the respondents, independent of the severity of symptoms or the degree of impairment. Compared to other diagnoses, the compliance rate was lower in the group diagnosed with personality disorders. Conclusion: The results support the feasibility of high-frequency monitoring in routine psychotherapy settings. Daily collection of psychological surveys allows for the assessment of highly resolved, equidistant time series data which gives insight into the nonlinear qualities of therapeutic change processes (e.g., pattern transitions, critical instabilities). PMID:27199837

  16. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance

    Directory of Open Access Journals (Sweden)

    Guenter Karl Schiepek


    Full Text Available AbstractObjective. The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients’ compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific surveys. Methods. The process-outcome monitoring is technically realized by an internet-based device for data collection and data analysis, the Synergetic Navigation System. Its feasibility is documented by a compliance study on 151 clients treated in an inpatient and a day-treatment clinic. Results. We found high compliance rates (mean: 78.3%, median: 89.4% amongst the respondents, independent of the severity of symptoms or the degree of impairment. Compared to other diagnoses, the compliance rate was lower in the group diagnosed with personality disorders. Conclusion. The results support the feasibility of high-frequency monitoring in routine psychotherapy settings. Daily collection of psychological surveys allows for assessment of highly resolved, equidistant time series data which gives insight into the nonlinear qualities of therapeutic change processes (e.g., pattern transitions, critical instabilities.


    Directory of Open Access Journals (Sweden)

    V.I. Milykh


    Full Text Available Attention is paid to the popular FEMM (Finite Element Method Magnetics program which is effective in the numerical calculations of the magnetic fields of electrical machines. The principles of the automated calculations providing the analysis of the dynamics of electromagnetic processes in turbo-generators are presented. This is realized in the form of a script on the algorithmic language Lua integrated with FEMM. The temporal functions of electromagnetic quantities are obtained by multi-position calculations of the magnetic field with ensuring its rotation together with the turbo-generator rotor. The developed program is universal in terms of the geometry and dimensions of turbo-generators, as well as the modes of their work with a minimum of input data in numerical form. This paper shows "extraction" of discrete temporal functions: the magnetic flux linkage of the phase stator winding; forces acting on the current-carrying and ferromagnetic elements of the structure; the magnetic induction at the fixed points; electromagnetic moment. This list can be expanded as part of the created program, as well as the use of the program can be extended to other types of electrical machines. The obtaining of a change period of any functions is provided by rotating the rotor to 60°.

  18. An automated process for generating archival data files from MATLAB figures (United States)

    Wallace, G. M.; Greenwald, M.; Stillerman, J.


    A new directive from the White House Office of Science and Technology Policy requires that all publications supported by federal funding agencies (e.g. Department of Energy Office of Science, National Science Foundation) include machine-readable datasets for figures and tables. An automated script was developed at the PSFC to make this process easier for authors using the MATLAB plotting environment to create figures. All relevant data (x, y, z, errorbars) and metadata (line style, color, symbol shape, labels) are contained within the MATLAB .fig file created when saving a figure. The export_fig script extracts data and metadata from a .fig file and exports it into an HDF5 data file with no additional user input required. Support is included for a number of plot types including 2-D and 3-D line, contour, and surface plots, quiver plots, bar graphs, and histograms. This work supported by US Department of Energy cooperative agreement DE-FC02-99ER54512 using the Alcator C-Mod tokamak, a DOE Office of Science user facility.

  19. Methodology on Investigating the Influences of Automated Material Handling System in Automotive Assembly Process (United States)

    Saffar, Seha; Azni Jafar, Fairul; Jamaludin, Zamberi


    A case study was selected as a method to collect data in actual industry situation. The study aimed to assess the influences of automated material handling system in automotive industry by proposing a new design of integration system through simulation, and analyze the significant effect and influence of the system. The method approach tool will be CAD Software (Delmia & Quest). The process of preliminary data gathering in phase 1 will collect all data related from actual industry situation. It is expected to produce a guideline and limitation in designing a new integration system later. In phase 2, an idea or concept of design will be done by using 10 principles of design consideration for manufacturing. A full factorial design will be used as design of experiment in order to analyze the performance measured of the integration system with the current system in case study. From the result of the experiment, an ANOVA analysis will be done to study the performance measured. Thus, it is expected that influences can be seen from the improvement made in the system.

  20. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.



  2. A Tool for the Automated Collection of Space Utilization Data: Three Dimensional Space Utilization Monitor (United States)

    Vos, Gordon A.; Fink, Patrick; Ngo, Phong H.; Morency, Richard; Simon, Cory; Williams, Robert E.; Perez, Lance C.


    Space Human Factors and Habitability (SHFH) Element within the Human Research Program (HRP) and the Behavioral Health and Performance (BHP) Element are conducting research regarding Net Habitable Volume (NHV), the internal volume within a spacecraft or habitat that is available to crew for required activities, as well as layout and accommodations within the volume. NASA needs methods to unobtrusively collect NHV data without impacting crew time. Data required includes metrics such as location and orientation of crew, volume used to complete tasks, internal translation paths, flow of work, and task completion times. In less constrained environments methods exist yet many are obtrusive and require significant post-processing. ?Examplesused in terrestrial settings include infrared (IR) retro-reflective marker based motion capture, GPS sensor tracking, inertial tracking, and multi-camera methods ?Due to constraints of space operations many such methods are infeasible. Inertial tracking systems typically rely upon a gravity vector to normalize sensor readings,and traditional IR systems are large and require extensive calibration. ?However, multiple technologies have not been applied to space operations for these purposes. Two of these include: 3D Radio Frequency Identification Real-Time Localization Systems (3D RFID-RTLS) ?Depth imaging systems which allow for 3D motion capture and volumetric scanning (such as those using IR-depth cameras like the Microsoft Kinect or Light Detection and Ranging / Light-Radar systems, referred to as LIDAR)

  3. A systematic framework for design of process monitoring and control (PAT) systems for crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist


    A generic computer-aided framework for systematic design of a process monitoring and control system for crystallization processes has been developed to study various aspects of crystallization operations.The systematic design framework contains a generic crystallizer modelling toolbox, a tool for...

  4. A Generic Framework for Systematic Design of Process Monitoring and Control System for Crystallization Processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Sin, Gürkan


    A generic framework for systematic design of a process monitoring and control system for crystallization processes has been developed in order to obtain the desired end-product properties notably the crystal size distribution (CSD). The design framework contains a generic crystallizer modelling t...

  5. Layerwise Monitoring of the Selective Laser Melting Process by Thermography (United States)

    Krauss, Harald; Zeugner, Thomas; Zaeh, Michael F.

    Selective Laser Melting is utilized to build parts directly from CAD data. In this study layerwise monitoring of the temperature distribution is used to gather information about the process stability and the resulting part quality. The heat distribution varies with different kinds of parameters including scan vector length, laser power, layer thickness and inter-part distance in the job layout. By integration of an off-axis mounted uncooled thermal detector, the solidification as well as the layer deposition are monitored and evaluated. This enables the identification of hot spots in an early stage during the solidification process and helps to avoid process interrupts. Potential quality indicators are derived from spatially resolved measurement data and are correlated to the resulting part properties. A model of heat dissipation is presented based on the measurement of the material response for varying heat input. Current results show the feasibility of process surveillance by thermography for a limited section of the building platform in a commercial system.

  6. Enhanced cumulative sum charts for monitoring process dispersion. (United States)

    Abujiya, Mu'azu Ramat; Riaz, Muhammad; Lee, Muhammad Hisyam


    The cumulative sum (CUSUM) control chart is widely used in industry for the detection of small and moderate shifts in process location and dispersion. For efficient monitoring of process variability, we present several CUSUM control charts for monitoring changes in standard deviation of a normal process. The newly developed control charts based on well-structured sampling techniques - extreme ranked set sampling, extreme double ranked set sampling and double extreme ranked set sampling, have significantly enhanced CUSUM chart ability to detect a wide range of shifts in process variability. The relative performances of the proposed CUSUM scale charts are evaluated in terms of the average run length (ARL) and standard deviation of run length, for point shift in variability. Moreover, for overall performance, we implore the use of the average ratio ARL and average extra quadratic loss. A comparison of the proposed CUSUM control charts with the classical CUSUM R chart, the classical CUSUM S chart, the fast initial response (FIR) CUSUM R chart, the FIR CUSUM S chart, the ranked set sampling (RSS) based CUSUM R chart and the RSS based CUSUM S chart, among others, are presented. An illustrative example using real dataset is given to demonstrate the practicability of the application of the proposed schemes.

  7. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi


    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  8. Adaptive Local Outlier Probability for Dynamic Process Monitoring

    Institute of Scientific and Technical Information of China (English)

    Yuxin Ma; Hongbo Shi; Mengling Wang


    Complex industrial processes often have multiple operating modes and present time-varying behavior. The data in one mode may follow specific Gaussian or non-Gaussian distributions. In this paper, a numerical y efficient moving window local outlier probability algorithm is proposed. Its key feature is the capability to handle complex data distributions and incursive operating condition changes including slow dynamic variations and instant mode shifts. First, a two-step adaption approach is introduced and some designed updating rules are applied to keep the monitoring model up-to-date. Then, a semi-supervised monitoring strategy is developed with an updating switch rule to deal with mode changes. Based on local probability models, the algorithm has a superior ability in detecting faulty conditions and fast adapting to slow variations and new operating modes. Final y, the utility of the proposed method is demonstrated with a numerical example and a non-isothermal continuous stirred tank reactor.

  9. Automated analysis of heterogeneous carbon nanostructures by high-resolution electron microscopy and on-line image processing

    Energy Technology Data Exchange (ETDEWEB)

    Toth, P., E-mail: [Department of Chemical Engineering, University of Utah, 50 S. Central Campus Drive, Salt Lake City, UT 84112-9203 (United States); Farrer, J.K. [Department of Physics and Astronomy, Brigham Young University, N283 ESC, Provo, UT 84602 (United States); Palotas, A.B. [Department of Combustion Technology and Thermal Energy, University of Miskolc, H3515, Miskolc-Egyetemvaros (Hungary); Lighty, J.S.; Eddings, E.G. [Department of Chemical Engineering, University of Utah, 50 S. Central Campus Drive, Salt Lake City, UT 84112-9203 (United States)


    High-resolution electron microscopy is an efficient tool for characterizing heterogeneous nanostructures; however, currently the analysis is a laborious and time-consuming manual process. In order to be able to accurately and robustly quantify heterostructures, one must obtain a statistically high number of micrographs showing images of the appropriate sub-structures. The second step of analysis is usually the application of digital image processing techniques in order to extract meaningful structural descriptors from the acquired images. In this paper it will be shown that by applying on-line image processing and basic machine vision algorithms, it is possible to fully automate the image acquisition step; therefore, the number of acquired images in a given time can be increased drastically without the need for additional human labor. The proposed automation technique works by computing fields of structural descriptors in situ and thus outputs sets of the desired structural descriptors in real-time. The merits of the method are demonstrated by using combustion-generated black carbon samples. - Highlights: ► The HRTEM analysis of heterogeneous nanostructures is a tedious manual process. ► Automatic HRTEM image acquisition and analysis can improve data quantity and quality. ► We propose a method based on on-line image analysis for the automation of HRTEM image acquisition. ► The proposed method is demonstrated using HRTEM images of soot particles.

  10. Relationships Between Animal Health Monitoring and the Risk Assessment Process

    Directory of Open Access Journals (Sweden)

    Salman MD


    Full Text Available Risk assessment is part of the risk analysis process as it is used in veterinary medicine to estimate risks related to international trade and food safety. Data from monitoring and surveillance systems (MO&SS are used throughout the risk assessment process for hazard identification, release assessment, exposure assessment and consequence assessment. As the quality of risk assessments depends to a large extent on the availability and quality of input data, there is a close relationship between MO&SS and risk assessment. In order to improve the quality of risk assessments, MO&SS should be designed according to minimum quality standards. Second, recent scientific developments on state-of-the-art design and analysis of surveys need to be translated into field applications and legislation. Finally, knowledge about the risk assessment process among MO&SS planners and managers should be promoted in order to assure high-quality data.

  11. Monitoring metal-fill in a lost foam casting process. (United States)

    Abdelrahman, Mohamed; Arulanantham, Jeanison Pradeep; Dinwiddie, Ralph; Walford, Graham; Vondra, Fred


    The lost foam casting (LFC) process is emerging as a reliable casting method. The metal-fill profile in LFC plays an important role among several factors that affect casting quality. The metal-fill profile is in turn affected by several factors. Several casting defects may result due to an improper metal-fill process. Hence, it becomes essential to characterize and control, if possible, the metal-fill process in LFC. This research presents instrumentation and a technique to monitor and characterize the metal-fill process. The characterization included the determination of the position of the metal front and the profile in which the metal fills up the foam pattern. The instrumentation included capacitive sensors. Each sensor is comprised of two electrodes whose capacitive coupling changes as the metal fills the foam pattern. Foundry tests were conducted to obtain the sensors' responses to the metal fill. Two such sensors were used in the foundry tests. Data representing the responses of these sensors during the metal-fill process were collected using a data acquisition system. A number of finite element electrostatic simulations were carried out to study the metal-fill process under conditions similar to those experienced in foundry tests. An artificial neural network was trained using the simulation data as inputs and the corresponding metal-fill profiles as outputs. The neural network was then used to infer the profile of the metal-fill during foundry tests. The results were verified by comparing the metal-fill profile inferred from the neural network to the actual metal-fill profile captured by an infrared camera used during the foundry tests. The match up between the inferred profiles and the infrared camera measurements was satisfactory, indicating that the developed technique provides a reliable and cost effective method to monitor the metal-fill profile in LFC.

  12. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.; Virden, Daniel J.; Myers, Joshua R.; Maxwell, Adam R.


    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objects recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.



    Mrs. Kavitha. R,; Tripty Singh


    Vein pattern in palms is a random mesh of interconnected and inter- wining blood vessels. This project is the application of vein detection concept to automate the drug delivery process. It dealswith extracting palm dorsal vein structures, which is a key procedure for selecting the optimal drug needle insertion point. Gray scale images obtained from a low cost IR-webcam are poor in contrast, and usually noisy which make an effective vein segmentation a great challenge. Here a new vein image s...

  14. DEdicated MONitor of EXotransits and Transients (DEMONEXT): Low-Cost Robotic and Automated Telescope for Followup of Exoplanetary Transits and Transients (United States)

    Villanueva, Steven; Eastman, Jason D.; Gaudi, B. Scott; Pogge, Richard W.; Stassun, Keivan G.; Trueblood, Mark; Trueblood, Patricia


    We present the design, development, and early science from the DEdicated MONitor of EXotransits and Transients (DEMONEXT), an automated and robotic 20 inch telescope jointly funded by The Ohio State University and Vanderbilt University. The telescope is a PlaneWave CDK20 f/6.8 Corrected Dall-Kirkham Astrograph telescope on a Mathis Instruments MI-750/1000 Fork Mount located at Winer Observatory in Sonoita, AZ. DEMONEXT has a Hedrick electronic focuser, Finger Lakes Instrumentation (FLI) CFW-3-10 filter wheel, and a 2048 x 2048 pixel FLI Proline CCD3041 camera with a pixel scale of 0.90 arc-seconds per pixel and a 30.7 x 30.7 arc-minute field-of-view. The telescope's automation, controls, and scheduling are implemented in Python, including a facility to add new targets in real time for rapid follow-up of time-critical targets. DEMONEXT will be used for the confirmation and detailed investigation of newly discovered planet candidates from the Kilodegree Extremely Little Telescope (KELT) survey, exploration of the atmospheres of Hot Jupiters via transmission spectroscopy and thermal emission measurements, and monitoring of select eclipsing binary star systems as benchmarks for models of stellar evolution. DEMONEXT will enable rapid confirmation imaging of supernovae, flare stars, tidal disruption events, and other transients discovered by the All Sky Automated Survey for SuperNovae (ASAS-SN).

  15. A national system for monitoring the population of agricultural pests using an integrated approach of remote sensing data from in situ automated traps and satellite images (United States)

    Diofantos, Hadjimitsis G.; Panayiotis, Philimis; Elias, Psimolophitis; Georgiou, George K.; Kyriacos, Themistocleous


    A national system for monitoring the population increase of agricultural pest "Lobesia Botrana" (vine moth/fly that attacks grapes) in Cyprus has been developed. The system comprises of automated delta traps with GPS that use wireless(Wi-Fi) camera, automated image analysis for identification of the specific fly species, Wi-Fi technology for transferring the data using mobile telephony network to a central station for result presentation and analysis. A GIS database was developed and included details of the pilot vineyards, environmental conditions and daily data of the number of captured flies from each automated trap. The results were compared with MODIS and LANDSAT satellite thermal images since the appearance of the vine fly is greatly dependent on the microclimate temperatures (degree days). Results showed that satellite data can estimate accurately the appearance of the vine fly. The proposed system can be an important tool for the improvement of a national Integrated Pest Management (IPM) system and it can also be used for monitoring other agricultural pests and insects.

  16. 保沧干渠自动化系统框架设计%Baocang main canal automation monitoring system framework design

    Institute of Scientific and Technical Information of China (English)

    刘长征; 耿海菊; 王楠; 刘晓菁


    河北省南水北调配套工程保沧干渠是南水北调中线工程的重要组成部分,其自动化系统由计算机监控系统、图像监视系统、管理信息及综合办公系统四个部分组成,对沿线管理所、泵站、分水口监测站的设备进行数据采集监测和监控,完成用水动态变化和管线监测设备的安全运行。阐述了保沧干渠自动化监控系统的组成结构和框架设计。%F orm a complete set of South-to-North water tr ansfer project in Hebei Province Baocang main canal is an important part of the middle route of South-to-North water transfer project, the automation monitoring system was composed of four parts of the computer monitoring and control system,image monitoring system, management information and integrated office, the system can monitor the data acquisition of the equipment of control station,pump station and monitoring station of diversion along the lline,realize dynamic changes of water resource and monitor equipment safe operation. This paper exponds the structure and frame design of Baocang main canal automation monitoring and control system.

  17. Unsupervised process monitoring and fault diagnosis with machine learning methods

    CERN Document Server

    Aldrich, Chris


    This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data

  18. Conception through build of an automated liquids processing system for compound management in a low-humidity environment. (United States)

    Belval, Richard; Alamir, Ab; Corte, Christopher; DiValentino, Justin; Fernandes, James; Frerking, Stuart; Jenkins, Derek; Rogers, George; Sanville-Ross, Mary; Sledziona, Cindy; Taylor, Paul


    Boehringer Ingelheim's Automated Liquids Processing System (ALPS) in Ridgefield, Connecticut, was built to accommodate all compound solution-based operations following dissolution in neat DMSO. Process analysis resulted in the design of two nearly identical conveyor-based subsystems, each capable of executing 1400 × 384-well plate or punch tube replicates per batch. Two parallel-positioned subsystems are capable of independent execution or alternatively executed as a unified system for more complex or higher throughput processes. Primary ALPS functions include creation of high-throughput screening plates, concentration-response plates, and reformatted master stock plates (e.g., 384-well plates from 96-well plates). Integrated operations included centrifugation, unsealing/piercing, broadcast diluent addition, barcode print/application, compound transfer/mix via disposable pipette tips, and plate sealing. ALPS key features included instrument pooling for increased capacity or fail-over situations, programming constructs to associate one source plate to an array of replicate plates, and stacked collation of completed plates. Due to the hygroscopic nature of DMSO, ALPS was designed to operate within a 10% relativity humidity environment. The activities described are the collaborative efforts that contributed to the specification, build, delivery, and acceptance testing between Boehringer Ingelheim Pharmaceuticals, Inc. and the automation integration vendor, Thermo Scientific Laboratory Automation (Burlington, ON, Canada).

  19. ["Veille sanitaire": tools, functions, process of healthcare monitoring in France]. (United States)

    Eilstein, D; Salines, G; Desenclos, J-C


    In France, the term "veille sanitaire" is widely used to designate healthcare monitoring. It contains, however, a set of concepts that are not shared equally by the entire scientific community. The same is true for activities that are part of it, even if some (surveillance for example) are already well defined. Concepts such as "observation", "vigilance", "alert" for example are not always clear. Furthermore, the use of these words in everyday language maintains this ambiguity. Thus, it seemed necessary to recall these definitions as already used in the literature or legislation texts and to make alternative suggestions. This formalization cannot be carried out without thinking about the structure of "veille sanitaire" and its components. Proposals are provided bringing out concepts of formated "veille" (monitoring) and non-formatted "veille" (monitoring). Definitions, functions, (methods and tools, processes) of these two components are outlined here as well as the cooperative relationship they sustain. The authors have attempted to provide the scientific community with a reference framework useful for exchanging information to promote research and methodological development dedicated to this public health application of epidemiology.


    Directory of Open Access Journals (Sweden)

    Litvinov V. N.


    Full Text Available To solve the problem of reducing the power supply system’s reliability a prompt full-scale diagnostics based on modern methods can help. Inculcation of information systems for the operational diagnostics implementation allows providing the operating personnel with information that enables to predict possible infringements in power transformers work and to prepare in advance an action plan to address them. The paper presents fragments of the developed monitoring system of power transformer using programmable logic controllers. Within the work of the system there were marked such groups of controlled parameters as information about temperature and the cooling system work; magnitude of windings voltage per phase; the windings current values per phase; information about being transmitted and transmitted power; information about the insulation state. There is designed a functional scheme of the system for monitoring the state of the power transformer. There is described a general algorithm of system functioning. There is developed graphical operator interface that allows to monitor the object state and to manage the system state. Using XML markup language there was designed format of data packets. Designed hardware and software package can be used in the educational process, as it allows to improve the quality of students training, to bring them closer to the realities of modern professional activities; in operational activities as complying with the approved domestic calculating methods replacement of foreign software; in science in solving problems of analysis and optimization of operating parameters of power transformers

  1. Online monitoring and control of the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Boe, K.


    The demand for online monitoring and control of biogas process is increasing, since better monitoring and control system can improve process stability and enhance process performance for better economy of the biogas plants. A number of parameters in both the liquid and the gas phase have been suggested as process indicators. These include gas production, pH, alkalinity, volatile fatty acids (VFA) and hydrogen. Of these, VFA is the most widely recognised as a direct, relevant measure of stability. The individual, rather than collective VFA concentrations are recognised as providing significantly more information for diagnosis. However, classic on-line measurement is based on filtration, which suffers from fouling, especially in particulate or slurry wastes. In this project, a new online VFA monitoring system has been developed using gas-phase VFA extraction to avoid sample filtration. The liquid sample is pumped into a sampling chamber, acidified, added with salt and heated to extract VFA into the gas phase before analysis by GC-FID. This allows easy application to manure. Sample and analysis time of the system varies from 25-40 min. depending on the washing duration. The sampling frequency is fast enough for the dynamic of a manure digester, which is in the range of several hours. This system has been validated over more than 6 months and had shown good agreement with offline VFA measurement. Response from this sensor was compared with other process parameters such as biogas production, pH and dissolved hydrogen during overload situations in a laboratory-scale digester, to investigate the suitability of each measure as a process indicator. VFA was most reliable for indicating process imbalance, and propionate was most persistent. However, when coupling the online VFA monitoring with a simple control for automatic controlling propionate level in a digester, it was found that propionate decreased so slow that the biogas production fluctuated. Therefore, it is more

  2. The european primary care monitor: structure, process and outcome indicators

    Directory of Open Access Journals (Sweden)

    Wilson Andrew


    Full Text Available Abstract Background Scientific research has provided evidence on benefits of well developed primary care systems. The relevance of some of this research for the European situation is limited. There is currently a lack of up to date comprehensive and comparable information on variation in development of primary care, and a lack of knowledge of structures and strategies conducive to strengthening primary care in Europe. The EC funded project Primary Health Care Activity Monitor for Europe (PHAMEU aims to fill this gap by developing a Primary Care Monitoring System (PC Monitor for application in 31 European countries. This article describes the development of the indicators of the PC Monitor, which will make it possible to create an alternative model for holistic analyses of primary care. Methods A systematic review of the primary care literature published between 2003 and July 2008 was carried out. This resulted in an overview of: (1 the dimensions of primary care and their relevance to outcomes at (primary health system level; (2 essential features per dimension; (3 applied indicators to measure the features of primary care dimensions. The indicators were evaluated by the project team against criteria of relevance, precision, flexibility, and discriminating power. The resulting indicator set was evaluated on its suitability for Europe-wide comparison of primary care systems by a panel of primary care experts from various European countries (representing a variety of primary care systems. Results The developed PC Monitor approaches primary care in Europe as a multidimensional concept. It describes the key dimensions of primary care systems at three levels: structure, process, and outcome level. On structure level, it includes indicators for governance, economic conditions, and workforce development. On process level, indicators describe access, comprehensiveness, continuity, and coordination of primary care services. On outcome level, indicators

  3. A Case Study of Reverse Engineering Integrated in an Automated Design Process (United States)

    Pescaru, R.; Kyratsis, P.; Oancea, G.


    This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.

  4. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech


    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  5. Multi-Source Data Processing Middleware for Land Monitoring within a Web-Based Spatial Data Infrastructure for Siberia

    Directory of Open Access Journals (Sweden)

    Christiane Schmullius


    Full Text Available Land monitoring is a key issue in Earth system sciences to study environmental changes. To generate knowledge about change, e.g., to decrease uncertaincy in the results and build confidence in land change monitoring, multiple information sources are needed. Earth observation (EO satellites and in situ measurements are available for operational monitoring of the land surface. As the availability of well-prepared geospatial time-series data for environmental research is limited, user-dependent processing steps with respect to the data source and formats pose additional challenges. In most cases, it is possible to support science with spatial data infrastructures (SDI and services to provide such data in a processed format. A data processing middleware is proposed as a technical solution to improve interdisciplinary research using multi-source time-series data and standardized data acquisition, pre-processing, updating and analyses. This solution is being implemented within the Siberian Earth System Science Cluster (SIB-ESS-C, which combines various sources of EO data, climate data and analytical tools. The development of this SDI is based on the definition of automated and on-demand tools for data searching, ordering and processing, implemented along with standard-compliant web services. These tools, consisting of a user-friendly download, analysis and interpretation infrastructure, are available within SIB-ESS-C for operational use.

  6. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)


    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  7. Process Model of Quality Cost Monitoring for Small and Medium Wood-Processing Enterprises

    Directory of Open Access Journals (Sweden)

    Denis Jelačić


    Full Text Available Quality is not only a technical category and the system of quality management is not only focused on product quality. Quality and costs are closely interlinked. The paper deals with the quality cost monitoring in small and medium wood-processing enterprises (SMEs in Slovakia, and also presents the results of the questionnaire survey. An empirical study is aimed to determine the level of understanding and level of implementation of quality cost monitoring in wood-processing SMEs in Slovakia. The research is based on PAF model. A suitable model for quality cost monitoring is also proposed in the paper based on the research results with guidelines for using the methods of Activity Basic Costing. The empirical study is focused on SMEs, which make 99.8 % of all companies in the branch, and where the quality cost monitoring often works as a latent management subsystem. SMEs managers use indicators for monitoring the processe performance and production quality, but they usually do not develop a separate framework for measuring and evaluating quality costs.

  8. Monitoring of Soil Remediation Process in the Metal Mining Area (United States)

    Kim, Kyoung-Woong; Ko, Myoung-Soo; Han, Hyeop-jo; Lee, Sang-Ho; Na, So-Young


    Stabilization using proper additives is an effective soil remediation technique to reduce As mobility in soil. Several researches have reported that Fe-containing materials such as amorphous Fe-oxides, goethite and hematite were effective in As immobilization and therefore acid mine drainage sludge (AMDS) may be potential material for As immobilization. The AMDS is the by-product from electrochemical treatment of acid mine drainage and mainly contains Fe-oxide. The Chungyang area in Korea is located in the vicinity of the huge abandoned Au-Ag Gubong mine which was closed in the 1970s. Large amounts of mine tailings have been remained without proper treatment and the mobilization of mine tailings can be manly occurred during the summer heavy rainfall season. Soil contamination from this mobilization may become an urgent issue because it can cause the contamination of groundwater and crop plants in sequence. In order to reduce the mobilization of the mine tailings, the pilot scale study of in-situ stabilization using AMDS was applied after the batch and column experiments in the lab. For the monitoring of stabilization process, we used to determine the As concentration in crop plants grown on the field site but it is not easily applicable because of time and cost. Therefore, we may need simple monitoring technique to measure the mobility or leachability which can be comparable with As concentration in crop plants. We compared several extraction methods to suggest the representative single extraction method for the monitoring of soil stabilization efficiency. Several selected extraction methods were examined and Mehlich 3 extraction method using the mixture of NH4F, EDTA, NH4NO3, CH3COOH and HNO3 was selected as the best predictor of the leachability or mobility of As in the soil remediation process.

  9. Thermographic process monitoring in powderbed based additive manufacturing (United States)

    Krauss, Harald; Zeugner, Thomas; Zaeh, Michael F.


    Selective Laser Melting is utilized to build metallic parts directly from CAD-Data by solidification of thin powder layers through application of a fast scanning laser beam. In this study layerwise monitoring of the temperature distribution is used to gather information about the process stability and the resulting part quality. The heat distribution varies with different kinds of parameters including scan vector length, laser power, layer thickness and inter-part distance in the job layout which in turn influence the resulting part quality. By integration of an off-axis mounted uncooled thermal detector the solidification as well as the layer deposition are monitored and evaluated. Errors in the generation of new powder layers usually result in a locally varying layer thickness that may cause poor part quality. For effect quantification, the locally applied layer thickness is determined by evaluating the heat-up of the newly deposited powder. During the solidification process space and time-resolved data is used to characterize the zone of elevated temperatures and to derive locally varying heat dissipation properties. Potential quality indicators are evaluated and correlated to the resulting part quality: Thermal diffusivity is derived from a simplified heat dissipation model and evaluated for every pixel and cool-down phase of a layer. This allows the quantification of expected material homogeneity properties. Maximum temperature and time above certain temperatures are measured in order to detect hot spots or delamination issues that may cause a process breakdown. Furthermore, a method for quantification of sputter activity is presented. Since high sputter activity indicates unstable melt dynamics this can be used to identify parameter drifts, improper atmospheric conditions or material binding errors. The resulting surface structure after solidification complicates temperature determination on the one hand but enables the detection of potential surface defects

  10. Thermographic process monitoring in powderbed based additive manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Krauss, Harald, E-mail:; Zaeh, Michael F. [AMLab, iwb Application Center Augsburg, Technische Universität München (Germany); Zeugner, Thomas [Augsburg University (Germany)


    Selective Laser Melting is utilized to build metallic parts directly from CAD-Data by solidification of thin powder layers through application of a fast scanning laser beam. In this study layerwise monitoring of the temperature distribution is used to gather information about the process stability and the resulting part quality. The heat distribution varies with different kinds of parameters including scan vector length, laser power, layer thickness and inter-part distance in the job layout which in turn influence the resulting part quality. By integration of an off-axis mounted uncooled thermal detector the solidification as well as the layer deposition are monitored and evaluated. Errors in the generation of new powder layers usually result in a locally varying layer thickness that may cause poor part quality. For effect quantification, the locally applied layer thickness is determined by evaluating the heat-up of the newly deposited powder. During the solidification process space and time-resolved data is used to characterize the zone of elevated temperatures and to derive locally varying heat dissipation properties. Potential quality indicators are evaluated and correlated to the resulting part quality: Thermal diffusivity is derived from a simplified heat dissipation model and evaluated for every pixel and cool-down phase of a layer. This allows the quantification of expected material homogeneity properties. Maximum temperature and time above certain temperatures are measured in order to detect hot spots or delamination issues that may cause a process breakdown. Furthermore, a method for quantification of sputter activity is presented. Since high sputter activity indicates unstable melt dynamics this can be used to identify parameter drifts, improper atmospheric conditions or material binding errors. The resulting surface structure after solidification complicates temperature determination on the one hand but enables the detection of potential surface defects

  11. Virtual instrument for monitoring process of brush plating

    Institute of Scientific and Technical Information of China (English)

    JING Xue-dong; XU Bin-shi; WANG Cheng-tao; ZHU Sheng; DONG Shi-yun


    A virtual instrument(Ⅵ) was developed to monitor the technological parameters in the process of brush plating, including coating thickness, brush-plating current, current density, deposition rate, and brush plating voltage. Meanwhile two approaches were presented to improve the measurement accuracy of coating thickness. One of them aims at eliminating the random interferences by moving average filtering; while the other manages to calculate the quantity of electricity consumed accurately with rectangular integration. With these two approaches, the coating thickness can be measured in real time with higher accuracy than the voltage-frequency conversion method. During the process of plating all the technological parameters are displayed visually on the front panel of the Ⅵ. Once brush current or current density overruns the limited values, or when the coating thickness reaches the objective value, the virtual will alarm. With this Ⅵ, the solution consumption can be decreased and the operating efficiency is improved.

  12. Contributions to ultrasound monitoring of the process of milk curdling. (United States)

    Jiménez, Antonio; Rufo, Montaña; Paniagua, Jesús M; Crespo, Abel T; Guerrero, M Patricia; Riballo, M José


    Ultrasound evaluation permits the state of milk being curdled to be determined quickly and cheaply, thus satisfying the demands faced by today's dairy product producers. This paper describes the non-invasive ultrasonic method of in situ monitoring the changing physical properties of milk during the renneting process. The basic objectives of the study were, on the one hand, to confirm the usefulness of conventional non-destructive ultrasonic testing (time-of-flight and attenuation of the ultrasound waves) in monitoring the process in the case of ewe's milk, and, on the other, to include other ultrasound parameters which have not previously been considered in studies on this topic, in particular, parameters provided by the Fast Fourier Transform technique. The experimental study was carried out in a dairy industry environment on four 52-l samples of raw milk in which were immersed 500kHz ultrasound transducers. Other physicochemical parameters of the raw milk (pH, dry matter, protein, Gerber fat test, and lactose) were measured, as also were the pH and temperature of the curdled samples simultaneously with the ultrasound tests. Another contribution of this study is the linear correlation analysis of the aforementioned ultrasound parameters and the physicochemical properties of the curdled milk.

  13. Process Diagnostics and Monitoring Using the Multipole Resonance Probe (MRP) (United States)

    Harhausen, J.; Awakowicz, P.; Brinkmann, R. P.; Foest, R.; Lapke, M.; Musch, T.; Mussenbrock, T.; Oberrath, J.; Ohl, A.; Rolfes, I.; Schulz, Ch.; Storch, R.; Styrnoll, T.


    In this contribution we present the application of the MRP in an industrial plasma ion assisted deposition (PIAD) chamber (Leybold optics SYRUS-pro). The MRP is a novel plasma diagnostic which is suitable for an industrial environment - which means that the proposed method is robust, calibration free, and economical, and can be used for ideal and reactive plasmas alike. In order to employ the MRP as process diagnostics we mounted the probe on a manipulator to obtain spatially resolved information on the electron density and temperature. As monitoring tool the MRP is installed at a fixed position. Even during the deposition process it provides stable measurement results while other diagnostic methods, e.g. the Langmuir probe, may suffer from dielectric coatings. In this contribution we present the application of the MRP in an industrial plasma ion assisted deposition (PIAD) chamber (Leybold optics SYRUS-pro). The MRP is a novel plasma diagnostic which is suitable for an industrial environment - which means that the proposed method is robust, calibration free, and economical, and can be used for ideal and reactive plasmas alike. In order to employ the MRP as process diagnostics we mounted the probe on a manipulator to obtain spatially resolved information on the electron density and temperature. As monitoring tool the MRP is installed at a fixed position. Even during the deposition process it provides stable measurement results while other diagnostic methods, e.g. the Langmuir probe, may suffer from dielectric coatings. Funded by the German Ministry for Education and Research (BMBF, Fkz. 13N10462).

  14. The Allocation of Automated Test Equipment Capacity with Variability in Demand and Processing Rates (United States)


    L., Richardson, S. C., Savage , H. S., Devers, W. C., Balaban, H. S., Bailey, E. K., et al. (1996). The Capability of the Consolidated Automated...Postgraduate School Monterey, CA 8. Kara Harp Okulu Savunma Bilimleri Enstitusu Bakanliklar Ankara, Turkey 9. Kara Kuvvetleri Komutanligi

  15. Optical sensors for process control and emissions monitoring in industry

    Energy Technology Data Exchange (ETDEWEB)

    S. W. Alendorf; D. K. Ottensen; D. W. Hahn; T. J. Kulp; U. B. Goers


    Sandia National Laboratories has a number of ongoing projects developing optical sensors for industrial environments. Laser-based sensors can be attractive for relatively harsh environments where extractive sampling is difficult, inaccurate, or impractical. Tools developed primarily for laboratory research can often be adapted for the real world and applied to problems far from their original uses. Spectroscopic techniques, appropriately selected, have the potential to impact the bottom line of a number of industries and industrial processes. In this paper the authors discuss three such applications: a laser-based instrument for process control in steelmaking, a laser-induced breakdown method for hazardous metal detection in process streams, and a laser-based imaging sensor for evaluating surface cleanliness. Each has the potential to provide critical, process-related information in a real-time, continuous manner. These sensor techniques encompass process control applications and emissions monitoring for pollution prevention. They also span the range from a field-tested pre-commercial prototype to laboratory instrumentation. Finally, these sensors employ a wide range of sophistication in both the laser source and associated analytical spectroscopy. In the ultimate applications, however, many attributes of the sensors are in common, such as the need for robust operation and hardening for harsh industrial environments.

  16. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery (United States)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John


    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called "Robofurnace." Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  17. Development of automated welding process for field fabrication of thick walled pressure vessels. Fourth quarter technical progress report for period ending September 30, 1979

    Energy Technology Data Exchange (ETDEWEB)



    Progress in developing an automated welding process for the field fabrication of thick walled pressure vessels is reported. Plans for the demonstration facility, for nondestructive testing, and for the procurement of materials are discussed. (LCL)

  18. Distributed multisensor processing, decision making, and control under constrained resources for remote health and environmental monitoring (United States)

    Talukder, Ashit; Sheikh, Tanwir; Chandramouli, Lavanya


    Previous field-deployable distributed sensing systems for health/biomedical applications and environmental sensing have been designed for data collection and data transmission at pre-set intervals, rather than for on-board processing These previous sensing systems lack autonomous capabilities, and have limited lifespans. We propose the use of an integrated machine learning architecture, with automated planning-scheduling and resource management capabilities that can be used for a variety of autonomous sensing applications with very limited computing, power, and bandwidth resources. We lay out general solutions for efficient processing in a multi-tiered (three-tier) machine learning framework that is suited for remote, mobile sensing systems. Novel dimensionality reduction techniques that are designed for classification are used to compress each individual sensor data and pass only relevant information to the mobile multisensor fusion module (second-tier). Statistical classifiers that are capable of handling missing/partial sensory data due to sensor failure or power loss are used to detect critical events and pass the information to the third tier (central server) for manual analysis and/or analysis by advanced pattern recognition techniques. Genetic optimisation algorithms are used to control the system in the presence of dynamic events, and also ensure that system requirements (i.e. minimum life of the system) are met. This tight integration of control optimisation and machine learning algorithms results in a highly efficient sensor network with intelligent decision making capabilities. The applicability of our technology in remote health monitoring and environmental monitoring is shown. Other uses of our solution are also discussed.

  19. Spectral induced polarization for monitoring electrokinetic remediation processes (United States)

    Masi, Matteo; Losito, Gabriella


    Electrokinetic remediation is an emerging technology for extracting heavy metals from contaminated soils and sediments. This method uses a direct or alternating electric field to induce the transport of contaminants toward the electrodes. The electric field also produces pH variations, sorption/desorption and precipitation/dissolution of species in the porous medium during remediation. Since heavy metal mobility is pH-dependent, the accurate control of pH inside the material is required in order to enhance the removal efficiency. The common approach for monitoring the remediation process both in laboratory and in the field is the chemical analysis of samples collected from discrete locations. The purpose of this study is the evaluation of Spectral Induced Polarization as an alternative method for monitoring geochemical changes in the contaminated mass during remediation. The advantage of this technique applied to field-scale is to offer higher resolution mapping of the remediation site and lower cost compared to the conventional sampling procedure. We carried out laboratory-scale electrokinetic remediation experiments on fine-grained marine sediments contaminated by heavy metal and we made Spectral Induced Polarization measurements before and after each treatment. Measurements were done in the frequency range 10- 3-103 Hz. By the deconvolution of the spectra using the Debye Decomposition method we obtained the mean relaxation time and total chargeability. The main finding of this work is that a linear relationship exists between the local total chargeability and pH, with good agreement. The observed behaviour of chargeability is interpreted as a direct consequence of the alteration of the zeta potential of the sediment particles due to pH changes. Such relationship has a significant value for the interpretation of induced polarization data, allowing the use of this technique for monitoring electrokinetic remediation at field-scale.

  20. Development of an automation system for a tablet coater

    DEFF Research Database (Denmark)

    Ruotsalainen, Mirja; Heinämäki, Jyrki; Rantanen, Jukka


    An instrumentation and automation system for a side-vented pan coater with a novel air-flow rate measurement system for monitoring the film-coating process of tablets was designed and tested. The instrumented coating system was tested and validated by film-coating over 20 pilot-scale batches...... and automated pan-coating system described, including historical data storage capability and a novel air-flow measurement system, is a useful tool for controlling and characterizing the tablet film-coating process. Monitoring of critical process parameters increases the overall coating process efficiency...

  1. Automation Security


    Mirzoev, Dr. Timur


    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  2. Estimation of urinary stone composition by automated processing of CT images

    CERN Document Server

    Chevreau, Grégoire; Conort, Pierre; Renard-Penna, Raphaëlle; Mallet, Alain; Daudon, Michel; Mozer, Pierre; 10.1007/s00240-009-0195-3


    The objective of this article was developing an automated tool for routine clinical practice to estimate urinary stone composition from CT images based on the density of all constituent voxels. A total of 118 stones for which the composition had been determined by infrared spectroscopy were placed in a helical CT scanner. A standard acquisition, low-dose and high-dose acquisitions were performed. All voxels constituting each stone were automatically selected. A dissimilarity index evaluating variations of density around each voxel was created in order to minimize partial volume effects: stone composition was established on the basis of voxel density of homogeneous zones. Stone composition was determined in 52% of cases. Sensitivities for each compound were: uric acid: 65%, struvite: 19%, cystine: 78%, carbapatite: 33.5%, calcium oxalate dihydrate: 57%, calcium oxalate monohydrate: 66.5%, brushite: 75%. Low-dose acquisition did not lower the performances (P < 0.05). This entirely automated approach eliminat...

  3. Monitoring of Lactic Fermentation Process by Ultrasonic Technique (United States)

    Alouache, B.; Touat, A.; Boutkedjirt, T.; Bennamane, A.

    The non-destructive control by using ultrasound techniques has become of great importance in food industry. In this work, Ultrasound has been used for quality control and monitoring the fermentation stages of yogurt, which is a highly consumed product. On the contrary to the physico-chemical methods, where the measurement instruments are directly introduced in the sample, ultrasound techniques have the advantage of being non-destructive and contactless, thus reducing the risk of contamination. Results obtained in this study by using ultrasound seem to be in good agreement with those obtained by physico-chemical methods such as acidity measurement by using a PH-meter instrument. This lets us to conclude that ultrasound method may be an alternative for a healthy control of yoghurt fermentation process.

  4. Monitoring Biological Modes in a Bioreactor Process by Computer Simulation

    Directory of Open Access Journals (Sweden)

    Samia Semcheddine


    Full Text Available This paper deals with the general framework of fermentation system modeling and monitoring, focusing on the fermentation of Escherichia coli. Our main objective is to develop an algorithm for the online detection of acetate production during the culture of recombinant proteins. The analysis the fermentation process shows that it behaves like a hybrid dynamic system with commutation (since it can be represented by 5 nonlinear models. We present a strategy of fault detection based on residual generation for detecting the different actual biological modes. The residual generation is based on nonlinear analytical redundancy relations. The simulation results show that the several modes that are occulted during the bacteria cultivation can be detected by residuals using a nonlinear dynamic model and a reduced instrumentation.

  5. Opportunities for Process Monitoring Techniques at Delayed Access Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, Michael M.; Gitau, Ernest TN; Johnson, Shirley J.; Schanfein, Mark; Toomey, Christopher


    Except for specific cases where the International Atomic Energy Agency (IAEA) maintains a continuous presence at a facility (such as the Japanese Rokkasho Reprocessing Plant), there is always a period of time or delay between the moment a State is notified or aware of an upcoming inspection, and the time the inspector actually enters the material balance area or facility. Termed by the authors as “delayed access,” this period of time between inspection notice and inspector entrance to a facility poses a concern. Delayed access also has the potential to reduce the effectiveness of measures applied as part of the Safeguards Approach for a facility (such as short-notice inspections). This report investigates the feasibility of using process monitoring to address safeguards challenges posed by delayed access at a subset of facility types.

  6. Condition monitoring for marine refrigeration plants based on process models

    NARCIS (Netherlands)

    Grimmelius, H.T.


    Over the last decades the reliability, availability and safety of ships has become increasingly important. The cost and safety risks of ships have increased with the size, and the complexity has led to extensive automation. Many parameters, variables and alarms are available simultaneously for evalu

  7. A Comparison of Natural Language Processing Methods for Automated Coding of Motivational Interviewing. (United States)

    Tanana, Michael; Hallgren, Kevin A; Imel, Zac E; Atkins, David C; Srikumar, Vivek


    Motivational interviewing (MI) is an efficacious treatment for substance use disorders and other problem behaviors. Studies on MI fidelity and mechanisms of change typically use human raters to code therapy sessions, which requires considerable time, training, and financial costs. Natural language processing techniques have recently been utilized for coding MI sessions using machine learning techniques, rather than human coders, and preliminary results have suggested these methods hold promise. The current study extends this previous work by introducing two natural language processing models for automatically coding MI sessions via computer. The two models differ in the way they semantically represent session content, utilizing either 1) simple discrete sentence features (DSF model) and 2) more complex recursive neural networks (RNN model). Utterance- and session-level predictions from these models were compared to ratings provided by human coders using a large sample of MI sessions (N=341 sessions; 78,977 clinician and client talk turns) from 6 MI studies. Results show that the DSF model generally had slightly better performance compared to the RNN model. The DSF model had "good" or higher utterance-level agreement with human coders (Cohen's kappa>0.60) for open and closed questions, affirm, giving information, and follow/neutral (all therapist codes); considerably higher agreement was obtained for session-level indices, and many estimates were competitive with human-to-human agreement. However, there was poor agreement for client change talk, client sustain talk, and therapist MI-inconsistent behaviors. Natural language processing methods provide accurate representations of human derived behavioral codes and could offer substantial improvements to the efficiency and scale in which MI mechanisms of change research and fidelity monitoring are conducted.

  8. A New Digital Signal Processing Method for Spectrum Interference Monitoring (United States)

    Angrisani, L.; Capriglione, D.; Ferrigno, L.; Miele, G.


    Frequency spectrum is a limited shared resource, nowadays interested by an ever growing number of different applications. Generally, the companies providing such services pay to the governments the right of using a limited portion of the spectrum, consequently they would be assured that the licensed radio spectrum resource is not interested by significant external interferences. At the same time, they have to guarantee that their devices make an efficient use of the spectrum and meet the electromagnetic compatibility regulations. Therefore the competent authorities are called to control the access to the spectrum adopting suitable management and monitoring policies, as well as the manufacturers have to periodically verify the correct working of their apparatuses. Several measurement solutions are present on the market. They generally refer to real-time spectrum analyzers and measurement receivers. Both of them are characterized by good metrological accuracies but show costs, dimensions and weights that make no possible a use "on the field". The paper presents a first step in realizing a digital signal processing based measurement instrument able to suitably accomplish for the above mentioned needs. In particular the attention has been given to the DSP based measurement section of the instrument. To these aims an innovative measurement method for spectrum monitoring and management is proposed in this paper. It performs an efficient sequential analysis based on a sample by sample digital processing. Three main issues are in particular pursued: (i) measurement performance comparable to that exhibited by other methods proposed in literature; (ii) fast measurement time, (iii) easy implementation on cost-effective measurement hardware.


    Energy Technology Data Exchange (ETDEWEB)

    Lascola, R.; Sharma, V.


    The characteristic strong colors of aqueous actinide solutions form the basis of analytical techniques for actinides based on absorption spectroscopy. Colorimetric measurements of samples from processing activities have been used for at least half a century. This seemingly mature technology has been recently revitalized by developments in chemometric data analysis. Where reliable measurements could formerly only be obtained under well-defined conditions, modern methods are robust with respect to variations in acidity, concentration of complexants and spectral interferents, and temperature. This paper describes two examples of the use of process absorption spectroscopy for Pu analysis at the Savannah River Site, in Aiken, SC. In one example, custom optical filters allow accurate colorimetric measurements of Pu in a stream with rapid nitric acid variation. The second example demonstrates simultaneous measurement of Pu and U by chemometric treatment of absorption spectra. The paper concludes with a description of the use of these analyzers to supplement existing technologies in nuclear materials monitoring in processing, reprocessing, and storage facilities.

  10. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures (United States)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin


    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic

  11. Automated analysis of long-term bridge behavior and health using a cyber-enabled wireless monitoring system (United States)

    O'Connor, Sean M.; Zhang, Yilan; Lynch, Jerome; Ettouney, Mohammed; van der Linden, Gwen


    A worthy goal for the structural health monitoring field is the creation of a scalable monitoring system architecture that abstracts many of the system details (e.g., sensors, data) from the structure owner with the aim of providing "actionable" information that aids in their decision making process. While a broad array of sensor technologies have emerged, the ability for sensing systems to generate large amounts of data have far outpaced advances in data management and processing. To reverse this trend, this study explores the creation of a cyber-enabled wireless SHM system for highway bridges. The system is designed from the top down by considering the damage mechanisms of concern to bridge owners and then tailoring the sensing and decision support system around those concerns. The enabling element of the proposed system is a powerful data repository system termed SenStore. SenStore is designed to combine sensor data with bridge meta-data (e.g., geometric configuration, material properties, maintenance history, sensor locations, sensor types, inspection history). A wireless sensor network deployed to a bridge autonomously streams its measurement data to SenStore via a 3G cellular connection for storage. SenStore securely exposes the bridge meta- and sensor data to software clients that can process the data to extract information relevant to the decision making process of the bridge owner. To validate the proposed cyber-enable SHM system, the system is implemented on the Telegraph Road Bridge (Monroe, MI). The Telegraph Road Bridge is a traditional steel girder-concrete deck composite bridge located along a heavily travelled corridor in the Detroit metropolitan area. A permanent wireless sensor network has been installed to measure bridge accelerations, strains and temperatures. System identification and damage detection algorithms are created to automatically mine bridge response data stored in SenStore over an 18-month period. Tools like Gaussian Process (GP

  12. The Development of Automated Detection Techniques for Passive Acoustic Monitoring as a Tool for Studying Beaked Whale Distribution and Habitat Preferences in the California Current Ecosystem (United States)

    Yack, Tina M.

    The objectives of this research were to test available automated detection methods for passive acoustic monitoring and integrate the best available method into standard marine mammal monitoring protocols for ship based surveys. The goal of the first chapter was to evaluate the performance and utility of PAMGUARD 1.0 Core software for use in automated detection of marine mammal acoustic signals during towed array surveys. Three different detector configurations of PAMGUARD were compared. These automated detection algorithms were evaluated by comparing them to the results of manual detections made by an experienced bio-acoustician (author TMY). This study provides the first detailed comparisons of PAMGUARD automated detection algorithms to manual detection methods. The results of these comparisons clearly illustrate the utility of automated detection methods for odontocete species. Results of this work showed that the majority of whistles and click events can be reliably detected using PAMGUARD software. The second chapter moves beyond automated detection to examine and test automated classification algorithms for beaked whale species. Beaked whales are notoriously elusive and difficult to study, especially using visual survey methods. The purpose of the second chapter was to test, validate, and compare algorithms for detection of beaked whales in acoustic line-transect survey data. Using data collected at sea from the PAMGUARD classifier developed in Chapter 2 it was possible to measure the clicks from visually verified Baird's beaked whale encounters and use this data to develop classifiers that could discriminate Baird's beaked whales from other beaked whale species in future work. Echolocation clicks from Baird's beaked whales, Berardius bairdii, were recorded during combined visual and acoustic shipboard surveys of cetacean populations in the California Current Ecosystem (CCE) and with autonomous, long-term recorders at four different sites in the Southern

  13. Recursive Gaussian Process Regression Model for Adaptive Quality Monitoring in Batch Processes

    Directory of Open Access Journals (Sweden)

    Le Zhou


    Full Text Available In chemical batch processes with slow responses and a long duration, it is time-consuming and expensive to obtain sufficient normal data for statistical analysis. With the persistent accumulation of the newly evolving data, the modelling becomes adequate gradually and the subsequent batches will change slightly owing to the slow time-varying behavior. To efficiently make use of the small amount of initial data and the newly evolving data sets, an adaptive monitoring scheme based on the recursive Gaussian process (RGP model is designed in this paper. Based on the initial data, a Gaussian process model and the corresponding SPE statistic are constructed at first. When the new batches of data are included, a strategy based on the RGP model is used to choose the proper data for model updating. The performance of the proposed method is finally demonstrated by a penicillin fermentation batch process and the result indicates that the proposed monitoring scheme is effective for adaptive modelling and online monitoring.

  14. Modeling Wireless Sensor Networks for Monitoring in Biological Processes

    DEFF Research Database (Denmark)

    Nadimi, Esmaeil

    parameters, as the use of wired sensors is impractical. In this thesis, a ZigBee based wireless sensor network was employed and only a part of the herd was monitored, as monitoring each individual animal in a large herd under practical conditions is inefficient. Investigations to show that the monitored...

  15. Automated Characterization of Spent Fuel through the Multi-Isotope Process (MIP) Monitor

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Orton, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    This research developed an algorithm for characterizing spent nuclear fuel (SNF) samples based on simulated gamma spectra. The gamma spectra for a variety of light water reactor fuels typical of those found in the United States were simulated. Fuel nuclide concentrations were simulated in ORIGEN-ARP for 1296 fuel samples with a variety of reactor designs, initial enrichments, burn ups, and cooling times. The results of the ORIGEN-ARP simulation were then input to SYNTH to simulate the gamma spectrum for each sample. These spectra were evaluated with partial least squares (PLS)-based multivariate analysis methods to characterize the fuel according to reactor type (pressurized or boiling water reactor), enrichment, burn up, and cooling time. Characterizing some of the features in series by using previously estimated features in the prediction greatly improves the performance. By first classifying the spent fuel reactor type and then using type-specific models, the prediction error for enrichment, burn up, and cooling time improved by a factor of two to four. For some features, the prediction was further improved by including additional information, such as including the predicted burn up in the estimation of cooling time. The optimal prediction flow was determined based on the simulated data. A PLS discriminate analysis model was developed which perfectly classified SNF reactor type. Burn up was predicted within 0.1% root mean squared percent error (RMSPE) and both cooling time and initial enrichment within approximately 2% RMSPE.

  16. Automation in Clinical Microbiology (United States)

    Ledeboer, Nathan A.


    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  17. A Scalable Gaussian Process Analysis Algorithm for Biomass Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Chandola, Varun [ORNL; Vatsavai, Raju [ORNL


    Biomass monitoring is vital for studying the carbon cycle of earth's ecosystem and has several significant implications, especially in the context of understanding climate change and its impacts. Recently, several change detection methods have been proposed to identify land cover changes in temporal profiles (time series) of vegetation collected using remote sensing instruments, but do not satisfy one or both of the two requirements of the biomass monitoring problem, i.e., {\\em operating in online mode} and {\\em handling periodic time series}. In this paper, we adapt Gaussian process regression to detect changes in such time series in an online fashion. While Gaussian process (GP) have been widely used as a kernel based learning method for regression and classification, their applicability to massive spatio-temporal data sets, such as remote sensing data, has been limited owing to the high computational costs involved. We focus on addressing the scalability issues associated with the proposed GP based change detection algorithm. This paper makes several significant contributions. First, we propose a GP based online time series change detection algorithm and demonstrate its effectiveness in detecting different types of changes in {\\em Normalized Difference Vegetation Index} (NDVI) data obtained from a study area in Iowa, USA. Second, we propose an efficient Toeplitz matrix based solution which significantly improves the computational complexity and memory requirements of the proposed GP based method. Specifically, the proposed solution can analyze a time series of length $t$ in $O(t^2)$ time while maintaining a $O(t)$ memory footprint, compared to the $O(t^3)$ time and $O(t^2)$ memory requirement of standard matrix manipulation based methods. Third, we describe a parallel version of the proposed solution which can be used to simultaneously analyze a large number of time series. We study three different parallel implementations: using threads, MPI, and a

  18. Advanced modelling, monitoring, and process control of bioconversion systems (United States)

    Schmitt, Elliott C.

    Production of fuels and chemicals from lignocellulosic biomass is an increasingly important area of research and industrialization throughout the world. In order to be competitive with fossil-based fuels and chemicals, maintaining cost-effectiveness is critical. Advanced process control (APC) and optimization methods could significantly reduce operating costs in the biorefining industry. Two reasons APC has previously proven challenging to implement for bioprocesses include: lack of suitable online sensor technology of key system components, and strongly nonlinear first principal models required to predict bioconversion behavior. To overcome these challenges batch fermentations with the acetogen Moorella thermoacetica were monitored with Raman spectroscopy for the conversion of real lignocellulosic hydrolysates and a kinetic model for the conversion of synthetic sugars was developed. Raman spectroscopy was shown to be effective in monitoring the fermentation of sugarcane bagasse and sugarcane straw hydrolysate, where univariate models predicted acetate concentrations with a root mean square error of prediction (RMSEP) of 1.9 and 1.0 g L-1 for bagasse and straw, respectively. Multivariate partial least squares (PLS) models were employed to predict acetate, xylose, glucose, and total sugar concentrations for both hydrolysate fermentations. The PLS models were more robust than univariate models, and yielded a percent error of approximately 5% for both sugarcane bagasse and sugarcane straw. In addition, a screening technique was discussed for improving Raman spectra of hydrolysate samples prior to collecting fermentation data. Furthermore, a mechanistic model was developed to predict batch fermentation of synthetic glucose, xylose, and a mixture of the two sugars to acetate. The models accurately described the bioconversion process with an RMSEP of approximately 1 g L-1 for each model and provided insights into how kinetic parameters changed during dual substrate

  19. Update on scribe–cleave–passivate (SCP) slim edge technology for silicon sensors: Automated processing and radiation resistance

    Energy Technology Data Exchange (ETDEWEB)

    Fadeyev, V., E-mail: [Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 (United States); Ely, S.; Galloway, Z.; Ngo, J.; Parker, C.; Sadrozinski, H.F.-W. [Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 (United States); Christophersen, M.; Phlips, B.F. [U.S. Naval Research Laboratory, Code 7654, 4555 Overlook Avenue, Southwest Washington, DC 20375 (United States); Pellegrini, G.; Rafi, J.M.; Quirion, D. [Instituto de Microelectrónica de Barcelona, IMB-CNM-CSIC, Bellaterra, Barcelona (Spain); Dalla Betta, G.-F. [INFN and University of Trento, Via Sommarive, 14, 38123 Povo di Trento (Italy); Boscardin, M. [Fondazione Bruno Kessler, Via Sommarive, 18, 38123 Povo di Trento (Italy); Casse, G. [Department of Physics, University of Liverpool, O. Lodge Laboratory, Oxford Street, Liverpool L69 7ZE (United Kingdom); Gorelov, I.; Hoeferkamp, M.; Metcalfe, J.; Seidel, S. [Department of Physics and Astronomy, University of New Mexico, MSC 07 4220, 1919 Lomas Boulevard NE, Albuquerque, NM 87131 (United States); Gaubas, E.; Ceponis, T. [Institute of Applied Research, Vilnius University, Sauletekio 9, LT-10222 Vilnius (Lithuania); and others


    We pursue scribe–cleave–passivate (SCP) technology for making “slim edge” sensors. The goal is to reduce the inactive region at the periphery of the devices while maintaining their performance. In this paper we report on two aspects of the current efforts. The first one involves fabrication options for mass production. We describe the automated cleaving tests and a simplified version of SCP post-processing of n-type devices. Another aspect is the radiation resistance of the passivation. We report on the radiation tests of n- and p-type devices with protons and neutrons.

  20. Feasibility of automated dropsize distributions from holographic data using digital image processing techniques. [particle diameter measurement technique (United States)

    Feinstein, S. P.; Girard, M. A.


    An automated technique for measuring particle diameters and their spatial coordinates from holographic reconstructions is being developed. Preliminary tests on actual cold-flow holograms of impinging jets indicate that a suitable discriminant algorithm consists of a Fourier-Gaussian noise filter and a contour thresholding technique. This process identifies circular as well as noncircular objects. The desired objects (in this case, circular or possibly ellipsoidal) are then selected automatically from the above set and stored with their parametric representations. From this data, dropsize distributions as a function of spatial coordinates can be generated and combustion effects due to hardware and/or physical variables studied.

  1. DEdicated MONitor of EXotransits and Transients (DEMONEXT): a low-cost robotic and automated telescope for followup of exoplanetary transits and other transient events (United States)

    Villanueva, S.; Eastman, J. D.; Gaudi, B. S.; Pogge, R. W.; Stassun, K. G.; Trueblood, M.; Trueblood, P.


    We present the design and development of the DEdicatedMONitor of EXotransits and Transients (DEMONEXT), an automated and robotic 20 inch telescope jointly funded by The Ohio State University and Vanderbilt University. The telescope is a PlaneWave CDK20 f/6.8 Corrected Dall-Kirkham Astrograph telescope on a Mathis Instruments MI-750/1000 Fork Mount located atWiner Observatory in Sonoita, AZ. DEMONEXT has a Hedrick electronic focuser, Finger Lakes Instrumentation (FLI) CFW-3-10 filter wheel, and a 2048 x 2048 pixel FLI Proline CCD3041 camera with a pixel scale of 0.90 arc-seconds per pixel and a 30.7× 30.7 arc-minute field-of-view. The telescope's automation, controls, and scheduling are implemented in Python, including a facility to add new targets in real time for rapid follow-up of time-critical targets. DEMONEXT will be used for the confirmation and detailed investigation of newly discovered planet candidates from the Kilodegree Extremely Little Telescope (KELT) survey, exploration of the atmospheres of Hot Jupiters via transmission spectroscopy and thermal emission measurements, and monitoring of select eclipsing binary star systems as benchmarks for models of stellar evolution. DEMONEXT will enable rapid confirmation imaging of supernovae, flare stars, tidal disruption events, and other transients discovered by the All Sky Automated Survey for SuperNovae (ASAS-SN). DEMONEXT will also provide follow-up observations of single-transit planets identified by the Transiting Exoplanet Survey Satellite (TESS) mission, and to validate long-period eclipsing systems discovered by Gaia.

  2. Kohonen Self-Organizing Maps in Validity Maintenance for Automated Scoring of Constructed Response. (United States)

    Williamson, David M.; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, monitoring the scoring process becomes a primary concern, particularly if automated scoring is intended to operate completely unassisted by humans. Using actual candidate selections from the Architectural Registration Examination (n=326), this study uses Kohonen…

  3. Automated Ground-based Time-lapse Camera Monitoring of West Greenland ice sheet outlet Glaciers: Challenges and Solutions (United States)

    Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.


    Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous

  4. Marketing automation


    Raluca Dania TODOR


    The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the...

  5. A conceptual framework for automating the operational and strategic decision-making process in the health care delivery system. (United States)

    Ruohonen, Toni; Ennejmy, Mohammed


    Making reliable and justified operational and strategic decisions is a really challenging task in the health care domain. So far, the decisions have been made based on the experience of managers and staff, or they are evaluated with traditional methods, using inadequate data. As a result of this kind of decision-making process, attempts to improve operations usually have failed or led to only local improvements. Health care organizations have a lot of operational data, in addition to clinical data, which is the key element for making reliable and justified decisions. However, it is progressively problematic to access it and make usage of it. In this paper we discuss about the possibilities how to exploit operational data in the most efficient way in the decision-making process. We'll share our future visions and propose a conceptual framework for automating the decision-making process.

  6. A computer based, automated analysis of process and outcomes of diabetic care in 23 GP practices.

    LENUS (Irish Health Repository)

    Hill, F


    The predicted prevalence of diabetes in Ireland by 2015 is 190,000. Structured diabetes care in general practice has outcomes equivalent to secondary care and good diabetes care has been shown to be associated with the use of electronic healthcare records (EHRs). This automated analysis of EHRs in 23 practices took 10 minutes per practice compared with 15 hours per practice for manual searches. Data was extracted for 1901 type II diabetics. There was valid data for >80% of patients for 6 of the 9 key indicators in the previous year. 543 (34%) had a Hba1c > 7.5%, 142 (9%) had a total cholesterol >6 mmol\\/l, 83 (6%) had an LDL cholesterol >4 mmol\\/l, 367 (22%) had Triglycerides > 2.2 mmol\\/l and 162 (10%) had Blood Pressure > 160\\/100 mmHg. Data quality and key indicators of care compare well with manual audits in Ireland and the U.K. electronic healthcare records and automated audits should be a feature of all chronic disease management programs.

  7. Comparison of manually produced and automated cross country movement maps using digital image processing techniques (United States)

    Wynn, L. K.


    The Image-Based Information System (IBIS) was used to automate the cross country movement (CCM) mapping model developed by the Defense Mapping Agency (DMA). Existing terrain factor overlays and a CCM map, produced by DMA for the Fort Lewis, Washington area, were digitized and reformatted into geometrically registered images. Terrain factor data from Slope, Soils, and Vegetation overlays were entered into IBIS, and were then combined utilizing IBIS-programmed equations to implement the DMA CCM model. The resulting IBIS-generated CCM map was then compared with the digitized manually produced map to test similarity. The numbers of pixels comprising each CCM region were compared between the two map images, and percent agreement between each two regional counts was computed. The mean percent agreement equalled 86.21%, with an areally weighted standard deviation of 11.11%. Calculation of Pearson's correlation coefficient yielded +9.997. In some cases, the IBIS-calculated map code differed from the DMA codes: analysis revealed that IBIS had calculated the codes correctly. These highly positive results demonstrate the power and accuracy of IBIS in automating models which synthesize a variety of thematic geographic data.

  8. Development of automated welding process for field fabrication of thick walled pressure vessels. Fourth quarter technical progress report for period ending September 28, 1980

    Energy Technology Data Exchange (ETDEWEB)


    Progress is reported in research aimed at optimizing an automated welding process for the field fabrication of thick-walled pressure vessels and for evaluating the welded joints. Information is included on the welding equipment, mechanical control of the process, joint design, filler wire optimization, in-process nondestructive testing of welds, and repair techniques. (LCL)

  9. Application of the informational reference system OZhUR to the automated processing of data from satellites of the Kosmos series (United States)

    Pokras, V. M.; Yevdokimov, V. P.; Maslov, V. D.


    The structure and potential of the information reference system OZhUR designed for the automated data processing systems of scientific space vehicles (SV) is considered. The system OZhUR ensures control of the extraction phase of processing with respect to a concrete SV and the exchange of data between phases.The practical application of the system OZhUR is exemplified in the construction of a data processing system for satellites of the Cosmos series. As a result of automating the operations of exchange and control, the volume of manual preparation of data is significantly reduced, and there is no longer any need for individual logs which fix the status of data processing. The system Ozhur is included in the automated data processing system Nauka which is realized in language PL-1 in a binary one-address system one-state (BOS OS) electronic computer.

  10. Intelligent instruments for process measurement techniques (monitoring of sensors) (United States)

    Bauer, B.; Hess, H. D.; Kalinski, J. R.; Leisenberg, W.; Marsch, D.


    Possibilities to extract redundant information of temperature sensors (resistance thermometers, thermocouples, semiconductor temperature sensors), and to find out which of the suggested redundancies are most suited for self controlled monitoring were investigated. Practical experience with equipment for process measurement techniques shows that sensor failures are five times more frequent than electronic malfunction. For resistance thermometers the measured values of the redundant information source (ac resistance) are too small (relative inductivity change 7 million). The information sources strain gage and propagation of ultrasonic waves are excluded because of physical properties in the sensor materials. Changes in the crystalline structure of thermocouples have the effect that there is no well defined relationship between thermoelectric voltage and the redundant information sources, resistance and coupled current impulses. A correlation of thermovoltage with these redundant values would yield a measurement uncertainty corresponding to more than + or - 50 K. Experiments with negative temperature coefficient sensors show that a failure is proceeded by a change in capacitance of the order of 0.1 pF.

  11. Automated DNA Sequencing System

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.


    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  12. A Case Study Improvement of a Testing Process by Combining Lean Management, Industrial Engineering and Automation Methods

    Directory of Open Access Journals (Sweden)

    Simon Withers


    Full Text Available Increasingly competitive market environments have forced not only large manufacturing, but also smalland-medium size enterprises (SME to look for means to improve their operations in order to increase competitive strength. This paper presents an adaptation and adoption by a UK SME engineering service organisation, of lean management, industrial engineering, and automation metods developed within larger organisations. This SME sought to improve the overall performance of one of its core testing processes. An exploratory analysis, based on the lean management concept of “value added” and work measurement technique “time study”, was developed and carried out in order to understand the current performance of a testing process for gas turbine fuel flow dividers. A design for the automation of some operations of the testing process was followed as an approach to reduce non-value added activities, and improve the overall efficiency of the testing process. The overall testing time was reduced from 12.41 to 7.93 hours (36.09 percent while the man hours and non-value added time were also reduced from 23.91 to 12.94 hours (45.87 percent and from 11.08 to 6.69 (39.67 percent hours respectively. This resulted in an increase in process efficiency in terms of man hours from 51.91 to 61.28 percent. The contribution of this paper resides in presenting a case study that can be used as a guiding reference for managers and engineers to undertake improvement projects, in their organisations, similar to the one presented in this paper.

  13. Monitoring endemic livestock diseases using laboratory diagnostic data: A simulation study to evaluate the performance of univariate process monitoring control algorithms. (United States)

    Lopes Antunes, Ana Carolina; Dórea, Fernanda; Halasa, Tariq; Toft, Nils


    Surveillance systems are critical for accurate, timely monitoring and effective disease control. In this study, we investigated the performance of univariate process monitoring control algorithms in detecting changes in seroprevalence for endemic diseases. We also assessed the effect of sample size (number of sentinel herds tested in the surveillance system) on the performance of the algorithms. Three univariate process monitoring control algorithms were compared: Shewart p Chart(1) (PSHEW), Cumulative Sum(2) (CUSUM) and Exponentially Weighted Moving Average(3) (EWMA). Increases in seroprevalence were simulated from 0.10 to 0.15 and 0.20 over 4, 8, 24, 52 and 104 weeks. Each epidemic scenario was run with 2000 iterations. The cumulative sensitivity(4) (CumSe) and timeliness were used to evaluate the algorithms' performance with a 1% false alarm rate. Using these performance evaluation criteria, it was possible to assess the accuracy and timeliness of the surveillance system working in real-time. The results showed that EWMA and PSHEW had higher CumSe (when compared with the CUSUM) from week 1 until the end of the period for all simulated scenarios. Changes in seroprevalence from 0.10 to 0.20 were more easily detected (higher CumSe) than changes from 0.10 to 0.15 for all three algorithms. Similar results were found with EWMA and PSHEW, based on the median time to detection. Changes in the seroprevalence were detected later with CUSUM, compared to EWMA and PSHEW for the different scenarios. Increasing the sample size 10 fold halved the time to detection (CumSe=1), whereas increasing the sample size 100 fold reduced the time to detection by a factor of 6. This study investigated the performance of three univariate process monitoring control algorithms in monitoring endemic diseases. It was shown that automated systems based on these detection methods identified changes in seroprevalence at different times. Increasing the number of tested herds would lead to faster


    Institute of Scientific and Technical Information of China (English)

    WU Yaohua; ZHANG Yigong; WU Yingying


    Compared to fixed virtual window algorithm (FVWA), the dynamic virtual window algorithm (DVWA) determines the length of each virtual container according to the sizes of goods of each order, which saves space of virtual containers and improves the picking efficiency. However, the interval of consecutive goods caused by dispensers on conveyor can not be eliminated by DVWA, which limits a further improvement of picking efficiency. In order to solve this problem, a compressible virtual window algorithm (CVWA) is presented. It not only inherits the merit of DVWA but also compresses the length of virtual containers without congestion of order accumulation by advancing the beginning time of order picking and reasonably coordinating the pace of order accumulation. The simulation result proves that the picking efficiency of automated sorting system is greatly improved by CVWA.

  15. Acoustic Emission Based In-process Monitoring in Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas; Bissacco, Giuliano; De Chiffre, Leonardo

    The applicability of acoustic emission (AE) measurements for in-process monitoring in the Robot Assisted Polishing (RAP) process was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate removal of the...

  16. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model. (United States)

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A


    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time.

  17. A fully automated meltwater monitoring and collection system for spatially distributed isotope analysis in snowmelt-dominated catchments (United States)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James


    In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter

  18. A fully automated health-care monitoring at home without attachment of any biological sensors and its clinical evaluation. (United States)

    Motoi, Kosuke; Ogawa, Mitsuhiro; Ueno, Hiroshi; Kuwae, Yutaka; Ikarashi, Akira; Yuji, Tadahiko; Higashi, Yuji; Tanaka, Shinobu; Fujimoto, Toshiro; Asanoi, Hidetsugu; Yamakoshi, Ken-ichi


    Daily monitoring of health condition is important for an effective scheme for early diagnosis, treatment and prevention of lifestyle-related diseases such as adiposis, diabetes, cardiovascular diseases and other diseases. Commercially available devices for health care monitoring at home are cumbersome in terms of self-attachment of biological sensors and self-operation of the devices. From this viewpoint, we have been developing a non-conscious physiological monitor installed in a bath, a lavatory, and a bed for home health care and evaluated its measurement accuracy by simultaneous recordings of a biological sensors directly attached to the body surface. In order to investigate its applicability to health condition monitoring, we have further developed a new monitoring system which can automatically monitor and store the health condition data. In this study, by evaluation on 3 patients with cardiac infarct or sleep apnea syndrome, patients' health condition such as body and excretion weight in the toilet and apnea and hypopnea during sleeping were successfully monitored, indicating that the system appears useful for monitoring the health condition during daily living.

  19. Perfect error processing: Perfectionism-related variations in action monitoring and error processing mechanisms. (United States)

    Stahl, Jutta; Acharki, Manuela; Kresimon, Miriam; Völler, Frederike; Gibbons, Henning


    Showing excellent performance and avoiding poor performance are the main characteristics of perfectionists. Perfectionism-related variations (N=94) in neural correlates of performance monitoring were investigated in a flanker task by assessing two perfectionism-related trait dimensions: Personal standard perfectionism (PSP), reflecting intrinsic motivation to show error-free performance, and evaluative concern perfectionism (ECP), representing the worry of being poorly evaluated based on bad performance. A moderating effect of ECP and PSP on error processing - an important performance monitoring system - was investigated by examining the error (-related) negativity (Ne/ERN) and the error positivity (Pe). The smallest Ne/ERN difference (error-correct) was obtained for pure-ECP participants (high-ECP-low-PSP), whereas the highest difference was shown for those with high-ECP-high-PSP (i.e., mixed perfectionists). Pe was positively correlated with PSP only. Our results encouraged the cognitive-bias hypothesis suggesting that pure-ECP participants reduce response-related attention to avoid intense error processing by minimising the subjective threat of negative evaluations. The PSP-related variations in late error processing are consistent with the participants' high in PSP goal-oriented tendency to optimise their behaviour.

  20. An EWMA-type control chart for monitoring the process mean using auxiliary information

    NARCIS (Netherlands)

    Abbas, N.; Riaz, M.; Does, R.J.M.M.


    Statistical process control (SPC) is an important application of statistics in which the outputs of production processes are monitored. Control charts are an important tool of SPC. A very popular category is the Shewhart's x-chart used to monitor the mean of a process characteristic. Two alternative

  1. Generalized model of synthesis quality monitoring system of extraction, processing and transportation of gas

    Directory of Open Access Journals (Sweden)

    Леонид Иванович Нефедов


    Full Text Available The extraction and processing and gas transportation have been analyzed in the article. Decomposition of the process of monitoring is introduced based on that; allowing the hierarchical structure of the monitoring system is request. Scientific novelty consists in the fact that developed generalized synthesis quality model monitoring system to extract, processing and gas transportation, which allows solving the problem of synthesis of a unified system and criteria positions.

  2. Cloud-based CT dose monitoring using the DICOM-structured report. Fully automated analysis in regard to national diagnostic reference levels

    Energy Technology Data Exchange (ETDEWEB)

    Boos, J.; Rubbert, C.; Heusch, P.; Lanzman, R.S.; Aissa, J.; Antoch, G.; Kroepil, P. [Univ. Duesseldorf (Germany). Dept. of Diagnostic an Interventional Radiology; Meineke, A. [Cerner Health Services, Idstein (Germany)


    To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). Materials and Methods: We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDI{sub vol}) and dose length product (DLP). Results: Data of 36 523 examinations (131 527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDI{sub vol} and DLP were 51.3 % and 52.8 % of the national DRLs, respectively. CTDI{sub vol} and DLP reached 43.8 % and 43.1 % for abdominal CT (n = 10 590), 66.6 % and 69.6 % for cranial CT (n = 16 098) and 37.8 % and 44.0 % for chest CT (n = 10 387) of the compared national DRLs, respectively. Overall, the CTDI{sub vol} exceeded national DRLs in 1.9 % of the examinations, while the DLP exceeded national DRLs in 2.9 % of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50 % of the DRLs. Conclusion: The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50 % of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments.



    K Sujatha; VENMATHI, M.; N. Pappa


    Combustion quality in power station boilers plays an important role in minimizing the flue gas emissions. In the present work various intelligent schemes to infer the flue gas emissions by monitoring the flame colour at the furnace of the boiler are proposed here. Flame image monitoring involves capturing the flame video over a period of time with the measurement of various parameters like Carbon dioxide (CO2), excess oxygen (O2), Nitrogen dioxide (NOx), Sulphur dioxide (SOx) and Carbon monox...

  4. Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.12 (United States)


    images in JPEG format. AOPS uses a simple monitoring technique . The main driver regularly polls a specified input directory for incoming data and for...Inversion technique by comparison of diver visibility products. • The new image merge capability allowing enhanced spatial coverage by combining the...Medium Resolution Imaging Spectrometer (MERIS) Meteorology and Oceanography (METOC) Mine Warfare (MIW) Moderate Resolution Imaging Spectrometers

  5. A KPI-based process monitoring and fault detection framework for large-scale processes. (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang


    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods.

  6. RNA extracted from blood samples with a rapid automated procedure is fit for molecular diagnosis or minimal residual disease monitoring in patients with a variety of malignant blood disorders. (United States)

    Bechlian, Didier; Honstettre, Amélie; Terrier, Michèle; Brest, Christelle; Malenfant, Carine; Mozziconacci, Marie-Joëlle; Chabannon, Christian


    Scientific studies in oncology, cancer diagnosis, and monitoring tumor response to therapeutics currently rely on a growing number of clinico-pathological information. These often include molecular analyses. The quality of these analyses depends on both pre-analytical and analytical information and often includes the extraction of DNA and/or RNA from human tissues and cells. The quality and quantity of obtained nucleic acids are of utmost importance. The use of automated techniques presents several advantages over manual techniques, such as reducing technical time and thus cost, and facilitating standardization. The purpose of this study was to validate an automated technique for RNA extraction from cells of patients treated for various malignant blood diseases. A well-established manual technique was compared to an automated technique, in order to extract RNA from blood samples drawn for the molecular diagnosis of a variety of leukemic diseases or monitoring of minimal residual disease. The quality of the RNA was evaluated by real-time quantitative RT-PCR (RQ-PCR) analyses of the Abelson gene transcript. The results show that both techniques produce RNA with comparable quality and quantity, thus suggesting that an automated technique can be substituted for the reference and manual technique used in the daily routine of a molecular pathology laboratory involved in minimal residual disease monitoring. Increased costs of reagents and disposables used for automated techniques can be compensated by a decrease in human resource.

  7. Miniaturized, Multi-Analyte Sensor Array for the Automated Monitoring of Major Atmospheric Constituents in Spacecraft Environment Project (United States)

    National Aeronautics and Space Administration — The objective of the Phase II SBIR project is to develop a prototype sensor system to detect gaseous analytes in support of the spacecraft environmental monitoring...

  8. Miniaturized, Multi-Analyte Sensor Array for the Automated Monitoring of Major Atmospheric Constituents in Spacecraft Environment Project (United States)

    National Aeronautics and Space Administration — InnoSense LLC (ISL) proposes to develop a miniaturized, multi-analyte sensor for near real-time monitoring of analytes in the spacecraft environment. The proposed...

  9. Evaluating an Automated Approach for Monitoring Forest Disturbances in the Pacific Northwest from Logging, Fire and Insect Outbreaks with Landsat Time Series Data

    Directory of Open Access Journals (Sweden)

    Christopher S. R. Neigh


    Full Text Available Forests are the largest aboveground sink for atmospheric carbon (C, and understanding how they change through time is critical to reduce our C-cycle uncertainties. We investigated a strong decline in Normalized Difference Vegetation Index (NDVI from 1982 to 1991 in Pacific Northwest forests, observed with the National Ocean and Atmospheric Administration’s (NOAA series of Advanced Very High Resolution Radiometers (AVHRRs. To understand the causal factors of this decline, we evaluated an automated classification method developed for Landsat time series stacks (LTSS to map forest change. This method included: (1 multiple disturbance index thresholds; and (2 a spectral trajectory-based image analysis with multiple confidence thresholds. We produced 48 maps and verified their accuracy with air photos, monitoring trends in burn severity data and insect aerial detection survey data. Area-based accuracy estimates for change in forest cover resulted in producer’s and user’s accuracies of 0.21 ± 0.06 to 0.38 ± 0.05 for insect disturbance, 0.23 ± 0.07 to 1 ± 0 for burned area and 0.74 ± 0.03 to 0.76 ± 0.03 for logging. We believe that accuracy was low for insect disturbance because air photo reference data were temporally sparse, hence missing some outbreaks, and the annual anniversary time step is not dense enough to track defoliation and progressive stand mortality. Producer’s and user’s accuracy for burned area was low due to the temporally abrupt nature of fire and harvest with a similar response of spectral indices between the disturbance index and normalized burn ratio. We conclude that the spectral trajectory approach also captures multi-year stress that could be caused by climate, acid deposition, pathogens, partial harvest, thinning, etc. Our study focused on understanding the transferability of previously successful methods to new ecosystems and found that this automated method does not perform with the same accuracy in Pacific

  10. Evaluating an Automated Approach for Monitoring Forest Disturbances in the Pacific Northwest from Logging, Fire and Insect Outbreaks with Landsat Time Series Data (United States)

    R.Neigh, Christopher S.; Bolton, Douglas K.; Williams, Jennifer J.; Diabate, Mouhamad


    Forests are the largest aboveground sink for atmospheric carbon (C), and understanding how they change through time is critical to reduce our C-cycle uncertainties. We investigated a strong decline in Normalized Difference Vegetation Index (NDVI) from 1982 to 1991 in Pacific Northwest forests, observed with the National Ocean and Atmospheric Administration's (NOAA) series of Advanced Very High Resolution Radiometers (AVHRRs). To understand the causal factors of this decline, we evaluated an automated classification method developed for Landsat time series stacks (LTSS) to map forest change. This method included: (1) multiple disturbance index thresholds; and (2) a spectral trajectory-based image analysis with multiple confidence thresholds. We produced 48 maps and verified their accuracy with air photos, monitoring trends in burn severity data and insect aerial detection survey data. Area-based accuracy estimates for change in forest cover resulted in producer's and user's accuracies of 0.21 +/- 0.06 to 0.38 +/- 0.05 for insect disturbance, 0.23 +/- 0.07 to 1 +/- 0 for burned area and 0.74 +/- 0.03 to 0.76 +/- 0.03 for logging. We believe that accuracy was low for insect disturbance because air photo reference data were temporally sparse, hence missing some outbreaks, and the annual anniversary time step is not dense enough to track defoliation and progressive stand mortality. Producer's and user's accuracy for burned area was low due to the temporally abrupt nature of fire and harvest with a similar response of spectral indices between the disturbance index and normalized burn ratio. We conclude that the spectral trajectory approach also captures multi-year stress that could be caused by climate, acid deposition, pathogens, partial harvest, thinning, etc. Our study focused on understanding the transferability of previously successful methods to new ecosystems and found that this automated method does not perform with the same accuracy in Pacific Northwest forests

  11. A new process monitoring method based on noisy time structure independent component analysis

    Institute of Scientific and Technical Information of China (English)

    Lianfang Cai; Xuemin Tian


    Conventional process monitoring method based on fast independent component analysis (FastICA) cannot take the ubiquitous measurement noises into account and may exhibit degraded monitoring performance under the adverse effects of the measurement noises. In this paper, a new process monitoring approach based on noisy time structure ICA (NoisyTSICA) is proposed to solve such problem. A NoisyTSICA algorithm which can consider the measurement noises explicitly is firstly developed to estimate the mixing matrix and extract the independent components (ICs). Subsequently, a monitoring statistic is built to detect process faults on the basis of the recur-sive kurtosis estimations of the dominant ICs. Lastly, a contribution plot for the monitoring statistic is constructed to identify the fault variables based on the sensitivity analysis. Simulation studies on the continuous stirred tank reactor system demonstrate that the proposed NoisyTSICA-based monitoring method outperforms the conven-tional FastICA-based monitoring method.

  12. Automated three-dimensional detection and classification of living organisms using digital holographic microscopy with partial spatial coherent source: application to the monitoring of drinking water resources. (United States)

    El Mallahi, Ahmed; Minetti, Christophe; Dubois, Frank


    In this paper, we investigate the use of a digital holographic microscope working with partially coherent spatial illumination for an automated detection and classification of living organisms. A robust automatic method based on the computation of propagating matrices is proposed to detect the 3D position of organisms. We apply this procedure to the evaluation of drinking water resources by developing a classification process to identify parasitic protozoan Giardia lamblia cysts among two other similar organisms. By selecting textural features from the quantitative optical phase instead of morphological ones, a robust classifier is built to propose a new method for the unambiguous detection of Giardia lamblia cyst that present a critical contamination risk.

  13. Validation Test Report for the Automated Optical Processing System (AOPS) Version 4.10 (United States)


    generate co-registered image databases of geophysical parameters derived from remotely sensed data. To accomplish this, AOPS uses the techniques of...quick-look “browse” images in JPEG format. AOPS uses a simple monitoring technique . The main driver regularly polls a specified input directory for...Oceanography (METOC) Mine Warfare (MIW) Moderate Resolution Imaging Spectrometers (MODIS, on Terra and Aqua) National Aeronautic and Space

  14. Warehouse automation


    Pogačnik, Jure


    An automated high bay warehouse is commonly used for storing large number of material with a high throughput. In an automated warehouse pallet movements are mainly performed by a number of automated devices like conveyors systems, trolleys, and stacker cranes. From the introduction of the material to the automated warehouse system to its dispatch the system requires no operator input or intervention since all material movements are done automatically. This allows the automated warehouse to op...


    Directory of Open Access Journals (Sweden)

    Mrs. Kavitha. R,


    Full Text Available Vein pattern in palms is a random mesh of interconnected and inter- wining blood vessels. This project is the application of vein detection concept to automate the drug delivery process. It dealswith extracting palm dorsal vein structures, which is a key procedure for selecting the optimal drug needle insertion point. Gray scale images obtained from a low cost IR-webcam are poor in contrast, and usually noisy which make an effective vein segmentation a great challenge. Here a new vein image segmentation method is introduced, based on enhancement techniques resolves the conflict between poor contrast vein image and good quality image segmentation. Gaussian filter is used to remove the high frequency noise in the image. The ultimate goal is to identify venous bifurcations and determine the insertion point for the needle in between their branches.

  16. Case study of verification, validation, and testing in the Automated Data Processing (ADP) system development life cycle

    Energy Technology Data Exchange (ETDEWEB)

    Riemer, C.A.


    Staff of the Environmental Assessment and Information Sciences Division of Argonne National Laboratory (ANL) studies the role played by the organizational participants in the Department of Veterans Affairs (VA) that conduct verification, validation, and testing (VV T) activities at various stages in the automated data processing (ADP) system development life cycle (SDLC). A case-study methodology was used to assess the effectiveness of VV T activities (tasks) and products (inputs and outputs). The case selected for the study was a project designed to interface the compensation and pension (C P) benefits systems with the centralized accounts receivable system (CARS). Argonne developed an organizational SDLC VV T model and checklists to help collect information from C P/CARS participants on VV T procedures and activities, and these were then evaluated against VV T standards.

  17. Code it rite the first time : automated invoice processing solution designed to ensure validity to field ticket coding

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, G.


    An entrepreneur who ran 55 rigs for a major oilfield operator in Calgary has developed a solution for the oil industry that reduces field ticketing errors from 40 per cent to almost none. The Code-Rite not only simplifies field ticketing but can eliminate weeks of trying to balance authorization for expenditure (AFE) numbers. A service provider who wants a field ticket signed for billing purposes following a service call to a well site receives all pertinent information on a barcode that includes AFE number, location, routing, approval authority and mailing address. Attaching the label to the field ticket provides all the invoicing information needed. This article described the job profile, education and life experiences and opportunities that led the innovator to develop this technology that solves an industry-wide problem. Code-Rite is currently being used by 3 large upstream oil and gas operators and plans are underway to automate the entire invoice processing system. 1 fig.

  18. Online monitoring of food processes using subsurface laser scattering

    DEFF Research Database (Denmark)

    Carstensen, Jens Michael; Møller, Flemming

    Online monitoring of physical parameters during food production is not a trivial task, but promising results can often be obtained with Subsurface Laser Scattering (SLS). The first SLS instruments are on the market today, and studies are needed to asses the potential of the technology. SLS can...... monitor particle changes and gelation formation in a fast and non-invasive manner during production of most food products. SLS is correlated to classical particle sizing parameters, i.e. size, number of light scatters and refractive index, as well as sensoric parameters like mouthfeel. The background...

  19. An alternative method for monitoring carbonyls, and the development of a 24-port fully automated carbonyl sampler for PAMS program

    Energy Technology Data Exchange (ETDEWEB)

    Parmar, S.S.; Ugarova, L. [Atmospheric Analysis and Consulting, Ventura, CA (United States); Fernandes, C.; Guyton, J.; Lee, C.P. [Arizona Dept. of Environmental Quality, Phoenix, AZ (United States)


    The authors have investigated the possibility of collecting different aldehydes and ketones on different sorbents such as silica gel, molecular sieve and charcoal followed by solvent extraction, DNPH derivatization and HPLC/UV analysis. Carbonyl collection efficiencies for these sorbents were calculated relative to a DNPH coated C{sub 18} sep-pak cartridge. From a limited number of laboratory experiments, at various concentrations, it appears that silica gel tubes can be used for sampling aldehydes (collection efficiencies {approximately} 1), whereas charcoal tubes are suitable for collecting ketones. Molecular sieve was found to be unsuitable for collecting most of the carbonyl studied. The authors also report the development of a fully automated 24-port carbonyl sampler specially designed for EPA`s PAMS program.

  20. Automated discrimination of dicentric and monocentric chromosomes by machine learning-based image processing. (United States)

    Li, Yanxin; Knoll, Joan H; Wilkins, Ruth C; Flegal, Farrah N; Rogan, Peter K


    Dose from radiation exposure can be estimated from dicentric chromosome (DC) frequencies in metaphase cells of peripheral blood lymphocytes. We automated DC detection by extracting features in Giemsa-stained metaphase chromosome images and classifying objects by machine learning (ML). DC detection involves (i) intensity thresholded segmentation of metaphase objects, (ii) chromosome separation by watershed transformation and elimination of inseparable chromosome clusters, fragments and staining debris using a morphological decision tree filter, (iii) determination of chromosome width and centreline, (iv) derivation of centromere candidates, and (v) distinction of DCs from monocentric chromosomes (MC) by ML. Centromere candidates are inferred from 14 image features input to a Support Vector Machine (SVM). Sixteen features derived from these candidates are then supplied to a Boosting classifier and a second SVM which determines whether a chromosome is either a DC or MC. The SVM was trained with 292 DCs and 3135 MCs, and then tested with cells exposed to either low (1 Gy) or high (2-4 Gy) radiation dose. Results were then compared with those of 3 experts. True positive rates (TPR) and positive predictive values (PPV) were determined for the tuning parameter, σ. At larger σ, PPV decreases and TPR increases. At high dose, for σ = 1.3, TPR = 0.52 and PPV = 0.83, while at σ = 1.6, the TPR = 0.65 and PPV = 0.72. At low dose and σ = 1.3, TPR = 0.67 and PPV = 0.26. The algorithm differentiates DCs from MCs, overlapped chromosomes and other objects with acceptable accuracy over a wide range of radiation exposures.

  1. Intelligent Machines in the 21st Century: Automating the Processes of Inference and Inquiry (United States)

    Knuth, Kevin H.


    The last century saw the application of Boolean algebra toward the construction of computing machines, which work by applying logical transformations to information contained in their memory. The development of information theory and the generalization of Boolean algebra to Bayesian inference have enabled these computing machines. in the last quarter of the twentieth century, to be endowed with the ability to learn by making inferences from data. This revolution is just beginning as new computational techniques continue to make difficult problems more accessible. However, modern intelligent machines work by inferring knowledge using only their pre-programmed prior knowledge and the data provided. They lack the ability to ask questions, or request data that would aid their inferences. Recent advances in understanding the foundations of probability theory have revealed implications for areas other than logic. Of relevance to intelligent machines, we identified the algebra of questions as the free distributive algebra, which now allows us to work with questions in a way analogous to that which Boolean algebra enables us to work with logical statements. In this paper we describe this logic of inference and inquiry using the mathematics of partially ordered sets and the scaffolding of lattice theory, discuss the far-reaching implications of the methodology, and demonstrate its application with current examples in machine learning. Automation of both inference and inquiry promises to allow robots to perform science in the far reaches of our solar system and in other star systems by enabling them to not only make inferences from data, but also decide which question to ask, experiment to perform, or measurement to take given what they have learned and what they are designed to understand.

  2. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A


    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  3. A pipeline for comprehensive and automated processing of electron diffraction data in IPLT. (United States)

    Schenk, Andreas D; Philippsen, Ansgar; Engel, Andreas; Walz, Thomas


    Electron crystallography of two-dimensional crystals allows the structural study of membrane proteins in their native environment, the lipid bilayer. Determining the structure of a membrane protein at near-atomic resolution by electron crystallography remains, however, a very labor-intense and time-consuming task. To simplify and accelerate the data processing aspect of electron crystallography, we implemented a pipeline for the processing of electron diffraction data using the Image Processing Library and Toolbox (IPLT), which provides a modular, flexible, integrated, and extendable cross-platform, open-source framework for image processing. The diffraction data processing pipeline is organized as several independent modules implemented in Python. The modules can be accessed either from a graphical user interface or through a command line interface, thus meeting the needs of both novice and expert users. The low-level image processing algorithms are implemented in C++ to achieve optimal processing performance, and their interface is exported to Python using a wrapper. For enhanced performance, the Python processing modules are complemented with a central data managing facility that provides a caching infrastructure. The validity of our data processing algorithms was verified by processing a set of aquaporin-0 diffraction patterns with the IPLT pipeline and comparing the resulting merged data set with that obtained by processing the same diffraction patterns with the classical set of MRC programs.

  4. Advances in monitoring dynamic hydrologic conditions in the vadose zone through automated high-resolution ground-penetrating radar imaging and analysis (United States)

    Mangel, Adam R.

    This body of research focuses on resolving physical and hydrological heterogeneities in the subsurface with ground-penetrating radar (GPR). Essentially, there are two facets of this research centered on the goal of improving the collective understanding of unsaturated flow processes: i) modifications to commercially available equipment to optimize hydrologic value of the data and ii) the development of novel methods for data interpretation and analysis in a hydrologic context given the increased hydrologic value of the data. Regarding modifications to equipment, automation of GPR data collection substantially enhances our ability to measure changes in the hydrologic state of the subsurface at high spatial and temporal resolution (Chapter 1). Additionally, automated collection shows promise for quick high-resolution mapping of dangerous subsurface targets, like unexploded ordinance, that may have alternate signals depending on the hydrologic environment (Chapter 5). Regarding novel methods for data inversion, dispersive GPR data collected during infiltration can constrain important information about the local 1D distribution of water in waveguide layers (Chapters 2 and 3), however, more data is required for reliably analyzing complicated patterns produced by the wetting of the soil. In this regard, data collected in 2D and 3D geometries can further illustrate evidence of heterogeneous flow, while maintaining the content for resolving wave velocities and therefore, water content. This enables the use of algorithms like reflection tomography, which show the ability of the GPR data to independently resolve water content distribution in homogeneous soils (Chapter 5). In conclusion, automation enables the non-invasive study of highly dynamic hydrologic processes by providing the high resolution data required to interpret and resolve spatial and temporal wetting patterns associated with heterogeneous flow. By automating the data collection, it also allows for the novel

  5. Marine pollution monitoring and coastal processes off Andhra Coast

    Digital Repository Service at National Institute of Oceanography (India)

    Sadhuram, Y.

    plants are some of them. ESSAR group is going to invest Rs.1000 crores to set up industries in this belt. In view of the above, regular monitoring of pollution concentration in the harbour and coastal waters is being done by NIO, RC, Visakhapatnam under...

  6. Automated Signal Processing Applied to Volatile-Based Inspection of Greenhouse Crops

    Directory of Open Access Journals (Sweden)

    Eldert van Henten


    Full Text Available Gas chromatograph–mass spectrometers (GC-MS have been used and shown utility for volatile-based inspection of greenhouse crops. However, a widely recognized difficulty associated with GC-MS application is the large and complex data generated by this instrument. As a consequence, experienced analysts are often required to process this data in order to determine the concentrations of the volatile organic compounds (VOCs of interest. Manual processing is time-consuming, labour intensive and may be subject to errors due to fatigue. The objective of this study was to assess whether or not GC-MS data can also be automatically processed in order to determine the concentrations of crop health associated VOCs in a greenhouse. An experimental dataset that consisted of twelve data files was processed both manually and automatically to address this question. Manual processing was based on simple peak integration while the automatic processing relied on the algorithms implemented in the MetAlignTM software package. The results of automatic processing of the experimental dataset resulted in concentrations similar to that after manual processing. These results demonstrate that GC-MS data can be automatically processed in order to accurately determine the concentrations of crop health associated VOCs in a greenhouse. When processing GC-MS data automatically, noise reduction, alignment, baseline correction and normalisation are required.

  7. Use of information system data of jet crushing acoustic monitoring for the process management

    Directory of Open Access Journals (Sweden)

    T.M. Bulanaya


    Full Text Available The graphic interpretation of amplitude and frequency of acoustic signals of loose material jet grinding process are resulted. Criteria of process management is determined on the basis of the acoustic monitoring data of jet mill acting.

  8. Acoustic emission-based in-process monitoring of surface generation in robot-assisted polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas; Bissacco, Giuliano; De Chiffre, Leonardo


    The applicability of acoustic emission (AE) measurements for in-process monitoring of surface generation in the robot-assisted polishing (RAP) was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate ...

  9. An Analysis of the Records Management Process to Determine the Impact of Automation on Productivity (United States)


    part of the DoD Functional Process Improvement Program, the IDEF modeling methodology consists of two modeling tools, IDEFO and IDEFIX . The IDEFO model...defines the activities of the business process for process improvement. The IDEFIX model is a data model used to complement the IDEFO model by defming...entities, along with their attributes and relationships (7:70). For example, IDEFIX would define a file folder as having a unique identifier, with

  10. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri


    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  11. 99Tc Process Monitoring System In-Lab Performance Characterization

    Energy Technology Data Exchange (ETDEWEB)

    O' Hara, Matthew J.; Niver, Cynthia M.


    Executive Summary A 99Tc Process Monitoring (Tc-Mon) System has been designed and built for deployment at the recently constructed 200 West Pump & Treat (200W P&T) Plant in the 200 West Area ZP-1 Operable Unit of the Hanford Site. The plant is operated by CH2M Hill Plateau Remediation Company (CHPRC). The Tc-Mon system was created through collaboration between Pacific Northwest National Laboratory (PNNL) and Burge Environmental, Inc. The new system’s design has been optimized based on experience from an earlier field test (2011) of a prototype system at the 200W-ZP-1 Interim Pump & Treat Plant. A portion of the new 200W P&T Plant is dedicated to removal of 99Tc from contaminated groundwater in the 200 West Area. 99Tc, as the pertechnetate anion (99TcO4-), is remediated through delivery of water into two trains (Trains A and B) of three tandem extraction columns filled with Purolite A530E resin. The resin columns cannot be regenerated; therefore, once they have reached their maximum useful capacity, the columns must be disposed of as radioactive waste. The Tc-Mon system’s primary duty will be to periodically sample and analyze the effluents from each of the two primary extraction columns to determine 99Tc breakthrough. The Tc-Mon system will enable the CH2M Hill Plateau Remediation Company (CHPRC) to measure primary extraction column breakthrough on demand. In this manner, CHPRC will be able to utilize each extraction column to its maximum capacity. This will significantly reduce column disposal and replacement costs over the life of the plant. The Tc-Mon system was constructed by Burge Environmental, Inc. and was delivered to PNNL in June 2013 for setup and initial hardware and software performance testing in the 325 Building. By early July, PNNL had initiated an in-laboratory performance characterization study on the system. The objective was to fully calibrate the system and then evaluate the quality of the analytical outputs 1) against a series of clean


    Directory of Open Access Journals (Sweden)

    Gennady G. Kulikov


    Full Text Available This article discusses the modern point of view, the issue of developing methods of forming the structure of the process lifecycle management of specialisttraining in conjunction with the University of industrial enterprise on the basisof a comprehensive content base chair. The possibility of using IT to improve the efficiency of educational processes.

  13. Fibromyalgia symptom reduction by online behavioral self-monitoring, longitudinal single subject analysis and automated delivery of individualized guidance

    Directory of Open Access Journals (Sweden)

    William Collinge


    Full Text Available Background: Fibromyalgia (FM is a complex chronic pain condition that is difficult to treat. The prevailing approach is an integration of pharmacological, psycho-educational, and behavioral strategies. Information technology offers great potential for FM sufferers to systemically monitor symptoms as well as potential impacts of various management strategies. Aims: This study aimed to evaluate effects of a web-based, self-monitoring and symptom management system (SMARTLog that analyzes personal self-monitoring data and delivers data-based feedback over time. Materials and Methods: Subjects were self-referred, anonymous, and recruited via publicity on FM advocacy websites. Standardized instruments assessed health status, self-efficacy, and locus of control at baseline and monthly during participation. Subjects were encouraged to complete the SMARTLog several times weekly. Within-subject, univariate, and multivariate analyses were used to derive classification trees for each user associating specific behavior variables with symptom levels over time. Results: Moderate use (3 times weekly x 3 months increased likelihood of clinically significant improvements in pain, memory, gastrointestinal problems, depression, fatigue, and concentration; heavy use (4.5 times weekly x five months produced the above plus improvement in stiffness and sleep difficulties. Conclusions: Individualized, web-based behavioral self-monitoring with personally-tailored feedback can enable FM sufferers to significantly reduce symptom levels over time.

  14. Statistical Process Control Charts for Public Health Monitoring (United States)


    Poisson counts) [21-23].  Cumulative sum ( CUSUM ) and exponentially weighted moving average (EWMA) control charts are often used with Phase II data. These...charts have been shown to more quickly detect small changes than traditional Shewhart charts. There have been several applications of CUSUM charts in...distribution, a CUSUM or EWMA chart would be required.  Risk adjustment for health data has been applied when monitoring variables that can be

  15. [Near infrared spectroscopy and multivariate statistical process analysis for real-time monitoring of production process]. (United States)

    Wang, Yi; Ma, Xiang; Wen, Ya-Dong; Zou, Quan; Wang, Jun; Tu, Jia-Run; Cai, Wen-Sheng; Shao, Xue-Guang


    Near infrared diffusive reflectance spectroscopy has been applied in on-site or on-line analysis due to its characteristics of fastness, non-destruction and the feasibility for real complex sample analysis. The present work reported a real-time monitoring method for industrial production by using near infrared spectroscopic technique and multivariate statistical process analysis. In the method, the real-time near infrared spectra of the materials are collected on the production line, and then the evaluation of the production process can be achieved by a statistic Hotelling T2 calculated with the established model. In this work, principal component analysis (PCA) is adopted for building the model, and the statistic is calculated by projecting the real-time spectra onto the PCA model. With an application of the method in a practical production, it was demonstrated that a real-time evaluation of the variations in the production can be realized by investigating the changes in the statistic, and the comparison of the products in different batches can be achieved by further statistics of the statistic. Therefore, the proposed method may provide a practical way for quality insurance of production processes.

  16. Design and Development of FPGA Based Data Acquisition System for Process Automation

    Directory of Open Access Journals (Sweden)

    H.S Murali


    Full Text Available This paper presents a novel approach to the design of data acquisition system for process applications. The core heart of the proposed system is Field Programmable Gate Array (FPGA which is configured and programmed to acquire a maximum of 16 MB real time data. For the real time validation of the designed system, a process plant with three parameters i.e. pressure, temperature and level is considered. Real time data from the process is acquired using suitable temperature, pressure and level sensors. Signal conditioners are designed for each sensor and are tested in real time. Designed FPGA based data acquisition system along with corresponding signal conditioners is validated in real-time by running the process and comparing the same with the corresponding references. The data acquired in real time compares well with the references.

  17. Quality of data entry using single entry, double entry and automated forms processing--an example based on a study of patient-reported outcomes

    DEFF Research Database (Denmark)

    Paulsen, Aksel; Overgaard, Søren; Lauritsen, Jens Martin


    The clinical and scientific usage of patient-reported outcome measures is increasing in the health services. Often paper forms are used. Manual double entry of data is defined as the definitive gold standard for transferring data to an electronic format, but the process is laborious. Automated...

  18. The multi-isotope process monitor: Non-destructive, near-real-time nuclear safeguards monitoring at a reprocessing facility (United States)

    Orton, Christopher Robert

    The IAEA will require advanced technologies to effectively safeguard nuclear material at envisioned large scale nuclear reprocessing plants. This dissertation describes results from simulations and experiments designed to test the Multi-Isotope Process (MIP) Monitor, a novel safeguards approach for process monitoring in reprocessing plants. The MIP Monitor combines the detection of intrinsic gamma ray signatures emitted from process solutions with multivariate analysis to detect off-normal conditions in process streams, nondestructively and in near-real time (NRT). Three different models were used to predict spent nuclear fuel composition, estimate chemical distribution during separation, and simulate spectra from a variety of gamma detectors in product and raffinate streams for processed fuel. This was done for fuel with various irradiation histories and under a variety of plant operating conditions. Experiments were performed to validate the results from the model. Three segments of commercial spent nuclear fuel with variations in burnup and cooling time were dissolved and subjected to a batch PUREX method to separate the uranium and plutonium from fission and activation products. Gamma spectra were recorded by high purity germanium (HPGe) and cadmium zinc telluride (CZT) detectors. Hierarchal Cluster Analysis (HCA) and Principal Component Analysis (PCA) were applied to spectra from both model and experiment to investigate spectral variations as a function of acid concentration, burnup level and cooling time. Partial Least Squares was utilized to extract quantitative information about process variables, such as acid concentration or burnup. The MIP Monitor was found to be sensitive to the induced variations of the process and was capable of extracting quantitative process information from the analyzed spectra.

  19. Assessing V and V Processes for Automation with Respect to Vulnerabilities to Loss of Airplane State Awareness (United States)

    Whitlow, Stephen; Wilkinson, Chris; Hamblin, Chris


    Automation has contributed substantially to the sustained improvement of aviation safety by minimizing the physical workload of the pilot and increasing operational efficiency. Nevertheless, in complex and highly automated aircraft, automation also has unintended consequences. As systems become more complex and the authority and autonomy (A&A) of the automation increases, human operators become relegated to the role of a system supervisor or administrator, a passive role not conducive to maintaining engagement and airplane state awareness (ASA). The consequence is that flight crews can often come to over rely on the automation, become less engaged in the human-machine interaction, and lose awareness of the automation mode under which the aircraft is operating. Likewise, the complexity of the system and automation modes may lead to poor understanding of the interaction between a mode of automation and a particular system configuration or phase of flight. These and other examples of mode confusion often lead to mismanaging the aircraftâ€"TM"s energy state or the aircraft deviating from the intended flight path. This report examines methods for assessing whether, and how, operational constructs properly assign authority and autonomy in a safe and coordinated manner, with particular emphasis on assuring adequate airplane state awareness by the flight crew and air traffic controllers in off-nominal and/or complex situations.

  20. Automation in immunohematology. (United States)

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta


    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.