WorldWideScience

Sample records for automated process monitoring

  1. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  2. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; Grate, Jay W.; DeVol, Timothy A.

    2004-01-01

    This research program is directed toward rapid, sensitive, and selective determination of beta and alpha-emitting radionuclides such as 99Tc, 90Sr, and trans-uranium (TRU) elements in low activity waste (LAW) processing streams. The overall technical approach is based on automated radiochemical measurement principles, which entails integration of sample treatment and separation chemistries and radiometric detection within a single functional analytical instrument. Nuclear waste process streams are particularly challenging for rapid analytical methods due to the complex, high-ionic-strength, caustic brine sample matrix, the presence of interfering radionuclides, and the variable and uncertain speciation of the radionuclides of interest. As a result, matrix modification, speciation control, and separation chemistries are required for use in automated process analyzers. Significant knowledge gaps exist relative to the design of chemistries for such analyzers so that radionuclides can be quantitatively and rapidly separated and analyzed in solutions derived from low-activity waste processing operations. This research is addressing these knowledge gaps in the area of separation science, nuclear detection, and analytical chemistry and instrumentation. The outcome of these investigations will be the knowledge necessary to choose appropriate chemistries for sample matrix modification and analyte speciation control and chemistries for rapid and selective separation and preconcentration of target radionuclides from complex sample matrices. In addition, new approaches for quantification of alpha emitters in solution using solid-state diode detectors, as well as improved instrumentation and signal processing techniques for use with solid-state and scintillation detectors, will be developed. New knowledge of the performance of separation materials, matrix modification and speciation control chemistries, instrument configurations, and quantitative analytical approaches will

  3. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; Grate, Jay W.; DeVol, Timothy A.

    2003-01-01

    This research program is directed toward rapid, sensitive, and selective determination of beta and alpha-emitting radionuclides such as 99Tc, 90Sr, and trans-uranium (TRU) elements in low activity waste (LAW) processing streams. The overall technical approach is based on automated radiochemical measurement principles. Nuclear waste process streams are particularly challenging for rapid analytical methods due to the complex, high- ionic-strength, caustic brine sample matrix, the presence of interfering radionuclides, and the variable and uncertain speciation of the radionuclides of interest. As a result, matrix modification, speciation control, and separation chemistries are required for use in automated process analyzers. Significant knowledge gaps exist relative to the design of chemistries for such analyzers so that radionuclides can be quantitatively and rapidly separated and analyzed in solutions derived from low-activity waste processing operations. This research is addressing these knowledge gaps in the area of separation science, nuclear detection, and analytical chemistry and instrumentation. The outcome of these investigations will be the knowledge necessary to choose appropriate chemistries for sample matrix modification and analyte speciation control and chemistries for rapid and selective separation and preconcentration of target radionuclides from complex sample matrices. In addition, new approaches for quantification of alpha emitters in solution using solid state diode detectors, as well as improved instrumentation and signal processing techniques for use with solid-state and scintillation detectors, will be developed. New knowledge of the performance of separation materials, matrix modification and speciation control chemistries, instrument configurations, and quantitative analytical approaches will provide the basis for designing effective instrumentation for radioanalytical process monitoring. Specific analytical targets include 99 Tc, 90Sr and

  4. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Jay W. Grate; Timothy A. DeVol

    2006-01-01

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodology, and to initiate work on new detection approaches, especially using modified diode detectors

  5. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  6. Automated Vehicle Monitoring System

    OpenAIRE

    Wibowo, Agustinus Deddy Arief; Heriansyah, Rudi

    2014-01-01

    An automated vehicle monitoring system is proposed in this paper. The surveillance system is based on image processing techniques such as background subtraction, colour balancing, chain code based shape detection, and blob. The proposed system will detect any human's head as appeared at the side mirrors. The detected head will be tracked and recorded for further action.

  7. Elektronische monitoring van luchtwassers op veehouderijbedrijven = Automated process monitoring and data logging of air scrubbers at animal houses

    NARCIS (Netherlands)

    Melse, R.W.; Franssen, J.C.T.J.

    2010-01-01

    At 6 animal houses air scrubbers equipped with an automated process monitoring and data logging system were tested. The measured values were successfully stored but the measured values, especially the pH and EC of the recirculation water, appeared not to be correct at all times.

  8. Plug-and-play monitoring and performance optimization for industrial automation processes

    CERN Document Server

    Luo, Hao

    2017-01-01

    Dr.-Ing. Hao Luo demonstrates the developments of advanced plug-and-play (PnP) process monitoring and control systems for industrial automation processes. With aid of the so-called Youla parameterization, a novel PnP process monitoring and control architecture (PnP-PMCA) with modularized components is proposed. To validate the developments, a case study on an industrial rolling mill benchmark is performed, and the real-time implementation on a laboratory brushless DC motor is presented. Contents PnP Process Monitoring and Control Architecture Real-Time Configuration Techniques for PnP Process Monitoring Real-Time Configuration Techniques for PnP Performance Optimization Benchmark Study and Real-Time Implementation Target Groups Researchers and students of Automation and Control Engineering Practitioners in the area of Industrial and Production Engineering The Author Hao Luo received the Ph.D. degree at the Institute for Automatic Control and Complex Systems (AKS) at the University of Duisburg-Essen, Germany, ...

  9. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  10. Automated Radioanalytical Chemistry: Applications For The Laboratory And Industrial Process Monitoring

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Farawila, Anne F.; Grate, Jay W.

    2009-01-01

    The identification and quantification of targeted α- and β-emitting radionuclides via destructive analysis in complex radioactive liquid matrices is highly challenging. Analyses are typically accomplished at on- or off-site laboratories through laborious sample preparation steps and extensive chemical separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, alpha energy spectroscopy, mass spectrometry). Analytical results may take days or weeks to report. When an industrial-scale plant requires periodic or continuous monitoring of radionuclides as an indication of the composition of its feed stream, diversion of safeguarded nuclides, or of plant operational conditions (for example), radiochemical measurements should be rapid, but not at the expense of precision and accuracy. Scientists at Pacific Northwest National Laboratory have developed and characterized a host of automated radioanalytical systems designed to perform reproducible and rapid radioanalytical processes. Platforms have been assembled for (1) automation and acceleration of sample analysis in the laboratory and (2) automated monitors for monitoring industrial scale nuclear processes on-line with near-real time results. These methods have been applied to the analysis of environmental-level actinides and fission products to high-level nuclear process fluids. Systems have been designed to integrate a number of discrete sample handling steps, including sample pretreatment (e.g., digestion and valence state adjustment) and chemical separations. The systems have either utilized on-line analyte detection or have collected the purified analyte fractions for off-line measurement applications. One PNNL system of particular note is a fully automated prototype on-line radioanalytical system designed for the Waste Treatment Plant at Hanford, WA, USA. This system demonstrated nearly continuous destructive analysis of the soft β-emitting radionuclide 99Tc in nuclear

  11. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    Science.gov (United States)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  12. Migration monitoring with automated technology

    Science.gov (United States)

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  13. Automated system for acquisition and image processing for the control and monitoring boned nopal

    Science.gov (United States)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  14. The SARVIEWS Project: Automated SAR Processing in Support of Operational Near Real-time Volcano Monitoring

    Science.gov (United States)

    Meyer, F. J.; Webley, P. W.; Dehn, J.; Arko, S. A.; McAlpin, D. B.; Gong, W.

    2016-12-01

    Volcanic eruptions are among the most significant hazards to human society, capable of triggering natural disasters on regional to global scales. In the last decade, remote sensing has become established in operational volcano monitoring. Centers like the Alaska Volcano Observatory rely heavily on remote sensing data from optical and thermal sensors to provide time-critical hazard information. Despite this high use of remote sensing data, the presence of clouds and a dependence on solar illumination often limit their impact on decision making. Synthetic Aperture Radar (SAR) systems are widely considered superior to optical sensors in operational monitoring situations, due to their weather and illumination independence. Still, the contribution of SAR to operational volcano monitoring has been limited in the past due to high data costs, long processing times, and low temporal sampling rates of most SAR systems. In this study, we introduce the automatic SAR processing system SARVIEWS, whose advanced data analysis and data integration techniques allow, for the first time, a meaningful integration of SAR into operational monitoring systems. We will introduce the SARVIEWS database interface that allows for automatic, rapid, and seamless access to the data holdings of the Alaska Satellite Facility. We will also present a set of processing techniques designed to automatically generate a set of SAR-based hazard products (e.g. change detection maps, interferograms, geocoded images). The techniques take advantage of modern signal processing and radiometric normalization schemes, enabling the combination of data from different geometries. Finally, we will show how SAR-based hazard information is integrated in existing multi-sensor decision support tools to enable joint hazard analysis with data from optical and thermal sensors. We will showcase the SAR processing system using a set of recent natural disasters (both earthquakes and volcanic eruptions) to demonstrate its

  15. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing

    International Nuclear Information System (INIS)

    Masotti, Paulo Henrique Ferraz

    2006-01-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  16. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    International Nuclear Information System (INIS)

    Brown, Aaron W.; Simone, Paul S.; York, J.C.; Emmert, Gary L.

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L –1 . • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L −1 . Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature

  17. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Aaron W. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Simone, Paul S. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States); York, J.C. [City of Lebanon, TN Water Treatment Plant, 7 Gilmore Hill Rd., Lebanon, TN 37087 (United States); Emmert, Gary L., E-mail: gemmert@memphis.edu [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States)

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L{sup –1}. • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L{sup −1}. Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature.

  18. Process monitoring

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    Many of the measurements and observations made in a nuclear processing facility to monitor processes and product quality can also be used to monitor the location and movements of nuclear materials. In this session information is presented on how to use process monitoring data to enhance nuclear material control and accounting (MC and A). It will be seen that SNM losses can generally be detected with greater sensitivity and timeliness and point of loss localized more closely than by conventional MC and A systems if process monitoring data are applied. The purpose of this session is to enable the participants to: (1) identify process unit operations that could improve control units for monitoring SNM losses; (2) choose key measurement points and formulate a loss indicator for each control unit; and (3) describe how the sensitivities and timeliness of loss detection could be determined for each loss indicator

  19. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  20. Bioreactor process monitoring using an automated microfluidic platform for cell-based assays

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    We report on a novel microfluidic system designed to monitor in real-time the concentration of live and dead cells in industrial cell production. Custom-made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample-to-waste liquid management and image cytometry-based ...

  1. 49 CFR 238.445 - Automated monitoring.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Automated monitoring. 238.445 Section 238.445... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor the... limiting the speed of the train. (c) The monitoring system shall be designed with an automatic self-test...

  2. [The use of automated processing of information obtained during space flights for the monitoring and evaluation of airborne pollution].

    Science.gov (United States)

    Bagmanov, B Kh; Mikhaĭlova, A Iu; Pavlov, S V

    1997-01-01

    The article describes experience on use of automated processing of information obtained during spaceflights for analysis of urban air pollution. The authors present a method for processing of information obtained during spaceflights and show how to identify foci of industrial release and area of their spread within and beyond the cities.

  3. Automated biomonitoring: living sensors as environmental monitors

    National Research Council Canada - National Science Library

    Gruber, D; Diamond, J

    1988-01-01

    Water quality continues to present problems of global concern and has resulted in greatly increased use of automated biological systems in monitoring drinking water, industrial effluents and wastewater...

  4. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-03

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  5. 49 CFR 238.237 - Automated monitoring.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Automated monitoring. 238.237 Section 238.237 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after...

  6. An automated neutron monitor maintenance system

    International Nuclear Information System (INIS)

    Moore, F.S.; Griffin, J.C.; Odell, D.M.C.

    1996-01-01

    Neutron detectors are commonly used by the nuclear materials processing industry to monitor fissile materials in process vessels and tanks. The proper functioning of these neutron monitors must be periodically evaluated. We have developed and placed in routine use a PC-based multichannel analyzer (MCA) system for on-line BF3 and He-3 gas-filled detector function testing. The automated system: 1) acquires spectral data from the monitor system, 2) analyzes the spectrum to determine the detector's functionality, 3) makes suggestions for maintenance or repair, as required, and 4) saves the spectrum and results to disk for review. The operator interface has been designed to be user-friendly and to minimize the training requirements of the user. The system may also be easily customized for various applications

  7. Technology Transfer Opportunities: Automated Ground-Water Monitoring

    Science.gov (United States)

    Smith, Kirk P.; Granato, Gregory E.

    1997-01-01

    Introduction A new automated ground-water monitoring system developed by the U.S. Geological Survey (USGS) measures and records values of selected water-quality properties and constituents using protocols approved for manual sampling. Prototypes using the automated process have demonstrated the ability to increase the quantity and quality of data collected and have shown the potential for reducing labor and material costs for ground-water quality data collection. Automation of water-quality monitoring systems in the field, in laboratories, and in industry have increased data density and utility while reducing operating costs. Uses for an automated ground-water monitoring system include, (but are not limited to) monitoring ground-water quality for research, monitoring known or potential contaminant sites, such as near landfills, underground storage tanks, or other facilities where potential contaminants are stored, and as an early warning system monitoring groundwater quality near public water-supply wells.

  8. Process computers automate CERN power supply installations

    International Nuclear Information System (INIS)

    Ullrich, H.; Martin, A.

    1974-01-01

    Higher standards of performance and reliability in the power plants of large particle accelerators necessitate increasing use of automation. The CERN (European Nuclear Research Centre) in Geneva started to employ process computers for plant automation at an early stage in its history. The great complexity and extent of the plants for high-energy physics first led to the setting-up of decentralized automatic systems which are now being increasingly combined into one interconnected automation system. One of these automatic systems controls and monitors the extensive power supply installations for the main ring magnets in the experimental zones. (orig.) [de

  9. National Automated Conformity Inspection Process -

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  10. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high...... flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so...... and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells...

  11. Automated contamination monitoring for hot particles

    International Nuclear Information System (INIS)

    Johnstone, G.; Case, L.

    1987-01-01

    INS Corp., the largest nuclear laundry company in the United States, has recently developed two types of automated contamination monitoring systems: 1) the Automated Laundry Monitor (ALM), which provides quality assurance monitoring for protective clothing contamination and 2) a low-level automated monitoring system for Plastic Volume Reduction Service (PVRS). The presentation details the inaccuracies associated with hand-probe frisking which led to the development of the ALM. The ALM was designed for 100% quality assurance monitoring of garments to the most stringent customer requirements. A review of why the ALM is essential in verifying the absence of hot particles on garments is given. The final topic addresses the expansion of the ALM technology in support of the INS Plastic Volume Reduction Service by monitoring decontaminated plastics to free release levels. This presentation reviews the design and operation of both monitoring systems

  12. Advanced health monitor for automated driving functions

    OpenAIRE

    Mikovski Iotov, I.

    2017-01-01

    There is a trend in the automotive domain where driving functions are taken from the driver by automated driving functions. In order to guarantee the correct behavior of these auto-mated driving functions, the report introduces an Advanced Health Monitor that uses Tem-poral Logic and Probabilistic Analysis to indicate the system’s health.

  13. Advanced health monitor for automated driving functions

    NARCIS (Netherlands)

    Mikovski Iotov, I.

    2017-01-01

    There is a trend in the automotive domain where driving functions are taken from the driver by automated driving functions. In order to guarantee the correct behavior of these auto-mated driving functions, the report introduces an Advanced Health Monitor that uses Tem-poral Logic and Probabilistic

  14. Immunosuppressant therapeutic drug monitoring by LC-MS/MS: workflow optimization through automated processing of whole blood samples.

    Science.gov (United States)

    Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario

    2013-11-01

    Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.

  15. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  16. Automated Cryocooler Monitor and Control System Software

    Science.gov (United States)

    Britchcliffe, Michael J.; Conroy, Bruce L.; Anderson, Paul E.; Wilson, Ahmad

    2011-01-01

    This software is used in an automated cryogenic control system developed to monitor and control the operation of small-scale cryocoolers. The system was designed to automate the cryogenically cooled low-noise amplifier system described in "Automated Cryocooler Monitor and Control System" (NPO-47246), NASA Tech Briefs, Vol. 35, No. 5 (May 2011), page 7a. The software contains algorithms necessary to convert non-linear output voltages from the cryogenic diode-type thermometers and vacuum pressure and helium pressure sensors, to temperature and pressure units. The control function algorithms use the monitor data to control the cooler power, vacuum solenoid, vacuum pump, and electrical warm-up heaters. The control algorithms are based on a rule-based system that activates the required device based on the operating mode. The external interface is Web-based. It acts as a Web server, providing pages for monitor, control, and configuration. No client software from the external user is required.

  17. Automated personal dosimetry monitoring system for NPP

    International Nuclear Information System (INIS)

    Chanyshev, E.; Chechyotkin, N.; Kondratev, A.; Plyshevskaya, D.

    2006-01-01

    Full text: Radiation safety of personnel at nuclear power plants (NPP) is a priority aim. Degree of radiation exposure of personnel is defined by many factors: NPP design, operation of equipment, organizational management of radiation hazardous works and, certainly, safety culture of every employee. Automated Personal Dosimetry Monitoring System (A.P.D.M.S.) is applied at all nuclear power plants nowadays in Russia to eliminate the possibility of occupational radiation exposure beyond regulated level under different modes of NPP operation. A.P.D.M.S. provides individual radiation dose registration. In the paper the efforts of Design Bureau 'Promengineering' in construction of software and hardware complex of A.P.D.M.S. (S.H.W. A.P.D.M.S.) for NPP with PWR are presented. The developed complex is intended to automatize activities of radiation safety department when caring out individual dosimetry control. The complex covers all main processes concerning individual monitoring of external and internal radiation exposure as well as dose recording, management, and planning. S.H.W. A.P.D.M.S. is a multi-purpose system which software was designed on the modular approach. This approach presumes modification and extension of software using new components (modules) without changes in other components. Such structure makes the system flexible and allows modifying it in case of implementation a new radiation safety requirements and extending the scope of dosimetry monitoring. That gives the possibility to include with time new kinds of dosimetry control for Russian NPP in compliance with IAEA recommendations, for instance, control of the equivalent dose rate to the skin and the equivalent dose rate to the lens of the eye S.H.W. A.P.D.M.S. provides dosimetry control as follows: Current monitoring of external radiation exposure: - Gamma radiation dose measurement using radio-photoluminescent personal dosimeters. - Neutron radiation dose measurement using thermoluminescent

  18. Development and implementation of an automated system for antiquated of the process of gamma radiation monitors calibration

    International Nuclear Information System (INIS)

    Silva Junior, Iremar Alves

    2012-01-01

    In this study it was carried out the development and implementation of a system for the appropriate process of gamma radiation monitors calibration, constituted by a pneumatic dispositive to exchange the attenuators and a positioning table, both actuated through a control panel. We also implemented a System of Caesa-Gammatron Irradiator, which increased the range of the air kerma rates, due to its higher activity comparing with the current system of gamma radiation in use in the calibration laboratory of gamma irradiation. Hence, it was necessary the installation of an attenuator dispositive remotely controlled in this irradiator system. Lastly, it was carried out an evaluation of the reduction in the rates of the occupational dose. This dissertation was developed with the aim of improving the quality of the services of calibration and tests of gamma radiation monitors - provided by the IPEN Laboratory of Instrument Calibration - as well as decreasing the occupational dose of the technicians involved in the process of calibration, following thus the principles of radiation protection. (author)

  19. Automating the personnel dosimeter monitoring program

    International Nuclear Information System (INIS)

    Compston, M.W.

    1982-12-01

    The personnel dosimetry monitoring program at the Portsmouth uranium enrichment facility has been improved by using thermoluminescent dosimetry to monitor for ionizing radiation exposure, and by automating most of the operations and all of the associated information handling. A thermoluminescent dosimeter (TLD) card, worn by personnel inside security badges, stores the energy of ionizing radiation. The dosimeters are changed-out periodically and are loaded 150 cards at a time into an automated reader-processor. The resulting data is recorded and filed into a useful form by computer programming developed for this purpose

  20. Automating the conflict resolution process

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  1. The Automator: Intelligent control system monitoring

    International Nuclear Information System (INIS)

    M. Bickley; D.A. Bryan; K.S. White

    1999-01-01

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  2. Automated interactive sales processes

    NARCIS (Netherlands)

    T.B. Klos (Tomas); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2010-01-01

    htmlabstractWhen we look at successful sales processes occurring in practice, we find they combine two techniques which have been studied separately in the literature. Recommender systems are used to suggest additional products or accessories to include in the bundle under consideration, and

  3. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1988-01-01

    Automation, the removal of the human element in inspection has not been generally applied to film radiographic NDT. The justification for automation is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 3x10 (to the power of nine) bits per 14x17. This is equivalent to 2200 computer floppy disks parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, a computer aided interpretation appears on the horizon. A unit which laser scans a 14x27 (inch) film in 6-8 seconds can digitize film in information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for film digital radiography system) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. (Author). 4 refs.; 21 figs

  4. Automated Cryocooler Monitor and Control System

    Science.gov (United States)

    Britcliffe, Michael J.; Hanscon, Theodore R.; Fowler, Larry E.

    2011-01-01

    A system was designed to automate cryogenically cooled low-noise amplifier systems used in the NASA Deep Space Network. It automates the entire operation of the system including cool-down, warm-up, and performance monitoring. The system is based on a single-board computer with custom software and hardware to monitor and control the cryogenic operation of the system. The system provides local display and control, and can be operated remotely via a Web interface. The system controller is based on a commercial single-board computer with onboard data acquisition capability. The commercial hardware includes a microprocessor, an LCD (liquid crystal display), seven LED (light emitting diode) displays, a seven-key keypad, an Ethernet interface, 40 digital I/O (input/output) ports, 11 A/D (analog to digital) inputs, four D/A (digital to analog) outputs, and an external relay board to control the high-current devices. The temperature sensors used are commercial silicon diode devices that provide a non-linear voltage output proportional to temperature. The devices are excited with a 10-microamp bias current. The system is capable of monitoring and displaying three temperatures. The vacuum sensors are commercial thermistor devices. The output of the sensors is a non-linear voltage proportional to vacuum pressure in the 1-Torr to 1-millitorr range. Two sensors are used. One measures the vacuum pressure in the cryocooler and the other the pressure at the input to the vacuum pump. The helium pressure sensor is a commercial device that provides a linear voltage output from 1 to 5 volts, corresponding to a gas pressure from 0 to 3.5 MPa (approx. = 500 psig). Control of the vacuum process is accomplished with a commercial electrically operated solenoid valve. A commercial motor starter is used to control the input power of the compressor. The warm-up heaters are commercial power resistors sized to provide the appropriate power for the thermal mass of the particular system, and

  5. Automation Processes and Blockchain Systems

    OpenAIRE

    Hegadekatti, Kartik

    2017-01-01

    Blockchain Systems and Ubiquitous computing are changing the way we do business and lead our lives. One of the most important applications of Blockchain technology is in automation processes and Internet-of-Things (IoT). Machines have so far been limited in ability primarily because they have restricted capacity to exchange value. Any monetary exchange of value has to be supervised by humans or human-based centralised ledgers. Blockchain technology changes all that. It allows machines to have...

  6. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  7. [Algorithm for the automated processing of rheosignals].

    Science.gov (United States)

    Odinets, G S

    1988-01-01

    Algorithm for rheosignals recognition for a microprocessing device with a representation apparatus and with automated and manual cursor control was examined. The algorithm permits to automate rheosignals registrating and processing taking into account their changeability.

  8. Monitoring system for automation of experimental researches in cutting

    International Nuclear Information System (INIS)

    Kuzinovski, Mikolaj; Trajchevski, Neven; Filipovski, Velimir; Tomov, Mite; Cichosz, Piotr

    2009-01-01

    This study presents procedures being performed when projecting and realizing experimental scientific researches by application of the automated measurement system with a computer support in all experiment stages. A special accent is placed on the measurement system integration and mathematical processing of data from experiments. Automation processes are described through the realized own automated monitoring system for research of physical phenomena in the cutting process with computer-aided data acquisition. The monitoring system is intended for determining the tangential, axial and radial component of the cutting force, as well as average temperature in the cutting process. The hardware acquisition art consists of amplifiers and A/D converters, while as for analysis and visualization software for P C is developed by using M S Visual C++. For mathematical description researched physical phenomena CADEX software is made, which in connection with MATLAB is intended for projecting processing and analysis of experimental scientific researches against the theory for planning multi-factorial experiments. The design and construction of the interface and the computerized measurement system were done by the Faculty of Mechanical Engineering in Skopje in collaboration with the Faculty of Electrical Engineering and Information Technologies in Skopje and the Institute of Production Engineering and Automation, Wroclaw University of Technology, Poland. Gaining own scientific research measurement system with free access to hardware and software parts provides conditions for a complete control of the research process and reduction of interval of the measuring uncertainty of gained results from performed researches.

  9. Office automation: a look beyond word processing

    OpenAIRE

    DuBois, Milan Ephriam, Jr.

    1983-01-01

    Approved for public release; distribution is unlimited Word processing was the first of various forms of office automation technologies to gain widespread acceptance and usability in the business world. For many, it remains the only form of office automation technology. Office automation, however, is not just word processing, although it does include the function of facilitating and manipulating text. In reality, office automation is not one innovation, or one office system, or one tech...

  10. ERP processes automation in corporate environments

    OpenAIRE

    Antonoaie Victor; Irimeş Adrian; Chicoş Lucia-Antoneta

    2017-01-01

    The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP proj...

  11. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  12. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  13. AUTOMATED LOW-COST PHOTOGRAMMETRY FOR FLEXIBLE STRUCTURE MONITORING

    Directory of Open Access Journals (Sweden)

    C. H. Wang

    2012-07-01

    Full Text Available Structural monitoring requires instruments which can provide high precision and accuracy, reliable measurements at good temporal resolution and rapid processing speeds. Long-term campaigns and flexible structures are regarded as two of the most challenging subjects in monitoring engineering structures. Long-term monitoring in civil engineering is generally considered to be labourintensive and financially expensive and it can take significant effort to arrange the necessary human resources, transportation and equipment maintenance. When dealing with flexible structure monitoring, it is of paramount importance that any monitoring equipment used is able to carry out rapid sampling. Low cost, automated, photogrammetric techniques therefore have the potential to become routinely viable for monitoring non-rigid structures. This research aims to provide a photogrammetric solution for long-term flexible structural monitoring purposes. The automated approach was achieved using low-cost imaging devices (mobile phones to replace traditional image acquisition stations and substantially reduce the equipment costs. A self-programmed software package was developed to deal with the hardware-software integration and system operation. In order to evaluate the performance of this low-cost monitoring system, a shaking table experiment was undertaken. Different network configurations and target sizes were used to determine the best configuration. A large quantity of image data was captured by four DSLR cameras and four mobile phone cameras respectively. These image data were processed using photogrammetric techniques to calculate the final results for the system evaluation.

  14. An automated processing chains for surface temperature monitoring on Earth's most active volcanoes by optical data from multiple satellites

    Science.gov (United States)

    Silvestri, Malvina; Musacchio, Massimo; Fabrizia Buongiorno, Maria

    2017-04-01

    The Geohazards Exploitation Platform, or GEP is one of six Thematic Exploitation Platforms developed by ESA to serve data user communities. As a new element of the ground segment delivering satellite results to users, these cloud-based platforms provide an online environment to access information, processing tools, computing resources for community collaboration. The aim is to enable the easy extraction of valuable knowledge from vast quantities of satellite-sensed data now being produced by Europe's Copernicus programme and other Earth observation satellites. In this context, the estimation of surface temperature on active volcanoes around the world is considered. E2E processing chains have been developed for different satellite data (ASTER, Landsat8 and Sentinel 3 missions) using thermal infrared (TIR) channels by applying specific algorithms. These chains have been implemented on the GEP platform enabling the use of EO missions and the generation of added value product such as surface temperature map, from not skilled users. This solution will enhance the use of satellite data and improve the dissemination of the results saving valuable time (no manual browsing, downloading or processing is needed) and producing time series data that can be speedily extracted from a single co-registered pixel, to highlight gradual trends within a narrow area. Moreover, thanks to the high-resolution optical imagery of Sentinel 2 (MSI), the detection of lava maps during an eruption can be automatically obtained. The proposed lava detection method is based on a contextual algorithm applied to Sentinel-2 NIR (band 8 - 0.8 micron) and SWIR (band 12 - 2.25 micron) data. Examples derived by last eruptions on active volcanoes are showed.

  15. Multivariate process monitoring of EAFs

    Energy Technology Data Exchange (ETDEWEB)

    Sandberg, E.; Lennox, B.; Marjanovic, O.; Smith, K.

    2005-06-01

    Improved knowledge of the effect of scrap grades on the electric steelmaking process and optimised scrap loading practices increase the potential for process automation. As part of an ongoing programme, process data from four Scandinavian EAFs have been analysed, using the multivariate process monitoring approach, to develop predictive models for end point conditions such as chemical composition, yield and energy consumption. The models developed generally predict final Cr, Ni and Mo and tramp element contents well, but electrical energy consumption, yield and content of oxidisable and impurity elements (C, Si, Mn, P, S) are at present more difficult to predict. Potential scrap management applications of the prediction models are also presented. (author)

  16. Automated processing of endoscopic surgical instruments.

    Science.gov (United States)

    Roth, K; Sieber, J P; Schrimm, H; Heeg, P; Buess, G

    1994-10-01

    This paper deals with the requirements for automated processing of endoscopic surgical instruments. After a brief analysis of the current problems, solutions are discussed. Test-procedures have been developed to validate the automated processing, so that the cleaning results are guaranteed and reproducable. Also a device for testing and cleaning was designed together with Netzsch Newamatic and PCI, called TC-MIC, to automate processing and reduce manual work.

  17. SHARP - Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. G.

    1990-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  18. SHARP: Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. Gaius

    1991-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  19. Engineering Process Monitoring for Control Room Operation

    OpenAIRE

    Bätz, M

    2001-01-01

    A major challenge in process operation is to reduce costs and increase system efficiency whereas the complexity of automated process engineering, control and monitoring systems increases continuously. To cope with this challenge the design, implementation and operation of process monitoring systems for control room operation have to be treated as an ensemble. This is only possible if the engineering of the monitoring information is focused on the production objective and is lead in close coll...

  20. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  1. ERP processes automation in corporate environments

    Directory of Open Access Journals (Sweden)

    Antonoaie Victor

    2017-01-01

    Full Text Available The automation processes are used in organizations to speed up analyses processes and reduce manual labour. Robotic Automation of IT processes implemented in a modern corporate workspace provides an excellent tool for assisting professionals in making decisions, saving resources and serving as a know-how repository. This study presents the newest trends in process automation, its benefits such as security, ease of use, reduction of overall process duration, and provide examples of SAPERP projects where this technology was implemented and meaningful impact was obtained.

  2. Powder handling for automated fuel processing

    International Nuclear Information System (INIS)

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-01-01

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs

  3. Real-time bioacoustics monitoring and automated species identification

    Directory of Open Access Journals (Sweden)

    T. Mitchell Aide

    2013-07-01

    Full Text Available Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON, a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website (arbimon.net. Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  4. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  5. Automated wireless monitoring system for cable tension using smart sensors

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jongwoong; Cho, Soojin; Spencer, Billie F.; Yun, Chung-Bang

    2013-04-01

    Cables are critical load carrying members of cable-stayed bridges; monitoring tension forces of the cables provides valuable information for SHM of the cable-stayed bridges. Monitoring systems for the cable tension can be efficiently realized using wireless smart sensors in conjunction with vibration-based cable tension estimation approaches. This study develops an automated cable tension monitoring system using MEMSIC's Imote2 smart sensors. An embedded data processing strategy is implemented on the Imote2-based wireless sensor network to calculate cable tensions using a vibration-based method, significantly reducing the wireless data transmission and associated power consumption. The autonomous operation of the monitoring system is achieved by AutoMonitor, a high-level coordinator application provided by the Illinois SHM Project Services Toolsuite. The monitoring system also features power harvesting enabled by solar panels attached to each sensor node and AutoMonitor for charging control. The proposed wireless system has been deployed on the Jindo Bridge, a cable-stayed bridge located in South Korea. Tension forces are autonomously monitored for 12 cables in the east, land side of the bridge, proving the validity and potential of the presented tension monitoring system for real-world applications.

  6. Computer simulation and automation of data processing

    International Nuclear Information System (INIS)

    Tikhonov, A.N.

    1981-01-01

    The principles of computerized simulation and automation of data processing are presented. The automized processing system is constructed according to the module-hierarchical principle. The main operating conditions of the system are as follows: preprocessing, installation analysis, interpretation, accuracy analysis and controlling parameters. The definition of the quasireal experiment permitting to plan the real experiment is given. It is pointed out that realization of the quasireal experiment by means of the computerized installation model with subsequent automized processing permits to scan the quantitative aspect of the system as a whole as well as provides optimal designing of installation parameters for obtaining maximum resolution [ru

  7. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  8. Automated Method for Monitoring Water Quality Using Landsat Imagery

    Directory of Open Access Journals (Sweden)

    D. Clay Barrett

    2016-06-01

    Full Text Available Regular monitoring of water quality is increasingly necessary to keep pace with rapid environmental change and protect human health and well-being. Remote sensing has been suggested as a potential solution for monitoring certain water quality parameters without the need for in situ sampling, but universal methods and tools are lacking. While many studies have developed predictive relationships between remotely sensed surface reflectance and water parameters, these relationships are often unique to a particular geographic region and have little applicability in other areas. In order to remotely monitor water quality, these relationships must be developed on a region by region basis. This paper presents an automated method for processing remotely sensed images from Landsat Thematic Mapper (TM and Enhanced Thematic Mapper Plus (ETM+ and extracting corrected reflectance measurements around known sample locations to allow rapid development of predictive water quality relationships to improve remote monitoring. Using open Python scripting, this study (1 provides an openly accessible and simple method for processing publicly available remote sensing data; and (2 allows determination of relationships between sampled water quality parameters and reflectance values to ultimately allow predictive monitoring. The method is demonstrated through a case study of the Ozark/Ouchita-Appalachian ecoregion in eastern Oklahoma using data collected for the Beneficial Use Monitoring Program (BUMP.

  9. Lifecycle, Iteration, and Process Automation with SMS Gateway

    Directory of Open Access Journals (Sweden)

    Fenny Fenny

    2015-12-01

    Full Text Available Producing a better quality software system requires an understanding of the indicators of the software quality through defect detection, and automated testing. This paper aims to elevate the design and automated testing process in an engine water pump of a drinking water plant. This paper proposes how software developers can improve the maintainability and reliability of automated testing system and report the abnormal state when an error occurs on the machine. The method in this paper uses literature to explain best practices and case studies of a drinking water plant. Furthermore, this paper is expected to be able to provide insights into the efforts to better handle errors and perform automated testing and monitoring on a machine.

  10. Welding process automation in power machine building

    International Nuclear Information System (INIS)

    Mel'bard, S.N.; Shakhnov, A.F.; Shergov, I.V.

    1977-01-01

    The level of welding automation operations in power engineering and ways of its enhancement are highlighted. Used as the examples of comlex automation are an apparatus for the horizontal welding of turbine rotors, remotely controlled automatic machine for welding ring joint of large-sized vessels, equipment for the electron-beam welding of steam turbine assemblies of alloyed steels. The prospects of industrial robots are noted. The importance of the complex automation of technological process, including stocking, assemblying, transportation and auxiliary operations, is emphasized

  11. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  12. The Automation of Nowcast Model Assessment Processes

    Science.gov (United States)

    2016-09-01

    secondly, provide modelers with the information needed to understand the model errors and how their algorithm changes might mitigate these errors. In...by ARL modelers. 2. Development Environment The automation of Point-Stat processes (i.e., PSA) was developed using Python 3.5.* Python was selected...because it is easy to use, widely used for scripting, and satisfies all the requirements to automate the implementation of the Point-Stat tool. In

  13. Monitoring of operating processes

    International Nuclear Information System (INIS)

    Barry, R.F.

    1981-01-01

    Apparatus is described for monitoring the processes of a nuclear reactor to detect off-normal operation of any process and for testing the monitoring apparatus. The processes are evaluated by response to their paramters, such as temperature, pressure, etc. The apparatus includes a pair of monitoring paths or signal processing units. Each unit includes facilities for receiving on a time-sharing basis, a status binary word made up of digits each indicating the status of a process, whether normal or off-normal, and test-signal binary words simulating the status binary words. The status words and test words are processed in succession during successive cycles. During each cycle, the two units receive the same status word and the same test word. The test words simulate the status words both when they indicate normal operation and when they indicate off-normal operation. Each signal-processing unit includes a pair of memories. Each memory receives a status word or a test word, as the case may be, and converts the received word into a converted status word or a converted test word. The memories of each monitoring unit operate into a non-coincidence which signals non-coincidence of the converted word out of one memory of a signal-processing unit not identical to the converted word of the other memory of the same unit

  14. Automation of technical specification monitoring for nuclear power plants

    International Nuclear Information System (INIS)

    Lin, J.C.; Abbott, E.C.; Hubbard, F.R.

    1986-01-01

    The complexity of today's nuclear power plants combined with an equally detailed regulatory process makes it necessary for the plant staff to have access to an automated system capable of monitoring the status of limiting conditions for operation (LCO). Pickard, Lowe and Garrick, Inc. (PLG), has developed the first of such a system, called Limiting Conditions for Operation Monitor (LIMCOM). LIMCOM provides members of the operating staff with an up-to-date comparison of currently operable equipment and plant operating conditions with what is required in the technical specifications. LIMCOM also provides an effective method of screening tagout requests by evaluating their impact on the LCOs. Finally, LIMCOM provides an accurate method of tracking and scheduling routine surveillance. (author)

  15. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    Science.gov (United States)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  16. Automated system for data acquisition and monitoring

    Directory of Open Access Journals (Sweden)

    Borza Sorin

    2017-01-01

    Full Text Available The Environmental management has become, with the development of human society a very important issue. There have been multiple systems that automatically monitors the environment. In this paper we propose a system that integrates GIS software and data acquisition software. In addition the proposed system implements new AHP multicriteria method that can get an answer online on each pollutant influence on limited geographical area in which the monitors. Factors pollutants of limited geographical areas are taken automatically by specific sensors through acquisition board. Labview software, with virtual instrument created by transferring them into a database Access. Access database they are taken up by software Geomedia Professional and processed using multi-criteria method AHP, so that at any moment, their influence on the environment and classify these influences, can be plotted on the screen monitoring system. The system allows, the automatic collection of data, the memorization and the generation of GIS elements. The research presented in this paper were aimed at implementing multi-criteria methods in GIS software.

  17. Design and implementation of an Internet based effective controlling and monitoring system with wireless fieldbus communications technologies for process automation--an experimental study.

    Science.gov (United States)

    Cetinceviz, Yucel; Bayindir, Ramazan

    2012-05-01

    The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Automated monitoring of recovered water quality

    Science.gov (United States)

    Misselhorn, J. E.; Hartung, W. H.; Witz, S. W.

    1974-01-01

    Laboratory prototype water quality monitoring system provides automatic system for online monitoring of chemical, physical, and bacteriological properties of recovered water and for signaling malfunction in water recovery system. Monitor incorporates whenever possible commercially available sensors suitably modified.

  19. ARTIP: Automated Radio Telescope Image Processing Pipeline

    Science.gov (United States)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  20. Automated processing of nuclear materials accounting data

    International Nuclear Information System (INIS)

    Straka, J.; Pacak, P.; Moravec, J.

    1980-01-01

    An automated system was developed of nuclear materials accounting in Czechoslovakia. The system allows automating data processing including data storage. It comprises keeping records of inventories and material balance. In designing the system, the aim of the IAEA was taken into consideration, ie., building a unified information system interconnected with state-run systems of accounting and checking nuclear materials in the signatory countries of the non-proliferation treaty. The nuclear materials accounting programs were written in PL-1 and were tested at an EC 1040 computer at UJV Rez where also the routine data processing takes place. (B.S.)

  1. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    International Nuclear Information System (INIS)

    Musyurka, A. V.

    2016-01-01

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  2. Automated System of Diagnostic Monitoring at Bureya HPP Hydraulic Engineering Installations: a New Level of Safety

    Energy Technology Data Exchange (ETDEWEB)

    Musyurka, A. V., E-mail: musyurkaav@burges.rushydro.ru [Bureya HPP (a JSC RusGidro affiliate) (Russian Federation)

    2016-09-15

    This article presents the design, hardware, and software solutions developed and placed in service for the automated system of diagnostic monitoring (ASDM) for hydraulic engineering installations at the Bureya HPP, and assuring a reliable process for monitoring hydraulic engineering installations. Project implementation represents a timely solution of problems addressed by the hydraulic engineering installation diagnostics section.

  3. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  4. AUTOMATING THE DATA SECURITY PROCESS

    OpenAIRE

    Florin Ogigau-Neamtiu

    2017-01-01

    Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization) suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its...

  5. Automated processing of pulsar observations

    Energy Technology Data Exchange (ETDEWEB)

    Byzhlov, B.V.; Ivanova, V.V.; Izvekova, V.A.; Kuz' min, A.D.; Kuz' min, Yu.P.; Malofeev, V.M.; Popov, Yu.M.; Solomin, N.S.; Shabanova, T.V.; Shitov, Yu.P.

    1977-01-01

    Digital computer technology which processes observation results in a real time scale is used on meter-range radiotelescopes DKR-100 of the USSR Academy of Sciences Physics Institute and the BSA of the Physics Institute to study pulsars. A method which calls for the accumulation of impulses with preliminary compensation of pulsar dispersion in a broad band is used to increase sensitivity and resolution capability. Known pulsars are studied with the aid of a ''neuron'' type analyzer. A system for processing observations in an on-line set-up was created on the M-6000 computer for seeking unknown pulsars. 8 figures, 1 table, references.

  6. Employee on Boarding Process Automation

    OpenAIRE

    Khushboo Nalband; Priyanka Jadhav; Geetanjali Salunke

    2017-01-01

    On boarding, also known as organizational socialization, plays a vital role in building the initial relationship between an organization and an employee. It also contributes to an employees’ satisfaction, better performance and greater organizational commitment thus increasing an employees’ effectiveness and productivity in his/her role. Therefore, it is essential that on boarding process of an organization is efficient and effective to improve new employees’ retention. Generally this on boar...

  7. Means of storage and automated monitoring of versions of text technical documentation

    Science.gov (United States)

    Leonovets, S. A.; Shukalov, A. V.; Zharinov, I. O.

    2018-03-01

    The paper presents automation of the process of preparation, storage and monitoring of version control of a text designer, and program documentation by means of the specialized software is considered. Automation of preparation of documentation is based on processing of the engineering data which are contained in the specifications and technical documentation or in the specification. Data handling assumes existence of strictly structured electronic documents prepared in widespread formats according to templates on the basis of industry standards and generation by an automated method of the program or designer text document. Further life cycle of the document and engineering data entering it are controlled. At each stage of life cycle, archive data storage is carried out. Studies of high-speed performance of use of different widespread document formats in case of automated monitoring and storage are given. The new developed software and the work benches available to the developer of the instrumental equipment are described.

  8. Automated extinction monitor for the NLOT site survey

    Science.gov (United States)

    Kumar Sharma, Tarun

    In order to search a few potential sites for the National Large Optical Telescope (NLOT) project, we have initiated a site survey program. Since, most of instruments used for the site survey are custom made, we also started developing our own site characterization instruments. In this process we have designed and developed a device called Automated Extinction Monitor (AEM) and installed the same at IAO, Hanle. The AEM is a small wide field robotic telescope, dedicated to record atmospheric extinction in one or more photometric bands. It gives very accurate statistics of the distribution of photometric nights. In addition to this, instrument also provides the measurement of sky brightness. Here we briefly describe overall instrument and initial results obtained.

  9. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  10. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  11. Engineering Process Monitoring for Control Room Operation

    CERN Document Server

    Bätz, M

    2001-01-01

    A major challenge in process operation is to reduce costs and increase system efficiency whereas the complexity of automated process engineering, control and monitoring systems increases continuously. To cope with this challenge the design, implementation and operation of process monitoring systems for control room operation have to be treated as an ensemble. This is only possible if the engineering of the monitoring information is focused on the production objective and is lead in close collaboration of control room teams, exploitation personnel and process specialists. In this paper some principles for the engineering of monitoring information for control room operation are developed at the example of the exploitation of a particle accelerator at the European Laboratory for Nuclear Research (CERN).

  12. Affective processes in human-automation interactions.

    Science.gov (United States)

    Merritt, Stephanie M

    2011-08-01

    This study contributes to the literature on automation reliance by illuminating the influences of user moods and emotions on reliance on automated systems. Past work has focused predominantly on cognitive and attitudinal variables, such as perceived machine reliability and trust. However, recent work on human decision making suggests that affective variables (i.e., moods and emotions) are also important. Drawing from the affect infusion model, significant effects of affect are hypothesized. Furthermore, a new affectively laden attitude termed liking is introduced. Participants watched video clips selected to induce positive or negative moods, then interacted with a fictitious automated system on an X-ray screening task At five time points, important variables were assessed including trust, liking, perceived machine accuracy, user self-perceived accuracy, and reliance.These variables, along with propensity to trust machines and state affect, were integrated in a structural equation model. Happiness significantly increased trust and liking for the system throughout the task. Liking was the only variable that significantly predicted reliance early in the task. Trust predicted reliance later in the task, whereas perceived machine accuracy and user self-perceived accuracy had no significant direct effects on reliance at any time. Affective influences on automation reliance are demonstrated, suggesting that this decision-making process may be less rational and more emotional than previously acknowledged. Liking for a new system may be key to appropriate reliance, particularly early in the task. Positive affect can be easily induced and may be a lever for increasing liking.

  13. Lean automation development : applying lean principles to the automation development process

    OpenAIRE

    Granlund, Anna; Wiktorsson, Magnus; Grahn, Sten; Friedler, Niklas

    2014-01-01

    By a broad empirical study it is indicated that automation development show potential of improvement. In the paper, 13 lean product development principles are contrasted to the automation development process and it is suggested why and how these principles can facilitate, support and improve the automation development process. The paper summarises a description of what characterises a lean automation development process and what consequences it entails. Main differences compared to current pr...

  14. An Automated Processing Method for Agglomeration Areas

    Directory of Open Access Journals (Sweden)

    Chengming Li

    2018-05-01

    Full Text Available Agglomeration operations are a core component of the automated generalization of aggregated area groups. However, because geographical elements that possess agglomeration features are relatively scarce, the current literature has not given sufficient attention to agglomeration operations. Furthermore, most reports on the subject are limited to the general conceptual level. Consequently, current agglomeration methods are highly reliant on subjective determinations and cannot support intelligent computer processing. This paper proposes an automated processing method for agglomeration areas. Firstly, the proposed method automatically identifies agglomeration areas based on the width of the striped bridging area, distribution pattern index (DPI, shape similarity index (SSI, and overlap index (OI. Next, the progressive agglomeration operation is carried out, including the computation of the external boundary outlines and the extraction of agglomeration lines. The effectiveness and rationality of the proposed method has been validated by using actual census data of Chinese geographical conditions in the Jiangsu Province.

  15. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    Padgett, H.C.; Schmidt, D.G.; Bida, G.T.; Wieland, B.W.; Pekrul, E.; Kingsbury, W.G.

    1991-01-01

    With the recent emergence of positron emission tomography (PET) as a viable clinical tool, there is a need for a convenient, cost-effective source of the positron emitter-labeled radiotracers labeled with carbon-11, nitrogen-13, oxygen-15, and fluorine-18. These short-lived radioisotopes are accelerator produced and thus, require a cyclotron and radiochemistry processing instrumentation that can be operated 3 in a clinical environment by competant technicians. The basic goal is to ensure safety and reliability while setting new standards for economy and ease of operation. The Siemens Radioisotope Delivery System (RDS 112) is a fully automated system dedicated to the production and delivery of positron-emitter labeled precursors and radiochemicals required to support a clinical PET imaging program. Thus, the entire RDS can be thought of as an automated radiochemical processing apparatus

  16. AUTOMATION DESIGN FOR MONORAIL - BASED SYSTEM PROCESSES

    Directory of Open Access Journals (Sweden)

    Bunda BESA

    2016-12-01

    Full Text Available Currently, conventional methods of decline development put enormous cost pressure on the profitability of mining operations. This is the case with narrow vein ore bodies where current methods and mine design of decline development may be too expensive to support economic extraction of the ore. According to studies, the time it takes to drill, clean and blast an end in conventional decline development can be up to 224 minutes. This is because once an end is blasted, cleaning should first be completed before drilling can commence, resulting in low advance rates per shift. Improvements in advance rates during decline development can be achieved by application of the Electric Monorail Transport System (EMTS based drilling system. The system consists of the drilling and loading components that use monorail technology to drill and clean the face during decline development. The two systems work simultaneously at the face in such a way that as the top part of the face is being drilled the pneumatic loading system cleans the face. However, to improve the efficiency of the two systems, critical processes performed by the two systems during mining operations must be automated. Automation increases safety and productivity, reduces operator fatigue and also reduces the labour costs of the system. The aim of this paper is, therefore, to describe automation designs of the two processes performed by the monorail drilling and loading systems during operations. During automation design, critical processes performed by the two systems and control requirements necessary to allow the two systems execute such processes automatically have also been identified.

  17. Automation of the aircraft design process

    Science.gov (United States)

    Heldenfels, R. R.

    1974-01-01

    The increasing use of the computer to automate the aerospace product development and engineering process is examined with emphasis on structural analysis and design. Examples of systems of computer programs in aerospace and other industries are reviewed and related to the characteristics of aircraft design in its conceptual, preliminary, and detailed phases. Problems with current procedures are identified, and potential improvements from optimum utilization of integrated disciplinary computer programs by a man/computer team are indicated.

  18. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing; Metodologia de monitoracao e diagnostico automatizado de rolamentos utilizando logica paraconsistente, transformada de Wavelet e processamento de sinais digitais

    Energy Technology Data Exchange (ETDEWEB)

    Masotti, Paulo Henrique Ferraz

    2006-07-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  19. Automated Irrigation System for Greenhouse Monitoring

    Science.gov (United States)

    Sivagami, A.; Hareeshvare, U.; Maheshwar, S.; Venkatachalapathy, V. S. K.

    2018-06-01

    The continuous requirement for the food needs the rapid improvement in food production technology. The economy of food production is mainly dependent on agriculture and the weather conditions, which are isotropic and thus we are not able to utilize the whole agricultural resources. The main reason is the deficiency of rainfall and paucity in land reservoir water. The continuous withdrawal water from the ground reduces the water level resulting in most of the land to come under the arid. In the field of cultivation, use of appropriate method of irrigation plays a vital role. Drip irrigation is a renowned methodology which is very economical and proficient. When the conventional drip irrigation system is followed, the farmer has to tag along the irrigation timetable, which is different for diverse crops. The current work makes the drip irrigation system an automated one, thereby the farmer doesn't want to follow any timetable since the sensor senses the soil moisture content and based on it supplies the water. Moreover the practice of economical sensors and the simple circuitry makes this project as an inexpensive product, which can be bought even by an underprivileged farmer. The current project is best suited for places where water is limited and has to be used in limited quantity.

  20. Automated Irrigation System for Greenhouse Monitoring

    Science.gov (United States)

    Sivagami, A.; Hareeshvare, U.; Maheshwar, S.; Venkatachalapathy, V. S. K.

    2018-03-01

    The continuous requirement for the food needs the rapid improvement in food production technology. The economy of food production is mainly dependent on agriculture and the weather conditions, which are isotropic and thus we are not able to utilize the whole agricultural resources. The main reason is the deficiency of rainfall and paucity in land reservoir water. The continuous withdrawal water from the ground reduces the water level resulting in most of the land to come under the arid. In the field of cultivation, use of appropriate method of irrigation plays a vital role. Drip irrigation is a renowned methodology which is very economical and proficient. When the conventional drip irrigation system is followed, the farmer has to tag along the irrigation timetable, which is different for diverse crops. The current work makes the drip irrigation system an automated one, thereby the farmer doesn't want to follow any timetable since the sensor senses the soil moisture content and based on it supplies the water. Moreover the practice of economical sensors and the simple circuitry makes this project as an inexpensive product, which can be bought even by an underprivileged farmer. The current project is best suited for places where water is limited and has to be used in limited quantity.

  1. Automated TLD system for gamma radiation monitoring

    International Nuclear Information System (INIS)

    Nyberg, P.C.; Ott, J.D.; Edmonds, C.M.; Hopper, J.L.

    1979-01-01

    A gamma radiation monitoring system utilizing a commercially available TLD reader and unique microcomputer control has been built to assess the external radiation exposure to the resident population near a nuclear weapons testing facility. Maximum use of the microcomputer was made to increase the efficiency of data acquisition, transmission, and preparation, and to reduce operational costs. The system was tested for conformance with an applicable national standard for TLD's used in environmental measurements

  2. Automated full matrix capture for industrial processes

    Science.gov (United States)

    Brown, Roy H.; Pierce, S. Gareth; Collison, Ian; Dutton, Ben; Dziewierz, Jerzy; Jackson, Joseph; Lardner, Timothy; MacLeod, Charles; Morozov, Maxim

    2015-03-01

    Full matrix capture (FMC) ultrasound can be used to generate a permanent re-focusable record of data describing the geometry of a part; a valuable asset for an inspection process. FMC is a desirable acquisition mode for automated scanning of complex geometries, as it allows compensation for surface shape in post processing and application of the total focusing method. However, automating the delivery of such FMC inspection remains a significant challenge for real industrial processes due to the high data overhead associated with the ultrasonic acquisition. The benefits of NDE delivery using six-axis industrial robots are well versed when considering complex inspection geometries, but such an approach brings additional challenges to scanning speed and positional accuracy when combined with FMC inspection. This study outlines steps taken to optimize the scanning speed and data management of a process to scan the diffusion bonded membrane of a titanium test plate. A system combining a KUKA robotic arm and a reconfigurable FMC phased array controller is presented. The speed and data implications of different scanning methods are compared, and the impacts on data visualization quality are discussed with reference to this study. For the 0.5 m2 sample considered, typical acquisitions of 18 TB/m2 were measured for a triple back wall FMC acquisition, illustrating the challenge of combining high data throughput with acceptable scanning speeds.

  3. Automation system for tritium contaminated surface monitoring

    International Nuclear Information System (INIS)

    Culcer, Mihai; Iliescu, Mariana; Curuia, Marian; Raceanu, Mircea; Enache, Adrian; Stefanescu, Ioan; Ducu, Catalin; Malinovschi, Viorel

    2005-01-01

    The low energy of betas makes tritium difficult to detect. However, there are several methods used in tritium detection, such as liquid scintillation and ionization chambers. Tritium on or near a surface can be also detected using proportional counter and, recently, solid state devices. The paper presents our results in the design and achievement of a surface tritium monitor using a PIN photodiode as a solid state charged particle detector to count betas emitted from the surface. That method allows continuous, real-time and non-destructively measuring of tritium. (authors)

  4. Cluster processing business level monitor

    International Nuclear Information System (INIS)

    Muniz, Francisco J.

    2017-01-01

    This article describes a Cluster Processing Monitor. Several applications with this functionality can be freely found doing a search in the Google machine. However, those applications may offer more features that are needed on the Processing Monitor being proposed. Therefore, making the monitor output evaluation difficult to be understood by the user, at-a-glance. In addition, such monitors may add unnecessary processing cost to the Cluster. For these reasons, a completely new Cluster Processing Monitor module was designed and implemented. In the CDTN, Clusters are broadly used, mainly, in deterministic methods (CFD) and non-deterministic methods (Monte Carlo). (author)

  5. Cluster processing business level monitor

    Energy Technology Data Exchange (ETDEWEB)

    Muniz, Francisco J., E-mail: muniz@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    This article describes a Cluster Processing Monitor. Several applications with this functionality can be freely found doing a search in the Google machine. However, those applications may offer more features that are needed on the Processing Monitor being proposed. Therefore, making the monitor output evaluation difficult to be understood by the user, at-a-glance. In addition, such monitors may add unnecessary processing cost to the Cluster. For these reasons, a completely new Cluster Processing Monitor module was designed and implemented. In the CDTN, Clusters are broadly used, mainly, in deterministic methods (CFD) and non-deterministic methods (Monte Carlo). (author)

  6. Miniature multichannel analyzer for process monitoring

    International Nuclear Information System (INIS)

    Halbig, J.K.; Klosterbuer, S.F.; Russo, P.A.; Sprinkle, J.K. Jr.; Stephens, M.M.; Wiig, L.G.; Ianakiev, K.D.

    1993-01-01

    A new, 4,000-channel analyzer has been developed for gamma-ray spectroscopy applications. A design philosophy of hardware and software building blocks has been combined with design goals of simplicity, compactness, portability, and reliability. The result is a miniature, modular multichannel analyzer (MMMCA), which offers solution to a variety of nondestructive assay (NDA) needs in many areas of general application, independent of computer platform or operating system. Detector-signal analog electronics, the bias supply, and batteries are included in the virtually pocket-size, low-power MMMCA unit. The MMMCA features digital setup and control, automated data reduction, and automated quality assurance. Areas of current NDA applications include on-line continuous (process) monitoring, process material holdup measurements, and field inspections

  7. D-MSR: A Distributed Network Management Scheme for Real-Time Monitoring and Process Control Applications in Wireless Industrial Automation

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-01-01

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead. PMID:23807687

  8. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation.

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-06-27

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  9. Default mode contributions to automated information processing.

    Science.gov (United States)

    Vatansever, Deniz; Menon, David K; Stamatakis, Emmanuel A

    2017-11-28

    Concurrent with mental processes that require rigorous computation and control, a series of automated decisions and actions govern our daily lives, providing efficient and adaptive responses to environmental demands. Using a cognitive flexibility task, we show that a set of brain regions collectively known as the default mode network plays a crucial role in such "autopilot" behavior, i.e., when rapidly selecting appropriate responses under predictable behavioral contexts. While applying learned rules, the default mode network shows both greater activity and connectivity. Furthermore, functional interactions between this network and hippocampal and parahippocampal areas as well as primary visual cortex correlate with the speed of accurate responses. These findings indicate a memory-based "autopilot role" for the default mode network, which may have important implications for our current understanding of healthy and adaptive brain processing.

  10. Automated Processing Workflow for Ambient Seismic Recordings

    Science.gov (United States)

    Girard, A. J.; Shragge, J.

    2017-12-01

    Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that

  11. Proof-of-concept automation of propellant processing

    Science.gov (United States)

    Ramohalli, Kumar; Schallhorn, P. A.

    1989-01-01

    For space-based propellant production, automation of the process is needed. Currently, all phases of terrestrial production have some form of human interaction. A mixer was acquired to help perform the tasks of automation. A heating system to be used with the mixer was designed, built, and installed. Tests performed on the heating system verify design criteria. An IBM PS/2 personal computer was acquired for the future automation work. It is hoped that some the mixing process itself will be automated. This is a concept demonstration task; proving that propellant production can be automated reliably.

  12. Automated inundation monitoring using TerraSAR-X multitemporal imagery

    Science.gov (United States)

    Gebhardt, S.; Huth, J.; Wehrmann, T.; Schettler, I.; Künzer, C.; Schmidt, M.; Dech, S.

    2009-04-01

    The Mekong Delta in Vietnam offers natural resources for several million inhabitants. However, a strong population increase, changing climatic conditions and regulatory measures at the upper reaches of the Mekong lead to severe changes in the Delta. Extreme flood events occur more frequently, drinking water availability is increasingly limited, soils show signs of salinization or acidification, species and complete habitats diminish. During the Monsoon season the river regularly overflows its banks in the lower Mekong area, usually with beneficial effects. However, extreme flood events occur more frequently causing extensive damage, on the average once every 6 to 10 years river flood levels exceed the critical beneficial level X-band SAR data are well suited for deriving inundated surface areas. The TerraSAR-X sensor with its different scanning modi allows for the derivation of spatial and temporal high resolved inundation masks. The paper presents an automated procedure for deriving inundated areas from TerraSAR-X Scansar and Stripmap image data. Within the framework of the German-Vietnamese WISDOM project, focussing the Mekong Delta region in Vietnam, images have been acquired covering the flood season from June 2008 to November 2008. Based on these images a time series of the so called watermask showing inundated areas have been derived. The product is required as intermediate to (i) calibrate 2d inundation model scenarios, (ii) estimate the extent of affected areas, and (iii) analyze the scope of prior crisis. The image processing approach is based on the assumption that water surfaces are forward scattering the radar signal resulting in low backscatter signals to the sensor. It uses multiple grey level thresholds and image morphological operations. The approach is robust in terms of automation, accuracy, robustness, and processing time. The resulting watermasks show the seasonal flooding pattern with inundations starting in July, having their peak at the end

  13. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  14. Adaptive Algorithms for Automated Processing of Document Images

    Science.gov (United States)

    2011-01-01

    ABSTRACT Title of dissertation: ADAPTIVE ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES Mudit Agrawal, Doctor of Philosophy, 2011...2011 4. TITLE AND SUBTITLE Adaptive Algorithms for Automated Processing of Document Images 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM...ALGORITHMS FOR AUTOMATED PROCESSING OF DOCUMENT IMAGES by Mudit Agrawal Dissertation submitted to the Faculty of the Graduate School of the University

  15. Development and implementation of an automated system for antiquated of the process of gamma radiation monitors calibration; Desenvolvimento e implantacao de um sistema automatizado para adequacao do processo de calibracao de monitores de radiacao gama

    Energy Technology Data Exchange (ETDEWEB)

    Silva Junior, Iremar Alves

    2012-07-01

    In this study it was carried out the development and implementation of a system for the appropriate process of gamma radiation monitors calibration, constituted by a pneumatic dispositive to exchange the attenuators and a positioning table, both actuated through a control panel. We also implemented a System of Caesa-Gammatron Irradiator, which increased the range of the air kerma rates, due to its higher activity comparing with the current system of gamma radiation in use in the calibration laboratory of gamma irradiation. Hence, it was necessary the installation of an attenuator dispositive remotely controlled in this irradiator system. Lastly, it was carried out an evaluation of the reduction in the rates of the occupational dose. This dissertation was developed with the aim of improving the quality of the services of calibration and tests of gamma radiation monitors - provided by the IPEN Laboratory of Instrument Calibration - as well as decreasing the occupational dose of the technicians involved in the process of calibration, following thus the principles of radiation protection. (author)

  16. Integrated system for automated financial document processing

    Science.gov (United States)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  17. Automation in irrigation process in family farm with Arduino platform

    Directory of Open Access Journals (Sweden)

    Kianne Crystie Bezerra da Cunha

    2016-03-01

    Full Text Available The small farmers tend not to use mechanical inputs in the irrigation process due to the high cost than conventional irrigation systems have and in other cases, the lack of knowledge and technical guidance makes the farmer theme using the system. Thus, all control and monitoring are made by hand without the aid of machines and this practice can lead to numerous problems from poor irrigation, and water waste, energy, and deficits in production. It is difficult to deduce when to irrigate, or how much water applied in cultivation, measure the soil temperature variables, temperature, and humidity, etc. The objective of this work is to implement an automated irrigation system aimed at family farming that is low cost and accessible to the farmer. The system will be able to monitor all parameters from irrigation. For this to occur, the key characteristics of family farming, Arduino platform, and irrigation were analyzed.

  18. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong

    2014-11-01

    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  19. Unattended reaction monitoring using an automated microfluidic sampler and on-line liquid chromatography.

    Science.gov (United States)

    Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve

    2018-04-03

    In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dennison, D.K.; Domning, E.E.; Seivers, R.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  1. Managing Automation: A Process, Not a Project.

    Science.gov (United States)

    Hoffmann, Ellen

    1988-01-01

    Discussion of issues in management of library automation includes: (1) hardware, including systems growth and contracts; (2) software changes, vendor relations, local systems, and microcomputer software; (3) item and authority databases; (4) automation and library staff, organizational structure, and managing change; and (5) environmental issues,…

  2. Automated testing of arrhythmia monitors using annotated databases.

    Science.gov (United States)

    Elghazzawi, Z; Murray, W; Porter, M; Ezekiel, E; Goodall, M; Staats, S; Geheb, F

    1992-01-01

    Arrhythmia-algorithm performance is typically tested using the AHA and MIT/BIH databases. The tools for this test are simulation software programs. While these simulations provide rapid results, they neglect hardware and software effects in the monitor. To provide a more accurate measure of performance in the actual monitor, a system has been developed for automated arrhythmia testing. The testing system incorporates an IBM-compatible personal computer, a digital-to-analog converter, an RS232 board, a patient-simulator interface to the monitor, and a multi-tasking software package for data conversion and communication with the monitor. This system "plays" patient data files into the monitor and saves beat classifications in detection files. Tests were performed using the MIT/BIH and AHA databases. Statistics were generated by comparing the detection files with the annotation files. These statistics were marginally different from those that resulted from the simulation. Differences were then examined. As expected, the differences were related to monitor hardware effects.

  3. AUTOMATION OF CHAMPAGNE WINES PROCESS IN SPARKLING WINE PRESSURE TANK

    Directory of Open Access Journals (Sweden)

    E. V. Lukyanchuk

    2016-08-01

    Full Text Available The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex avtomatization sites grape processing with bulk transportation of the latter; improving the quality and durability of wines by the processing of a wide applying wine cold and heat, as well as technical and microbiological control most powerful automation equipment; the introduction of automated production processes of continuous technical champagne, sherry wine and cognac alcohol madery; the use of complex automation auxiliary production sites (boilers, air conditioners, refrigeration unitsand other.; complex avtomatization creation of enterprises, and sites manufactory bottling wines. In the wine industry developed more sophisticated schemes of automation and devices that enable the transition to integrated production automation, will create, are indicative automated enterprise serving for laboratories to study of the main problems of automation of production processes of winemaking.

  4. Automated Student Aid Processing: The Challenge and Opportunity.

    Science.gov (United States)

    St. John, Edward P.

    1985-01-01

    To utilize automated technology for student aid processing, it is necessary to work with multi-institutional offices (student aid, admissions, registration, and business) and to develop automated interfaces with external processing systems at state and federal agencies and perhaps at need-analysis organizations and lenders. (MLW)

  5. Automated Psychological Categorization via Linguistic Processing System

    National Research Council Canada - National Science Library

    Eramo, Mark

    2004-01-01

    .... This research examined whether or not Information Technology (IT) tools, specializing in text mining, are robust enough to automate the categorization/segmentation of individual profiles for the purpose...

  6. Tools for automated acoustic monitoring within the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  7. Tools for automated acoustic monitoring within the R package monitoR

    DEFF Research Database (Denmark)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those...... with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors....

  8. Problems of complex automation of process at a NPP

    International Nuclear Information System (INIS)

    Naumov, A.V.

    1981-01-01

    The importance of theoretical investigation in determining the level and quality of NPP automation is discussed. Achievements gained in this direction are briefly reviewed on the example of domestic NPPs. Two models of the problem solution on function distribution between the operator and technical means are outlined. The processes subjected to automation are enumerated. Development of the optimal methods of power automatic control of power units is one of the most important problems of NPP automation. Automation of discrete operations especially during the start-up, shut-down or in imergency situations becomes important [ru

  9. Process monitoring by display devices

    International Nuclear Information System (INIS)

    Eggerdinger, C.; Schattner, R.

    1984-01-01

    The use of extensive automation, regulating, protection and limiting devices and the application of ergonomic principles (e.g. the increased use of mimic diagrams) has led to plant being capable of continued operation. German nuclear power stations are in top position worldwide as regards safety and availability. However, there is already a requirement to overcome the unmanageable state due to the large number and miniaturization of elements by renewed efforts. An attempt at this made with conventional technology is represented by a mimic board, which was provided in a powerstation just being set to work. Such mimic boards give the opportunity of monitoring the most important parameters at a glance but there are limits to their use due to the large space required. The use of VDU screens represents a possibility of solving this problem. (orig./DG) [de

  10. Automation of data processing | G | African Journal of Range and ...

    African Journals Online (AJOL)

    Data processing can be time-consuming when experiments with advanced designs are employed. This, coupled with a shortage of research workers, necessitates automation. It is suggested that with automation the first step is to determine how the data must be analysed. The second step is to determine what programmes ...

  11. Automated Impedance Tomography for Monitoring Permeable Reactive Barrier Health

    Energy Technology Data Exchange (ETDEWEB)

    LaBrecque, D J; Adkins, P L

    2009-07-02

    The objective of this research was the development of an autonomous, automated electrical geophysical monitoring system which allows for near real-time assessment of Permeable Reactive Barrier (PRB) health and aging and which provides this assessment through a web-based interface to site operators, owners and regulatory agencies. Field studies were performed at four existing PRB sites; (1) a uranium tailing site near Monticello, Utah, (2) the DOE complex at Kansas City, Missouri, (3) the Denver Federal Center in Denver, Colorado and (4) the Asarco Smelter site in East Helena, Montana. Preliminary surface data over the PRB sites were collected (in December, 2005). After the initial round of data collection, the plan was modified to include studies inside the barriers in order to better understand barrier aging processes. In September 2006 an autonomous data collection system was designed and installed at the EPA PRB and the electrode setups in the barrier were revised and three new vertical electrode arrays were placed in dedicated boreholes which were in direct contact with the PRB material. Final data were collected at the Kansas City, Denver and Monticello, Utah PRB sites in the fall of 2007. At the Asarco Smelter site in East Helena, Montana, nearly continuous data was collected by the autonomous monitoring system from June 2006 to November 2007. This data provided us with a picture of the evolution of the barrier, enabling us to examine barrier changes more precisely and determine whether these changes are due to installation issues or are normal barrier aging. Two rounds of laboratory experiments were carried out during the project. We conducted column experiments to investigate the effect of mineralogy on the electrical signatures resulting from iron corrosion and mineral precipitation in zero valent iron (ZVI) columns. In the second round of laboratory experiments we observed the electrical response from simulation of actual field PRBs at two sites: the

  12. Environmental and process monitoring technologies

    International Nuclear Information System (INIS)

    Vo-Dinh, Tuan

    1993-01-01

    The objective of this conference was to provide a multidisciplinary forum dealing with state-of-the-art methods and instrumentation for environmental and process monitoring. In the last few years, important advances have been made in improving existing analytical methods and developing new techniques for trace detection of chemicals. These monitoring technologies are a topic of great interest for environmental and industrial control in a wide spectrum of areas. Sensitive detection, selective characterization, and cost-effective analysis are among the most important challenges facing monitoring technologies. This conference integrating interdisciplinary research and development was aimed to present the most recent advances and applications in the important areas of environmental and process monitoring. Separate abstracts have been prepared for 34 papers for inclusion in the appropriate data bases

  13. Automation for Primary Processing of Hardwoods

    Science.gov (United States)

    Daniel L. Schmoldt

    1992-01-01

    Hardwood sawmills critically need to incorporate automation and computer technology into their operations. Social constraints, forest biology constraints, forest product market changes, and financial necessity are forcing primary processors to boost their productivity and efficiency to higher levels. The locations, extent, and types of defects found in logs and on...

  14. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    Johnson, S.M.; Lorenz, R.

    1992-01-01

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  15. Automated radon-thoron monitoring for earthquake prediction research

    International Nuclear Information System (INIS)

    Shapiro, M.H.; Melvin, J.D.; Copping, N.A.; Tombrello, T.A.; Whitcomb, J.H.

    1980-01-01

    This paper describes an automated instrument for earthquake prediction research which monitors the emission of radon ( 222 Rn) and thoron ( 220 Rn) from rock. The instrument uses aerosol filtration techniques and beta counting to determine radon and thoron levels. Data from the first year of operation of a field prototype suggest an annual cycle in the radon level at the site which is related to thermoelastic strains in the crust. Two anomalous increases in the radon level of short duration have been observed during the first year of operation. One anomaly appears to have been a precursor for a nearby earthquake (2.8 magnitude, Richter scale), and the other may have been associated with changing hydrological conditions resulting from heavy rainfall

  16. Evaluation of new and conventional thermoluminescent phosphors for environmental monitoring using automated thermoluminescent dosimeter readers

    International Nuclear Information System (INIS)

    Rathbone, B.A.; Endres, A.W.; Antonio, E.J.

    1994-10-01

    In recent years, there has been considerable interest in a new generation of super-sensitive thermoluminescent (TL) phosphors for potential use in routine personnel and environmental monitoring. Two of these phosphors, α-Al 2 O 3 :C and LiF:Mg,Cu,P, are evaluated in this paper for selected characteristics relevant to environmental monitoring, along with two conventional phosphors widely used in environmental monitoring, LiF:Mg,Ti and CaF 2 :Dy. The characteristics evaluated are light-induced fading, light-induced background, linearity and variability at low dose, and the minimum measurable dose. These characteristics were determined using an automated commercial dosimetry system (Harshaw System 8800) and routine processing protocols. Annealing and readout protocols for each phosphor were optimized for use in a large-scale environmental monitoring program

  17. Containerless automated processing of intermetallic compounds and composites

    Science.gov (United States)

    Johnson, D. R.; Joslin, S. M.; Reviere, R. D.; Oliver, B. F.; Noebe, R. D.

    1993-01-01

    An automated containerless processing system has been developed to directionally solidify high temperature materials, intermetallic compounds, and intermetallic/metallic composites. The system incorporates a wide range of ultra-high purity chemical processing conditions. The utilization of image processing for automated control negates the need for temperature measurements for process control. The list of recent systems that have been processed includes Cr, Mo, Mn, Nb, Ni, Ti, V, and Zr containing aluminides. Possible uses of the system, process control approaches, and properties and structures of recently processed intermetallics are reviewed.

  18. development of an automated batch-process solar water disinfection

    African Journals Online (AJOL)

    user

    This work presents the development of an automated batch-process water disinfection system ... Locally sourced materials in addition to an Arduinomicro processor were used to control ..... As already mentioned in section 3.1.1, a statistical.

  19. Automated Internal Revenue Processing System: A Panacea For ...

    African Journals Online (AJOL)

    Automated Internal Revenue Processing System: A Panacea For Financial ... for the collection and management of internal revenue which is the financial ... them, computational errors, high level of redundancy and inconsistencies in record, ...

  20. More steps towards process automation for optical fabrication

    Science.gov (United States)

    Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina

    2017-06-01

    In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.

  1. Development of automated patrol-type monitoring and inspection system for nuclear power plant and application to actual plant

    International Nuclear Information System (INIS)

    Senoo, Makoto; Koga, Kazunori; Hirakawa, Hiroshi; Tanaka, Keiji

    1996-01-01

    An automated patrol-type monitoring and inspection system was developed and applied in a nuclear power plant. This system consists of a monorail, a monitoring robot and an operator's console. The monitoring robot consists of a sensor unit and a control unit. Three kinds of sensor, a color ITV camera, an infrared camera and a microphone are installed in the sensor unit. The features of this monitoring robot are; (1) Weights 15 kg with a cross-sectional dimensions of 152 mm width and 290 mm height. (2) Several automatic monitoring functions are installed using image processing and frequency analysis for three sensor signals. (author)

  2. Monitoring and control of the Rossendorf research reactor using a microcomputerized automation system

    International Nuclear Information System (INIS)

    Ba weg, F.; Enkelmann, W.; Klebau, J.

    1982-01-01

    A decentral hierarchic information system (HIS) is presented, which has been developed for monitoring and control of the Rossendorf Research Reactor RFR, but which may also be considered the prototype of a digital automation system (AS) to be used in power stations. The functions integrated in the HIS are as follows: process monitoring, process control, and use of a specialized industrial robot for control of charging and discharging of the materials to be irradiated. The AS is realized on the basis of the process computer system PRA 30 (A 6492) developed in the GDR and including a computer K 1630 and the intelligent process terminals ursadat 5000 connected by a fast serial interface (IFLS). (author)

  3. Process computers automate CERN power supply installations

    CERN Document Server

    Ullrich, H

    1974-01-01

    Computerized automation systems are being used at CERN, Geneva, to improve the capacity, operational reliability and flexibility of the power supply installations for main ring magnets in the experimental zones of particle accelerators. A detailed account of the technological problem involved is followed in the article by a description of the system configuration, the program system and field experience already gathered in similar schemes. (1 refs).

  4. Automation in a material processing/storage facility

    International Nuclear Information System (INIS)

    Peterson, K.; Gordon, J.

    1997-01-01

    The Savannah River Site (SRS) is currently developing a new facility, the Actinide Packaging and Storage Facility (APSF), to process and store legacy materials from the United States nuclear stockpile. A variety of materials, with a variety of properties, packaging and handling/storage requirements, will be processed and stored at the facility. Since these materials are hazardous and radioactive, automation will be used to minimize worker exposure. Other benefits derived from automation of the facility include increased throughput capacity and enhanced security. The diversity of materials and packaging geometries to be handled poses challenges to the automation of facility processes. In addition, the nature of the materials to be processed underscores the need for safety, reliability and serviceability. The application of automation in this facility must, therefore, be accomplished in a rational and disciplined manner to satisfy the strict operational requirements of the facility. Among the functions to be automated are the transport of containers between process and storage areas via an Automatic Guided Vehicle (AGV), and various processes in the Shipping Package Unpackaging (SPU) area, the Accountability Measurements (AM) area, the Special Isotope Storage (SIS) vault and the Special Nuclear Materials (SNM) vault. Other areas of the facility are also being automated, but are outside the scope of this paper

  5. Real time fish pond monitoring and automation using Arduino

    Science.gov (United States)

    Harun, Z.; Reda, E.; Hashim, H.

    2018-03-01

    Investment and operating costs are the biggest obstacles in modernizing fish ponds in an otherwise very lucrative industry i.e. food production, in this region. Small-scale farmers running on small ponds could not afford to hire workers to man daily operations which usually consists of monitoring water levels, temperature and feeding fish. Bigger scale enterprises usually have some kinds of automation for water monitoring and replacement. These entities have to consider employing pH and dissolved oxygen (DO) sensors to ensure the health and growth of fish, sooner or later as their farms grow. This project identifies one of the sites, located in Malacca. In this project, water, temperature, pH and DO levels are measured and integrated with aerating and water supply pumps using Arduino. User could receive information at predetermined intervals on preferred communication or display gadgets as long as they have internet. Since integrating devices are comparatively not expensive; it usually consists of Arduino board, internet and relay frames and display system, farmer could source these components easily. A sample of two days measurements of temperature, pH and DO levels show that this farm has a high-quality water. Oxygen levels increases in the day as sunshine supports photosynthesis in the pond. With this integration system, farmer need not hire worker at their site, consequently drive down operating costs and improve efficiency.

  6. Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

    Science.gov (United States)

    Duc Nguyen, Minh

    2017-10-01

    This work describes Live Monitor, the monitoring subsystem of SDDS - an automated system for space experiment data processing, storage, and distribution created at SINP MSU. Live Monitor allows operators and developers of satellite data centers to identify errors occurred in data processing quickly and to prevent further consequences caused by the errors. All activities of the whole data processing cycle are illustrated via a web interface in real-time. Notification messages are delivered to responsible people via emails and Telegram messenger service. The flexible monitoring mechanism implemented in Live Monitor allows us to dynamically change and control events being shown on the web interface on our demands. Physicists, whose space weather analysis models are functioning upon satellite data provided by SDDS, can use the developed RESTful API to monitor their own events and deliver customized notification messages by their needs.

  7. Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

    Directory of Open Access Journals (Sweden)

    Duc Nguyen Minh

    2017-01-01

    Full Text Available This work describes Live Monitor, the monitoring subsystem of SDDS – an automated system for space experiment data processing, storage, and distribution created at SINP MSU. Live Monitor allows operators and developers of satellite data centers to identify errors occurred in data processing quickly and to prevent further consequences caused by the errors. All activities of the whole data processing cycle are illustrated via a web interface in real-time. Notification messages are delivered to responsible people via emails and Telegram messenger service. The flexible monitoring mechanism implemented in Live Monitor allows us to dynamically change and control events being shown on the web interface on our demands. Physicists, whose space weather analysis models are functioning upon satellite data provided by SDDS, can use the developed RESTful API to monitor their own events and deliver customized notification messages by their needs.

  8. AUTOMATION OF CHAMPAGNE WINES PROCESS IN SPARKLING WINE PRESSURE TANK

    OpenAIRE

    E. V. Lukyanchuk; V. A. Khobin; V. A. Khobin

    2016-01-01

    The wine industry is now successfully solved the problem for the implementation of automation receiving points of grapes, crushing and pressing departments installation continuous fermentation work, blend tanks, production lines ordinary Madeira continuously working plants for ethyl alcohol installations champagne wine in continuous flow, etc. With the development of automation of technological progress productivity winemaking process develops in the following areas: organization of complex a...

  9. An Automated 476 MHz RF Cavity Processing Facility at SLAC

    CERN Document Server

    McIntosh, P; Schwarz, H

    2003-01-01

    The 476 MHz accelerating cavities currently used at SLAC are those installed on the PEP-II B-Factory collider accelerator. They are designed to operate at a maximum accelerating voltage of 1 MV and are routinely utilized on PEP-II at voltages up to 750 kV. During the summer of 2003, SPEAR3 will undergo a substantial upgrade, part of which will be to replace the existing 358.54 MHz RF system with essentially a PEP-II high energy ring (HER) RF station operating at 476.3 MHz and 3.2 MV (or 800 kV/cavity). Prior to installation, cavity RF processing is required to prepare them for use. A dedicated high power test facility is employed at SLAC to provide the capability of conditioning each cavity up to the required accelerating voltage. An automated LabVIEW based interface controls and monitors various cavity and test stand parameters, increasing the RF fields accordingly such that stable operation is finally achieved. This paper describes the high power RF cavity processing facility, highlighting the features of t...

  10. Monitoring of polymer melt processing

    International Nuclear Information System (INIS)

    Alig, Ingo; Steinhoff, Bernd; Lellinger, Dirk

    2010-01-01

    The paper reviews the state-of-the-art of in-line and on-line monitoring during polymer melt processing by compounding, extrusion and injection moulding. Different spectroscopic and scattering techniques as well as conductivity and viscosity measurements are reviewed and compared concerning their potential for different process applications. In addition to information on chemical composition and state of the process, the in situ detection of morphology, which is of specific interest for multiphase polymer systems such as polymer composites and polymer blends, is described in detail. For these systems, the product properties strongly depend on the phase or filler morphology created during processing. Examples for optical (UV/vis, NIR) and ultrasonic attenuation spectra recorded during extrusion are given, which were found to be sensitive to the chemical composition as well as to size and degree of dispersion of micro or nanofillers in the polymer matrix. By small-angle light scattering experiments, process-induced structures were detected in blends of incompatible polymers during compounding. Using conductivity measurements during extrusion, the influence of processing conditions on the electrical conductivity of polymer melts with conductive fillers (carbon black or carbon nanotubes) was monitored. (topical review)

  11. Implementing The Automated Phases Of The Partially-Automated Digital Triage Process Model

    Directory of Open Access Journals (Sweden)

    Gary D Cantrell

    2012-12-01

    Full Text Available Digital triage is a pre-digital-forensic phase that sometimes takes place as a way of gathering quick intelligence. Although effort has been undertaken to model the digital forensics process, little has been done to date to model digital triage. This work discuses the further development of a model that does attempt to address digital triage the Partially-automated Crime Specific Digital Triage Process model. The model itself will be presented along with a description of how its automated functionality was implemented to facilitate model testing.

  12. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring.

    Science.gov (United States)

    Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de

    2017-11-05

    Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.

  13. An Energy Efficient Adaptive Sampling Algorithm in a Sensor Network for Automated Water Quality Monitoring

    Directory of Open Access Journals (Sweden)

    Tongxin Shu

    2017-11-01

    Full Text Available Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA, while achieving around the same Normalized Mean Error (NME, DDASA is superior in saving 5.31% more battery energy.

  14. Monitored Retrievable Storage/Multi-Purpose Canister analysis: Simulation and economics of automation

    International Nuclear Information System (INIS)

    Bennett, P.C.; Stringer, J.B.

    1994-01-01

    Robotic automation is examined as a possible alternative to manual spent nuclear fuel, transport cask and Multi-Purpose canister (MPC) handling at a Monitored Retrievable Storage (MRS) facility. Automation of key operational aspects for the MRS/MPC system are analyzed to determine equipment requirements, through-put times and equipment costs is described. The economic and radiation dose impacts resulting from this automation are compared to manual handling methods

  15. Modern business process automation : YAWL and its support environment

    NARCIS (Netherlands)

    Hofstede, ter A.H.M.; Aalst, van der W.M.P.; Adams, M.; Russell, N.C.

    2010-01-01

    This book provides a comprehensive treatment of the field of Business Process Management (BPM) with a focus on Business Process Automation. It achieves this by covering a wide range of topics, both introductory and advanced, illustrated through and grounded in the YAWL (Yet Another Workflow

  16. A system of automated processing of deep water hydrological information

    Science.gov (United States)

    Romantsov, V. A.; Dyubkin, I. A.; Klyukbin, L. N.

    1974-01-01

    An automated system for primary and scientific analysis of deep water hydrological information is presented. Primary processing of the data in this system is carried out on a drifting station, which also calculates the parameters of vertical stability of the sea layers, as well as their depths and altitudes. Methods of processing the raw data are described.

  17. IDAPS (Image Data Automated Processing System) System Description

    Science.gov (United States)

    1988-06-24

    This document describes the physical configuration and components used in the image processing system referred to as IDAPS (Image Data Automated ... Processing System). This system was developed by the Environmental Research Institute of Michigan (ERIM) for Eglin Air Force Base. The system is designed

  18. An Automated Sample Processing System for Planetary Exploration

    Science.gov (United States)

    Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther

    2012-01-01

    An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.

  19. High throughput sample processing and automated scoring

    Directory of Open Access Journals (Sweden)

    Gunnar eBrunborg

    2014-10-01

    Full Text Available The comet assay is a sensitive and versatile method for assessing DNA damage in cells. In the traditional version of the assay, there are many manual steps involved and few samples can be treated in one experiment. High throughput modifications have been developed during recent years, and they are reviewed and discussed. These modifications include accelerated scoring of comets; other important elements that have been studied and adapted to high throughput are cultivation and manipulation of cells or tissues before and after exposure, and freezing of treated samples until comet analysis and scoring. High throughput methods save time and money but they are useful also for other reasons: large-scale experiments may be performed which are otherwise not practicable (e.g., analysis of many organs from exposed animals, and human biomonitoring studies, and automation gives more uniform sample treatment and less dependence on operator performance. The high throughput modifications now available vary largely in their versatility, capacity, complexity and costs. The bottleneck for further increase of throughput appears to be the scoring.

  20. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    Science.gov (United States)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  1. The Automated Discovery of Hybrid Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Reijers, Hajo; Maggi, Fabrizio Maria

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedu...

  2. The automated discovery of hybrid processes

    NARCIS (Netherlands)

    Maggi, F.M.; Slaats, T.; Reijers, H.A.

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedural

  3. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C......In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...

  4. Automated complex spectra processing of actinide α-radiation

    International Nuclear Information System (INIS)

    Anichenkov, S.V.; Popov, Yu.S.; Tselishchev, I.V.; Mishenev, V.B.; Timofeev, G.A.

    1989-01-01

    Earlier described algorithms of automated processing of complex α - spectra of actinides with the use of Ehlektronika D3-28 computer line, connected with ICA-070 multichannel amplitude pulse analyzer, were realized. The developed program enables to calculated peak intensity and the relative isotope content, to conduct energy calibration of spectra, to calculate peak center of gravity and energy resolution, to perform integral counting in particular part of the spectrum. Error of the method of automated processing depens on the degree of spectrum complication and lies within the limits of 1-12%. 8 refs.; 4 figs.; 2 tabs

  5. An automated digital imaging system for environmental monitoring applications

    Science.gov (United States)

    Bogle, Rian; Velasco, Miguel; Vogel, John

    2013-01-01

    Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.

  6. Software Process Automation: Experiences from the Trenches.

    Science.gov (United States)

    1996-07-01

    Integration of problem database Weaver tions) J Process WordPerfect, All-in-One, Oracle, CM Integration of tools Weaver System K Process Framemaker , CM...handle change requests and problem reports. * Autoplan, a project management tool * Framemaker , a document processing system * Worldview, a document...Cadre, Team Work, FrameMaker , some- thing for requirements traceability, their own homegrown scheduling tool, and their own homegrown tool integrator

  7. AIRSAR Automated Web-based Data Processing and Distribution System

    Science.gov (United States)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  8. Automated processing for proton spectroscopic imaging using water reference deconvolution.

    Science.gov (United States)

    Maudsley, A A; Wu, Z; Meyerhoff, D J; Weiner, M W

    1994-06-01

    Automated formation of MR spectroscopic images (MRSI) is necessary before routine application of these methods is possible for in vivo studies; however, this task is complicated by the presence of spatially dependent instrumental distortions and the complex nature of the MR spectrum. A data processing method is presented for completely automated formation of in vivo proton spectroscopic images, and applied for analysis of human brain metabolites. This procedure uses the water reference deconvolution method (G. A. Morris, J. Magn. Reson. 80, 547(1988)) to correct for line shape distortions caused by instrumental and sample characteristics, followed by parametric spectral analysis. Results for automated image formation were found to compare favorably with operator dependent spectral integration methods. While the water reference deconvolution processing was found to provide good correction of spatially dependent resonance frequency shifts, it was found to be susceptible to errors for correction of line shape distortions. These occur due to differences between the water reference and the metabolite distributions.

  9. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  10. An automated, self-verifying system for monitoring uranium in effluent streams

    International Nuclear Information System (INIS)

    Reda, R.J.; Pickett, J.L.

    1992-01-01

    In nuclear facilities such as nuclear fuel fabrication plants, a constant vigil is required to ensure that the concentrations of uranium in process or waste streams do not exceed required specifications. The specifications may be dictated by the process owner, a regulatory agency such as the US Nuclear Regulatory Agency or Environmental Protection Agency, or by criticality safety engineering criteria. Traditionally, uranium monitoring in effluent streams has been accomplished by taking periodic samples of the liquid stream and determining the concentration by chemical analysis. Despite its accuracy, chemical sampling is not timely enough for practical use in continuously flowing systems because of the possibility that a significant quantity of uranium may be discharged between sampling intervals. To completely satisfy regulatory standards, the liquid waste stream must be monitored for uranium on a 100% basis. To this end, an automated, radioisotopic liquid-waste monitoring system was developed by GE Nuclear Energy as an integral part of the uranium conversion and waste recovery operations. The system utilizes passive gamma-ray spectroscopy and is thus a robust, on-line, and nondestructive assay for uranium. The system provides uranium concentration data for process monitoring and assures regulatory compliance for criticality safety. A summary of the principles of system operation, calibration, and verification is presented in this paper

  11. Test Automation Process Improvement A case study of BroadSoft

    OpenAIRE

    Gummadi, Jalendar

    2016-01-01

    This master thesis research is about improvement of test automation process at BroadSoft Finland as a case study. Test automation project recently started at BroadSoft but the project is not properly integrated in to existing process. Project is about converting manual test cases to automation test cases. The aim of this thesis is about studying existing BroadSoft test process and studying different test automation frameworks. In this thesis different test automation process are studied ...

  12. Automated Remote Monitoring of Depression: Acceptance Among Low-Income Patients in Diabetes Disease Management

    OpenAIRE

    Ramirez, Magaly; Wu, Shinyi; Jin, Haomiao; Ell, Kathleen; Gross-Schulman, Sandra; Myerchin Sklaroff, Laura; Guterman, Jeffrey

    2016-01-01

    Background Remote patient monitoring is increasingly integrated into health care delivery to expand access and increase effectiveness. Automation can add efficiency to remote monitoring, but patient acceptance of automated tools is critical for success. From 2010 to 2013, the Diabetes-Depression Care-management Adoption Trial (DCAT)?a quasi-experimental comparative effectiveness research trial aimed at accelerating the adoption of collaborative depression care in a safety-net health care syst...

  13. Automated processing of data for supertree construction

    OpenAIRE

    Hill, Jon; Davis, Katie; Tover, Jaime; Wills, Matthew

    2015-01-01

    Talk given to Evolution2015 on the new autoprocessing functionality of the STK. This involves collecting nomenclature and taxonomic information on the OTUs to create a consistent naming scheme, and following the normal processing.

  14. DEVELOPMENT OF AN AUTOMATED BATCH-PROCESS SOLAR ...

    African Journals Online (AJOL)

    One of the shortcomings of solar disinfection of water (SODIS) is the absence of a feedback mechanism indicating treatment completion. This work presents the development of an automated batch-process water disinfection system aimed at solving this challenge. Locally sourced materials in addition to an Arduinomicro ...

  15. Automating the Fireshed Assessment Process with ArcGIS

    Science.gov (United States)

    Alan Ager; Klaus Barber

    2006-01-01

    A library of macros was developed to automate the Fireshed process within ArcGIS. The macros link a number of vegetation simulation and wildfire behavior models (FVS, SVS, FARSITE, and FlamMap) with ESRI geodatabases, desktop software (Access, Excel), and ArcGIS. The macros provide for (1) an interactive linkage between digital imagery, vegetation data, FVS-FFE, and...

  16. Integration of disabled people in an automated work process

    Science.gov (United States)

    Jalba, C. K.; Muminovic, A.; Epple, S.; Barz, C.; Nasui, V.

    2017-05-01

    Automation processes enter more and more into all areas of life and production. Especially people with disabilities can hardly keep step with this change. In sheltered workshops in Germany people with physical and mental disabilities get help with much dedication, to be integrated into the work processes. This work shows that cooperation between disabled people and industrial robots by means of industrial image processing can successfully result in the production of highly complex products. Here is described how high-pressure hydraulic pumps are assembled by people with disabilities in cooperation with industrial robots in a sheltered workshop. After the assembly process, the pumps are checked for leaks at very high pressures in a completely automated process.

  17. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Toivonen, H.; Honkamaa, T.; Ilander, T.; Leppaenen, A.; Nikkinen, M.; Poellaenen, R.; Ylaetalo, S.

    1998-07-01

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m 3 /h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10 -6 Bq/m 3 . The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  18. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance

  19. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives

    DEFF Research Database (Denmark)

    Podevin, Michael Paul Ambrose; Fotidis, Ioannis; Angelidaki, Irini

    2018-01-01

    microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on “big data”. The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral...... contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity – the capability of monitoring several analytes simultaneously – in the interest of improving product quality, productivity, and process automation...... interpretation and process automation to aid and motivate development....

  20. Prospect Theory in the Automated Advisory Process

    OpenAIRE

    WERNER, JONATAN; SJÖBERG, JONAS

    2016-01-01

    With robo-advisors and regulation eventually changing the market conditions of thefinancial advisory industry, traditional advisors will have to adapt to a new world of asset management. Thus, it will be of interest to traditional advisors to further explore the topic of how to automatically evaluate soft aspects such as client preferences and behavior, and transform it into portfolio allocations while retaining stringency and high quality in the process. In this thesis, we show how client pr...

  1. Streamlining Compliance Validation Through Automation Processes

    Science.gov (United States)

    2014-03-01

    INTENTIONALLY LEFT BLANK xv LIST OF ACRONYMS AND ABBREVIATIONS ACAS Assured Compliance Assessment Suite AMP Apache- MySQL -PHP ANSI American...enemy. Of course , a common standard for DoD security personnel to write and share compliance validation content would prevent duplicate work and aid in...process and consume much of the SCAP content available. Finally, it is free and easy to install as part of the Apache/ MySQL /PHP (AMP) [37

  2. Safety Evaluation of an Automated Remote Monitoring System for Heart Failure in an Urban, Indigent Population.

    Science.gov (United States)

    Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Hertz, Crystal Coyazo; Guterman, Jeffrey J

    2017-12-01

    Heart Failure (HF) is the most expensive preventable condition, regardless of patient ethnicity, race, socioeconomic status, sex, and insurance status. Remote telemonitoring with timely outpatient care can significantly reduce avoidable HF hospitalizations. Human outreach, the traditional method used for remote monitoring, is effective but costly. Automated systems can potentially provide positive clinical, fiscal, and satisfaction outcomes in chronic disease monitoring. The authors implemented a telephonic HF automated remote monitoring system that utilizes deterministic decision tree logic to identify patients who are at risk of clinical decompensation. This safety study evaluated the degree of clinical concordance between the automated system and traditional human monitoring. This study focused on a broad underserved population and demonstrated a safe, reliable, and inexpensive method of monitoring patients with HF.

  3. INFORMATION SYSTEM OF AUTOMATION OF PREPARATION EDUCATIONAL PROCESS DOCUMENTS

    Directory of Open Access Journals (Sweden)

    V. A. Matyushenko

    2016-01-01

    Full Text Available Information technology is rapidly conquering the world, permeating all spheres of human activity. Education is not an exception. An important direction of information of education is the development of university management systems. Modern information systems improve and facilitate the management of all types of activities of the institution. The purpose of this paper is development of system, which allows automating process of formation of accounting documents. The article describes the problem of preparation of the educational process documents. Decided to project and create the information system in Microsoft Access environment. The result is four types of reports obtained by using the developed system. The use of this system now allows you to automate the process and reduce the effort required to prepare accounting documents. All reports was implement in Microsoft Excel software product and can be used for further analysis and processing.

  4. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  5. Automated defect spatial signature analysis for semiconductor manufacturing process

    Science.gov (United States)

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  6. Automation of the micro-arc oxidation process

    Science.gov (United States)

    Golubkov, P. E.; Pecherskaya, E. A.; Karpanin, O. V.; Shepeleva, Y. V.; Zinchenko, T. O.; Artamonov, D. V.

    2017-11-01

    At present the significantly increased interest in micro-arc oxidation (MAO) encourages scientists to look for the solution of the problem of this technological process controllability. To solve this problem an automated technological installation MAO was developed, its structure and control principles are presented in this article. This device will allow to provide the controlled synthesis of MAO coatings and to identify MAO process patterns which contributes to commercialization of this technology.

  7. ECG acquisition and automated remote processing

    CERN Document Server

    Gupta, Rajarshi; Bera, Jitendranath

    2014-01-01

    The book is focused on the area of remote processing of ECG in the context of telecardiology, an emerging area in the field of Biomedical Engineering Application. Considering the poor infrastructure and inadequate numbers of physicians in rural healthcare clinics in India and other developing nations, telemedicine services assume special importance. Telecardiology, a specialized area of telemedicine, is taken up in this book considering the importance of cardiac diseases, which is prevalent in the population under discussion. The main focus of this book is to discuss different aspects of ECG acquisition, its remote transmission and computerized ECG signal analysis for feature extraction. It also discusses ECG compression and application of standalone embedded systems, to develop a cost effective solution of a telecardiology system.

  8. Portal monitoring technology control process

    International Nuclear Information System (INIS)

    York, R.L.

    1998-01-01

    Portal monitors are an important part of the material protection, control, and accounting (MPC and A) programs in Russia and the US. Although portal monitors are only a part of an integrated MPC and A system, they are an effective means of controlling the unauthorized movement of special nuclear material (SNM). Russian technical experts have gained experience in the use of SNM portal monitors from US experts ad this has allowed them to use the monitors more effectively. Several Russian institutes and companies are designing and manufacturing SNM portal monitors in Russia. Interactions between Russian and US experts have resulted in improvements to the instruments. SNM portal monitor technology has been effectively transferred from the US to Russia and should be a permanent part of the Russian MPC and A Program. Progress in the implementation of the monitors and improvements to how they are used are discussed

  9. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    Science.gov (United States)

    Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248

  10. Automated long-term monitoring of parallel microfluidic operations applying a machine vision-assisted positioning method.

    Science.gov (United States)

    Yip, Hon Ming; Li, John C S; Xie, Kai; Cui, Xin; Prasad, Agrim; Gao, Qiannan; Leung, Chi Chiu; Lam, Raymond H W

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  11. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    Directory of Open Access Journals (Sweden)

    Hon Ming Yip

    2014-01-01

    Full Text Available As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities.

  12. The automated data processing architecture for the GPI Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  13. Technology transfer potential of an automated water monitoring system. [market research

    Science.gov (United States)

    Jamieson, W. M.; Hillman, M. E. D.; Eischen, M. A.; Stilwell, J. M.

    1976-01-01

    The nature and characteristics of the potential economic need (markets) for a highly integrated water quality monitoring system were investigated. The technological, institutional and marketing factors that would influence the transfer and adoption of an automated system were studied for application to public and private water supply, public and private wastewater treatment and environmental monitoring of rivers and lakes.

  14. Some considerations on automated image processing of pathline photographs

    International Nuclear Information System (INIS)

    Kobayashi, T.; Saga, T.; Segawa, S.

    1987-01-01

    It is presently shown that flow visualization velocity vectors can be automatically obtained from tracer particle photographs by means of an image processing system. The system involves automated gray level threshold selection during the digitization process and separation or erasure of the intersecting path lines, followed by use of the pathline picture in the identification process and an adjustment of the averaging area in the rearrangement process. Attention is given to the results obtained for two-dimensional flows past an airfoil cascade and around a circular cylinder. 7 references

  15. Personalized and automated remote monitoring of atrial fibrillation.

    Science.gov (United States)

    Rosier, Arnaud; Mabo, Philippe; Temal, Lynda; Van Hille, Pascal; Dameron, Olivier; Deléger, Louise; Grouin, Cyril; Zweigenbaum, Pierre; Jacques, Julie; Chazard, Emmanuel; Laporte, Laure; Henry, Christine; Burgun, Anita

    2016-03-01

    Remote monitoring of cardiac implantable electronic devices is a growing standard; yet, remote follow-up and management of alerts represents a time-consuming task for physicians or trained staff. This study evaluates an automatic mechanism based on artificial intelligence tools to filter atrial fibrillation (AF) alerts based on their medical significance. We evaluated this method on alerts for AF episodes that occurred in 60 pacemaker recipients. AKENATON prototype workflow includes two steps: natural language-processing algorithms abstract the patient health record to a digital version, then a knowledge-based algorithm based on an applied formal ontology allows to calculate the CHA2DS2-VASc score and evaluate the anticoagulation status of the patient. Each alert is then automatically classified by importance from low to critical, by mimicking medical reasoning. Final classification was compared with human expert analysis by two physicians. A total of 1783 alerts about AF episode >5 min in 60 patients were processed. A 1749 of 1783 alerts (98%) were adequately classified and there were no underestimation of alert importance in the remaining 34 misclassified alerts. This work demonstrates the ability of a pilot system to classify alerts and improves personalized remote monitoring of patients. In particular, our method allows integration of patient medical history with device alert notifications, which is useful both from medical and resource-management perspectives. The system was able to automatically classify the importance of 1783 AF alerts in 60 patients, which resulted in an 84% reduction in notification workload, while preserving patient safety. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  16. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-01-01

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  17. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  18. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Automated processing of data generated by molecular dynamics

    International Nuclear Information System (INIS)

    Lobato Hoyos, Ivan; Rojas Tapia, Justo; Instituto Peruano de Energia Nuclear, Lima

    2008-01-01

    A new integrated tool for automated processing of data generated by molecular dynamics packages and programs have been developed. The program allows to calculate important quantities such as pair correlation function, the analysis of common neighbors, counting nanoparticles and their size distribution, conversion of output files between different formats. The work explains in detail the modules of the tool, the interface between them. The uses of program are illustrated in application examples in the calculation of various properties of silver nanoparticles. (author)

  20. Simulation based optimization on automated fibre placement process

    Science.gov (United States)

    Lei, Shi

    2018-02-01

    In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.

  1. Automated processing of X-ray images in medicine

    International Nuclear Information System (INIS)

    Babij, Ya.S.; B'yalyuk, Ya.O.; Yanovich, I.A.; Lysenko, A.V.

    1991-01-01

    Theoretical and practical achievements in application of computing technology means for processing of X-ray images in medicine were generalized. The scheme of the main directions and tasks of processing of X-ray images was given and analyzed. The principal problems appeared in automated processing of X-ray images were distinguished. It is shown that for interpretation of X-ray images it is expedient to introduce a notion of relative operating characteristic (ROC) of a roentgenologist. Every point on ROC curve determines the individual criteria of roentgenologist to put a positive diagnosis for definite situation

  2. AUTOMATION OF TRACEABILITY PROCESS AT GRAIN TERMINAL LLC “ UKRTRANSAGRO"

    Directory of Open Access Journals (Sweden)

    F. A. TRISHYN

    2017-07-01

    Full Text Available A positive trend of growth in both grain production and export is indicated. In the current marketing year the export potential of the Ukrainian grain market is close to the record level. However, the high positions in the rating of world exporters are achieved not only due to the high export potential, but also because of higher quality and logistics. These factors depend directly on the quality of enterprise management and all processes occurring at it. One of the perspective ways of enterprise development is the implementation of the traceability system and further automation of the traceability process. European integration laws are obliging Ukrainian enterprises to have a traceability system. Traceability is an ability to follow the movement of a feed or food through specified stages of production, processing and distribution. The process of traceability is managing by people, which implies a human factor. Automation will allow, in a greater extent, to exclude the human factor that will mean decreasing of errors in documentation and will speed up the process of grain transshipment. Research work on the process was carried out on the most modern grain terminal - LLC “UkrTransAgro”. The terminal is located in the Ukrainian water area of the Azov Sea (Mariupol, Ukraine. Characteristics of the terminal: capacity of a simultaneous storage - 48,120 thousand tons, acceptance of crops from transport - 4,500 tons / day; acceptance of crops from railway transport - 3000 tons / day, transshipment capacity - up to 1.2 million tons per year, shipment to the sea vessels - 7000 tons / day. The analysis of the automation level of the grain terminal is carried out. The company uses software from 1C - «1C: Enterprise 8. Accounting for grain elevator, mill, and feed mill for Ukraine». This software is used for quantitative and qualitative registration at the elevator in accordance with industry guidelines and standards. The software product has many

  3. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    Science.gov (United States)

    Singh, Preetpal

    Canada is home to thousands of freshwater lakes and rivers. Apart from being sources of infinite natural beauty, rivers and lakes are an important source of water, food and transportation. The northern hemisphere of Canada experiences extreme cold temperatures in the winter resulting in a freeze up of regional lakes and rivers. Frozen lakes and rivers tend to offer unique opportunities in terms of wildlife harvesting and winter transportation. Ice roads built on frozen rivers and lakes are vital supply lines for industrial operations in the remote north. Monitoring the ice freeze-up and break-up dates annually can help predict regional climatic changes. Lake ice impacts a variety of physical, ecological and economic processes. The construction and maintenance of a winter road can cost millions of dollars annually. A good understanding of ice mechanics is required to build and deem an ice road safe. A crucial factor in calculating load bearing capacity of ice sheets is the thickness of ice. Construction costs are mainly attributed to producing and maintaining a specific thickness and density of ice that can support different loads. Climate change is leading to warmer temperatures causing the ice to thin faster. At a certain point, a winter road may not be thick enough to support travel and transportation. There is considerable interest in monitoring winter road conditions given the high construction and maintenance costs involved. Remote sensing technologies such as Synthetic Aperture Radar have been successfully utilized to study the extent of ice covers and record freeze-up and break-up dates of ice on lakes and rivers across the north. Ice road builders often used Ultrasound equipment to measure ice thickness. However, an automated monitoring system, based on machine vision and image processing technology, which can measure ice thickness on lakes has not been thought of. Machine vision and image processing techniques have successfully been used in manufacturing

  4. The Earth Observation Monitor - Automated monitoring and alerting for spatial time-series data based on OGC web services

    Science.gov (United States)

    Eberle, J.; Hüttich, C.; Schmullius, C.

    2014-12-01

    Spatial time series data are freely available around the globe from earth observation satellites and meteorological stations for many years until now. They provide useful and important information to detect ongoing changes of the environment; but for end-users it is often too complex to extract this information out of the original time series datasets. This issue led to the development of the Earth Observation Monitor (EOM), an operational framework and research project to provide simple access, analysis and monitoring tools for global spatial time series data. A multi-source data processing middleware in the backend is linked to MODIS data from Land Processes Distributed Archive Center (LP DAAC) and Google Earth Engine as well as daily climate station data from NOAA National Climatic Data Center. OGC Web Processing Services are used to integrate datasets from linked data providers or external OGC-compliant interfaces to the EOM. Users can either use the web portal (webEOM) or the mobile application (mobileEOM) to execute these processing services and to retrieve the requested data for a given point or polygon in userfriendly file formats (CSV, GeoTiff). Beside providing just data access tools, users can also do further time series analyses like trend calculations, breakpoint detections or the derivation of phenological parameters from vegetation time series data. Furthermore data from climate stations can be aggregated over a given time interval. Calculated results can be visualized in the client and downloaded for offline usage. Automated monitoring and alerting of the time series data integrated by the user is provided by an OGC Sensor Observation Service with a coupled OGC Web Notification Service. Users can decide which datasets and parameters are monitored with a given filter expression (e.g., precipitation value higher than x millimeter per day, occurrence of a MODIS Fire point, detection of a time series anomaly). Datasets integrated in the SOS service are

  5. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    Science.gov (United States)

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  6. Integrated safeguards and security for a highly automated process

    International Nuclear Information System (INIS)

    Zack, N.R.; Hunteman, W.J.; Jaeger, C.D.

    1993-01-01

    Before the cancellation of the New Production Reactor Programs for the production of tritium, the reactors and associated processing were being designed to contain some of the most highly automated and remote systems conceived for a Department of Energy facility. Integrating safety, security, materials control and accountability (MC and A), and process systems at the proposed facilities would enhance the overall information and protection-in-depth available. Remote, automated fuel handling and assembly/disassembly techniques would deny access to the nuclear materials while upholding ALARA principles but would also require the full integration of all data/information systems. Such systems would greatly enhance MC and A as well as facilitate materials tracking. Physical protection systems would be connected with materials control features to cross check activities and help detect and resolve anomalies. This paper will discuss the results of a study of the safeguards and security benefits achieved from a highly automated and integrated remote nuclear facility and the impacts that such systems have on safeguards and computer and information security

  7. A combination of HPLC and automated data analysis for monitoring the efficiency of high-pressure homogenization.

    Science.gov (United States)

    Eggenreich, Britta; Rajamanickam, Vignesh; Wurm, David Johannes; Fricke, Jens; Herwig, Christoph; Spadiut, Oliver

    2017-08-01

    Cell disruption is a key unit operation to make valuable, intracellular target products accessible for further downstream unit operations. Independent of the applied cell disruption method, each cell disruption process must be evaluated with respect to disruption efficiency and potential product loss. Current state-of-the-art methods, like measuring the total amount of released protein and plating-out assays, are usually time-delayed and involve manual intervention making them error-prone. An automated method to monitor cell disruption efficiency at-line is not available to date. In the current study we implemented a methodology, which we had originally developed to monitor E. coli cell integrity during bioreactor cultivations, to automatically monitor and evaluate cell disruption of a recombinant E. coli strain by high-pressure homogenization. We compared our tool with a library of state-of-the-art methods, analyzed the effect of freezing the biomass before high-pressure homogenization and finally investigated this unit operation in more detail by a multivariate approach. A combination of HPLC and automated data analysis describes a valuable, novel tool to monitor and evaluate cell disruption processes. Our methodology, which can be used both in upstream (USP) and downstream processing (DSP), describes a valuable tool to evaluate cell disruption processes as it can be implemented at-line, gives results within minutes after sampling and does not need manual intervention.

  8. Text mining from ontology learning to automated text processing applications

    CERN Document Server

    Biemann, Chris

    2014-01-01

    This book comprises a set of articles that specify the methodology of text mining, describe the creation of lexical resources in the framework of text mining and use text mining for various tasks in natural language processing (NLP). The analysis of large amounts of textual data is a prerequisite to build lexical resources such as dictionaries and ontologies and also has direct applications in automated text processing in fields such as history, healthcare and mobile applications, just to name a few. This volume gives an update in terms of the recent gains in text mining methods and reflects

  9. An automated platform for phytoplankton ecology and aquatic ecosystem monitoring

    NARCIS (Netherlands)

    Pomati, F.; Jokela, J.; Simona, M.; Veronesi, M.; Ibelings, B.W.

    2011-01-01

    High quality monitoring data are vital for tracking and understanding the causes of ecosystem change. We present a potentially powerful approach for phytoplankton and aquatic ecosystem monitoring, based on integration of scanning flow-cytometry for the characterization and counting of algal cells

  10. Automated electrohysterographic detection of uterine contractions for monitoring of pregnancy: feasibility and prospects.

    Science.gov (United States)

    Muszynski, C; Happillon, T; Azudin, K; Tylcz, J-B; Istrate, D; Marque, C

    2018-05-08

    Preterm birth is a major public health problem in developed countries. In this context, we have conducted research into outpatient monitoring of uterine electrical activity in women at risk of preterm delivery. The objective of this preliminary study was to perform automated detection of uterine contractions (without human intervention or tocographic signal, TOCO) by processing the EHG recorded on the abdomen of pregnant women. The feasibility and accuracy of uterine contraction detection based on EHG processing were tested and compared to expert decision using external tocodynamometry (TOCO) . The study protocol was approved by local Ethics Committees under numbers ID-RCB 2016-A00663-48 for France and VSN 02-0006-V2 for Iceland. Two populations of women were included (threatened preterm birth and labour) in order to test our system of recognition of the various types of uterine contractions. EHG signal acquisition was performed according to a standardized protocol to ensure optimal reproducibility of EHG recordings. A system of 18 Ag/AgCl surface electrodes was used by placing 16 recording electrodes between the woman's pubis and umbilicus according to a 4 × 4 matrix. TOCO was recorded simultaneously with EHG recording. EHG signals were analysed in real-time by calculation of the nonlinear correlation coefficient H 2 . A curve representing the number of correlated pairs of signals according to the value of H 2 calculated between bipolar signals was then plotted. High values of H 2 indicated the presence of an event that may correspond to a contraction. Two tests were performed after detection of an event (fusion and elimination of certain events) in order to increase the contraction detection rate. The EHG database contained 51 recordings from pregnant women, with a total of 501 contractions previously labelled by analysis of the corresponding tocographic recording. The percentage recognitions obtained by application of the method based on coefficient H 2 was

  11. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    International Nuclear Information System (INIS)

    Kazarov, A; Miotto, G Lehmann; Magnoni, L

    2012-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  12. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    Science.gov (United States)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  13. AUTOMATED PROCESSING OF DAIRY PRODUCT MICROPHOTOS USING IMAGEJ AND STATISTICA

    Directory of Open Access Journals (Sweden)

    V. K. Bitiukov

    2014-01-01

    Full Text Available Summary. The article discusses the construction of algorithms for automated processing of microphotos of dairy products. Automated processing of micro photos of dairy products relevant in the study of the degree of homogenization. Microphotos of dairy products contain information about the distribution of fat globules in the mass fractions. Today, there are some of software products, offering image processing and relieving researchers from routine operations manual data processing. But it need to be adapted for performing the processing of microphotos of dairy products. In this paper we propose to use for processing the application package ImageJ for processing image files taken with digital microscope, and to calculate the statistical characteristics of the proposed use of the software package Statistica. Processing algorithm consists of successive stages of conversion to gray scale, scaling, filtering, binarization, object recognition and statistical processing of the results of recognition. The result of the implemented data processing algorithms is the distribution function of the fat globules in terms of volume or mass fraction, as well as the statistical parameters of the distribution (the mathematical expectation, variance, skewness and kurtosis coefficients. For the inspection of the algorithm and its debugging experimental studieswere carried out. Carries out the homogenization of farm milk at different pressures of homogenization. For each sample were made microphoto sand image processing carried out in accordance with the proposed algorithm. Studies have shown the effectiveness and feasibility of the proposed algorithm in the form of java script for ImageJ and then send the data to a file for the software package Statistica.

  14. CONCEPT AND STRUCTURE OF AUTOMATED SYSTEM FOR MONITORING STUDENT LEARNING QUALITY

    Directory of Open Access Journals (Sweden)

    M. Yu. Kataev

    2017-01-01

    organization and management of the learning process in a higher educational institution. The factors that affect the level of student knowledge obtained during training are shown. On this basis, the determining factors in assessing the level of knowledge are highlighted. It is offered to build the managing of individual training at any time interval on the basis of a calculation of the generalized criterion which consists of students’ current progress, their activity and time spent for training.The block structure of the automated program system of continuous monitoring of achievements of each student is described. All functional blocks of system are interconnected with educational process. The main advantage of this system is that students have continuous access to materials about own individual achievements and mistakes; from passive consumers of information they turn into active members of the education, and thus, they can achieve bigger effectiveness of personal vocational training. It is pointed out that information base of such system has to be available not only to students and teachers, but also future employers of university graduates.Practical significance. The concept of automated system for education results monitoring and technique of processing of collected material presented in the article are based on a simple and obvious circumstance: a student with high progress spends more time on training and leads active lifestyle in comparison with fellow students; therefore, that student with high probability will be more successful in the chosen profession. Thus, for ease of use, complete, fully detailed and digitized information on individual educational achievements of future expert is necessary not only for effective management of educational process in higher education institutions, but also for employers interested in well-prepared, qualified and hard-working staff intended to take responsibility for labour duties.

  15. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  16. Integrated Monitoring System of Production Processes

    Directory of Open Access Journals (Sweden)

    Oborski Przemysław

    2016-12-01

    Full Text Available Integrated monitoring system for discrete manufacturing processes is presented in the paper. The multilayer hardware and software reference model was developed. Original research are an answer for industry needs of the integration of information flow in production process. Reference model corresponds with proposed data model based on multilayer data tree allowing to describe orders, products, processes and save monitoring data. Elaborated models were implemented in the integrated monitoring system demonstrator developed in the project. It was built on the base of multiagent technology to assure high flexibility and openness on applying intelligent algorithms for data processing. Currently on the base of achieved experience an application integrated monitoring system for real production system is developed. In the article the main problems of monitoring integration are presented, including specificity of discrete production, data processing and future application of Cyber-Physical-Systems. Development of manufacturing systems is based more and more on taking an advantage of applying intelligent solutions into machine and production process control and monitoring. Connection of technical systems, machine tools and manufacturing processes monitoring with advanced information processing seems to be one of the most important areas of near future development. It will play important role in efficient operation and competitiveness of the whole production system. It is also important area of applying in the future Cyber-Physical-Systems that can radically improve functionally of monitoring systems and reduce the cost of its implementation.

  17. Expanding the functional significance of automated control systems for the production process at hydroelectric plants

    International Nuclear Information System (INIS)

    Vasil'ev, Yu.S.; Kononova, M.Yu.

    1993-01-01

    Automated control systems for the production process (ACS PP) have been successfully implemented in a number of hydroelectric plants in the Russian Federation. The circle of problems that can be solved using ACS PP can be conditionally divided into two classes: on-line/technological control, and production-technological control. This article describes successes and future directions for the solution of these two classes of problems. From the discussion, it is concluded (1) that the data base for existing ACS PP at hydroelectric plants can be successfully employed as points for monitoring the conservation of an environment of local significance; (2) that is is expedient to discuss the problem with organizations, including local control groups interested in the development of territorial-basin systems for ecological monitoring; and (3) that the initiative in creating local territorial-basin support points for monitoring should emanate from guidelines for hydroelectric plants with ACS PP. 3 refs., 2 figs

  18. Automated Performance Monitoring and Assessment for DCS Digital Systems

    Science.gov (United States)

    1980-07-01

    in an automated technical con-. trol environnrunt. j UNCLASSIFIED SECURITY CLASSIFICATION OF I , PAGE(Wh n O tD . E e d) ACKNOWLEDGEMENT This program...5-2 Second Level MUX TD -1193 Alarms 5-3 5-3 First Level MUX TD -1192 Alarms 5-4 5.-4 Submultiplexer TDM-1251 Alarms 5-5 5 RF Distribution System...i 0 (0: I- O)’dU 04i (0) 0 0a 04 U) (04 04r 04 r- (O 0) 0) 41 ~ 0) 0H (z (0 0) Q) 4J a r- ) -4’ iHQ ) w .41 Q) > 4-4 !O ~ -4 ) > 114 > 44 >4 UU O4 u

  19. Westinghouse integrated cementation facility. Smart process automation minimizing secondary waste

    International Nuclear Information System (INIS)

    Fehrmann, H.; Jacobs, T.; Aign, J.

    2015-01-01

    The Westinghouse Cementation Facility described in this paper is an example for a typical standardized turnkey project in the area of waste management. The facility is able to handle NPP waste such as evaporator concentrates, spent resins and filter cartridges. The facility scope covers all equipment required for a fully integrated system including all required auxiliary equipment for hydraulic, pneumatic and electric control system. The control system is based on actual PLC technology and the process is highly automated. The equipment is designed to be remotely operated, under radiation exposure conditions. 4 cementation facilities have been built for new CPR-1000 nuclear power stations in China

  20. Starting the automation process by using group technology

    Directory of Open Access Journals (Sweden)

    Jorge Andrés García Barbosa

    2004-09-01

    Full Text Available This article describes starting-up an automation process based on applying group technology (GT. Mecanizados CNC, a company making matallurgical sector products, bases the layout (organisation and disposition of its machinery on the concept of manufacturing cells; production is programmed once the best location for the equipment has been determined. The order of making products and suitable setting up of tools for the machinery in the cells is established, aimed at minimising set up leading to achieving 15% improvement in productivity.

  1. Electrical - light current remote monitoring, control and automation. [Coal mine, United Kingdom

    Energy Technology Data Exchange (ETDEWEB)

    Collingwood, C H

    1981-06-01

    A brief discussion is given of the application of control monitoring and automation techniques to coal mining in the United Kingdom, especially of the use of microprocessors, for the purpose of enhancing safety and productivity. Lighting systems for the coal mine is similarly discussed.

  2. Automated Pre-processing for NMR Assignments with Reduced Tedium

    Energy Technology Data Exchange (ETDEWEB)

    2004-05-11

    An important rate-limiting step in the reasonance asignment process is accurate identification of resonance peaks in MNR spectra. NMR spectra are noisy. Hence, automatic peak-picking programs must navigate between the Scylla of reliable but incomplete picking, and the Charybdis of noisy but complete picking. Each of these extremes complicates the assignment process: incomplete peak-picking results in the loss of essential connectivities, while noisy picking conceals the true connectivities under a combinatiorial explosion of false positives. Intermediate processing can simplify the assignment process by preferentially removing false peaks from noisy peak lists. This is accomplished by requiring consensus between multiple NMR experiments, exploiting a priori information about NMR spectra, and drawing on empirical statistical distributions of chemical shift extracted from the BioMagResBank. Experienced NMR practitioners currently apply many of these techniques "by hand", which is tedious, and may appear arbitrary to the novice. To increase efficiency, we have created a systematic and automated approach to this process, known as APART. Automated pre-processing has three main advantages: reduced tedium, standardization, and pedagogy. In the hands of experienced spectroscopists, the main advantage is reduced tedium (a rapid increase in the ratio of true peaks to false peaks with minimal effort). When a project is passed from hand to hand, the main advantage is standardization. APART automatically documents the peak filtering process by archiving its original recommendations, the accompanying justifications, and whether a user accepted or overrode a given filtering recommendation. In the hands of a novice, this tool can reduce the stumbling block of learning to differentiate between real peaks and noise, by providing real-time examples of how such decisions are made.

  3. RAPID AUTOMATED RADIOCHEMICAL ANALYZER FOR DETERMINATION OF TARGETED RADIONUCLIDES IN NUCLEAR PROCESS STREAMS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Egorov, Oleg; Devol, Timothy A.

    2008-01-01

    Some industrial process-scale plants require the monitoring of specific radionuclides as an indication of the composition of their feed streams or as indicators of plant performance. In this process environment, radiochemical measurements must be fast, accurate, and reliable. Manual sampling, sample preparation, and analysis of process fluids are highly precise and accurate, but tend to be expensive and slow. Scientists at Pacific Northwest National Laboratory (PNNL) have assembled and characterized a fully automated prototype Process Monitor instrument which was originally designed to rapidly measure Tc-99 in the effluent streams of the Waste Treatment Plant at Hanford, WA. The system is capable of a variety of tasks: extraction of a precise volume of sample, sample digestion/analyte redox adjustment, column-based chemical separations, flow-through radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results can be immediately calculated and electronically reported. It is capable of performing a complete analytical cycle in less than 15 minutes. The system is highly modular and can be adapted to a variety of sample types and analytical requirements. It exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  4. Vibration based structural health monitoring of an arch bridge: From automated OMA to damage detection

    Science.gov (United States)

    Magalhães, F.; Cunha, A.; Caetano, E.

    2012-04-01

    In order to evaluate the usefulness of approaches based on modal parameters tracking for structural health monitoring of bridges, in September of 2007, a dynamic monitoring system was installed in a concrete arch bridge at the city of Porto, in Portugal. The implementation of algorithms to perform the continuous on-line identification of modal parameters based on structural responses to ambient excitation (automated Operational Modal Analysis) has permitted to create a very complete database with the time evolution of the bridge modal characteristics during more than 2 years. This paper describes the strategy that was followed to minimize the effects of environmental and operational factors on the bridge natural frequencies, enabling, in a subsequent stage, the identification of structural anomalies. Alternative static and dynamic regression models are tested and complemented by a Principal Components Analysis. Afterwards, the identification of damages is tried with control charts. At the end, it is demonstrated that the adopted processing methodology permits the detection of realistic damage scenarios, associated with frequency shifts around 0.2%, which were simulated with a numerical model.

  5. Monitoring and control of fine abrasive finishing processes

    DEFF Research Database (Denmark)

    Lazarev, Ruslan

    In engineering, surfaces with specified functional properties are of high demand in various applications. Desired surface finish can be obtained using several methods. Abrasive finishing is one of the most important processes in the manufacturing of mould and dies tools. It is a principal method ...... was segmented using discretization methods. The applied methodology was proposed for implementation as an on-line system and is considered to be a part of the next generation of STRECON NanoRAP machine....... to remove unwanted material, obtain desired geometry, surface quality and surface functional properties. The automation and computerization of finishing processes involves utilisation of robots, specialized machines with several degrees of freedom, sensors and data acquisition systems. The focus...... of this work was to investigate foundations for process monitoring and control methods in application to semi-automated polishing machine based on the industrial robot. The monitoring system was built on NI data acquisition system with two sensors, acoustic emission sensor and accelerometer. Acquired sensory...

  6. Organization of film data processing in the PPI-SA automated system

    International Nuclear Information System (INIS)

    Ovsov, Yu.V.; Perekatov, V.G.

    1984-01-01

    Organization of processing nuclear interaction images at PUOS - type standard devices using the PPI-SA automated system is considered. The system is made in the form of a complete module comprising two scanning measuring projectors and a scan-ning automatic device which operate in real time on line with the BESM-4-computer. The system comprises: subsystem for photographic film scanning, selection of events for measurements and preliminary encoding; subsystem for formation and generation of libraries with data required for monitoring the scanning automatic device; subsystem for precision measurements separate coordinates on photo images of nuclear particle tracks and ionization losses. The system software comprises monitoring programs for the projectors and scanning automatic device as well as test functional control programs and operating system. The programs are organized a modular concept. By changing the module set the system can be modified and adapted for image processing in different fields of science and technology

  7. Automated Grid Monitoring for LHCb through HammerCloud

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  8. Automated analysis of PET based in-vivo monitoring in ion beam therapy

    International Nuclear Information System (INIS)

    Kuess, P.

    2014-01-01

    Particle Therapy (PT)-PET is currently the only clinically approved in-vivo method for monitoring PT. Due to fragmentation processes in the patients' tissue and the beam projectiles, a beta plus activity distribution (BAD) can be measured during or shortly after the irradiation. The recorded activity map can not be directly compared to the planned dose distribution. However, by means of a Monte Carlo (MC) simulation it is possible to predict the measured BAD from a treatment plan (TP). Thus to verify a patient's treatment fraction the actual PET measurement can be compared to the respective BAD prediction. This comparison is currently performed by visual inspection which requires experienced evaluators and is rather time consuming. In this PhD thesis an evaluation tool is presented to compare BADs in an automated and objective way. The evaluation method was based on the Pearson's correlation coefficient (PCC) – an established measure in medical image processing – which was coded into a software tool. The patient data used to develop, test and validate the software tool were acquired at the GSI research facility where over 400 patient treatments with 12C were monitored by means of an in-beam PET prototype. The number of data sets was increased by artificially altering BAD to simulate different beam ranges. The automated detection tool was tested in head and neck (H&N), prostate, lung, and brain. To generate carbon ion TPs the treatment planning system TRiP98 was used for all cases. From these TPs the respective BAD predictions were derived. Besides the detection of range deviations by means of PT-PET also the automated detection of patient setup uncertainties was investigated. Although all measured patient data were recorded during the irradiation (in-beam) also scenarios performing PET scans shortly after the irradiation (in-room) were considered. To analyze the achievable precision of PT-PET with the automated evaluation tool based on

  9. Automated speech quality monitoring tool based on perceptual evaluation

    OpenAIRE

    Vozňák, Miroslav; Rozhon, Jan

    2010-01-01

    The paper deals with a speech quality monitoring tool which we have developed in accordance with PESQ (Perceptual Evaluation of Speech Quality) and is automatically running and calculating the MOS (Mean Opinion Score). Results are stored into database and used in a research project investigating how meteorological conditions influence the speech quality in a GSM network. The meteorological station, which is located in our university campus provides information about a temperature,...

  10. A Comparative Experimental Study on the Use of Machine Learning Approaches for Automated Valve Monitoring Based on Acoustic Emission Parameters

    Science.gov (United States)

    Ali, Salah M.; Hui, K. H.; Hee, L. M.; Salman Leong, M.; Al-Obaidi, M. A.; Ali, Y. H.; Abdelrhman, Ahmed M.

    2018-03-01

    Acoustic emission (AE) analysis has become a vital tool for initiating the maintenance tasks in many industries. However, the analysis process and interpretation has been found to be highly dependent on the experts. Therefore, an automated monitoring method would be required to reduce the cost and time consumed in the interpretation of AE signal. This paper investigates the application of two of the most common machine learning approaches namely artificial neural network (ANN) and support vector machine (SVM) to automate the diagnosis of valve faults in reciprocating compressor based on AE signal parameters. Since the accuracy is an essential factor in any automated diagnostic system, this paper also provides a comparative study based on predictive performance of ANN and SVM. AE parameters data was acquired from single stage reciprocating air compressor with different operational and valve conditions. ANN and SVM diagnosis models were subsequently devised by combining AE parameters of different conditions. Results demonstrate that ANN and SVM models have the same results in term of prediction accuracy. However, SVM model is recommended to automate diagnose the valve condition in due to the ability of handling a high number of input features with low sampling data sets.

  11. Long-term monitoring of soil gas fluxes with closed chambers using automated and manual systems

    Energy Technology Data Exchange (ETDEWEB)

    Scott, A.; Crichton, I.; Ball, B.C.

    1999-10-01

    The authors describe two gas sample collection techniques, each of which is used in conjunction with custom made automated or manually operated closed chambers. The automated system allows automatic collection of gas samples for simultaneous analysis of multiple trace gas efflux from soils, permitting long-term monitoring. Since the manual system is cheaper to produce, it can be replicated more than the automated and used to estimate spatial variability of soil fluxes. The automated chamber covers a soil area of 0.5 m{sup 2} and has a motor driven lid that remains operational throughout a range of weather conditions. Both systems use gas-tight containers of robust metal construction, which give good sample retention, thereby allowing long-term storage and convenience of transport from remote locations. The containers in the automated system are filled by pumping gas from the closed chamber via a multiway rotary valve. Stored samples from both systems are analyzed simultaneously for N{sub 2}O and CO{sub 2} using automated injection into laboratory-based gas chromatographs. The use of both collection systems is illustrated by results from a field experiment on sewage sludge disposal to land where N{sub 2}O fluxes were high. The automated gas sampling system permitted quantification of the marked temporal variability of concurrent N{sub 2}O and CO{sub 2} fluxes and allowed improved estimation of cumulative fluxes. The automated measurement approach yielded higher estimates of cumulative flux because integration of manual point-in-time observations missed a number of transient high-flux events.

  12. Automated monitoring: a potential solution for achieving sustainable improvement in hand hygiene practices.

    Science.gov (United States)

    Levchenko, Alexander I; Boscart, Veronique M; Fernie, Geoff R

    2014-08-01

    Adequate hand hygiene is often considered as the most effective method of reducing the rates of hospital-acquired infections, which are one of the major causes of increased cost, morbidity, and mortality in healthcare. Electronic monitoring technologies provide a promising direction for achieving sustainable hand hygiene improvement by introducing the elements of automated feedback and creating the possibility to automatically collect individual hand hygiene performance data. The results of the multiphase testing of an automated hand hygiene reminding and monitoring system installed in a complex continuing care setting are presented. The study included a baseline Phase 1, with the system performing automated data collection only, a preintervention Phase 2 with hand hygiene status indicator enabled, two intervention Phases 3 and 4 with the system generating hand hygiene reminding signals and periodic performance feedback sessions provided, and a postintervention Phase 5 with only hand hygiene status indicator enabled and no feedback sessions provided. A significant increase in hand hygiene performance observed during the first intervention Phase 3 was sustained over the second intervention Phase 4, with the postintervention phase also indicating higher hand hygiene activity rates compared with the preintervention and baseline phases. The overall trends observed during the multiphase testing, the factors affecting acceptability of the automated hand hygiene monitoring system, and various strategies of technology deployment are discussed.

  13. [Automation and organization of technological process of urinalysis].

    Science.gov (United States)

    Kolenkin, S M; Kishkun, A A; Kol'chenko, O L

    2000-12-01

    Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.

  14. Highly Automated Agile Testing Process: An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Jarosław Berłowski

    2016-09-01

    Full Text Available This paper presents a description of an agile testing process in a medium size software project that is developed using Scrum. The research methods used is the case study were as follows: surveys, quantifiable project data sources and qualitative project members opinions were used for data collection. Challenges related to the testing process regarding a complex project environment and unscheduled releases were identified. Based on the obtained results, we concluded that the described approach addresses well the aforementioned issues. Therefore, recommendations were made with regard to the employed principles of agility, specifically: continuous integration, responding to change, test automation and test driven development. Furthermore, an efficient testing environment that combines a number of test frameworks (e.g. JUnit, Selenium, Jersey Test with custom-developed simulators is presented.

  15. Automated micro fluidic system for PCR applications in the monitoring of drinking water quality

    International Nuclear Information System (INIS)

    Soria Soria, E.; Yanez Amoros, A.; Murtula Corbi, R.; Catalan Cuenca, V.; Martin-Cisneros, C. S.; Ymbern, O.; Alonso-Chamorro, J.

    2009-01-01

    Microbiological laboratories present a growing interest in automated, simple and user-friendly methodologies able to perform simultaneous analysis of a high amount of samples. Analytical tools based on micro-fluidic could play an important role in this field. In this work, the development of an automated micro fluidic system for PCR applications and aimed to monitoring of drinking water quality is presented. The device will be able to determine, simultaneously, fecal pollution indicators and water-transmitted pathogens. Further-more, complemented with DNA pre-concentration and extraction modules, the device would present a highly integrated solution for microbiological diagnostic laboratories. (Author) 13 refs.

  16. Introducing 2D barcode on TLD cards - a step towards automation in personnel monitoring

    International Nuclear Information System (INIS)

    Ajoy, K.C.; Dhanasekaran, A.; Annalakshmi, O; Rajagopal, V.; Santhanam, R.; Jose, M.T.

    2018-01-01

    As part of personnel monitoring services, TLD lab, RSD, IGCAR issues and receives large numbers of TLD cards every month, for use by occupational workers belonging to various hot facilities at Kalpakkam. Considering the nature of the work being manual, routine, labour intensive and being prone for human errors, introducing automation would be necessary at the TLD lab as well as at the user facility. This requires identification of the individual components of the TLD and embed them with unique identification for the system to accomplish the task. The paper discusses the automation part related to the TLD cards

  17. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  18. Automated Groundwater Monitoring of Uranium at the Hanford Site, Washington - 13116

    Energy Technology Data Exchange (ETDEWEB)

    Burge, Scott R. [Burge Environmental, Inc., 6100 South Maple Avenue, no. 114, Tempe, AZ, 85283 (United States); O' Hara, Matthew J. [Pacific Northwest National Laboratory, 902 Battelle Blvd., Richland, WA, 99352 (United States)

    2013-07-01

    An automated groundwater monitoring system for the detection of uranyl ion in groundwater was deployed at the 300 Area Industrial Complex, Hanford Site, Washington. The research was conducted to determine if at-site, automated monitoring of contaminant movement in the subsurface is a viable alternative to the baseline manual sampling and analytical laboratory assay methods currently employed. The monitoring system used Arsenazo III, a colorimetric chelating compound, for the detection of the uranyl ion. The analytical system had a limit of quantification of approximately 10 parts per billion (ppb, μg/L). The EPA's drinking water maximum contaminant level (MCL) is 30 ppb [1]. In addition to the uranyl ion assay, the system was capable of acquiring temperature, conductivity, and river level data. The system was fully automated and could be operated remotely. The system was capable of collecting water samples from four sampling sources, quantifying the uranyl ion, and periodically performing a calibration of the analytical cell. The system communications were accomplished by way of cellular data link with the information transmitted through the internet. Four water sample sources were selected for the investigation: one location provided samples of Columbia River water, and the remaining three sources provided groundwater from aquifer sampling tubes positioned in a vertical array at the Columbia River shoreline. The typical sampling schedule was to sample the four locations twice per day with one calibration check per day. This paper outlines the instrumentation employed, the operation of the instrumentation, and analytical results for a period of time between July and August, 2012. The presentation includes the uranyl ion concentration and conductivity results from the automated sampling/analysis system, along with a comparison between the automated monitor's analytical performance and an independent laboratory analysis. Benefits of using the automated

  19. Automated Groundwater Monitoring of Uranium at the Hanford Site, Washington - 13116

    International Nuclear Information System (INIS)

    Burge, Scott R.; O'Hara, Matthew J.

    2013-01-01

    An automated groundwater monitoring system for the detection of uranyl ion in groundwater was deployed at the 300 Area Industrial Complex, Hanford Site, Washington. The research was conducted to determine if at-site, automated monitoring of contaminant movement in the subsurface is a viable alternative to the baseline manual sampling and analytical laboratory assay methods currently employed. The monitoring system used Arsenazo III, a colorimetric chelating compound, for the detection of the uranyl ion. The analytical system had a limit of quantification of approximately 10 parts per billion (ppb, μg/L). The EPA's drinking water maximum contaminant level (MCL) is 30 ppb [1]. In addition to the uranyl ion assay, the system was capable of acquiring temperature, conductivity, and river level data. The system was fully automated and could be operated remotely. The system was capable of collecting water samples from four sampling sources, quantifying the uranyl ion, and periodically performing a calibration of the analytical cell. The system communications were accomplished by way of cellular data link with the information transmitted through the internet. Four water sample sources were selected for the investigation: one location provided samples of Columbia River water, and the remaining three sources provided groundwater from aquifer sampling tubes positioned in a vertical array at the Columbia River shoreline. The typical sampling schedule was to sample the four locations twice per day with one calibration check per day. This paper outlines the instrumentation employed, the operation of the instrumentation, and analytical results for a period of time between July and August, 2012. The presentation includes the uranyl ion concentration and conductivity results from the automated sampling/analysis system, along with a comparison between the automated monitor's analytical performance and an independent laboratory analysis. Benefits of using the automated system as an

  20. Material quality development during the automated tow placement process

    Science.gov (United States)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  1. Process automation system for integration and operation of Large Volume Plasma Device

    International Nuclear Information System (INIS)

    Sugandhi, R.; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-01-01

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  2. Process automation system for integration and operation of Large Volume Plasma Device

    Energy Technology Data Exchange (ETDEWEB)

    Sugandhi, R., E-mail: ritesh@ipr.res.in; Srivastava, P.K.; Sanyasi, A.K.; Srivastav, Prabhakar; Awasthi, L.M.; Mattoo, S.K.

    2016-11-15

    Highlights: • Analysis and design of process automation system for Large Volume Plasma Device (LVPD). • Data flow modeling for process model development. • Modbus based data communication and interfacing. • Interface software development for subsystem control in LabVIEW. - Abstract: Large Volume Plasma Device (LVPD) has been successfully contributing towards understanding of the plasma turbulence driven by Electron Temperature Gradient (ETG), considered as a major contributor for the plasma loss in the fusion devices. Large size of the device imposes certain difficulties in the operation, such as access of the diagnostics, manual control of subsystems and large number of signals monitoring etc. To achieve integrated operation of the machine, automation is essential for the enhanced performance and operational efficiency. Recently, the machine is undergoing major upgradation for the new physics experiments. The new operation and control system consists of following: (1) PXIe based fast data acquisition system for the equipped diagnostics; (2) Modbus based Process Automation System (PAS) for the subsystem controls and (3) Data Utilization System (DUS) for efficient storage, processing and retrieval of the acquired data. In the ongoing development, data flow model of the machine’s operation has been developed. As a proof of concept, following two subsystems have been successfully integrated: (1) Filament Power Supply (FPS) for the heating of W- filaments based plasma source and (2) Probe Positioning System (PPS) for control of 12 number of linear probe drives for a travel length of 100 cm. The process model of the vacuum production system has been prepared and validated against acquired pressure data. In the next upgrade, all the subsystems of the machine will be integrated in a systematic manner. The automation backbone is based on 4-wire multi-drop serial interface (RS485) using Modbus communication protocol. Software is developed on LabVIEW platform using

  3. Automated Machinery Health Monitoring Using Stress Wave Analysis & Artificial Intelligence

    National Research Council Canada - National Science Library

    Board, David

    1998-01-01

    .... Army, for application to helicopter drive train components. The system will detect structure borne, high frequency acoustic data, and process it with feature extraction and polynomial network artificial intelligence software...

  4. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  5. Automation of a problem list using natural language processing

    Directory of Open Access Journals (Sweden)

    Haug Peter J

    2005-08-01

    Full Text Available Abstract Background The medical problem list is an important part of the electronic medical record in development in our institution. To serve the functions it is designed for, the problem list has to be as accurate and timely as possible. However, the current problem list is usually incomplete and inaccurate, and is often totally unused. To alleviate this issue, we are building an environment where the problem list can be easily and effectively maintained. Methods For this project, 80 medical problems were selected for their frequency of use in our future clinical field of evaluation (cardiovascular. We have developed an Automated Problem List system composed of two main components: a background and a foreground application. The background application uses Natural Language Processing (NLP to harvest potential problem list entries from the list of 80 targeted problems detected in the multiple free-text electronic documents available in our electronic medical record. These proposed medical problems drive the foreground application designed for management of the problem list. Within this application, the extracted problems are proposed to the physicians for addition to the official problem list. Results The set of 80 targeted medical problems selected for this project covered about 5% of all possible diagnoses coded in ICD-9-CM in our study population (cardiovascular adult inpatients, but about 64% of all instances of these coded diagnoses. The system contains algorithms to detect first document sections, then sentences within these sections, and finally potential problems within the sentences. The initial evaluation of the section and sentence detection algorithms demonstrated a sensitivity and positive predictive value of 100% when detecting sections, and a sensitivity of 89% and a positive predictive value of 94% when detecting sentences. Conclusion The global aim of our project is to automate the process of creating and maintaining a problem

  6. Automation of the DoD Export License Application Review Process

    National Research Council Canada - National Science Library

    Young, Shelton

    2002-01-01

    .... The overall audit objective was to determine whether Federal automation programs supporting the export license and review process could be used to establish a common electronic interface creating...

  7. Quantitative monitoring of the fluorination process by neutron counting

    International Nuclear Information System (INIS)

    Russo, P.A.; Appert, Q.D.; Biddle, R.S.; Kelley, T.A.; Martinez, M.M.; West, M.H.

    1993-01-01

    Plutonium metal is produced by reducing PuF 4 prepared from PuO 2 by fluorination. Both fluorination and reduction are batch processes at the Los Alamos Plutonium Facility. The conversion of plutonium oxide to fluoride greatly increases the neutron yield, a result of the high cross section for alpha-neutron (α,n) reactions on fluorine targets compared to the (more than 100 times) smaller α,n yield on oxygen targets. Because of the increase, total neutron counting can be used to monitor the conversion process. This monitoring ability can lead to an improved metal product, reduced scrap for recycle, waste reduction, minimized reagent usage, and reduce personnel radiation exposures. A new stirred-bed fluorination process has been developed simultaneously with a recent evaluation of an automated neutron-counting instrument for quantitative process monitoring. Neutrons are counted with polyethylene-moderated 3 He-gas proportional counters. Results include a calibration of the real-time neutron-count-rate indicator for the extent of fluorination using reference values obtained from destructive analysis of samples from the blended fluoroinated batch

  8. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  9. Experimental demonstration of microscopic process monitoring

    International Nuclear Information System (INIS)

    Hurt, R.D.; Hurrell, S.J.; Wachter, J.W.; Hebble, T.L.; Crawford, A.B.

    1982-01-01

    Microscopic process monitoring (MPM) is a material control strategy designed to use standard process control data to provide expanded safeguards protection of nuclear fuel cycle facilities. The MPM methodology identifies process events by recognizing significant patterns of changes in on-line measurements. The goals of MPM are to detect diversions of nuclear material and to provide information on process status useful to other facility safeguards operations

  10. Process understanding and cooperative design. Keys to high quality automation

    International Nuclear Information System (INIS)

    Tommila, T.; Heinonen, R.

    1995-01-01

    A systematic approach to the specification of process control systems, and four practical methods supporting user participation and interdisciplinary co-operation are described. The main steps of the design approach are: (1) hierarchical decomposition of the plant to process items of different types; (2) analysis and definition of requirements and control strategies associated with each process item; (3) definition of automation degree; and (4) functional specification of the control system and its user interface. The specification language used for this step is a combination of principles found in object oriented design, structured analysis as well as new language standards for programmable controllers and open information systems. The design review methods presented include structured control strategy meetings, safety analysis of sequential controls, review of graphic displays, and a usability questionnaire for existing plants. These methods can be used to elicit users' needs and operational experience, to gain a common understanding of the process functionality, or to detect errors in design specifications or in existing systems. (8 refs., 9 figs.)

  11. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in OMB...

  12. AIRCRAFT POWER SUPPLY SYSTEM DESIGN PROCESS AS AN AUTOMATION OBJECT

    Directory of Open Access Journals (Sweden)

    Boris V. Zhmurov

    2018-01-01

    aircraft and take into account all the requirements of the customer and the regulatory and technical documentation is its automation.Automation of the design of EPS aircraft as an optimization task involves the formalization of the object of optimization, as well as the choice of the criterion of efficiency and control actions. Under the object of optimization in this case we mean the design process of the EPS, the formalization of which includes formalization and the design object – the aircraft power supply system.

  13. An automated system for monitoring bird collisions with power lines and tower guys

    Energy Technology Data Exchange (ETDEWEB)

    Carlton, R.G. [Electric Power Research Inst., Palo Alto, CA (United States)

    2005-07-01

    An automated system for monitoring collisions between birds and power lines was presented. The bird strike indicator (BSI) was developed to gather bird collision information that is difficult to obtain through direct human observation as well as to aid in the calculation of inherent biases which must be considered when attempting to determine total mortality from data obtained in on-the-ground dead bird searches. The BSI can be placed directly on power lines, static wires, or tower guy cables with a standard hot stick power line clamp. The sensor consists of a state-of-the-art accelerometers, power supplies, signal processors, and data acquisition systems. The BSI also includes a communication system for transmitting data to a ground-based unit in which raw data can be stored. A complete BSI consists of 30 sensors with signal processing and data logging capabilities, and a base station. The sensors integrate several components, including wireless radio, data storage, and a microcontroller with an A/D converter. Full-scale field deployment has shown that the BSI is both robust and sensitive to vibrations in the guy wires, as the system has been tuned to eliminate vibrations induced by wind. 3 figs.

  14. Acoustic emission-based in-process monitoring of surface generation in robot-assisted polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas; Bissacco, Giuliano; De Chiffre, Leonardo

    2016-01-01

    The applicability of acoustic emission (AE) measurements for in-process monitoring of surface generation in the robot-assisted polishing (RAP) was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate...... automatic detection of optimal process endpoint allow intelligent process control, creating fundamental elements in development of robust fully automated RAP process for its widespread industrial application....... removal of the part from the machine tool. In this study, stabilisation of surface roughness during polishing rotational symmetric surfaces by the RAP process was monitored by AE measurements. An AE sensor was placed on a polishing arm in direct contact with a bonded abrasive polishing tool...

  15. Advanced monitoring with complex stream processing

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  16. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    Science.gov (United States)

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  17. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  18. Automated read-out of thermoluminescence dosemeters in a centralized individual monitoring service

    International Nuclear Information System (INIS)

    Toivonen, M.

    The organizational problems in maintaining centralized individual monitoring service with erasable and re-usable dosemeters are evaluated. Design criteria for an automated thermoluminescence reader are laid down. It is characteristic for the planning of the monitoring system that the issuing of dosemeters can be arranged without having two dosemeters for each worker. A home made reader designed to fullfil these criteria is presented. The use of a standard barcode and a standard optical barcode reader in identification of dosemeters is described. A method of using a minicomputer in preparing the self-fastening identification labels, in printing mailing lists and in printing results is described

  19. FY-2010 Process Monitoring Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Casella, Amanda J.; Hines, Wes; Levitskaia, Tatiana G.; henkell, J.; Schwantes, Jon M.; Jordan, Elizabeth A.; Lines, Amanda M.; Fraga, Carlos G.; Peterson, James M.; Verdugo, Dawn E.; Christensen, Ronald N.; Peper, Shane M.

    2011-01-01

    During FY 2010, work under the Spectroscopy-Based Process Monitoring task included ordering and receiving four fluid flow meters and four flow visible-near infrared spectrometer cells to be instrumented within the centrifugal contactor system at Pacific Northwest National Laboratory (PNNL). Initial demonstrations of real-time spectroscopic measurements on cold-stream simulants were conducted using plutonium (Pu)/uranium (U) (PUREX) solvent extraction process conditions. The specific test case examined the extraction of neodymium nitrate (Nd(NO3)3) from an aqueous nitric acid (HNO3) feed into a tri-n-butyl phosphate (TBP)/ n-dodecane solvent. Demonstration testing of this system included diverting a sample from the aqueous feed meanwhile monitoring the process in every phase using the on-line spectroscopic process monitoring system. The purpose of this demonstration was to test whether spectroscopic monitoring is capable of determining the mass balance of metal nitrate species involved in a cross-current solvent extraction scheme while also diverting a sample from the system. The diversion scenario involved diverting a portion of the feed from a counter-current extraction system while a continuous extraction experiment was underway. A successful test would demonstrate the ability of the process monitoring system to detect and quantify the diversion of material from the system during a real-time continuous solvent extraction experiment. The system was designed to mimic a PUREX-type extraction process with a bank of four centrifugal contactors. The aqueous feed contained Nd(NO3)3 in HNO3, and the organic phase was composed of TBP/n-dodecane. The amount of sample observed to be diverted by on-line spectroscopic process monitoring was measured to be 3 mmol (3 x 10-3 mol) Nd3+. This value was in excellent agreement with the 2.9 mmol Nd3+ value based on the known mass of sample taken (i.e., diverted) directly from the system feed solution.

  20. Using process-oriented interfaces for solving the automation paradox in highly automated navy vessels

    NARCIS (Netherlands)

    Diggelen, J. van; Post, W.; Rakhorst, M.; Plasmeijer, R.; Staal, W. van

    2014-01-01

    This paper describes a coherent engineering method for developing high level human machine interaction within a highly automated environment consisting of sensors, actuators, automatic situation assessors and planning devices. Our approach combines ideas from cognitive work analysis, cognitive

  1. Automated monitoring of fissile and fertile materials in incinerator residue

    International Nuclear Information System (INIS)

    Schoenig, F.C. Jr.; Glendinning, S.G.; Tunnell, G.W.; Zucker, M.S.

    1986-01-01

    This patent describes an apparatus for determining the fissile and fertile material content of incinerator residue contained in a manipulatable container. The apparatus comprises a main body member formed of neutron moderating material and formed with a well for receiving the container; a first plug formed of neutron reflecting material for closing the top of the well; and a second plug containing a first neutron source for alternatively closing the top of the well and for directing neutrons into the well. It also includes a second neutron source selectively positionable in the bottom of the well for directing neutrons into the well; manipulating means for placing the container in the well and removing the container therefrom and for selectively placing one of the first and second plugs in the top of the well. Neutron detectors are positioned within the neutron moderating material of the main body member around the sides of the well. At least one gamma ray detector is positioned adjacent the bottom of the well. A means receives and processes the signals from the neutron and gamma ray detectors when the container is in the well for determining the fissile and fertile material content of the incinerator residue in the container

  2. Implementation of a fully automated process purge-and-trap gas chromatograph at an environmental remediation site

    International Nuclear Information System (INIS)

    Blair, D.S.; Morrison, D.J.

    1997-01-01

    The AQUASCAN, a commercially available, fully automated purge-and-trap gas chromatograph from Sentex Systems Inc., was implemented and evaluated as an in-field, automated monitoring system of contaminated groundwater at an active DOE remediation site in Pinellas, FL. Though the AQUASCAN is designed as a stand alone process analytical unit, implementation at this site required additional hardware. The hardware included a sample dilution system and a method for delivering standard solution to the gas chromatograph for automated calibration. As a result of the evaluation the system was determined to be a reliable and accurate instrument. The AQUASCAN reported concentration values for methylene chloride, trichloroethylene, and toluene in the Pinellas ground water were within 20% of reference laboratory values

  3. Problems of collaborative work of the automated process control system (APCS) and the its information security and solutions.

    Science.gov (United States)

    Arakelyan, E. K.; Andryushin, A. V.; Mezin, S. V.; Kosoy, A. A.; Kalinina, Ya V.; Khokhlov, I. S.

    2017-11-01

    The principle of interaction of the specified systems of technological protections by the Automated process control system (APCS) and information safety in case of incorrect execution of the algorithm of technological protection is offered. - checking the correctness of the operation of technological protection in each specific situation using the functional relationship between the monitored parameters. The methodology for assessing the economic feasibility of developing and implementing an information security system.

  4. On-line process control monitoring system

    International Nuclear Information System (INIS)

    O'Rourke, P.E.; Van Hare, D.R.; Prather, W.S.

    1992-01-01

    This patent describes apparatus for monitoring at a plurality of locations within a system the concentration of at least one chemical substance involved in a chemical process. It comprises plurality of process cells; first means for carrying the light; second means for carrying the light; means for producing a spectrum from the light received by the second carrying means; multiplexing means for selecting one process cell of the plurality of process cells at a time so that the producing means can produce a process spectrum from the one cell of the process cells; a reference cell for producing a reference spectrum for comparison to the process spectrum; a standard cell for producing a standard spectrum for comparison to the process spectrum; and means for comparing the reference spectrum, the standard spectrum and the process spectrum and determining the concentration of the chemical substance in the process cell

  5. Automating the Human Factors Engineering and Evaluation Processes

    International Nuclear Information System (INIS)

    Mastromonico, C.

    2002-01-01

    The Westinghouse Savannah River Company (WSRC) has developed a software tool for automating the Human Factors Engineering (HFE) design review, analysis, and evaluation processes. The tool provides a consistent, cost effective, graded, user-friendly approach for evaluating process control system Human System Interface (HSI) specifications, designs, and existing implementations. The initial set of HFE design guidelines, used in the tool, was obtained from NUREG- 0700. Each guideline was analyzed and classified according to its significance (general concept vs. supporting detail), the HSI technology (computer based vs. non-computer based), and the HSI safety function (safety vs. non-safety). Approximately 10 percent of the guidelines were determined to be redundant or obsolete and were discarded. The remaining guidelines were arranged in a Microsoft Access relational database, and a Microsoft Visual Basic user interface was provided to facilitate the HFE design review. The tool also provides the capability to add new criteria to accommodate advances in HSI technology and incorporate lessons learned. Summary reports produced by the tool can be easily ported to Microsoft Word and other popular PC office applications. An IBM compatible PC with Microsoft Windows 95 or higher is required to run the application

  6. Monitoring and controlling the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Ahring, B K; Angelidaki, I [The Technical Univ. of Denmark, Dept. of Environmental Science and Engineering, Lyngby (Denmark)

    1997-08-01

    Many modern large-scale biogas plants have been constructed recently, increasing the demand for proper monitoring and control of these large reactor systems. For monitoring the biogas process, an easy to measure and reliable indicator is required, which reflects the metabolic state and the activity of the bacterial populations in the reactor. In this paper, we discuss existing indicators as well as indicators under development which can potentially be used to monitor the state of the biogas process in a reactor. Furthermore, data are presented from two large scale thermophilic biogas plants, subjected to temperature changes and where the concentration of volatile fatty acids was monitored. The results clearly demonstrated that significant changes in the concentration of the individual VFA occurred although the biogas production was not significantly changed. Especially the concentrations of butyrate, isobutyrate and isovalerate showed significant changes. Future improvements of process control could therefore be based on monitoring of the concentration of specific VFA`s together with information about the bacterial populations in the reactor. The last information could be supplied by the use of modern molecular techniques. (au) 51 refs.

  7. Concurrent Pilot Instrument Monitoring in the Automated Multi-Crew Airline Cockpit.

    Science.gov (United States)

    Jarvis, Stephen R

    2017-12-01

    Pilot instrument monitoring has been described as "inadequate," "ineffective," and "insufficient" after multicrew aircraft accidents. Regulators have called for improved instrument monitoring by flight crews, but scientific knowledge in the area is scarce. Research has tended to investigate the monitoring of individual pilots when in the pilot-flying role; very little research has looked at crew monitoring, or that of the "monitoring-pilot" role despite it being half of the apparent problem. Eye-tracking data were collected from 17 properly constituted and current Boeing 737 crews operating in a full motion simulator. Each crew flew four realistic flight segments, with pilots swapping between the pilot-flying and pilot-monitoring roles, with and without the autopilot engaged. Analysis was performed on the 375 maneuvering-segments prior to localizer intercept. Autopilot engagement led to significantly less visual dwell time on the attitude director indicator (mean 212.8-47.8 s for the flying pilot and 58.5-39.8 s for the monitoring-pilot) and an associated increase on the horizontal situation indicator (18-52.5 s and 36.4-50.5 s). The flying-pilots' withdrawal of attention from the primary flight reference and increased attention to the primary navigational reference was paralleled rather than complemented by the monitoring-pilot, suggesting that monitoring vulnerabilities can be duplicated in the flight deck. Therefore it is possible that accident causes identified as "inadequate" or "insufficient" monitoring, are in fact a result of parallel monitoring.Jarvis SR. Concurrent pilot instrument monitoring in the automated multi-crew airline cockpit. Aerosp Med Hum Perform. 2017; 88(12):1100-1106.

  8. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    Science.gov (United States)

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  9. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Mingzhong, Wang; Guogang, Huang [Pingdingshan Mining Bureau (China); Yunjia, Wang; Guogangli, [China Univ. of Mining and Technology, Xuzhou (China)

    1997-12-31

    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  10. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Wang Mingzhong; Huang Guogang [Pingdingshan Mining Bureau (China); Wang Yunjia; Guogangli [China Univ. of Mining and Technology, Xuzhou (China)

    1996-12-31

    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  11. Failsafe automation of Phase II clinical trial interim monitoring for stopping rules.

    Science.gov (United States)

    Day, Roger S

    2010-02-01

    In Phase II clinical trials in cancer, preventing the treatment of patients on a study when current data demonstrate that the treatment is insufficiently active or too toxic has obvious benefits, both in protecting patients and in reducing sponsor costs. Considerable efforts have gone into experimental designs for Phase II clinical trials with flexible sample size, usually implemented by early stopping rules. The intended benefits will not ensue, however, if the design is not followed. Despite the best intentions, failures can occur for many reasons. The main goal is to develop an automated system for interim monitoring, as a backup system supplementing the protocol team, to ensure that patients are protected. A secondary goal is to stimulate timely recording of patient assessments. We developed key concepts and performance needs, then designed, implemented, and deployed a software solution embedded in the clinical trials database system. The system has been in place since October 2007. One clinical trial tripped the automated monitor, resulting in e-mails that initiated statistician/investigator review in timely fashion. Several essential contributing activities still require human intervention, institutional policy decisions, and institutional commitment of resources. We believe that implementing the concepts presented here will provide greater assurance that interim monitoring plans are followed and that patients are protected from inadequate response or excessive toxicity. This approach may also facilitate wider acceptance and quicker implementation of new interim monitoring algorithms.

  12. Automated force volume image processing for biological samples.

    Directory of Open Access Journals (Sweden)

    Pavel Polyakov

    2011-04-01

    Full Text Available Atomic force microscopy (AFM has now become a powerful technique for investigating on a molecular level, surface forces, nanomechanical properties of deformable particles, biomolecular interactions, kinetics, and dynamic processes. This paper specifically focuses on the analysis of AFM force curves collected on biological systems, in particular, bacteria. The goal is to provide fully automated tools to achieve theoretical interpretation of force curves on the basis of adequate, available physical models. In this respect, we propose two algorithms, one for the processing of approach force curves and another for the quantitative analysis of retraction force curves. In the former, electrostatic interactions prior to contact between AFM probe and bacterium are accounted for and mechanical interactions operating after contact are described in terms of Hertz-Hooke formalism. Retraction force curves are analyzed on the basis of the Freely Jointed Chain model. For both algorithms, the quantitative reconstruction of force curves is based on the robust detection of critical points (jumps, changes of slope or changes of curvature which mark the transitions between the various relevant interactions taking place between the AFM tip and the studied sample during approach and retraction. Once the key regions of separation distance and indentation are detected, the physical parameters describing the relevant interactions operating in these regions are extracted making use of regression procedure for fitting experiments to theory. The flexibility, accuracy and strength of the algorithms are illustrated with the processing of two force-volume images, which collect a large set of approach and retraction curves measured on a single biological surface. For each force-volume image, several maps are generated, representing the spatial distribution of the searched physical parameters as estimated for each pixel of the force-volume image.

  13. Automating the simulator testing and data collection process

    Energy Technology Data Exchange (ETDEWEB)

    Magi, T.; Dimitri-Hakim, R. [L-3 Communications MAPPS Inc., Montreal, Quebec (Canada)

    2012-07-01

    Scenario-based training is a key process in the use of Full Scope Simulators (FSS) for operator training. Scenario-based training can be defined as any set of simulated plant operations performed with a specific training objective in mind. In order to meet this training objective, the ANSI/ANS-3.5-2009 standard requires that certain simulator training scenarios be tested to ensure that they reproduce the expected plant responses, that all plant procedures can be followed, and that scenario-based training objectives can be met. While malfunction testing provided a narrow view of the simulator performance revolving around the malfunction itself, scenario testing provides a broader, overall view. The concept of instructor validation of simulator scenarios to be used for training and evaluation, and oversight of simulator performance during the validation process, work hand-in-hand. This is where Scenario-Based Testing comes into play. With the description of Scenario-Based Testing (SBT) within Nuclear Energy Institute NEI 09-09 white paper and within the ANSI/ANS-3.5-2009 standard, the industry now has a way forward that reduces the regulatory uncertainty. Together, scenario-based testing and scenario-based training combine to produce better simulators which in turn can be used to more effectively and efficiently train new and existing power plant operators. However, they also impose a significant data gathering and analysis burden on FSS users. L-3 MAPPS Orchid Instructor Station (Orchid IS) facilitates this data gathering and analysis by providing features that automate this process with a simple, centralized, easy to use interface. (author)

  14. Monitoring of batch processes using spectroscopy

    NARCIS (Netherlands)

    Gurden, S. P.; Westerhuis, J. A.; Smilde, A. K.

    2002-01-01

    There is an increasing need for new techniques for the understanding, monitoring and the control of batch processes. Spectroscopy is now becoming established as a means of obtaining real-time, high-quality chemical information at frequent time intervals and across a wide range of industrial

  15. DEEP LEARNING AND IMAGE PROCESSING FOR AUTOMATED CRACK DETECTION AND DEFECT MEASUREMENT IN UNDERGROUND STRUCTURES

    Directory of Open Access Journals (Sweden)

    F. Panella

    2018-05-01

    Full Text Available This work presents the combination of Deep-Learning (DL and image processing to produce an automated cracks recognition and defect measurement tool for civil structures. The authors focus on tunnel civil structures and survey and have developed an end to end tool for asset management of underground structures. In order to maintain the serviceability of tunnels, regular inspection is needed to assess their structural status. The traditional method of carrying out the survey is the visual inspection: simple, but slow and relatively expensive and the quality of the output depends on the ability and experience of the engineer as well as on the total workload (stress and tiredness may influence the ability to observe and record information. As a result of these issues, in the last decade there is the desire to automate the monitoring using new methods of inspection. The present paper has the goal of combining DL with traditional image processing to create a tool able to detect, locate and measure the structural defect.

  16. Automated Processing of Two-Dimensional Correlation Spectra

    Science.gov (United States)

    Sengstschmid; Sterk; Freeman

    1998-04-01

    An automated scheme is described which locates the centers of cross peaks in two-dimensional correlation spectra, even under conditions of severe overlap. Double-quantum-filtered correlation (DQ-COSY) spectra have been investigated, but the method is also applicable to TOCSY and NOESY spectra. The search criterion is the intrinsic symmetry (or antisymmetry) of cross-peak multiplets. An initial global search provides the preliminary information to build up a two-dimensional "chemical shift grid." All genuine cross peaks must be centered at intersections of this grid, a fact that reduces the extent of the subsequent search program enormously. The program recognizes cross peaks by examining the symmetry of signals in a test zone centered at a grid intersection. This "symmetry filter" employs a "lowest value algorithm" to discriminate against overlapping responses from adjacent multiplets. A progressive multiplet subtraction scheme provides further suppression of overlap effects. The processed two-dimensional correlation spectrum represents cross peaks as points at the chemical shift coordinates, with some indication of their relative intensities. Alternatively, the information is presented in the form of a correlation table. The authenticity of a given cross peak is judged by a set of "confidence criteria" expressed as numerical parameters. Experimental results are presented for the 400-MHz double-quantum-filtered COSY spectrum of 4-androsten-3,17-dione, a case where there is severe overlap. Copyright 1998 Academic Press.

  17. Automated processing of zebrafish imaging data: a survey.

    Science.gov (United States)

    Mikut, Ralf; Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A; Kausler, Bernhard X; Ledesma-Carbayo, María J; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-09-01

    Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines.

  18. Automated Processing of Zebrafish Imaging Data: A Survey

    Science.gov (United States)

    Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A.; Kausler, Bernhard X.; Ledesma-Carbayo, María J.; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-01-01

    Abstract Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines. PMID:23758125

  19. Automated terrestrial laser scanning with near-real-time change detection – monitoring of the Séchilienne landslide

    Directory of Open Access Journals (Sweden)

    R. A. Kromer

    2017-05-01

    Full Text Available We present an automated terrestrial laser scanning (ATLS system with automatic near-real-time change detection processing. The ATLS system was tested on the Séchilienne landslide in France for a 6-week period with data collected at 30 min intervals. The purpose of developing the system was to fill the gap of high-temporal-resolution TLS monitoring studies of earth surface processes and to offer a cost-effective, light, portable alternative to ground-based interferometric synthetic aperture radar (GB-InSAR deformation monitoring. During the study, we detected the flux of talus, displacement of the landslide and pre-failure deformation of discrete rockfall events. Additionally, we found the ATLS system to be an effective tool in monitoring landslide and rockfall processes despite missing points due to poor atmospheric conditions or rainfall. Furthermore, such a system has the potential to help us better understand a wide variety of slope processes at high levels of temporal detail.

  20. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology.

    Science.gov (United States)

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-05-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  1. Automated Mobility Transitions: Governing Processes in the UK

    Directory of Open Access Journals (Sweden)

    Debbie Hopkins

    2018-03-01

    Full Text Available Contemporary systems of mobility are undergoing a transition towards automation. In the UK, this transition is being led by (often new partnerships between incumbent manufacturers and new entrants, in collaboration with national governments, local/regional councils, and research institutions. This paper first offers a framework for analyzing the governance of the transition, adapting ideas from the Transition Management (TM perspective, and then applies the framework to ongoing automated vehicle transition dynamics in the UK. The empirical analysis suggests that the UK has adopted a reasonably comprehensive approach to the governing of automated vehicle innovation but that this approach cannot be characterized as sufficiently inclusive, democratic, diverse and open. The lack of inclusivity, democracy, diversity and openness is symptomatic of the post-political character of how the UK’s automated mobility transition is being governed. The paper ends with a call for a reconfiguration of the automated vehicle transition in the UK and beyond, so that much more space is created for dissent and for reflexive and comprehensive big picture thinking on (automated mobility futures.

  2. Vision-Based Geo-Monitoring - A New Approach for an Automated System

    Science.gov (United States)

    Wagner, A.; Reiterer, A.; Wasmeier, P.; Rieke-Zapp, D.; Wunderlich, T.

    2012-04-01

    The necessity for monitoring geo-risk areas such as rock slides is growing due to the increasing probability of such events caused by environmental change. Life with threat becomes to a calculable risk by geodetic deformation monitoring. An in-depth monitoring concept with modern measurement technologies allows the estimation of the hazard potential and the prediction of life-threatening situations. The movements can be monitored by sensors, placed in the unstable slope area. In most cases, it is necessary to enter the regions at risk in order to place the sensors and maintain them. Using long-range monitoring systems (e.g. terrestrial laser scanners, total stations, ground based synthetic aperture radar) allows avoiding this risk. To close the gap between the existing low-resolution, medium-accuracy sensors and conventional (co-operative target-based) surveying methods, image-assisted total stations (IATS) are a suggestive solution. IATS offer the user (e.g. metrology expert) an image capturing system (CCD/CMOS camera) in addition to 3D point measurements. The images of the telescope's visual field are projected onto the camera's chip. With appropriate calibration, these images are accurately geo-referenced and oriented since the horizontal and vertical angles of rotation are continuously recorded. The oriented images can directly be used for direction measurements with no need for object control points or further photogrammetric orientation processes. IATS are able to provide high density deformation fields with high accuracy (down to mm range), in all three coordinate directions. Tests have shown that with suitable image processing measurements a precision of 0.05 pixel ± 0.04·σ is possible (which corresponds to 0.03 mgon ± 0.04·σ). These results have to be seen under the consideration that such measurements are image-based only. For measuring in 3D object space the precision of pointing has to be taken into account. IATS can be used in two different ways

  3. Healthcare Blockchain System Using Smart Contracts for Secure Automated Remote Patient Monitoring.

    Science.gov (United States)

    Griggs, Kristen N; Ossipova, Olya; Kohlios, Christopher P; Baccarini, Alessandro N; Howson, Emily A; Hayajneh, Thaier

    2018-06-06

    As Internet of Things (IoT) devices and other remote patient monitoring systems increase in popularity, security concerns about the transfer and logging of data transactions arise. In order to handle the protected health information (PHI) generated by these devices, we propose utilizing blockchain-based smart contracts to facilitate secure analysis and management of medical sensors. Using a private blockchain based on the Ethereum protocol, we created a system where the sensors communicate with a smart device that calls smart contracts and writes records of all events on the blockchain. This smart contract system would support real-time patient monitoring and medical interventions by sending notifications to patients and medical professionals, while also maintaining a secure record of who has initiated these activities. This would resolve many security vulnerabilities associated with remote patient monitoring and automate the delivery of notifications to all involved parties in a HIPAA compliant manner.

  4. Automated measurement of pressure injury through image processing.

    Science.gov (United States)

    Li, Dan; Mathews, Carol

    2017-11-01

    To develop an image processing algorithm to automatically measure pressure injuries using electronic pressure injury images stored in nursing documentation. Photographing pressure injuries and storing the images in the electronic health record is standard practice in many hospitals. However, the manual measurement of pressure injury is time-consuming, challenging and subject to intra/inter-reader variability with complexities of the pressure injury and the clinical environment. A cross-sectional algorithm development study. A set of 32 pressure injury images were obtained from a western Pennsylvania hospital. First, we transformed the images from an RGB (i.e. red, green and blue) colour space to a YC b C r colour space to eliminate inferences from varying light conditions and skin colours. Second, a probability map, generated by a skin colour Gaussian model, guided the pressure injury segmentation process using the Support Vector Machine classifier. Third, after segmentation, the reference ruler - included in each of the images - enabled perspective transformation and determination of pressure injury size. Finally, two nurses independently measured those 32 pressure injury images, and intraclass correlation coefficient was calculated. An image processing algorithm was developed to automatically measure the size of pressure injuries. Both inter- and intra-rater analysis achieved good level reliability. Validation of the size measurement of the pressure injury (1) demonstrates that our image processing algorithm is a reliable approach to monitoring pressure injury progress through clinical pressure injury images and (2) offers new insight to pressure injury evaluation and documentation. Once our algorithm is further developed, clinicians can be provided with an objective, reliable and efficient computational tool for segmentation and measurement of pressure injuries. With this, clinicians will be able to more effectively monitor the healing process of pressure

  5. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    Goncalves, Joao G.M.; Schwalbach, Peter; Schoeneman, Barry Dale; Ross, Troy D.; Baldwin, George Thomas

    2010-01-01

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  6. Automated selected reaction monitoring data analysis workflow for large-scale targeted proteomic studies.

    Science.gov (United States)

    Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi

    2013-08-01

    Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.

  7. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  8. Safety monitoring in process and control

    International Nuclear Information System (INIS)

    Esparza, V. Jr.; Sebo, D.E.

    1984-01-01

    Safety Functions provide a method of ensuring the safe operation of any large-scale processing plant. Successful implementation of safety functions requires continuous monitoring of safety function values and trends. Because the volume of information handled by a plant operator occassionally can become overwhelming, attention may be diverted from the primary concern of maintaining plant safety. With this in mind EG and G, Idaho developed various methods and techniques for use in a computerized Safety Function Monitoring System and tested the application of these techniques using a simulated nuclear power plant, the Loss-of-Fluid Test Facility (LOFT) at the Idaho National Engineering Laboratory (INEL). This paper presents the methods used in the development of a Safety Function Monitoring System

  9. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    Science.gov (United States)

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  10. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    International Nuclear Information System (INIS)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M.; Rahmat, K.; Ariffin, H.

    2012-01-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  11. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); Rahmat, K. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); University Malaya, Biomedical Imaging Department, Kuala Lumpur (Malaysia); Ariffin, H. [University of Malaya, Department of Paediatrics, Faculty of Medicine, Kuala Lumpur (Malaysia)

    2012-07-15

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  12. Automated and electronically assisted hand hygiene monitoring systems: a systematic review.

    Science.gov (United States)

    Ward, Melissa A; Schweizer, Marin L; Polgreen, Philip M; Gupta, Kalpana; Reisinger, Heather S; Perencevich, Eli N

    2014-05-01

    Hand hygiene is one of the most effective ways to prevent transmission of health care-associated infections. Electronic systems and tools are being developed to enhance hand hygiene compliance monitoring. Our systematic review assesses the existing evidence surrounding the adoption and accuracy of automated systems or electronically enhanced direct observations and also reviews the effectiveness of such systems in health care settings. We systematically reviewed PubMed for articles published between January 1, 2000, and March 31, 2013, containing the terms hand AND hygiene or hand AND disinfection or handwashing. Resulting articles were reviewed to determine if an electronic system was used. We identified 42 articles for inclusion. Four types of systems were identified: electronically assisted/enhanced direct observation, video-monitored direct observation systems, electronic dispenser counters, and automated hand hygiene monitoring networks. Fewer than 20% of articles identified included calculations for efficiency or accuracy. Limited data are currently available to recommend adoption of specific automatic or electronically assisted hand hygiene surveillance systems. Future studies should be undertaken that assess the accuracy, effectiveness, and cost-effectiveness of such systems. Given the restricted clinical and infection prevention budgets of most facilities, cost-effectiveness analysis of specific systems will be required before these systems are widely adopted. Published by Mosby, Inc.

  13. Automated processing of thermal infrared images of Osservatorio Vesuviano permanent surveillance network by using Matlab code

    Science.gov (United States)

    Sansivero, Fabio; Vilardo, Giuseppe; Caputo, Teresa

    2017-04-01

    The permanent thermal infrared surveillance network of Osservatorio Vesuviano (INGV) is composed of 6 stations which acquire IR frames of fumarole fields in the Campi Flegrei caldera and inside the Vesuvius crater (Italy). The IR frames are uploaded to a dedicated server in the Surveillance Center of Osservatorio Vesuviano in order to process the infrared data and to excerpt all the information contained. In a first phase the infrared data are processed by an automated system (A.S.I.R.A. Acq- Automated System of IR Analysis and Acquisition) developed in Matlab environment and with a user-friendly graphic user interface (GUI). ASIRA daily generates time-series of residual temperature values of the maximum temperatures observed in the IR scenes after the removal of seasonal effects. These time-series are displayed in the Surveillance Room of Osservatorio Vesuviano and provide information about the evolution of shallow temperatures field of the observed areas. In particular the features of ASIRA Acq include: a) efficient quality selection of IR scenes, b) IR images co-registration in respect of a reference frame, c) seasonal correction by using a background-removal methodology, a) filing of IR matrices and of the processed data in shared archives accessible to interrogation. The daily archived records can be also processed by ASIRA Plot (Matlab code with GUI) to visualize IR data time-series and to help in evaluating inputs parameters for further data processing and analysis. Additional processing features are accomplished in a second phase by ASIRA Tools which is Matlab code with GUI developed to extract further information from the dataset in automated way. The main functions of ASIRA Tools are: a) the analysis of temperature variations of each pixel of the IR frame in a given time interval, b) the removal of seasonal effects from temperature of every pixel in the IR frames by using an analytic approach (removal of sinusoidal long term seasonal component by using a

  14. Automated Processing of ISIS Topside Ionograms into Electron Density Profiles

    Science.gov (United States)

    Reinisch, bodo W.; Huang, Xueqin; Bilitza, Dieter; Hills, H. Kent

    2004-01-01

    Modeling of the topside ionosphere has for the most part relied on just a few years of data from topside sounder satellites. The widely used Bent et al. (1972) model, for example, is based on only 50,000 Alouette 1 profiles. The International Reference Ionosphere (IRI) (Bilitza, 1990, 2001) uses an analytical description of the graphs and tables provided by Bent et al. (1972). The Alouette 1, 2 and ISIS 1, 2 topside sounder satellites of the sixties and seventies were ahead of their times in terms of the sheer volume of data obtained and in terms of the computer and software requirements for data analysis. As a result, only a small percentage of the collected topside ionograms was converted into electron density profiles. Recently, a NASA-funded data restoration project has undertaken and is continuing the process of digitizing the Alouette/ISIS ionograms from the analog 7-track tapes. Our project involves the automated processing of these digital ionograms into electron density profiles. The project accomplished a set of important goals that will have a major impact on understanding and modeling of the topside ionosphere: (1) The TOPside Ionogram Scaling and True height inversion (TOPIST) software was developed for the automated scaling and inversion of topside ionograms. (2) The TOPIST software was applied to the over 300,000 ISIS-2 topside ionograms that had been digitized in the fkamework of a separate AISRP project (PI: R.F. Benson). (3) The new TOPIST-produced database of global electron density profiles for the topside ionosphere were made publicly available through NASA s National Space Science Data Center (NSSDC) ftp archive at . (4) Earlier Alouette 1,2 and ISIS 1, 2 data sets of electron density profiles from manual scaling of selected sets of ionograms were converted fiom a highly-compressed binary format into a user-friendly ASCII format and made publicly available through nssdcftp.gsfc.nasa.gov. The new database for the topside ionosphere established

  15. Automated Intelligent Monitoring and the Controlling Software System for Solar Panels

    Science.gov (United States)

    Nalamwar, H. S.; Ivanov, M. A.; Baidali, S. A.

    2017-01-01

    The inspection of the solar panels on a periodic basis is important to improve longevity and ensure performance of the solar system. To get the most solar potential of the photovoltaic (PV) system is possible through an intelligent monitoring & controlling system. The monitoring & controlling system has rapidly increased its popularity because of its user-friendly graphical interface for data acquisition, monitoring, controlling and measurements. In order to monitor the performance of the system especially for renewable energy source application such as solar photovoltaic (PV), data-acquisition systems had been used to collect all the data regarding the installed system. In this paper the development of a smart automated monitoring & controlling system for the solar panel is described, the core idea is based on IoT (the Internet of Things). The measurements of data are made using sensors, block management data acquisition modules, and a software system. Then, all the real-time data collection of the electrical output parameters of the PV plant such as voltage, current and generated electricity is displayed and stored in the block management. The proposed system is smart enough to make suggestions if the panel is not working properly, to display errors, to remind about maintenance of the system through email or SMS, and to rotate panels according to a sun position using the Ephemeral table that stored in the system. The advantages of the system are the performance of the solar panel system which can be monitored and analyzed.

  16. Ideal versus real automated twin column recycling chromatography process.

    Science.gov (United States)

    Gritti, Fabrice; Leal, Mike; McDonald, Thomas; Gilar, Martin

    2017-07-28

    The full baseline separation of two compounds (selectivity factors αchromatography is used to confirm that the speed-resolution performance of the TCRSP is intrinsically superior to that of the single-column process. This advantage is illustrated in this work by developing an automated TCRSP for the challenging separation of two polycyclic aromatic hydrocarbon (PAH) isomers (benzo[a]anthracene and chrysene) in the reversed-phase retention mode at pressure smaller than 5000psi. The columns used are the 3.0mm×150mm column packed with 3.5μm XBridge BEH-C 18 material (α=1.010) and the 3.0mm or 4.6mm×150mm columns packed with the same 3.5μm XSelect HSST 3 material (α=1.025). The isocratic mobile phase is an acetonitrile-water mixture (80/20, v/v). Remarkably, significant differences are observed between the predicted retention times and efficiencies of the ideal TCRSP (given by the number of cycles multiplied by the retention time and efficiency of one column) and those of the real TCRSP. The fundamental explanation lies in the pressure-dependent retention of these PAHs or in the change of their partial molar volume as they are transferred from the mobile to the stationary phase. A revisited retention and efficiency model is then built to predict the actual performance of real TCRSPs. The experimental and calculated resolution data are found in very good agreement for a change, Δv m =-10cm 3 /mol, of the partial molar volume of the two PAH isomers upon transfer from the acetonitrile-water eluent mixture to the silica-C 18 stationary phase. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Alcoholics' selective attention to alcohol stimuli: automated processing?

    Science.gov (United States)

    Stormark, K M; Laberg, J C; Nordby, H; Hugdahl, K

    2000-01-01

    This study investigated alcoholics' selective attention to alcohol words in a version of the Stroop color-naming task. Alcoholic subjects (n = 23) and nonalcoholic control subjects (n = 23) identified the color of Stroop versions of alcohol, emotional, neutral and color words. Manual reaction times (RTs), skin conductance responses (SCRs) and heart rate (HR) were recorded. Alcoholics showed overall longer RTs than controls while both groups were slower in responding to the incongruent color words than to the other words. Alcoholics showed longer RTs to both alcohol (1522.7 milliseconds [ms]) and emotional words (1523.7 ms) than to neutral words (1450.8 ms) which suggests that the content of these words interfered with the ability to attend to the color of the words. There was also a negative correlation (r = -.41) between RT and response accuracy to alcohol words for the alcoholics, reflecting that the longer time the alcoholics used to respond to the color of the alcohol words, the more incorrect their responses were. The alcoholics also showed significantly greater SCRs to alcohol words (0.16 microSiemens) than to any of the other words (ranging from 0.04-0.08 microSiemens), probably reflecting the emotional significance of the alcohol words. Finally, the alcoholics evidenced smaller HR acceleration to alcohol (1.9 delta bpm) compared to neutral (2.8 delta bpm), which could be related to difficulties alcoholics experience in terminating their attention to the alcohol words. These findings indicate that it is difficult for alcoholics to regulate their attention to alcohol stimuli, suggesting that alcoholics' processing of alcohol information is automated.

  18. Fully automated data acquisition, processing, and display in equilibrium radioventriculography

    International Nuclear Information System (INIS)

    Bourguignon, M.H.; Douglass, K.H.; Links, J.M.; Wagner, H.N. Jr.; Johns Hopkins Medical Institutions, Baltimore, MD

    1981-01-01

    A fully automated data acquisition, processing, and display procedure was developed for equilibrium radioventriculography. After a standardized acquisition, the study is automatically analyzed to yield both right and left ventricular time-activity curves. The program first creates a series of edge-enhanced images (difference between squared images and scaled original images). A marker point within each ventricle is then identified as that pixel with maximum counts to the patient's right and left of the count center of gravity of a stroke volume image. Regions of interest are selected on each frame as the first contour of local maxima of the two-dimensional second derivative (pseudo-Laplacian) which encloses the appropriate marker point, using a method developed by Goris. After shifting the left ventricular end-systolic region of interest four pixels to the patient's left, a background region of interest is generated as the crescent-shaped area of the shifted region of interest not intersected by the end systolic region. The average counts/pixel in this background region in the end systolic frame of the original study are subtracted from each pixel in all frames of the gated study. Right and left ventricular time-activity curves are then obtained by applying each region of interest to its corresponding background-subtracted frame, and the ejection fraction, end diastolic, end systolic, and stroke counts determined for both ventricles. In fourteen consecutive patients, in addition to the automatic ejection fractions, manually drawn regions of interest were used to obtain ejection fractions for both ventricles. The manual regions of interest were drawn twice, and the average obtained. (orig./TR)

  19. Partitioning,Automation and Error Recovery in the Control and Monitoring System of an LHC Experiment

    Institute of Scientific and Technical Information of China (English)

    C.Gaspar

    2001-01-01

    The Joint Controls Project(JCOP)is a collaboration between CERN and the four LHC experiments to find and implement common solutions for their control and monitoring systems.As part of this project and Architecture Working Group was set up in order to study the requirements and devise an architectural model that would suit the four experiments.Many issues were studied by this working group:Alarm handling,Access Control,Hierarchical Control,etc.This paper will report on the specific issue of hierarchical control and in particular partitioning,automation and error recovery.

  20. Towards the development of an automated ATP measuring platform to monitor microbial quality of drinking water

    DEFF Research Database (Denmark)

    Tatari, Karolina; Hansen, C. B.; Rasmussen, A.

    is detected by a photomultiplier. Temperature in the assay box is controlled and set to 25°C. Calibration of the system using ATP standard solutions was successful, both for free and for total ATP. Chemical release of ATP by reagent addition however resulted in the formation of particles that ultimately......This work aimed to develop an automated and nearly on-line method to monitor ATP levels in drinking water as an indicator of microbial contamination. The system consists of a microfluidic cartridge installed in a light tight box, where the sample is mixed with the reagents and the emitted light...

  1. Development of an automated data acquisition and processing pipeline using multiple telescopes for observing transient phenomena

    Science.gov (United States)

    Savant, Vaibhav; Smith, Niall

    2016-07-01

    We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.

  2. How automation helps steer the revenue cycle process.

    Science.gov (United States)

    Colpas, Phil

    2013-06-01

    top-of-mind issue as we see how healthcare reform plays out. Here's what our select group of experts had to say about how automation helps to steer the revenue cycle process.

  3. Process monitoring using optical ultrasonic wave detection

    International Nuclear Information System (INIS)

    Telschow, K.L.; Walter, J.B.; Garcia, G.V.; Kunerth, D.C.

    1989-01-01

    Optical ultrasonic wave detection techniques are being developed for process monitoring. An important limitation on optical techniques is that the material surface, in materials processing applications, is usually not a specular reflector and in many cases is totally diffusely reflecting. This severely degrades the light collected by the detection optics, greatly reducing the intensity and randomly scattering the phase of the reflected light. A confocal Fabry-Perot interferometer, which is sensitive to the Doppler frequency shift resulting from the surface motion and not to the phase of the collected light, is well suited to detecting ultrasonic waves in diffusely reflecting materials. This paper describes the application of this detector to the real-time monitoring of the sintering of ceramic materials. 8 refs., 5 figs

  4. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    Devol, Timothy A.

    2005-01-01

    Comparison of different pulse shape discrimination methods was performed under two different experimental conditions and the best method was identified. Beta/gamma discrimination of 90Sr/90Y and 137Cs was performed using a phoswich detector made of BC400 (2.5 cm OD x 1.2 cm) and BGO (2.5 cm O.D. x 2.5 cm ) scintillators. Alpha/gamma discrimination of 210Po and 137Cs was performed using a CsI:Tl (2.8 x 1.4 x 1.4 cm3) scintillation crystal. The pulse waveforms were digitized with a DGF-4c (X-Ray Instrumentation Associates) and analyzed offline with IGOR Pro software (Wavemetrics, Inc.). The four pulse shape discrimination methods that were compared include: rise time discrimination, digital constant fraction discrimination, charge ratio, and constant time discrimination (CTD) methods. The CTD method is the ratio of the pulse height at a particular time after the beginning of the pulse to the time at the maximum pulse height. The charge comparison method resulted in a Figure of Merit (FoM) of 3.3 (9.9 % spillover) and 3.7 (0.033 % spillover) for the phoswich and the CsI:Tl scintillator setups, respectively. The CTD method resulted in a FoM of 3.9 (9.2 % spillover) and 3.2 (0.25 % spillover), respectively. Inverting the pulse shape data typically resulted in a significantly higher FoM than conventional methods, but there was no reduction in % spillover values. This outcome illustrates that the FoM may not be a good scheme for the quantification of a system to perform pulse shape discrimination. Comparison of several pulse shape discrimination (PSD) methods was performed as a means to compare traditional analog and digital PSD methods on the same scintillation pulses. The X-ray Instrumentation Associates DGF-4C (40 Msps, 14-bit) was used to digitize waveforms from a CsI:Tl crystal and BC400/BGO phoswich detector

  5. Geophysical methods for monitoring soil stabilization processes

    Science.gov (United States)

    Saneiyan, Sina; Ntarlagiannis, Dimitrios; Werkema, D. Dale; Ustra, Andréa

    2018-01-01

    Soil stabilization involves methods used to turn unconsolidated and unstable soil into a stiffer, consolidated medium that could support engineered structures, alter permeability, change subsurface flow, or immobilize contamination through mineral precipitation. Among the variety of available methods carbonate precipitation is a very promising one, especially when it is being induced through common soil borne microbes (MICP - microbial induced carbonate precipitation). Such microbial mediated precipitation has the added benefit of not harming the environment as other methods can be environmentally detrimental. Carbonate precipitation, typically in the form of calcite, is a naturally occurring process that can be manipulated to deliver the expected soil strengthening results or permeability changes. This study investigates the ability of spectral induced polarization and shear-wave velocity for monitoring calcite driven soil strengthening processes. The results support the use of these geophysical methods as soil strengthening characterization and long term monitoring tools, which is a requirement for viable soil stabilization projects. Both tested methods are sensitive to calcite precipitation, with SIP offering additional information related to long term stability of precipitated carbonate. Carbonate precipitation has been confirmed with direct methods, such as direct sampling and scanning electron microscopy (SEM). This study advances our understanding of soil strengthening processes and permeability alterations, and is a crucial step for the use of geophysical methods as monitoring tools in microbial induced soil alterations through carbonate precipitation.

  6. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  7. Marketing automation processes as a way to improve contemporary marketing of a company

    OpenAIRE

    Witold Świeczak

    2013-01-01

    The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influenc...

  8. Process instrument monitoring for SNM solution surveillance

    International Nuclear Information System (INIS)

    Armatys, C.M.; Johnson, C.E.; Wagner, E.P.

    1983-02-01

    A process monitoring computer system at the Idaho Chemical Processing Plant (ICPP) is being used to evaluate nuclear fuel reprocessing plant data for Safeguards surveillance capabilities. The computer system was installed to collect data from the existing plant instruments and to evaluate what safeguards assurances can be provided to complement conventional accountability and physical protection measures. Movements of solutions containing special nuclear material (SNM) can be observed, activities associated with accountancy measurements (mixing, sampling, and bulk measurement) can be confirmed, and long-term storage of SNM solutions can be monitored to ensure containment. Special precautions must be taken, both in system design and operation to ensure adequate coverage of essential measured parameters and interpretation of process data, which can be comprised by instrument malfunctions or failures, unreliable data collection, or process activities that deviate from readily identified procedures. Experience at ICPP and prior evaluations at the Tokai reprocessing plant show that the use of process data can provide assurances that accountability measurement procedures are followed and SNM solutions are properly contained and can help confirm that SNM controls are in effect within a facility

  9. Automated processing of data on the use of motor vehicles in the Serbian Armed Forces

    Directory of Open Access Journals (Sweden)

    Nikola S. Osmokrović

    2012-10-01

    Full Text Available The main aim of introducing information technology into the armed forces is the automation of the management process. The management in movement and transport (M&T in our armed forces has been included in the process of automation from the beginning. For that reason, today we can speak about the automated processing of data on road traffic safety and on the use of motor vehicles. With regard to the overall development of the information system of the movement and transport service, the paper presents an information system of the M&T service for the processing of data on the use of motor vehicles. The main features, components and functions of the 'Vozila' application, which was specially developed for the automated processing of data on motor vehicle use, are explained in particular.

  10. Prajna: adding automated reasoning to the visual- analysis process.

    Science.gov (United States)

    Swing, E

    2010-01-01

    Developers who create applications for knowledge representation must contend with challenges in both the abundance of data and the variety of toolkits, architectures, and standards for representing it. Prajna is a flexible Java toolkit designed to overcome these challenges with an extensible architecture that supports both visualization and automated reasoning.

  11. Automation of photographic film processing in industrial radiography

    International Nuclear Information System (INIS)

    Arzhenukhin, V.K.; Grachev, A.V.; Kuleshov, A.V.; Majorov, A.N.

    1973-01-01

    The automation of industrial radiographs photoprocessing is discussed. It pulses into the foreground the requirement to physicomechanical characteristics of films. The most widely used radiographic film sorts and photoprocessing conditions are presented. The values of permissible mechanical effect at film photoprocessing in dependence on temperature are established. The main technical parameters of the automatic unit for radiographic film photoprocessing have been presented

  12. Results and discussion of laboratory experiences with different automated TLD readers for personnel monitoring

    International Nuclear Information System (INIS)

    Regulla, D.F.; Drexeler, G.

    Although the film seems to continue serving as the main personnel dosemeter in Germany for the time in sight, the evolution of particularly solid state techniques and their properties are thoroughly considered with respect to a possible generalized application in personnel monitoring. For this reason different automated TLD systems that are commercially available have been investigated in the laboratory in order to find out their usefulness for a largescale or also decentralized service. Along with studying the dosimetrical and apparative parameters, the question has been discussed to which monitoring philosophy these TLD systems seem to fit. It is reported both on experimental experiences achieved as well as on the results of basic discussions that in return influence the discussion about the necessary outfit of personnel TL dosemeters

  13. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline.

    Science.gov (United States)

    Loh, K B; Ramli, N; Tan, L K; Roziah, M; Rahmat, K; Ariffin, H

    2012-07-01

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. Diffusion tensor imaging outperforms conventional MRI in depicting white matter maturation. • DTI will become an important clinical tool for diagnosing paediatric neurological diseases. • DTI appears especially helpful for developmental abnormalities, tumours and white matter disease. • An automated processing pipeline assists quantitative analysis of high throughput DTI data.

  14. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    Science.gov (United States)

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years

  15. Automated multivariate analysis of multi-sensor data submitted online: Real-time environmental monitoring.

    Science.gov (United States)

    Eide, Ingvar; Westad, Frank

    2018-01-01

    A pilot study demonstrating real-time environmental monitoring with automated multivariate analysis of multi-sensor data submitted online has been performed at the cabled LoVe Ocean Observatory located at 258 m depth 20 km off the coast of Lofoten-Vesterålen, Norway. The major purpose was efficient monitoring of many variables simultaneously and early detection of changes and time-trends in the overall response pattern before changes were evident in individual variables. The pilot study was performed with 12 sensors from May 16 to August 31, 2015. The sensors provided data for chlorophyll, turbidity, conductivity, temperature (three sensors), salinity (calculated from temperature and conductivity), biomass at three different depth intervals (5-50, 50-120, 120-250 m), and current speed measured in two directions (east and north) using two sensors covering different depths with overlap. A total of 88 variables were monitored, 78 from the two current speed sensors. The time-resolution varied, thus the data had to be aligned to a common time resolution. After alignment, the data were interpreted using principal component analysis (PCA). Initially, a calibration model was established using data from May 16 to July 31. The data on current speed from two sensors were subject to two separate PCA models and the score vectors from these two models were combined with the other 10 variables in a multi-block PCA model. The observations from August were projected on the calibration model consecutively one at a time and the result was visualized in a score plot. Automated PCA of multi-sensor data submitted online is illustrated with an attached time-lapse video covering the relative short time period used in the pilot study. Methods for statistical validation, and warning and alarm limits are described. Redundant sensors enable sensor diagnostics and quality assurance. In a future perspective, the concept may be used in integrated environmental monitoring.

  16. Automated multivariate analysis of multi-sensor data submitted online: Real-time environmental monitoring.

    Directory of Open Access Journals (Sweden)

    Ingvar Eide

    Full Text Available A pilot study demonstrating real-time environmental monitoring with automated multivariate analysis of multi-sensor data submitted online has been performed at the cabled LoVe Ocean Observatory located at 258 m depth 20 km off the coast of Lofoten-Vesterålen, Norway. The major purpose was efficient monitoring of many variables simultaneously and early detection of changes and time-trends in the overall response pattern before changes were evident in individual variables. The pilot study was performed with 12 sensors from May 16 to August 31, 2015. The sensors provided data for chlorophyll, turbidity, conductivity, temperature (three sensors, salinity (calculated from temperature and conductivity, biomass at three different depth intervals (5-50, 50-120, 120-250 m, and current speed measured in two directions (east and north using two sensors covering different depths with overlap. A total of 88 variables were monitored, 78 from the two current speed sensors. The time-resolution varied, thus the data had to be aligned to a common time resolution. After alignment, the data were interpreted using principal component analysis (PCA. Initially, a calibration model was established using data from May 16 to July 31. The data on current speed from two sensors were subject to two separate PCA models and the score vectors from these two models were combined with the other 10 variables in a multi-block PCA model. The observations from August were projected on the calibration model consecutively one at a time and the result was visualized in a score plot. Automated PCA of multi-sensor data submitted online is illustrated with an attached time-lapse video covering the relative short time period used in the pilot study. Methods for statistical validation, and warning and alarm limits are described. Redundant sensors enable sensor diagnostics and quality assurance. In a future perspective, the concept may be used in integrated environmental monitoring.

  17. Testing the effectiveness of automated acoustic sensors for monitoring vocal activity of Marbled Murrelets Brachyramphus marmoratus

    Science.gov (United States)

    Cragg, Jenna L.; Burger, Alan E.; Piatt, John F.

    2015-01-01

    Cryptic nest sites and secretive breeding behavior make population estimates and monitoring of Marbled Murrelets Brachyramphus marmoratus difficult and expensive. Standard audio-visual and radar protocols have been refined but require intensive field time by trained personnel. We examined the detection range of automated sound recorders (Song Meters; Wildlife Acoustics Inc.) and the reliability of automated recognition models (“recognizers”) for identifying and quantifying Marbled Murrelet vocalizations during the 2011 and 2012 breeding seasons at Kodiak Island, Alaska. The detection range of murrelet calls by Song Meters was estimated to be 60 m. Recognizers detected 20 632 murrelet calls (keer and keheer) from a sample of 268 h of recordings, yielding 5 870 call series, which compared favorably with human scanning of spectrograms (on average detecting 95% of the number of call series identified by a human observer, but not necessarily the same call series). The false-negative rate (percentage of murrelet call series that the recognizers failed to detect) was 32%, mainly involving weak calls and short call series. False-positives (other sounds included by recognizers as murrelet calls) were primarily due to complex songs of other bird species, wind and rain. False-positives were lower in forest nesting habitat (48%) and highest in shrubby vegetation where calls of other birds were common (97%–99%). Acoustic recorders tracked spatial and seasonal trends in vocal activity, with higher call detections in high-quality forested habitat and during late July/early August. Automated acoustic monitoring of Marbled Murrelet calls could provide cost-effective, valuable information for assessing habitat use and temporal and spatial trends in nesting activity; reliability is dependent on careful placement of sensors to minimize false-positives and on prudent application of digital recognizers with visual checking of spectrograms.

  18. Use of automated medication adherence monitoring in bipolar disorder research: pitfalls, pragmatics, and possibilities.

    Science.gov (United States)

    Levin, Jennifer B; Sams, Johnny; Tatsuoka, Curtis; Cassidy, Kristin A; Sajatovic, Martha

    2015-04-01

    Medication nonadherence occurs in 20-60% of persons with bipolar disorder (BD) and is associated with serious negative outcomes, including relapse, hospitalization, incarceration, suicide and high healthcare costs. Various strategies have been developed to measure adherence in BD. This descriptive paper summarizes challenges and workable strategies using electronic medication monitoring in a randomized clinical trial (RCT) in patients with BD. Descriptive data from 57 nonadherent individuals with BD enrolled in a prospective RCT evaluating a novel customized adherence intervention versus control were analyzed. Analyses focused on whole group data and did not assess intervention effects. Adherence was assessed with the self-reported Tablets Routine Questionnaire and the Medication Event Monitoring System (MEMS). The majority of participants were women (74%), African American (69%), with type I BD (77%). Practical limitations of MEMS included misuse in conjunction with pill minders, polypharmacy, cost, failure to bring to research visits, losing the device, and the device impacting baseline measurement. The advantages were more precise measurement, less biased recall, and collecting data from past time periods for missed interim visits. Automated devices such as MEMS can assist investigators in evaluating adherence in patients with BD. Knowing the anticipated pitfalls allows study teams to implement preemptive procedures for successful implementation in BD adherence studies and can help pave the way for future refinements as automated adherence assessment technologies become more sophisticated and readily available.

  19. Automated electronic monitoring of circuit pressures during continuous renal replacement therapy: a technical report.

    Science.gov (United States)

    Zhang, Ling; Baldwin, Ian; Zhu, Guijun; Tanaka, Aiko; Bellomo, Rinaldo

    2015-03-01

    Automated electronic monitoring and analysis of circuit pressures during continuous renal replacement therapy (CRRT) has the potential to predict failure and allow intervention to optimise function. Current CRRT machines can measure and store pressure readings for downloading into databases and for analysis. We developed a procedure to obtain such data at intervals of 1 minute and analyse them using the Prismaflex CRRT machine, and we present an example of such analysis. We obtained data on pressures obtained at intervals of 1 minute in a patient with acute kidney injury and sepsis treated with continuous haemofiltration at 2 L/hour of ultrafiltration and a blood flow of 200 mL/minute. Data analysis identified progressive increases in transmembrane pressure (TMP) and prefilter pressure (PFP) from time 0 until 33 hours or clotting. TMP increased from 104 mmHg to 313 mmHg and PFP increased from from 131 mmHg to 185 mmHg. Effluent pressure showed a progressive increase in the negative pressure applied to achieve ultrafiltration from 0 mmHg to -168 mmHg. The inflection point for such changes was also identified. Blood pathway pressures for access and return remained unchanged throughout. Automated electronic monitoring of circuit pressure during CRRT is possible and provides useful information on the evolution of circuit clotting.

  20. Automated Plasma Spray (APS) process feasibility study: Plasma spray process development and evaluation

    Science.gov (United States)

    Fetheroff, C. W.; Derkacs, T.; Matay, I. M.

    1979-01-01

    An automated plasma spray (APS) process was developed to apply two layer (NiCrAlY and ZrO2-12Y2O3) thermal-barrier coatings to aircraft gas turbine engine blade airfoils. The APS process hardware consists of four subsystems: a mechanical blade positioner incorporating two interlaced six-degree-of-freedom assemblies; a noncoherent optical metrology subsystem; a microprocessor-based adaptive system controller; and commercial plasma spray equipment. Over fifty JT9D first stage turbine blades specimens were coated with the APS process in preliminary checkout and evaluation studies. The best of the preliminary specimens achieved an overall coating thickness uniformity of + or - 53 micrometers, much better than is achievable manually. Factors limiting this performance were identified and process modifications were initiated accordingly. Comparative evaluations of coating thickness uniformity for manually sprayed and APS coated specimens were initiated. One of the preliminary evaluation specimens was subjected to a torch test and metallographic evaluation.

  1. Advanced process monitoring and feedback control to enhance cell culture process production and robustness.

    Science.gov (United States)

    Zhang, An; Tsang, Valerie Liu; Moore, Brandon; Shen, Vivian; Huang, Yao-Ming; Kshirsagar, Rashmi; Ryll, Thomas

    2015-12-01

    It is a common practice in biotherapeutic manufacturing to define a fixed-volume feed strategy for nutrient feeds, based on historical cell demand. However, once the feed volumes are defined, they are inflexible to batch-to-batch variations in cell growth and physiology and can lead to inconsistent productivity and product quality. In an effort to control critical quality attributes and to apply process analytical technology (PAT), a fully automated cell culture feedback control system has been explored in three different applications. The first study illustrates that frequent monitoring and automatically controlling the complex feed based on a surrogate (glutamate) level improved protein production. More importantly, the resulting feed strategy was translated into a manufacturing-friendly manual feed strategy without impact on product quality. The second study demonstrates the improved process robustness of an automated feed strategy based on online bio-capacitance measurements for cell growth. In the third study, glucose and lactate concentrations were measured online and were used to automatically control the glucose feed, which in turn changed lactate metabolism. These studies suggest that the auto-feedback control system has the potential to significantly increase productivity and improve robustness in manufacturing, with the goal of ensuring process performance and product quality consistency. © 2015 Wiley Periodicals, Inc.

  2. Program software for the automated processing of gravity and magnetic survey data for the Mir computer

    Energy Technology Data Exchange (ETDEWEB)

    Lyubimov, G.A.

    1980-01-01

    A presentation is made of the content of program software for the automated processing of gravity and magnetic survey data for the small Mir-1 and Mir-2 computers as worked out on the Voronezh geophysical expedition.

  3. Automated process control system for heat-treating nuclear power station parts

    International Nuclear Information System (INIS)

    Afanasiadi, N.G.; Demin, V.P.; Launin, B.N.

    1984-01-01

    The basic factors determining the need for an automated process control system (APCS) are discussed, as are system requirements. The basic tasks solved by the system are discussed. The functional scheme for a decentralized, two-level APCS is given

  4. Reference Tools for Data Processing, Office Automation, and Data Communications: An Introductory Guide.

    Science.gov (United States)

    Cupoli, Patricia Dymkar

    1981-01-01

    Provides an introduction to various reference sources which are useful in dealing with the areas of data processing, office automation, and communications technologies. A bibliography with vendor listings is included. (FM)

  5. Development of mathematical models for automation of strength calculation during plastic deformation processing

    Science.gov (United States)

    Steposhina, S. V.; Fedonin, O. N.

    2018-03-01

    Dependencies that make it possible to automate the force calculation during surface plastic deformation (SPD) processing and, thus, to shorten the time for technological preparation of production have been developed.

  6. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    Science.gov (United States)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for

  7. Automated radiological monitoring at a Russian Ministry of Defence Naval Site

    International Nuclear Information System (INIS)

    Moskowitz, P.D.; Pomerville, J.; Gavrilov, S.; Kisselev, V.; Daniylan, V.; Belikov, A.; Egorkin, A.; Sokolovski, Y.; Endregard, M.; Krosshavn, M.; Sundling, C.V.; Yokstad, H.

    2001-01-01

    The Arctic Military Environmental Cooperation (AMEC) Program is a cooperative effort between the military establishments of the Kingdom of Norway, the Russian Federation, and the US. This paper discusses joint activities conducted over the past year among Norwegian, Russian, and US technical experts on a project to develop, demonstrate and implement automated radiological monitoring at Russian Navy facilities engaged in the dismantlement of nuclear-powered strategic ballistic missile launching submarines. Radiological monitoring is needed at these facilities to help protect workers engaged in the dismantlement program and the public living within the footprint of routine and accidental radiation exposure areas. By providing remote stand-alone monitoring, the Russian Navy will achieve added protection due to the defense-in-depth strategy afforded by local (at the site), regional (Kola) and national-level (Moscow) oversight. The system being implemented at the Polyaminsky Russian Naval Shipyard was developed from a working model tested at the Russian Institute for Nuclear Safety, Moscow, Russia. It includes Russian manufactured terrestrial and underwater gamma detectors, smart controllers for graded sampling, radio-modems for offsite transmission of the data, and a data fusion/display system: The data fusion/display system is derived from the Norwegian Picasso AMEC Environmental Monitoring software package. This computer package allows monitoring personnel to review the real-time and historical status of monitoring at specific sites and objects and to establish new monitoring protocols as required, for example, in an off-normal accident situation. Plans are being developed to implement the use of this system at most RF Naval sites handling spent nuclear fuel

  8. A simple condition monitoring model for a direct monitoring process

    NARCIS (Netherlands)

    Christer, A.H.; Wang, Wenbin

    1995-01-01

    This paper addresses the problem of condition monitoring of a component which has available a measure of condition called wear. Wear accumulates over time and monitoring inspections are performed at chosen times to monitor and measure the cumulative wear. If past measurements of wear are available

  9. A method for automated processing of measurement information during mechanical drilling

    Energy Technology Data Exchange (ETDEWEB)

    Samonenko, V.I.; Belinkov, V.G.; Romanova, L.A.

    1984-01-01

    An algorithm is cited for a developed method for automated processing of measurement information during mechanical drilling. Its use in conditions of operation of an automated control system (ASU) from drilling will make it possible to precisely identify a change in the lithology, the physical and mechanical and the abrasive properties, in the stratum (pore) pressure in the rock being drilled out during mechanical drilling, which along with other methods for testing the drilling process will increase the reliability of the decisions made.

  10. Process automation using combinations of process and machine control technologies with application to a continuous dissolver

    International Nuclear Information System (INIS)

    Spencer, B.B.; Yarbro, O.O.

    1991-01-01

    Operation of a continuous rotary dissolver, designed to leach uranium-plutonium fuel from chopped sections of reactor fuel cladding using nitric acid, has been automated. The dissolver is a partly continuous, partly batch process that interfaces at both ends with batchwise processes, thereby requiring synchronization of certain operations. Liquid acid is fed and flows through the dissolver continuously, whereas chopped fuel elements are fed to the dissolver in small batches and move through the compartments of the dissolver stagewise. Sequential logic (or machine control) techniques are used to control discrete activities such as the sequencing of isolation valves. Feedback control is used to control acid flowrates and temperatures. Expert systems technology is used for on-line material balances and diagnostics of process operation. 1 ref., 3 figs

  11. A New Device to Automate the Monitoring of Critical Patients’ Urine Output

    Directory of Open Access Journals (Sweden)

    Abraham Otero

    2014-01-01

    Full Text Available Urine output (UO is usually measured manually each hour in acutely ill patients. This task consumes a substantial amount of time. Furthermore, in the literature there is evidence that more frequent (minute-by-minute UO measurement could impact clinical decision making and improve patient outcomes. However, it is not feasible to manually take minute-by-minute UO measurements. A device capable of automatically monitoring UO could save precious time of the healthcare staff and improve patient outcomes through a more precise and continuous monitoring of this parameter. This paper presents a device capable of automatically monitoring UO. It provides minute by minute measures and it can generate alarms that warn of deviations from therapeutic goals. It uses a capacitive sensor for the measurement of the UO collected within a rigid container. When the container is full, it automatically empties without requiring any internal or external power supply or any intervention by the nursing staff. In vitro tests have been conducted to verify the proper operation and accuracy in the measures of the device. These tests confirm the viability of the device to automate the monitoring of UO.

  12. In-process monitoring and control of microassembly by utilising force sensor

    OpenAIRE

    S. Tangjitsitcharoen; P. Tangpornprasert; Ch. Virulsri; N. Rojanarowan

    2008-01-01

    Purpose: The aim of this research is to develop an in-process monitoring system to control the position of the shaftwithin a tolerance of ±2.5 μm regardless of any conditions of the geometries of the shaft and the thrust plate.Design/methodology/approach: To realize an automated and intelligent microassembly process, a method hasbeen developed to monitor and control the position of the shaft in the plate of the high-precision spindle motorfor hard disk drive in order to reduce the shaft high ...

  13. Infrasonic Stethoscope for Monitoring Physiological Processes

    Science.gov (United States)

    Shams, Qamar A. (Inventor); Zuckerwar, Allan J. (Inventor); Dimarcantonio, Albert L. (Inventor)

    2018-01-01

    An infrasonic stethoscope for monitoring physiological processes of a patient includes a microphone capable of detecting acoustic signals in the audible frequency bandwidth and in the infrasonic bandwidth (0.03 to 1000 Hertz), a body coupler attached to the body at a first opening in the microphone, a flexible tube attached to the body at a second opening in the microphone, and an earpiece attached to the flexible tube. The body coupler is capable of engagement with a patient to transmit sounds from the person, to the microphone and then to the earpiece.

  14. Automated Processing of 2-D Gel Electrophoretograms of Genomic DNA for Hunting Pathogenic DNA Molecular Changes.

    Science.gov (United States)

    Takahashi; Nakazawa; Watanabe; Konagaya

    1999-01-01

    We have developed the automated processing algorithms for 2-dimensional (2-D) electrophoretograms of genomic DNA based on RLGS (Restriction Landmark Genomic Scanning) method, which scans the restriction enzyme recognition sites as the landmark and maps them onto a 2-D electrophoresis gel. Our powerful processing algorithms realize the automated spot recognition from RLGS electrophoretograms and the automated comparison of a huge number of such images. In the final stage of the automated processing, a master spot pattern, on which all the spots in the RLGS images are mapped at once, can be obtained. The spot pattern variations which seemed to be specific to the pathogenic DNA molecular changes can be easily detected by simply looking over the master spot pattern. When we applied our algorithms to the analysis of 33 RLGS images derived from human colon tissues, we successfully detected several colon tumor specific spot pattern changes.

  15. Fluorescence monitoring of ultrasound degradation processes

    International Nuclear Information System (INIS)

    Hassoon, Salah; Bulatov, Valery; Yasman, Yakov; Schechter, Israel

    2004-01-01

    Ultrasound-based water treatment is often applied for degradation of stable organic pollutants, such as polycyclic aromatic hydrocarbons and halogenated compounds. Monitoring the degradation process, during the application of ultrasound radiation, is of considerable economical interest. In this work, the possibility of performing on-line spectral analysis during sonication was examined and it was found that direct absorption or fluorescence readings are misleading. Optical monitoring is strongly affected by the absorption and scattering of light by cavitation micro-bubbles and ultrasound induced particulates. A model was developed to account for these effects and to allow for on-line fluorescence analysis. The model takes into account the absorption and scattering coefficients of the micro-bubbles and particulates, as well as their time dependent concentration. The model parameters are found from independent measurements where the pollutants are added to already sonicated pure water. Then, the model is tested for predicting the actual fluorescence behavior during the sonication process. It has been shown that the model allows for recovery of the true degradation data, as obtained by off-line HPLC measurements

  16. Information processing for aerospace structural health monitoring

    Science.gov (United States)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  17. Measuring oxidation processes: Atomic oxygen flux monitor

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    Of the existing 95 high-energy accelerators in the world, the Stanford Linear Collider (SLC) at the Stanford Linear Accelerator Center (SLAC) is the only one of the linear-collider type, where electrons and positrons are smashed together at energies of 50 GeV using linear beams instead of beam rings for achieving interactions. Use of a collider eliminates energy losses in the form of x-rays due to the curved trajectory of the rings, a phenomena known as bremsstrauhlung. Because these losses are eliminated, higher interaction energies are reached. Consequently the SLC produced the first Z particle in quantities large enough to allow measurement of its physical properties with some accuracy. SLAC intends to probe still deeper into the structure of matter by next polarizing the electrons in the beam. The surface of the source for these polarized particles, typically gallium arsenide, must be kept clean of contaminants. One method for accomplishing this task requires the oxidation of the surface, from which the oxidized contaminants are later boiled off. The technique requires careful measurement of the oxidation process. SLAC researchers have developed a technique for measuring the atomic oxygen flux in this process. The method uses a silver film on a quartz-crystal, deposition-rate monitor. Measuring the initial oxidation rate of the silver, which is proportional to the atomic oxygen flux, determines a lower limit on that flux in the range of 10 13 to 10 17 atoms per square centimeter per second. Furthermore, the deposition is reversible by exposing the sensor to atomic hydrogen. This technique has wider applications to processes in solid-state and surface physics as well as surface chemistry. In semiconductor manufacturing where a precise thickness of oxide must be deposited, this technique could be used to monitor the critical flux of atomic oxygen in the process

  18. Dispatcher's monitoring systems of coal preparation processes. Systemy dyspozytorskiej kontroli procesow wzbogacania wegla

    Energy Technology Data Exchange (ETDEWEB)

    Cierpisz, S [Politechnika Slaska, Gliwice (Poland); Cierpisz, T; Glowacki, D; Puczylowski, T [Min-Tech Sp. z o.o., Katowice (Poland)

    1994-08-01

    The computer-based control and dispatcher's monitoring systems for coal preparation plants are described. The article refers to the local automation systems of coal blending production, control systems of heavy media separation process and dispatcher's visualization systems of technological lines operation. The effects of implementation of the above mentioned systems as well as some experiences gained at the designing and operational stages are given. (author). 2 refs., 6 figs.

  19. The monitoring and control of TRUEX processes

    International Nuclear Information System (INIS)

    Regalbuto, M.C.; Misra, B.; Chamberlain, D.B.; Leonard, R.A.; Vandegrift, G.F.

    1992-04-01

    The Generic TRUEX Model (GTM) was used to design a flowsheet for the TRUEX solvent extraction process that would be used to determine its instrumentation and control requirements. Sensitivity analyses of the key process variables, namely, the aqueous and organic flow rates, feed compositions, and the number of contactor stages, were carried out to assess their impact on the operation of the TRUEX process. Results of these analyses provide a basis for the selection of an instrument and control system and the eventual implementation of a control algorithm. Volume Two of this report is an evaluation of the instruments available for measuring many of the physical parameters. Equations that model the dynamic behavior of the TRUEX process have been generated. These equations can be used to describe the transient or dynamic behavior of the process for a given flowsheet in accordance with the TRUEX model. Further work will be done with the dynamic model to determine how and how quickly the system responds to various perturbations. The use of perturbation analysis early in the design stage will lead to a robust flowsheet, namely, one that will meet all process goals and allow for wide control bounds. The process time delay, that is, the speed with which the system reaches a new steady state, is an important parameter in monitoring and controlling a process. In the future, instrument selection and point-of-variable measurement, now done using the steady-state results reported here, will be reviewed and modified as necessary based on this dynamic method of analysis

  20. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives.

    Science.gov (United States)

    Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini

    2018-08-01

    Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.

  1. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    International Nuclear Information System (INIS)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan

    2012-01-01

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED adj ). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED adj between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED adj that differed by up to 44% from effective dose estimates that were not

  2. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan [Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States) and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27710 (United States); and Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708 (United States)

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose

  3. New signal processing algorithms for automated external defibrillators

    OpenAIRE

    Irusta Zarandona, Unai

    2017-01-01

    [ES]La fibrilación ventricular (VF) es el primer ritmo registrado en el 40\\,\\% de las muertes súbitas por paro cardiorrespiratorio extrahospitalario (PCRE). El único tratamiento eficaz para la FV es la desfibrilación mediante una descarga eléctrica. Fuera del hospital, la descarga se administra mediante un desfibrilador externo automático (DEA), que previamente analiza el electrocardiograma (ECG) del paciente y comprueba si presenta un ritmo desfibrilable. La supervivencia en un caso de PCRE ...

  4. Automation of periodic replenishment process for fashionable product

    OpenAIRE

    Lauwerier, Rémi

    2015-01-01

    The thesis depicted in this report was the subject of my internship at Dior Haute Couture in Paris. I was a member of the supply chain team and I was working on the women shoes product. The aim was to develop the automation of the replenishment of the stores in the world. As a planner for Europe and the United States, I was in strong interaction with the buyers who were following specific stores and who have strong knowledge about the product. The product required a well understanding of the ...

  5. CULTURE AND TECHNOLOGY: AUTOMATION IN THE CREATIVE PROCESSES OF NARRATIVE

    Directory of Open Access Journals (Sweden)

    Fernando Fogliano

    2013-12-01

    Full Text Available The objective here is to think on the problem raised by the progressively opaque presence of technology in the contemporary artistic production. Automation is the most evident aspect of technology of devices used for production, post-production and dissemination of this cultural activity. Along the text the philosophers Vilém Flusser and Gilbert Simon are put in confrontation so that a more profound insight can be obtained. Language is considered here as the integrative factor in the search for a new convergent conceptual scenario that enable us understand the consequences of the technological convergence

  6. A prototype of an automated high resolution InSAR volcano-monitoring system in the MED-SUV project

    Science.gov (United States)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas

    2016-04-01

    Volcanic processes which produce a variety of geological and hydrological hazards are difficult to predict and capable of triggering natural disasters on regional to global scales. Therefore it is important to monitor volcano continuously and with a high spatial and temporal sampling rate. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities and it helps for the better understanding and modelling of the involved geophysical processes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide a powerful tool for observing the eruptive activities and measuring the surface changes of millimetre accuracy. All the mentioned techniques with deformation time series extraction address the challenges by exploiting medium to large SAR image stacks. The process of selecting, ordering, downloading, storing, logging, extracting and preparing the data for processing is very time consuming has to be done manually for every single data-stack. In many cases it is even an iterative process which has to be done regularly and continuously. Therefore, data processing becomes slow which causes significant delays in data delivery. The SAR Satellite based High Resolution Data Acquisition System, which will be developed at DLR, will automate this entire time consuming tasks and allows an operational volcano monitoring system. Every 24 hours the system runs for searching new acquired scene over the volcanoes and keeps track of the data orders, log the status and download the provided data via ftp-transfer including E-Mail alert. Furthermore, the system will deliver specified reports and maps to a database for review and use by specialists. The user interaction will be minimized and iterative processes will be totally avoided. In this presentation, a prototype of SAR Satellite based High Resolution Data

  7. Automated selected reaction monitoring software for accurate label-free protein quantification.

    Science.gov (United States)

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  8. ADVANCES IN CLOG STATE MONITORING FOR USE IN AUTOMATED REED BED INSTALLATIONS

    Directory of Open Access Journals (Sweden)

    Theodore HUGHES-RILEY

    2014-06-01

    Full Text Available Constructed wetlands are a popular form of waste-water treatment that have proliferated across Europe and the rest of the world in recent years as an environmentally conscious form of waste water treatment. The ability to monitor the conditions in the bed and control input factors such as heating and aeration may extend the lifetime of the reed bed substantially beyond the ten year lifetime normally reached. The Autonomous Reed Bed Installation (ARBI project is an EU FP7 initiative to develop a reed bed with automated control over input parameters based on readings taken from embedded sensors. Automated remedial action may improve bed treatment efficiency, and prolong the life of the bed and avoiding the need to refurbish the bed, which is both time consuming and costly. One critical parameter to observe is the clog state of the reed bed, as this can severely impact on the efficiency of water treatment to the point of the bed becoming non-operable. Magnetic resonance (MR sensors can be a powerful tool in determining clogging levels, and has previously been explored in the literature. This work is based on a conference paper (2nd International Conference "Water resources and wetlands", 2014 and details magnetic sensors suitable for long-term embedding into a constructed wetland. Unlike previous studies this work examines a probe embedded into a wetland.

  9. Completely automated measurement facility (PAVICOM) for track-detector data processing

    CERN Document Server

    Aleksandrov, A B; Feinberg, E L; Goncharova, L A; Konovalova, N S; Martynov, A G; Polukhina, N G; Roussetski, A S; Starkov, NI; Tsarev, V A

    2004-01-01

    A review of technical capabilities and investigations performed using the completely automated measuring facility (PAVICOM) is presented. This very efficient facility for track-detector data processing in the field of nuclear and high-energy particle physics has been constructed in the Lebedev physical institute. PAVICOM is widely used in Russia for treatment of experimental data from track detectors (emulsion and solid-state trackers) in high- and low-energy physics, cosmic ray physics, etc. PAVICOM provides an essential improvement of the efficiency of experimental studies. In contrast to semi-automated microscopes widely used until now, PAVICOM is capable of performing completely automated measurements of charged particle tracks in nuclear emulsions and track detectors without employing hard visual work. In this case, track images are recorded by CCD cameras and then are digitized and converted into files. Thus, experimental data processing is accelerated by approximately a thousand times. Completely autom...

  10. Statistical process control for electron beam monitoring.

    Science.gov (United States)

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    Schobert, G.; Arnold, N.; Bashore, D.; Mika, R.; Oliaro, G.

    1989-01-01

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  12. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    Science.gov (United States)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-04-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems.

  13. Process defects and in situ monitoring methods in metal powder bed fusion: a review

    International Nuclear Information System (INIS)

    Grasso, Marco; Colosimo, Bianca Maria

    2017-01-01

    Despite continuous technological enhancements of metal Additive Manufacturing (AM) systems, the lack of process repeatability and stability still represents a barrier for the industrial breakthrough. The most relevant metal AM applications currently involve industrial sectors (e.g. aerospace and bio-medical) where defects avoidance is fundamental. Because of this, there is the need to develop novel in situ monitoring tools able to keep under control the stability of the process on a layer-by-layer basis, and to detect the onset of defects as soon as possible. On the one hand, AM systems must be equipped with in situ sensing devices able to measure relevant quantities during the process, a.k.a. process signatures. On the other hand, in-process data analytics and statistical monitoring techniques are required to detect and localize the defects in an automated way. This paper reviews the literature and the commercial tools for in situ monitoring of powder bed fusion (PBF) processes. It explores the different categories of defects and their main causes, the most relevant process signatures and the in situ sensing approaches proposed so far. Particular attention is devoted to the development of automated defect detection rules and the study of process control strategies, which represent two critical fields for the development of future smart PBF systems. (paper)

  14. Marketing automation processes as a way to improve contemporary marketing of a company

    Directory of Open Access Journals (Sweden)

    Witold Świeczak

    2013-09-01

    Full Text Available The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influence an effective course of actions taken as a part of marketing automation. Because the concept of marketing automation is a completely new reality; it is giving up the communication based on mass distribution of a uniform contents for really personalized individual and fully automated communication. This is a completely new idea, a kind of coexistence, in which both a sales department and a marketing department cooperate with each other closely to achieve the best result. It is also a situation in which marketing can definitely confirm its contribution to the income generated by the company. But marketing automation also means huge analytical possibilities and a real increase of a company’s value, its value added generated by the system – the source of information about clients, about all processes both marketing and sales, taking place in a company. The introduction of marketing automation system alters not only the current functioning of a marketing department, but also marketers themselves. In fact, everything that marketing automation system provides, including primarily accumulated unique knowledge of the client, is also a critical marketing value of every modern enterprise.

  15. Automation and efficiency in the operational processes: a case study in a logistics operator

    OpenAIRE

    Nascimento, Dener Gomes do; Silva, Giovanni Henrique da

    2017-01-01

    Globalization has made the automations become increasingly feasible and with the technological development many operations can be optimized, bringing productivity gains. Logistics is a major benefit of all this development, because lives a time extremely competitive, in which being efficient is a requirement to stay alive in the market. Inserted in this context, this article seeks from the analysis of the processes in a distribution center, identify opportunities to automate operations to gai...

  16. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  17. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  18. Process development for automated solar cell and module production. Task 4: automated array assembly. Quarterly report No. 5

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J.J.

    1980-01-31

    Construction of an automated solar cell layup and interconnect system is now complete. This system incorporates a Unimate 2000 B industrial robot with an end effector consisting of a vacuum pick up and induction heating coil. The robot interfaces with a smart cell preparation station which correctly orients the cell, applies solder paste and forms and positions the correct lengths of interconnect lead. The system is controlled and monitored by a TRS-80 micro computer. The first operational tests of the fully integrated station have been run. These tests proved the soundness of the basic design concept but also pointed to areas in which modifications are necessary. These modifications are nearly complete and the improved parts are being integrated. Development of the controlling computer program is progressing to both reflect these changes and reduce operating time.

  19. AUTOMATION OF CONTROL OF THE BUSINESS PROCESS OF PUBLISHING SCIENTIFIC JOURNALS

    Directory of Open Access Journals (Sweden)

    O. Yu. Sakaliuk

    2016-09-01

    Full Text Available We consider business process automation publishing scientific journals. It describes the focal point of publishing houses Odessa National Academy of Food Technology and the automation of business processes. A complex business process models publishing scientific journals. Analyzed organizational structure of Coordinating Centre of Scientific Journals' Publishing ONAFT structure and created its model. A process model simulation conducted business process notation eEPC and BPMN. Also held database design, creation of file structure and create AIS interface. Implemented interaction with the webcam. Justification feasibility of software development, and the definition of performance based on the results petal chart, it is safe to say that an automated way to much more efficient compared to manual mode. The developed software will accelerate the development of scientific periodicals ONAFT, which in turn improve the academy ratings at the global level, improve its image and credibility.

  20. Comparative analysis of automation of production process with industrial robots in Asia/Australia and Europe

    Directory of Open Access Journals (Sweden)

    I. Karabegović

    2017-01-01

    Full Text Available The term "INDUSTRY 4.0" or "fourth industrial revolution" was first introduced at the fair in 2011 in Hannover. It comes from the high-tech strategy of the German Federal Government that promotes automation-computerization to complete smart automation, meaning the introduction of a method of self-automation, self-configuration, self-diagnosing and fixing the problem, knowledge and intelligent decision-making. Any automation, including smart, cannot be imagined without industrial robots. Along with the fourth industrial revolution, ‘’robotic revolution’’ is taking place in Japan. Robotic revolution refers to the development and research of robotic technology with the aim of using robots in all production processes, and the use of robots in real life, to be of service to a man in daily life. Knowing these facts, an analysis was conducted of the representation of industrial robots in the production processes on the two continents of Europe and Asia /Australia, as well as research that industry is ready for the introduction of intelligent automation with the goal of establishing future smart factories. The paper gives a representation of the automation of production processes in Europe and Asia/Australia, with predictions for the future.

  1. Method of noncontacting ultrasonic process monitoring

    Science.gov (United States)

    Garcia, Gabriel V.; Walter, John B.; Telschow, Kenneth L.

    1992-01-01

    A method of monitoring a material during processing comprising the steps of (a) shining a detection light on the surface of a material; (b) generating ultrasonic waves at the surface of the material to cause a change in frequency of the detection light; (c) detecting a change in the frequency of the detection light at the surface of the material; (d) detecting said ultrasonic waves at the surface point of detection of the material; (e) measuring a change in the time elapsed from generating the ultrasonic waves at the surface of the material and return to the surface point of detection of the material, to determine the transit time; and (f) comparing the transit time to predetermined values to determine properties such as, density and the elastic quality of the material.

  2. Performance Analysis of Wireless Networks for Industrial Automation-Process Automation (WIA-PA)

    Science.gov (United States)

    2017-09-01

    accordance with [28]. Our model does not simulate the joining process because we were not able to use multiple processing threads in our MATLAB...of 1 ms, but implementing this becomes too computationally intensive and other process do not get scheduled. Due to this, we expanded our simulation ...time to be in terms of seconds, and scaled data rates and processing rates to match. For simulation purposes, our timeslot within the superframe is

  3. A wireless smart sensor network for automated monitoring of cable tension

    International Nuclear Information System (INIS)

    Sim, Sung-Han; Cho, Soojin; Li, Jian; Jo, Hongki; Park, Jong-Woong; Jung, Hyung-Jo; Spencer Jr, Billie F

    2014-01-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea. (paper)

  4. A wireless smart sensor network for automated monitoring of cable tension

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jong-Woong; Cho, Soojin; Spencer, Billie F., Jr.; Jung, Hyung-Jo

    2014-02-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea.

  5. Processing and review interface for strong motion data (PRISM) software, version 1.0.0—Methodology and automated processing

    Science.gov (United States)

    Jones, Jeanne; Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.

  6. Automated Remote Monitoring of Depression: Acceptance Among Low-Income Patients in Diabetes Disease Management.

    Science.gov (United States)

    Ramirez, Magaly; Wu, Shinyi; Jin, Haomiao; Ell, Kathleen; Gross-Schulman, Sandra; Myerchin Sklaroff, Laura; Guterman, Jeffrey

    2016-01-25

    Remote patient monitoring is increasingly integrated into health care delivery to expand access and increase effectiveness. Automation can add efficiency to remote monitoring, but patient acceptance of automated tools is critical for success. From 2010 to 2013, the Diabetes-Depression Care-management Adoption Trial (DCAT)-a quasi-experimental comparative effectiveness research trial aimed at accelerating the adoption of collaborative depression care in a safety-net health care system-tested a fully automated telephonic assessment (ATA) depression monitoring system serving low-income patients with diabetes. The aim of this study was to determine patient acceptance of ATA calls over time, and to identify factors predicting long-term patient acceptance of ATA calls. We conducted two analyses using data from the DCAT technology-facilitated care arm, in which for 12 months the ATA system periodically assessed depression symptoms, monitored treatment adherence, prompted self-care behaviors, and inquired about patients' needs for provider contact. Patients received assessments at 6, 12, and 18 months using Likert-scale measures of willingness to use ATA calls, preferred mode of reach, perceived ease of use, usefulness, nonintrusiveness, privacy/security, and long-term usefulness. For the first analysis (patient acceptance over time), we computed descriptive statistics of these measures. In the second analysis (predictive factors), we collapsed patients into two groups: those reporting "high" versus "low" willingness to use ATA calls. To compare them, we used independent t tests for continuous variables and Pearson chi-square tests for categorical variables. Next, we jointly entered independent factors found to be significantly associated with 18-month willingness to use ATA calls at the univariate level into a logistic regression model with backward selection to identify predictive factors. We performed a final logistic regression model with the identified significant

  7. The Effects of Automated Prompting and Self-Monitoring on Homework Completion for a Student with Attention Deficit Hyperactivity Disorder

    Science.gov (United States)

    Blicha, Amy; Belfiore, Phillip J.

    2013-01-01

    This study examined the effects of an intervention consisting of automated prompting and self-monitoring on the level of independent homework task completion for an elementary-age student with attention deficit hyperactivity disorder (ADHD). Instituting a single subject, within series ABAB design, the results showed a consistent increase and…

  8. Process monitoring for reprocessing plant safeguards: a summary review

    International Nuclear Information System (INIS)

    Kerr, H.T.; Ehinger, M.H.; Wachter, J.W.; Hebble, T.L.

    1986-10-01

    Process monitoring is a term typically associated with a detailed look at plant operating data to determine plant status. Process monitoring has been generally associated with operational control of plant processes. Recently, process monitoring has been given new attention for a possible role in international safeguards. International Safeguards Project Office (ISPO) Task C.59 has the goal to identify specific roles for process monitoring in international safeguards. As the preliminary effort associated with this task, a review of previous efforts in process monitoring for safeguards was conducted. Previous efforts mentioned concepts and a few specific applications. None were comprehensive in addressing all aspects of a process monitoring application for safeguards. This report summarizes the basic elements that must be developed in a comprehensive process monitoring application for safeguards. It then summarizes the significant efforts that have been documented in the literature with respect to the basic elements that were addressed

  9. Research progress of laser welding process dynamic monitoring technology based on plasma characteristics signal

    Directory of Open Access Journals (Sweden)

    Teng WANG

    2017-02-01

    Full Text Available During the high-power laser welding process, plasmas are induced by the evaporation of metal under laser radiation, which can affect the coupling of laser energy and the workpiece, and ultimately impact on the reliability of laser welding quality and process directly. The research of laser-induced plasma is a focus in high-power deep penetration welding field, which provides a promising research area for realizing the automation of welding process quality inspection. In recent years, the research of laser welding process dynamic monitoring technology based on plasma characteristics is mainly in two aspects, namely the research of plasma signal detection and the research of laser welding process modeling. The laser-induced plasma in the laser welding is introduced, and the related research of laser welding process dynamic monitoring technology based on plasma characteristics at home and abroad is analyzed. The current problems in the field are summarized, and the future development trend is put forward.

  10. MIR-ATR sensor for process monitoring

    International Nuclear Information System (INIS)

    Geörg, Daniel; Schalk, Robert; Beuermann, Thomas; Methner, Frank-Jürgen

    2015-01-01

    A mid-infrared attenuated total reflectance (MIR-ATR) sensor has been developed for chemical reaction monitoring. The optical setup of the compact and low-priced sensor consists of an IR emitter as light source, a zinc selenide (ZnSe) ATR prism as boundary to the process, and four thermopile detectors, each equipped with an optical bandpass filter. The practical applicability was tested during esterification of ethanol and formic acid to ethyl formate and water as a model reaction with subsequent distillation. For reference analysis, a Fourier transform mid-infrared (FT-MIR) spectrometer with diamond ATR module was applied. On-line measurements using the MIR-ATR sensor and the FT-MIR spectrometer were performed in a bypass loop. The sensor was calibrated by multiple linear regression in order to link the measured absorbance in the four optical channels to the analyte concentrations. The analytical potential of the MIR-ATR sensor was demonstrated by simultaneous real-time monitoring of all four chemical substances involved in the esterification and distillation process. The temporal courses of the sensor signals are in accordance with the concentration values achieved by the commercial FT-MIR spectrometer. The standard error of prediction for ethanol, formic acid, ethyl formate, and water were 0.38 mol L   −  1 , 0.48 mol L   −  1 , 0.38 mol L   −  1 , and 1.12 mol L   −  1 , respectively. A procedure based on MIR spectra is presented to simulate the response characteristics of the sensor if the transmission ranges of the filters are varied. Using this tool analyte specific bandpass filters for a particular chemical reaction can be identified. By exchanging the optical filters, the sensor can be adapted to a wide range of processes in the chemical, pharmaceutical, and beverage industries. (paper)

  11. Application of an automated wireless structural monitoring system for long-span suspension bridges

    International Nuclear Information System (INIS)

    Kurata, M.; Lynch, J. P.; Linden, G. W. van der; Hipley, P.; Sheng, L.-H.

    2011-01-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  12. MICROBIOLOGICAL MONITORING AND AUTOMATED EVENT SAMPLING AT KARST SPRINGS USING LEO-SATELLITES

    Science.gov (United States)

    Stadler, Hermann; Skritek, Paul; Sommer, Regina; Mach, Robert L.; Zerobin, Wolfgang; Farnleitner, Andreas H.

    2010-01-01

    Data communication via Low-Earth-Orbit Satellites between portable hydro-meteorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E.coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples by hand compared with the auto-sampling procedure revealed corresponding results and were in agreement to the ISO 9308-1 reference method. E.coli concentrations were individually corrected by event specific die-off rates (0.10–0.14 day−1) compensating losses due to sample storage at spring temperature in the auto sampler. Two large summer events 2005/2006 at a large alpine karst spring (LKAS2) were monitored including detailed analysis of E.coli dynamics (n = 271) together with comprehensive hydrological characterisations. High resolution time series demonstrated a sudden increase of E.coli concentrations in spring water (approx. 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorbent coefficient measured at 254nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-Satellite based system it is a helpful tool for Early-Warning-Systems in the field of drinking water protection. PMID:18776628

  13. Application of AN Automated Wireless Structural Monitoring System for Long-Span Suspension Bridges

    Science.gov (United States)

    Kurata, M.; Lynch, J. P.; van der Linden, G. W.; Hipley, P.; Sheng, L.-H.

    2011-06-01

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  14. Automated synthesis of image processing procedures using AI planning techniques

    Science.gov (United States)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  15. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC)

  16. Development of automated welding process for field fabrication of thick walled pressure vessels

    International Nuclear Information System (INIS)

    Schneider, U.A.

    Research on automatic welding processes for the fabrication of thick-walled pressure vessels continued. A literature review on the subject was completed. A laboratory study of criteria for judging acceptable root parameters continued. Equipment for a demonstration facility to test the components and processes of the automated welding system has been specified and is being obtained

  17. Development of automated welding process for field fabrication of thick walled pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, U A

    1981-01-01

    Research on automatic welding processes for the fabrication of thick-walled pressure vessels continued. A literature review on the subject was completed. A laboratory study of criteria for judging acceptable root parameters continued. Equipment for a demonstration facility to test the components and processes of the automated welding system has been specified and is being obtained. (LCL)

  18. A Continuous Automated Vault Inventory System (CAVIS) for accountability monitoring of stored nuclear materials

    Energy Technology Data Exchange (ETDEWEB)

    Pickett, C.A.; Barham, M.A.; Gafford, T.A.; Hutchinson, D.P.; Jordan, J.K.; Maxey, L.C.; Moran, B.W.; Muhs, J.; Nodine, R.; Simpson, M.L. [and others

    1994-12-08

    Nearly all facilities that store hazardous (radioactive or non-radioactive) materials must comply with prevailing federal, state, and local laws. These laws usually have components that require periodic physical inspections to insure that all materials remain safely and securely stored. The inspections are generally labor intensive, slow, put personnel at risk, and only find anomalies after they have occurred. The system described in this paper was developed for monitoring stored nuclear materials resulting from weapons dismantlement, but its applications extend to any storage facility that meets the above criteria. The traditional special nuclear material (SNM) accountability programs, that are currently used within most of the Department of Energy (DOE) complex, require the physical entry of highly trained personnel into SNM storage vaults. This imposes the need for additional security measures, which typically mandate that extra security personnel be present while SNM inventories are performed. These requirements increase labor costs and put additional personnel at risk to radiation exposure. In some cases, individuals have received radiation exposure equivalent to the annual maximum during just one inventory verification. With increasing overhead costs, the current system is rapidly becoming too expensive to operate, the need for an automated method of inventory verification is evident. The Continuous Automated Vault Inventory System (CAVIS) described in this paper was designed and prototyped as a low cost, highly reliable, and user friendly system that is capable of providing, real-time weight, gamma. and neutron energy confirmation from each item stored in a SNM vault. This paper describes the sensor technologies, the CAVIS prototype system (built at Y- 12 for highly enriched uranium storage), the technical requirements that must be achieved to assure successful implementation, and descriptions of sensor technologies needed for a plutonium facility.

  19. A Continuous Automated Vault Inventory System (CAVIS) for accountability monitoring of stored nuclear materials

    International Nuclear Information System (INIS)

    Pickett, C.A.; Barham, M.A.; Gafford, T.A.; Hutchinson, D.P.; Jordan, J.K.; Maxey, L.C.; Moran, B.W.; Muhs, J.; Nodine, R.; Simpson, M.L.

    1994-01-01

    Nearly all facilities that store hazardous (radioactive or non-radioactive) materials must comply with prevailing federal, state, and local laws. These laws usually have components that require periodic physical inspections to insure that all materials remain safely and securely stored. The inspections are generally labor intensive, slow, put personnel at risk, and only find anomalies after they have occurred. The system described in this paper was developed for monitoring stored nuclear materials resulting from weapons dismantlement, but its applications extend to any storage facility that meets the above criteria. The traditional special nuclear material (SNM) accountability programs, that are currently used within most of the Department of Energy (DOE) complex, require the physical entry of highly trained personnel into SNM storage vaults. This imposes the need for additional security measures, which typically mandate that extra security personnel be present while SNM inventories are performed. These requirements increase labor costs and put additional personnel at risk to radiation exposure. In some cases, individuals have received radiation exposure equivalent to the annual maximum during just one inventory verification. With increasing overhead costs, the current system is rapidly becoming too expensive to operate, the need for an automated method of inventory verification is evident. The Continuous Automated Vault Inventory System (CAVIS) described in this paper was designed and prototyped as a low cost, highly reliable, and user friendly system that is capable of providing, real-time weight, gamma. and neutron energy confirmation from each item stored in a SNM vault. This paper describes the sensor technologies, the CAVIS prototype system (built at Y- 12 for highly enriched uranium storage), the technical requirements that must be achieved to assure successful implementation, and descriptions of sensor technologies needed for a plutonium facility

  20. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.

  1. An automated fog monitoring system for the Indo-Gangetic Plains based on satellite measurements

    Science.gov (United States)

    Patil, Dinesh; Chourey, Reema; Rizvi, Sarwar; Singh, Manoj; Gautam, Ritesh

    2016-05-01

    Fog is a meteorological phenomenon that causes reduction in regional visibility and affects air quality, thus leading to various societal and economic implications, especially disrupting air and rail transportation. The persistent and widespread winter fog impacts the entire the Indo-Gangetic Plains (IGP), as frequently observed in satellite imagery. The IGP is a densely populated region in south Asia, inhabiting about 1/6th of the world's population, with a strong upward pollution trend. In this study, we have used multi-spectral radiances and aerosol/cloud retrievals from Terra/Aqua MODIS data for developing an automated web-based fog monitoring system over the IGP. Using our previous and existing methodologies, and ongoing algorithm development for the detection of fog and retrieval of associated microphysical properties (e.g. fog droplet effective radius), we characterize the widespread fog detection during both daytime and nighttime. Specifically, for the night time fog detection, the algorithm employs a satellite-based bi-spectral brightness temperature difference technique between two spectral channels: MODIS band-22 (3.9μm) and band-31 (10.75μm). Further, we are extending our algorithm development to geostationary satellites, for providing continuous monitoring of the spatial-temporal variation of fog. We anticipate that the ongoing and future development of a fog monitoring system would be of assistance to air, rail and vehicular transportation management, as well as for dissemination of fog information to government agencies and general public. The outputs of fog detection algorithm and related aerosol/cloud parameters are operationally disseminated via http://fogsouthasia.com/.

  2. The newborn oxygram: automated processing of transcutaneous oxygen data.

    Science.gov (United States)

    Horbar, J D; Clark, J T; Lucey, J F

    1980-12-01

    Hypoxemic and hyperoxemic episodes are common in newborns with respiratory disorders. We have developed a microprocessor-based data system for use with transcutaneous oxygen (TcPO2) monitors in an attempt to quantitate these episodes. The amount of time spent by an infant in each of ten preset TcPO2 ranges can be automatically recorded. These data are referred to as the oxygram. Fourteen newborn infants were monitored for a total of 552 hours using this system. They spent a mean of 2.96% of the time with a TcPO2 less than or equal to 40 torr and 0.26% of the time with a TcPO2 greater than 100 torr. Representative oxygrams are presented. Clinical and research applications of the data system are discussed.

  3. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  4. An Automated Process for Generation of New Fuel Breakdown Mechanisms

    National Research Council Canada - National Science Library

    Violi, Angela

    2006-01-01

    .... It combines advanced computational techniques in a synergistic study of the critical processes in fuel decomposition at a level of detail that can help distinguish, correct, and quantify mechanisms for these processes...

  5. Automated simulation and study of spatial-structural design processes

    NARCIS (Netherlands)

    Davila Delgado, J.M.; Hofmeyer, H.; Stouffs, R.; Sariyildiz, S.

    2013-01-01

    A so-called "Design Process Investigation toolbox" (DPI toolbox), has been developed. It is a set of computational tools that simulate spatial-structural design processes. Its objectives are to study spatial-structural design processes and to support the involved actors. Two case-studies are

  6. Safeguards inventory and process monitoring regulatory comparison

    Energy Technology Data Exchange (ETDEWEB)

    Cavaluzzi, Jack M. [Texas A & M Univ., College Station, TX (United States); Gibbs, Philip W. [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2013-06-27

    Detecting the theft or diversion of the relatively small amount of fissile material needed to make a nuclear weapon given the normal operating capacity of many of today’s running nuclear production facilities is a difficult task. As throughput increases, the ability of the Material Control and Accountability (MC&A) Program to detect the material loss decreases because the statistical measurement uncertainty also increases. The challenge faced is the ability of current accounting, measurement, and material control programs to detect small yet significant losses under some regulatory approaches can decrease to the point where it is extremely low if not practically non-existent at normal operating capacities. Adding concern to this topic is that there are variations among regulatory bodies as far as what is considered a Significant Quantity (SQ). Some research suggests that thresholds should be lower than those found in any current regulation which if adopted would make meeting detection goals even more difficult. This paper reviews and compares the current regulatory requirements for the MA elements related to physical inventory, uncertainty of the Inventory Difference (ID), and Process Monitoring (PM) in the United States Department of Energy (DOE) and Nuclear Regulatory Commission (NRC), Rosatom of the Russian Federation and the Chinese Atomic Energy Agency (CAEA) of China. The comparison looks at how the regulatory requirements for the implementation of various MA elements perform across a range of operating capacities in example facilities.

  7. The integration of process monitoring for safeguards

    International Nuclear Information System (INIS)

    Cipiti, Benjamin B.; Zinaman, Owen R.

    2010-01-01

    The Separations and Safeguards Performance Model is a reprocessing plant model that has been developed for safeguards analyses of future plant designs. The model has been modified to integrate bulk process monitoring data with traditional plutonium inventory balances to evaluate potential advanced safeguards systems. Taking advantage of the wealth of operator data such as flow rates and mass balances of bulk material, the timeliness of detection of material loss was shown to improve considerably. Four diversion cases were tested including both abrupt and protracted diversions at early and late times in the run. The first three cases indicated alarms before half of a significant quantity of material was removed. The buildup of error over time prevented detection in the case of a protracted diversion late in the run. Some issues related to the alarm conditions and bias correction will need to be addressed in future work. This work both demonstrates the use of the model for performing diversion scenario analyses and for testing advanced safeguards system designs.

  8. FLAME MONITORING IN POWER STATION BOILERS USING IMAGE PROCESSING

    Directory of Open Access Journals (Sweden)

    K. Sujatha

    2012-05-01

    Full Text Available Combustion quality in power station boilers plays an important role in minimizing the flue gas emissions. In the present work various intelligent schemes to infer the flue gas emissions by monitoring the flame colour at the furnace of the boiler are proposed here. Flame image monitoring involves capturing the flame video over a period of time with the measurement of various parameters like Carbon dioxide (CO2, excess oxygen (O2, Nitrogen dioxide (NOx, Sulphur dioxide (SOx and Carbon monoxide (CO emissions plus the flame temperature at the core of the fire ball, air/fuel ratio and the combustion quality. Higher the quality of combustion less will be the flue gases at the exhaust. The flame video was captured using an infrared camera. The flame video is then split up into the frames for further analysis. The video splitter is used for progressive extraction of the flame images from the video. The images of the flame are then pre-processed to reduce noise. The conventional classification and clustering techniques include the Euclidean distance classifier (L2 norm classifier. The intelligent classifier includes the Radial Basis Function Network (RBF, Back Propagation Algorithm (BPA and parallel architecture with RBF and BPA (PRBFBPA. The results of the validation are supported with the above mentioned performance measures whose values are in the optimal range. The values of the temperatures, combustion quality, SOx, NOx, CO, CO2 concentrations, air and fuel supplied corresponding to the images were obtained thereby indicating the necessary control action taken to increase or decrease the air supply so as to ensure complete combustion. In this work, by continuously monitoring the flame images, combustion quality was inferred (complete/partial/incomplete combustion and the air/fuel ratio can be automatically varied. Moreover in the existing set-up, measurements like NOx, CO and CO2 are inferred from the samples that are collected periodically or by

  9. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  10. System of automated processing of radionuclide investigations (SAPRI-01) in clinical practice

    International Nuclear Information System (INIS)

    Sivachenko, T.P.; Mechev, D.S.; Krupka, I.N.

    1988-01-01

    The author described the results of clinical testing of a system SAPRI-01 designed for automated collection, storage and processing of data on radionuclide investigations. He gave examples of automated processing of RCG and the results of positive scintigraphy of tumors of different sites using 67 Ga-citrate and 99m Tc pertechnetate in statistical and dynamic investigations. Short-comings and ways for updating 4 the system during its serial production were pointed out. The introduction of the system into clinical practice on a wide scale was shown to hold promise

  11. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  12. Automated processing of webcam images for phenological classification.

    Science.gov (United States)

    Bothmann, Ludwig; Menzel, Annette; Menze, Bjoern H; Schunk, Christian; Kauermann, Göran

    2017-01-01

    Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels' time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the statistical software

  13. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Tekinerdogan, B.

    1998-01-01

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and

  14. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Demeyer, S.; Bosch, H.G.P.; Tekinerdogan, B.

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and

  15. Automated processing of webcam images for phenological classification.

    Directory of Open Access Journals (Sweden)

    Ludwig Bothmann

    Full Text Available Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels' time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the

  16. Design and development on automated control system of coated fuel particle fabrication process

    International Nuclear Information System (INIS)

    Liu Malin; Shao Youlin; Liu Bing

    2013-01-01

    With the development trend of the large-scale production of the HTR coated fuel particles, the original manual control system can not meet the requirement and the automation control system of coated fuel particle fabrication in modern industrial grade is needed to develop. The comprehensive analysis aiming at successive 4-layer coating process of TRISO type coated fuel particles was carried out. It was found that the coating process could be divided into five subsystems and nine operating states. The establishment of DCS-type (distributed control system) of automation control system was proposed. According to the rigorous requirements of preparation process for coated particles, the design considerations of DCS were proposed, including the principle of coordinated control, safety and reliability, integration specification, practical and easy to use, and open and easy to update. A complete set of automation control system for coated fuel particle preparation process was manufactured based on fulfilling the requirements of these principles in manufacture practice. The automated control system was put into operation in the production of irradiated samples for HTRPM demonstration project. The experimental results prove that the system can achieve better control of coated fuel particle preparation process and meet the requirements of factory-scale production. (authors)

  17. The feasibility of automated online flow cytometry for in-situ monitoring of microbial dynamics in aquatic ecosystems

    Science.gov (United States)

    Besmer, Michael D.; Weissbrodt, David G.; Kratochvil, Bradley E.; Sigrist, Jürg A.; Weyland, Mathias S.; Hammes, Frederik

    2014-01-01

    Fluorescent staining coupled with flow cytometry (FCM) is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i) flowing tap water from a municipal drinking water supply network and (ii) river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12–14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized) by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough toward the eventual establishment of fully automated online microbiological monitoring technologies. PMID:24917858

  18. Fully Automated Concentration Control of the Acidic Texturisation Process

    OpenAIRE

    Dannenberg, T.; Zimmer, M.; Rentsch, J.

    2012-01-01

    To enable a concentration control in the acidic texturing process we have closed the feedback loop from analytical data to the dosing mechanism of the used process tool. In order to analyze the process bath we used near-infrared spectroscopy in an online setup as well as ion chromatography as an inline method in a second approach. Using the developed dosing algorithm allows a concentration optimization of HF and HNO3 in dependence of the Si concentrations. This allows a further optimization o...

  19. Aozan: an automated post-sequencing data-processing pipeline.

    Science.gov (United States)

    Perrin, Sandrine; Firmo, Cyril; Lemoine, Sophie; Le Crom, Stéphane; Jourdren, Laurent

    2017-07-15

    Data management and quality control of output from Illumina sequencers is a disk space- and time-consuming task. Thus, we developed Aozan to automatically handle data transfer, demultiplexing, conversion and quality control once a run has finished. This software greatly improves run data management and the monitoring of run statistics via automatic emails and HTML web reports. Aozan is implemented in Java and Python, supported on Linux systems, and distributed under the GPLv3 License at: http://www.outils.genomique.biologie.ens.fr/aozan/ . Aozan source code is available on GitHub: https://github.com/GenomicParisCentre/aozan . aozan@biologie.ens.fr. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    Science.gov (United States)

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  1. An Improvement in Thermal Modelling of Automated Tape Placement Process

    International Nuclear Information System (INIS)

    Barasinski, Anaies; Leygue, Adrien; Poitou, Arnaud; Soccard, Eric

    2011-01-01

    The thermoplastic tape placement process offers the possibility of manufacturing large laminated composite parts with all kinds of geometries (double curved i.e.). This process is based on the fusion bonding of a thermoplastic tape on a substrate. It has received a growing interest during last years because of its non autoclave abilities.In order to control and optimize the quality of the manufactured part, we need to predict the temperature field throughout the processing of the laminate. In this work, we focus on a thermal modeling of this process which takes in account the imperfect bonding existing between the different layers of the substrate by introducing thermal contact resistance in the model. This study is leaning on experimental results which inform us that the value of the thermal resistance evolves with temperature and pressure applied on the material.

  2. Automated input data management in manufacturing process simulation

    OpenAIRE

    Ettefaghian, Alireza

    2015-01-01

    Input Data Management (IDM) is a time consuming and costly process for Discrete Event Simulation (DES) projects. Input Data Management is considered as the basis of real-time process simulation (Bergmann, Stelzer and Strassburger, 2011). According to Bengtsson et al. (2009), data input phase constitutes on the average about 31% of the time of an entire simulation project. Moreover, the lack of interoperability between manufacturing applications and simulation software leads to a high cost to ...

  3. A feasability study of color flow doppler vectorization for automated blood flow monitoring.

    Science.gov (United States)

    Schorer, R; Badoual, A; Bastide, B; Vandebrouck, A; Licker, M; Sage, D

    2017-12-01

    An ongoing issue in vascular medicine is the measure of the blood flow. Catheterization remains the gold standard measurement method, although non-invasive techniques are an area of intense research. We hereby present a computational method for real-time measurement of the blood flow from color flow Doppler data, with a focus on simplicity and monitoring instead of diagnostics. We then analyze the performance of a proof-of-principle software implementation. We imagined a geometrical model geared towards blood flow computation from a color flow Doppler signal, and we developed a software implementation requiring only a standard diagnostic ultrasound device. Detection performance was evaluated by computing flow and its determinants (flow speed, vessel area, and ultrasound beam angle of incidence) on purposely designed synthetic and phantom-based arterial flow simulations. Flow was appropriately detected in all cases. Errors on synthetic images ranged from nonexistent to substantial depending on experimental conditions. Mean errors on measurements from our phantom flow simulation ranged from 1.2 to 40.2% for angle estimation, and from 3.2 to 25.3% for real-time flow estimation. This study is a proof of concept showing that accurate measurement can be done from automated color flow Doppler signal extraction, providing the industry the opportunity for further optimization using raw ultrasound data.

  4. Affordable Bimodal Optical Sensors to Spread the Use of Automated Insect Monitoring

    Directory of Open Access Journals (Sweden)

    Ilyas Potamitis

    2018-01-01

    Full Text Available We present a novel bimodal optoelectronic sensor based on Fresnel lenses and the associated stereo-recording device that records the wingbeat event of an insect in flight as backscattered and extinction light. We investigate the complementary information of these two sources of biometric evidence and we finally embed part of this technology in an electronic e-trap for fruit flies. The e-trap examines the spectral content of the wingbeat of the insect flying in and reports wirelessly counts and species identity. We design our devices so that they are optimized in terms of detection accuracy and power consumption, but above all, we ensure that they are affordable. Our aim is to make more widespread the use of electronic insect traps that report in virtually real time the level of the pest population from the field straight to a human controlled agency. We have the vision to establish remote automated monitoring for all insects of economic and hygienic importance at large spatial scales, using their wingbeat as biometric evidence. To this end, we provide open access to the implementation details, recordings, and classification code we developed.

  5. Automated remote cameras for monitoring alluvial sandbars on the Colorado River in Grand Canyon, Arizona

    Science.gov (United States)

    Grams, Paul E.; Tusso, Robert B.; Buscombe, Daniel

    2018-02-27

    Automated camera systems deployed at 43 remote locations along the Colorado River corridor in Grand Canyon National Park, Arizona, are used to document sandbar erosion and deposition that are associated with the operations of Glen Canyon Dam. The camera systems, which can operate independently for a year or more, consist of a digital camera triggered by a separate data controller, both of which are powered by an external battery and solar panel. Analysis of images for categorical changes in sandbar size show deposition at 50 percent or more of monitoring sites during controlled flood releases done in 2012, 2013, 2014, and 2016. The images also depict erosion of sandbars and show that erosion rates were highest in the first 3 months following each controlled flood. Erosion rates were highest in 2015, the year of highest annual dam release volume. Comparison of the categorical estimates of sandbar change agree with sandbar change (erosion or deposition) measured by topographic surveys in 76 percent of cases evaluated. A semiautomated method for quantifying changes in sandbar area from the remote-camera images by rectifying the oblique images and segmenting the sandbar from the rest of the image is presented. Calculation of sandbar area by this method agrees with sandbar area determined by topographic survey within approximately 8 percent and allows quantification of sandbar area monthly (or more frequently).

  6. Statistical data processing with automatic system for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  7. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  8. Automated multiscale morphometry of muscle disease from second harmonic generation microscopy using tensor-based image processing.

    Science.gov (United States)

    Garbe, Christoph S; Buttgereit, Andreas; Schürmann, Sebastian; Friedrich, Oliver

    2012-01-01

    Practically, all chronic diseases are characterized by tissue remodeling that alters organ and cellular function through changes to normal organ architecture. Some morphometric alterations become irreversible and account for disease progression even on cellular levels. Early diagnostics to categorize tissue alterations, as well as monitoring progression or remission of disturbed cytoarchitecture upon treatment in the same individual, are a new emerging field. They strongly challenge spatial resolution and require advanced imaging techniques and strategies for detecting morphological changes. We use a combined second harmonic generation (SHG) microscopy and automated image processing approach to quantify morphology in an animal model of inherited Duchenne muscular dystrophy (mdx mouse) with age. Multiphoton XYZ image stacks from tissue slices reveal vast morphological deviation in muscles from old mdx mice at different scales of cytoskeleton architecture: cell calibers are irregular, myofibrils within cells are twisted, and sarcomere lattice disruptions (detected as "verniers") are larger in number compared to samples from healthy mice. In young mdx mice, such alterations are only minor. The boundary-tensor approach, adapted and optimized for SHG data, is a suitable approach to allow quick quantitative morphometry in whole tissue slices. The overall detection performance of the automated algorithm compares very well with manual "by eye" detection, the latter being time consuming and prone to subjective errors. Our algorithm outperfoms manual detection by time with similar reliability. This approach will be an important prerequisite for the implementation of a clinical image databases to diagnose and monitor specific morphological alterations in chronic (muscle) diseases. © 2011 IEEE

  9. Monitoring, accounting and automated decision support for the ALICE experiment based on the MonALISA framework

    CERN Document Server

    Cirstoiu, C; Betev, L; Saiz, P; Peters, A J; Muraru, A; Voicu, R; Legrand, I

    2007-01-01

    We are developing a general purpose monitoring system for the ALICE experiment, based on the MonALISA framework. MonALISA (Monitoring Agents using a Large Integrated Services Architecture) is a fully distributed system with no single point of failure that is able to collect, store monitoring information and present it as significant perspectives and synthetic views on the status and the trends of the entire system. Furthermore, agents can use it for taking automated operational decisions. Monitoring information is gathered locally from all the components running in each site. The entire flow of information is aggregated on site level by a MonALISA service and then collected and presented in various forms by a central MonALISA Repository. Based on this information, other services take operational decisions such as alerts, triggers, service restarts and automatic production job or transfer submissions. The system monitors all the components: computer clusters (all major parameters of each computing node), jobs ...

  10. MO-G-BRE-03: Automated Continuous Monitoring of Patient Setup with Second-Check Independent Image Registration

    International Nuclear Information System (INIS)

    Jiang, X; Fox, T; Schreibmann, E

    2014-01-01

    Purpose: To create a non-supervised quality assurance program to monitor image-based patient setup. The system acts a secondary check by independently computing shifts and rotations and interfaces with Varian's database to verify therapist's work and warn against sub-optimal setups. Methods: Temporary digitally-reconstructed radiographs (DRRs) and OBI radiographic image files created by Varian's treatment console during patient setup are intercepted and used as input in an independent registration module customized for accuracy that determines the optimal rotations and shifts. To deal with the poor quality of OBI images, a histogram equalization of the live images to the DDR counterparts is performed as a pre-processing step. A search for the most sensitive metric was performed by plotting search spaces subject to various translations and convergence analysis was applied to ensure the optimizer finds the global minima. Final system configuration uses the NCC metric with 150 histogram bins and a one plus one optimizer running for 2000 iterations with customized scales for translations and rotations in a multi-stage optimization setup that first corrects and translations and subsequently rotations. Results: The system was installed clinically to monitor and provide almost real-time feedback on patient positioning. On a 2 month-basis uncorrected pitch values were of a mean 0.016° with standard deviation of 1.692°, and couch rotations of − 0.090°± 1.547°. The couch shifts were −0.157°±0.466° cm for the vertical, 0.045°±0.286 laterally and 0.084°± 0.501° longitudinally. Uncorrected pitch angles were the most common source of discrepancies. Large variations in the pitch angles were correlated with patient motion inside the mask. Conclusion: A system for automated quality assurance of therapist's registration was designed and tested in clinical practice. The approach complements the clinical software's automated registration in

  11. MO-G-BRE-03: Automated Continuous Monitoring of Patient Setup with Second-Check Independent Image Registration

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, X; Fox, T; Schreibmann, E [Department of Radiation Oncology and Winship Cancer Institute, Emory University, Atlanta, GA (United States)

    2014-06-15

    Purpose: To create a non-supervised quality assurance program to monitor image-based patient setup. The system acts a secondary check by independently computing shifts and rotations and interfaces with Varian's database to verify therapist's work and warn against sub-optimal setups. Methods: Temporary digitally-reconstructed radiographs (DRRs) and OBI radiographic image files created by Varian's treatment console during patient setup are intercepted and used as input in an independent registration module customized for accuracy that determines the optimal rotations and shifts. To deal with the poor quality of OBI images, a histogram equalization of the live images to the DDR counterparts is performed as a pre-processing step. A search for the most sensitive metric was performed by plotting search spaces subject to various translations and convergence analysis was applied to ensure the optimizer finds the global minima. Final system configuration uses the NCC metric with 150 histogram bins and a one plus one optimizer running for 2000 iterations with customized scales for translations and rotations in a multi-stage optimization setup that first corrects and translations and subsequently rotations. Results: The system was installed clinically to monitor and provide almost real-time feedback on patient positioning. On a 2 month-basis uncorrected pitch values were of a mean 0.016° with standard deviation of 1.692°, and couch rotations of − 0.090°± 1.547°. The couch shifts were −0.157°±0.466° cm for the vertical, 0.045°±0.286 laterally and 0.084°± 0.501° longitudinally. Uncorrected pitch angles were the most common source of discrepancies. Large variations in the pitch angles were correlated with patient motion inside the mask. Conclusion: A system for automated quality assurance of therapist's registration was designed and tested in clinical practice. The approach complements the clinical software's automated registration in

  12. Towards automated processing of the right of access in inter-organizational Web Service compositions

    DEFF Research Database (Denmark)

    Herkenhöner, Ralph; De Meer, Hermann; Jensen, Meiko

    2010-01-01

    with trade secret protection. In this paper, we present an automated architecture to enable exercising the right of access in the domain of inter-organizational business processes based on Web Services technology. Deriving its requirements from the legal, economical, and technical obligations, we show...

  13. COED Transactions, Vol. IX, No. 9, September 1977. A Complete Course in Process Automation.

    Science.gov (United States)

    Marcovitz, Alan B., Ed.

    This document presents a mechanical engineering unit addressing essential aspects of computerized plant automation. The unit reviews the control of the simplest of all processes, a 1-measured variable, 1-controlled variable system, through the computer control of an air compressor. (SL)

  14. Process methods and levels of automation of wood pallet repair in the United States

    Science.gov (United States)

    Jonghun Park; Laszlo Horvath; Robert J. Bush

    2016-01-01

    This study documented the current status of wood pallet repair in the United States by identifying the types of processing and equipment usage in repair operations from an automation prespective. The wood pallet repair firms included in the sudy received an average of approximately 1.28 million cores (i.e., used pallets) for recovery in 2012. A majority of the cores...

  15. Relay for the automation of the exposition process in X-ray control of material quality

    International Nuclear Information System (INIS)

    Vladimirov, L.V.; Ermakova, T.N.; Krongauz, A.N.; Kurozaev, V.P.; Khlebtsevich, V.Yu.; Chernobrovov, S.V.; Shul'gina, Z.I.

    1977-01-01

    Discussed are the theoretical and experimental conceptions which constitute the basis for elaboration of an electronic relay intended for automation of the exposure process during X-ray inspection of the material quality. The operating principle and circuitry of the relay are described

  16. Preliminary Evaluation of an Aviation Safety Thesaurus' Utility for Enhancing Automated Processing of Incident Reports

    Science.gov (United States)

    Barrientos, Francesca; Castle, Joseph; McIntosh, Dawn; Srivastava, Ashok

    2007-01-01

    This document presents a preliminary evaluation the utility of the FAA Safety Analytics Thesaurus (SAT) utility in enhancing automated document processing applications under development at NASA Ames Research Center (ARC). Current development efforts at ARC are described, including overviews of the statistical machine learning techniques that have been investigated. An analysis of opportunities for applying thesaurus knowledge to improving algorithm performance is then presented.

  17. Identifying and locating surface defects in wood: Part of an automated lumber processing system

    Science.gov (United States)

    Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa

    1983-01-01

    Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...

  18. Automated road segment creation process : a report on research sponsored by SaferSim.

    Science.gov (United States)

    2016-08-01

    This report provides a summary of a set of tools that can be used to automate the process : of generating roadway surfaces from alignment and texture information. The tools developed : were created in Python 3.x and rely on the availability of two da...

  19. 3-D image pre-processing algorithms for improved automated tracing of neuronal arbors.

    Science.gov (United States)

    Narayanaswamy, Arunachalam; Wang, Yu; Roysam, Badrinath

    2011-09-01

    The accuracy and reliability of automated neurite tracing systems is ultimately limited by image quality as reflected in the signal-to-noise ratio, contrast, and image variability. This paper describes a novel combination of image processing methods that operate on images of neurites captured by confocal and widefield microscopy, and produce synthetic images that are better suited to automated tracing. The algorithms are based on the curvelet transform (for denoising curvilinear structures and local orientation estimation), perceptual grouping by scalar voting (for elimination of non-tubular structures and improvement of neurite continuity while preserving branch points), adaptive focus detection, and depth estimation (for handling widefield images without deconvolution). The proposed methods are fast, and capable of handling large images. Their ability to handle images of unlimited size derives from automated tiling of large images along the lateral dimension, and processing of 3-D images one optical slice at a time. Their speed derives in part from the fact that the core computations are formulated in terms of the Fast Fourier Transform (FFT), and in part from parallel computation on multi-core computers. The methods are simple to apply to new images since they require very few adjustable parameters, all of which are intuitive. Examples of pre-processing DIADEM Challenge images are used to illustrate improved automated tracing resulting from our pre-processing methods.

  20. Current status of process monitoring for IAEA safeguards

    International Nuclear Information System (INIS)

    Koroyasu, M.

    1987-06-01

    Based on literature survey, this report tries to answer some of the following questions on process monitoring for safeguards purposes of future large scale reprocessing plants: what is process monitoring, what are the basic elements of process monitoring, what kinds of process monitoring are there, what are the basic problems of process monitoring, what is the relationship between process monitoring and near-real-time materials accountancy, what are actual results of process monitoring tests and what should be studied in future. A brief description of Advanced Safeguards Approaches proposed by the four states (France, U.K., Japan and U.S.A.), the approach proposed by the U.S.A., the description of the process monitoring, the main part of the report published as a result of one of the U.S. Support Programmes for IAEA Safeguards and an article on process monitoring presented at an IAEA Symposium held in November 1986 are given in the annexes. 24 refs, 20 figs, tabs

  1. Automation of the process of generation of the students insurance, applying RFID and GPRS technologies

    Directory of Open Access Journals (Sweden)

    Nelson Barrera-Lombana

    2013-07-01

    Full Text Available This article presents the description of the design and implementation of a system which allows the fulfilment of a consultation service on various parameters to a web server using a GSM modem, exchanging information systems over the Internet (ISS and radio-frequency identification (RFID. The application validates for its use in automation of the process of generation of the student insurance, and hardware and software, developed by the Research Group in Robotics and Industrial Automation GIRAof UPTC, are used as a platform.

  2. Emergency healthcare process automation using mobile computing and cloud services.

    Science.gov (United States)

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2012-10-01

    Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case.

  3. Exponential models applied to automated processing of radioimmunoassay standard curves

    International Nuclear Information System (INIS)

    Morin, J.F.; Savina, A.; Caroff, J.; Miossec, J.; Legendre, J.M.; Jacolot, G.; Morin, P.P.

    1979-01-01

    An improved computer processing is described for fitting of radio-immunological standard curves by means of an exponential model on a desk-top calculator. This method has been applied to a variety of radioassays and the results are in accordance with those obtained by more sophisticated models [fr

  4. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A.

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  5. Automated system of monitoring and positioning of functional units of mining technological machines for coal-mining enterprises

    Directory of Open Access Journals (Sweden)

    Meshcheryakov Yaroslav

    2018-01-01

    Full Text Available This article is show to the development of an automated monitoring and positioning system for functional nodes of mining technological machines. It describes the structure, element base, algorithms for identifying the operating states of a walking excavator; various types of errors in the functioning of microelectromechanical gyroscopes and accelerometers, as well as methods for their correction based on the Madgwick fusion filter. The results of industrial tests of an automated monitoring and positioning system for functional units on one of the opencast coal mines of Kuzbass are presented. This work is addressed to specialists working in the fields of the development of embedded systems and control systems, radio electronics, mechatronics, and robotics.

  6. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    Science.gov (United States)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical

  7. Radiation Monitoring System in Advanced Spent Fuel Conditioning Process Facility

    Energy Technology Data Exchange (ETDEWEB)

    You, Gil Sung; Kook, D. H.; Choung, W. M.; Ku, J. H.; Cho, I. J.; You, G. S.; Kwon, K. C.; Lee, W. K.; Lee, E. P

    2006-09-15

    The Advanced spent fuel Conditioning Process is under development for effective management of spent fuel by converting UO{sub 2} into U-metal. For demonstration of this process, {alpha}-{gamma} type new hot cell was built in the IMEF basement . To secure against radiation hazard, this facility needs radiation monitoring system which will observe the entire operating area before the hot cell and service area at back of it. This system consists of 7 parts; Area Monitor for {gamma}-ray, Room Air Monitor for particulate and iodine in both area, Hot cell Monitor for hot cell inside high radiation and rear door interlock, Duct Monitor for particulate of outlet ventilation, Iodine Monitor for iodine of outlet duct, CCTV for watching workers and material movement, Server for management of whole monitoring system. After installation and test of this, radiation monitoring system will be expected to assist the successful ACP demonstration.

  8. Radiation Monitoring System in Advanced Spent Fuel Conditioning Process Facility

    International Nuclear Information System (INIS)

    You, Gil Sung; Kook, D. H.; Choung, W. M.; Ku, J. H.; Cho, I. J.; You, G. S.; Kwon, K. C.; Lee, W. K.; Lee, E. P.

    2006-09-01

    The Advanced spent fuel Conditioning Process is under development for effective management of spent fuel by converting UO 2 into U-metal. For demonstration of this process, α-γ type new hot cell was built in the IMEF basement . To secure against radiation hazard, this facility needs radiation monitoring system which will observe the entire operating area before the hot cell and service area at back of it. This system consists of 7 parts; Area Monitor for γ-ray, Room Air Monitor for particulate and iodine in both area, Hot cell Monitor for hot cell inside high radiation and rear door interlock, Duct Monitor for particulate of outlet ventilation, Iodine Monitor for iodine of outlet duct, CCTV for watching workers and material movement, Server for management of whole monitoring system. After installation and test of this, radiation monitoring system will be expected to assist the successful ACP demonstration

  9. Performance evaluation of enzyme immunoassay for voriconazole therapeutic drug monitoring with automated clinical chemistry analyzers

    Directory of Open Access Journals (Sweden)

    Yongbum Jeon

    2017-08-01

    Full Text Available Objective: Voriconazole is a triazole antifungal developed for the treatment of fungal infectious disease, and the clinical utility of its therapeutic drug monitoring has been evaluated. Recently, a new assay for analyzing the serum voriconazole concentration with an automated clinical chemistry analyzer was developed. We evaluated the performance of the new assay based on standardized protocols. Methods: The analytical performance of the assay was evaluated according to its precision, trueness by recovery, limit of quantitation, linearity, and correlation with results from liquid chromatography-tandem mass spectrometry (LC-MS/MS. The evaluation was performed with the same protocol on two different routine chemistry analyzers. All evaluations were performed according to CLSI Guidelines EP15, EP17, EP6, and EP9 [1–4]. Results: Coefficients of variation for within-run and between-day imprecision were 3.2–5.1% and 1.5–3.0%, respectively, on the two different analyzers for pooled serum samples. The recovery rates were in the range of 95.4–102.2%. The limit of blank was 0.0049 μg/mL, and the limit of detection of the samples was 0.0266–0.0376 μg/mL. The percent recovery at three LoQ levels were 67.9–74.6% for 0.50 μg/mL, 75.5–80.2% for 0.60 μg/mL, and 89.9–96.6% for 0.70 μg/mL. A linear relationship was demonstrated between 0.5 μg/mL and 16.0 μg/mL (R2=0.9995–0.9998. The assay correlated well with LC-MS/MS results (R2=0.9739–0.9828. Conclusions: The assay showed acceptable precision, trueness, linearity, and limit of quantification, and correlated well with LC-MS/MS. Therefore, its analytical performance is satisfactory for monitoring the drug concentration of voriconazole. Keywords: Voriconazole, Antifungal agents, Therapeutic drug monitoring

  10. A New Tool for Automated Data Collection and Complete On-site Flux Data Processing for Eddy Covariance Measurements

    Science.gov (United States)

    Begashaw, I. G.; Kathilankal, J. C.; Li, J.; Beaty, K.; Ediger, K.; Forgione, A.; Fratini, G.; Johnson, D.; Velgersdyk, M.; Hupp, J. R.; Xu, L.; Burba, G. G.

    2014-12-01

    The eddy covariance method is widely used for direct measurements of turbulent exchange of gases and energy between the surface and atmosphere. In the past, raw data were collected first in the field and then processed back in the laboratory to achieve fully corrected publication-ready flux results. This post-processing consumed significant amount of time and resources, and precluded researchers from accessing near real-time final flux results. A new automated measurement system with novel hardware and software designs was developed, tested, and deployed starting late 2013. The major advancements with this automated flux system include: 1) Enabling logging high-frequency, three-dimensional wind speeds and multiple gas densities (CO2, H2O and CH4), low-frequency meteorological data, and site metadata simultaneously through a specially designed file format 2) Conducting fully corrected, real-time on-site flux computations using conventional as well as user-specified methods, by implementing EddyPro Software on a small low-power microprocessor 3) Providing precision clock control and coordinate information for data synchronization and inter-site data comparison by incorporating a GPS and Precision Time Protocol. Along with these innovations, a data management server application was also developed to chart fully corrected real-time fluxes to assist remote system monitoring, to send e-mail alerts, and to automate data QA/QC, transfer and archiving at individual stations or on a network level. Combination of all of these functions was designed to help save substantial amount of time and costs associated with managing a research site by eliminating the post-field data processing, reducing user errors and facilitating real-time access to fully corrected flux results. The design, functionality, and test results from this new eddy covariance measurement tool will be presented.

  11. Sociolinguistically Informed Natural Language Processing: Automating Irony Detection

    Science.gov (United States)

    2017-10-23

    interaction feature using the entire training dataset, and repeated this process 100 times to account for variation due to the SGD procedure. Table 6...Levy and Goldberg , 2014). We parsed the ukWaC corpus (Baroni et al., 2009) using the Stanford Dependency Parser v3.5.2 with Stanford Dependencies...bitrary and variable sizes. We pre-trained our own syntactic embeddings fol- lowing (Levy and Goldberg , 2014). We parsed the ukWaC corpus (Baroni et

  12. Potential of Laboratory Execution Systems (LESs) to Simplify the Application of Business Process Management Systems (BPMSs) in Laboratory Automation.

    Science.gov (United States)

    Neubert, Sebastian; Göde, Bernd; Gu, Xiangyu; Stoll, Norbert; Thurow, Kerstin

    2017-04-01

    Modern business process management (BPM) is increasingly interesting for laboratory automation. End-to-end workflow automation and improved top-level systems integration for information technology (IT) and automation systems are especially prominent objectives. With the ISO Standard Business Process Model and Notation (BPMN) 2.X, a system-independent and interdisciplinary accepted graphical process control notation is provided, allowing process analysis, while also being executable. The transfer of BPM solutions to structured laboratory automation places novel demands, for example, concerning the real-time-critical process and systems integration. The article discusses the potential of laboratory execution systems (LESs) for an easier implementation of the business process management system (BPMS) in hierarchical laboratory automation. In particular, complex application scenarios, including long process chains based on, for example, several distributed automation islands and mobile laboratory robots for a material transport, are difficult to handle in BPMSs. The presented approach deals with the displacement of workflow control tasks into life science specialized LESs, the reduction of numerous different interfaces between BPMSs and subsystems, and the simplification of complex process modelings. Thus, the integration effort for complex laboratory workflows can be significantly reduced for strictly structured automation solutions. An example application, consisting of a mixture of manual and automated subprocesses, is demonstrated by the presented BPMS-LES approach.

  13. Recent progress in online ultrasonic process monitoring

    Science.gov (United States)

    Wen, Szu-Sheng L.; Chen, Tzu-Fang; Ramos-Franca, Demartonne; Nguyen, Ky T.; Jen, Cheng-Kuei; Ihara, Ikuo; Derdouri, A.; Garcia-Rejon, Andres

    1998-03-01

    On-line ultrasonic monitoring of polymer co-extrusion and gas-assisted injection molding are presented. During the co- extrusion of high density polyethylene and Santoprene ultrasonic sensors consisting of piezoelectric transducers and clad ultrasonic buffer rods are used to detect the interface between these two polymers and the stability of the extrusion. The same ultrasonic sensor also measures the surface temperature of the extruded polymer. The results indicate that temperature measurements using ultrasound have a faster response time than those obtained by conventional thermocouple. In gas-assisted injection molding the polymer and gas flow front positions are monitored simultaneously. This information may be used to control the plunger movement.

  14. Monitoring Industrial Food Processes Using Spectroscopy & Chemometrics

    DEFF Research Database (Denmark)

    Pedersen, Dorthe Kjær; Engelsen, Søren Balling

    2001-01-01

    In the last decade rapid spectroscopic measurements have revolutionized quality control in practically all areas of primary food and feed production. Near-infrared spectroscopy (NIR & NIT) has been implemented for monitoring the quality of millions of samples of cereals, milk and meat with unprec......In the last decade rapid spectroscopic measurements have revolutionized quality control in practically all areas of primary food and feed production. Near-infrared spectroscopy (NIR & NIT) has been implemented for monitoring the quality of millions of samples of cereals, milk and meat...

  15. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules: Final Subcontract Report, April 1998 - April 2002

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore, S. B.; Hogan, S. J.

    2002-11-01

    This report describes the automated systems developed for PV module assembly and testing processes after lamination. These processes are applicable to a broad range of module types, including those made with wafer-based and thin-film solar cells. Survey data and input from module manufacturers gathered during site visits were used to define system capabilities and process specifications. Spire completed mechanical, electrical, and software engineering for four automation systems: a module edge trimming system, the SPI-TRIM 350; an edge sealing and framing system, the SPI-FRAMER 350; an integrated module testing system, the SPI-MODULE QA 350; and a module buffer storage system, the SPI-BUFFER 350. A fifth system for junction-box installation, the SPI-BOXER 350, was nearly completed during the program. A new-size solar simulator, the SPI-SUN SIMULATOR 350i, was designed as part of the SPI-MODULE QA 350. This simulator occupies minimal production floor space, and its test area is large enough to handle most production modules. The automated systems developed in this program are designed for integration to create automated production lines.

  16. A methodology to determine the level of automation to improve the production process and reduce the ergonomics index

    Science.gov (United States)

    Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo

    2017-08-01

    Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.

  17. Automation of the Process to Obtain U F4 Powders

    International Nuclear Information System (INIS)

    Fenocchio, A.D

    2001-01-01

    Here is exposed the preliminary analysis of the control system to be implemented in the Production Plant of UF 4 Powders.The work has been done in the electronic laboratory.This implies, the setting of devices (PLC, Temperature Controllers, etc.) and the setting of the communications using the proper protocol.Also is shown a study about the logic for the first part of the conversion process of UF 6 : the evaporation.This study is used to define the methodology to follow in a future PLC program

  18. Mobile Monitoring Data Processing and Analysis Strategies

    Science.gov (United States)

    The development of portable, high-time resolution instruments for measuring the concentrations of a variety of air pollutants has made it possible to collect data while in motion. This strategy, known as mobile monitoring, involves mounting air sensors on variety of different pla...

  19. Mobile Monitoring Data Processing & Analysis Strategies

    Science.gov (United States)

    The development of portable, high-time resolution instruments for measuring the concentrations of a variety of air pollutants has made it possible to collect data while in motion. This strategy, known as mobile monitoring, involves mounting air sensors on variety of different pla...

  20. A method for the automated long-term monitoring of three-spined stickleback Gasterosteus aculeatus shoal dynamics.

    Science.gov (United States)

    Kleinhappel, T K; Al-Zoubi, A; Al-Diri, B; Burman, O; Dickinson, P; John, L; Wilkinson, A; Pike, T W

    2014-04-01

    This paper describes and evaluates a flexible, non-invasive tagging system for the automated identification and long-term monitoring of individual three-spined sticklebacks Gasterosteus aculeatus. The system is based on barcoded tags, which can be reliably and robustly detected and decoded to provide information on an individual's identity and location. Because large numbers of fish can be individually tagged, it can be used to monitor individual- and group-level dynamics within fish shoals. © 2014 The Fisheries Society of the British Isles.

  1. Automated processing of human bone marrow grafts for transplantation.

    Science.gov (United States)

    Zingsem, J; Zeiler, T; Zimmermanm, R; Weisbach, V; Mitschulat, H; Schmid, H; Beyer, J; Siegert, W; Eckstein, R

    1993-01-01

    Prior to purging or cryopreservation, we concentrated 21 bone marrow (BM) harvests using a modification of the 'grancollect-protocol' of the Fresenius AS 104 cell separator with the P1-Y set. Within 40-70 min, the initial marrow volume of 1,265 ml (+/- 537 ml) was processed two to three times. A mean of 47% (+/- 21%) of the initial mononuclear cells was recovered in a mean volume of 128 ml (+36 ml). The recovery of clonogenic cells, measured by CFU-GM assays, was 68% (+/- 47%). Red blood cells in the BM concentrates were reduced to 7% (+/- 4%) of the initial number. The procedure was efficient and yielded a BM cell fraction suitable for purging, cryopreservation and transplantation. At this time, 10 of the 21 patients whose BM was processed using this technique have been transplanted. Seven of these 10 patients have been grafted using the BM alone. Three of the 10 patients showed reduced cell viability and colony growth in the thawed BM samples, and therefore obtained BM and peripheral blood-derived stem cells. All transplanted patients showed an evaluable engraftment, achieving 1,000 granulocytes per microliter of peripheral blood in a mean of 18 days.

  2. Bim Automation: Advanced Modeling Generative Process for Complex Structures

    Science.gov (United States)

    Banfi, F.; Fai, S.; Brumana, R.

    2017-08-01

    The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.

  3. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles.

    Science.gov (United States)

    Görgl, R.; Brandstätter, E.

    2016-03-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  4. Laser materials processing of complex components: from reverse engineering via automated beam path generation to short process development cycles

    Science.gov (United States)

    Görgl, Richard; Brandstätter, Elmar

    2017-01-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser cladding and laser-based additive manufacturing are given.

  5. Process monitoring of fibre reinforced composites using optical fibre sensors

    Energy Technology Data Exchange (ETDEWEB)

    Fernando, G.F.; Degamber, B.

    2006-04-15

    The deployment of optical fibre based sensor systems for process monitoring of advanced fibre reinforced organic matrix composites is reviewed. The focus is on thermosetting resins and the various optical and spectroscopy-based techniques that can be used to monitor the processing of these materials. Following brief consideration of the manufacturing methods commonly used in the production of thermoset based composites, a discussion is presented on sensor systems that can be used to facilitate real-time chemical process monitoring. Although the focus is on thermosets, the techniques described can be adapted for chemical monitoring of organic species in general. (author)

  6. AUTOMATED SYSTEM OF DATA PROCESSING WITH THE IMPLEMENTATION OF RATING TECHNOLOGY OF TEACHING

    Directory of Open Access Journals (Sweden)

    О. И. Дзювина

    2014-01-01

    Full Text Available Rating technology of teaching enables independent and individual work of students, increase their motivation.Purpose: to increase the efficiency of data processing with the implementation of rating technology of teaching.Method: analysis, synthesis,experiment.Results. Developed an automated data processing system for the implementation of rating technology of teaching.Practical implication. Education.Purchase on Elibrary.ru > Buy now

  7. [Automated processing of data from the 1985 population and housing census].

    Science.gov (United States)

    Cholakov, S

    1987-01-01

    The author describes the method of automated data processing used in the 1985 census of Bulgaria. He notes that the computerization of the census involves decentralization and the use of regional computing centers as well as data processing at the Central Statistical Office's National Information Computer Center. Special attention is given to problems concerning the projection and programming of census data. (SUMMARY IN ENG AND RUS)

  8. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  9. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meier, David E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coble, Jamie B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, David V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mcdonald, Luther W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Forrester, Joel B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Unlu, Kenan [Pennsylvania State Univ., University Park, PA (United States); Landsberger, Sheldon [Univ. of Texas, Austin, TX (United States); Bender, Sarah [Pennsylvania State Univ., University Park, PA (United States); Dayman, Kenneth J. [Univ. of Texas, Austin, TX (United States); Reilly, Dallas D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-01

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicate changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.

  10. Automated Hardware and Software System for Monitoring the Earth’s Magnetic Environment

    Directory of Open Access Journals (Sweden)

    Alexei Gvishiani

    2016-12-01

    Full Text Available The continuous growth of geophysical observations requires adequate methods for their processing and analysis. This becomes one of the most important and widely discussed issues in the data science community. The system analysis methods and data mining techniques are able to sustain the solution of this problem. This paper presents an innovative holistic hardware/software system (HSS developed for efficient management and intellectual analysis of geomagnetic data, registered by Russian geomagnetic observatories and international satellites. Geomagnetic observatories that comprise the International Real-time Magnetic Observatory Network (INTERMAGNET produce preliminary (raw and definitive (corrected geomagnetic data of the highest quality. The designed system automates and accelerates routine production of definitive data from the preliminary magnetograms, obtained by Russian observatories, due to implemented algorithms that involve artificial intelligence elements. The HSS is the first system that provides sophisticated automatic detection and multi-criteria classification of extreme geomagnetic conditions, which may be hazardous for technological infrastructure and economic activity in Russia. It enables the online access to digital geomagnetic data, its processing results and modelling calculations along with their visualization on conventional and spherical screens. The concept of the presented system agrees with the accepted ‘four Vs’ paradigm of Big Data. The HSS can increase significantly the ‘velocity’ and ‘veracity’ features of the INTERMAGNET system. It also provides fusion of large sets of ground-based and satellite geomagnetic data, thus facilitating the ‘volume’ and ‘variety’ of handled data.

  11. Acoustic Emission Based In-process Monitoring in Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas; Bissacco, Giuliano; De Chiffre, Leonardo

    The applicability of acoustic emission (AE) measurements for in-process monitoring in the Robot Assisted Polishing (RAP) process was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate removal...... improving the efficiency of the process. It also allows for intelligent process control and generally enhances the robustness and reliability of the automated RAP system in industrial applications....... of the part from the machine tool. In this study, development of surface roughness during polishing rotational symmetric surfaces by the RAP process was inferred from AE measurements. An AE sensor was placed on a polishing tool, and a cylindrical rod of Vanadis 4E steel having an initial turned surface...

  12. Manual of process automation. On-line control systems for devices in the process technology. 3. tot. rev. and enl. ed.; Handbuch der Prozessautomatisierung. Prozessleittechnik fuer verfahrenstechnische Anlagen

    Energy Technology Data Exchange (ETDEWEB)

    Maier, U.; Frueh, K.F. (eds.)

    2004-07-01

    This is a reference manual for engineers who need answers to automation problems in chemical engineering. Some new current subjects have been introduced to complement the information. The following chapters are new or have been rewritten by new authors: Internet and intranet technologies; Outline of process-related functions; Control systems in industrial applications; Problems and solutions; Model-based predicative control (MPC); Report archive analysis; Control Loop Performance Monitoring (CPM); Automation structures; Explosion protection; Remote-I/O; Integration of intelligent field equipment in PLS; Weighing and filling techniques; Safety; Maintenance - structures and strategies. The other chapters have been revised and updated as well. (orig.) [German] Das grundsaetzliche Konzept des Handbuchs ist unveraendert: Es dient als Nachschlagewerk fuer Ingenieure, die sich in verschiedenen Taetigkeitsbereichen mit Fragen der Automatisierung verfahrenstechnischer Anlagen auseinandersetzen muessen. Einige Themen wurden neu aufgenommen - wegen ihrer Aktualitaet und zur Abrundung des Themenspektrums. Folgende Kapitel sind voellig neu oder mit neuen Autoren wesentlich erweitert: Internet-/Intranettechnologien; Uebersicht ueber prozessnahe Funktionen; Industrielle Regelung: Probleme und Problemloesungen; Modellgestuetzte praediktive Regelung (MPC); Meldearchivanalyse; Control Loop Performance Monitoring (CPM); Automatisierungsstrukturen; Explosionsschutz; Remote-I/O; Integration intelligenter Feldgeraete in PLS; Waege- und Abfuelltechnik; Anlagensicherheit; Ganzheitliche Instandhaltung - Strukturen und Strategien. Die uebrigen Kapitel wurden aktualisiert und teilweise auch wesentlich ueberarbeitet. (orig.)

  13. Smart membranes for monitoring membrane based desalination processes

    KAUST Repository

    Laleg-Kirati, Taous-Meriem; Karam, Ayman M.

    2017-01-01

    Various examples are related to smart membranes for monitoring membrane based process such as, e.g., membrane distillation processes. In one example, a membrane, includes a porous surface and a plurality of sensors (e.g., temperature, flow and

  14. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The consi....... The considered example is a ship sailing with a given speed through a Gaussian wave field....

  15. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    Science.gov (United States)

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  16. Automated vehicle counting using image processing and machine learning

    Science.gov (United States)

    Meany, Sean; Eskew, Edward; Martinez-Castro, Rosana; Jang, Shinae

    2017-04-01

    Vehicle counting is used by the government to improve roadways and the flow of traffic, and by private businesses for purposes such as determining the value of locating a new store in an area. A vehicle count can be performed manually or automatically. Manual counting requires an individual to be on-site and tally the traffic electronically or by hand. However, this can lead to miscounts due to factors such as human error A common form of automatic counting involves pneumatic tubes, but pneumatic tubes disrupt traffic during installation and removal, and can be damaged by passing vehicles. Vehicle counting can also be performed via the use of a camera at the count site recording video of the traffic, with counting being performed manually post-recording or using automatic algorithms. This paper presents a low-cost procedure to perform automatic vehicle counting using remote video cameras with an automatic counting algorithm. The procedure would utilize a Raspberry Pi micro-computer to detect when a car is in a lane, and generate an accurate count of vehicle movements. The method utilized in this paper would use background subtraction to process the images and a machine learning algorithm to provide the count. This method avoids fatigue issues that are encountered in manual video counting and prevents the disruption of roadways that occurs when installing pneumatic tubes

  17. UNICOS CPC6: automated code generation for process control applications

    International Nuclear Information System (INIS)

    Fernandez Adiego, B.; Blanco Vinuela, E.; Prieto Barreiro, I.

    2012-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS). As a part of this framework, UNICOS-CPC provides a well defined library of device types, a methodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) to develop CPC applications. The CPC component is composed of several platform oriented plug-ins (PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA generator (based on PVSS) and the CPC wizard as a dedicated plug-in created to provide the user a friendly GUI (Graphical User Interface). A tool called UAB Bootstrap will manage the different UAB components, like CPC, and its dependencies with the resource packages. This tool guides the control system developer during the installation, update and execution of the UAB components. (authors)

  18. A method to automate the radiological survey process

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.

    1987-01-01

    This document describes the USRAD system, a hardware/software ranging and data transmission system, that provides real-time position data and combines it with other portable instrument measurements. Live display of position data and onsite data reduction, presentation, and formatting for reports and automatic transfer into databases are among the unusual attributes of USRADS. Approximately 25% of any survey-to-survey report process is dedicated to data recording and formatting, which is eliminated by USRADS. Cost savings are realized by the elimination of manual transcription of instrument readout in the field and clerical formatting of data in the office. Increased data reliability is realized by ensuring complete survey coverage of an area in the field, by elimination of mathematical errors in conversion of instrument readout to unit concentration, and by elimination of errors associated with transcribing data from the field into report format. The USRAD system can be adapted to measure other types of pollutants or physical/chemical/geological/biological conditions in which portable instrumentation exists. 2 refs., 2 figs

  19. Automated processing of massive audio/video content using FFmpeg

    Directory of Open Access Journals (Sweden)

    Kia Siang Hock

    2014-01-01

    Full Text Available Audio and video content forms an integral, important and expanding part of the digital collections in libraries and archives world-wide. While these memory institutions are familiar and well-versed in the management of more conventional materials such as books, periodicals, ephemera and images, the handling of audio (e.g., oral history recordings and video content (e.g., audio-visual recordings, broadcast content requires additional toolkits. In particular, a robust and comprehensive tool that provides a programmable interface is indispensable when dealing with tens of thousands of hours of audio and video content. FFmpeg is comprehensive and well-established open source software that is capable of the full-range of audio/video processing tasks (such as encode, decode, transcode, mux, demux, stream and filter. It is also capable of handling a wide-range of audio and video formats, a unique challenge in memory institutions. It comes with a command line interface, as well as a set of developer libraries that can be incorporated into applications.

  20. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  1. Diagnostic monitor for carbon fiber processing

    Science.gov (United States)

    Paulauskas, Felix L.; Bigelow, Timothy S.; Meek, Thomas T.

    2002-01-01

    A method for monitoring characteristics of materials includes placing a material in an application zone, measuring a change in at least one property value of the application zone caused by placing the material in the application zone and relating changes in the property value of the application zone caused by the material to at least one characteristic of the material An apparatus for monitoring characteristics of a material includes a measuring device for measuring a property value resulting from applying a frequency signal to the application zone after placing a material in the application zone and a processor for relating changes in the property value caused by placement of the material in the application zone to at least one desired characteristic of the material. The application zone is preferably a resonant cavity.

  2. Improved automated lumen contour detection by novel multifrequency processing algorithm with current intravascular ultrasound system.

    Science.gov (United States)

    Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro

    2013-02-01

    The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.

  3. Performance evaluation of enzyme immunoassay for voriconazole therapeutic drug monitoring with automated clinical chemistry analyzers.

    Science.gov (United States)

    Jeon, Yongbum; Han, Minje; Han, Eun Young; Lee, Kyunghoon; Song, Junghan; Song, Sang Hoon

    2017-08-01

    Voriconazole is a triazole antifungal developed for the treatment of fungal infectious disease, and the clinical utility of its therapeutic drug monitoring has been evaluated. Recently, a new assay for analyzing the serum voriconazole concentration with an automated clinical chemistry analyzer was developed. We evaluated the performance of the new assay based on standardized protocols. The analytical performance of the assay was evaluated according to its precision, trueness by recovery, limit of quantitation, linearity, and correlation with results from liquid chromatography-tandem mass spectrometry (LC-MS/MS). The evaluation was performed with the same protocol on two different routine chemistry analyzers. All evaluations were performed according to CLSI Guidelines EP15, EP17, EP6, and EP9 [1-4]. Coefficients of variation for within-run and between-day imprecision were 3.2-5.1% and 1.5-3.0%, respectively, on the two different analyzers for pooled serum samples. The recovery rates were in the range of 95.4-102.2%. The limit of blank was 0.0049 μg/mL, and the limit of detection of the samples was 0.0266-0.0376 μg/mL. The percent recovery at three LoQ levels were 67.9-74.6% for 0.50 μg/mL, 75.5-80.2% for 0.60 μg/mL, and 89.9-96.6% for 0.70 μg/mL. A linear relationship was demonstrated between 0.5 μg/mL and 16.0 μg/mL ( R 2 =0.9995-0.9998). The assay correlated well with LC-MS/MS results ( R 2 =0.9739-0.9828). The assay showed acceptable precision, trueness, linearity, and limit of quantification, and correlated well with LC-MS/MS. Therefore, its analytical performance is satisfactory for monitoring the drug concentration of voriconazole.

  4. Automation of NLO processes and decays and POWHEG matching in WHIZARD

    International Nuclear Information System (INIS)

    Reuter, Juergen; Chokoufe, Bijan; Stahlhofen, Maximilian

    2016-03-01

    We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one-loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e + e - processes are shown. Also, first NLO-studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the POWHEG formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables.

  5. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  6. Processing Approaches for DAS-Enabled Continuous Seismic Monitoring

    Science.gov (United States)

    Dou, S.; Wood, T.; Freifeld, B. M.; Robertson, M.; McDonald, S.; Pevzner, R.; Lindsey, N.; Gelvin, A.; Saari, S.; Morales, A.; Ekblaw, I.; Wagner, A. M.; Ulrich, C.; Daley, T. M.; Ajo Franklin, J. B.

    2017-12-01

    Distributed Acoustic Sensing (DAS) is creating a "field as laboratory" capability for seismic monitoring of subsurface changes. By providing unprecedented spatial and temporal sampling at a relatively low cost, DAS enables field-scale seismic monitoring to have durations and temporal resolutions that are comparable to those of laboratory experiments. Here we report on seismic processing approaches developed during data analyses of three case studies all using DAS-enabled seismic monitoring with applications ranging from shallow permafrost to deep reservoirs: (1) 10-hour downhole monitoring of cement curing at Otway, Australia; (2) 2-month surface monitoring of controlled permafrost thaw at Fairbanks, Alaska; (3) multi-month downhole and surface monitoring of carbon sequestration at Decatur, Illinois. We emphasize the data management and processing components relevant to DAS-based seismic monitoring, which include scalable approaches to data management, pre-processing, denoising, filtering, and wavefield decomposition. DAS has dramatically increased the data volume to the extent that terabyte-per-day data loads are now typical, straining conventional approaches to data storage and processing. To achieve more efficient use of disk space and network bandwidth, we explore improved file structures and data compression schemes. Because noise floor of DAS measurements is higher than that of conventional sensors, optimal processing workflow involving advanced denoising, deconvolution (of the source signatures), and stacking approaches are being established to maximize signal content of DAS data. The resulting workflow of data management and processing could accelerate the broader adaption of DAS for continuous monitoring of critical processes.

  7. Automated data processing architecture for the Gemini Planet Imager Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Maire, Jérôme; Marchis, Franck; Graham, James R.; Macintosh, Bruce; Ammons, S. Mark; Bailey, Vanessa P.; Barman, Travis S.; Bruzzone, Sebastian; Bulger, Joanna; Cotten, Tara; Doyon, René; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Goodsell, Stephen; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn M.; Larkin, James E.; Marley, Mark S.; Metchev, Stanimir; Nielsen, Eric L.; Oppenheimer, Rebecca; Palmer, David W.; Patience, Jennifer; Poyneer, Lisa A.; Pueyo, Laurent; Rajan, Abhijith; Rantakyrö, Fredrik T.; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane J.

    2018-01-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multiyear direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the Data Cruncher, combines multiple data reduction pipelines (DRPs) together to process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our DRPs. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  8. Software for an automated processing system for radioisotope information from multichannel radiodiagnostic instruments

    International Nuclear Information System (INIS)

    Zelenin, P.E.; Meier, V.P.

    1985-01-01

    The SAORI-01 system for the automated processing of radioisotope information is designed for the collection, processing, and representation of information coming from gamma chambers and multichannel radiodiagnostic instruments (MRI) and is basically oriented toward the radiodiagnostic laboratories of major multidisciplinary hospitals and scientific-research institutes. The functional characteristics of the basic software are discussed, and permits performance of the following functions: collection of information regarding MRI; processing and representation of recorded information; storage of patient files on magnetic carriers; and writing of special processing programs in the FORTRAN and BASIC high-level language

  9. Automated work-flow for processing high-resolution direct infusion electrospray ionization mass spectral fingerprints

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    2007-01-01

    an automated data processing pipeline to compare large numbers of fingerprint spectra from direct infusion experiments analyzed by high resolution MS. We describe some of the intriguing problems that have to be addressed. starting with the conversion and pre-processing of the raw data to the final data......The use of mass spectrometry (MS) is pivotal in analyses of the metabolome and presents a major challenge for subsequent data processing. While the last few years have given new high performance instruments, there has not been a comparable development in data processing. In this paper we discuss...

  10. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    STUBBS, A.M.

    2001-01-01

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  11. Computer-based diagnostic monitoring to enhance the human-machine interface of complex processes

    International Nuclear Information System (INIS)

    Kim, I.S.

    1992-02-01

    There is a growing interest in introducing an automated, on-line, diagnostic monitoring function into the human-machine interfaces (HMIs) or control rooms of complex process plants. The design of such a system should be properly integrated with other HMI systems in the control room, such as the alarms system or the Safety Parameter Display System (SPDS). This paper provides a conceptual foundation for the development of a Plant-wide Diagnostic Monitoring System (PDMS), along with functional requirements for the system and other advanced HMI systems. Insights are presented into the design of an efficient and robust PDMS, which were gained from a critical review of various methodologies developed in the nuclear power industry, the chemical process industry, and the space technological community

  12. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Directory of Open Access Journals (Sweden)

    Dai Fei Elmer Ker

    Full Text Available Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and

  13. Study on Laser Welding Process Monitoring Method

    OpenAIRE

    Knag , Heeshin

    2017-01-01

    International audience; In this paper, a study of quality monitoring technology for the laser welding was conducted. The laser welding and the industrial robotic systems were used with robot-based laser welding systems. The laser system used in this study was 1.6 kW fiber laser, while the robot system was Industrial robot (pay-load : 130 kg). The robot-based laser welding system was equipped with a laser scanner system for remote laser welding. The welding joints of steel plate and steel plat...

  14. Study on Laser Welding Process Monitoring Method

    OpenAIRE

    Heeshin Knag

    2016-01-01

    In this paper, a study of quality monitoring technology for the laser welding was conducted. The laser welding and the industrial robotic systems were used with robot-based laser welding systems. The laser system used in this study was 1.6 kW fiber laser, while the robot system was Industrial robot (pay-load : 130 kg). The robot-based laser welding system was equipped with a laser scanner system for remote laser welding. The welding joints of steel plate and steel plate coated with zinc were ...

  15. Spectral imaging applications: Remote sensing, environmental monitoring, medicine, military operations, factory automation and manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Gat, N.; Subramanian, S. [Opto-Knowledge Systems, Inc. (United States); Barhen, J. [Oak Ridge National Lab., TN (United States); Toomarian, N. [Jet Propulsion Lab., Pasadena, CA (United States)

    1996-12-31

    This paper reviews the activities at OKSI related to imaging spectroscopy presenting current and future applications of the technology. The authors discuss the development of several systems including hardware, signal processing, data classification algorithms and benchmarking techniques to determine algorithm performance. Signal processing for each application is tailored by incorporating the phenomenology appropriate to the process, into the algorithms. Pixel signatures are classified using techniques such as principal component analyses, generalized eigenvalue analysis and novel very fast neural network methods. The major hyperspectral imaging systems developed at OKSI include the Intelligent Missile Seeker (IMS) demonstration project for real-time target/decoy discrimination, and the Thermal InfraRed Imaging Spectrometer (TIRIS) for detection and tracking of toxic plumes and gases. In addition, systems for applications in medical photodiagnosis, manufacturing technology, and for crop monitoring are also under development.

  16. Simplified Automated Image Analysis for Detection and Phenotyping of Mycobacterium tuberculosis on Porous Supports by Monitoring Growing Microcolonies

    Science.gov (United States)

    den Hertog, Alice L.; Visser, Dennis W.; Ingham, Colin J.; Fey, Frank H. A. G.; Klatser, Paul R.; Anthony, Richard M.

    2010-01-01

    Background Even with the advent of nucleic acid (NA) amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS), as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. Methods Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO) supports. Repeated imaging during colony growth greatly simplifies “computer vision” and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. Significance Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation. PMID:20544033

  17. Simplified automated image analysis for detection and phenotyping of Mycobacterium tuberculosis on porous supports by monitoring growing microcolonies.

    Directory of Open Access Journals (Sweden)

    Alice L den Hertog

    Full Text Available BACKGROUND: Even with the advent of nucleic acid (NA amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS, as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. METHODS: Here we explore the growth of Mycobacterial tuberculosis microcolonies, imaged by automated digital microscopy, cultured on a porous aluminium oxide (PAO supports. Repeated imaging during colony growth greatly simplifies "computer vision" and presumptive identification of microcolonies was achieved here using existing publically available algorithms. Our system thus allows the growth of individual microcolonies to be monitored and critically, also to change the media during the growth phase without disrupting the microcolonies. Transfer of identified microcolonies onto selective media allowed us, within 1-2 bacterial generations, to rapidly detect the drug susceptibility of individual microcolonies, eliminating the need for time consuming subculturing or the inoculation of multiple parallel cultures. SIGNIFICANCE: Monitoring the phenotype of individual microcolonies as they grow has immense potential for research, screening, and ultimately M. tuberculosis diagnostic applications. The method described is particularly appealing with respect to speed and automation.

  18. Comprehensive automation and monitoring of MV grids as the key element of improvement of energy supply reliability and continuity

    Directory of Open Access Journals (Sweden)

    Stanisław Kubacki

    2012-03-01

    Full Text Available The paper presents the issue of comprehensive automation and monitoring of medium voltage (MV grids as a key element of the Smart Grid concept. The existing condition of MV grid control and monitoring is discussed, and the concept of a solution which will provide the possibility of remote automatic grid reconfiguration and ensure full grid observability from the dispatching system level is introduced. Automation of MV grid switching is discussed in detail to isolate a faulty line section and supply electricity at the time of the failure to the largest possible number of recipients. An example of such automation controls’ operation is also presented. The paper’s second part presents the key role of the quick fault location function and the possibility of the MV grid’s remote reconfiguration for improving power supply reliability (SAIDI and SAIFI indices. It is also shown how an increase in the number of points fitted with faulted circuit indicators with the option of remote control of switches from the dispatch system in MV grids may affect reduction of SAIDI and SAIFI indices across ENERGA-OPERATOR SA divisions.

  19. Processing of the WLCG monitoring data using NoSQL

    Science.gov (United States)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  20. Processing of the WLCG monitoring data using NoSQL

    International Nuclear Information System (INIS)

    Andreeva, J; Beche, A; Karavakis, E; Saiz, P; Tuckett, D; Belov, S; Kadochnikov, I; Schovancova, J; Dzhunov, I

    2014-01-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  1. An automated image processing method for classification of diabetic retinopathy stages from conjunctival microvasculature images

    Science.gov (United States)

    Khansari, Maziyar M.; O'Neill, William; Penn, Richard; Blair, Norman P.; Chau, Felix; Shahidi, Mahnaz

    2017-03-01

    The conjunctiva is a densely vascularized tissue of the eye that provides an opportunity for imaging of human microcirculation. In the current study, automated fine structure analysis of conjunctival microvasculature images was performed to discriminate stages of diabetic retinopathy (DR). The study population consisted of one group of nondiabetic control subjects (NC) and 3 groups of diabetic subjects, with no clinical DR (NDR), non-proliferative DR (NPDR), or proliferative DR (PDR). Ordinary least square regression and Fisher linear discriminant analyses were performed to automatically discriminate images between group pairs of subjects. Human observers who were masked to the grouping of subjects performed image discrimination between group pairs. Over 80% and 70% of images of subjects with clinical and non-clinical DR were correctly discriminated by the automated method, respectively. The discrimination rates of the automated method were higher than human observers. The fine structure analysis of conjunctival microvasculature images provided discrimination of DR stages and can be potentially useful for DR screening and monitoring.

  2. 10 CFR 74.53 - Process monitoring.

    Science.gov (United States)

    2010-01-01

    ... estimated measurement standard deviation greater than five percent that is either input or output material... differences greater than three times the estimated standard deviation of the process difference estimator and...; and (4) SSNM involved in research and development operations that process less than five formula...

  3. Monitoring Assertion-Based Business Processes

    NARCIS (Netherlands)

    Aiello, Marco; Lazovik, Alexander

    2006-01-01

    Business processes that span organizational borders describe the interaction between multiple parties working towards a common objective. They also express business rules that govern the behavior of the process and account for expressing changes reflecting new business objectives and new market

  4. An Advanced Pre-Processing Pipeline to Improve Automated Photogrammetric Reconstructions of Architectural Scenes

    Directory of Open Access Journals (Sweden)

    Marco Gaiani

    2016-02-01

    Full Text Available Automated image-based 3D reconstruction methods are more and more flooding our 3D modeling applications. Fully automated solutions give the impression that from a sample of randomly acquired images we can derive quite impressive visual 3D models. Although the level of automation is reaching very high standards, image quality is a fundamental pre-requisite to produce successful and photo-realistic 3D products, in particular when dealing with large datasets of images. This article presents an efficient pipeline based on color enhancement, image denoising, color-to-gray conversion and image content enrichment. The pipeline stems from an analysis of various state-of-the-art algorithms and aims to adjust the most promising methods, giving solutions to typical failure causes. The assessment evaluation proves how an effective image pre-processing, which considers the entire image dataset, can improve the automated orientation procedure and dense 3D point cloud reconstruction, even in the case of poor texture scenarios.

  5. The R package EchoviewR for automated processing of active acoustic data using Echoview

    Directory of Open Access Journals (Sweden)

    Lisa-Marie Katarina Harrison

    2015-02-01

    Full Text Available Acoustic data is time consuming to process due to the large data size and the requirement to often undertake some data processing steps manually. Manual processing may introduce subjective, irreproducible decisions into the data processing work flow, reducing consistency in processing between surveys. We introduce the R package EchoviewR as an interface between R and Echoview, a commercially available acoustic processing software package. EchoviewR allows for automation of Echoview using scripting which can drastically reduce the manual work required when processing acoustic surveys. This package plays an important role in reducing subjectivity in acoustic data processing by allowing exactly the same process to be applied automatically to multiple surveys and documenting where subjective decisions have been made. Using data from a survey of Antarctic krill, we provide two examples of using EchoviewR: krill estimation and swarm detection.

  6. Evaluation of a Multi-Parameter Sensor for Automated, Continuous Cell Culture Monitoring in Bioreactors

    Science.gov (United States)

    Pappas, D.; Jeevarajan, A.; Anderson, M. M.

    2004-01-01

    offer automated, continuous monitoring of cell cultures with a temporal resolution of 1 minute, which is not attainable by sampling via handheld blood analyzer (i-STAT). Conclusion: The resulting bias and precision found in these cell culture-based studies is comparable to Paratrend sensor clinical results. Although the large error in p02 measurements (+/-18 mm Hg) may be acceptable for clinical applications, where Paratrend values are periodically adjusted to a BGA measurement, the O2 sensor in this bundle may not be reliable enough for the single-calibration requirement of sensors used in NASA's bioreactors. The pH and pC02 sensors in the bundle are reliable and stable over the measurement period, and can be used without recalibration to measure cell cultures in rn.jcrogravity biotechnology experiments. Future work will test additional Paratrend sensors to provide statistical assessment of sensor performance.

  7. BiomaSoft: data processing system for monitoring and evaluating food and energy production. Part I

    International Nuclear Information System (INIS)

    Quevedo, J. R.; Suárez, J.

    2015-01-01

    The integrated food and energy production in Cuba demands to process diverse and voluminous information to make local, sectoral and national decisions, in order to have incidence on public policies, for which the support of automated systems that facilitate the monitoring and evaluation (M&E) of the integrated food and energy production in Cuban municipalities is necessary. The objective of this research was to identify the tools for the design of the data processing system BiomaSoft and to contextualize its application environment. The software development methodology was RUP (Rational Unified Process), with UML (Unified Modeling Language) as modeling language and PHP (Hypertext Pre-Processor) as programming language. The environment was conceptualized through a dominion model and the functional and non-functional requisites that should be fulfilled, as well as the Use Case Diagram of the system, with the description of actors, were specified. For the display of BiomaSoft a configuration based on two types of physical nodes (a web server and client computers) was conceived, in the municipalities that participate in the project «Biomass as renewable energy source for Cuban rural areas» (BIOMAS-CUBA). It is concluded that the monitoring and evaluation of integrated food and energy production under Cuban conditions can be made through the automated system BiomaSoft, and the identification of tools for its design and the contextualization of its application environment contribute to this purpose. (author)

  8. Use of process monitoring data to enhance material accounting

    International Nuclear Information System (INIS)

    Brouns, R.J.; Smith, B.W.

    1980-01-01

    A study was conducted for the Nuclear Regulatory Commission as part of a continuing program to estimate the effectiveness of using process monitoring data to enhance special nuclear material accounting in nuclear facilities. Two licensed fuel fabrication facilities with internal scrap recovery processes were examined. The loss detection sensitivity, timeliness, and localization capabilities of the process monitoring technique were evaluated for single and multiple (trickle) losses. 4 refs

  9. The automated testing system of programs with the graphic user interface within the context of educational process

    OpenAIRE

    Sychev, O.; Kiryushkin, A.

    2009-01-01

    The paper describes the problems of automation of educational process at the course "Programming on high level language. Algorithmic languages". Complexities of testing of programs with the user interface are marked. Existing analogues was considered. Methods of automation of student's jobs testing are offered.

  10. Process development for automated solar-cell and module production. Task 4. Automated array assembly. Quarterly report No. 3

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J. J.; Gifford, M.

    1981-04-15

    The Automated Lamination Station is mechanically complete and is currently undergoing final wiring. The high current driver and isolator boards have been completed and installed, and the main interface board is under construction. The automated vacuum chamber has had a minor redesign to increase stiffness and improve the cover open/close mechanism. Design of the Final Assembly Station has been completed and construction is underway.

  11. Prompt Gamma Ray Spectroscopy for process monitoring

    International Nuclear Information System (INIS)

    Zoller, W.H.; Holmes, J.L.

    1991-01-01

    Prompt Gamma Ray Spectroscopy (PGRS) is a very powerful analytical technique able to measure many metallic, contamination problem elements. The technique involves measurement of gamma rays that are emitted by nuclei upon capturing a neutron. This method is sensitive not only to the target element but also to the particular isotope of that element. PGRS is capable of measuring dissolved metal ions in a flowing system. In the field, isotopic neutron sources are used to produce the desired neutron flux ( 252 Cf can produce neutron flux of the order of 10 8 neutrons/cm 2 --sec.). Due to high penetrating power of gamma radiation, high efficiency gamma ray detectors can be placed in an appropriate geometry to maximize sensitivity, providing real-time monitoring with low detection level capabilities

  12. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE): Standardized Processing Software for Developmental and High-Artifact Data.

    Science.gov (United States)

    Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R

    2018-01-01

    Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and

  13. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  14. Automated processing of measuring information and control processes of eutrophication in water for household purpose, based on artificial neural networks

    Directory of Open Access Journals (Sweden)

    О.М. Безвесільна

    2006-04-01

    Full Text Available  The possibilities of application  informational-computer technologies for automated handling of a measuring information about development of seaweed (evtrofication in household reservoirs are considered. The input data’s for a research of processes evtrofication are videoimages of tests of water, which are used for the definition of geometric characteristics, number and biomass of seaweed. For handling a measuring information the methods of digital handling videoimages and mathematical means of artificial neural networks are offered.

  15. Approach to automation of a process of yeast inoculum production on industrial scale for ethanol production

    Directory of Open Access Journals (Sweden)

    Ibeth Viviana Ordóñez-Ortega

    2013-07-01

    Full Text Available The results of an applied research for automation the stage of reproduction of Saccharomyces cerevisiae yeas to produce ethanol, are presented in this paper. The identification of the variables to be instrumented, the instrumentation requirements and the proposed control scheme are based on the analysis of the functioning and operation of the process.

  16. - GEONET - A Realization of an Automated Data Flow for Data Collecting, Processing, Storing, and Retrieving

    International Nuclear Information System (INIS)

    Friedsam, Horst; Pushor, Robert; Ruland, Robert; SLAC

    2005-01-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's

  17. Database Security for an Integrated Solution to Automate Sales Processes in Banking

    OpenAIRE

    Alexandra Maria Ioana FLOREA

    2013-01-01

    In order to maintain a competitive edge in a very active banking market the implementation of a web-based solution to standardize, optimize and manage the flow of sales / pre-sales and generating new leads is requested by a company. This article presents the realization of a development framework for software interoperability in the banking financial institutions and an integrated solution for achieving sales process automation in banking. The paper focuses on presenting the requirements for ...

  18. Electromagnetic compatibility of tools and automated process control systems of NPP units

    International Nuclear Information System (INIS)

    Alpeev, A.S.

    1994-01-01

    Problems of electromagnetic compatibility of automated process control subsystems in NPP units are discussed. It is emphasized, that at the stage of development of request for proposal for each APC subsystem special attention should be paid to electromagnetic situation in specific room and requirements to the quality of functions performed by the system. Besides, requirements to electromagnetic compatibility tests at the work stations should be formulated, and mock-ups of the subsystems should be tested

  19. A realization of an automated data flow for data collecting, processing, storing and retrieving

    International Nuclear Information System (INIS)

    Friedsam, H.; Pushor, R.; Ruland, R.

    1986-11-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's. 14 refs., 4 figs

  20. Automation of the software production process for multiple cryogenic control applications

    OpenAIRE

    Fluder, Czeslaw; Lefebvre, Victor; Pezzetti, Marco; Plutecki, Przemyslaw; Tovar-González, Antonio; Wolak, Tomasz

    2018-01-01

    The development of process control systems for the cryogenic infrastructure at CERN is based on an automatic software generation approach. The overall complexity of the systems, their frequent evolution as well as the extensive use of databases, repositories, commercial engineering software and CERN frameworks have led to further efforts towards improving the existing automation based software production methodology. A large number of control system upgrades were successfully performed for th...