WorldWideScience

Sample records for automated process monitoring

  1. Automated point clouds processing for deformation monitoring

    Directory of Open Access Journals (Sweden)

    Ján Erdélyi

    2015-12-01

    Full Text Available The weather conditions and the operation load are causing changes in the spatial position and in the shape of engineering constructions, which affects their static and dynamic function and reliability. Because these facts, geodetic measurements are integral parts of engineering structures diagnosis.The advantage of terrestrial laser scanning (TLS over conventional surveying methods is the efficiency of spatial data acquisition. TLS allows contactless determining the spatial coordinates of points lying on the surface on the measured object. The scan rate of current scanners (up to 1 million of points/s allows significant reduction of time, necessary for the measurement; respectively increase the quantity of obtained information about the measured object. To increase the accuracy of results, chosen parts of the monitored construction can be approximated by single geometric entities using regression. In this case the position of measured point is calculated from tens or hundreds of scanned points.This paper presents the possibility of deformation monitoring of engineering structures using the technology of TLS. For automated data processing was developed an application based on Matlab®, Displacement_TLS. The operation mode, the basic parts of this application and the calculation of displacements are described.

  2. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  3. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jay W. Grate; Timothy A. DeVol

    2006-07-20

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodoology, and to initiate work on new detection approaches, especially using modified diode detectors.

  4. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodology, and to initiate work on new detection approaches, especially using modified diode detectors

  5. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  6. Complex Event Processing Approach To Automated Monitoring Of Particle Accelerator And Its Control System

    OpenAIRE

    Karol Grzegorczyk; Vito Baggiolini; Krzysztof Zieliński

    2014-01-01

    This article presents the design and implementation of a software component for automated monitoring and diagnostic information analysis of a particle accelerator and its control system. The information that is analyzed can be seen as streams of events. A Complex Event Processing (CEP) approach to event processing was selected. The main advantage of this approach is the ability to continuously query data coming from several streams. The presented software component is based on Esper, the most...

  7. Complex Event Processing Approach To Automated Monitoring Of Particle Accelerator And Its Control System

    Directory of Open Access Journals (Sweden)

    Karol Grzegorczyk

    2014-01-01

    Full Text Available This article presents the design and implementation of a software component for automated monitoring and diagnostic information analysis of a particle accelerator and its control system. The information that is analyzed can be seen as streams of events. A Complex Event Processing (CEP approach to event processing was selected. The main advantage of this approach is the ability to continuously query data coming from several streams. The presented software component is based on Esper, the most popular open-source implementation of CEP. As a test bed, the control system of the accelerator complex located at CERN, the European Organization for Nuclear Research, was chosen. The complex includes the Large Hadron Collider, the world’s most powerful accelerator. The main contribution to knowledge is by showing that the CEP approach can successfully address many of the challenges associated with automated monitoring of the accelerator and its control system that were previously unsolved. Test results, performance analysis, and a proposal for further works are also presented.

  8. Fully automated measuring equipment for aqueous boron and its application to online monitoring of industrial process effluents.

    Science.gov (United States)

    Ohyama, Seiichi; Abe, Keiko; Ohsumi, Hitoshi; Kobayashi, Hirokazu; Miyazaki, Naotsugu; Miyadera, Koji; Akasaka, Kin-ichi

    2009-06-01

    Fully automated measuring equipment for aqueous boron (referred to as the online boron monitor) was developed on the basis of a rapid potentiometric determination method using a commercial BF4(-) ion-selective electrode (ISE). The equipment can measure boron compounds with concentration ranging from a few to several hundred mg/L, and the measurement is completed in less than 20 min without any pretreatment of the sample. In the monitor, a series of operations for the measurement, i.e., sampling and dispensing of the sample, addition of the chemicals, acquisition and processing of potentiometric data, rinsing of the measurement cell, and calibration of the BF4(-) ISE, is automated. To demonstrate the performance, we installed the monitor in full-scale coal-fired power plants and measured the effluent from a flue gas desulfurization unit. The boron concentration in the wastewater varied significantly depending on the type of coal and the load of power generation. An excellent correlation (R2 = 0.987) was obtained in the measurements between the online boron monitor and inductively coupled plasma atomic emission spectrometry, which proved that the developed monitor can serve as a useful tool for managing boron emission in industrial process effluent. PMID:19569339

  9. Automated system for acquisition and image processing for the control and monitoring boned nopal

    Science.gov (United States)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  10. Characterization and Application of Superlig 620 Solid Phase Extraction Resin for Automated Process Monitoring of 90Sr

    International Nuclear Information System (INIS)

    Characterization of SuperLig(regsign) 620 solid phase extraction resin was performed in order to develop an automated on-line process monitor for 90Sr. The main focus was on strontium separation from barium, with the goal of developing an automated separation process for 90Sr in high-level wastes. High-level waste contains significant 137Cs activity, of which 137mBa is of great concern as an interference to the quantification of strontium. In addition barium, yttrium and plutonium were studied as potential interferences to strontium uptake and detection. A number of complexants were studied in a series of batch Kd experiments, as SuperLig(regsign) 620 was not previously known to elute strontium in typical mineral acids. The optimal separation was found using a 2M nitric acid load solution with a strontium elution step of ∼0.49M ammonium citrate and a barium elution step of ∼1.8M ammonium citrate. 90Sr quantification of Hanford high-level tank waste was performed on a sequential injection analysis microfluidics system coupled to a flow-cell detector. The results of the on-line procedure are compared to standard radiochemical techniques in this paper.

  11. Characterization and application of SuperLigR 620 solid phase extraction resin for automated process monitoring of 90Sr

    International Nuclear Information System (INIS)

    Characterization of SuperLigR 620 solid phase extraction resin was performed in order to develop an automated on-line process monitor for 90Sr. The main focus was on strontium separation from barium, with the goal of developing an automated separation process for 90Sr in high-level wastes. High-level waste contains significant 137Cs activity, of which 137mBa is of great concern as an interference to the quantification of strontium. In addition barium, yttrium and plutonium were studied as potential interferences to strontium uptake and detection. A number of complexants were studied in a series of batch Kd experiments, as SuperLigR 620 was not previously known to elute strontium in typical mineral acids. The optimal separation was found using a 2 M nitric acid load solution with a strontium elution step of ∼0.49 M ammonium citrate and a barium elution step of ∼1.8 M ammonium citrate. 90Sr quantification of Hanford high-level tank waste was performed on a sequential injection analysis microfluidics system coupled to a flow-cell detector. The results of the on-line procedure are compared to standard radiochemical techniques in this paper. (author)

  12. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing

    International Nuclear Information System (INIS)

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  13. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    International Nuclear Information System (INIS)

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L–1. • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L−1. Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature

  14. Heavy Oil Process Monitor: Automated On-Column Asphaltene Precipitation and Re-Dissolution

    Energy Technology Data Exchange (ETDEWEB)

    John F. Schabron; Joseph F. Rovani; Mark Sanderson

    2007-03-31

    An automated separation technique was developed that provides a new approach to measuring the distribution profiles of the most polar, or asphaltenic components of an oil, using a continuous flow system to precipitate and re-dissolve asphaltenes from the oil. Methods of analysis based on this new technique were explored. One method based on the new technique involves precipitation of a portion of residua sample in heptane on a polytetrafluoroethylene-packed (PTFE) column. The precipitated material is re-dissolved in three steps using solvents of increasing polarity: cyclohexane, toluene, and methylene chloride. The amount of asphaltenes that dissolve in cyclohexane is a useful diagnostic of the thermal history of oil, and its proximity to coke formation. For example, about 40 % (w/w) of the heptane asphaltenes from unpyrolyzed residua dissolves in cyclohexane. As pyrolysis progresses, this number decrease to below 15% as coke and toluene insoluble pre-coke materials appear. Currently, the procedure for the isolation of heptane asphaltenes and the determination of the amount of asphaltenes soluble in cyclohexane spans three days. The automated procedure takes one hour. Another method uses a single solvent, methylene chloride, to re-dissolve the material that precipitates on heptane on the PTFE-packed column. The area of this second peak can be used to calculate a value which correlates with gravimetric asphaltene content. Currently the gravimetric procedure to determine asphaltenes takes about 24 hours. The automated procedure takes 30 minutes. Results for four series of original and pyrolyzed residua were compared with data from the gravimetric methods. Methods based on the new on-column precipitation and re-dissolution technique provide significantly more detail about the polar constituent's oils than the gravimetric determination of asphaltenes.

  15. The value of automated high-frequency nutrient monitoring in inference of biogeochemical processes, temporal variability and trends

    Science.gov (United States)

    Bieroza, Magdalena; Heathwaite, Louise

    2013-04-01

    Stream water quality signals integrate catchment-scale processes responsible for delivery and biogeochemical transformation of the key biotic macronutrients (N, C, P). This spatial and temporal integration is particularly pronounced in the groundwater-dominated streams, as in-stream nutrient dynamics are mediated by the processes occurring within riparian and hyporheic ecotones. In this paper we show long-term high-frequency in-stream macronutrient dynamics from a small agricultural catchment located in the North West England. Hourly in-situ measurements of total and reactive phosphorus (Systea, IT), nitrate (Hach Lange, DE) and physical water quality parameters (turbidity, specific conductivity, dissolved oxygen, temperature, pH; WaterWatch, UK) were carried out on the lowland, gaining reach of the River Leith. High-frequency data show complex non-linear nutrient concentration-discharge relationships. The dominance of hysteresis effects suggests the presence of a temporally varying apportionment of allochthonous and autochthonous nutrient sources. Varying direction, magnitude and dynamics of the hysteretic responses between storm events is driven by the variation in the contributing source areas and shows the importance of the coupling of catchment-scale, in-stream, riparian and hyporheic biogeochemical cycles. The synergistic effect of physical (temperature-driven, the hyporheic exchange controlled by diffusion) and biogeochemical drivers (stream and hyporheic metabolism) on in-stream nutrient concentrations manifests itself in observed diurnal patterns. As inferred from the high-frequency nutrient monitoring, the diurnal dynamics are of the greatest importance under baseflow conditions. Understanding the role and relative importance of these processes can be difficult due to spatial and temporal heterogeneity of the key mechanisms involved. This study shows the importance of in-situ, fine temporal resolution, automated monitoring approaches in providing evidence

  16. HEAVY OIL PROCESS MONITOR: AUTOMATED ON-COLUMN ASPHALTENE PRECIPITATION AND RE-DISSOLUTION

    Energy Technology Data Exchange (ETDEWEB)

    John F. Schabron; Joseph F. Rovani Jr; Mark Sanderson

    2006-06-01

    About 37-50% (w/w) of the heptane asphaltenes from unpyrolyzed residua dissolve in cyclohexane. As pyrolysis progresses, this number decrease to below 15% as coke and toluene insoluble pre-coke materials appear. This solubility measurement can be used after coke begins to form, unlike the flocculation titration, which cannot be applied to multi-phase systems. Currently, the procedure for the isolation of heptane asphaltenes and the determination of the amount of asphaltenes soluble in cyclohexane spans three days. A more rapid method to measure asphaltene solubility was explored using a novel on-column asphaltene precipitation and re-dissolution technique. This was automated using high performance liquid chromatography (HPLC) equipment with a step gradient sequence using the solvents: heptane, cyclohexane, toluene:methanol (98:2). Results for four series of original and pyrolyzed residua were compared with data from the gravimetric method. The measurement time was reduced from three days to forty minutes. The separation was expanded further with the use of four solvents: heptane, cyclohexane, toluene, and cyclohexanone or methylene chloride. This provides a fourth peak which represents the most polar components, in the oil.

  17. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    Science.gov (United States)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  18. Automated monitoring of milk meters

    OpenAIRE

    Mol, de, M.J.; Andre, G.

    2009-01-01

    Automated monitoring might be an alternative for periodic checking of electronic milk meters. A computer model based on Dynamic Linear Modelling (DLM) has been developed for this purpose. Two situations are distinguished: more milking stands in the milking parlour and only one milking stand in the milking parlour, e.g. in case of robotic milking. In the first case the model is based on a comparison per milking session of the average per stand with the overall average over all stands. The mode...

  19. National Automated Conformity Inspection Process

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  20. Automated satellite telemetry processing system

    Science.gov (United States)

    Parunakian, David; Kalegaev, Vladimir; Barinova, Vera

    In this paper we describe the design and important implementation details of the new automated system for processing satellite telemetry developedat Skobeltsyn Institute of Nuclear Physics of Moscow State University (SINP MSU) . We discuss the most common tasks and pitfall for such systems built around data stream from a single spacecraft or a single instrument, and suggest a solution that allows to quickly develop telemetry processing modules and to integrate them with an existing polling mechanism, support infrastructure and data storage in Oracle or MySQL database systems. We also demonstrate the benefits of this approach using modules for processing three different spacecraft data streams: Coronas-Photon (2009-003A), Tatiana-2 (2009-049D) and Meteor-M no.1 (2009-049A). The data format and protocols used by each of these spacecraft have distinct peculiarities, which nevertheless did not pose a problem for integrating their modules into the main system. Remote access via web interface to Oracle databases and sophisticated visualization tools create a possibility of efficient scientific exploitation of satellite data. Such a system is already deployed at the web portal of the Space Monitoring Data Center (SMDC) of SINP MSU (http://smdc.sinp.msu.ru).

  1. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    Science.gov (United States)

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity. PMID:8738184

  2. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 109 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author)

  3. Automating the radiographic ndt process

    International Nuclear Information System (INIS)

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justification for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 109 bits per 14 x 17. This is equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 inch film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in separate parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system

  4. Automated personal dosimetry monitoring system for NPP

    Energy Technology Data Exchange (ETDEWEB)

    Chanyshev, E.; Chechyotkin, N.; Kondratev, A.; Plyshevskaya, D. [Design Bureau ' Promengineering' , Moscow (Russian Federation)

    2006-07-01

    Full text: Radiation safety of personnel at nuclear power plants (NPP) is a priority aim. Degree of radiation exposure of personnel is defined by many factors: NPP design, operation of equipment, organizational management of radiation hazardous works and, certainly, safety culture of every employee. Automated Personal Dosimetry Monitoring System (A.P.D.M.S.) is applied at all nuclear power plants nowadays in Russia to eliminate the possibility of occupational radiation exposure beyond regulated level under different modes of NPP operation. A.P.D.M.S. provides individual radiation dose registration. In the paper the efforts of Design Bureau 'Promengineering' in construction of software and hardware complex of A.P.D.M.S. (S.H.W. A.P.D.M.S.) for NPP with PWR are presented. The developed complex is intended to automatize activities of radiation safety department when caring out individual dosimetry control. The complex covers all main processes concerning individual monitoring of external and internal radiation exposure as well as dose recording, management, and planning. S.H.W. A.P.D.M.S. is a multi-purpose system which software was designed on the modular approach. This approach presumes modification and extension of software using new components (modules) without changes in other components. Such structure makes the system flexible and allows modifying it in case of implementation a new radiation safety requirements and extending the scope of dosimetry monitoring. That gives the possibility to include with time new kinds of dosimetry control for Russian NPP in compliance with IAEA recommendations, for instance, control of the equivalent dose rate to the skin and the equivalent dose rate to the lens of the eye S.H.W. A.P.D.M.S. provides dosimetry control as follows: Current monitoring of external radiation exposure: - Gamma radiation dose measurement using radio-photoluminescent personal dosimeters. - Neutron radiation dose measurement using

  5. Automated personal dosimetry monitoring system for NPP

    International Nuclear Information System (INIS)

    Full text: Radiation safety of personnel at nuclear power plants (NPP) is a priority aim. Degree of radiation exposure of personnel is defined by many factors: NPP design, operation of equipment, organizational management of radiation hazardous works and, certainly, safety culture of every employee. Automated Personal Dosimetry Monitoring System (A.P.D.M.S.) is applied at all nuclear power plants nowadays in Russia to eliminate the possibility of occupational radiation exposure beyond regulated level under different modes of NPP operation. A.P.D.M.S. provides individual radiation dose registration. In the paper the efforts of Design Bureau 'Promengineering' in construction of software and hardware complex of A.P.D.M.S. (S.H.W. A.P.D.M.S.) for NPP with PWR are presented. The developed complex is intended to automatize activities of radiation safety department when caring out individual dosimetry control. The complex covers all main processes concerning individual monitoring of external and internal radiation exposure as well as dose recording, management, and planning. S.H.W. A.P.D.M.S. is a multi-purpose system which software was designed on the modular approach. This approach presumes modification and extension of software using new components (modules) without changes in other components. Such structure makes the system flexible and allows modifying it in case of implementation a new radiation safety requirements and extending the scope of dosimetry monitoring. That gives the possibility to include with time new kinds of dosimetry control for Russian NPP in compliance with IAEA recommendations, for instance, control of the equivalent dose rate to the skin and the equivalent dose rate to the lens of the eye S.H.W. A.P.D.M.S. provides dosimetry control as follows: Current monitoring of external radiation exposure: - Gamma radiation dose measurement using radio-photoluminescent personal dosimeters. - Neutron radiation dose measurement using thermoluminescent

  6. Automating the conflict resolution process

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  7. The Automator: Intelligent control system monitoring

    International Nuclear Information System (INIS)

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  8. The Automator: Intelligent Control System Monitoring

    International Nuclear Information System (INIS)

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  9. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Automation, the removal of the human element in inspection has not been generally applied to film radiographic NDT. The justification for automation is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 3x10 (to the power of nine) bits per 14x17. This is equivalent to 2200 computer floppy disks parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, a computer aided interpretation appears on the horizon. A unit which laser scans a 14x27 (inch) film in 6-8 seconds can digitize film in information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for film digital radiography system) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. (Author). 4 refs.; 21 figs

  10. Monitoring of the physical status of Mars-500 subjects as a model of structuring an automated system in support of the training process in an exploration mission

    Science.gov (United States)

    Fomina, Elena; Savinkina, Alexandra; Kozlovskaya, Inesa; Lysova, Nataliya; Angeli, Tomas; Chernova, Maria; Uskov, Konstantin; Kukoba, Tatyana; Sonkin, Valentin; Ba, Norbert

    Physical training sessions aboard the ISS are performed under the permanent continuous control from Earth. Every week the instructors give their recommendations on how to proceed with the training considering the results of analysis of the daily records of training cosmonauts and data of the monthly fitness testing. It is obvious that in very long exploration missions this system of monitoring will be inapplicable. For this reason we venture to develop an automated system to control the physical training process using the current ISS locomotion test parameters as the leading criteria. Simulation of an extended exploration mission in experiment MARS-500 enabled the trial application of the automated system for assessing shifts in cosmonauts’ physical status in response to exercises of varying category and dismissal periods. Methods. Six subjects spent 520 days in the analog of an interplanetary vehicle at IBMP (Moscow). A variety of training regimens and facilities were used to maintain a high level of physical performance of the subjects. The resistance exercises involved expanders, strength training device (MDS) and vibrotraining device (Galileo). The cycling exercises were performed on the bicycle ergometer (VB-3) and a treadmill with the motor in or out of motion. To study the effect of prolonged periods of dismissal from training on physical performance, the training flow was interrupted for a month once in the middle and then at the end of isolation. In addition to the in-flight locomotion test integrated into the automated training control system, the physical status of subjects was attested by analysis of the records of the monthly incremental testing on the bicycle ergometer and MDS. Results. It was demonstrated that the recommended training regimens maintained high physical performance levels despite the limited motor activities in isolation. According to the locomotion testing, the subjects increased velocity significantly and reduced the physiological

  11. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin;

    2015-01-01

    thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells, and...... high flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so that these...... determining the total cell and dead cells concentrations, within a time frame of 10.3 min. The platform consists of custom made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample to waste liquid management and image cytometry-based detection. The total concentration of cells...

  12. Monitoring system for automation of experimental researches in cutting

    International Nuclear Information System (INIS)

    This study presents procedures being performed when projecting and realizing experimental scientific researches by application of the automated measurement system with a computer support in all experiment stages. A special accent is placed on the measurement system integration and mathematical processing of data from experiments. Automation processes are described through the realized own automated monitoring system for research of physical phenomena in the cutting process with computer-aided data acquisition. The monitoring system is intended for determining the tangential, axial and radial component of the cutting force, as well as average temperature in the cutting process. The hardware acquisition art consists of amplifiers and A/D converters, while as for analysis and visualization software for P C is developed by using M S Visual C++. For mathematical description researched physical phenomena CADEX software is made, which in connection with MATLAB is intended for projecting processing and analysis of experimental scientific researches against the theory for planning multi-factorial experiments. The design and construction of the interface and the computerized measurement system were done by the Faculty of Mechanical Engineering in Skopje in collaboration with the Faculty of Electrical Engineering and Information Technologies in Skopje and the Institute of Production Engineering and Automation, Wroclaw University of Technology, Poland. Gaining own scientific research measurement system with free access to hardware and software parts provides conditions for a complete control of the research process and reduction of interval of the measuring uncertainty of gained results from performed researches.

  13. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  14. Automating the training development process

    Science.gov (United States)

    Scott, Carol J.

    1993-01-01

    The Automated Training Development System (ATDS) was developed as a training tool for the JPL training environment. ATDS is based on the standard for military training programs and is designed to develop training from a system perspective, focusing on components in terms of the whole process. The principal feature of ATDS is data base maintainability. Everything is contained and maintained within the data base, and, if properly developed, it could be a training component of a software delivery and provided to CM as a controlled item. The analysis, development, design, presentation, and reporting phases in the ATDS instructional design method are illustrated.

  15. Automated Low-Cost Photogrammetry for Flexible Structure Monitoring

    Science.gov (United States)

    Wang, C. H.; Mills, J. P.; Miller, P. E.

    2012-07-01

    Structural monitoring requires instruments which can provide high precision and accuracy, reliable measurements at good temporal resolution and rapid processing speeds. Long-term campaigns and flexible structures are regarded as two of the most challenging subjects in monitoring engineering structures. Long-term monitoring in civil engineering is generally considered to be labourintensive and financially expensive and it can take significant effort to arrange the necessary human resources, transportation and equipment maintenance. When dealing with flexible structure monitoring, it is of paramount importance that any monitoring equipment used is able to carry out rapid sampling. Low cost, automated, photogrammetric techniques therefore have the potential to become routinely viable for monitoring non-rigid structures. This research aims to provide a photogrammetric solution for long-term flexible structural monitoring purposes. The automated approach was achieved using low-cost imaging devices (mobile phones) to replace traditional image acquisition stations and substantially reduce the equipment costs. A self-programmed software package was developed to deal with the hardware-software integration and system operation. In order to evaluate the performance of this low-cost monitoring system, a shaking table experiment was undertaken. Different network configurations and target sizes were used to determine the best configuration. A large quantity of image data was captured by four DSLR cameras and four mobile phone cameras respectively. These image data were processed using photogrammetric techniques to calculate the final results for the system evaluation.

  16. Automated engineering of process automation systems; Automatisiertes Engineering von Prozessleitsystem-Funktionen

    Energy Technology Data Exchange (ETDEWEB)

    Schmidberger, T.; Fay, A. [Univ. der Bundeswehr Hamburg (Germany). Inst. fuer Automatisierungstechnik; Drath, R. [ABB AG, Ladenburg (Germany). Forschungszentrum

    2005-07-01

    The paper proposes a concept to reduce engineering effort for planning and implementation of process control systems. According to this concept, knowledge based methods accomplish engineering tasks automatically. This approach makes use of information provided electronically by new, object-oriented P and I diagram tool, thus allowing the 'automation of automation'. As examples for this concept, the automatic engineering of interlockings and asset monitors is described. (orig.)

  17. SHARP: Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. Gaius

    1991-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  18. SHARP - Automated monitoring of spacecraft health and status

    Science.gov (United States)

    Atkinson, David J.; James, Mark L.; Martin, R. G.

    1990-01-01

    Briefly discussed here are the spacecraft and ground systems monitoring process at the Jet Propulsion Laboratory (JPL). Some of the difficulties associated with the existing technology used in mission operations are highlighted. A new automated system based on artificial intelligence technology is described which seeks to overcome many of these limitations. The system, called the Spacecraft Health Automated Reasoning Prototype (SHARP), is designed to automate health and status analysis for multi-mission spacecraft and ground data systems operations. The system has proved to be effective for detecting and analyzing potential spacecraft and ground systems problems by performing real-time analysis of spacecraft and ground data systems engineering telemetry. Telecommunications link analysis of the Voyager 2 spacecraft was the initial focus for evaluation of the system in real-time operations during the Voyager spacecraft encounter with Neptune in August 1989.

  19. Design and development of automated TLD contamination monitor

    International Nuclear Information System (INIS)

    Thermo Luminescent Dosimeter (TLD) is issued to occupational worker to register the external exposure received during his course of work. Before sending back the TLDs for processing it is the responsibility of the parent institution to check and certify that the TLDs are free of radioactive contamination. To ease the duty of health physicist a PC based automated TLD contamination monitor was designed and developed and the details of the same are presented in this paper

  20. Biogeochemical processing of nutrients in groundwater-fed stream during baseflow conditions - the value of fluorescence spectroscopy and automated high-frequency nutrient monitoring

    Science.gov (United States)

    Bieroza, Magdalena; Heathwaite, Louise

    2014-05-01

    Recent research in groundwater-dominated streams indicates that organic matter plays an important role in nutrient transformations at the surface-groundwater interface known as the hyporheic zone. Mixing of water and nutrient fluxes in the hyporheic zone controls in-stream nutrients availability, dynamics and export to downstream reaches. In particular, benthic sediments can form adsorptive sinks for organic matter and reactive nutrients (nitrogen and phosphorus) that sustain a variety of hyporheic processes e.g. denitrification, microbial uptake. Thus, hyporheic metabolism can have an important effect on both quantity (concentration) and quality (labile vs. refractory character) of organic matter. Here high-frequency nutrient monitoring combined with spectroscopic analysis was used to provide insights into biogeochemical processing of a small, agricultural stream in the NE England subject to diffuse nutrient pollution. Biogeochemical data were collected hourly for a week at baseflow conditions when in-stream-hyporheic nutrient dynamics have the greatest impact on stream health. In-stream nutrients (total phosphorus, reactive phosphorus, nitrate nitrogen) and water quality parameters (turbidity, specific conductivity, pH, temperature, dissolved oxygen, redox potential) were measured in situ hourly by an automated bank-side laboratory. Concurrent hourly autosamples were retrieved daily and analysed for nutrients and fine sediments including spectroscopic analyses of dissolved organic matter - excitation-emission matrix (EEM) fluorescence spectroscopy and ultraviolet-visible (UV-Vis) absorbance spectroscopy. Our results show that organic matter can potentially be utilised as a natural, environmental tracer of the biogeochemical processes occurring at the surface-groundwater interface in streams. High-frequency spectroscopic characterisation of in-stream organic matter can provide useful quantitative and qualitative information on fluxes of reactive nutrients in

  1. Automation of Design Engineering Processes

    Science.gov (United States)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  2. Real-time bioacoustics monitoring and automated species identification

    Directory of Open Access Journals (Sweden)

    T. Mitchell Aide

    2013-07-01

    Full Text Available Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON, a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website (arbimon.net. Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  3. Automated wireless monitoring system for cable tension using smart sensors

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jongwoong; Cho, Soojin; Spencer, Billie F.; Yun, Chung-Bang

    2013-04-01

    Cables are critical load carrying members of cable-stayed bridges; monitoring tension forces of the cables provides valuable information for SHM of the cable-stayed bridges. Monitoring systems for the cable tension can be efficiently realized using wireless smart sensors in conjunction with vibration-based cable tension estimation approaches. This study develops an automated cable tension monitoring system using MEMSIC's Imote2 smart sensors. An embedded data processing strategy is implemented on the Imote2-based wireless sensor network to calculate cable tensions using a vibration-based method, significantly reducing the wireless data transmission and associated power consumption. The autonomous operation of the monitoring system is achieved by AutoMonitor, a high-level coordinator application provided by the Illinois SHM Project Services Toolsuite. The monitoring system also features power harvesting enabled by solar panels attached to each sensor node and AutoMonitor for charging control. The proposed wireless system has been deployed on the Jindo Bridge, a cable-stayed bridge located in South Korea. Tension forces are autonomously monitored for 12 cables in the east, land side of the bridge, proving the validity and potential of the presented tension monitoring system for real-world applications.

  4. Automated Method for Monitoring Water Quality Using Landsat Imagery

    Directory of Open Access Journals (Sweden)

    D. Clay Barrett

    2016-06-01

    Full Text Available Regular monitoring of water quality is increasingly necessary to keep pace with rapid environmental change and protect human health and well-being. Remote sensing has been suggested as a potential solution for monitoring certain water quality parameters without the need for in situ sampling, but universal methods and tools are lacking. While many studies have developed predictive relationships between remotely sensed surface reflectance and water parameters, these relationships are often unique to a particular geographic region and have little applicability in other areas. In order to remotely monitor water quality, these relationships must be developed on a region by region basis. This paper presents an automated method for processing remotely sensed images from Landsat Thematic Mapper (TM and Enhanced Thematic Mapper Plus (ETM+ and extracting corrected reflectance measurements around known sample locations to allow rapid development of predictive water quality relationships to improve remote monitoring. Using open Python scripting, this study (1 provides an openly accessible and simple method for processing publicly available remote sensing data; and (2 allows determination of relationships between sampled water quality parameters and reflectance values to ultimately allow predictive monitoring. The method is demonstrated through a case study of the Ozark/Ouchita-Appalachian ecoregion in eastern Oklahoma using data collected for the Beneficial Use Monitoring Program (BUMP.

  5. Welding process automation in power machine building

    International Nuclear Information System (INIS)

    The level of welding automation operations in power engineering and ways of its enhancement are highlighted. Used as the examples of comlex automation are an apparatus for the horizontal welding of turbine rotors, remotely controlled automatic machine for welding ring joint of large-sized vessels, equipment for the electron-beam welding of steam turbine assemblies of alloyed steels. The prospects of industrial robots are noted. The importance of the complex automation of technological process, including stocking, assemblying, transportation and auxiliary operations, is emphasized

  6. Area γ radiation monitoring network systems based on totally integrated automation

    International Nuclear Information System (INIS)

    It introduces a kind of Area γ Radiation Monitoring Network Systems based on Totally Integrated Automation. It features simple and safe process control, easy integration of information network, field bus and field instrumentation, modular design and powerful system expansion, implements management and control integration, is positive importance for localization of Radiation Monitoring System. (authors)

  7. Design and implementation of an Internet based effective controlling and monitoring system with wireless fieldbus communications technologies for process automation--an experimental study.

    Science.gov (United States)

    Cetinceviz, Yucel; Bayindir, Ramazan

    2012-05-01

    The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. PMID:22306882

  8. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  9. Automation of loading process equipment industrial companies

    Directory of Open Access Journals (Sweden)

    П.М. Павленко

    2006-01-01

    Full Text Available  The results of mathematical modeling the process of loading of the metalcutting equipment by theory methods of mass service are submitted. The results are used for construction of the appropriate program module of the automated system technological preparation of manufacture.

  10. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    The Siemens RDS 112, an automated radiochemical production and delivery system designed to support a clinical PET program, consists of an 11 MeV, proton only, negative ion cyclotron, a shield, a computer, and targetry and chemical processing modules to produce radiochemicals used in PET imaging. The principal clinical PET tracers are [18F]FDG, [13N]ammonia and [15O]water. Automated synthesis of [18F]FDG is achieved using the Chemistry Process Control Unit (CPCU), a general purpose valve-and-tubing device that emulates manual processes while allowing for competent operator intervention. Using function-based command file software, this pressure-driven synthesis system carries out chemical processing procedures by timing only, without process-based feedback. To date, nine CPCUs have installed at seven institutions resulting in 1,200+ syntheses of [18F]FDG, with an average yield of 55% (EOB)

  11. Monitoring of Microalgal Processes.

    Science.gov (United States)

    Havlik, Ivo; Scheper, Thomas; Reardon, Kenneth F

    2016-01-01

    Process monitoring, which can be defined as the measurement of process variables with the smallest possible delay, is combined with process models to form the basis for successful process control. Minimizing the measurement delay leads inevitably to employing online, in situ sensors where possible, preferably using noninvasive measurement methods with stable, low-cost sensors. Microalgal processes have similarities to traditional bioprocesses but also have unique monitoring requirements. In general, variables to be monitored in microalgal processes can be categorized as physical, chemical, and biological, and they are measured in gaseous, liquid, and solid (biological) phases. Physical and chemical process variables can be usually monitored online using standard industrial sensors. The monitoring of biological process variables, however, relies mostly on sensors developed and validated using laboratory-scale systems or uses offline methods because of difficulties in developing suitable online sensors. Here, we review current technologies for online, in situ monitoring of all types of process parameters of microalgal cultivations, with a focus on monitoring of biological parameters. We discuss newly introduced methods for measuring biological parameters that could be possibly adapted for routine online use, should be preferably noninvasive, and are based on approaches that have been proven in other bioprocesses. New sensor types for measuring physicochemical parameters using optical methods or ion-specific field effect transistor (ISFET) sensors are also discussed. Reviewed methods with online implementation or online potential include measurement of irradiance, biomass concentration by optical density and image analysis, cell count, chlorophyll fluorescence, growth rate, lipid concentration by infrared spectrophotometry, dielectric scattering, and nuclear magnetic resonance. Future perspectives are discussed, especially in the field of image analysis using in situ

  12. Automated chemical monitoring in new projects of nuclear power plant units

    Science.gov (United States)

    Lobanok, O. I.; Fedoseev, M. V.

    2013-07-01

    The development of automated chemical monitoring systems in nuclear power plant units for the past 30 years is briefly described. The modern level of facilities used to support the operation of automated chemical monitoring systems in Russia and abroad is shown. Hardware solutions suggested by the All-Russia Institute for Nuclear Power Plant Operation (which is the General Designer of automated process control systems for power units used in the AES-2006 and VVER-TOI Projects) are presented, including the structure of additional equipment for monitoring water chemistry (taking the Novovoronezh 2 nuclear power plant as an example). It is shown that the solutions proposed with respect to receiving and processing of input measurement signals and subsequent construction of standard control loops are unified in nature. Simultaneous receipt of information from different sources for ensuring that water chemistry is monitored in sufficient scope and with required promptness is one of the problems that have been solved successfully. It is pointed out that improved quality of automated chemical monitoring can be supported by organizing full engineering follow-up of the automated chemical monitoring system's equipment throughout its entire service life.

  13. Automated extinction monitor for the NLOT site survey

    Science.gov (United States)

    Kumar Sharma, Tarun

    In order to search a few potential sites for the National Large Optical Telescope (NLOT) project, we have initiated a site survey program. Since, most of instruments used for the site survey are custom made, we also started developing our own site characterization instruments. In this process we have designed and developed a device called Automated Extinction Monitor (AEM) and installed the same at IAO, Hanle. The AEM is a small wide field robotic telescope, dedicated to record atmospheric extinction in one or more photometric bands. It gives very accurate statistics of the distribution of photometric nights. In addition to this, instrument also provides the measurement of sky brightness. Here we briefly describe overall instrument and initial results obtained.

  14. In Process Beam Monitoring

    Science.gov (United States)

    Steen, W. M.; Weerasinghe, V. M.

    1986-11-01

    The industrial future of lasers in material processing lies in the combination of the laser with automatic machinery. One possible form of such a combination is an intelligent workstation which monitors the process as it occurs and adjusts itself accordingly, either by self teaching or by comparison to a process data bank or algorithm. In order to achieve this attractive goal in-process signals are required. Two devices are described in this paper. One is the Laser Beam Analyser which is now maturing into a second generation with computerised output. The other is the Acoustic Mirror, a totally novel analytic technique, not yet fully understood, but which nevertheless can act as a very effective process monitor.

  15. Classification Trees for Quality Control Processes in Automated Constructed Response Scoring.

    Science.gov (United States)

    Williamson, David M.; Hone, Anne S.; Miller, Susan; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, the issue of monitoring the scoring process becomes a primary concern, particularly when the goal is to have automated scoring operate completely unassisted by humans. Using a vignette from the Architectural Registration Examination and data for 326 cases with both human…

  16. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    OpenAIRE

    Hon Ming Yip; John C. S. Li; Kai Xie; Xin Cui; Agrim Prasad; Qiannan Gao; Chi Chiu Leung; Lam, Raymond H. W.

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet...

  17. Wind Turbine Manufacturing Process Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Waseem Faidi; Chris Nafis; Shatil Sinha; Chandra Yerramalli; Anthony Waas; Suresh Advani; John Gangloff; Pavel Simacek

    2012-04-26

    To develop a practical inline inspection that could be used in combination with automated composite material placement equipment to economically manufacture high performance and reliable carbon composite wind turbine blade spar caps. The approach technical feasibility and cost benefit will be assessed to provide a solid basis for further development and implementation in the wind turbine industry. The program is focused on the following technology development: (1) Develop in-line monitoring methods, using optical metrology and ultrasound inspection, and perform a demonstration in the lab. This includes development of the approach and performing appropriate demonstration in the lab; (2) Develop methods to predict composite strength reduction due to defects; and (3) Develop process models to predict defects from leading indicators found in the uncured composites.

  18. Automated Synthesis of Assertion Monitors using Visual Specifications

    CERN Document Server

    Gadkari, Ambar A

    2011-01-01

    Automated synthesis of monitors from high-level properties plays a significant role in assertion-based verification. We present here a methodology to synthesize assertion monitors from visual specifications given in CESC (Clocked Event Sequence Chart). CESC is a visual language designed for specifying system level interactions involving single and multiple clock domains. It has well-defined graphical and textual syntax and formal semantics based on synchronous language paradigm enabling formal analysis of specifications. In this paper we provide an overview of CESC language with few illustrative examples. The algorithm for automated synthesis of assertion monitors from CESC specifications is described. A few examples from standard bus protocols (OCP-IP and AMBA) are presented to demonstrate the application of monitor synthesis algorithm.

  19. Demonstration of expert systems in automated monitoring

    International Nuclear Information System (INIS)

    The Reactor Systems Section of Oak Ridge National Laboratory's Instrumentation and Controls Division has been developing expertise in the application of artificial intelligence (AI) tools and techniques to control complex systems. One of the applications developed demonstrates the capabilities of a rule-based expert system to monitor a nuclear reactor. Based on the experience acquired with the demonstration described in this paper, a 2-yr program was initiated during fiscal year 1985 for the development and implementation of an intelligent monitoring adviser to the operators of the HFIR facility. The intelligent monitoring system will act as an alert and cooperative expert to relieve the operators of routine tasks, request their attention when abnormalities are detected, and provide them with interactive diagnostic aid and project action/effects information as needed or on demand

  20. Java Implementation based Heterogeneous Video Sequence Automated Surveillance Monitoring

    Directory of Open Access Journals (Sweden)

    Sankari Muthukarupan

    2013-04-01

    Full Text Available Automated video based surveillance monitoring is an essential and computationally challenging task to resolve issues in the secure access localities. This paper deals with some of the issues which are encountered in the integration surveillance monitoring in the real-life circumstances. We have employed video frames which are extorted from heterogeneous video formats. Each video frame is chosen to identify the anomalous events which are occurred in the sequence of time-driven process. Background subtraction is essentially required based on the optimal threshold and reference frame. Rest of the frames are ablated from reference image, hence all the foreground images paradigms are obtained. The co-ordinate existing in the deducted images is found by scanning the images horizontally until the occurrence of first black pixel. Obtained coordinate is twinned with existing co-ordinates in the primary images. The twinned co-ordinate in the primary image is considered as an active-region-of-interest. At the end, the starred images are converted to temporal video that scrutinizes the moving silhouettes of human behaviors in a static background. The proposed model is implemented in Java. Results and performance analysis are carried out in the real-life environments.

  1. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing; Metodologia de monitoracao e diagnostico automatizado de rolamentos utilizando logica paraconsistente, transformada de Wavelet e processamento de sinais digitais

    Energy Technology Data Exchange (ETDEWEB)

    Masotti, Paulo Henrique Ferraz

    2006-07-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  2. Plutonium monitor: data processing

    International Nuclear Information System (INIS)

    The principle of the real time determination of air voluminal activity from the measurement of the activity of the filter. The ''Pu'' measurement processing has to comple the Pu/natural radioactivity discrimination that the sampler cannot do alone. The basic process of the measurement processing is described. For the operation checkout and the examination of performance of the processing, and for the technical success of a measurement-processing system, it is possible to use a real-time simulation of the different sensors; in the case of ''Pu'' processing, a mockup of the sampler has been prefered; it gives the elementary countings due to the natural radioactivity for the two ''Pu'' and ''RaA'' windows; it has been associated to a simulator giving the pulses corresponding in the ''Pu'' window to only ''Pu'', according the chosen profile. The main results obtained after several hundreds simulations are given; eight diagrams, quite representative, are presented. To concludes the performence of the BFSA monitor, for plutonium aerosol monitoring, in which the TMAPU2 measurement processing system and a performant detection head are associated, are reviewed

  3. Automated full matrix capture for industrial processes

    Science.gov (United States)

    Brown, Roy H.; Pierce, S. Gareth; Collison, Ian; Dutton, Ben; Dziewierz, Jerzy; Jackson, Joseph; Lardner, Timothy; MacLeod, Charles; Morozov, Maxim

    2015-03-01

    Full matrix capture (FMC) ultrasound can be used to generate a permanent re-focusable record of data describing the geometry of a part; a valuable asset for an inspection process. FMC is a desirable acquisition mode for automated scanning of complex geometries, as it allows compensation for surface shape in post processing and application of the total focusing method. However, automating the delivery of such FMC inspection remains a significant challenge for real industrial processes due to the high data overhead associated with the ultrasonic acquisition. The benefits of NDE delivery using six-axis industrial robots are well versed when considering complex inspection geometries, but such an approach brings additional challenges to scanning speed and positional accuracy when combined with FMC inspection. This study outlines steps taken to optimize the scanning speed and data management of a process to scan the diffusion bonded membrane of a titanium test plate. A system combining a KUKA robotic arm and a reconfigurable FMC phased array controller is presented. The speed and data implications of different scanning methods are compared, and the impacts on data visualization quality are discussed with reference to this study. For the 0.5 m2 sample considered, typical acquisitions of 18 TB/m2 were measured for a triple back wall FMC acquisition, illustrating the challenge of combining high data throughput with acceptable scanning speeds.

  4. Process development for automated solar cell and module production. Task 4: automated array assembly

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J.J.

    1980-06-30

    The scope of work under this contract involves specifying a process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use. This process sequence is then critically analyzed from a technical and economic standpoint to determine the technological readiness of each process step for implementation. The process steps are ranked according to the degree of development effort required and according to their significance to the overall process. Under this contract the steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development. Economic analysis using the SAMICS system has been performed during these studies to assure that development efforts have been directed towards the ultimate goal of price reduction. Details are given. (WHK)

  5. A plasma process monitor/control system

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, J.O.; Ward, P.P.; Smith, M.L. [Sandia National Labs., Albuquerque, NM (United States); Markle, R.J. [Advanced Micro Devices, Inc., Austin, TX (United States)

    1997-08-01

    Sandia National Laboratories has developed a system to monitor plasma processes for control of industrial applications. The system is designed to act as a fully automated, sand-alone process monitor during printed wiring board and semiconductor production runs. The monitor routinely performs data collection, analysis, process identification, and error detection/correction without the need for human intervention. The monitor can also be used in research mode to allow process engineers to gather additional information about plasma processes. The plasma monitor can perform real-time control of support systems known to influence plasma behavior. The monitor can also signal personnel to modify plasma parameters when the system is operating outside of desired specifications and requires human assistance. A notification protocol can be selected for conditions detected in the plasma process. The Plasma Process Monitor/Control System consists of a computer running software developed by Sandia National Laboratories, a commercially available spectrophotometer equipped with a charge-coupled device camera, an input/output device, and a fiber optic cable.

  6. Automated HPLC monitoring of broth components on bioreactors

    OpenAIRE

    Favre, Eric; Pugeaud, Patrick; Raboud, Jean Philippe; Péringer, Paul

    1989-01-01

    Under proper operating conditions, a low dead volume continuous filtration module operated on biological broths (yeast and bacteria suspensions in stirred reactors) still fulfills the flow-rate requirements of an analytical apparatus (for example HPLC or FIA) without membrane regeneration. The filtrate stream has been successfully connected to a bioreactor in order to perform the automated HPLC analysis of broth components. The monitoring of the carbon source (lactose), and minor products (gl...

  7. Main principles of automated radiation monitoring system construction at NPPs

    International Nuclear Information System (INIS)

    The main purpose of the automated radiation monitoring system (ARMS) is the radiation situation control on operating site, in sanitary-protection zone and in observation zone aimed at data preparation for radiation safety control under NPP normal operation. The ARMS functions necessary for the problem solution are enumerated. The ARMS organization structure and functions of its main parts are described. Special attention is paid to the ARMS central desk including requirements for equipment, communication means, reliability, radiometric and dosimetric devices. 1 fig

  8. Automated inundation monitoring using TerraSAR-X multitemporal imagery

    Science.gov (United States)

    Gebhardt, S.; Huth, J.; Wehrmann, T.; Schettler, I.; Künzer, C.; Schmidt, M.; Dech, S.

    2009-04-01

    The Mekong Delta in Vietnam offers natural resources for several million inhabitants. However, a strong population increase, changing climatic conditions and regulatory measures at the upper reaches of the Mekong lead to severe changes in the Delta. Extreme flood events occur more frequently, drinking water availability is increasingly limited, soils show signs of salinization or acidification, species and complete habitats diminish. During the Monsoon season the river regularly overflows its banks in the lower Mekong area, usually with beneficial effects. However, extreme flood events occur more frequently causing extensive damage, on the average once every 6 to 10 years river flood levels exceed the critical beneficial level X-band SAR data are well suited for deriving inundated surface areas. The TerraSAR-X sensor with its different scanning modi allows for the derivation of spatial and temporal high resolved inundation masks. The paper presents an automated procedure for deriving inundated areas from TerraSAR-X Scansar and Stripmap image data. Within the framework of the German-Vietnamese WISDOM project, focussing the Mekong Delta region in Vietnam, images have been acquired covering the flood season from June 2008 to November 2008. Based on these images a time series of the so called watermask showing inundated areas have been derived. The product is required as intermediate to (i) calibrate 2d inundation model scenarios, (ii) estimate the extent of affected areas, and (iii) analyze the scope of prior crisis. The image processing approach is based on the assumption that water surfaces are forward scattering the radar signal resulting in low backscatter signals to the sensor. It uses multiple grey level thresholds and image morphological operations. The approach is robust in terms of automation, accuracy, robustness, and processing time. The resulting watermasks show the seasonal flooding pattern with inundations starting in July, having their peak at the end

  9. D-MSR: A Distributed Network Management Scheme for Real-Time Monitoring and Process Control Applications in Wireless Industrial Automation

    Directory of Open Access Journals (Sweden)

    Paul Havinga

    2013-06-01

    Full Text Available Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  10. A software architecture for automating operations processes

    Science.gov (United States)

    Miller, Kevin J.

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a software architecture based on an integrated toolkit approach for simplifying and automating mission operations tasks. The toolkit approach is based on building adaptable, reusable graphical tools that are integrated through a combination of libraries, scripts, and system-level user interface shells. The graphical interface shells are designed to integrate and visually guide a user through the complex steps in an operations process. They provide a user with an integrated system-level picture of an overall process, defining the required inputs and possible output through interactive on-screen graphics. The OEL has developed the software for building these process-oriented graphical user interface (GUI) shells. The OEL Shell development system (OEL Shell) is an extension of JPL's Widget Creation Library (WCL). The OEL Shell system can be used to easily build user interfaces for running complex processes, applications with extensive command-line interfaces, and tool-integration tasks. The interface shells display a logical process flow using arrows and box graphics. They also allow a user to select which output products are desired and which input sources are needed, eliminating the need to know which program and its associated command-line parameters must be executed in each case. The shells have also proved valuable for use as operations training tools because of the OEL Shell hypertext help environment. The OEL toolkit approach is guided by several principles, including the use of ASCII text file interfaces with a multimission format, Perl scripts for mission-specific adaptation code, and programs that include a simple command-line interface for batch mode processing. Projects can adapt the interface shells by simple changes to the resources configuration file. This approach has allowed the development of sophisticated, automated software systems that are easy, cheap, and fast to build. This paper will

  11. Automated Web-based Monitoring of a Pump and Treat System at the Hanford Site

    Science.gov (United States)

    Webber, W.; Versteeg, R.; Richardson, A.; Ankeny, M.; Gilmore, T.; Morse, J.; Thompson, M.

    2006-05-01

    Automated and autonomous monitoring of environmental conditions can be used to improve operational efficiency, verify remedial action decisions, and promote confidence in the monitoring process by making data and associated derived information readily accessible to regulators and stakeholders. Ultimately autonomous monitoring systems can reduce overall costs associated with regulatory compliance of performance and long- term monitoring. As part of a joint decision between DOE and the WA Department of Ecology to put on "cold standby" a pump and treat system that has been operating on the Department of Energy's Hanford site in Washington State since 1995, a web site was developed to display the automated water level network around the pump and treat system. The automated water level network consists of nineteen wells with water level transducers and temperature and conductivity probes for selected wells. Data from this network will be used to evaluate the impacts of the pump-and-treat system and the response of the aquifer to shutdown of the system. The website will provide access to data from the automated network along with additional information pertaining to the shutdown of the pump and treat system to the various stakeholders in a convenient and timely fashion. This will allow the various stakeholders to observe the impacts of the shutdown as the aquifer responds. There are future plans to expand this web-based data reporting platform to other environmental data that pertains to the various remedial actions planned at the Hanford site. The benefits of the web site application for monitoring and stewardship are: consistency of data processing and analyses with automated and on demand data and information delivery. The system and data access is password controlled and access to various data or fields can be restricted to specified users. An important feature is that the stakeholders have access to the data in near-real time providing a checks-and-balance system

  12. Concept of Educationional and Administrative Processes Automation System for Department

    OpenAIRE

    Ivan N. Berlinets

    2012-01-01

    Article describes concept and approach to implementation of educational and administrative processes automation system for graduate department. Described program components and technologies implementing system’s functions

  13. Automated Solution for Data Monitoring (Dashboard of ASIC Design Flow

    Directory of Open Access Journals (Sweden)

    Kariyappa B S1 , Aravind2 , Dhananjaya A3 , Vineet Puri

    2013-07-01

    Full Text Available Application Specific Integrated Circuit (ASIC design flow consists of several steps involved with Electronic Design Automation (EDA tools. For an ASIC designer it is very important to know the status of design development. Finding the status of the actual design is currently a manual work. It is difficult to track the status and error information using log/report files generated by the tool at different stages of design flow. Therefore it is necessary to develop an automated tool to solve these issues and hence to reduce the designer effort significantly. In this paper smart data monitoring (dashboard system is developed as an automated solution using PERL scripting language. The 8-bit Arithmetic Logic Unit (ALU is designed for the verification of developed dashboard system. The log/report files are generated at each stages of the design. The information like errors, warnings, time of execution and report parameters are extracted from the design runs and stored in to database using the dashboard system. The stored design status information and report results are visualized in a single window dashboard view at each stages of the design flow. The developed dashboard system is generic and can be used for any kind of ASIC design. Thus monitoring multiple design products using dashboard, the time and effort required for checking design status is reducedsignificantly.

  14. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong

    2014-11-01

    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  15. Biosensors and Automation for Bioprocess Monitoring and Control

    OpenAIRE

    Kumar, M A

    2011-01-01

    Bioprocess monitoring and control is a complex task that needs rapid and reliable methods which are adaptable to continuous analysis. Process monitoring during fermentation is widely applicable in the field of pharmaceutical, food and beverages and wastewater treatment. The ability to monitor has direct relevance in improving performance, quality, productivity, and yield of the process. In fact, the complexity of the bioprocesses requires almost real time insight into the dynamic process for ...

  16. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  17. Semisupervised Gaussian Process for Automated Enzyme Search.

    Science.gov (United States)

    Mellor, Joseph; Grigoras, Ioana; Carbonell, Pablo; Faulon, Jean-Loup

    2016-06-17

    Synthetic biology is today harnessing the design of novel and greener biosynthesis routes for the production of added-value chemicals and natural products. The design of novel pathways often requires a detailed selection of enzyme sequences to import into the chassis at each of the reaction steps. To address such design requirements in an automated way, we present here a tool for exploring the space of enzymatic reactions. Given a reaction and an enzyme the tool provides a probability estimate that the enzyme catalyzes the reaction. Our tool first considers the similarity of a reaction to known biochemical reactions with respect to signatures around their reaction centers. Signatures are defined based on chemical transformation rules by using extended connectivity fingerprint descriptors. A semisupervised Gaussian process model associated with the similar known reactions then provides the probability estimate. The Gaussian process model uses information about both the reaction and the enzyme in providing the estimate. These estimates were validated experimentally by the application of the Gaussian process model to a newly identified metabolite in Escherichia coli in order to search for the enzymes catalyzing its associated reactions. Furthermore, we show with several pathway design examples how such ability to assign probability estimates to enzymatic reactions provides the potential to assist in bioengineering applications, providing experimental validation to our proposed approach. To the best of our knowledge, the proposed approach is the first application of Gaussian processes dealing with biological sequences and chemicals, the use of a semisupervised Gaussian process framework is also novel in the context of machine learning applied to bioinformatics. However, the ability of an enzyme to catalyze a reaction depends on the affinity between the substrates of the reaction and the enzyme. This affinity is generally quantified by the Michaelis constant KM

  18. Process improvement and automation in construction: Opposing or complementing approaches?

    OpenAIRE

    Koskela, Lauri

    1992-01-01

    It is widely recognized that there must be wide-ranging changes in construction before automation can be implemented in practice. On the other hand, the innovation rate of construction is rather low, and thus it is unclear, how the steps necessary for automation could be realized. It is argued, that an insufficient attention to process improvement is a major barrier to automation and other technological progress of construction.

  19. Automate The Tax Levy Process (Taxy)

    Data.gov (United States)

    Social Security Administration — This data store contains information to support the automation of Tax Levy payments. Data includes but is not limited to Title II benefits adjustment data, as well...

  20. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  1. Automated Monitoring System for Waste Disposal Sites and Groundwater

    Energy Technology Data Exchange (ETDEWEB)

    S. E. Rawlinson

    2003-03-01

    A proposal submitted to the U.S. Department of Energy (DOE), Office of Science and Technology, Accelerated Site Technology Deployment (ASTD) program to deploy an automated monitoring system for waste disposal sites and groundwater, herein referred to as the ''Automated Monitoring System,'' was funded in fiscal year (FY) 2002. This two-year project included three parts: (1) deployment of cellular telephone modems on existing dataloggers, (2) development of a data management system, and (3) development of Internet accessibility. The proposed concept was initially (in FY 2002) to deploy cellular telephone modems on existing dataloggers and partially develop the data management system at the Nevada Test Site (NTS). This initial effort included both Bechtel Nevada (BN) and the Desert Research Institute (DRI). The following year (FY 2003), cellular modems were to be similarly deployed at Sandia National Laboratories (SNL) and Los Alamos National Laboratory (LANL), and the early data management system developed at the NTS was to be brought to those locations for site-specific development and use. Also in FY 2003, additional site-specific development of the complete system was to be conducted at the NTS. To complete the project, certain data, depending on site-specific conditions or restrictions involving distribution of data, were to made available through the Internet via the DRI/Western Region Climate Center (WRCC) WEABASE platform. If the complete project had been implemented, the system schematic would have looked like the figure on the following page.

  2. An online mass-based gas analyser for simultaneous determination of H2, CH4, CO, N2 and CO2: an automated sensor for process monitoring in industry

    International Nuclear Information System (INIS)

    An automated gas analyser has been designed, constructed and installed for online monitoring of H2, CH4, CO and CO2 in the reduction plant at Mobarakeh Steel Company, Iran. A small and low-resolution mass spectrometer is used in this instrument. The analyser accepts the sample directly from the ambient pressure. Mass spectra are converted to percentages of the species in the mixture based on a derived mathematical expression and especially developed software. The instrument is capable of simultaneously analyzing six different gas inputs. The instrument recorded a precision level of below 3%. (paper)

  3. Agents and Daemons, automating Data Quality Monitoring operations

    Science.gov (United States)

    Lopera, Luis I.; DQM Group

    2012-12-01

    Since 2009 when the LHC came back to active service, the Data Quality Monitoring (DQM) team was faced with the need to homogenize and automate operations across all the different environments within which DQM is used. The main goal of automation is to reduce the operator intervention at the minimum possible level, especially in the area of DQM files management, where long-term archival presented the greatest challenges. Manually operated procedures cannot cope with the constant increase in luminosity, datasets and uptime of the CMS detector. Therefore a solid and reliable set of sophisticated scripts, the agents, has been designed since the beginning to manage all DQM-related workflows. This allows to fully exploiting all available resources in every condition, maximizing the performance and reducing the latency in making data available for validation and certification. The agents can be easily fine-tuned to adapt to current and future hardware constraints and proved to be flexible enough to include unforeseen features, like an ad-hoc quota management and a real time sound alarm system.

  4. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  5. Automated medical diagnosis with fuzzy stochastic models: monitoring chronic diseases.

    Science.gov (United States)

    Jeanpierre, Laurent; Charpillet, François

    2004-01-01

    As the world population ages, the patients per physician ratio keeps on increasing. This is even more important in the domain of chronic pathologies where people are usually monitored for years and need regular consultations. To address this problem, we propose an automated system to monitor a patient population, detecting anomalies in instantaneous data and in their temporal evolution, so that it could alert physicians. By handling the population of healthy patients autonomously and by drawing the physicians' attention to the patients-at-risk, the system allows physicians to spend comparatively more time with patients who need their services. In such a system, the interaction between the patients, the diagnosis module, and the physicians is very important. We have based this system on a combination of stochastic models, fuzzy filters, and strong medical semantics. We particularly focused on a particular tele-medicine application: the Diatelic Project. Its objective is to monitor chronic kidney-insufficient patients and to detect hydration troubles. During two years, physicians from the ALTIR have conducted a prospective randomized study of the system. This experiment clearly shows that the proposed system is really beneficial to the patients' health. PMID:15520535

  6. Automated Monitoring System for Fall Detection in the Elderly

    Directory of Open Access Journals (Sweden)

    Shadi Khawandi

    2010-12-01

    Full Text Available Falls are a major problem for the elderly people living independently. According to the World Health Organization, falls and sustained injuries are the third cause of chronic disability. In the last years there have been many commercial solutions aimed at automatic and non automatic detection of falls like the social alarm (wrist watch with a button that is activated by the subject in case of a fall event, and the wearable fall detectors that are based on combinations of accelerometers and tilt sensors. Critical problems are associated with those solutions like button is often unreachable after the fall, wearable devices produce many false alarms and old people tend to forget wearing them frequently. To solve these problems, we propose an automated monitoring that will detects the face of the person, extract features such as speed and determines if a human fall has occurred. An alarm is triggered immediately upon detection of a fall.

  7. Wideband impedance spectrum analyzer for process automation applications

    Science.gov (United States)

    Doerner, Steffen; Schneider, Thomas; Hauptmann, Peter R.

    2007-10-01

    For decades impedance spectroscopy is used in technical laboratories and research departments to investigate effects or material characteristics that affect the impedance spectrum of the sensor. Establishing this analytical approach for process automation and stand-alone applications will deliver additional and valuable information beside traditional measurement techniques such as the measurement of temperature, flow rate, and conductivity, among others. As yet, most of the current impedance analysis methods are suited for laboratory applications only since they involve stand-alone network analyzers that are slow, expensive, large, or immobile. Furthermore, those systems offer a large range of functionality that is not being used in process control and other fields of application. We developed a sensor interface based on high speed direct digital signal processing offering wideband impedance spectrum analysis with high resolution for frequency adjustment, excellent noise rejection, very high measurement rate, and convenient data exchange to common interfaces. The electronics has been implemented on two small circuit boards and it is well suited for process control applications such as monitoring phase transitions, characterization of fluidal systems, and control of biological processes. The impedance spectrum analyzer can be customized easily for different measurement applications by adapting the appropriate sensor module. It has been tested for industrial applications, e.g., dielectric spectroscopy and high temperature gas analysis.

  8. Automation in a material processing/storage facility

    International Nuclear Information System (INIS)

    The Savannah River Site (SRS) is currently developing a new facility, the Actinide Packaging and Storage Facility (APSF), to process and store legacy materials from the United States nuclear stockpile. A variety of materials, with a variety of properties, packaging and handling/storage requirements, will be processed and stored at the facility. Since these materials are hazardous and radioactive, automation will be used to minimize worker exposure. Other benefits derived from automation of the facility include increased throughput capacity and enhanced security. The diversity of materials and packaging geometries to be handled poses challenges to the automation of facility processes. In addition, the nature of the materials to be processed underscores the need for safety, reliability and serviceability. The application of automation in this facility must, therefore, be accomplished in a rational and disciplined manner to satisfy the strict operational requirements of the facility. Among the functions to be automated are the transport of containers between process and storage areas via an Automatic Guided Vehicle (AGV), and various processes in the Shipping Package Unpackaging (SPU) area, the Accountability Measurements (AM) area, the Special Isotope Storage (SIS) vault and the Special Nuclear Materials (SNM) vault. Other areas of the facility are also being automated, but are outside the scope of this paper

  9. Monitoring and control of the Rossendorf research reactor using a microcomputerized automation system

    International Nuclear Information System (INIS)

    A decentral hierarchic information system (HIS) is presented, which has been developed for monitoring and control of the Rossendorf Research Reactor RFR, but which may also be considered the prototype of a digital automation system (AS) to be used in power stations. The functions integrated in the HIS are as follows: process monitoring, process control, and use of a specialized industrial robot for control of charging and discharging of the materials to be irradiated. The AS is realized on the basis of the process computer system PRA 30 (A 6492) developed in the GDR and including a computer K 1630 and the intelligent process terminals ursadat 5000 connected by a fast serial interface (IFLS). (author)

  10. Process monitoring by display devices

    International Nuclear Information System (INIS)

    The use of extensive automation, regulating, protection and limiting devices and the application of ergonomic principles (e.g. the increased use of mimic diagrams) has led to plant being capable of continued operation. German nuclear power stations are in top position worldwide as regards safety and availability. However, there is already a requirement to overcome the unmanageable state due to the large number and miniaturization of elements by renewed efforts. An attempt at this made with conventional technology is represented by a mimic board, which was provided in a powerstation just being set to work. Such mimic boards give the opportunity of monitoring the most important parameters at a glance but there are limits to their use due to the large space required. The use of VDU screens represents a possibility of solving this problem. (orig./DG)

  11. AUTOMATED DEPLOYMENT PROCESS WITHIN ENTERPRISE SOLUTIONS : Case Episerver

    OpenAIRE

    Heinänen, Michael

    2016-01-01

    This research focused on studying the concept of automated deployment in Web hosted applications. The work, conducted for within Episerver, had three objectives, i.e. to reduce deployment times, cost and dependency on managed services engineers; to introduce a more reliable deployment solution with the current infrastructure in order to minimize human error; and to develop an agile and secure automated deployment process for the case company. The research presents a fully functional deplo...

  12. Industrial internet and its role in process automation

    OpenAIRE

    Solovyev, Anatoly

    2016-01-01

    Modern process automation undergoes a major shift in the way it addresses conventional challenges. Moreover, it is adapting to the newly arising challenges due to changing business scenarios. Nowadays, the areas of the automation that recently were rather separate start to merge and the border between them is fading. This situation only adds struggle to the already highly competitive production industry. In order to be successful, companies should adopt new approaches to the w...

  13. Process Monitoring for Nuclear Safeguards

    International Nuclear Information System (INIS)

    Process Monitoring has long been used to evaluate industrial processes and operating conditions in nuclear and non-nuclear facilities. In nuclear applications there is a recognized need to demonstrate the safeguards benefits from using advanced process monitoring on spent fuel reprocessing technologies and associated facilities, as a complement to nuclear materials accounting. This can be accomplished by: defining credible diversion pathway scenarios as a sample problem; using advanced sensor and data analysis techniques to illustrate detection capabilities; and formulating 'event detection' methodologies as a means to quantify performance of the safeguards system. Over the past 30 years there have been rapid advances and improvement in the technology associated with monitoring and control of industrial processes. In the context of bulk handling facilities that process nuclear materials, modern technology can provide more timely information on the location and movement of nuclear material to help develop more effective safeguards. For international safeguards, inspection means verification of material balance data as reported by the operator through the State to the international inspectorate agency. This verification recognizes that the State may be in collusion with the operator to hide clandestine activities, potentially during abnormal process conditions with falsification of data to mask the removal. Records provided may show material is accounted for even though a removal occurred. Process monitoring can offer additional fidelity during a wide variety of operating conditions to help verify the declaration or identify possible diversions. The challenge is how to use modern technology for process monitoring and control in a proprietary operating environment subject to safeguards inspectorate or other regulatory oversight. Under the U.S. National Nuclear Security Administration's Next Generation Safeguards Initiative, a range of potential safeguards applications

  14. An Automated 476 MHz RF Cavity Processing Facility at SLAC

    CERN Document Server

    McIntosh, P; Schwarz, H

    2003-01-01

    The 476 MHz accelerating cavities currently used at SLAC are those installed on the PEP-II B-Factory collider accelerator. They are designed to operate at a maximum accelerating voltage of 1 MV and are routinely utilized on PEP-II at voltages up to 750 kV. During the summer of 2003, SPEAR3 will undergo a substantial upgrade, part of which will be to replace the existing 358.54 MHz RF system with essentially a PEP-II high energy ring (HER) RF station operating at 476.3 MHz and 3.2 MV (or 800 kV/cavity). Prior to installation, cavity RF processing is required to prepare them for use. A dedicated high power test facility is employed at SLAC to provide the capability of conditioning each cavity up to the required accelerating voltage. An automated LabVIEW based interface controls and monitors various cavity and test stand parameters, increasing the RF fields accordingly such that stable operation is finally achieved. This paper describes the high power RF cavity processing facility, highlighting the features of t...

  15. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.;

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually....... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management. The...

  16. Monitored Retrievable Storage/Multi-Purpose Canister analysis: Simulation and economics of automation

    International Nuclear Information System (INIS)

    Robotic automation is examined as a possible alternative to manual spent nuclear fuel, transport cask and Multi-Purpose canister (MPC) handling at a Monitored Retrievable Storage (MRS) facility. Automation of key operational aspects for the MRS/MPC system are analyzed to determine equipment requirements, through-put times and equipment costs is described. The economic and radiation dose impacts resulting from this automation are compared to manual handling methods

  17. Spray automated balancing of rotors - How process parameters influence performance

    Science.gov (United States)

    Smalley, A. J.; Baldwin, R. M.; Fleming, D. P.; Yuhas, J. S.

    1989-01-01

    This paper addresses the application of spray-automated balancing of rotors, and the influence that various operating parameters will have on balancing performance. Spray-automated balancing uses the fuel-air repetitive explosion process to imbed short, discrete bursts of high velocity, high temperature powder into a rotating part at an angle selected to reduce unbalance of the part. The shortness of the burst, the delay in firing of the gun, the speed of the disk and the variability in speed all influence the accuracy and effectiveness of the automated balancing process. The paper evaluates this influence by developing an analytical framework and supplementing the analysis with empirical data obtained while firing the gun at a rotating disk. Encouraging results are obtained, and it is shown that the process should perform satisfactorily over a wide range of operating parameters. Further experimental results demonstrate the ability of the method to reduce vibration levels induced by mass unbalance in a rotating disk.

  18. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    Science.gov (United States)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  19. The Automated Discovery of Hybrid Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Reijers, Hajo; Maggi, Fabrizio Maria

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedu...

  20. An automated digital imaging system for environmental monitoring applications

    Science.gov (United States)

    Bogle, Rian; Velasco, Miguel; Vogel, John

    2013-01-01

    Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.

  1. An automated data system for the monitoring of RBMK-1500 reactors: Current status and potential

    International Nuclear Information System (INIS)

    The automated monitoring systems of the Ignalina nuclear power plant's RBMK-1500 reactor include a computing system which collects, logs, processes and presents to the reactor operator data on process operations and the status of various plant systems. The data obtained by the system by means of complex logical processing of measurement results and model calculations and comparing the parameters measured with the calculated settings stored in the computing system is presented on mimic panels, colour and monochromatic displays and printouts. The system is a multilevel multimachine complex which has a hierarchical structure with functional and topological decentralization. The approach adopted in designing a flexible computing system, the modular style of its software, and the distributed database facilitate the updating of the functional and technical structure of the system. The updating being undertaken provides for an expansion of functions for monitoring compliance with operating regulations, data presentations, and solutions to a range of other problems connected with increasing the operational safety of the reactor. The experience acquired in creating the Ignalina plant's computer system is being used to develop a new automated monitoring and control system for pressure tube reactors, based on promising computing facilities. Different versions of the system are being examined which optimize the combination of analogue and digital facilities in a manner which eases the operator's burden with regard to evaluating the status of the reactor unit and taking reactor control decisions. The design takes into account the fact that the improvement of microprocessors and their incorporation in monitoring and plant control and safety systems and equipment will in the future enable systems which are at present functionally and instrumentally separate to be closely integrated. (author). 1 fig

  2. The Automated Discovery of Hybrid Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Reijers, Hajo; Maggi, Fabrizio Maria

    2014-01-01

    technique for discovering from an event log a so-called hybrid process model. A hybrid process model is hierarchical, where each of its sub-processes may be specified in a declarative or procedural fashion. We have implemented the proposed approach as a plug-in of the ProM platform. To evaluate the approach......, we used our plug-in to mine a real-life log from a financial context....

  3. Initial Flight Results for an Automated Satellite Beacon Health Monitoring Network

    OpenAIRE

    Young, Anthony; Kitts, Christopher; Neumann, Michael; Mas, Ignacio; Rasay, Mike

    2010-01-01

    Beacon monitoring is an automated satellite health monitoring architecture that combines telemetry analysis, periodic low data rate message broadcasts by a spacecraft, and automated ground reception and data handling in order to implement a cost-effective anomaly detection and notification capability for spacecraft missions. Over the past two decades, this architecture has been explored and prototyped for a range of spacecraft mission classes to include use on NASA deep space probes, military...

  4. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  5. Model-Based Test Automation Strategies for Data Processing Systems

    OpenAIRE

    Di Nardo, Daniel

    2016-01-01

    Data processing software is an essential component of systems that aggregate and analyse real-world data, thereby enabling automated interaction between such systems and the real world. In data processing systems, inputs are often big and complex files that have a well-defined structure, and that often have dependencies between several of their fields. Testing of data processing systems is complex. Software engineers, in charge of testing these systems, have to handcraft complex data files of...

  6. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino;

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C...

  7. Automated Tow Placement Processing and Characterization of Composites

    Science.gov (United States)

    Prabhakaran, R.

    2004-01-01

    The project had one of the initial objectives as automated tow placement (ATP), in which a robot was used to place a collimated band of pre-impregnated ribbons or a wide preconsolidated tape onto a tool surface. It was proposed to utilize the Automated Tow Placement machine that was already available and to fabricate carbon fiber reinforced PEEK (polyether-ether-ketone) matrix composites. After initial experiments with the fabrication of flat plates, composite cylinders were to be fabricated. Specimens from the fabricated parts were to be tested for mechanical characterization. A second objective was to conduct various types of tests for characterizing composite specimens cured by different fabrication processes.

  8. Knowledge Automation How to Implement Decision Management in Business Processes

    CERN Document Server

    Fish, Alan N

    2012-01-01

    A proven decision management methodology for increased profits and lowered risks Knowledge Automation: How to Implement Decision Management in Business Processes describes a simple but comprehensive methodology for decision management projects, which use business rules and predictive analytics to optimize and automate small, high-volume business decisions. It includes Decision Requirements Analysis (DRA), a new method for taking the crucial first step in any IT project to implement decision management: defining a set of business decisions and identifying all the information-business knowledge

  9. Automated Solution for Data Monitoring (Dashboard) of ASIC Design Flow

    OpenAIRE

    Kariyappa B S1 , Aravind2 , Dhananjaya A3 , Vineet Puri

    2013-01-01

    Application Specific Integrated Circuit (ASIC) design flow consists of several steps involved with Electronic Design Automation (EDA) tools. For an ASIC designer it is very important to know the status of design development. Finding the status of the actual design is currently a manual work. It is difficult to track the status and error information using log/report files generated by the tool at different stages of design flow. Therefore it is necessary to develop an automated tool to solve t...

  10. Automation of Extraction Chromatograhic and Ion Exchange Separations for Radiochemical Analysis and Monitoring

    International Nuclear Information System (INIS)

    Radiochemical analysis, complete with the separation of radionuclides of interest from the sample matrix and from other interfering radionuclides, is often an essential step in the determination of the radiochemical composition of a nuclear sample or process stream. Although some radionuclides can be determined nondestructively by gamma spectroscopy, where the gamma rays penetrate significant distances in condensed media and the gamma ray energies are diagnostic for specific radionuclides, other radionuclides that may be of interest emit only alpha or beta particles. For these, samples must be taken for destructive analysis and radiochemical separations are required. For process monitoring purposes, the radiochemical separation and detection methods must be rapid so that the results will be timely. These results could be obtained by laboratory analysis or by radiochemical process analyzers operating on-line or at-site. In either case, there is a need for automated radiochemical analysis methods to provide speed, throughput, safety, and consistent analytical protocols. Classical methods of separation used during the development of nuclear technologies, namely manual precipitations, solvent extractions, and ion exchange, are slow and labor intensive. Fortunately, the convergence of digital instrumentation for preprogrammed fluid manipulation and the development of new separation materials for column-based isolation of radionuclides has enabled the development of automated radiochemical analysis methodology. The primary means for separating radionuclides in solution are liquid-liquid extraction and ion exchange. These processes are well known and have been reviewed in the past.1 Ion exchange is readily employed in column formats. Liquid-liquid extraction can also be implemented on column formats using solvent-impregnated resins as extraction chromatographic materials. The organic liquid extractant is immobilized in the pores of a microporous polymer material. Under

  11. Automated high-volume aerosol sampling station for environmental radiation monitoring

    International Nuclear Information System (INIS)

    An automated high-volume aerosol sampling station, known as CINDERELLA.STUK, for environmental radiation monitoring has been developed by the Radiation and Nuclear Safety Authority (STUK), Finland. The sample is collected on a glass fibre filter (attached into a cassette), the airflow through the filter is 800 m3/h at maximum. During the sampling, the filter is continuously monitored with Na(I) scintillation detectors. After the sampling, the large filter is automatically cut into 15 pieces that form a small sample and after ageing, the pile of filter pieces is moved onto an HPGe detector. These actions are performed automatically by a robot. The system is operated at a duty cycle of 1 d sampling, 1 d decay and 1 d counting. Minimum detectable concentrations of radionuclides in air are typically 1Ae10 x 10-6 Bq/m3. The station is equipped with various sensors to reveal unauthorized admittance. These sensors can be monitored remotely in real time via Internet or telephone lines. The processes and operation of the station are monitored and partly controlled by computer. The present approach fulfils the requirements of CTBTO for aerosol monitoring. The concept suits well for nuclear material safeguards, too

  12. Novel automated process for aspheric surfaces

    Science.gov (United States)

    Bingham, Richard G.; Walker, David D.; Kim, Do-Hyung; Brooks, David; Freeman, Richard; Riley, Darren

    2000-10-01

    We report on the development of a novel industrial process, embodied in a new robotic polishing machine, for automatically grinding an polishing aspheric optics. The machine is targeted at meeting the growing demand for inexpensive axially symmetric but aspherical lenses and mirrors for industry and science, non-axisymmetric and conformal optics of many kinds, the planarization of silicon wafers and associated devices, and for controlling form and texture in other artifacts including prosthetic joints. We describe both the physics and the implementation of the process. It is based on an innovative pressurized tool of variable effective size, spun to give high removal rate. The tool traverse and orientation are orchestrated in a unique (and patented) way to avoid completely the characteristic fast peripheral-velocity and center-zero left by conventional spinning tools. The pressurized tooling supports loose abrasive grinding and polishing, plus a new bound-abrasive grinding process, providing for a wide range of work from coarse profiling to fine polishing and figuring. Finally we discuss the critical control, data handling and software challenges in the implementation of the process, contrast the approach with alternative technologies, and present preliminary results of polishing trials.

  13. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    Science.gov (United States)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  14. Monitoring of polymer melt processing

    International Nuclear Information System (INIS)

    The paper reviews the state-of-the-art of in-line and on-line monitoring during polymer melt processing by compounding, extrusion and injection moulding. Different spectroscopic and scattering techniques as well as conductivity and viscosity measurements are reviewed and compared concerning their potential for different process applications. In addition to information on chemical composition and state of the process, the in situ detection of morphology, which is of specific interest for multiphase polymer systems such as polymer composites and polymer blends, is described in detail. For these systems, the product properties strongly depend on the phase or filler morphology created during processing. Examples for optical (UV/vis, NIR) and ultrasonic attenuation spectra recorded during extrusion are given, which were found to be sensitive to the chemical composition as well as to size and degree of dispersion of micro or nanofillers in the polymer matrix. By small-angle light scattering experiments, process-induced structures were detected in blends of incompatible polymers during compounding. Using conductivity measurements during extrusion, the influence of processing conditions on the electrical conductivity of polymer melts with conductive fillers (carbon black or carbon nanotubes) was monitored. (topical review)

  15. Seismic monitoring of geomorphic processes

    Science.gov (United States)

    Burtin, A.; Hovius, N.; Turowski, J. M.

    2014-12-01

    In seismology, the signal is usually analysed for earthquake data, but these represent less than 1% of continuous recording. The remaining data are considered as seismic noise and were for a long time ignored. Over the past decades, the analysis of seismic noise has constantly increased in popularity, and this has led to develop new approaches and applications in geophysics. The study of continuous seismic records is now open to other disciplines, like geomorphology. The motion of mass at the Earth's surface generates seismic waves that are recorded by nearby seismometers and can be used to monitor its transfer through the landscape. Surface processes vary in nature, mechanism, magnitude and space and time, and this variability can be observed in the seismic signals. This contribution aims to give an overview of the development and current opportunities for the seismic monitoring of geomorphic processes. We first describe the common principles of seismic signal monitoring and introduce time-frequency analysis for the purpose of identification and differentiation of surface processes. Second, we present techniques to detect, locate and quantify geomorphic events. Third, we review the diverse layout of seismic arrays and highlight their advantages and limitations for specific processes, like slope or channel activity. Finally, we illustrate all these characteristics with the analysis of seismic data acquired in a small debris-flow catchment where geomorphic events show interactions and feedbacks. Further developments must aim to fully understand the richness of the continuous seismic signals, to better quantify the geomorphic activity and improve the performance of warning systems. Seismic monitoring may ultimately allow the continuous survey of erosion and transfer of sediments in the landscape on the scales of external forcing.

  16. Automated business processes in outbound logistics: An information system perspective

    DEFF Research Database (Denmark)

    Tambo, Torben

    2010-01-01

    This article analyses the potentials and possibilities of changing outbound logistics from highly labour intensive on the information processing side to a more or less fully automated solution. Automation offers advantages in terms of direct labour cost reduction as well as indirect cost reduction...... process alignment with a highly standardised outbound logistics although serving a vast range of customers and countries. Expressing a number of compliance requirements and associated business processes outlines the design criteria for the information system. Implementation of this design with bespoke ERP...... is not a matter of whether the system can or cannot, but a matter of making a technological and economical best fit. Along the formal implementation issues there is a parallel process focused on a mutuality between IT teams, business users, management and external stakeholders in offering relevant...

  17. ECG acquisition and automated remote processing

    CERN Document Server

    Gupta, Rajarshi; Bera, Jitendranath

    2014-01-01

    The book is focused on the area of remote processing of ECG in the context of telecardiology, an emerging area in the field of Biomedical Engineering Application. Considering the poor infrastructure and inadequate numbers of physicians in rural healthcare clinics in India and other developing nations, telemedicine services assume special importance. Telecardiology, a specialized area of telemedicine, is taken up in this book considering the importance of cardiac diseases, which is prevalent in the population under discussion. The main focus of this book is to discuss different aspects of ECG acquisition, its remote transmission and computerized ECG signal analysis for feature extraction. It also discusses ECG compression and application of standalone embedded systems, to develop a cost effective solution of a telecardiology system.

  18. Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation.

    Science.gov (United States)

    Muir, B M; Moray, N

    1996-03-01

    Two experiments are reported which examined operators' trust in and use of the automation in a simulated supervisory process control task. Tests of the integrated model of human trust in machines proposed by Muir (1994) showed that models of interpersonal trust capture some important aspects of the nature and dynamics of human-machine trust. Results showed that operators' subjective ratings of trust in the automation were based mainly upon their perception of its competence. Trust was significantly reduced by any sign of incompetence in the automation, even one which had no effect on overall system performance. Operators' trust changed very little with experience, with a few notable exceptions. Distrust in one function of an automatic component spread to reduce trust in another function of the same component, but did not generalize to another independent automatic component in the same system, or to other systems. There was high positive correlation between operators' trust in and use of the automation; operators used automation they trusted and rejected automation they distrusted, preferring to do the control task manually. There was an inverse relationship between trust and monitoring of the automation. These results suggest that operators' subjective ratings of trust and the properties of the automation which determine their trust, can be used to predict and optimize the dynamic allocation of functions in automated systems. PMID:8849495

  19. Automated control system for a mashing process

    Science.gov (United States)

    Teterin, E.; Rudnickiy, V.

    2015-10-01

    The goal of this paper is to describe a system for a mashing process, which is the first part of brewing beer. The mashing is a procedure where the fermentable (and some nonfermentable) sugars are extracted from malts. The program part based on LabVIEW, which is used to control NI CompactRIO. The main target of the project is to reach a predefined levels of the temperatures and maintain it during the pauses. When the necessary break time is ended the system is ready to go to the new value. The precise control of the temperatures during the breaks is one of the critical factors that define the texture and alcohol content of the beer. The system has two tanks with resistors PT'00 in both of them, heat exchanger (coil), heater and pump. The first tank has heating element in order to rise the temperature in the other one. This project has practical solution with all explanations and graphs which are proven working ability of this control system.

  20. Automated Science Processing for the Fermi Large Area Telescope

    Science.gov (United States)

    Chiang, James

    2012-03-01

    The Large Area Telescope (LAT) onboard the Fermi γ-ray Space Telescope provides high sensitivity to emission from astronomical sources over a broad energy range (20MeV to >300 GeV) and has substantially improved spatial, energy, and timing resolution compared with previous observatories at these energies [4]. One of the LAT's most innovative features is that it performs continuous monitoring of the gamma-ray sky with all-sky coverage every 3 h. This survey strategy greatly enables the search for transient behavior from both previously known and unknown sources. In addition, the constant accumulation of data allows for increasingly improved measurements of persistent sources. These include the Milky Way Galaxy itself, which produces gamma-ray emission as a result from interactions of cosmic rays with gas in the Galaxy, and potential signals from candidate dark matter particles in the Milky Way and its neighboring galaxies. The automated science processing (ASP) functionality of the Fermi Instrument Science Operations Center (ISOC) is a part of the automated data pipeline that processes the raw data arriving from the spacecraft and puts it into a form amenable to scientific analysis. ASP operates at the end of the pipeline on the processed data and is intended to detect and characterize transient behavior (e.g., short time scale increases or “flares” in the gamma-ray flux) from astronomical sources. On detection of a flaring event, ASP will alert other observatories on a timely basis so that they may train their telescopes on the flaring source in order to detect possible correlated activity in other wavelength bands. Since the data from the LAT is archived and publicly available as soon as it is processed, ASP serves mainly to provide triggers for those follow-up observations; its estimates of the properties of the flaring sources (flux, spectral index, location) need not be the best possible, as subsequent off-line analysis can provide more refined

  1. Mass Spectrometry-Based Monitoring of Millisecond Protein-Ligand Binding Dynamics Using an Automated Microfluidic Platform

    Energy Technology Data Exchange (ETDEWEB)

    Cong, Yongzheng; Katipamula, Shanta; Trader, Cameron D.; Orton, Daniel J.; Geng, Tao; Baker, Erin Shammel; Kelly, Ryan T.

    2016-03-24

    Characterizing protein-ligand binding dynamics is crucial for understanding protein function and developing new therapeutic agents. We have developed a novel microfluidic platform that features rapid mixing of protein and ligand solutions, variable incubation times, and on-chip electrospray ionization to perform label-free, solution-based monitoring of protein-ligand binding dynamics. This platform offers many advantages including automated processing, rapid mixing, and low sample consumption.

  2. An Automated Image Processing System for Concrete Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Baumgart, C.W.; Cave, S.P.; Linder, K.E.

    1998-11-23

    AlliedSignal Federal Manufacturing & Technologies (FM&T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of "pixels" which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented.

  3. An Automated, Image Processing System for Concrete Evaluation

    International Nuclear Information System (INIS)

    Allied Signal Federal Manufacturing ampersand Technologies (FM ampersand T) was asked to perform a proof-of-concept study for the Missouri Highway and Transportation Department (MHTD), Research Division, in June 1997. The goal of this proof-of-concept study was to ascertain if automated scanning and imaging techniques might be applied effectively to the problem of concrete evaluation. In the current evaluation process, a concrete sample core is manually scanned under a microscope. Voids (or air spaces) within the concrete are then detected visually by a human operator by incrementing the sample under the cross-hairs of a microscope and by counting the number of ''pixels'' which fall within a void. Automation of the scanning and image analysis processes is desired to improve the speed of the scanning process, to improve evaluation consistency, and to reduce operator fatigue. An initial, proof-of-concept image analysis approach was successfully developed and demonstrated using acquired black and white imagery of concrete samples. In this paper, the automated scanning and image capture system currently under development will be described and the image processing approach developed for the proof-of-concept study will be demonstrated. A development update and plans for future enhancements are also presented

  4. The automation of analysis of technological process effectiveness

    Directory of Open Access Journals (Sweden)

    B. Krupińska

    2007-10-01

    Full Text Available Purpose: Improvement of technological processes by the use of technological efficiency analysis can create basis of their optimization. Informatization and computerization of wider and wider scope of activity is one of the most important current development trends of an enterprise.Design/methodology/approach: Indicators appointment makes it possible to evaluate the process efficiency, which can constitute an optimization basis of particular operation. Model of technological efficiency analysis is based on particular efficiency indicators that characterize operation, taking into account following criteria: operation – material, operation – machine, operation – human, operation – technological parameters.Findings: From the qualitative and correctness of choose of technology point of view comprehensive technological processes assessment makes up the basis of technological efficiency analysis. Results of technological efficiency analysis of technological process of prove that the chosen model of technological efficiency analysis makes it possible to improve the process continuously by the technological analysis, and application of computer assistance makes it possible to automate the process of efficiency analysis, and finally controlled improvement of technological processes.Practical implications: For the sake of complexity of technological efficiency analysis one has created an AEPT computer analysis from which result: operation efficiency indicators with distinguished indicators with minimal acceptable values, values of efficiency of the applied samples, value of technological process efficiency.Originality/value: The created computer analysis of ef technological process efficiency (AEPT makes it possible to automate the process of analysis and optimization.

  5. Linked Data approach for selection process automation in Systematic Reviews

    OpenAIRE

    Torchiano, Marco; Morisio, Maurizio; Tomassetti, Federico Cesare Argentino; Ardito, Luca; Vetro, Antonio; Rizzo, Giuseppe

    2011-01-01

    Background: a systematic review identifies, evaluates and synthesizes the available literature on a given topic using scientific and repeatable methodologies. The significant workload required and the subjectivity bias could affect results. Aim: semi-automate the selection process to reduce the amount of manual work needed and the consequent subjectivity bias. Method: extend and enrich the selection of primary studies using the existing technologies in the field of Linked Data and text mining...

  6. Automated processing of data generated by molecular dynamics

    International Nuclear Information System (INIS)

    A new integrated tool for automated processing of data generated by molecular dynamics packages and programs have been developed. The program allows to calculate important quantities such as pair correlation function, the analysis of common neighbors, counting nanoparticles and their size distribution, conversion of output files between different formats. The work explains in detail the modules of the tool, the interface between them. The uses of program are illustrated in application examples in the calculation of various properties of silver nanoparticles. (author)

  7. Supporting Communication and Collaboration in the Process Automation Industry

    OpenAIRE

    Brönmark, Jonas; Åkerlind, Mikaela

    2011-01-01

    This thesis shows new domains for social media applications. More specifically, it explores how communication and collaboration can be supported in the process automation industry.´A concept demonstrator was implemented using the Sencha Touch framework. The prototype is based on several identified use cases, and has been tested and evaluated with end users.The design and functionality is inspired from social media applications such as Facebook and Stack Overow. These kinds of popular social m...

  8. Automated DEM extraction in digital aerial photogrammetry: precisions and validation for mass movement monitoring

    Directory of Open Access Journals (Sweden)

    A. Pesci

    2005-06-01

    Full Text Available Automated procedures for photogrammetric image processing and Digital Elevation Models (DEM extraction yield high precision terrain models in a short time, reducing manual editing; their accuracy is strictly related to image quality and terrain features. After an analysis of the performance of the Digital Photogrammetric Workstation (DPW 770 Helava, the paper compares DEMs derived from different surveys and registered in the same reference system. In the case of stable area, the distribution of height residuals, their mean and standard deviation values, indicate that the theoretical accuracy is achievable automatically when terrain is characterized by regular morphology. Steep slopes, corrugated surfaces, vegetation and shadows can degrade results even if manual editing procedures are applied. The comparison of multi-temporal DEMs on unstable areas allows the monitoring of surface deformation and morphological changes.

  9. Inter-process handling automating system; Koteikan handling jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, H. [Meidensha Corp., Tokyo (Japan)

    1994-10-18

    This paper introduces automation of loading works in production site by using robots. Loading robots are required of complex movements, and are used for loading work in processing machines requiring six degrees of freedom and for relatively simple palletizing work that can be dealt with by four degrees of freedom. The `inter-machine handling system` is an automated system performed by a ceiling running robot in which different workpiece model determination and positional shift measurement are carried out by image processing. A robot uses the image information to exchange hands automatically as required, and clamp a workpiece; then runs to an M/C to replace the processed workpiece; and put the M/C processes workpiece onto a multi-axial dedicated machine. Five processing machines are operated in parallel with the cycle time matched with that of this handling process, and a processing machine finished of processing is given a handling work in preferential order. As a result, improvement in productivity and elimination of two workers were achieved simultaneously. 6 figs., 5 tabs.

  10. Integrated safeguards and security for a highly automated process

    International Nuclear Information System (INIS)

    Before the cancellation of the New Production Reactor Programs for the production of tritium, the reactors and associated processing were being designed to contain some of the most highly automated and remote systems conceived for a Department of Energy facility. Integrating safety, security, materials control and accountability (MC and A), and process systems at the proposed facilities would enhance the overall information and protection-in-depth available. Remote, automated fuel handling and assembly/disassembly techniques would deny access to the nuclear materials while upholding ALARA principles but would also require the full integration of all data/information systems. Such systems would greatly enhance MC and A as well as facilitate materials tracking. Physical protection systems would be connected with materials control features to cross check activities and help detect and resolve anomalies. This paper will discuss the results of a study of the safeguards and security benefits achieved from a highly automated and integrated remote nuclear facility and the impacts that such systems have on safeguards and computer and information security

  11. Automation of the electron-beam welding process

    Science.gov (United States)

    Koleva, E.; Dzharov, V.; Kardjiev, M.; Mladenov, G.

    2016-03-01

    In this work, the automatic control is considered of the vacuum and cooling systems of the located in the IE-BAS equipment for electron-beam welding, evaporation and surface modification. A project was elaborated for the control and management based on the development of an engineering support system using existing and additional technical means of automation. Optimization of the indicators, which are critical for the duration of reaching the working regime and stopping the operation of the installation, can be made using experimentally obtained transient characteristics. The automation of the available equipment aimed at improving its efficiency and the repeatability of the obtained results, as well as at stabilizing the process parameters, should be integrated in an Engineering Support System which, besides the operator supervision, consists of several subsystems for equipment control, data acquisition, information analysis, system management and decision-making support.

  12. Automation of film densitometry for application in personal monitoring

    International Nuclear Information System (INIS)

    In this research work, a semi-automatic densitometry system has been developed for large-scale monitoring services by use of film badge dosemeters. The system consists of a charge-coupled device (CCD)-based scanner that can scan optical densities (ODs) up to 4.2, a computer vision algorithm to improve the quality of digitised films and an analyser program to calculate the necessary information, e.g. the mean OD of region of interest and radiation doses. For calibration of the system, two reference films were used. The Microtek scanner International Color Consortium (ICC) profiler is applied for determining the colour attributes of the scanner accurately and a reference of the density step tablet, Bundesanstalt fuer Materialforschung und-pruefung (BAM) is used for calibrating the automatic conversion of gray-level values to OD values in the range of 0.2-4.0 OD. The system contributes to achieve more objectives and reliable results. So by applying this system, we can digitise a set of 20 films at once and calculate their relative doses less than about 4 min, and meanwhile it causes to avoid disadvantages of manual process and to enhance the accuracy of dosimetry. (authors)

  13. Automated separation process for radioanalytical purposes at nuclear power plants.

    Science.gov (United States)

    Nagy, L G; Vajda, N; Vodicska, M; Zagyvai, P; Solymosi, J

    1987-10-01

    Chemical separation processes have been developed to remove the matrix components and thus to determine fission products, especially radioiodine nuclides, in the primary coolant of WWER-type nuclear reactors. Special procedures have been elaborated to enrich long-lived nuclides in waste waters to be released and to separate and enrich caesium isotopes in the environment. All processes are based mainly on ion-exchange separations using amorphous zirconium phosphate. Automated equipment was constructed to meet the demands of the plant personnel for serial analysis. PMID:3680447

  14. Text mining from ontology learning to automated text processing applications

    CERN Document Server

    Biemann, Chris

    2014-01-01

    This book comprises a set of articles that specify the methodology of text mining, describe the creation of lexical resources in the framework of text mining and use text mining for various tasks in natural language processing (NLP). The analysis of large amounts of textual data is a prerequisite to build lexical resources such as dictionaries and ontologies and also has direct applications in automated text processing in fields such as history, healthcare and mobile applications, just to name a few. This volume gives an update in terms of the recent gains in text mining methods and reflects

  15. An automated platform for phytoplankton ecology and aquatic ecosystem monitoring

    NARCIS (Netherlands)

    Pomati, F.; Jokela, J.; Simona, M.; Veronesi, M.; Ibelings, B.W.

    2011-01-01

    High quality monitoring data are vital for tracking and understanding the causes of ecosystem change. We present a potentially powerful approach for phytoplankton and aquatic ecosystem monitoring, based on integration of scanning flow-cytometry for the characterization and counting of algal cells wi

  16. Using Automation to Improve the Flight Software Testing Process

    Science.gov (United States)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  17. Automated Quality Monitoring and Validation of the CMS Reconstruction Software

    CERN Document Server

    Piparo, Danilo

    2011-01-01

    assessed. The automated procedure adopted by CMS to accomplish this ambitious task and the innovative tools developed for that purpose are presented. The whole chain of steps is illustrated, starting from the application testing over large ensembles of datasets emulating Tier-0, Tier-1 and Tier-2 environments, to the collection of the produced physical quantities in the form of several hundred thousand histograms, to the estimation of their compatibility between releases, to the final production and publication of reports characterised by an ef...

  18. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    Science.gov (United States)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  19. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    International Nuclear Information System (INIS)

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  20. Automated multi-parameter monitoring of neo-nates

    OpenAIRE

    Gangadharan, V.

    2013-01-01

    Advancements in monitoring technology have led to an increasing amount of physiological data; such as heart rate and oxygen saturation, being accumulated in hospitals. A high rate of false alarms in the neonatal intensive care environment due to inadequate analysis of data highlights the need for an intelligent detection system with improved specificity that provides timely alerts to allow early clinical intervention. Current cot-side monitoring systems analyse data channels independently by ...

  1. QualitySpy: a framework for monitoring software development processes

    Directory of Open Access Journals (Sweden)

    Marian Jureczko

    2012-03-01

    Full Text Available The growing popularity of highly iterative, agile processes creates increasing need for automated monitoring of the quality of software artifacts, which would be focused on short terms (in the case of eXtreme Programming process iteration can be limited to one week. This paper presents a framework that calculates software metrics and cooperates with development tools (e.g. source version control system and issue tracking system to describe current state of a software project with regard to its quality. The framework is designed to support high level of automation of data collection and to be useful for researchers as well as for industry. The framework is currently being developed hence the paper reports already implemented features as well as future plans. The first release is scheduled for July.

  2. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  3. Real-time systems implementation of industrial computerized process automation

    CERN Document Server

    Halang, WA

    1992-01-01

    This book represents the first comprehensive text in English on real-time and embedded computing systems. It is addressed to engineering students of universities and polytechnics as well as to practitioners and provides the knowledge required for the implementation of industrial computerized process control and manufacturing automation systems. The book avoids mathematical treatment and supports the relevance of the concepts introduced by practical examples and case studies. Special emphasis is placed on a sound conceptual basis and on methodologies and tools for the development of high qualit

  4. Starting the automation process by using group technology

    Directory of Open Access Journals (Sweden)

    Jorge Andrés García Barbosa

    2010-06-01

    Full Text Available This article describes starting-up an automation process based on applying group technology (GT. Mecanizados CNC, a company making matallurgical sector products, bases the layout (organisation and disposition of its machinery on the concept of manufacturing cells; production is programmed once the best location for the equipment has been determined. The order of making products and suitable setting up of tools for the machinery in the cells is established, aimed at minimising set up leading to achieving 15% improvement in productivity.

  5. Microsoft Business Solutions-Axapta as a basis for automated monitoring of high technology products competitiveness

    Science.gov (United States)

    Tashchiyan, G. O.; Sushko, A. V.; Grichin, S. V.

    2015-09-01

    One of the conditions of normal performance of the Russian economy is the problem of high technology products competitiveness. Different tools of these products estimation are used nowadays, one of them is automated monitoring of the high technology products in mechanical engineering. This system is developed on the basis of “Innovator" software integrated in Microsoft Business Solutions-Axapta.

  6. RAPID AUTOMATED RADIOCHEMICAL ANALYZER FOR DETERMINATION OF TARGETED RADIONUCLIDES IN NUCLEAR PROCESS STREAMS

    International Nuclear Information System (INIS)

    Some industrial process-scale plants require the monitoring of specific radionuclides as an indication of the composition of their feed streams or as indicators of plant performance. In this process environment, radiochemical measurements must be fast, accurate, and reliable. Manual sampling, sample preparation, and analysis of process fluids are highly precise and accurate, but tend to be expensive and slow. Scientists at Pacific Northwest National Laboratory (PNNL) have assembled and characterized a fully automated prototype Process Monitor instrument which was originally designed to rapidly measure Tc-99 in the effluent streams of the Waste Treatment Plant at Hanford, WA. The system is capable of a variety of tasks: extraction of a precise volume of sample, sample digestion/analyte redox adjustment, column-based chemical separations, flow-through radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results can be immediately calculated and electronically reported. It is capable of performing a complete analytical cycle in less than 15 minutes. The system is highly modular and can be adapted to a variety of sample types and analytical requirements. It exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  7. Atlas-based multichannel monitoring of functional MRI signals in real-time: automated approach.

    Science.gov (United States)

    Lee, Jong-Hwan; O'Leary, Heather M; Park, Hyunwook; Jolesz, Ferenc A; Yoo, Seung-Schik

    2008-02-01

    We report an automated method to simultaneously monitor blood-oxygenation-level-dependent (BOLD) MR signals from multiple cortical areas in real-time. Individual brain anatomy was normalized and registered to a pre-segmented atlas in standardized anatomical space. Subsequently, using real-time fMRI (rtfMRI) data acquisition, localized BOLD signals were measured and displayed from user-selected areas labeled with anatomical and Brodmann's Area (BA) nomenclature. The method was tested on healthy volunteers during the performance of hand motor and internal speech generation tasks employing a trial-based design. Our data normalization and registration algorithm, along with image reconstruction, movement correction and a data display routine were executed with enough processing and communication bandwidth necessary for real-time operation. Task-specific BOLD signals were observed from the hand motor and language areas. One of the study participants was allowed to freely engage in hand clenching tasks, and associated brain activities were detected from the motor-related neural substrates without prior knowledge of the task onset time. The proposed method may be applied to various applications such as neurofeedback, brain-computer-interface, and functional mapping for surgical planning where real-time monitoring of region-specific brain activity is needed. PMID:17370340

  8. Silicon Carbide Temperature Monitor Processing Improvements. Status Report

    International Nuclear Information System (INIS)

    Silicon carbide (SiC) temperature monitors are used as temperature sensors in Advanced Test Reactor (ATR) irradiations at the Idaho National Laboratory (INL). Although thermocouples are typically used to provide real-time temperature indication in instrumented lead tests, other indicators, such as melt wires, are also often included in such tests as an independent technique of detecting peak temperatures incurred during irradiation. In addition, less expensive static capsule tests, which have no leads attached for real-time data transmission, often rely on melt wires as a post-irradiation technique for peak temperature indication. Melt wires are limited in that they can only detect whether a single temperature is or is not exceeded. SiC monitors are advantageous because a single monitor can be used to detect for a range of temperatures that occurred during irradiation. As part of the process initiated to make SiC temperature monitors available at the ATR, post-irradiation evaluations of these monitors have been previously completed at the High Temperature Test Laboratory (HTTL). INL selected the resistance measurement approach for determining irradiation temperature from SiC temperature monitors because it is considered to be the most accurate measurement. The current process involves the repeated annealing of the SiC monitors at incrementally increasing temperature, with resistivity measurements made between annealing steps. The process is time consuming and requires the nearly constant attention of a trained staff member. In addition to the expensive and lengthy post analysis required, the current process adds many potential sources of error in the measurement, as the sensor must be repeatedly moved from furnace to test fixture. This time-consuming post irradiation analysis is a significant portion of the total cost of using these otherwise inexpensive sensors. An additional consideration of this research is that, if the SiC post processing can be automated, it

  9. Silicon Carbide Temperature Monitor Processing Improvements. Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Unruh, Troy Casey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Daw, Joshua Earl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Al Rashdan, Ahamad [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-29

    Silicon carbide (SiC) temperature monitors are used as temperature sensors in Advanced Test Reactor (ATR) irradiations at the Idaho National Laboratory (INL). Although thermocouples are typically used to provide real-time temperature indication in instrumented lead tests, other indicators, such as melt wires, are also often included in such tests as an independent technique of detecting peak temperatures incurred during irradiation. In addition, less expensive static capsule tests, which have no leads attached for real-time data transmission, often rely on melt wires as a post-irradiation technique for peak temperature indication. Melt wires are limited in that they can only detect whether a single temperature is or is not exceeded. SiC monitors are advantageous because a single monitor can be used to detect for a range of temperatures that occurred during irradiation. As part of the process initiated to make SiC temperature monitors available at the ATR, post-irradiation evaluations of these monitors have been previously completed at the High Temperature Test Laboratory (HTTL). INL selected the resistance measurement approach for determining irradiation temperature from SiC temperature monitors because it is considered to be the most accurate measurement. The current process involves the repeated annealing of the SiC monitors at incrementally increasing temperature, with resistivity measurements made between annealing steps. The process is time consuming and requires the nearly constant attention of a trained staff member. In addition to the expensive and lengthy post analysis required, the current process adds many potential sources of error in the measurement, as the sensor must be repeatedly moved from furnace to test fixture. This time-consuming post irradiation analysis is a significant portion of the total cost of using these otherwise inexpensive sensors. An additional consideration of this research is that, if the SiC post processing can be automated, it

  10. Cost Analysis of an Automated and Manual Cataloging and Book Processing System.

    Science.gov (United States)

    Druschel, Joselyn

    1981-01-01

    Cost analysis of an automated network system and a manual system of cataloging and book processing indicates a 20 percent savings using automation. Per unit costs based on the average monthly automation rate are used for comparison. Higher manual system costs are attributed to staff costs. (RAA)

  11. An Automated Acoustic System to Monitor and Classify Birds

    Directory of Open Access Journals (Sweden)

    Ho KC

    2006-01-01

    Full Text Available This paper presents a novel bird monitoring and recognition system in noisy environments. The project objective is to avoid bird strikes to aircraft. First, a cost-effective microphone dish concept (microphone array with many concentric rings is presented that can provide directional and accurate acquisition of bird sounds and can simultaneously pick up bird sounds from different directions. Second, direction-of-arrival (DOA and beamforming algorithms have been developed for the circular array. Third, an efficient recognition algorithm is proposed which uses Gaussian mixture models (GMMs. The overall system is suitable for monitoring and recognition for a large number of birds. Fourth, a hardware prototype has been built and initial experiments demonstrated that the array can acquire and classify birds accurately.

  12. Automated Grid Monitoring for LHCb through HammerCloud

    CERN Document Server

    CERN. Geneva

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  13. An IR controlled automated guided vehicle for floor contamination monitoring

    International Nuclear Information System (INIS)

    The present system is useful as a remote monitoring device, for floor contamination measurements, using the IR communication. The same concept of control can be applied with any other wireless communication increasing the range of control. Only the error corrections and tolerance will have to be tuned for specific type of communication. This system can be useful as simulation platform for developing methodology and software for large scale contamination measurements due to the programmability of the system

  14. An Automated Acoustic System to Monitor and Classify Birds

    OpenAIRE

    Ho KC; Li Y.; Stanford V; Rochet C; Kwan C; Mei G; Ren Z; Xu R; Zhang Y; Lao D; Stevenson M.

    2006-01-01

    This paper presents a novel bird monitoring and recognition system in noisy environments. The project objective is to avoid bird strikes to aircraft. First, a cost-effective microphone dish concept (microphone array with many concentric rings) is presented that can provide directional and accurate acquisition of bird sounds and can simultaneously pick up bird sounds from different directions. Second, direction-of-arrival (DOA) and beamforming algorithms have been developed for the circular a...

  15. Automated Image Processing for the Analysis of DNA Repair Dynamics

    CERN Document Server

    Riess, Thorsten; Tomas, Martin; Ferrando-May, Elisa; Merhof, Dorit

    2011-01-01

    The efficient repair of cellular DNA is essential for the maintenance and inheritance of genomic information. In order to cope with the high frequency of spontaneous and induced DNA damage, a multitude of repair mechanisms have evolved. These are enabled by a wide range of protein factors specifically recognizing different types of lesions and finally restoring the normal DNA sequence. This work focuses on the repair factor XPC (xeroderma pigmentosum complementation group C), which identifies bulky DNA lesions and initiates their removal via the nucleotide excision repair pathway. The binding of XPC to damaged DNA can be visualized in living cells by following the accumulation of a fluorescent XPC fusion at lesions induced by laser microirradiation in a fluorescence microscope. In this work, an automated image processing pipeline is presented which allows to identify and quantify the accumulation reaction without any user interaction. The image processing pipeline comprises a preprocessing stage where the ima...

  16. Automated Data Processing as an AI Planning Problem

    Science.gov (United States)

    Golden, Keith; Pang, Wanlin; Nemani, Ramakrishna; Votava, Petr

    2003-01-01

    NASA s vision for Earth Science is to build a "sensor web"; an adaptive array of heterogeneous satellites and other sensors that will track important events, such as storms, and provide real-time information about the state of the Earth to a wide variety of customers. Achieving his vision will require automation not only in the scheduling of the observations but also in the processing af tee resulting data. Ta address this need, we have developed a planner-based agent to automatically generate and execute data-flow programs to produce the requested data products. Data processing domains are substantially different from other planning domains that have been explored, and this has led us to substantially different choices in terms of representation and algorithms. We discuss some of these differences and discuss the approach we have adopted.

  17. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy of the...... massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time and...... mass axis on to a fixed one-dimensional array, we obtain a vector that can be used directly as input in multivariate statistics or library search methods. We demonstrate that both cluster- and discriminant analysis as well as PCA (and related methods) can be applied directly on mass spectra from direct...

  18. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang

    2013-01-01

      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  19. Automating the training development process for mission flight operations

    Science.gov (United States)

    Scott, Carol J.

    1994-01-01

    Traditional methods of developing training do not effectively support the changing needs of operational users in a multimission environment. The Automated Training Development System (ATDS) provides advantages over conventional methods in quality, quantity, turnaround, database maintenance, and focus on individualized instruction. The Operations System Training Group at the JPL performed a six-month study to assess the potential of ATDS to automate curriculum development and to generate and maintain course materials. To begin the study, the group acquired readily available hardware and participated in a two-week training session to introduce the process. ATDS is a building activity that combines training's traditional information-gathering with a hierarchical method for interleaving the elements. The program can be described fairly simply. A comprehensive list of candidate tasks determines the content of the database; from that database, selected critical tasks dictate which competencies of skill and knowledge to include in course material for the target audience. The training developer adds pertinent planning information about each task to the database, then ATDS generates a tailored set of instructional material, based on the specific set of selection criteria. Course material consistently leads students to a prescribed level of competency.

  20. Automated angiogenesis quantification through advanced image processing techniques.

    Science.gov (United States)

    Doukas, Charlampos N; Maglogiannis, Ilias; Chatziioannou, Aristotle; Papapetropoulos, Andreas

    2006-01-01

    Angiogenesis, the formation of blood vessels in tumors, is an interactive process between tumor, endothelial and stromal cells in order to create a network for oxygen and nutrients supply, necessary for tumor growth. According to this, angiogenic activity is considered a suitable method for both tumor growth or inhibition detection. The angiogenic potential is usually estimated by counting the number of blood vessels in particular sections. One of the most popular assay tissues to study the angiogenesis phenomenon is the developing chick embryo and its chorioallantoic membrane (CAM), which is a highly vascular structure lining the inner surface of the egg shell. The aim of this study was to develop and validate an automated image analysis method that would give an unbiased quantification of the micro-vessel density and growth in angiogenic CAM images. The presented method has been validated by comparing automated results to manual counts over a series of digital chick embryo photos. The results indicate the high accuracy of the tool, which has been thus extensively used for tumor growth detection at different stages of embryonic development. PMID:17946107

  1. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    Science.gov (United States)

    Bond, Jason; Kim, Don; Chrzanowski, Adam; Szostak-Chrzanowski, Anna

    2007-01-01

    The increasing number of structural collapses, slope failures and other natural disasters has lead to a demand for new sensors, sensor integration techniques and data processing strategies for deformation monitoring systems. In order to meet extraordinary accuracy requirements for displacement detection in recent deformation monitoring projects, research has been devoted to integrating Global Positioning System (GPS) as a monitoring sensor. Although GPS has been used for monitoring purposes worldwide, certain environments pose challenges where conventional processing techniques cannot provide the required accuracy with sufficient update frequency. Described is the development of a fully automated, continuous, real-time monitoring system that employs GPS sensors and pseudolite technology to meet these requirements in such environments. Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based upon client needs. A test was conducted that illustrated a 10 mm displacement was remotely detected at a target point using the designed system. This information could then be used to signal an alarm if conditions are deemed to be unsafe.

  2. ERT monitoring of environmental remediation processes

    Science.gov (United States)

    La Brecque, D. J.; Ramirez, A. L.; Daily, W. D.; Binley, A. M.; Schima, S. A.

    1996-03-01

    The use of electrical resistance tomography (ERT) to monitor new environmental remediation processes is addressed. An overview of the ERT method, including design of surveys and interpretation, is given. Proper design and lay-out of boreholes and electrodes are important for successful results. Data are collected using an automated collection system and interpreted using a nonlinear least squares inversion algorithm. Case histories are given for three remediation technologies: Joule (ohmic) heating, in which clay layers are heated electrically; air sparging, the injection of air below the water table; and electrokinetic treatment, which moves ions by applying an electric current. For Joule heating, a case history is given for an experiment near Savannah River, Georgia, USA. The target for Joule heating was a clay layer of variable thickness. During the early stages of heating, ERT images show increases in conductivity due to the increased temperatures. Later, the conductivities decreased as the system became dehydrated. For air sparging, a case history from Florence, Oregon, USA is described. Air was injected into a sandy aquifer at the site of a former service station. Successive images clearly show the changes in shape of the region of air saturation with time. The monitoring of an electrokinetic laboratory test on core samples is shown. The electrokinetic treatment creates a large change in the core resistivity, decreasing near the anode and increasing near the cathode. Although remediation efforts were successful both at Savannah River and at Florence, in neither case did experiments progress entirely as predicted. At Savannah River, the effects of heating and venting were not uniform and at Florence the radius of air flow was smaller than expected. Most sites are not as well characterized as these two sites. Improving remediation methods requires an understanding of the movements of heat, air, fluids and ions in the sub-surface which ERT can provide. The

  3. Automation of a problem list using natural language processing

    Directory of Open Access Journals (Sweden)

    Haug Peter J

    2005-08-01

    Full Text Available Abstract Background The medical problem list is an important part of the electronic medical record in development in our institution. To serve the functions it is designed for, the problem list has to be as accurate and timely as possible. However, the current problem list is usually incomplete and inaccurate, and is often totally unused. To alleviate this issue, we are building an environment where the problem list can be easily and effectively maintained. Methods For this project, 80 medical problems were selected for their frequency of use in our future clinical field of evaluation (cardiovascular. We have developed an Automated Problem List system composed of two main components: a background and a foreground application. The background application uses Natural Language Processing (NLP to harvest potential problem list entries from the list of 80 targeted problems detected in the multiple free-text electronic documents available in our electronic medical record. These proposed medical problems drive the foreground application designed for management of the problem list. Within this application, the extracted problems are proposed to the physicians for addition to the official problem list. Results The set of 80 targeted medical problems selected for this project covered about 5% of all possible diagnoses coded in ICD-9-CM in our study population (cardiovascular adult inpatients, but about 64% of all instances of these coded diagnoses. The system contains algorithms to detect first document sections, then sentences within these sections, and finally potential problems within the sentences. The initial evaluation of the section and sentence detection algorithms demonstrated a sensitivity and positive predictive value of 100% when detecting sections, and a sensitivity of 89% and a positive predictive value of 94% when detecting sentences. Conclusion The global aim of our project is to automate the process of creating and maintaining a problem

  4. ZigBee Based Industrial Automation Profile for Power Monitoring Systems

    Directory of Open Access Journals (Sweden)

    Archana R. Raut,

    2011-05-01

    Full Text Available Industrial automations which are mostly depend upon the power systems & which requires distance controlled and regulated systems. Mostly voltage and current equipped parameters along with power and energy management system forms the industrial scenario for automations. Wireless technology which meets to cost, speed and distance scenario will always be a point of an interest for research. In this research work we mainly monitored power related parameters and enable remote switching devices for proper power management systems using ZigBee. This paper proposes a digital system for condition monitoring, diagnosis, and supervisory control for electric systems parameters like voltage and current using wireless sensor networks (WSNs based on ZigBee. Its main feature is its use of the ZigBee protocol as the communication medium between the transmitter and receiver modules. It illustrates that the new ZigBee standard performs well industrial environments.

  5. Analysing risk factors for urinary tract infection based on automated monitoring of hospital-acquired infection

    DEFF Research Database (Denmark)

    Redder, J D; Leth, R A; Møller, Jens Kjølseth

    2016-01-01

    Urinary tract infections account for as much as one-third of all nosocomial infections. The aim of this study was to examine previously reported characteristics of patients with hospital-acquired urinary tract infections (HA-UTI) using an automated infection monitoring system (Hospital-Acquired...... Infection Registry: HAIR). A matched case-control study was conducted to investigate the association of risk factors with HA-UTI. Patients with HA-UTI more frequently had indwelling urinary catheters or a disease in the genitourinary or nervous system than the controls. Automated hospital-acquired infection...... monitoring enables documentation of key risk factors to better evaluate infection control interventions in general or for selected groups of patients....

  6. Signal Processing under Active Monitoring

    OpenAIRE

    Mostovyi, Oleksii

    2005-01-01

    This paper describes a method of signal preprocessing under active monitoring. Suppose we want to solve the inverse problem of getting the response of a medium to one powerful signal, which is equivalent to obtaining the transmission function of the medium, but do not have an opportunity to conduct such an experiment (it might be too expensive or harmful for the environment). Practically the problem can be reduced to obtaining the transmission function of the medium. In this case ...

  7. Design and Implementation of Aquarium Remote Automation Monitoring and Control System

    OpenAIRE

    Ma, Yinchi; Ding, Wen

    2013-01-01

    In recent years, with the rapid development of ornamental fisheries and recreational fisheries, different varieties of ornamental fish have come into many office business palaces and people’s house. The Aquarium and many other aquarium equipment are developing from functionality to intelligentize. Through the industry research, a set of remote automation monitoring and control equipment for the aquarium will have important significance and development prospect. The system stores and transmits...

  8. Automated Monitoring Systems to Assess Gait Score and Feed Intake of Broilers

    OpenAIRE

    Aydin, Arda

    2016-01-01

    The last decades of the 20th century saw important changes in animal production. Production intensified considerably and farms became highly specialised. Traditionally, livestock management decisions were based on the observation and judgment of the farmer. However, because of the increasing scale of farms and the large number of animals, the farmer has a high technical, organisational and logistical workload and therefore has limited time to monitor his animals himself. Automated monitori...

  9. Automated Selected Reaction Monitoring Software for Accurate Label-Free Protein Quantification

    OpenAIRE

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-01-01

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently...

  10. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    OpenAIRE

    Thomas E. Watts III; Robyn A. Snow; Brown, Aaron W.; J. C. York; Greg Fantom; Paul S. Simone Jr.; Gary L. Emmert

    2015-01-01

    An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The...

  11. Automating the Photogrammetric Bridging Based on MMS Image Sequence Processing

    Science.gov (United States)

    Silva, J. F. C.; Lemes Neto, M. C.; Blasechi, V.

    2014-11-01

    The photogrammetric bridging or traverse is a special bundle block adjustment (BBA) for connecting a sequence of stereo-pairs and of determining the exterior orientation parameters (EOP). An object point must be imaged in more than one stereo-pair. In each stereo-pair the distance ratio between an object and its corresponding image point varies significantly. We propose to automate the photogrammetric bridging based on a fully automatic extraction of homologous points in stereo-pairs and on an arbitrary Cartesian datum to refer the EOP and tie points. The technique uses SIFT algorithm and the keypoint matching is given by similarity descriptors of each keypoint based on the smallest distance. All the matched points are used as tie points. The technique was applied initially to two pairs. The block formed by four images was treated by BBA. The process follows up to the end of the sequence and it is semiautomatic because each block is processed independently and the transition from one block to the next depends on the operator. Besides four image blocks (two pairs), we experimented other arrangements with block sizes of six, eight, and up to twenty images (respectively, three, four, five and up to ten bases). After the whole image pairs sequence had sequentially been adjusted in each experiment, a simultaneous BBA was run so to estimate the EOP set of each image. The results for classical ("normal case") pairs were analyzed based on standard statistics regularly applied to phototriangulation, and they show figures to validate the process.

  12. Automation of the CFD Process on Distributed Computing Systems

    Science.gov (United States)

    Tejnil, Ed; Gee, Ken; Rizk, Yehia M.

    2000-01-01

    A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational

  13. Control and Monitoring Winemaking Process Online

    OpenAIRE

    Angelkov, Dimitrija; Martinovska Bande, Cveta

    2014-01-01

    In the process of wine production distributed sensor networks are used for monitoring parameters that enable constant wine quality. This paper presents an ongoing project for monitoring the conditions in the wine cellar and for controlling the wine fermentation process. Temperature and humidity sensors installed in the cellar are used to provide similar conditions for the barrels in the cellar. During the process of fermentation sensors located in the barrels are used to control the level of ...

  14. Implementation of a fully automated process purge-and-trap gas chromatograph at an environmental remediation site

    International Nuclear Information System (INIS)

    The AQUASCAN, a commercially available, fully automated purge-and-trap gas chromatograph from Sentex Systems Inc., was implemented and evaluated as an in-field, automated monitoring system of contaminated groundwater at an active DOE remediation site in Pinellas, FL. Though the AQUASCAN is designed as a stand alone process analytical unit, implementation at this site required additional hardware. The hardware included a sample dilution system and a method for delivering standard solution to the gas chromatograph for automated calibration. As a result of the evaluation the system was determined to be a reliable and accurate instrument. The AQUASCAN reported concentration values for methylene chloride, trichloroethylene, and toluene in the Pinellas ground water were within 20% of reference laboratory values

  15. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  16. Automating the Human Factors Engineering and Evaluation Processes

    International Nuclear Information System (INIS)

    The Westinghouse Savannah River Company (WSRC) has developed a software tool for automating the Human Factors Engineering (HFE) design review, analysis, and evaluation processes. The tool provides a consistent, cost effective, graded, user-friendly approach for evaluating process control system Human System Interface (HSI) specifications, designs, and existing implementations. The initial set of HFE design guidelines, used in the tool, was obtained from NUREG- 0700. Each guideline was analyzed and classified according to its significance (general concept vs. supporting detail), the HSI technology (computer based vs. non-computer based), and the HSI safety function (safety vs. non-safety). Approximately 10 percent of the guidelines were determined to be redundant or obsolete and were discarded. The remaining guidelines were arranged in a Microsoft Access relational database, and a Microsoft Visual Basic user interface was provided to facilitate the HFE design review. The tool also provides the capability to add new criteria to accommodate advances in HSI technology and incorporate lessons learned. Summary reports produced by the tool can be easily ported to Microsoft Word and other popular PC office applications. An IBM compatible PC with Microsoft Windows 95 or higher is required to run the application

  17. Signal processing of macro pulse monitor

    International Nuclear Information System (INIS)

    A beam current signal processing system for long burst beam is now under development at JAERI FEL superconducting linac. The system utilizes core current monitors already installed and enables simultaneous monitoring of the long burst beam current along the linac. (author)

  18. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28 Section 1017.28 Energy DEPARTMENT OF ENERGY (GENERAL PROVISIONS) IDENTIFICATION AND PROTECTION OF UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may...

  19. 10 CFR 74.53 - Process monitoring.

    Science.gov (United States)

    2010-01-01

    ... Quantities of Strategic Special Nuclear Material § 74.53 Process monitoring. (a) Licensees subject to § 74.51... Category IA material from any accessible process location and within seven calendar days of a loss of Category IB material from any accessible process location; (2) A quality control test whereby...

  20. Automating the packing heuristic design process with genetic programming.

    Science.gov (United States)

    Burke, Edmund K; Hyde, Matthew R; Kendall, Graham; Woodward, John

    2012-01-01

    The literature shows that one-, two-, and three-dimensional bin packing and knapsack packing are difficult problems in operational research. Many techniques, including exact, heuristic, and metaheuristic approaches, have been investigated to solve these problems and it is often not clear which method to use when presented with a new instance. This paper presents an approach which is motivated by the goal of building computer systems which can design heuristic methods. The overall aim is to explore the possibilities for automating the heuristic design process. We present a genetic programming system to automatically generate a good quality heuristic for each instance. It is not necessary to change the methodology depending on the problem type (one-, two-, or three-dimensional knapsack and bin packing problems), and it therefore has a level of generality unmatched by other systems in the literature. We carry out an extensive suite of experiments and compare with the best human designed heuristics in the literature. Note that our heuristic design methodology uses the same parameters for all the experiments. The contribution of this paper is to present a more general packing methodology than those currently available, and to show that, by using this methodology, it is possible for a computer system to design heuristics which are competitive with the human designed heuristics from the literature. This represents the first packing algorithm in the literature able to claim human competitive results in such a wide variety of packing domains. PMID:21609273

  1. Automated integration of continuous glucose monitor data in the electronic health record using consumer technology.

    Science.gov (United States)

    Kumar, Rajiv B; Goren, Nira D; Stark, David E; Wall, Dennis P; Longhurst, Christopher A

    2016-05-01

    The diabetes healthcare provider plays a key role in interpreting blood glucose trends, but few institutions have successfully integrated patient home glucose data in the electronic health record (EHR). Published implementations to date have required custom interfaces, which limit wide-scale replication. We piloted automated integration of continuous glucose monitor data in the EHR using widely available consumer technology for 10 pediatric patients with insulin-dependent diabetes. Establishment of a passive data communication bridge via a patient's/parent's smartphone enabled automated integration and analytics of patient device data within the EHR between scheduled clinic visits. It is feasible to utilize available consumer technology to assess and triage home diabetes device data within the EHR, and to engage patients/parents and improve healthcare provider workflow. PMID:27018263

  2. Comprehensive process monitoring for laser welding process optimization

    Science.gov (United States)

    Stritt, P.; Boley, M.; Heider, A.; Fetzer, F.; Jarwitz, M.; Weller, D.; Weber, R.; Berger, P.; Graf, T.

    2016-03-01

    Fundamental process monitoring is very helpful to detect defects formed during the complex interactions of capillary laser welding process. Beside the monitoring and diagnostics of laser welding process enlarges the process knowledge which is essential to prevent weld defects. Various studies on monitoring of laser welding processes of aluminum, copper and steel were performed. Coaxial analyses in real-time with inline coherent imaging and photodiode based measurements have been applied as well as off-axis thermography, spectroscopy, online X-Ray observation and highspeed imaging with 808 nm illumination wavelength. The presented diagnostics and monitoring methods were appropriate to study typical weld defects like pores, spatters and cracks. Using these diagnostics allows understanding the formation of such defects and developing strategies to prevent them.

  3. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); Rahmat, K. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); University Malaya, Biomedical Imaging Department, Kuala Lumpur (Malaysia); Ariffin, H. [University of Malaya, Department of Paediatrics, Faculty of Medicine, Kuala Lumpur (Malaysia)

    2012-07-15

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  4. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    International Nuclear Information System (INIS)

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  5. Experimental demonstration of microscopic process monitoring

    International Nuclear Information System (INIS)

    Microscopic process monitoring (MPM) is a material control strategy designed to use standard process control data to provide expanded safeguards protection of nuclear fuel cycle facilities. The MPM methodology identifies process events by recognizing significant patterns of changes in on-line measurements. The goals of MPM are to detect diversions of nuclear material and to provide information on process status useful to other facility safeguards operations

  6. AUTOMATION OF BUISNESS-PROCESSES OF A TRAINING CENTER

    Directory of Open Access Journals (Sweden)

    Kovalenko A. V.

    2015-06-01

    Full Text Available The modern Russian companies have realized the need of automation of document flow not only as a mean of keeping documents in order, but also as a tool of optimization of expenses, as an assistant in adoption of administrative decisions. The Russian market of information systems for long time had no software products intended for educational institutions. The majority of the automated systems are intended for the enterprises with an activity in the sphere of trade and production. In comparison with the above companies, the list of software products for commercial training centers is small. Even considering the developed line of programs it is impossible to speak about meeting all the requirements for companies of such activity. At creation of the automated system for training center, the analysis of the existing software products intended for automation of training centers and adjacent institutes was carried out; a number of features of activity are revealed. The article is devoted to the description of the developed automated information system of document flow of a commercial educational institution, namely the developed configuration of "Training center" on a platform of "1C: Enterprise 8.2". The developed program complex serving as the administrative tool for the analysis of economic activity of training center, scheduling of the educational equipment and teaching structure, payroll calculation taking into account specifics of branch has been presented in the article

  7. Partitioning,Automation and Error Recovery in the Control and Monitoring System of an LHC Experiment

    Institute of Scientific and Technical Information of China (English)

    C.Gaspar

    2001-01-01

    The Joint Controls Project(JCOP)is a collaboration between CERN and the four LHC experiments to find and implement common solutions for their control and monitoring systems.As part of this project and Architecture Working Group was set up in order to study the requirements and devise an architectural model that would suit the four experiments.Many issues were studied by this working group:Alarm handling,Access Control,Hierarchical Control,etc.This paper will report on the specific issue of hierarchical control and in particular partitioning,automation and error recovery.

  8. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    Directory of Open Access Journals (Sweden)

    Thomas E. Watts III

    2015-10-01

    Full Text Available An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The original model predicted trihalomethanes concentrations within ~10 μg·L−1 of the measurement. Calibration improved model prediction by a factor of three to five times better than the literature model.

  9. Advanced monitoring with complex stream processing

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  10. How automation helps steer the revenue cycle process.

    Science.gov (United States)

    Colpas, Phil

    2013-06-01

    top-of-mind issue as we see how healthcare reform plays out. Here's what our select group of experts had to say about how automation helps to steer the revenue cycle process. PMID:23855249

  11. Ultrasonic techniques for process monitoring and control.

    Energy Technology Data Exchange (ETDEWEB)

    Chien, H.-T.

    1999-03-24

    Ultrasonic techniques have been applied successfully to process monitoring and control for many industries, such as energy, medical, textile, oil, and material. It helps those industries in quality control, energy efficiency improving, waste reducing, and cost saving. This paper presents four ultrasonic systems, ultrasonic viscometer, on-loom, real-time ultrasonic imaging system, ultrasonic leak detection system, and ultrasonic solid concentration monitoring system, developed at Argonne National Laboratory in the past five years for various applications.

  12. Monitoring and control of Urex radiochemical processes

    International Nuclear Information System (INIS)

    There is urgent need for methods to provide on-line monitoring and control of the radiochemical processes that are currently being developed and demonstrated under the Global Nuclear Energy Partnership (GNEP) initiative. The methods used to monitor these processes must be robust (require little or no maintenance) and must be able to withstand harsh environments (e.g., high radiation fields and aggressive chemical matrices). The ability for continuous online monitoring allows the following benefits: - Accountability of the fissile materials; - Control of the process flowsheet; - Information on flow parameters, solution composition, and chemical speciation; - Enhanced performance by eliminating the need for traditional analytical 'grab samples'; - Improvement of operational and criticality safety; - Elimination of human error. The objective of our project is to use a system of flow, chemical composition, and physical property measurement techniques for developing on-line real-time monitoring systems for the UREX process streams. We will use our past experience in adapting and deploying Raman spectrometer combined with Coriolis meters and conductivity probes in developing a deployable prototype monitor for the UREX radiochemical streams. This system will be augmented with UV-vis-NIR spectrophotometer. Flow, temperature, density, and chemical composition and concentration measurements will be combined for real-time data analysis during processing. Currently emphasis of our research is placed on evaluation of the commercial instrumentation for the UREX flowsheet. (authors)

  13. Electronic Tongue-FIA system for the Monitoring of Heavy Metal Biosorption Processes

    Science.gov (United States)

    Wilson, D.; Florido, A.; Valderrama, C.; de Labastida, M. Fernández; Alegret, S.; del Valle, M.

    2011-09-01

    An automated flow injection potentiometric (FIP) system with electronic tongue detection (ET) was used for the monitoring of biosorption processes of heavy metals on waste biomaterial. Grape stalk wastes were used as biosorbent to remove Cu2+ ions in a fixed-bed column setup. For the monitoring, the used ET employed a sensor array formed by Cu2+ and Ca2+ selective electrodes and two generic heavy-metal electrodes. The subsequent cross-response obtained was processed by a multilayer artificial neural network (ANN) model in order to resolve the concentrations of the monitored species. The coupling of the electronic tongue with the automation features of the flow-injection system (ET-FIP) allowed us to accurately characterize the biosorption process, through obtaining its breakthrough curves. In parallel, fractions of the extract solution were analyzed by atomic absorption spectroscopy in order to validate the results obtained with the reported methodology.

  14. FRP resin process automating system; FRP jushi kako jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    Ochiai, I.; Sakai, H. [Meidensha Corp., Tokyo (Japan)

    1994-10-18

    This paper introduces as FRP resin product processing system using robots. Automatic processing by means of robots requires considerations on positioning of delivered workpieces, correction of positional shift of workpieces, monitoring of tools and cutters, disposal of chips, and dust and noise preventive measures. In a bath tank drilling and deburring system, robots should measure and correct the positional shift of workpieces, exchange tools automatically, and shut down the system upon occurrence of anomaly in processing. The wall panel processing system performs transportation of products by using a lift and carry system in consideration of preventing nicks on the transportation side a product. Workpiece positioning is performed by lifting and pressing them onto the standard plate on the upper portion of panel, and the thickness and length are measured and corrected by a workpiece shift correcting sensor disposed on a robot. The purification tank partition drilling system has shuttle-type transportation devices installed on both flanks of a robot. This is a high-efficiency system requiring no robot downtime. A dust collecting duct is disposed below the positioning device to prevent chips from leaking outside the device. 4 figs., 7 tabs.

  15. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools...... hand, it facilitates the selection of proper monitoring and analysis tools for a given application or process. On the other hand, it permits the identification of potential applications for a given monitoring technique or tool. An efficient inference system based on forward as well as reverse search......Efficient process monitoring and analysis tools provide the means for automated supervision and control of manufacturing plants and therefore play an important role in plant safety, process control and assurance of end product quality. The availability of a large number of different process...

  16. A novel automated discontinuous venous blood monitoring system for ex vivo glucose determination in humans.

    Science.gov (United States)

    Schaller, R; Feichtner, F; Köhler, H; Bodenlenz, M; Plank, J; Wutte, A; Mader, J K; Ellmerer, M; Hellmich, R; Wedig, H; Hainisch, R; Pieber, T R; Schaupp, L

    2009-03-15

    Intensive insulin therapy reduces mortality and morbidity in critically ill patients but imposes great demands on medical staff who must take frequent blood samples for the determination of glucose levels. A solution to this resourcing problem would be provided by an automated blood monitoring system. The aim of the present clinical study was to evaluate such a system comprising an automatic blood sampling unit linked to a glucose biosensor. Our approach was to determine the correlation and system error of the sampling unit alone and of the combined system with respect to reference levels over 12h in humans. Two venous cannulae were inserted to connect the automatic and reference systems to the subjects. Blood samples were taken at 15 and 30 min intervals. The median Pearson coefficient of correlation between manually and automatically withdrawn blood samples was 0.982 for the sampling unit alone and 0.950 for the complete system. The biosensor had a linear range up to 20 mmoll(-1) and a 95% response time of Titration Error Grid analysis suggested an acceptable treatment in 99.56% of cases. Implementation of a "Keep Vein Open" saline infusion into the automated blood sampling system reduced blood withdrawal failures through occluded catheters fourfold. In summary, automated blood sampling from a peripheral vein coupled with automatic glucose determination is a promising alternative to frequent manual blood sampling. PMID:19135351

  17. FY-2010 Process Monitoring Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Casella, Amanda J.; Hines, Wes; Levitskaia, Tatiana G.; henkell, J.; Schwantes, Jon M.; Jordan, Elizabeth A.; Lines, Amanda M.; Fraga, Carlos G.; Peterson, James M.; Verdugo, Dawn E.; Christensen, Ronald N.; Peper, Shane M.

    2011-01-01

    During FY 2010, work under the Spectroscopy-Based Process Monitoring task included ordering and receiving four fluid flow meters and four flow visible-near infrared spectrometer cells to be instrumented within the centrifugal contactor system at Pacific Northwest National Laboratory (PNNL). Initial demonstrations of real-time spectroscopic measurements on cold-stream simulants were conducted using plutonium (Pu)/uranium (U) (PUREX) solvent extraction process conditions. The specific test case examined the extraction of neodymium nitrate (Nd(NO3)3) from an aqueous nitric acid (HNO3) feed into a tri-n-butyl phosphate (TBP)/ n-dodecane solvent. Demonstration testing of this system included diverting a sample from the aqueous feed meanwhile monitoring the process in every phase using the on-line spectroscopic process monitoring system. The purpose of this demonstration was to test whether spectroscopic monitoring is capable of determining the mass balance of metal nitrate species involved in a cross-current solvent extraction scheme while also diverting a sample from the system. The diversion scenario involved diverting a portion of the feed from a counter-current extraction system while a continuous extraction experiment was underway. A successful test would demonstrate the ability of the process monitoring system to detect and quantify the diversion of material from the system during a real-time continuous solvent extraction experiment. The system was designed to mimic a PUREX-type extraction process with a bank of four centrifugal contactors. The aqueous feed contained Nd(NO3)3 in HNO3, and the organic phase was composed of TBP/n-dodecane. The amount of sample observed to be diverted by on-line spectroscopic process monitoring was measured to be 3 mmol (3 x 10-3 mol) Nd3+. This value was in excellent agreement with the 2.9 mmol Nd3+ value based on the known mass of sample taken (i.e., diverted) directly from the system feed solution.

  18. Marketing automation processes as a way to improve contemporary marketing of a company

    OpenAIRE

    Witold Świeczak

    2013-01-01

    The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influenc...

  19. Automated Computer Systems for Manufacturability Analyses and Tooling Design : Applied to the Rotary Draw Bending Process

    OpenAIRE

    Johansson, Joel

    2011-01-01

    Intensive competition on the global market puts great pressure on manufacturing companies to develop and produce products that meet requirements from customers and investors. One key factor in meeting these requirements is the efficiency of the product development and the production preparation processes. Design automation is a powerful tool to increase efficiency in these two processes. The benefits of automating the manufacturability analysis process, a part of the production preparation pr...

  20. Design Automation Systems for Production Preparation : Applied on the Rotary Draw Bending Process

    OpenAIRE

    Johansson, Joel

    2008-01-01

    Intensive competition on the global market puts great pressure on manufacturing companies to develop and produce products that meet requirements from customers and investors. One key factor in meeting these requirements is the efficiency of the product development and the production preparation process. Design automation is a powerful tool to increase efficiency in these two processes. The benefits of automating the production preparation process are shortened led-time, improved product perfo...

  1. PetriJet Platform Technology: An Automated Platform for Culture Dish Handling and Monitoring of the Contents.

    Science.gov (United States)

    Vogel, Mathias; Boschke, Elke; Bley, Thomas; Lenk, Felix

    2015-08-01

    Due to the size of the required equipment, automated laboratory systems are often unavailable or impractical for use in small- and mid-sized laboratories. However, recent developments in automation engineering provide endless possibilities for incorporating benchtop devices. Here, the authors describe the development of a platform technology to handle sealed culture dishes. The programming is based on the Petri net method and implemented via Codesys V3.5 pbF. The authors developed a system of three independent electrical driven axes capable of handling sealed culture dishes. The device performs two difference processes. First, it automatically obtains an image of every processed culture dish. Second, a server-based image analysis algorithm provides the user with several parameters of the cultivated sample on the culture dish. For demonstration purposes, the authors developed a continuous, systematic, nondestructive, and quantitative method for monitoring the growth of a hairy root culture. New results can be displayed with respect to the previous images. This system is highly accurate, and the results can be used to simulate the growth of biological cultures. The authors believe that the innovative features of this platform can be implemented, for example, in the food industry, clinical environments, and research laboratories. PMID:25787804

  2. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    Directory of Open Access Journals (Sweden)

    Anna Szostak-Chrzanowski

    2007-06-01

    Full Text Available The increasing number of structural collapses, slope failures and other naturaldisasters has lead to a demand for new sensors, sensor integration techniques and dataprocessing strategies for deformation monitoring systems. In order to meet extraordinaryaccuracy requirements for displacement detection in recent deformation monitoringprojects, research has been devoted to integrating Global Positioning System (GPS as amonitoring sensor. Although GPS has been used for monitoring purposes worldwide,certain environments pose challenges where conventional processing techniques cannotprovide the required accuracy with sufficient update frequency. Described is thedevelopment of a fully automated, continuous, real-time monitoring system that employsGPS sensors and pseudolite technology to meet these requirements in such environments.Ethernet and/or serial port communication techniques are used to transfer data betweenGPS receivers at target points and a central processing computer. The data can beprocessed locally or remotely based upon client needs. A test was conducted that illustrateda 10 mm displacement was remotely detected at a target point using the designed system.This information could then be used to signal an alarm if conditions are deemed to beunsafe.

  3. Automated Identification of Volcanic Plumes using the Ozone Monitoring Instrument (OMI)

    Science.gov (United States)

    Flower, V. J. B.; Oommen, T.; Carn, S. A.

    2015-12-01

    Volcanic eruptions are a global phenomenon which are increasingly impacting human populations due to factors such as the extension of population centres into areas of higher risk, expansion of agricultural sectors to accommodate increased production or the increasing impact of volcanic plumes on air travel. In areas where extensive monitoring is present these impacts can be moderated by ground based monitoring and alert systems, however many volcanoes have little or no monitoring capabilities. In many of these regions volcanic alerts are generated by local communities with limited resources or formal communication systems, however additional eruption alerts can result from chance encounters with passing aircraft. In contrast satellite based remote sensing instruments possess the capability to provide near global daily monitoring, facilitating automated volcanic eruption detection. One such system generates eruption alerts through the detection of thermal anomalies, known as MODVOLC, and is currently operational utilising moderate resolution MODIS satellite data. Within this work we outline a method to distinguish SO2 eruptions from background levels recorded by the Ozone Monitoring Instrument (OMI) through the identification and classification of volcanic activity over a 5 year period. The incorporation of this data into a logistic regression model facilitated the classification of volcanic events with an overall accuracy of 80% whilst consistently identifying plumes with a mass of 400 tons or higher. The implementation of the developed model could facilitate the near real time identification of new and ongoing volcanic activity on a global scale.

  4. Fully Automated Field-Deployable Bioaerosol Monitoring System Using Carbon Nanotube-Based Biosensors.

    Science.gov (United States)

    Kim, Junhyup; Jin, Joon-Hyung; Kim, Hyun Soo; Song, Wonbin; Shin, Su-Kyoung; Yi, Hana; Jang, Dae-Ho; Shin, Sehyun; Lee, Byung Yang

    2016-05-17

    Much progress has been made in the field of automated monitoring systems of airborne pathogens. However, they still lack the robustness and stability necessary for field deployment. Here, we demonstrate a bioaerosol automonitoring instrument (BAMI) specifically designed for the in situ capturing and continuous monitoring of airborne fungal particles. This was possible by developing highly sensitive and selective fungi sensors based on two-channel carbon nanotube field-effect transistors (CNT-FETs), followed by integration with a bioaerosol sampler, a Peltier cooler for receptor lifetime enhancement, and a pumping assembly for fluidic control. These four main components collectively cooperated with each other to enable the real-time monitoring of fungi. The two-channel CNT-FETs can detect two different fungal species simultaneously. The Peltier cooler effectively lowers the working temperature of the sensor device, resulting in extended sensor lifetime and receptor stability. The system performance was verified in both laboratory conditions and real residential areas. The system response was in accordance with reported fungal species distribution in the environment. Our system is versatile enough that it can be easily modified for the monitoring of other airborne pathogens. We expect that our system will expedite the development of hand-held and portable systems for airborne bioaerosol monitoring. PMID:27070239

  5. Process monitoring in solar cell manufacturing

    International Nuclear Information System (INIS)

    In this paper, the authors describe a new method that is capable of on-line monitoring of several solar cell process steps such as texturing, AR coatings, and metal contact properties. The measurement technique is rapid and specifically designed for solar cells and wafers. The system implementing this new concept is named ''PV Reflectometer.'' The idea was originally conceived several years ago and the principle of the method has been demonstrated for some simple cases. Recently, this method has been improved to be more suitable for commercial applications. For completeness, the paper first includes a brief review of the process control requirements and the common monitoring methods in solar cell production

  6. Automated Performance Monitoring Data Analysis and Reporting within the Open Source R Environment

    Science.gov (United States)

    Kennel, J.; Tonkin, M. J.; Faught, W.; Lee, A.; Biebesheimer, F.

    2013-12-01

    Environmental scientists encounter quantities of data at a rate that in many cases outpaces our ability to appropriately store, visualize and convey the information. The free software environment, R, provides a framework for efficiently processing, analyzing, depicting and reporting on data from a multitude of formats in the form of traceable and quality-assured data summary reports. Automated data summary reporting leverages document markup languages such as markdown, HTML, or LaTeX using R-scripts capable of completing a variety of simple or sophisticated data processing, analysis and visualization tasks. Automated data summary reports seamlessly integrate analysis into report production with calculation outputs - such as plots, maps and statistics - included alongside report text. Once a site-specific template is set up, including data types, geographic base data and reporting requirements, reports can be (re-)generated trivially as the data evolve. The automated data summary report can be a stand-alone report, or it can be incorporated as an attachment to an interpretive report prepared by a subject-matter expert, thereby providing the technical basis to report on and efficiently evaluate large volumes of data resulting in a concise interpretive report. Hence, the data summary report does not replace the scientist, but relieves them of repetitive data processing tasks, facilitating a greater level of analysis. This is demonstrated using an implementation developed for monthly groundwater data reporting for a multi-constituent contaminated site, highlighting selected analysis techniques that can be easily incorporated in a data summary report.

  7. Monitoring and controlling the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Ahring, B.K.; Angelidaki, I. [The Technical Univ. of Denmark, Dept. of Environmental Science and Engineering, Lyngby (Denmark)

    1997-08-01

    Many modern large-scale biogas plants have been constructed recently, increasing the demand for proper monitoring and control of these large reactor systems. For monitoring the biogas process, an easy to measure and reliable indicator is required, which reflects the metabolic state and the activity of the bacterial populations in the reactor. In this paper, we discuss existing indicators as well as indicators under development which can potentially be used to monitor the state of the biogas process in a reactor. Furthermore, data are presented from two large scale thermophilic biogas plants, subjected to temperature changes and where the concentration of volatile fatty acids was monitored. The results clearly demonstrated that significant changes in the concentration of the individual VFA occurred although the biogas production was not significantly changed. Especially the concentrations of butyrate, isobutyrate and isovalerate showed significant changes. Future improvements of process control could therefore be based on monitoring of the concentration of specific VFA`s together with information about the bacterial populations in the reactor. The last information could be supplied by the use of modern molecular techniques. (au) 51 refs.

  8. A report on the impact of automation in the food process industry

    OpenAIRE

    Dudbridge, Michael

    2008-01-01

    Research Objectives: To understand how the food industry in Europe is using automation To ascertain what the food processing industry requires from equipment suppliers Furthermore to identify variations by sector and by country

  9. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Wang Mingzhong; Huang Guogang [Pingdingshan Mining Bureau (China); Wang Yunjia; Guogangli [China Univ. of Mining and Technology, Xuzhou (China)

    1996-12-31

    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  10. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  11. Automated radiological monitoring at a Russian Ministry of Defence Naval Site

    International Nuclear Information System (INIS)

    The Arctic Military Environmental Cooperation (AMEC) Program is a cooperative effort between the military establishments of the Kingdom of Norway, the Russian Federation, and the US. This paper discusses joint activities conducted over the past year among Norwegian, Russian, and US technical experts on a project to develop, demonstrate and implement automated radiological monitoring at Russian Navy facilities engaged in the dismantlement of nuclear-powered strategic ballistic missile launching submarines. Radiological monitoring is needed at these facilities to help protect workers engaged in the dismantlement program and the public living within the footprint of routine and accidental radiation exposure areas. By providing remote stand-alone monitoring, the Russian Navy will achieve added protection due to the defense-in-depth strategy afforded by local (at the site), regional (Kola) and national-level (Moscow) oversight. The system being implemented at the Polyaminsky Russian Naval Shipyard was developed from a working model tested at the Russian Institute for Nuclear Safety, Moscow, Russia. It includes Russian manufactured terrestrial and underwater gamma detectors, smart controllers for graded sampling, radio-modems for offsite transmission of the data, and a data fusion/display system: The data fusion/display system is derived from the Norwegian Picasso AMEC Environmental Monitoring software package. This computer package allows monitoring personnel to review the real-time and historical status of monitoring at specific sites and objects and to establish new monitoring protocols as required, for example, in an off-normal accident situation. Plans are being developed to implement the use of this system at most RF Naval sites handling spent nuclear fuel

  12. Automated outlier detection framework for identifying damage states in multi-girder steel bridges using long-term wireless monitoring data

    Science.gov (United States)

    O'Connor, Sean M.; Zhang, Yilan; Lynch, Jerome P.

    2015-04-01

    Advances in wireless sensor technology have enabled low cost and extremely scalable sensing platforms prompting high density sensor installations. High density long-term monitoring generates a wealth of sensor data demanding an efficient means of data storage and data processing for information extraction that is pertinent to the decision making of bridge owners. This paper reports on decision making inferences drawn from automated data processing of long-term highway bridge data. The Telegraph Road Bridge (TRB) demonstration testbed for sensor technology innovation and data processing tool development has been instrumented with a long-term wireless structural monitoring system that has been in operation since September 2011. The monitoring system has been designed to specifically address stated concerns by the Michigan Department of Transportation regarding pin and hanger steel girder bridges. The sensing strategy consists of strain, acceleration and temperature sensors deployed in a manner to track specific damage modalities common to multigirder steel concrete composite bridges using link plate assemblies. To efficiently store and process long-term sensor data, the TRB monitoring system operates around the SenStore database system. SenStore combines sensor data with bridge information (e.g., material properties, geometry, boundary conditions) and exposes an application programming interface to enable automated data extraction by processing tools. Large long-term data sets are modeled for environmental and operational influence by regression methods. Response processes are defined by statistical parameters extracted from long-term data and used to automate decision support in an outlier detection, or statistical process control, framework.

  13. Definition and First Year of a New International Master in Industrial Processes Automation

    OpenAIRE

    Witrant, Emmanuel; Thiriet, Jean-Marc; Retière, Nicolas

    2010-01-01

    The aim of this paper is to present a new international master curriculum in the field of information technologies (IT) focused on industrial processes automation (IPA), proposed by University Joseph Fourier (UJF) / University of Grenoble, France. The education objectives are set according to the latest concepts developed in IT for process automation, with clear specifications towards engineering and industry. The local and international frameworks are analyzed and the education objectives ex...

  14. Automation and computer integrated manufacturing in food processing industry: an appraisal

    OpenAIRE

    Mohamed, Ayad Khalifa

    2003-01-01

    This study is concerned with a research programme on automation and computer integrated manufacturing (CIM) in food processing industry, culminating in an implementation framework detailing the extent of automation and application of computer based technologies in Irish food processing industries. This work involved with designing of a postal survey questionnaire and mailing it to 221 manufacturing companies, and designing a web-based survey and emailing it to 31 manufacturing companies i...

  15. Coating Process Monitoring Using Computer Vision

    OpenAIRE

    Veijola, Erik

    2013-01-01

    The aim of this Bachelor’s Thesis was to make a prototype system for Metso Paper Inc. for monitoring a paper roll coating process. If the coating is done badly and there are faults one has to redo the process which lowers the profits of the company since the process is costly. The work was proposed by Seppo Parviainen in December of 2012. The resulting system was to alarm the personnel of faults in the process. Specifically if the system that is applying the synthetic resin on to the roll...

  16. Automated data evaluation and modelling of simultaneous (19) F-(1) H medium-resolution NMR spectra for online reaction monitoring.

    Science.gov (United States)

    Zientek, Nicolai; Laurain, Clément; Meyer, Klas; Paul, Andrea; Engel, Dirk; Guthausen, Gisela; Kraume, Matthias; Maiwald, Michael

    2016-06-01

    Medium-resolution nuclear magnetic resonance spectroscopy (MR-NMR) currently develops to an important analytical tool for both quality control and process monitoring. In contrast to high-resolution online NMR (HR-NMR), MR-NMR can be operated under rough environmental conditions. A continuous re-circulating stream of reaction mixture from the reaction vessel to the NMR spectrometer enables a non-invasive, volume integrating online analysis of reactants and products. Here, we investigate the esterification of 2,2,2-trifluoroethanol with acetic acid to 2,2,2-trifluoroethyl acetate both by (1) H HR-NMR (500 MHz) and (1) H and (19) F MR-NMR (43 MHz) as a model system. The parallel online measurement is realised by splitting the flow, which allows the adjustment of quantitative and independent flow rates, both in the HR-NMR probe as well as in the MR-NMR probe, in addition to a fast bypass line back to the reactor. One of the fundamental acceptance criteria for online MR-MNR spectroscopy is a robust data treatment and evaluation strategy with the potential for automation. The MR-NMR spectra are treated by an automated baseline and phase correction using the minimum entropy method. The evaluation strategies comprise (i) direct integration, (ii) automated line fitting, (iii) indirect hard modelling (IHM) and (iv) partial least squares regression (PLS-R). To assess the potential of these evaluation strategies for MR-NMR, prediction results are compared with the line fitting data derived from the quantitative HR-NMR spectroscopy. Although, superior results are obtained from both IHM and PLS-R for (1) H MR-NMR, especially the latter demands for elaborate data pretreatment, whereas IHM models needed no previous alignment. Copyright © 2015 John Wiley & Sons, Ltd. PMID:25854892

  17. A Camera and Multi-Sensor Automated Station Design for Polar Physical and Biological Systems Monitoring: AMIGOS

    Science.gov (United States)

    Bohlander, J. A.; Ross, R.; Scambos, T.; Haran, T. M.; Bauer, R. J.

    2012-12-01

    The Automated Meteorology - Ice/Indigenous species - Geophysics Observation System (AMIGOS) consists of a set of measurement instruments and camera(s) controlled by a single-board computer with a simplified Linux operating system and an Iridium satellite modem supporting two-way communication. Primary features of the system relevant to polar operations are low power requirements, daily data uploading, reprogramming, tolerance for low temperatures, and various approaches for automatic resets and recovery from low power or cold shut-down. Instruments include a compact weather station, C/A or dual-frequency GPS, solar flux and reflectivity sensors, sonic snow gages, simplified radio-echo-sounder, and resistance thermometer string in the firn column. In the current state of development, there are two basic designs. One is intended for in situ observations of glacier conditions. The other design supports a high-resolution camera for monitoring biological or geophysical systems from short distances (100 m to 20 km). The stations have been successfully used in several locations for operational support, monitoring rapid ice changes in response to climate change or iceberg drift, and monitoring penguin colony activity. As of June, 2012, there are 9 AMIGOS systems installed, all on the Antarctic continent. The stations are a working prototype for a planned series of upgraded stations, currently termed 'Sentinels'. These stations would carry further instrumentation, communications, and processing capability to investigate ice - ocean interaction from ice tongue, ice shelf, or fjord coastline areas.

  18. Automated swimming activity monitor for examining temporal patterns of toxicant effects on individual Daphnia magna.

    Science.gov (United States)

    Bahrndorff, Simon; Michaelsen, Thomas Yssing; Jensen, Anne; Marcussen, Laurits Faarup; Nielsen, Majken Elley; Roslev, Peter

    2016-07-01

    Aquatic pollutants are often biologically active at low concentrations and impact on biota in combination with other abiotic stressors. Traditional toxicity tests may not detect these effects, and there is a need for sensitive high-throughput methods for detecting sublethal effects. We have evaluated an automated infra-red (IR) light-based monitor for recording the swimming activity of Daphnia magna to establish temporal patterns of toxicant effects on an individual level. Activity was recorded for 48 h and the sensitivity of the monitor was evaluated by exposing D. magna to the reference chemicals K2 Cr2 O7 at 15, 20 and 25 °C and 2,4-dichlorophenol at 20 °C. Significant effects (P < 0.001) of toxicant concentrations, exposure time and incubation temperatures were observed. At 15 °C, the swimming activity remained unchanged for 48 h at sublethal concentrations of K2 Cr2 O7 whereas activity at 20 and 25 °C was more biphasic with decreases in activity occurring after 12-18 h. A similar biphasic pattern was observed after 2,4-dichlorophenol exposure at 20 °C. EC50 values for 2,4-dichlorophenol and K2 Cr2 O7 determined from automated recording of swimming activity showed increasing toxicity with time corresponding to decreases in EC50 of 0.03-0.07 mg l(-1) h(-1) . EC50 values determined after 48 h were comparable or lower than EC50 values based on visual inspection according to ISO 6341. The results demonstrated that the swimming activity monitor is capable of detecting sublethal behavioural effects that are toxicant and temperature dependent. The method allows EC values to be established at different time points and can serve as a high-throughput screening tool in toxicity testing. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26198804

  19. Process automation using combinations of process and machine control technologies with application to a continuous dissolver

    International Nuclear Information System (INIS)

    Operation of a continuous rotary dissolver, designed to leach uranium-plutonium fuel from chopped sections of reactor fuel cladding using nitric acid, has been automated. The dissolver is a partly continuous, partly batch process that interfaces at both ends with batchwise processes, thereby requiring synchronization of certain operations. Liquid acid is fed and flows through the dissolver continuously, whereas chopped fuel elements are fed to the dissolver in small batches and move through the compartments of the dissolver stagewise. Sequential logic (or machine control) techniques are used to control discrete activities such as the sequencing of isolation valves. Feedback control is used to control acid flowrates and temperatures. Expert systems technology is used for on-line material balances and diagnostics of process operation. 1 ref., 3 figs

  20. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  1. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan [Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States) and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27710 (United States); and Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708 (United States)

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose

  2. ADVANCES IN CLOG STATE MONITORING FOR USE IN AUTOMATED REED BED INSTALLATIONS

    Directory of Open Access Journals (Sweden)

    Theodore HUGHES-RILEY

    2014-06-01

    Full Text Available Constructed wetlands are a popular form of waste-water treatment that have proliferated across Europe and the rest of the world in recent years as an environmentally conscious form of waste water treatment. The ability to monitor the conditions in the bed and control input factors such as heating and aeration may extend the lifetime of the reed bed substantially beyond the ten year lifetime normally reached. The Autonomous Reed Bed Installation (ARBI project is an EU FP7 initiative to develop a reed bed with automated control over input parameters based on readings taken from embedded sensors. Automated remedial action may improve bed treatment efficiency, and prolong the life of the bed and avoiding the need to refurbish the bed, which is both time consuming and costly. One critical parameter to observe is the clog state of the reed bed, as this can severely impact on the efficiency of water treatment to the point of the bed becoming non-operable. Magnetic resonance (MR sensors can be a powerful tool in determining clogging levels, and has previously been explored in the literature. This work is based on a conference paper (2nd International Conference "Water resources and wetlands", 2014 and details magnetic sensors suitable for long-term embedding into a constructed wetland. Unlike previous studies this work examines a probe embedded into a wetland.

  3. Completely automated measurement facility (PAVICOM) for track-detector data processing

    CERN Document Server

    Aleksandrov, A B; Feinberg, E L; Goncharova, L A; Konovalova, N S; Martynov, A G; Polukhina, N G; Roussetski, A S; Starkov, NI; Tsarev, V A

    2004-01-01

    A review of technical capabilities and investigations performed using the completely automated measuring facility (PAVICOM) is presented. This very efficient facility for track-detector data processing in the field of nuclear and high-energy particle physics has been constructed in the Lebedev physical institute. PAVICOM is widely used in Russia for treatment of experimental data from track detectors (emulsion and solid-state trackers) in high- and low-energy physics, cosmic ray physics, etc. PAVICOM provides an essential improvement of the efficiency of experimental studies. In contrast to semi-automated microscopes widely used until now, PAVICOM is capable of performing completely automated measurements of charged particle tracks in nuclear emulsions and track detectors without employing hard visual work. In this case, track images are recorded by CCD cameras and then are digitized and converted into files. Thus, experimental data processing is accelerated by approximately a thousand times. Completely autom...

  4. A prototype of an automated high resolution InSAR volcano-monitoring system in the MED-SUV project

    Science.gov (United States)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas

    2016-04-01

    Volcanic processes which produce a variety of geological and hydrological hazards are difficult to predict and capable of triggering natural disasters on regional to global scales. Therefore it is important to monitor volcano continuously and with a high spatial and temporal sampling rate. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities and it helps for the better understanding and modelling of the involved geophysical processes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide a powerful tool for observing the eruptive activities and measuring the surface changes of millimetre accuracy. All the mentioned techniques with deformation time series extraction address the challenges by exploiting medium to large SAR image stacks. The process of selecting, ordering, downloading, storing, logging, extracting and preparing the data for processing is very time consuming has to be done manually for every single data-stack. In many cases it is even an iterative process which has to be done regularly and continuously. Therefore, data processing becomes slow which causes significant delays in data delivery. The SAR Satellite based High Resolution Data Acquisition System, which will be developed at DLR, will automate this entire time consuming tasks and allows an operational volcano monitoring system. Every 24 hours the system runs for searching new acquired scene over the volcanoes and keeps track of the data orders, log the status and download the provided data via ftp-transfer including E-Mail alert. Furthermore, the system will deliver specified reports and maps to a database for review and use by specialists. The user interaction will be minimized and iterative processes will be totally avoided. In this presentation, a prototype of SAR Satellite based High Resolution Data

  5. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  6. Automation of nuclear power plants with a decentralized process control system

    International Nuclear Information System (INIS)

    In the automation of nuclear power plants there is a trend to microprocessor-controlled automation systems with bus data transmission. The state of the art is discussed, using the BBC PROCONTROL system as an example. Other decentralized process control systems - e.g. the TELEPERM M by Siemens or CONTRONIC 3 by Hartmann + Braun - arrive more or less at the same solutions. (orig./WB)

  7. Using Natural Language Processing to Improve Accuracy of Automated Notifiable Disease Reporting

    OpenAIRE

    Friedlin, Jeff; Grannis, Shaun; Overhage, J Marc

    2008-01-01

    We examined whether using a natural language processing (NLP) system results in improved accuracy and completeness of automated electronic laboratory reporting (ELR) of notifiable conditions. We used data from a community-wide health information exchange that has automated ELR functionality. We focused on methicillin-resistant Staphylococcus Aureus (MRSA), a reportable infection found in unstructured, free-text culture result reports. We used the Regenstrief EXtraction tool (REX) for this wor...

  8. Seismic monitoring of torrential and fluvial processes

    Science.gov (United States)

    Burtin, Arnaud; Hovius, Niels; Turowski, Jens M.

    2016-04-01

    In seismology, the signal is usually analysed for earthquake data, but earthquakes represent less than 1 % of continuous recording. The remaining data are considered as seismic noise and were for a long time ignored. Over the past decades, the analysis of seismic noise has constantly increased in popularity, and this has led to the development of new approaches and applications in geophysics. The study of continuous seismic records is now open to other disciplines, like geomorphology. The motion of mass at the Earth's surface generates seismic waves that are recorded by nearby seismometers and can be used to monitor mass transfer throughout the landscape. Surface processes vary in nature, mechanism, magnitude, space and time, and this variability can be observed in the seismic signals. This contribution gives an overview of the development and current opportunities for the seismic monitoring of geomorphic processes. We first describe the common principles of seismic signal monitoring and introduce time-frequency analysis for the purpose of identification and differentiation of surface processes. Second, we present techniques to detect, locate and quantify geomorphic events. Third, we review the diverse layout of seismic arrays and highlight their advantages and limitations for specific processes, like slope or channel activity. Finally, we illustrate all these characteristics with the analysis of seismic data acquired in a small debris-flow catchment where geomorphic events show interactions and feedbacks. Further developments must aim to fully understand the richness of the continuous seismic signals, to better quantify the geomorphic activity and to improve the performance of warning systems. Seismic monitoring may ultimately allow the continuous survey of erosion and transfer of sediments in the landscape on the scales of external forcing.

  9. Chapter 11. The energy supply of electrolytic series. The mechanization and automation of electrolysis process

    International Nuclear Information System (INIS)

    This chapter is devoted to energy supply of electrolytic series, mechanization and automation of electrolysis process. Thus, the energy supply of electrolytic series was considered. The mechanization of processes of electrolytic cells maintenance was considered as well. The automatic control system of technologic process was proposed.

  10. STAMPS: software tool for automated MRI post-processing on a supercomputer

    OpenAIRE

    Bigler, Don C.; Aksu, Yaman; Yang, Qing X.

    2009-01-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features....

  11. The Effects of Automated Prompting and Self-Monitoring on Homework Completion for a Student with Attention Deficit Hyperactivity Disorder

    Science.gov (United States)

    Blicha, Amy; Belfiore, Phillip J.

    2013-01-01

    This study examined the effects of an intervention consisting of automated prompting and self-monitoring on the level of independent homework task completion for an elementary-age student with attention deficit hyperactivity disorder (ADHD). Instituting a single subject, within series ABAB design, the results showed a consistent increase and…

  12. A wireless smart sensor network for automated monitoring of cable tension

    International Nuclear Information System (INIS)

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea. (paper)

  13. A wireless smart sensor network for automated monitoring of cable tension

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jong-Woong; Cho, Soojin; Spencer, Billie F., Jr.; Jung, Hyung-Jo

    2014-02-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea.

  14. Automated assay data processing and quality control: A review and recommendations

    International Nuclear Information System (INIS)

    Automated data processing and quality control of assays offers not only increased speed but also a more thorough and statistically rigorous analysis of results. This review outlines the motivations, statistical definitions, and mathematical methods pertinent to assay data processing. The presentation concentrates on basic concepts rather than specific mathematical formulae. The numerous automated calibration procedures are discussed and summarized in tabular form. A comprehensive view of data processing is offered which includes much more than simple calibration and interpolation. A small number of calculator and computer programs which provide an acceptably detailed statistical analysis of assays are recommended. Finally, possible future developments in hardware and software are discussed. (author)

  15. Microbiological monitoring and automated event sampling at karst springs using LEO-satellites.

    Science.gov (United States)

    Stadler, H; Skritek, P; Sommer, R; Mach, R L; Zerobin, W; Farnleitner, A H

    2008-01-01

    Data communication via Low-Earth-Orbit (LEO) Satellites between portable hydrometeorological measuring stations is the backbone of our system. This networking allows automated event sampling with short time increments also for E. coli field analysis. All activities of the course of the event-sampling can be observed on an internet platform based on a Linux-Server. Conventionally taken samples compared with the auto-sampling procedure revealed corresponding results and were in agreement with the ISO 9308-1 reference method. E. coli concentrations were individually corrected by event specific inactivation coefficients (0.10-0.14 day(-1)), compensating losses due to sample storage at spring temperature in the auto sampler.Two large summer events in 2005/2006 at an important alpine karst spring (LKAS2) were monitored including detailed analysis of E. coli dynamics (n = 271) together with comprehensive hydrological characterisations. High-resolution time series demonstrated a sudden increase of E. coli concentrations in spring water (approximately 2 log10 units) with a specific time delay after the beginning of the event. Statistical analysis suggested the spectral absorption coefficient measured at 254 nm (SAC254) as an early warning surrogate for real time monitoring of faecal input. Together with the LEO-satellite based system it is a helpful tool for early-warning systems in the field of drinking water protection. PMID:18776628

  16. Application of an automated wireless structural monitoring system for long-span suspension bridges

    International Nuclear Information System (INIS)

    This paper describes an automated wireless structural monitoring system installed at the New Carquinez Bridge (NCB). The designed system utilizes a dense network of wireless sensors installed in the bridge but remotely controlled by a hierarchically designed cyber-environment. The early efforts have included performance verification of a dense network of wireless sensors installed on the bridge and the establishment of a cellular gateway to the system for remote access from the internet. Acceleration of the main bridge span was the primary focus of the initial field deployment of the wireless monitoring system. An additional focus of the study is on ensuring wireless sensors can survive for long periods without human intervention. Toward this end, the life-expectancy of the wireless sensors has been enhanced by embedding efficient power management schemes in the sensors while integrating solar panels for power harvesting. The dynamic characteristics of the NCB under daily traffic and wind loads were extracted from the vibration response of the bridge deck and towers. These results have been compared to a high-fidelity finite element model of the bridge.

  17. ROLE AND PECULIARITIES OF PROJECT STREAM IN THE FIELDOF AUTOMATION OF BUISNESS-PROCESSES IN COMPANIES

    Directory of Open Access Journals (Sweden)

    Kovalenko A. V.

    2015-06-01

    Full Text Available For effective management of economic and financial activity of a modern enterprise it is required to have information – software supply for its separate components: branches, divisions, services. Automation of the account allows accelerating significantly the process of granting, processing and the analysis of information necessary for the purposes of management. An important aspect on introduction of a program complex is the concept on its realization, i.e. the head of the company should define the way of development of automation of business - processes, what type of introduction is more preferable to the enterprise, what documents will formalize each of the stages. The article generalizes the data obtained by practical consideration on the basis of the realized projects on introduction of the automated systems in the companies from various fields of activity. The main stages of the design direction in the sphere of automation of business – processes are presented in this work, as well as the features of the subject and the characteristics of each stage, documentary objects for realization of each of them. Also on the basis of the carried-out analysis, the authors described a number of the existing shortcomings on realization of the design direction. In view of the data specified in article, the companies will be able to begin the project on automation of their own business effectively and quickly

  18. Signal Processing for Beam Position Monitors

    CERN Document Server

    Vismara, Giuseppe

    2000-01-01

    At the first sight the problem to determine the beam position from the ratio of the induced charges of the opposite electrodes of a beam monitor seems trivial, but up to now no unique solution has been found that fits the various demands of all particle accelerators. The purpose of this paper is to help "instrumentalists" to choose the best processing system for their particular application, depending on the machine size, the input dynamic range, the required resolution and the acquisition speed. After a general introduction and an analysis of the electrical signals to be treated (frequency and time domain), the definition of the electronic specifications will be reviewed. The tutorial will present the different families in which the processing systems can be grouped. A general description of the operating principles with relative advantages and disadvantages for the most employed processing systems is presented. Special emphasis will be put on recent technological developments based on telecommunication circ...

  19. Automated synthesis of image processing procedures using AI planning techniques

    Science.gov (United States)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  20. Automated Planning of context-aware Process Models

    OpenAIRE

    Heinrich, Bernd; Schön, Dominik

    2015-01-01

    Most real world processes are heavily influenced by environmental factors, which are referred to as the context of a process. Thus, the consideration of context is proposed within the research strand of Business Process Modeling. Most existing context-aware modeling approaches con-sider context only in terms of static information like, for instance, the location where a process is performed. However, context information like the weather could change during the conduction of a process, which w...

  1. Trend Analysis on the Automation of the Notebook PC Production Process

    Directory of Open Access Journals (Sweden)

    Chin-Ching Yeh

    2012-09-01

    Full Text Available Notebook PCs are among the Taiwanese electronic products that generate the highest production value and market share. According to the ITRI IEK statistics, the domestic Notebook PC - production value in 2011 was about NT $2.3 trillion. Of about 200 million notebook PCs in global markets in 2011, Taiwan’s notebook PC output accounts for more than 90% of them, meaning that nine out of every ten notebook PCs in the world are manufactured by Taiwanese companies. For such a large industry with its output value and quantity, the degree of automation in production processes is not high. This means that there is still a great room for the automation of the notebook PC production process, or that the degree of automation of the production process of the laptops cannot be enhanced. This paper presents an analysis of the situation.

  2. A Continuous Automated Vault Inventory System (CAVIS) for accountability monitoring of stored nuclear materials

    International Nuclear Information System (INIS)

    Nearly all facilities that store hazardous (radioactive or non-radioactive) materials must comply with prevailing federal, state, and local laws. These laws usually have components that require periodic physical inspections to insure that all materials remain safely and securely stored. The inspections are generally labor intensive, slow, put personnel at risk, and only find anomalies after they have occurred. The system described in this paper was developed for monitoring stored nuclear materials resulting from weapons dismantlement, but its applications extend to any storage facility that meets the above criteria. The traditional special nuclear material (SNM) accountability programs, that are currently used within most of the Department of Energy (DOE) complex, require the physical entry of highly trained personnel into SNM storage vaults. This imposes the need for additional security measures, which typically mandate that extra security personnel be present while SNM inventories are performed. These requirements increase labor costs and put additional personnel at risk to radiation exposure. In some cases, individuals have received radiation exposure equivalent to the annual maximum during just one inventory verification. With increasing overhead costs, the current system is rapidly becoming too expensive to operate, the need for an automated method of inventory verification is evident. The Continuous Automated Vault Inventory System (CAVIS) described in this paper was designed and prototyped as a low cost, highly reliable, and user friendly system that is capable of providing, real-time weight, gamma. and neutron energy confirmation from each item stored in a SNM vault. This paper describes the sensor technologies, the CAVIS prototype system (built at Y- 12 for highly enriched uranium storage), the technical requirements that must be achieved to assure successful implementation, and descriptions of sensor technologies needed for a plutonium facility

  3. The Multi-Isotope Process (MIP) Monitor Project: FY12 Progress and Accomplishments

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B.; Orton, Christopher R.; Jordan, David V.; Schwantes, Jon M.; Bender, Sarah; Dayman, Kenneth J.; Unlu, Kenan; Landsberger, Sheldon

    2012-09-27

    The Multi-Isotope Process (MIP) Monitor, being developed at Pacific Northwest National Laboratory (PNNL), provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of "...(minimization of) the risks of nuclear proliferation and terrorism." The MIP Monitor measures distributions of a suite of indicator (radioactive) isotopes present within product and waste streams of a nuclear reprocessing facility. These indicator isotopes are monitored on-line by gamma spectrometry and compared, in near-real-time, to spectral patterns representing "normal" process conditions using multivariate pattern recognition software. The monitor utilizes this multivariate analysis and gamma spectroscopy of reprocessing streams to detect small changes in the gamma spectrum, which may indicate changes in process conditions. Multivariate analysis methods common in chemometrics, such as principal component analysis (PCA) and partial least squares regression (PLS), act as pattern recognition techniques, which can detect small deviations from the expected, nominal condition. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting. Development of the MIP Monitor approach continues to evaluate the efficacy of the monitor for automated, real-time or near-real-time application. This report details follow-on research and development efforts sponsored by the U.S. Department of Energy Fuel Cycle Research and Development related to the MIP Monitor for fiscal year

  4. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.

  5. An automated fog monitoring system for the Indo-Gangetic Plains based on satellite measurements

    Science.gov (United States)

    Patil, Dinesh; Chourey, Reema; Rizvi, Sarwar; Singh, Manoj; Gautam, Ritesh

    2016-05-01

    Fog is a meteorological phenomenon that causes reduction in regional visibility and affects air quality, thus leading to various societal and economic implications, especially disrupting air and rail transportation. The persistent and widespread winter fog impacts the entire the Indo-Gangetic Plains (IGP), as frequently observed in satellite imagery. The IGP is a densely populated region in south Asia, inhabiting about 1/6th of the world's population, with a strong upward pollution trend. In this study, we have used multi-spectral radiances and aerosol/cloud retrievals from Terra/Aqua MODIS data for developing an automated web-based fog monitoring system over the IGP. Using our previous and existing methodologies, and ongoing algorithm development for the detection of fog and retrieval of associated microphysical properties (e.g. fog droplet effective radius), we characterize the widespread fog detection during both daytime and nighttime. Specifically, for the night time fog detection, the algorithm employs a satellite-based bi-spectral brightness temperature difference technique between two spectral channels: MODIS band-22 (3.9μm) and band-31 (10.75μm). Further, we are extending our algorithm development to geostationary satellites, for providing continuous monitoring of the spatial-temporal variation of fog. We anticipate that the ongoing and future development of a fog monitoring system would be of assistance to air, rail and vehicular transportation management, as well as for dissemination of fog information to government agencies and general public. The outputs of fog detection algorithm and related aerosol/cloud parameters are operationally disseminated via http://fogsouthasia.com/.

  6. Improvement of an automated neonatal seizure detector using a post-processing technique.

    Science.gov (United States)

    Ansari, A H; Matic, V; De Vos, M; Naulaers, G; Cherian, P J; Van Huffel, S

    2015-08-01

    Visual recognition of neonatal seizures during continuous EEG monitoring in neonatal intensive care units (NICUs) is labor-intensive, has low inter-rater agreement and requires special expertise that is not available around the clock. Development of an accurate automated seizure detection system with a low false alarm rate will support clinical decision making and alleviate significantly the workload. However, this is an ongoing difficult challenge for engineers as the neonatal EEG signal is non-stationary and often includes complex patterns of seizures and artifacts. In this study, we show an improvement of our previously developed neonatal seizure detector (developed using heuristic if-then rules). In order to improve the detection accuracy, mean phase coherence as a new feature is used to characterize artifacts and also support vector machine is applied to perform the post-processing step to remove false detections. As a result, the false alarm rate drops 42% (from 2.6 h(-1) to 1.5 h(-1)), whereas the good detection rate reduces only by 4%. PMID:26737624

  7. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  8. Effect of Using Automated Auditing Tools on Detecting Compliance Failures in Unmanaged Processes

    Science.gov (United States)

    Doganata, Yurdaer; Curbera, Francisco

    The effect of using automated auditing tools to detect compliance failures in unmanaged business processes is investigated. In the absence of a process execution engine, compliance of an unmanaged business process is tracked by using an auditing tool developed based on business provenance technology or employing auditors. Since budget constraints limit employing auditors to evaluate all process instances, a methodology is devised to use both expert opinion on a limited set of process instances and the results produced by fallible automated audit machines on all process instances. An improvement factor is defined based on the average number of non-compliant process instances detected and it is shown that the improvement depends on the prevalence of non-compliance in the process as well as the sensitivity and the specificity of the audit machine.

  9. Distributed process control system for remote control and monitoring of the TFTR tritium systems

    International Nuclear Information System (INIS)

    This paper reviews the progress made in the application of a commercially available distributed process control system to support the requirements established for the Tritium REmote Control And Monitoring System (TRECAMS) of the Tokamak Fusion Test REactor (TFTR). The system that will discussed was purchased from Texas (TI) Instruments Automation Controls Division), previously marketed by Rexnord Automation. It consists of three, fully redundant, distributed process controllers interfaced to over 1800 analog and digital I/O points. The operator consoles located throughout the facility are supported by four Digital Equipment Corporation (DEC) PDP-11/73 computers. The PDP-11/73's and the three process controllers communicate over a fully redundant one megabaud fiber optic network. All system functionality is based on a set of completely integrated databases loaded to the process controllers and the PDP-11/73's. (author). 2 refs.; 2 figs

  10. Monitoring seasonal and diurnal changes in photosynthetic pigments with automated PRI and NDVI sensors

    Directory of Open Access Journals (Sweden)

    J. A. Gamon

    2015-02-01

    only, and the other that also considered diurnal sun angle effects. Cross-calibration clearly affected sensor agreement with independent measurements, with the best method dependent upon the study aim and time frame (seasonal vs. diurnal. The seasonal patterns of NDVI and PRI differed for evergreen and deciduous species, demonstrating the complementary nature of these two indices. Over the spring season, PRI was most strongly influenced by changing chlorophyll : carotenoid pool sizes, while over the diurnal time scale PRI was most affected by the xanthophyll cycle epoxidation state. This finding demonstrates that the SRS PRI sensors can resolve different processes affecting PRI over different time scales. The advent of small, inexpensive, automated PRI and NDVI sensors offers new ways to explore environmental and physiological constraints on photosynthesis, and may be particularly well-suited for use at flux tower sites. Wider application of automated sensors could lead to improved integration of flux and remote sensing approaches to studying photosynthetic carbon uptake, and could help define the concept of contrasting vegetation optical types.

  11. The Use of an Automated System (GreenFeed) to Monitor Enteric Methane and Carbon Dioxide Emissions from Ruminant Animals.

    Science.gov (United States)

    Hristov, Alexander N; Oh, Joonpyo; Giallongo, Fabio; Frederick, Tyler; Weeks, Holley; Zimmerman, Patrick R; Harper, Michael T; Hristova, Rada A; Zimmerman, R Scott; Branco, Antonio F

    2015-01-01

    Ruminant animals (domesticated or wild) emit methane (CH4) through enteric fermentation in their digestive tract and from decomposition of manure during storage. These processes are the major sources of greenhouse gas (GHG) emissions from animal production systems. Techniques for measuring enteric CH4 vary from direct measurements (respiration chambers, which are highly accurate, but with limited applicability) to various indirect methods (sniffers, laser technology, which are practical, but with variable accuracy). The sulfur hexafluoride (SF6) tracer gas method is commonly used to measure enteric CH4 production by animal scientists and more recently, application of an Automated Head-Chamber System (AHCS) (GreenFeed, C-Lock, Inc., Rapid City, SD), which is the focus of this experiment, has been growing. AHCS is an automated system to monitor CH4 and carbon dioxide (CO2) mass fluxes from the breath of ruminant animals. In a typical AHCS operation, small quantities of baiting feed are dispensed to individual animals to lure them to AHCS multiple times daily. As the animal visits AHCS, a fan system pulls air past the animal's muzzle into an intake manifold, and through an air collection pipe where continuous airflow rates are measured. A sub-sample of air is pumped out of the pipe into non-dispersive infra-red sensors for continuous measurement of CH4 and CO2 concentrations. Field comparisons of AHCS to respiration chambers or SF6 have demonstrated that AHCS produces repeatable and accurate CH4 emission results, provided that animal visits to AHCS are sufficient so emission estimates are representative of the diurnal rhythm of rumen gas production. Here, we demonstrate the use of AHCS to measure CO2 and CH4 fluxes from dairy cows given a control diet or a diet supplemented with technical-grade cashew nut shell liquid. PMID:26383886

  12. The Use of an Automated System (GreenFeed) to Monitor Enteric Methane and Carbon Dioxide Emissions from Ruminant Animals

    Science.gov (United States)

    Hristov, Alexander N.; Oh, Joonpyo; Giallongo, Fabio; Frederick, Tyler; Weeks, Holley; Zimmerman, Patrick R.; Harper, Michael T.; Hristova, Rada A.; Zimmerman, R. Scott; Branco, Antonio F.

    2015-01-01

    Ruminant animals (domesticated or wild) emit methane (CH4) through enteric fermentation in their digestive tract and from decomposition of manure during storage. These processes are the major sources of greenhouse gas (GHG) emissions from animal production systems. Techniques for measuring enteric CH4 vary from direct measurements (respiration chambers, which are highly accurate, but with limited applicability) to various indirect methods (sniffers, laser technology, which are practical, but with variable accuracy). The sulfur hexafluoride (SF6) tracer gas method is commonly used to measure enteric CH4 production by animal scientists and more recently, application of an Automated Head-Chamber System (AHCS) (GreenFeed, C-Lock, Inc., Rapid City, SD), which is the focus of this experiment, has been growing. AHCS is an automated system to monitor CH4 and carbon dioxide (CO2) mass fluxes from the breath of ruminant animals. In a typical AHCS operation, small quantities of baiting feed are dispensed to individual animals to lure them to AHCS multiple times daily. As the animal visits AHCS, a fan system pulls air past the animal’s muzzle into an intake manifold, and through an air collection pipe where continuous airflow rates are measured. A sub-sample of air is pumped out of the pipe into non-dispersive infra-red sensors for continuous measurement of CH4 and CO2 concentrations. Field comparisons of AHCS to respiration chambers or SF6 have demonstrated that AHCS produces repeatable and accurate CH4 emission results, provided that animal visits to AHCS are sufficient so emission estimates are representative of the diurnal rhythm of rumen gas production. Here, we demonstrate the use of AHCS to measure CO2 and CH4 fluxes from dairy cows given a control diet or a diet supplemented with technical-grade cashew nut shell liquid. PMID:26383886

  13. Implications of critical chain methodology for business process flexible automation projects in economic organizations

    OpenAIRE

    Paul BRUDARU

    2009-01-01

    Business processes flexible automation projects involve the use of methods and technologies from Business Processes Management area (BPM) that aim at increasing the agility of organizations in changing the business processes as response to environmental changes. BPM-type projects are a mix between process improvement projects and software development which implies a high complexity in managing them. The successful implementation of these projects involves overcoming problems inherent as delay...

  14. Information processing for aerospace structural health monitoring

    Science.gov (United States)

    Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.

    1998-06-01

    Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.

  15. The monitoring and control of TRUEX processes

    International Nuclear Information System (INIS)

    The Generic TRUEX Model (GTM) was used to design a flowsheet for the TRUEX solvent extraction process that would be used to determine its instrumentation and control requirements. Sensitivity analyses of the key process variables, namely, the aqueous and organic flow rates, feed compositions, and the number of contactor stages, were carried out to assess their impact on the operation of the TRUEX process. Results of these analyses provide a basis for the selection of an instrument and control system and the eventual implementation of a control algorithm. Volume Two of this report is an evaluation of the instruments available for measuring many of the physical parameters. Equations that model the dynamic behavior of the TRUEX process have been generated. These equations can be used to describe the transient or dynamic behavior of the process for a given flowsheet in accordance with the TRUEX model. Further work will be done with the dynamic model to determine how and how quickly the system responds to various perturbations. The use of perturbation analysis early in the design stage will lead to a robust flowsheet, namely, one that will meet all process goals and allow for wide control bounds. The process time delay, that is, the speed with which the system reaches a new steady state, is an important parameter in monitoring and controlling a process. In the future, instrument selection and point-of-variable measurement, now done using the steady-state results reported here, will be reviewed and modified as necessary based on this dynamic method of analysis

  16. Dynamic software architecture development: Towards an automated process

    OpenAIRE

    Ter Beek M.H.; Bucchiarone A.; Gnesi S.

    2009-01-01

    We propose a software engineering process to aid the development of Dynamic Software Architectures (DSAs). This process is based on the sequential application of a number of formal methods and tools, and it can support software architects throughout the design, analysis and code generation of software systems. To illustrate the process, we apply it to an industrial case study from the Service-Oriented Computing (SOC) domain.

  17. Capillary electrophoresis for automated on-line monitoring of suspension cultures: Correlating cell density, nutrients and metabolites in near real-time.

    Science.gov (United States)

    Alhusban, Ala A; Breadmore, Michael C; Gueven, Nuri; Guijt, Rosanne M

    2016-05-12

    Increasingly stringent demands on the production of biopharmaceuticals demand monitoring of process parameters that impact on their quality. We developed an automated platform for on-line, near real-time monitoring of suspension cultures by integrating microfluidic components for cell counting and filtration with a high-resolution separation technique. This enabled the correlation of the growth of a human lymphocyte cell line with changes in the essential metabolic markers, glucose, glutamine, leucine/isoleucine and lactate, determined by Sequential Injection-Capillary Electrophoresis (SI-CE). Using 8.1 mL of media (41 μL per run), the metabolic status and cell density were recorded every 30 min over 4 days. The presented platform is flexible, simple and automated and allows for fast, robust and sensitive analysis with low sample consumption and high sample throughput. It is compatible with up- and out-scaling, and as such provides a promising new solution to meet the future demands in process monitoring in the biopharmaceutical industry. PMID:27114228

  18. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Tekinerdogan, Bedir

    1998-01-01

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and soft

  19. Automating Measurement for Software Process Models using Attribute Grammar Rules

    Directory of Open Access Journals (Sweden)

    Abdul Azim Abd. Ghani

    2007-08-01

    Full Text Available The modelling concept is well accepted in software engineering discipline. Some software models are built either to control the development stages, to measure program quality or to serve as a medium that gives better understanding of the actual software systems. Software process modelling nowadays has reached a level that allow software designs to be transformed into programming languages, such as architecture design language and unified modelling language. This paper described the adaptation of attribute grammar approach in measuring software process model. A tool, called Software Process Measurement Application was developed to enable the measurement accordingly to specified attribute grammar rules. A context-free grammar to read the process model is depicted from IDEF3 standard, and rules were attached to enable the measurement metrics calculation. The measurement metric values collected were used to aid in determining the decomposing and structuring of processes for the proposed software systems.

  20. The feasibility of automated online flow cytometry for in-situ monitoring of microbial dynamics in aquatic ecosystems

    Directory of Open Access Journals (Sweden)

    Michael Domenic Besmer

    2014-06-01

    Full Text Available Fluorescent staining coupled with flow cytometry (FCM is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i flowing tap water from a municipal drinking water supply network and (ii river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12 to 14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough towards the eventual establishment of fully automated online microbiological monitoring technologies.

  1. Design and development on automated control system of coated fuel particle fabrication process

    International Nuclear Information System (INIS)

    With the development trend of the large-scale production of the HTR coated fuel particles, the original manual control system can not meet the requirement and the automation control system of coated fuel particle fabrication in modern industrial grade is needed to develop. The comprehensive analysis aiming at successive 4-layer coating process of TRISO type coated fuel particles was carried out. It was found that the coating process could be divided into five subsystems and nine operating states. The establishment of DCS-type (distributed control system) of automation control system was proposed. According to the rigorous requirements of preparation process for coated particles, the design considerations of DCS were proposed, including the principle of coordinated control, safety and reliability, integration specification, practical and easy to use, and open and easy to update. A complete set of automation control system for coated fuel particle preparation process was manufactured based on fulfilling the requirements of these principles in manufacture practice. The automated control system was put into operation in the production of irradiated samples for HTRPM demonstration project. The experimental results prove that the system can achieve better control of coated fuel particle preparation process and meet the requirements of factory-scale production. (authors)

  2. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    Science.gov (United States)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  3. MIR-ATR sensor for process monitoring

    International Nuclear Information System (INIS)

    A mid-infrared attenuated total reflectance (MIR-ATR) sensor has been developed for chemical reaction monitoring. The optical setup of the compact and low-priced sensor consists of an IR emitter as light source, a zinc selenide (ZnSe) ATR prism as boundary to the process, and four thermopile detectors, each equipped with an optical bandpass filter. The practical applicability was tested during esterification of ethanol and formic acid to ethyl formate and water as a model reaction with subsequent distillation. For reference analysis, a Fourier transform mid-infrared (FT-MIR) spectrometer with diamond ATR module was applied. On-line measurements using the MIR-ATR sensor and the FT-MIR spectrometer were performed in a bypass loop. The sensor was calibrated by multiple linear regression in order to link the measured absorbance in the four optical channels to the analyte concentrations. The analytical potential of the MIR-ATR sensor was demonstrated by simultaneous real-time monitoring of all four chemical substances involved in the esterification and distillation process. The temporal courses of the sensor signals are in accordance with the concentration values achieved by the commercial FT-MIR spectrometer. The standard error of prediction for ethanol, formic acid, ethyl formate, and water were 0.38 mol L  −  1, 0.48 mol L  −  1, 0.38 mol L  −  1, and 1.12 mol L  −  1, respectively. A procedure based on MIR spectra is presented to simulate the response characteristics of the sensor if the transmission ranges of the filters are varied. Using this tool analyte specific bandpass filters for a particular chemical reaction can be identified. By exchanging the optical filters, the sensor can be adapted to a wide range of processes in the chemical, pharmaceutical, and beverage industries. (paper)

  4. Automating security monitoring and analysis for Space Station Freedom's electric power system

    Science.gov (United States)

    Dolce, James L.; Sobajic, Dejan J.; Pao, Yoh-Han

    1990-01-01

    Operating a large, space power system requires classifying the system's status and analyzing its security. Conventional algorithms are used by terrestrial electric utilities to provide such information to their dispatchers, but their application aboard Space Station Freedom will consume too much processing time. A new approach for monitoring and analysis using adaptive pattern techniques is presented. This approach yields an on-line security monitoring and analysis algorithm that is accurate and fast; and thus, it can free the Space Station Freedom's power control computers for other tasks.

  5. Uranium monitoring tool for rapid analysis of environmental samples based on automated liquid-liquid microextraction.

    Science.gov (United States)

    Rodríguez, Rogelio; Avivar, Jessica; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2015-03-01

    A fully automated in-syringe (IS) magnetic stirring assisted (MSA) liquid-liquid microextraction (LLME) method for uranium(VI) determination was developed, exploiting a long path-length liquid waveguide capillary cell (LWCC) with spectrophotometric detection. On-line extraction of uranium was performed within a glass syringe containing a magnetic stirrer for homogenization of the sample and the successive reagents: cyanex-272 in dodecane as extractant, EDTA as interference eliminator, hydrochloric acid to make the back-extraction of U(VI) and arsenazo-III as chromogenic reagent to accomplish the spectrophotometric detection at 655 nm. Magnetic stirring assistance was performed by a specially designed driving device placed around the syringe body creating a rotating magnetic field in the syringe, and forcing the rotation of the stirring bar located inside the syringe. The detection limit (LOD) of the developed method is 3.2 µg L(-1). Its good interday precision (Relative Standard Deviation, RSD 3.3%), and its high extraction frequency (up to 6 h(-1)) makes of this method an inexpensive and fast screening tool for monitoring uranium(VI) in environmental samples. It was successfully applied to different environmental matrices: channel sediment certified reference material (BCR-320R), soil and phosphogypsum reference materials, and natural water samples, with recoveries close to 100%. PMID:25618721

  6. The Continuous Monitoring of Flash Flood Velocity Field based on an Automated LSPIV System

    Science.gov (United States)

    Li, W.; Ran, Q.; Liao, Q.

    2014-12-01

    Large-scale particle image velocimetry (LSPIV) is a non-intrusive tool for flow velocity field measurement and has more advantages against traditional techniques, with its applications on river, lake and ocean, especially under extreme conditions. An automated LSPIV system is presented in this study, which can be easily set up and executed for continuous monitoring of flash flood. The experiment site is Longchi village, Sichuan Province, where 8.0 magnitude earthquake occurred in 2008 and debris flow happens every year since then. The interest of area is about 30m*40m of the channel which has been heavily destroyed by debris flow. Series of videos obtained during the flood season indicates that flood outbreaks after rainstorm just for several hours. Measurement is complete without being influenced by this extreme weather condition and results are more reliable and accurate due to high soil concentration. Compared with direct measurement by impellor flow meter, we validated that LSPIV works well at mountain stream, with index of 6.7% (Average Relative Error) and 95% (Nash-Sutcliffe Coefficient). On Jun 26, the maximum flood surface velocity reached 4.26 m/s, and the discharge based on velocity-area method was also decided. Overall, this system is safe, non-contact and can be adjusted according to our requirement flexibly. We can get valuable data of flood which is scarce before, which will make a great contribution to the analysis of flood and debris flow mechanism.

  7. An Improvement in Thermal Modelling of Automated Tape Placement Process

    International Nuclear Information System (INIS)

    The thermoplastic tape placement process offers the possibility of manufacturing large laminated composite parts with all kinds of geometries (double curved i.e.). This process is based on the fusion bonding of a thermoplastic tape on a substrate. It has received a growing interest during last years because of its non autoclave abilities.In order to control and optimize the quality of the manufactured part, we need to predict the temperature field throughout the processing of the laminate. In this work, we focus on a thermal modeling of this process which takes in account the imperfect bonding existing between the different layers of the substrate by introducing thermal contact resistance in the model. This study is leaning on experimental results which inform us that the value of the thermal resistance evolves with temperature and pressure applied on the material.

  8. Measures and mechanisms for process monitoring in evolving business networks

    OpenAIRE

    Comuzzi, M.; Vonk, J.; Grefen, P.

    2012-01-01

    The literature on monitoring of cross-organizational processes, executed within business networks, considers monitoring only in the network formation phase, since network establishment determines what can be monitored during process execution. In particular, the impact of evolution in such networks on monitoring is not considered. When a business network evolves, e.g. contracts are introduced, updated, or dropped, or actors join or leave the network, the monitoring requirements of the network...

  9. Monitoring, accounting and automated decision support for the ALICE experiment based on the MonALISA framework

    CERN Document Server

    Cirstoiu, C; Betev, L; Saiz, P; Peters, A J; Muraru, A; Voicu, R; Legrand, I

    2007-01-01

    We are developing a general purpose monitoring system for the ALICE experiment, based on the MonALISA framework. MonALISA (Monitoring Agents using a Large Integrated Services Architecture) is a fully distributed system with no single point of failure that is able to collect, store monitoring information and present it as significant perspectives and synthetic views on the status and the trends of the entire system. Furthermore, agents can use it for taking automated operational decisions. Monitoring information is gathered locally from all the components running in each site. The entire flow of information is aggregated on site level by a MonALISA service and then collected and presented in various forms by a central MonALISA Repository. Based on this information, other services take operational decisions such as alerts, triggers, service restarts and automatic production job or transfer submissions. The system monitors all the components: computer clusters (all major parameters of each computing node), jobs ...

  10. Towards automated processing of the right of access in inter-organizational Web Service compositions

    DEFF Research Database (Denmark)

    Herkenhöner, Ralph; De Meer, Hermann; Jensen, Meiko;

    2010-01-01

    with trade secret protection. In this paper, we present an automated architecture to enable exercising the right of access in the domain of inter-organizational business processes based on Web Services technology. Deriving its requirements from the legal, economical, and technical obligations, we show...

  11. MO-G-BRE-03: Automated Continuous Monitoring of Patient Setup with Second-Check Independent Image Registration

    International Nuclear Information System (INIS)

    Purpose: To create a non-supervised quality assurance program to monitor image-based patient setup. The system acts a secondary check by independently computing shifts and rotations and interfaces with Varian's database to verify therapist's work and warn against sub-optimal setups. Methods: Temporary digitally-reconstructed radiographs (DRRs) and OBI radiographic image files created by Varian's treatment console during patient setup are intercepted and used as input in an independent registration module customized for accuracy that determines the optimal rotations and shifts. To deal with the poor quality of OBI images, a histogram equalization of the live images to the DDR counterparts is performed as a pre-processing step. A search for the most sensitive metric was performed by plotting search spaces subject to various translations and convergence analysis was applied to ensure the optimizer finds the global minima. Final system configuration uses the NCC metric with 150 histogram bins and a one plus one optimizer running for 2000 iterations with customized scales for translations and rotations in a multi-stage optimization setup that first corrects and translations and subsequently rotations. Results: The system was installed clinically to monitor and provide almost real-time feedback on patient positioning. On a 2 month-basis uncorrected pitch values were of a mean 0.016° with standard deviation of 1.692°, and couch rotations of − 0.090°± 1.547°. The couch shifts were −0.157°±0.466° cm for the vertical, 0.045°±0.286 laterally and 0.084°± 0.501° longitudinally. Uncorrected pitch angles were the most common source of discrepancies. Large variations in the pitch angles were correlated with patient motion inside the mask. Conclusion: A system for automated quality assurance of therapist's registration was designed and tested in clinical practice. The approach complements the clinical software's automated registration in

  12. Process monitoring for reprocessing plant safeguards: a summary review

    International Nuclear Information System (INIS)

    Process monitoring is a term typically associated with a detailed look at plant operating data to determine plant status. Process monitoring has been generally associated with operational control of plant processes. Recently, process monitoring has been given new attention for a possible role in international safeguards. International Safeguards Project Office (ISPO) Task C.59 has the goal to identify specific roles for process monitoring in international safeguards. As the preliminary effort associated with this task, a review of previous efforts in process monitoring for safeguards was conducted. Previous efforts mentioned concepts and a few specific applications. None were comprehensive in addressing all aspects of a process monitoring application for safeguards. This report summarizes the basic elements that must be developed in a comprehensive process monitoring application for safeguards. It then summarizes the significant efforts that have been documented in the literature with respect to the basic elements that were addressed

  13. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A.

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  14. Automation System in Rare Earths Countercurrent Extraction Processes

    Institute of Scientific and Technical Information of China (English)

    贾江涛; 严纯华; 廖春生; 吴声; 王明文; 李标国

    2001-01-01

    Based on the countercurrent extraction theory for optimized designing and simulating, the rare earth separation processes, the selection of the detecting points (stages) and on-line analysis for elements, the simulation of open loop response and its response speed, the diagnosis and the regulative prescription for running the solvent extraction cascades were studied.

  15. Emergency healthcare process automation using mobile computing and cloud services.

    Science.gov (United States)

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2012-10-01

    Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case. PMID:22205383

  16. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  17. Monitoring seasonal and diurnal changes in photosynthetic pigments with automated PRI and NDVI sensors

    Science.gov (United States)

    Gamon, J. A.; Kovalchuck, O.; Wong, C. Y. S.; Harris, A.; Garrity, S. R.

    2015-07-01

    The vegetation indices normalized difference vegetation index (NDVI) and photochemical reflectance index (PRI) provide indicators of pigmentation and photosynthetic activity that can be used to model photosynthesis from remote sensing with the light-use-efficiency model. To help develop and validate this approach, reliable proximal NDVI and PRI sensors have been needed. We tested new NDVI and PRI sensors, "spectral reflectance sensors" (SRS sensors; recently developed by Decagon Devices, during spring activation of photosynthetic activity in evergreen and deciduous stands. We also evaluated two methods of sensor cross-calibration - one that considered sky conditions (cloud cover) at midday only, and another that also considered diurnal sun angle effects. Cross-calibration clearly affected sensor agreement with independent measurements, with the best method dependent upon the study aim and time frame (seasonal vs. diurnal). The seasonal patterns of NDVI and PRI differed for evergreen and deciduous species, demonstrating the complementary nature of these two indices. Over the spring season, PRI was most strongly influenced by changing chlorophyll : carotenoid pool sizes, while over the diurnal timescale, PRI was most affected by the xanthophyll cycle epoxidation state. This finding demonstrates that the SRS PRI sensors can resolve different processes affecting PRI over different timescales. The advent of small, inexpensive, automated PRI and NDVI sensors offers new ways to explore environmental and physiological constraints on photosynthesis, and may be particularly well suited for use at flux tower sites. Wider application of automated sensors could lead to improved integration of flux and remote sensing approaches for studying photosynthetic carbon uptake, and could help define the concept of contrasting vegetation optical types.

  18. Automated contrast medium monitoring system for computed tomography--Intra-institutional audit.

    Science.gov (United States)

    Lauretti, Dario Luca; Neri, Emanuele; Faggioni, Lorenzo; Paolicchi, Fabio; Caramella, Davide; Bartolozzi, Carlo

    2015-12-01

    iodine concentration and iodine dose (rs=0.3862, p<0.0001) for all CT studies. Automated contrast management systems can provide a full report of contrast use with the possibility to systematically compare different contrast injection protocols, minimize errors, and optimize organ-specific contrast enhancement for any given patient and clinical application. This can be useful to improve and harmonize the quality and consistency of contrast CT procedures within the same radiological department and across the hospital, as well as to monitor potential adverse events and overall costs. PMID:26365621

  19. Automation of functional testing in software development process

    OpenAIRE

    MUHIČ, BENJAMIN

    2014-01-01

    Rapid development and changes in technology and increasing market demands are leading companies to choose agile methods in software development. The main objective of applying agile methods is primarily fast adaptation of development process to changed market demands. Short deadlines in the product supply chain, demand from the companies on one side, to shorten the software testing cycle, but on the other side to develop and produce high quality products. Therefore companies decide on automat...

  20. Automation of the Process to Obtain U F4 Powders

    International Nuclear Information System (INIS)

    Here is exposed the preliminary analysis of the control system to be implemented in the Production Plant of UF4 Powders.The work has been done in the electronic laboratory.This implies, the setting of devices (PLC, Temperature Controllers, etc.) and the setting of the communications using the proper protocol.Also is shown a study about the logic for the first part of the conversion process of UF6: the evaporation.This study is used to define the methodology to follow in a future PLC program

  1. Cassini's Maneuver Automation Software (MAS) Process: How to Successfully Command 200 Navigation Maneuvers

    Science.gov (United States)

    Yang, Genevie Velarde; Mohr, David; Kirby, Charles E.

    2008-01-01

    To keep Cassini on its complex trajectory, more than 200 orbit trim maneuvers (OTMs) have been planned from July 2004 to July 2010. With only a few days between many of these OTMs, the operations process of planning and executing the necessary commands had to be automated. The resulting Maneuver Automation Software (MAS) process minimizes the workforce required for, and maximizes the efficiency of, the maneuver design and uplink activities. The MAS process is a well-organized and logically constructed interface between Cassini's Navigation (NAV), Spacecraft Operations (SCO), and Ground Software teams. Upon delivery of an orbit determination (OD) from NAV, the MAS process can generate a maneuver design and all related uplink and verification products within 30 minutes. To date, all 112 OTMs executed by the Cassini spacecraft have been successful. MAS was even used to successfully design and execute a maneuver while the spacecraft was in safe mode.

  2. Quantitative and Qualitative Analysis of Aconitum Alkaloids in Raw and Processed Chuanwu and Caowu by HPLC in Combination with Automated Analytical System and ESI/MS/MS

    Directory of Open Access Journals (Sweden)

    Aimin Sun

    2012-01-01

    Full Text Available HPLC in combination with automated analytical system and ESI/MS/MS was used to analyze aconitine (A, mesaconitine (MA, hypaconitine (HA, and their benzoyl analogs in the Chinese herbs Caowu and Chuanwu. First, an HPLC method was developed and validated to determine A, MA, and HA in raw and processed Caowu and Chuanwu. Then an automated analytical system and ESI/MS/MS were applied to analyze these alkaloids and their semihydrolyzed products. The results obtained from automated analytical system are identical to those from ESI/MS/MS, which indicated that the method is a convenient and rapid tool for the qualitative analysis of herbal preparations. Furthermore, HA was little hydrolyzed by heating processes and thus it might account more for the toxicity of processed aconites. Hence, HA could be used as an indicator when one alkaloid is required as a reference to monitor the quality of raw and processed Chuanwu and Caowu. In addition, the raw and processed Chuanwu and Caowu can be distinguished by monitoring the ratio of A and MA to HA.

  3. AUTOMATED SYSTEM OF DATA PROCESSING WITH THE IMPLEMENTATION OF RATING TECHNOLOGY OF TEACHING

    Directory of Open Access Journals (Sweden)

    О. И. Дзювина

    2014-01-01

    Full Text Available Rating technology of teaching enables independent and individual work of students, increase their motivation.Purpose: to increase the efficiency of data processing with the implementation of rating technology of teaching.Method: analysis, synthesis,experiment.Results. Developed an automated data processing system for the implementation of rating technology of teaching.Practical implication. Education.Purchase on Elibrary.ru > Buy now

  4. Software conception and structure of automated workstation for an experimental data processing

    International Nuclear Information System (INIS)

    In order to increase user efficiency and possibilities in the use of experimental data processing distributive system based on personal computers, the automated experimental data processing workstation are created. Basic principles including different kinds of users and their access attributes, operating regimes, the roll of the intelligent computer assistant, and software and hardware needed are analysed. Software includes knowledge base, data base, Al system building tools, intelligent system applications, general programs and other service and teaching programs. 7 refs.; 4 figs

  5. An image-processing program for automated counting

    Science.gov (United States)

    Cunningham, D.J.; Anderson, W.H.; Anthony, R.M.

    1996-01-01

    An image-processing program developed by the National Institute of Health, IMAGE, was modified in a cooperative project between remote sensing specialists at the Ohio State University Center for Mapping and scientists at the Alaska Science Center to facilitate estimating numbers of black brant (Branta bernicla nigricans) in flocks at Izembek National Wildlife Refuge. The modified program, DUCK HUNT, runs on Apple computers. Modifications provide users with a pull down menu that optimizes image quality; identifies objects of interest (e.g., brant) by spectral, morphometric, and spatial parameters defined interactively by users; counts and labels objects of interest; and produces summary tables. Images from digitized photography, videography, and high- resolution digital photography have been used with this program to count various species of waterfowl.

  6. Automating the parallel processing of fluid and structural dynamics calculations

    Science.gov (United States)

    Arpasi, Dale J.; Cole, Gary L.

    1987-01-01

    The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilties to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.

  7. FLAME MONITORING IN POWER STATION BOILERS USING IMAGE PROCESSING

    Directory of Open Access Journals (Sweden)

    K. Sujatha

    2012-05-01

    Full Text Available Combustion quality in power station boilers plays an important role in minimizing the flue gas emissions. In the present work various intelligent schemes to infer the flue gas emissions by monitoring the flame colour at the furnace of the boiler are proposed here. Flame image monitoring involves capturing the flame video over a period of time with the measurement of various parameters like Carbon dioxide (CO2, excess oxygen (O2, Nitrogen dioxide (NOx, Sulphur dioxide (SOx and Carbon monoxide (CO emissions plus the flame temperature at the core of the fire ball, air/fuel ratio and the combustion quality. Higher the quality of combustion less will be the flue gases at the exhaust. The flame video was captured using an infrared camera. The flame video is then split up into the frames for further analysis. The video splitter is used for progressive extraction of the flame images from the video. The images of the flame are then pre-processed to reduce noise. The conventional classification and clustering techniques include the Euclidean distance classifier (L2 norm classifier. The intelligent classifier includes the Radial Basis Function Network (RBF, Back Propagation Algorithm (BPA and parallel architecture with RBF and BPA (PRBFBPA. The results of the validation are supported with the above mentioned performance measures whose values are in the optimal range. The values of the temperatures, combustion quality, SOx, NOx, CO, CO2 concentrations, air and fuel supplied corresponding to the images were obtained thereby indicating the necessary control action taken to increase or decrease the air supply so as to ensure complete combustion. In this work, by continuously monitoring the flame images, combustion quality was inferred (complete/partial/incomplete combustion and the air/fuel ratio can be automatically varied. Moreover in the existing set-up, measurements like NOx, CO and CO2 are inferred from the samples that are collected periodically or by

  8. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles.

    Science.gov (United States)

    Görgl, R.; Brandstätter, E.

    2016-03-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  9. UNICOS CPC6: automated code generation for process control applications

    International Nuclear Information System (INIS)

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS). As a part of this framework, UNICOS-CPC provides a well defined library of device types, a methodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) to develop CPC applications. The CPC component is composed of several platform oriented plug-ins (PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA generator (based on PVSS) and the CPC wizard as a dedicated plug-in created to provide the user a friendly GUI (Graphical User Interface). A tool called UAB Bootstrap will manage the different UAB components, like CPC, and its dependencies with the resource packages. This tool guides the control system developer during the installation, update and execution of the UAB components. (authors)

  10. Feynchois: System For Automating The Process Of Feynman Diagram Generation

    CERN Document Server

    Choi, C

    2004-01-01

    We have developed a DTD (Document Type Definition) for an XML (Extensible Markup Language) document for describing Feynman rules of quantum field theoretical models—the document is called FeynPage. A FeynPage can be any XML document that conforms to the FeynPage DTD. A FeynPage can be understood by a human or a computer program that is aware of the FeynPage DTD. We have also developed a Feynman diagram generator, which has been named FeynChois. It provides a user with a full GUI (Graphical User Interface) environment. More importantly, FeynChois knows how to read FeynPage. When FeynChois is asked by a user to generate diagrams, it will first look up the rules in the FeynPage; then, it will generate diagrams according to the rules for any process specified by the user. If the Feynman rules in a FeynPage are modified, FeynChois will generate diagrams according to the modified rules. What FeynChois generates are actually Java™ objects that represent Feynman diagrams. These objects are graphi...

  11. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    Science.gov (United States)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  12. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  13. Automated system for hydrogen and its isotopes monitoring at IVG.1M reactor

    International Nuclear Information System (INIS)

    System is dedicated for real-time registration of hydrogen (and its isotopes) concentration in two volumes of experimental device during irradiative material testing experiment at IVG1M reactor. System allows conducting of automated simultaneous real-time sampling and registration of chemical composition of gases in operational volumes of two chambers of experimental installation (in inlet and outlet sides, for example). That is especially important for experiments on study of diffusion parameters (with method of hydrogen permeation, for example). Measurement part is based on mass-analyzing radio-frequency gouges of omegatron type RMO-13 and MX6407-P (for light masses) with appropriate set of hardware-software tools. Original programmable oscillator of omegatron operational frequency and programmable high-voltage ramp for MX6407-P deflection system were designed with purpose to provide computer control for both mass-analyzers. CAMAC interface is used to link measurement system to IBM PC. System provides possibility to measure wide mass-spectrum in a chamber under irradiation as well as simultaneous measurement of concentration changes of up to four masses in one chamber and registration of spectrum of light masses (2-6) in second chamber. Also it allows operation in data analysis mode when measurements are finished. Selection of operational mode, set-up of measurement duration, sampling frequency and data analysis are provided by graphic terminal of IBM PC. System supports data acquisition and processing during four-hour reactor power session with following technical characteristics: Number of measure channels - 2 (one-channel operation is possible); Input signal - analogous, 10-5-10.0 V voltage; Channel sampling frequency -up to 0.1 Hz. Software is functioning in environment of operational systems MS-DOS or Windows (in DOS emulation mode). Post-measurement data processing provides visual analysis and filtration of measured arrays. Described automated system is

  14. An agent-based service-oriented integration architecture for chemical process automation

    Institute of Scientific and Technical Information of China (English)

    Na Luo; Weimin Zhong; Feng Wan; Zhencheng Ye; Feng Qian

    2015-01-01

    In reality, traditional process control system built upon centralized and hierarchical structures presents a weak response to change and is easy to shut down by single failure. Aiming at these problems, a new agent-based service-oriented integration architecture was proposed for chemical process automation system. Web services were dynamical y orchestrated on the internet and agent behaviors were built in them. Data analysis, model, op-timization, control, fault diagnosis and so on were capsuled into different web services. Agents were used for ser-vice compositions by negotiation. A prototype system of poly(ethylene terephthalate) process automation was used as the case study to demonstrate the validation of the integration.

  15. Automation of NLO processes and decays and POWHEG matching in WHIZARD

    CERN Document Server

    Reuter, J; Hoang, A; Kilian, W; Stahlhofen, M; Teubner, T; Weiss, C

    2016-01-01

    We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one-loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e+e- processes are shown. Also, first NLO-studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the POWHEG formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables.

  16. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Directory of Open Access Journals (Sweden)

    Dai Fei Elmer Ker

    Full Text Available Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and

  17. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  18. Radioactivity monitoring and data processing on the SHMU

    International Nuclear Information System (INIS)

    The radiation monitoring network in the Slovak Republic is presented. The data are collected and data processing proceeds on the Slovak Hydro-Meteorologic Institute. The data from 21 monitored sites are send to Slovak Centre of Radiation Monitoring Network, to the Nuclear Regulatory Authority of the Slovak Republic, to the Bohunice NPP, as well as to Austria and other national monitoring centres. From January 1999 the data from the Slovak Army monitoring network, consisting from 11 sites of measurements, are obtained. Process of data processing is described

  19. Comprehensive automation and monitoring of MV grids as the key element of improvement of energy supply reliability and continuity

    Directory of Open Access Journals (Sweden)

    Stanisław Kubacki

    2012-03-01

    Full Text Available The paper presents the issue of comprehensive automation and monitoring of medium voltage (MV grids as a key element of the Smart Grid concept. The existing condition of MV grid control and monitoring is discussed, and the concept of a solution which will provide the possibility of remote automatic grid reconfiguration and ensure full grid observability from the dispatching system level is introduced. Automation of MV grid switching is discussed in detail to isolate a faulty line section and supply electricity at the time of the failure to the largest possible number of recipients. An example of such automation controls’ operation is also presented. The paper’s second part presents the key role of the quick fault location function and the possibility of the MV grid’s remote reconfiguration for improving power supply reliability (SAIDI and SAIFI indices. It is also shown how an increase in the number of points fitted with faulted circuit indicators with the option of remote control of switches from the dispatch system in MV grids may affect reduction of SAIDI and SAIFI indices across ENERGA-OPERATOR SA divisions.

  20. An Advanced Pre-Processing Pipeline to Improve Automated Photogrammetric Reconstructions of Architectural Scenes

    Directory of Open Access Journals (Sweden)

    Marco Gaiani

    2016-02-01

    Full Text Available Automated image-based 3D reconstruction methods are more and more flooding our 3D modeling applications. Fully automated solutions give the impression that from a sample of randomly acquired images we can derive quite impressive visual 3D models. Although the level of automation is reaching very high standards, image quality is a fundamental pre-requisite to produce successful and photo-realistic 3D products, in particular when dealing with large datasets of images. This article presents an efficient pipeline based on color enhancement, image denoising, color-to-gray conversion and image content enrichment. The pipeline stems from an analysis of various state-of-the-art algorithms and aims to adjust the most promising methods, giving solutions to typical failure causes. The assessment evaluation proves how an effective image pre-processing, which considers the entire image dataset, can improve the automated orientation procedure and dense 3D point cloud reconstruction, even in the case of poor texture scenarios.

  1. Lifetime of automated control system of technological processes of nuclear power unit and its updating

    International Nuclear Information System (INIS)

    For the analysis of their physical lifetime, components of automated technological process control systems at WWER nuclear power plants are divided into three categories, viz., irreparable equipment, equipment reparable during unit shutdown, and equipment reparable during the unit operation. Conditions for the innovation and upgrading of automated control systems at Czechoslovak WWER-440 nuclear power plants are discussed. Presumably, at the Bohunice V-2 and the Dukovany nuclear power plants the innovation of these systems will proceed in a manner similar to that applied at the Loviisa nuclear power plant in Finland. The decision concerning the global upgrading should be postponed until experience is gained with the new automated control system that will be installed at the first unit of the Mochovce nuclear power plant. The upgrading of the automated control systems at the two Bohunice V-1 power plant units poses specific problems, particularly with respect to the assessment of the true physical lifetime of the technological equipment and to the finding of the best approach to the upgrading. (Z.M.). 2 figs

  2. THE AUTOMATED TESTING SYSTEM OF PROGRAMS WITH THE GRAPHIC USER INTERFACE WITHIN THE CONTEXT OF EDUCATIONAL PROCESS

    OpenAIRE

    Sychev, O.; Kiryushkin, A.

    2009-01-01

    The paper describes the problems of automation of educational process at the course "Programming on high level language. Algorithmic languages". Complexities of testing of programs with the user interface are marked. Existing analogues was considered. Methods of automation of student's jobs testing are offered.

  3. Evaluation of a Multi-Parameter Sensor for Automated, Continuous Cell Culture Monitoring in Bioreactors

    Science.gov (United States)

    Pappas, D.; Jeevarajan, A.; Anderson, M. M.

    2004-01-01

    offer automated, continuous monitoring of cell cultures with a temporal resolution of 1 minute, which is not attainable by sampling via handheld blood analyzer (i-STAT). Conclusion: The resulting bias and precision found in these cell culture-based studies is comparable to Paratrend sensor clinical results. Although the large error in p02 measurements (+/-18 mm Hg) may be acceptable for clinical applications, where Paratrend values are periodically adjusted to a BGA measurement, the O2 sensor in this bundle may not be reliable enough for the single-calibration requirement of sensors used in NASA's bioreactors. The pH and pC02 sensors in the bundle are reliable and stable over the measurement period, and can be used without recalibration to measure cell cultures in rn.jcrogravity biotechnology experiments. Future work will test additional Paratrend sensors to provide statistical assessment of sensor performance.

  4. Process monitoring in support of International Atomic Energy Agency safeguards

    International Nuclear Information System (INIS)

    A review of previous efforts in process monitoring for safeguards was conducted. Previous efforts touched on various concepts and a few specific applications, but none was comprehensive in addressing all aspects of a process monitoring application for safeguards. This report develops prototypical process monitoring concepts that can be incorporated into the International Atomic Energy Agency's (IAEA's) general safeguards approach for fuel reprocessing plants. This effort considers existing approaches, recognizing limitations and needed improvements. Prototypical process monitoring applications are developed and proposed for implementation and demonstration in the Integrated Equipment Test facility, which is located at the Oak Ridge National Laboratory. The specific information needed to accomplish the process monitoring objectives are defined, and the mechanics for obtaining that information are described. Effort is given to the identification and assessment of potential impacts and benefits associated with process monitoring concepts, with particular attention to IAEA, state, and plant operator interests. The historical development of process monitoring is described and the implications of using process monitoring in international safeguards are discussed. Specific process process monitoring applications for demonstration in the IET facility are developed in Sects. 6 through 14. 1 fig

  5. Automated system for monitoring groundwater levels at an experimental low-level waste disposal site

    International Nuclear Information System (INIS)

    One of the major problems with disposing of low-level solid wastes in the eastern United States is the potential for water-waste interactions and leachate migration. To monitor groundwater fluctuations and the frequency with which groundwater comes into contact with a group of experimental trenches, work at Oak Ridge National Laboratory's Engineered Test Facility (ETF) has employed a network of water level recorders that feed information from 15 on-site wells to a centralized data recording system. The purpose of this report is to describe the monitoring system being used and to document the computer programs that have been developed to process the data. Included in this report are data based on more than 2 years of water level information for ETF wells 1 through 12 and more than 6 months of data from all 15 wells. The data thus reflect both long-term trends as well as a large number of short-term responses to individual storm events. The system was designed to meet the specific needs of the ETF, but the hardware and computer routines have generic application to a variety of groundwater monitoring situations. 5 references

  6. NeuronMetrics: Software for Semi-Automated Processing of Cultured-Neuron Images

    OpenAIRE

    Narro, Martha L.; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L.

    2007-01-01

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics™ for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based ...

  7. A realization of an automated data flow for data collecting, processing, storing and retrieving

    International Nuclear Information System (INIS)

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's. 14 refs., 4 figs

  8. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  9. Current status of process monitoring for IAEA safeguards

    International Nuclear Information System (INIS)

    Based on literature survey, this report tries to answer some of the following questions on process monitoring for safeguards purposes of future large scale reprocessing plants: what is process monitoring, what are the basic elements of process monitoring, what kinds of process monitoring are there, what are the basic problems of process monitoring, what is the relationship between process monitoring and near-real-time materials accountancy, what are actual results of process monitoring tests and what should be studied in future. A brief description of Advanced Safeguards Approaches proposed by the four states (France, U.K., Japan and U.S.A.), the approach proposed by the U.S.A., the description of the process monitoring, the main part of the report published as a result of one of the U.S. Support Programmes for IAEA Safeguards and an article on process monitoring presented at an IAEA Symposium held in November 1986 are given in the annexes. 24 refs, 20 figs, tabs

  10. Research on the Correlation Between Oil Menitoring and Vibration Monitoring in Information Collecting and Processing Monitoring

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xin-ze; YAN Xin-ping; ZHAO Chun-hong; GAO Xiao-hong; XIAO Han-liang

    2004-01-01

    Oil monitoriug and vibration monitoring are two principal techniques for mechanical fault diagnosis and condition monitoring at present. They monitor the mechanical condition by different approaches, neverthelcss, oil and vibration monitoring are related in information collecting and processing. In the same mechanical system, the information obtained from the same information source can be described with the same expression form. The expressions are constituted of a structure matrix, a relative matrix and a system matrix. For oil and vibration monitoring, the information source is correlation and the collection is independent and complementary. And oil monitoring and vibration monitoring have the same process method when they yield their information. This rcsearch has provided a reasonable and useful approach to combine oil monitoring and vibration monitoring.

  11. Analysis of the thoracic aorta using a semi-automated post processing tool

    Energy Technology Data Exchange (ETDEWEB)

    Entezari, Pegah, E-mail: p-entezari@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Kino, Aya, E-mail: ayakino@gmail.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Honarmand, Amir R., E-mail: arhonarmand@yahoo.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Galizia, Mauricio S., E-mail: maugalizia@yahoo.com.br [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yang, Yan, E-mail: yyang@vitalimages.com [Vital images Inc, Minnetonka, MN (United States); Collins, Jeremy, E-mail: collins@fsm.northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yaghmai, Vahid, E-mail: vyaghmai@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Carr, James C., E-mail: jcarr@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States)

    2013-09-15

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  12. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  13. Radiation Monitoring System in Advanced Spent Fuel Conditioning Process Facility

    International Nuclear Information System (INIS)

    The Advanced spent fuel Conditioning Process is under development for effective management of spent fuel by converting UO2 into U-metal. For demonstration of this process, α-γ type new hot cell was built in the IMEF basement . To secure against radiation hazard, this facility needs radiation monitoring system which will observe the entire operating area before the hot cell and service area at back of it. This system consists of 7 parts; Area Monitor for γ-ray, Room Air Monitor for particulate and iodine in both area, Hot cell Monitor for hot cell inside high radiation and rear door interlock, Duct Monitor for particulate of outlet ventilation, Iodine Monitor for iodine of outlet duct, CCTV for watching workers and material movement, Server for management of whole monitoring system. After installation and test of this, radiation monitoring system will be expected to assist the successful ACP demonstration

  14. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meier, David E.; Coble, Jamie B.; Jordan, David V.; Mcdonald, Luther W.; Forrester, Joel B.; Schwantes, Jon M.; Unlu, Kenan; Landsberger, Sheldon; Bender, Sarah; Dayman, Kenneth J.; Reilly, Dallas D.

    2013-09-01

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicate changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.

  15. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study

    Science.gov (United States)

    Johansen, Ayna; Brendryen, Håvar

    2016-01-01

    Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373

  16. Regeneration of Recombinant Antigen Microarrays for the Automated Monitoring of Antibodies against Zoonotic Pathogens in Swine Sera

    Directory of Open Access Journals (Sweden)

    Verena K. Meyer

    2015-01-01

    Full Text Available The ability to regenerate immobilized proteins like recombinant antigens (rAgs on surfaces is an unsolved problem for flow-based immunoassays on microarray analysis systems. The regeneration on microarray chip surfaces is achieved by changing the protein structures and desorption of antibodies. Afterwards, reactivation of immobilized protein antigens is necessary for reconstitution processes. Any backfolding should be managed in a way that antibodies are able to detect the protein antigens in the next measurement cycle. The regeneration of rAg microarrays was examined for the first time on the MCR3 flow-based chemiluminescence (CL microarray analysis platform. The aim was to reuse rAg microarray chips in order to reduce the screening effort and costs. An antibody capturing format was used to detect antibodies against zoonotic pathogens in sera of slaughtered pigs. Different denaturation and reactivation buffers were tested. Acidic glycine-SDS buffer (pH 2.5 and 8 M guanidinium hydrochloride showed the best results in respect of denaturation efficiencies. The highest CL signals after regeneration were achieved with a carbonate buffer containing 10 mM DTT and 0.1% BSA for reactivation. Antibodies against Yersinia spp. and hepatitis E virus (HEV were detected in swine sera on one immunochip over 4 days and 25 measurement cycles. Each cycle took 10 min for detection and regeneration. By using the rAg microarray chip, a fast and automated screening of antibodies against pathogens in sera of slaughtered pigs would be possible for zoonosis monitoring.

  17. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    OpenAIRE

    Anna Szostak-Chrzanowski; Adam Chrzanowski; Don Kim; Jason Bond

    2007-01-01

    The increasing number of structural collapses, slope failures and other natural disasters has lead to a demand for new sensors, sensor integration techniques and data processing strategies for deformation monitoring systems. In order to meet extraordinary accuracy requirements for displacement detection in recent deformation monitoring projects, research has been devoted to integrating Global Positioning System (GPS) as a monitoring sensor. Although GPS has been used for monitoring purposes w...

  18. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  19. Development of automated health physics monitoring system for a medical cyclotron complex

    International Nuclear Information System (INIS)

    Full text: A Health Physics surveillance system (Watchdog) for the real-time detection, processing and storage of various radiological protection related data of the Cyclotron facility of the Radiopharmaceutical Division of ANSTO developed during 1991 has been operational since March 1992 (Mukherjee, B. et al. Proc. 13th Int. Conf. on Cyclotrons and their Applications, Vancouver, Canada, July 1992). In this paper a upgraded version of the Health Physics Monitoring system installed to monitor the radiation fields in the vicinity of the new PET and SPECT target caves as well as the stack effluent discharge is presented. Standard gamma and neutron area monitors (GD1...GD7 and ND1) were modified with novel electronic 'Piggy back' circuits to respond to the new ICRP 1990 radiation weighting factor (ICRP Publication No. 61, 1991). The monitor outputs were connected to a datalogger via RF shielded twisted pair cables in 'Current-loop' mode. Two NaI-scintillation detectors (SM1 and SM2) connected to single channel analysers were used in the stack monitors to detect the release of positron emitting gases and Iodine-123. The operation of the suction pump (SP) was controlled by a solenoid valve connected to the datalogger in order to compensate the 'residence-time' error of the stack detectors. The datalogger was interfaced to a 100 MHz Pentium-CPU based Personal computer with a 2GB Hard disk for long term data storage. The neutron and gamma dose equivalent rates were sampled in every minute and displayed in the user friendly mimics. In total 6 mimics were simultaneously operational in 'Multi-tasking' mode. The datalogger output signals were linearised using 'multi-degree' polynomials. The data was collected in a block of 24 hours and stored in the Excel V5 spreadsheet for statistical analysis and graphical display. The long term Health Physics data collected in the spreadsheets was used to analyse the global performance of the entire cyclotron facility which includes the

  20. 'Au.Raex': An Automated, Long Lasting Exposimeter for Monitoring Persons with Increased Radon-Exposure

    International Nuclear Information System (INIS)

    Within this framework, the automated radon exposimeter 'au.raex' improves the long-established method of radon exposure measurements using nuclear track detectors in a decisive method. Unlike conventional nuclear track exposimeters this radon measurement is switchable. By movement recognition the exposition is constrained automatically to the period in which it is actually worn, the exposition time is captured automatically. Despite these advantages, it is comfortable to wear au.raex. It has roughly the dimensions of a cigarette box. Used as a time-controlled ambient exposimeter it captures only the radon expositions during relevant and defined periods. The timing control has been implemented in form of a complete calendar. Thus, the on-and off separately for each weekday, as well as public holidays and holiday periods are defined, in which the detector, against the rule, remains completely closed. Data evaluation and programming are performed using the USB port and software on a computer. The switchability of the measurement is achieved by a movable slide at a small distance above the detector film. Both movement- and time-depended control of the closure are optimized for low electronic energy consumption. The 'au.raex' is applicable for measuring campaigns lasting about several years, without the need to charge the device or further maintenance. Calibration as well as the practical testing of 'au.raex' were made by the Radon Laboratory of Karlsruhe Institute of Technology KIT using their own nuclear track films and evaluation process. To validate the operation of the instrument, measurements are to be performed on persons with known increased radon exposure.(author)

  1. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  2. Monitoring of sediment transport processes using tracer stones

    Science.gov (United States)

    Redtenbacher, Matthias; Harb, Gabriele; Barbas, Teresa; Schneider, Josef

    2014-05-01

    In the last decades the vulnerability of our civilization to geomorphological damaging events like debris flows and exceptional floods increased. The reasons are, on one side, that the global hydrological cycle became more intense during the recent past and on the other side that the material assets of the population increased. Risk prevention, risk analysis and forecast methods thus became more important. Geomorphological processes are often not easy to analyse. To get information about the probability and the consequences of these increasing events, it is necessary to analyse the availability of sediments in the catchment area, the erosion processes of the sediment and the transport of the sediments along torrents. The project ClimCatch, which started in April 2012, investigates the torrential sediment transport processes in a non-glaciated Alpine valley in Austria and the related natural hazards under the viewpoint of the on-going climate change. Due to an extreme precipitation event in 2011 debris flow-similar discharges occurred in this catchment and since that the sediment sources are highly erodible there. The aims of the project are to derive a quantitative sediment budget model, including geomorphic process domains, determining sediment transport in the river system and the measurement of bed load output, besides others. To quantify river sediment dynamics several different methodologies are applied within the project. Discharge and sediment transport measurement as well as hydrological stations are installed in the catchment area. Aggradation and erosion are analysed by means of laser scanning technology in the sediment storage basin which is located at the outlet of the catchment. The observation and measurement of the sediment transport is performed by the application of radio telemetry stones and colour tracer stones. Line pebble counting, automated grain size determination using photographs and sieving on-site is performed to get qualitative sediment

  3. A chemical sensor and biosensor based totally automated water quality monitor for extended space flight: Step 1

    Science.gov (United States)

    Smith, Robert S.

    1993-01-01

    The result of a literature search to consider what technologies should be represented in a totally automated water quality monitor for extended space flight is presented. It is the result of the first summer in a three year JOVE project. The next step will be to build a test platform at the Authors' school, St. John Fisher College. This will involve undergraduates in NASA related research. The test flow injection analysis system will be used to test the detection limit of sensors and the performance of sensors in groups. Sensor companies and research groups will be encouraged to produce sensors which are not currently available and are needed for this project.

  4. Image Processing for Automated Analysis of the Fluorescence In-Situ Hybridization (FISH) Microscopic Images

    Czech Academy of Sciences Publication Activity Database

    Schier, Jan; Kovář, Bohumil; Kočárek, E.; Kuneš, Michal

    Berlin Heidelberg: Springer-Verlag, 2011, s. 622-633. (Lecture Notes in Computer Science ). ISBN 978-3-642-24081-2. [5th International Conference, ICHIT 2011. Daejeon (KR), 22.09.2011-24.09.2011] R&D Projects: GA TA ČR TA01010931 Institutional research plan: CEZ:AV0Z10750506 Keywords : fluorescence in-situ hybridization * image processing * image segmentation Subject RIV: IN - Informatics, Computer Science http://library.utia.cas.cz/separaty/2011/ZS/shier-image processing for automated analysis of the fluorescence in-situ hybridization (fish) microscopic images.pdf

  5. Power up your plant - An introduction to integrated process and power automation

    Energy Technology Data Exchange (ETDEWEB)

    Vasel, Jeffrey

    2010-09-15

    This paper discusses how a single integrated system can increase energy efficiency, improve plant uptime, and lower life cycle costs. Integrated Process and Power Automation is a new system integration architecture and power strategy that addresses the needs of the process and power generation industries. The architecture is based on Industrial Ethernet standards such as IEC 61850 and Profinet as well as Fieldbus technologies. The energy efficiency gains from integration are discussed in a power generation use case. A power management system success story from a major oil and gas company, Petrobras, is also discussed.

  6. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time. The...

  7. Research on machine vision system of monitoring injection molding processing

    Science.gov (United States)

    Bai, Fan; Zheng, Huifeng; Wang, Yuebing; Wang, Cheng; Liao, Si'an

    2016-01-01

    With the wide development of injection molding process, the embedded monitoring system based on machine vision has been developed to automatically monitoring abnormality of injection molding processing. First, the construction of hardware system and embedded software system were designed. Then camera calibration was carried on to establish the accurate model of the camera to correct distortion. Next the segmentation algorithm was applied to extract the monitored objects of the injection molding process system. The realization procedure of system included the initialization, process monitoring and product detail detection. Finally the experiment results were analyzed including the detection rate of kinds of the abnormality. The system could realize the multi-zone monitoring and product detail detection of injection molding process with high accuracy and good stability.

  8. Remotely Managing Operation, Data Collection and processing in Modern Automated ET Networks

    Science.gov (United States)

    Johnson, D.; Xu, L.; Li, J.; Yuan, G.; Sun, X.; Zhu, Z.; Tang, X.; Velgersdyk, M.; Beaty, K.; Fratini, G.; Kathilankal, J. C.; Burba, G. G.

    2014-12-01

    The significant increase in overall data generation and available computing power in the recent years has greatly improved spatial and temporal data coverage of evapotranspiration (ET) measurements on multiple scales, ranging from a single station to continental scale ET networks. With the increased number of ET stations and increased amount of data flowing from each station, modern tools are needed to effectively and efficiently handle the entire infrastructure (hardware, software and data management). These tools can automate key stages of ET network operation, remotely providing real-time ET rates and alerts for the health of the instruments. This can help maximize time dedicated to answering research questions, rather than to station management. This year, the Chinese Ecosystem Research Network (CERN) within the Chinese Academy of Sciences implemented a large-scale 27-station national ET network across China to measure and understand the water cycle from a variety of ecosystems. It includes automated eddy covariance systems, on-site flux computations, wireless communication, and a network server for system, data, and user management. This presentation will discuss the latest information on the CERN network, methods and hardware for ET measurements, tools for automated data collection, data processing and quality control, and data transport and management of the multiple stations. This system description is beneficial for individuals and institutions interested in setting up or modifying present ET networks consisting of single or multiple stations spread over geographic locations ranging from single field site or watershed to national or continental scale.

  9. An Integrated Solution for both Monitoring and Controlling for Automization Using Wireless Sensor Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    M Gnana Seelan

    2013-02-01

    Full Text Available Temperature monitoring plays a major role in controlling it according to its varied conditions. Thisprocess is common in all critical areas like data centre, server rooms, grid rooms and other datacommunication equipped rooms. This is mandatory for each organization/industry to impart suchprocess, as most of the critical data would be in data centre along with their network infrastructure whichhaving various electronic, electrical and mechanical devices are involved for data transmissions. Thesedevices are very much depend on the environmental factors such as temperature, moisture, humidity etc.,and also emit heat in the form of thermal energy when they are in functional. To overcome these heats,the server/data centre room(s would be engaged with multiple (distributed air-conditioning (ac systemsto provide cooling environment and maintain the temperature level of the room. The proposed paper isthe study of automization of monitoring and controlling temperature as per desired requirements withwsn network

  10. High performance, inexpensive solar cell process capable of a high degree of automation

    Science.gov (United States)

    Shah, P.; Fuller, C. R.

    1976-01-01

    This paper proposes a process for inexpensive high performance solar cell fabrication that can be automated for further cost reduction and higher throughputs. The unique feature of the process is the use of oxides as doping sources for simultaneous n(+) junction formation and back p(+) layer, as a mask for metallization and as an in situ AR coating for spectrum matching. Cost analysis is performed to show that significant cost reductions over the conventional process is possible using the proposed scheme and the cost intensive steps are identified which can be further reduced to make the process compatible with the needed price goals of 50 cents/watt. The process was demonstrated by fabricating n(+)-p cells using Arsenic doped oxides. Simple n(+)-p structure cells showed corrected efficiencies of 14.5% (AMO) and 12% with doped oxide as an in situ antireflection coating.

  11. An Automated System for the Detection of Stratified Squamous Epithelial Cancer Cell Using Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Ram Krishna Kumar

    2013-06-01

    Full Text Available Early detection of cancer disease is a difficult problem and if it is not detected in starting phase the cancer can be fatal. Current medical procedures which are used to diagnose the cancer in body partsare time taking and more laboratory work is required for them. This work is an endeavor to possible recognition of cancer cells in the body part. The process consists of image taken of the affected area and digital image processing of the images to get a morphological pattern which differentiate normal cell to cancer cell. The technique is different than visual inspection and biopsy process. Image processing enables the visualization of cellular structure with substantial resolution. The aim of the work is to exploit differences in cellular organization between cancerous and normal tissue using image processing technique, thus allowing for automated, fast and accurate diagnosis.

  12. A fuzzy model for processing and monitoring vital signs in ICU patients

    Directory of Open Access Journals (Sweden)

    Valentim Ricardo AM

    2011-08-01

    Full Text Available Abstract Background The area of the hospital automation has been the subject of much research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others; communication (tracking patients, staff and materials, development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials; and aid to medical diagnosis (according to each speciality. Methods In this context, this paper presents a Fuzzy model for helping medical diagnosis of Intensive Care Unit (ICU patients and their vital signs monitored through a multiparameter heart screen. Intelligent systems techniques were used in the data acquisition and processing (sorting, transforming, among others it into useful information, conducting pre-diagnosis and providing, when necessary, alert signs to the medical staff. Conclusions The use of fuzzy logic turned to the medical area can be very useful if seen as a tool to assist specialists in this area. This paper presented a fuzzy model able to monitor and classify the condition of the vital signs of hospitalized patients, sending alerts according to the pre-diagnosis done helping the medical diagnosis.

  13. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits. PMID:26227212

  14. Processing of the WLCG monitoring data using NoSQL

    Science.gov (United States)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  15. Processing of the WLCG monitoring data using NoSQL

    International Nuclear Information System (INIS)

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  16. Automated processing of forensic casework samples using robotic workstations equipped with nondisposable tips: contamination prevention.

    Science.gov (United States)

    Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M

    2008-05-01

    An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol. PMID:18471209

  17. Quality control of environmental radiation monitoring process

    International Nuclear Information System (INIS)

    This report summarizes for the period (January/2003 to September 2003) the analytical results of the Environmental Monitoring Program- Centro de Desenvolvimento da Tecnologia Nuclear - CDTN. A statistical treatment using control graphs for periodicity and tendency analysis according to temporal variation is also carried out. Moreover, a comparison of radioactive and stable elements concentrations with the derived and intake limits for ingestion and inhalation recommended by Comissao Nacional de Energia Nuclear, Fundacao Estatual do Meio Ambiente (FEAM) e Instituto Brasileiro do Meio Ambiente (IBAMA) is performed. The results are compliant with those recommended by the legislation. (author)

  18. An improved, computer-based, on-line gamma monitor for plutonium anion exchange process control

    International Nuclear Information System (INIS)

    An improved, low-cost, computer-based system has replaced a previously developed on-line gamma monitor. Both instruments continuously profile uranium, plutonium, and americium in the nitrate anion exchange process used to recover and purify plutonium at the Los Alamos Plutonium Facility. The latest system incorporates a personal computer that provides full-feature multichannel analyzer (MCA) capabilities by means of a single-slot, plug-in integrated circuit board. In addition to controlling all MCA functions, the computer program continuously corrects for gain shift and performs all other data processing functions. This Plutonium Recovery Operations Gamma Ray Energy Spectrometer System (PROGRESS) provides on-line process operational data essential for efficient operation. By identifying abnormal conditions in real time, it allows operators to take corrective actions promptly. The decision-making capability of the computer will be of increasing value as we implement automated process-control functions in the future. 4 refs., 6 figs

  19. Subsea flow assurance and process monitoring via gamma radiation

    International Nuclear Information System (INIS)

    Condition monitoring and process control with the use of gamma radiation is considered to be the most reliable detection principle for a wide range of applications throughout the oil and gas industries, from measuring mechanical integrity to dynamic process fluid monitoring. The growing numbers of advanced subsea processing projects and pipeline flow assurance studies currently adds an increasing number of subsea applications to the radiation measurement reference list (author) (ml)

  20. Implications of critical chain methodology for business process flexible automation projects in economic organizations

    Directory of Open Access Journals (Sweden)

    Paul BRUDARU

    2009-12-01

    Full Text Available Business processes flexible automation projects involve the use of methods and technologies from Business Processes Management area (BPM that aim at increasing the agility of organizations in changing the business processes as response to environmental changes. BPM-type projects are a mix between process improvement projects and software development which implies a high complexity in managing them. The successful implementation of these projects involves overcoming problems inherent as delays in the activities of projects, multi-tasking, lack of focus which can not be solved by traditional project management tools. An approach which takes account of the difficulties of BPM projects is critical chain methodology. Using critical chain method provides the methodology fundament necessary for the successful completion of BPM-type projects.

  1. A Scheme for Automation of Telecom Data Processing for Business Application

    CERN Document Server

    Nair, T R Gopalakrishnan; V., Suma; Maharajan, Ezhilarasan

    2012-01-01

    As the telecom industry is witnessing a large scale growth, one of the major challenges faced in the domain deals with the analysis and processing of telecom transactional data which are generated in large volumes by embedded system communication controllers having various functions. This paper deals with the analysis of such raw data files which are made up of the sequences of the tokens. It also depicts the method in which the files are parsed for extracting the information leading to the final storage in predefined data base tables. The parser is capable of reading the file in a line structured way and store the tokens into the predefined tables of data bases. The whole process is automated using the SSIS tools available in the SQL server. The log table is maintained in each step of the process which will enable tracking of the file for any risk mitigation. It can extract, transform and load data resulting in the processing.

  2. Instrumentation, Field Network and Process Automation for the Cryogenic System of the LHC Test String

    OpenAIRE

    Suraci, A.; Bager, T.; Balle, Ch.; Blanco, E.; Casas, J.; Gomes, P.; Pelletier, S.; Serio, L.; Vauthier, N.

    2001-01-01

    CERN is now setting up String 2, a full-size prototype of a regular cell of the LHC arc. It is composed of two quadrupole, six dipole magnets, and a separate cryogenic distribution line (QRL) for the supply and recovery of the cryogen. An electrical feed box (DFB), with up to 38 High Temperature Superconducting (HTS) leads, powers the magnets. About 700 sensors and actuators are distributed along four Profibus DP and two Profibus PA field buses. The process automation is handled by two contro...

  3. A realization of an automated data flow for data collecting, processing, storing and retrieving

    Energy Technology Data Exchange (ETDEWEB)

    Friedsam, H.; Pushor, R.; Ruland, R.

    1986-11-01

    GEONET is a database system developed at the Stanford Linear Accelerator Center for the alignment of the Stanford Linear Collider. It features an automated data flow, ranging from data collection using HP110 handheld computers to processing, storing and retrieving data and finally to adjusted coordinates. This paper gives a brief introduction to the SLC project and the applied survey methods. It emphasizes the hardware and software implementation of GEONET using a network of IBM PC/XT's. 14 refs., 4 figs.

  4. Annotated bibliography of films in automation, data processing, and computer science

    CERN Document Server

    Soloman, Martin B Jr

    2015-01-01

    With the rapid development of computer science and the expanding use of computers in all facets of American life, there has been made available a wide range of instructional and informational films on automation, data processing, and computer science. Here is the first annotated bibliography of these and related films, gathered from industrial, institutional, and other sources.This bibliography annotates 244 films, alphabetically arranged by title, with a detailed subject index. Information is also provided concerning the intended audience, rental-purchase data, ordering procedures, and such s

  5. Automated longitudinal monitoring of in vivo protein aggregation in neurodegenerative disease C. elegans models

    OpenAIRE

    Cornaglia, Matteo; Krishnamani, Gopalan; Mouchiroud, Laurent; Sorrentino, Vincenzo; Lehnert, Thomas; Auwerx, Johan; Gijs, Martin A. M.

    2016-01-01

    Background While many biological studies can be performed on cell-based systems, the investigation of molecular pathways related to complex human dysfunctions – e.g. neurodegenerative diseases – often requires long-term studies in animal models. The nematode Caenorhabditis elegans represents one of the best model organisms for many of these tests and, therefore, versatile and automated systems for accurate time-resolved analyses on C. elegans are becoming highly desirable tools in the field. ...

  6. Automated analyser for monitoring trace amounts of volatile chloro-organic compounds in recirculated industrial water

    OpenAIRE

    Elżbieta Przyk; Jacek Namieśnik; Wojciech Chrzanowski; Andrzej Wasik; Wacław Janicki

    2002-01-01

    An automated analyser of volatile chloro-organic compounds in water was constructed and tested using standard mixtures of dichloromethane and dichloroethane. It was based on continuous, countercurrent gas stripping of the liquid sample followed by periodic trapping of the analytes on two traps alternately connected to the bubbler outlet, and thermal desorption. When one trap performed adsorption, the other underwent desorption and cooling. Analytes were detected by an ECD detector. Integratio...

  7. Web-based execution of graphical work-flows: a modular platform for multifunctional scientific process automation

    International Nuclear Information System (INIS)

    The Passerelle process automation suite offers a fundamentally modular solution platform, based on a layered integration of several best-of-breed technologies. It has been successfully applied by Synchrotron Soleil as the sequencer for data acquisition and control processes on its beamlines, integrated with TANGO as a control bus and GlobalScreenTM) as the SCADA package. Since last year it is being used as the graphical work-flow component for the development of an eclipse-based Data Analysis Work Bench, at ESRF. The top layer of Passerelle exposes an actor-based development paradigm, based on the Ptolemy framework (UC Berkeley). Actors provide explicit reusability and strong decoupling, combined with an inherently concurrent execution model. Actor libraries exist for TANGO integration, web-services, database operations, flow control, rules-based analysis, mathematical calculations, launching external scripts etc. Passerelle's internal architecture is based on OSGi, the major Java framework for modular service-based applications. A large set of modules exist that can be recombined as desired to obtain different features and deployment models. Besides desktop versions of the Passerelle work-flow workbench, there is also the Passerelle Manager. It is a secured web application including a graphical editor, for centralized design, execution, management and monitoring of process flows, integrating standard Java Enterprise services with OSGi. We will present the internal technical architecture, some interesting application cases and the lessons learnt. (authors)

  8. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  9. Attempts to Automate the Process of Generation of Orthoimages of Objects of Cultural Heritage

    Science.gov (United States)

    Markiewicz, J. S.; Podlasiak, P.; Zawieska, D.

    2015-02-01

    At present, digital documentation recorded in the form of raster or vector files is the obligatory way of inventorying historical objects. The orthoimage is a cartometric form of photographic presentation of information in the two-dimensional reference system. The paper will discuss the issue of automation of the orthoimage generation basing on the TLS data and digital images. At present attempts are made to apply modern technologies not only for the needs of surveys, but also during the data processing. This paper will present attempts aiming at utilisation of appropriate algorithms and the author's application for automatic generation of the projection plane, for the needs of acquisition of intensity orthoimages from the TLS data. Such planes are defined manually in the majority of popular TLS data processing applications. A separate issue related to the RGB image generation is the orientation of digital images in relation to scans. It is important, in particular in such cases when scans and photographs are not taken simultaneously. This paper will present experiments concerning the utilisation of the SIFT algorithm for automatic matching of intensity orthoimages of the intensity and digital (RGB) photographs. Satisfactory results of the process of automation, as well as in relation to the quality of resulting orthoimages have been obtained.

  10. Optical sensors for process monitoring in biotechnology

    Science.gov (United States)

    Ploetz, F.; Schelp, Carsten; Anders, K.; Eberhardt, F.; Scheper, Thomas-Helmut; Bueckmann, F.

    1991-09-01

    The development and application of various optical sensors will be presented. Among these are optical sensors (optrodes) with immobilized enzymes, coenzymes, and labeled antibodies. The NADH formation of coenzyme dependent enzymes was used for detection of lactate, pyrovate mannitol, ethanol, and formate. An enzyme optrode based on a pH-optrode as a transducer for the monitoring of urea and penicillin in fermentation media was developed. For preparing an oxygen optrode, oxygen-sensitive fluorophores were entrapped in a gaspermeable silicone matrix that is attached to the distal end of a bifurcated fiber optic waveguide bundle. By labeling of immuncomponent with fluorophores or enzymes, which transpose fluorophores or chromophores, immunreactions were observed by an optical sensors.

  11. Development and testing of an automated High-resolution InSAR volcano-monitoring system in the MED-SUV project

    Science.gov (United States)

    Chowdhury, Tanvir Ahmed; Minet, Christian; Fritz, Thomas; Rodriguez Gonzalez, Fernando

    2015-04-01

    Volcanic unrest which produces a variety of geological and hydrological hazards is difficult to predict. Therefore it is important to monitor volcanoes continuously. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities. Besides the improvements of the understanding of geophysical processes underlying the volcanic systems of Vesuvius/ Campi Flegrei and Mt. Etna, one of the main goals of the MED-SUV (MEDiterranean SUpersite Volcanoes) project is to design a system for automatically monitoring ground deformations over active volcanoes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide powerful tools for observing the surface changes with millimeter accuracy. All the mentioned techniques address the challenges by exploiting medium to large SAR image stacks. The generation of interferometric products constitutes a major effort in terms of processing and planning. It requires a high degree of automation, robustness and quality control of the overall process. As a consequence of these requirements and constrains, the Integrated Wide Area Processor (IWAP) developed at DLR is introduced in the framework of a remote sensing task of MED-SUV project. The IWAP has been conceived and designed to optimize the processing workflow in order to minimize the processing time. Moreover, a quality control concept has been developed and integrated in the workflow. The IWAP is structured into three parts: (i) firstly, preparation of an order file containing some configuration parameters and invokes the processor; (ii) secondly, upon request from the processor, the operator performs some manual interactions by means of visual interfaces; (iii) analysis of the final product supported by extensive product visualization. This visualization supports the interpretation of the results without the need of

  12. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    International Nuclear Information System (INIS)

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar+ ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern

  13. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  14. Monitoring of steam sterilization processes in the dental office

    NARCIS (Netherlands)

    J.P.C.M. van Doornmalen; A.G.M. Rietmeijer; A.J. Feilzer; K. Kopinga

    2013-01-01

    In dental offices steam sterilization is used to prevent infection of staff and patient. The necessity of sterilization is obvious. To ensure effective sterilization processes each load has to be monitored. Based on literature and standards a state of the art concept of every load monitoring is desc

  15. Monitoring and control of fine abrasive finishing processes

    DEFF Research Database (Denmark)

    Lazarev, Ruslan

    In engineering, surfaces with specified functional properties are of high demand in various applications. Desired surface finish can be obtained using several methods. Abrasive finishing is one of the most important processes in the manufacturing of mould and dies tools. It is a principal method to...... remove unwanted material, obtain desired geometry, surface quality and surface functional properties. The automation and computerization of finishing processes involves utilisation of robots, specialized machines with several degrees of freedom, sensors and data acquisition systems. The focus of this...

  16. Nystatin-induced changes in yeast monitored by time-resolved automated single cell electrorotation.

    Science.gov (United States)

    Hölzel, R

    1998-10-23

    A widespread use of electrorotation for the determination of cellular and subcellular properties has been hindered so far by the need for manual recording of cell movements. Therefore a system has been developed that allows the automatic collection of electrorotation spectra of single cells in real time. It employs a hardware based registration of image moments from which object orientation is calculated. Since the camera's video signal is processed without intermediate image storage a high data throughput of about two recordings per second could be achieved independently of image resolution. This made it possible to monitor changes in cell membrane and cytoplasm of the yeast Saccharomyces cerevisiae under the influence of the antibiotic nystatin with a temporal resolution of 3 min. Up to 20 electrorotation spectra of an individual cell could be collected in the frequency range between 1 kHz and 1 GHz. Two distinct events 7 and 75 min after addition of nystatin were observed with a fast increase in membrane permeability accompanied by a nearly simultaneous drop in cytoplasmic conductivity. PMID:9795246

  17. Automated image analyzer for batch processing of CR-39 foils for fast neutron dosimetry

    International Nuclear Information System (INIS)

    An automated image analysis system has been developed for counting of tracks generated in CR-39 detectors after processing by Electro-chemical etching (ECE). The tracks are caused by exposure to fast neutron, and is used for measuring the neutron dose received by the radiation workers. The system is capable of batch processing a group of 20 foils in a single cycle, rendering the measurement process elegant and efficient. Thus, the system provides a marked improvement over the earlier one, which has provision of handling one foil at a time. The image analysis software of this system is empowered with the capability to resolve the overlapping tracks, which are commonly found in foils exposed to higher levels of neutron dose. The algorithm employed to resolve the tracks is an enhancement over that utilized in the earlier system. This results in improved accuracy of dosimetry. (author)

  18. Application of PLC’s for Automation of Processes in Industries

    Directory of Open Access Journals (Sweden)

    Rahul Pawar

    2016-06-01

    Full Text Available Several industries utilize sequential industrial process which is respective in nature. For such processes industries have to depend upon use of relays, stepping drum, timers and controls, considerable difficulties experienced in reprogramming necessitated due to change in the nature of production. Often the whole system has to be scrapped and a redesigning is required. To overcome these problems PLC control system was introduced. The PLC can be described as a control ladder comprising a sequence program. PLC sequence program consists of normally open and normally closed contacts connected in parallel or in series. It also has relay coils, which turns ON and OFF as the state of these contacts change. In this paper, about all aspects of these powerful and versatile tools and its applications to process automation has been discussed

  19. Adaptive Soa Stack-Based Business Process Monitoring Platform

    Directory of Open Access Journals (Sweden)

    Przemysław Dadel

    2014-01-01

    Full Text Available Executable business processes that formally describe company activities are well placed in the SOA environment as they allow for declarative organization of high-level system logic.However, for both technical and non-technical users, to fully benet from that element of abstractionappropriate business process monitoring systems are required and existing solutions remain unsatisfactory.The paper discusses the problem of business process monitoring in the context of the service orientation paradigm in order to propose an architectural solution and provide implementation of a system for business process monitoring that alleviates the shortcomings of the existing solutions.Various platforms are investigated to obtain a broader view of the monitoring problem and to gather functional and non-functional requirements. These requirements constitute input forthe further analysis and the system design. The monitoring software is then implemented and evaluated according to the specied criteria.An extensible business process monitoring system was designed and built on top of OSGiMM - a dynamic, event-driven, congurable communications layer that provides real-time monitoring capabilities for various types of resources. The system was tested against the stated functional requirements and its implementation provides a starting point for the further work.It is concluded that providing a uniform business process monitoring solution that satises a wide range of users and business process platform vendors is a dicult endeavor. It is furthermore reasoned that only an extensible, open-source, monitoring platform built on top of a scalablecommunication core has a chance to address all the stated and future requirements.

  20. Determination of atmospheric radiocesium on filter tapes used at automated SPM monitoring stations for estimation of transport pathways of radionuclides from Fukushima Dai-ichi Nuclear Power Plant

    International Nuclear Information System (INIS)

    Suspended particulate matters (SPM) collected hourly at that time of Fukushima Dai-ichi Nuclear Plant accident in 2011 on a filter tape at many automated SPM monitoring stations located widely in eastern Japan was analyzed in order to determine atmospheric radiocesium concentration. Precise time series of 137Cs concentrations in wide areas of eastern Japan were revealed. Analysis of radioactivity of SPM collected on filter tapes by automated SPM monitoring stations even at 1-2 years after the accident is concluded to give very valuable information to resolve the matter of radioactive contamination by Fukushima accident. (author)

  1. Automation of image data processing. (Polish Title: Automatyzacja proces u przetwarzania danych obrazowych)

    Science.gov (United States)

    Preuss, R.

    2014-12-01

    This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft. At present, image data obtained by various registration systems (metric and non - metric cameras) placed on airplanes, satellites, or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured) are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images. For fast images georeferencing automatic image matching algorithms are currently applied. They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage. Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object (area). In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic, DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules. Image processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters. The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system

  2. Signal Processing Methods Monitor Cranial Pressure

    Science.gov (United States)

    2010-01-01

    Dr. Norden Huang, of Goddard Space Flight Center, invented a set of algorithms (called the Hilbert-Huang Transform, or HHT) for analyzing nonlinear and nonstationary signals that developed into a user-friendly signal processing technology for analyzing time-varying processes. At an auction managed by Ocean Tomo Federal Services LLC, licenses of 10 U.S. patents and 1 domestic patent application related to HHT were sold to DynaDx Corporation, of Mountain View, California. DynaDx is now using the licensed NASA technology for medical diagnosis and prediction of brain blood flow-related problems, such as stroke, dementia, and traumatic brain injury.

  3. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    Science.gov (United States)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  4. Design aspects of automation system for initial processing of fecal samples

    International Nuclear Information System (INIS)

    The procedure for initial handling of the fecal samples at Bioassay Lab., Trombay is as follows: overnight fecal samples are collected from the worker in a kit consisting of a polythene bag placed in a wide mouth polythene container closed with an inner lid and a screw cap. Occupational worker collects the sample in the polythene bag. On receiving the sample, the polythene container along with the sample is weighed, polythene bag containing fecal sample is lifted out of the container using a pair of tongs placed inside a crucible and ashed inside a muffle furnace at 450℃. After complete ashing, the crucible containing white ash is taken-up for further radiochemical processing. This paper describes the various steps in developing a prototype automated system for initial handling of fecal samples. The proposed system for handling and processing of fecal samples is proposed to automate the above. The system once developed will help eliminate manual intervention till the ashing stage and reduce the biological hazard involved in handling such samples mentioned procedure

  5. Automated Formosat Image Processing System for Rapid Response to International Disasters

    Science.gov (United States)

    Cheng, M. C.; Chou, S. C.; Chen, Y. C.; Chen, B.; Liu, C.; Yu, S. J.

    2016-06-01

    FORMOSAT-2, Taiwan's first remote sensing satellite, was successfully launched in May of 2004 into the Sun-synchronous orbit at 891 kilometers of altitude. With the daily revisit feature, the 2-m panchromatic, 8-m multi-spectral resolution images captured have been used for researches and operations in various societal benefit areas. This paper details the orchestration of various tasks conducted in different institutions in Taiwan in the efforts responding to international disasters. The institutes involved including its space agency-National Space Organization (NSPO), Center for Satellite Remote Sensing Research of National Central University, GIS Center of Feng-Chia University, and the National Center for High-performance Computing. Since each institution has its own mandate, the coordinated tasks ranged from receiving emergency observation requests, scheduling and tasking of satellite operation, downlink to ground stations, images processing including data injection, ortho-rectification, to delivery of image products. With the lessons learned from working with international partners, the FORMOSAT Image Processing System has been extensively automated and streamlined with a goal to shorten the time between request and delivery in an efficient manner. The integrated team has developed an Application Interface to its system platform that provides functions of search in archive catalogue, request of data services, mission planning, inquiry of services status, and image download. This automated system enables timely image acquisition and substantially increases the value of data product. Example outcome of these efforts in recent response to support Sentinel Asia in Nepal Earthquake is demonstrated herein.

  6. A Noble Approach of Process Automation in Galvanized Nut, Bolt Manufacturing Industry

    Directory of Open Access Journals (Sweden)

    Akash Samanta

    2012-05-01

    Full Text Available Corrosion costs money”, The Columbus battle institute estimates that corrosion costs Americans more than $ 220 billion annually, about 4.3% of the gross natural product [1].Now a days due to increase of pollution, the rate of corrosion is also increasing day-by-day mainly in India, so, to save the steel structures, galvanizing is the best and the simplest solution. Due to this reason galvanizing industries are increasing day-by-day since mid of 1700s.Galvanizing is a controlled metallurgical combination of zinc and steel that can provide a corrosion resistance in a wide variety of environment. In fact, the galvanized metal corrosion resistance factor can be some 70 to 80 times greater that the base metal material. Keeping in mind the importance of this industry, a noble approach of process automation in galvanized nut-bolt  manufacturing plant is presented here as nuts and bolts are the prime ingredient of any structure. In this paper the main objectives of any industry like survival, profit maximization, profit satisfying and sales growth are fulfilled. Furthermore the environmental aspects i.e. pollution control and energy saving are also considered in this paper. The whole automation process is done using programmable logic controller (PLC which has number of unique advantages like being faster, reliable, requires less maintenance and reprogrammable. The whole system has been designed and tested using GE, FANUC PLC.

  7. Automating PACS quality control with the Vanderbilt image processing enterprise resource

    Science.gov (United States)

    Esparza, Michael L.; Welch, E. Brian; Landman, Bennett A.

    2012-02-01

    Precise image acquisition is an integral part of modern patient care and medical imaging research. Periodic quality control using standardized protocols and phantoms ensures that scanners are operating according to specifications, yet such procedures do not ensure that individual datasets are free from corruption; for example due to patient motion, transient interference, or physiological variability. If unacceptable artifacts are noticed during scanning, a technologist can repeat a procedure. Yet, substantial delays may be incurred if a problematic scan is not noticed until a radiologist reads the scans or an automated algorithm fails. Given scores of slices in typical three-dimensional scans and widevariety of potential use cases, a technologist cannot practically be expected inspect all images. In large-scale research, automated pipeline systems have had great success in achieving high throughput. However, clinical and institutional workflows are largely based on DICOM and PACS technologies; these systems are not readily compatible with research systems due to security and privacy restrictions. Hence, quantitative quality control has been relegated to individual investigators and too often neglected. Herein, we propose a scalable system, the Vanderbilt Image Processing Enterprise Resource (VIPER) to integrate modular quality control and image analysis routines with a standard PACS configuration. This server unifies image processing routines across an institutional level and provides a simple interface so that investigators can collaborate to deploy new analysis technologies. VIPER integrates with high performance computing environments has successfully analyzed all standard scans from our institutional research center over the course of the last 18 months.

  8. Migrating from Process Automation to Process Management Support: A Holistic Approach to Software Engineering applied to Media Production

    Directory of Open Access Journals (Sweden)

    Atta Badii

    2010-04-01

    Full Text Available  This paper presents the on-going research performed in order to migrate from process automation to process management support in the context of media production and more specifically 3D cinematographic immersive and interactive production. The endeavour has been addressed on the basis of a holistic approach to software engineering applied to media production modelling to ensure design correctness, completeness and effectiveness. The focus of the designed application is on information and metadata management throughout the process in a similar fashion to that achieved in Decision Support Systems (DSS to facilitate well-grounded business decisions. The paper sets out the aims and objectives and the methodology deployed. The proposed solution is then described in some detail including the workflow to be supported and the experimental scenario as planned. The paper concludes with some preliminary conclusions and sets out the planned future work.

  9. Monitoring autocorrelated process: A geometric Brownian motion process approach

    Science.gov (United States)

    Li, Lee Siaw; Djauhari, Maman A.

    2013-09-01

    Autocorrelated process control is common in today's modern industrial process control practice. The current practice of autocorrelated process control is to eliminate the autocorrelation by using an appropriate model such as Box-Jenkins models or other models and then to conduct process control operation based on the residuals. In this paper we show that many time series are governed by a geometric Brownian motion (GBM) process. Therefore, in this case, by using the properties of a GBM process, we only need an appropriate transformation and model the transformed data to come up with the condition needs in traditional process control. An industrial example of cocoa powder production process in a Malaysian company will be presented and discussed to illustrate the advantages of the GBM approach.

  10. APACS: Monitoring and diagnosis of complex processes

    International Nuclear Information System (INIS)

    This paper describes APACS - a new framework for a system that detects, predicts and identifies faults in industrial processes. The APACS frameworks provides a structure in which a heterogeneous set of programs can share a common view of the problem and a common model of the domain. (author). 17 refs, 2 figs

  11. CMS Tracker Integration: monitoring the process quality

    International Nuclear Information System (INIS)

    The CMS experiment at LHC features the largest Silicon Strip Tracker (SST) ever built. This detector is composed of about 15000 modules, thus the potential problems of the system comes from its complexity. This article covers the tests performed during the tracker integration, describing their motivations in terms of process quality assurance

  12. Access Control for Monitoring System-Spanning Business Processes

    NARCIS (Netherlands)

    Bassil, S.; Reichert, M.U.; Bobrik, R.; Bauer, Th.

    2007-01-01

    Integrated process support is highly desirable in environ- ments where data related to a particular (business) process are scattered over distributed and heterogeneous information systems (IS). A process monitoring component is a much-needed module in order to provide an integrated view on all these

  13. Intelligent monitoring of business processes using case-based reasoning

    OpenAIRE

    Kapetanakis, Stylianos

    2012-01-01

    The work in this thesis presents an approach towards the effective monitoring of business processes using Case-Based Reasoning (CBR). The rationale behind this research was that business processes constitute a fundamental concept of the modern world and there is a constantly emerging need for their efficient control. They can be efficiently represented but not necessarily monitored and diagnosed effectively via an appropriate platform. Motivated by the above observation this research purs...

  14. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    Science.gov (United States)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing

  15. Ultrasonic flow measurements for irrigation process monitoring

    Science.gov (United States)

    Ziani, Elmostafa; Bennouna, Mustapha; Boissier, Raymond

    2004-02-01

    This paper presents the state of the art of the general principle of liquid flow measurements by ultrasonic method, and problems of flow measurements. We present an ultrasonic flowmeter designed according to smart sensors concept, for the measurement of irrigation water flowing through pipelines or open channels, using the ultrasonic transit time approach. The new flowmeter works on the principle of measuring time delay differences between sound pulses transmitted upstream and downstream in the flowing liquid. The speed of sound in the flowing medium is eliminated as a variable because the flowrate calculations are based on the reciprocals of the transmission times. The transit time difference is digitally measured by means of a suitable, microprocessor controlled logic. This type of ultrasonic flowmeter will be widely used in industry and water management, it is well studied in this work, followed by some experimental results. For pressurized channels, we use one pair of ultrasonic transducer arranged in proper positions and directions of the pipe, in this case, to determine the liquid velocity, a real time on-line analysis taking account the geometries of the hydraulic system, is applied to the obtained ultrasonic data. In the open channels, we use a single or two pairs of ultrasonic emitter-receiver according to the desired performances. Finally, the goals of this work consist in integrating the smart sensor into irrigation systems monitoring in order to evaluate potential advantages and demonstrate their performance, on the other hand, to understand and use ultrasonic approach for determining flow characteristics and improving flow measurements by reducing errors caused by disturbances of the flow profiles.

  16. In-process monitoring and control of microassembly by utilising force sensor

    Directory of Open Access Journals (Sweden)

    S. Tangjitsitcharoen

    2008-12-01

    Full Text Available Purpose: The aim of this research is to develop an in-process monitoring system to control the position of the shaftwithin a tolerance of ±2.5 μm regardless of any conditions of the geometries of the shaft and the thrust plate.Design/methodology/approach: To realize an automated and intelligent microassembly process, a method hasbeen developed to monitor and control the position of the shaft in the plate of the high-precision spindle motorfor hard disk drive in order to reduce the shaft high problem. The force sensor is utilized and attached on thetable of the microassembly machine under the jig of the plate to monitor the in-process pressing force. Theexperimentally obtained pressing force depends on the variations of the tolerance fitness and the geometricalroundness of the shaft and the plate but it is increased immediately when the shaft touches the stopper on the jigas the setting point. Hence, a method proposed introduces the reference voltage as the threshold value, whichis calculated and obtained by taking the differentiation of the in-process pressing force. A slope detector isdeveloped to calculate the output voltage based on the in-process pressing force, and the motor driver of machineis controlled when the obtained output voltage is larger than the reference voltage.Findings: It is proved that the shaft high problem is well controlled and reduced by the method developedregardless of any combinations of the geometries of the shaft and the plate.Practical implications: In this research, the microassembly process of the shaft into the plate of high-precisionspindle motor for hard disk drive is showed.Originality/value: This paper has shown the developed the in-process monitoring system and control themanufacturing process automatically.

  17. An alarm filtering system for an automated process: a multiple-agent approach

    International Nuclear Information System (INIS)

    Nowadays, the supervision process of industrial installations is more and more complex involving the automation of their control. A malfunction generates an avalanche of alarms. The operator, in charge of the supervision, must face the incident and execute right actions to recover a normal situation. Generally, he is drowned under the great number of alarms. Our aim, in the frame of our researches, is to perform an alarm filtering system for an automated metro line, to help the operator finding the main alarm responsible for the malfunction. Our works are divided into two parts, both dealing with study and development of an alarm filtering system but using two different approaches. The first part is developed in the frame of the SARA project (an operator assistance system for an automated metro line) which is an expert system prototype helping the operators of a command center. In this part, a centralized approach has been used representing the events with a single event graph and using a global procedure to perform diagnosis. This approach has itself shown its limits. In the second part of our works, we have considered the distributed artificial intelligence (DAI) techniques, and more especially the multi-agent approach. The multi-agent approach has been motivated by the natural distribution of the metro line equipment and by the fact that each equipment has its own local control and knowledge. Thus, each equipment has been considered as an autonomous agent. Through agents cooperation, the system is able to determine the main alarm and the faulty equipment responsible for the incident. A prototype, written in SPIRAL (a tool for knowledge-based system) is running on a workstation. This prototype has allowed the concretization and the validation of our multi-agent approach. (author)

  18. Design of MPU based process monitoring instrument

    International Nuclear Information System (INIS)

    A display sub-system (DSS) for a process variable like flow is designed around Intel 8088 microprocessor. It displays the current value of a process variable but average and accumulated value display is manually selectable. The display consists of 6 units of seven segment display and accuracy up to 2 nd place of decimal is achieved. The engineering units are indicated by the LEDs. The control software is developed in assembler and burnt in a EPROM. The maximum value of the display is 9999.99 K. liter and that of time is 99 days 23 hours and 59 minutes. Sampling period is 1 second. Data acquisition is done using Polling technique. (author)

  19. Process monitoring in modern safeguards applications

    International Nuclear Information System (INIS)

    From the safeguards standpoint, regulatory requirements are finally moving into the modern world of communication and information processing. Gone are the days when the accountant with the green eye shade and arm bands made judgments on the material balance a month after the balance was closed. The most recent Nuclear Regulatory Commission (NRC) regulations and U.S. Department of Energy (DOE) orders have very strict standards for timeliness and sensitivity to loss or removal of material. The latest regulations recognize that plant operators have a lot of information on and control over the location and movement of material within their facilities. This information goes beyond that traditionally reported under accountability requirements. These new regulations allow facility operators to take credit for many of the more informal process controls

  20. Towards process tomography for monitoring pressure filtration

    OpenAIRE

    York T A, Davidson J L, Mazurkiewich L, Mann R, Grieve B D

    2005-01-01

    This paper reports on progress towards the first continuous application of electrical impedance tomography to a production scale industrial process. It includes the design and implementation of the worlds first certified intrinsically safe electrical tomography system. Zener barrier (ZB) modules and intrinsically safe relays provide electrical isolation and the instrument is certified for operation in a Zone 0 environment. Two systems have been operating successfully on production pressure fi...

  1. Continuously monitored barrier options under Markov processes

    OpenAIRE

    Aleksandar Mijatovic; Martijn Pistorius

    2009-01-01

    In this paper we present an algorithm for pricing barrier options in one-dimensional Markov models. The approach rests on the construction of an approximating continuous-time Markov chain that closely follows the dynamics of the given Markov model. We illustrate the method by implementing it for a range of models, including a local Levy process and a local volatility jump-diffusion. We also provide a convergence proof and error estimates for this algorithm.

  2. Metrology Sampling Strategies for Process Monitoring Applications

    KAUST Repository

    Vincent, Tyrone L.

    2011-11-01

    Shrinking process windows in very large scale integration semiconductor manufacturing have already necessitated the development of control systems capable of addressing sub-lot-level variation. Within-wafer control is the next milestone in the evolution of advanced process control from lot-based and wafer-based control. In order to adequately comprehend and control within-wafer spatial variation, inline measurements must be performed at multiple locations across the wafer. At the same time, economic pressures prompt a reduction in metrology, for both capital and cycle-time reasons. This paper explores the use of modeling and minimum-variance prediction as a method to select the sites for measurement on each wafer. The models are developed using the standard statistical tools of principle component analysis and canonical correlation analysis. The proposed selection method is validated using real manufacturing data, and results indicate that it is possible to significantly reduce the number of measurements with little loss in the information obtained for the process control systems. © 2011 IEEE.

  3. An Automated Capacitance-Based Fuel Level Monitoring System for Networked Tanks

    Directory of Open Access Journals (Sweden)

    Oke Alice O

    2015-08-01

    Full Text Available The making of an effective fuel measuring system has been a great challenge in the Nigerian industry, as various oil organization are running into different problems ranging from fire outbreak, oil pilfering, oil spillage and some other negative effects. The use of meter rule or long rod at most petrol filling stations for quantity assessment of fuel in tank is inefficient, stressful, dangerous and almost impossible in a networking environment. This archaic method does not provide good reorder date and does not give a good inventory. As such there is a need to automate the system by providing a real time measurement of fuel storage device to meet the demand of the customers. In this paper, a system was designed to sense the level of fuel in a networked tanks using a capacitive sensor controlled by an ATMEGA 328 Arduino microcontroller. The result was automated both in digital and analogue form through radio frequency Transmission using XBee and interfaced to Computer System for notification of fuel level and refill operations. This enables consumption control, cost analysis and tax accounting for fuel purchases

  4. Infrared signature analysis - Real time monitoring of manufacturing processes

    International Nuclear Information System (INIS)

    The ability to monitor manufacturing processes in an adaptive control mode and perform an inspection in real time is of interest to fabricators in the pressure vessel, aerospace, automotive, nuclear, and shipbuilding industries. Results of a series of experiments using infrared thermography as the principal sensing mode are presented to show how artificial intelligence contained in infrared isotherm, contains vast critical process variables. Image processing computer software development has demonstrated in a spot welding application how the process can be monitored and controlled in real time. The IR vision sensor program is now under way. Research thus far has focused on fusion welding, resistance spot welding and metal removal. 6 references

  5. Infrared Signature Analysis: Real Time Monitoring Of Manufacturing Processes

    Science.gov (United States)

    Bangs, Edmund R.

    1988-01-01

    The ability to monitor manufacturing processes in an adaptive control mode and perform an inspection in real time is of interest to fabricators in the pressure vessel, aerospace, automotive, nuclear and shipbuilding industries. Results of a series of experiments using infrared thermography as the principal sensing mode are presented to show how artificial intelligence contained in infrared isotherm, contains vast critical process variables. Image processing computer software development has demonstrated in a spot welding application how the process can be monitored and controlled in real time. The IR vision sensor program is now under way. Research thus far has focused on fusion welding, resistance spot welding and metal removal.

  6. Automated microfluidic platform of bead-based electrochemical immunosensor integrated with bioreactor for continual monitoring of cell secreted biomarkers

    Science.gov (United States)

    Riahi, Reza; Shaegh, Seyed Ali Mousavi; Ghaderi, Masoumeh; Zhang, Yu Shrike; Shin, Su Ryon; Aleman, Julio; Massa, Solange; Kim, Duckjin; Dokmeci, Mehmet Remzi; Khademhosseini, Ali

    2016-04-01

    There is an increasing interest in developing microfluidic bioreactors and organs-on-a-chip platforms combined with sensing capabilities for continual monitoring of cell-secreted biomarkers. Conventional approaches such as ELISA and mass spectroscopy cannot satisfy the needs of continual monitoring as they are labor-intensive and not easily integrable with low-volume bioreactors. This paper reports on the development of an automated microfluidic bead-based electrochemical immunosensor for in-line measurement of cell-secreted biomarkers. For the operation of the multi-use immunosensor, disposable magnetic microbeads were used to immobilize biomarker-recognition molecules. Microvalves were further integrated in the microfluidic immunosensor chip to achieve programmable operations of the immunoassay including bead loading and unloading, binding, washing, and electrochemical sensing. The platform allowed convenient integration of the immunosensor with liver-on-chips to carry out continual quantification of biomarkers secreted from hepatocytes. Transferrin and albumin productions were monitored during a 5-day hepatotoxicity assessment in which human primary hepatocytes cultured in the bioreactor were treated with acetaminophen. Taken together, our unique microfluidic immunosensor provides a new platform for in-line detection of biomarkers in low volumes and long-term in vitro assessments of cellular functions in microfluidic bioreactors and organs-on-chips.

  7. Automated work-flow for processing high-resolution direct infusion electrospray ionization mass spectral fingerprints

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    2007-01-01

    automated data processing pipeline to compare large numbers of fingerprint spectra from direct infusion experiments analyzed by high resolution MS. We describe some of the intriguing problems that have to be addressed. starting with the conversion and pre-processing of the raw data to the final data...... analysis. Illustrated on the direct infusion analysis (ESI-TOF-MS) of complex mixtures the method exploits the full quality of the high-resolution present in the mass spectra. Although the method is illustrated as a new library search method for high resolution MS, we demonstrate that the output of the...... preprocessing is applicable to cluster-, discriminant analysis, and related multivariate methods applied directly to mass spectra from direct infusion analysis of crude extracts. This is done to find the relationship between several terverticillate Penicillium species and identify the ions responsible for the...

  8. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 23OC/234-52. This CSWD describes hardware and PFP/FFS developed software for control of Magnesium Hydroxide Precipitation process located in room 230, 234-52. The Honeywell and Plant Scape software generate limited configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, Solutions Stabilization Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  9. Quality control of CT systems by automated monitoring of key performance indicators: a two-year study.

    Science.gov (United States)

    Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-01-01

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service

  10. Capillary electrophoresis with contactless conductivity detection coupled to a sequential injection analysis manifold for extended automated monitoring applications

    International Nuclear Information System (INIS)

    A capillary electrophoresis (CE) instrument with capacitively coupled contactless conductivity detection (C4D) based on a sequential injection analysis (SIA) manifold was refined. Hydrodynamic injection was implemented to avoid a sampling bias by using a split-injection device based on a needle valve for precise adjustment. For safety and reliability, the integrity of the high voltage compartment at the detection end was fully maintained by implementing flushing of the high voltage interface through the capillary. With this set-up, extended fully automated monitoring applications are possible. The system was successfully tested in the field for the determination of the concentration levels of major inorganic cations and anions in a creek over a period of 5 days.

  11. An Automated Electronic Tongue for In-Situ Quick Monitoring of Trace Heavy Metals in Water Environment

    Science.gov (United States)

    Cai, Wei; Li, Yi; Gao, Xiaoming; Guo, Hongsun; Zhao, Huixin; Wang, Ping

    2009-05-01

    An automated electronic tongue instrumentation has been developed for in-situ concentration determination of trace heavy metals in water environment. The electronic tongue contains two main parts. The sensor part consists of a silicon-based Hg-coated Au microelectrodes array (MEA) for the detection of Zn(II), Cd(II), Pb(II) and Cu(II) and a multiple light-addressable potentiometric sensor (MLAPS) for the detection of Fe(III) and Cr(VI). The control part employs pumps, valves and tubes to enable the pick-up and pretreatment of aqueous sample. The electronic tongue realized detection of the six metals mentioned above at part-per-billion (ppb) level without manual operation. This instrumentation will have wide application in quick monitoring and prediction the heavy metal pollution in lakes and oceans.

  12. ONLINE WATER MONITORING UTILIZING AN AUTOMATED MICROARRAY BIOSENSOR INSTRUMENT - PHASE I

    Science.gov (United States)

    Constellation Technology Corporation (Constellation) proposes the use of an integrated recovery and detection system for online water supply monitoring.  The integrated system is designed to efficiently capture and recover pathogens such as bacteria, viruses, parasites, an...

  13. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    Science.gov (United States)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  14. A METHOD OF COMPLEX AUTOMATED MONITORING OF UKRAINIAN POWER ENERGY SYSTEM OBJECTS TO INCREASE ITS OPERATION SAFETY

    Directory of Open Access Journals (Sweden)

    Ye.I. Sokol

    2016-05-01

    Full Text Available The paper describes an algorithm of the complex automated monitoring of Ukraine’s power energy system, aimed at ensuring safety of its personnel and equipment. This monitoring involves usage of unmanned aerial vehicles (UAVs for planned and unplanned registration status of power transmission lines (PTL and high-voltage substations (HVS. It is assumed that unscheduled overflights will be made in emergency situations on power lines. With the help of the UAV, pictures of transmission and HVS will be recorded from the air in the optical and infrared ranges, as well as strength of electric (EF and magnetic (MF fields will be measured along the route of flight. Usage specially developed software allows to compare the recorded pictures with pre-UAV etalon patterns corresponding to normal operation of investigated transmission lines and the HVSs. Such reference pattern together with the experimentally obtained maps of HVS’s protective grounding will be summarized in a single document – a passport of HVS and PTL. This passport must also contain the measured and calculated values of strength levels of EF and MF in the places where staff of power facilities stay as well as layout of equipment, the most vulnerable to the effects of electromagnetic interference. If necessary, as part of ongoing monitoring, recommendations will be given on the design and location of electromagnetic screens, reducing the levels of electromagnetic interference as well as on location of lightning rods, reducing probability lightning attachment to the objects. The paper presents analytic expressions, which formed the basis of the developed software for calculation of the EF strength in the vicinity of power lines. This software will be used as a base at UAV navigation along the transmission lines, as well as to detect violations in the transmission lines operation. Comparison of distributions of EF strength calculated with the help of the elaborated software with the known

  15. Monitoring of Welding Processes with Application of Artificial Neural Networks

    OpenAIRE

    Чвертко, Євгенія Петрівна; Пірумов, Андрій Євгенович; Шевченко, Микола Віталійович

    2014-01-01

    The paper presents a summary of methods of monitoring systems’ development for the processes involving heating of filler material and/ or base metal by the electric current and with periodical shortages of the welding circuit. The processes investigated were MAG welding, underwater flux-cored welding and flash-butt welding. Details of experiments, primary data processing procedures based on statistical analysis methods are described, the aim of primary processing being obtaining of informativ...

  16. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa

    2008-03-01

    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  17. Spatial monitoring of groundwater drawdown and rebound associated with quarry dewatering using automated time-lapse electrical resistivity tomography and distribution guided clustering

    OpenAIRE

    Chambers, J. E.; Meldrum, P.I.; P. B. Wilkinson; Ward, W.; Jackson, C; Matthews, B.; Joel, P; Kuras, O.; Bai, L.; S. Uhlemann; Gunn, D

    2015-01-01

    Dewatering systems used for mining and quarrying operations often result in highly artificial and complex groundwater conditions, which can be difficult to characterise and monitor using borehole point sampling approaches. Here automated time-lapse electrical resistivity tomography (ALERT) is considered as a means of monitoring subsurface groundwater dynamics associated with changes in the dewatering regime in an operational sand and gravel quarry. We considered two scenarios: the first was u...

  18. Groundwater monitoring plan for the 300 Area process trenches

    International Nuclear Information System (INIS)

    This document describes the groundwater monitoring program for the Hanford Site 300 Area Process Trenches (300 APT). The 300 APT are a Resource Conservation and Recovery Act of 1976 (RCRA) regulated unit. The 300 APT are included in the Dangerous Waste Portion of the Resource Conservation and Recovery Act Permit for the Treatment, Storage, and Disposal of Dangerous Waste, Permit No. WA890008967, and are subject to final-status requirements for groundwater monitoring. This document describes a compliance monitoring program for groundwater in the uppermost aquifer system at the 300 APT. This plan describes the 300 APT monitoring network, constituent list, sampling schedule, statistical methods, and sampling and analysis protocols that will be employed for the 300 APT. This plan will be used to meet groundwater monitoring requirements from the time the 300 APT becomes part of the Permit and through the postclosure care period until certification of final closure

  19. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  20. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    Science.gov (United States)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence

  1. Automated and Scalable Data Reduction in the textsc{Sofia} Data Processing System

    Science.gov (United States)

    Krzaczek, R.; Shuping, R.; Charcos-Llorens, M.; Alles, R.; Vacca, W.

    2015-09-01

    In order to provide suitable data products to general investigators and other end users in a timely manner, the Stratospheric Observatory for Infrared Astronomy SOFIA) has developed a framework supporting the automated execution of data processing pipelines for the various instruments, called the Data Processing System (DPS), see Shuping et al. (2014) for overview). The primary requirement is to process all data collected from a flight within eight hours, allowing data quality assessments and inspections to be made the following day. The raw data collected during a flight requires processing by a number of different software packages and tools unique to each combination of instrument and mode of operation, much of it developed in-house, in order to create data products for use by investigators and other end-users. The requirement to deliver these data products in a consistent, predictable, and performant manner presents a significant challenge for the observatory. Herein we present aspects of the DPS that help to achieve these goals. We discuss how it supports data reduction software written in a variety of languages and environments, its support for new versions and live upgrades to that software and other necessary resources (e.g., calibrations), its accommodation of sudden processing loads through the addition (and eventual removal) of computing resources, and close with an observation of the performance achieved in the first two observing cycles of SOFIA.

  2. Sensor-model prediction, monitoring and in-situ control of liquid RTM advanced fiber architecture composite processing

    Science.gov (United States)

    Kranbuehl, D.; Kingsley, P.; Hart, S.; Loos, A.; Hasko, G.; Dexter, B.

    In-situ frequency dependent electromagnetic sensors (FDEMS) and the Loos resin transfer model have been used to select and control the processing properties of an epoxy resin during liquid pressure RTM impregnation and cure. Once correlated with viscosity and degree of cure the FDEMS sensor monitors and the RTM processing model predicts the reaction advancement of the resin, viscosity and the impregnation of the fabric. This provides a direct means for predicting, monitoring, and controlling the liquid RTM process in-situ in the mold throughout the fabrication process and the effects of time, temperature, vacuum and pressure. Most importantly, the FDEMS-sensor model system has been developed to make intelligent decisions, thereby automating the liquid RTM process and removing the need for operator direction.

  3. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  4. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    Science.gov (United States)

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  5. An automated data processing method dedicated to 3D ultrasonic non destructive testing of composite pieces

    International Nuclear Information System (INIS)

    State-of the art Non Destructive Testing using ultrasound is based on evaluation of C-scan images, which is done mainly visually. The development of the new Sampling Phased Array technique SPA by IZFP Fraunhofer provides a fast three-dimensional reconstruction of inner object structures. This new inspection technique is to be complemented with fully or semi-automated evaluation of ultrasonic data, providing maximum support to the operator. We present in this contribution a processing method for SPA ultrasonic data, where the main focus of this paper will be on speckle noise reduction. The evaluation method is applied on carbon fibre composite where it demonstrates robust and successful performance in recognition of defects.

  6. Analysis of irradiated U-7wt%Mo dispersion fuel microstructures using automated image processing

    Science.gov (United States)

    Collette, R.; King, J.; Buesch, C.; Keiser, D. D.; Williams, W.; Miller, B. D.; Schulthess, J.

    2016-07-01

    The High Performance Research Reactor Fuel Development (HPPRFD) program is responsible for developing low enriched uranium (LEU) fuel substitutes for high performance reactors fueled with highly enriched uranium (HEU) that have not yet been converted to LEU. The uranium-molybdenum (U-Mo) fuel system was selected for this effort. In this study, fission gas pore segmentation was performed on U-7wt%Mo dispersion fuel samples at three separate fission densities using an automated image processing interface developed in MATLAB. Pore size distributions were attained that showed both expected and unexpected fission gas behavior. In general, it proved challenging to identify any dominant trends when comparing fission bubble data across samples from different fuel plates due to varying compositions and fabrication techniques. The results exhibited fair agreement with the fission density vs. porosity correlation developed by the Russian reactor conversion program.

  7. Monitoring of transport contamination

    International Nuclear Information System (INIS)

    Organization of monitoring of transport contamination is considered. A particularly thorough monitoring is recommended to be carried out in loading-unloading operations. The monitoring is performed when leaving loading-unloading site and zone under control and prior to preventive examination, technical service or repair. The method of monitoring of auto transport contamination with high-energy β-emitters by means of a special stand permitting the automation of the monitoring process is described

  8. Machine vision process monitoring on a poultry processing kill line: results from an implementation

    Science.gov (United States)

    Usher, Colin; Britton, Dougl; Daley, Wayne; Stewart, John

    2005-11-01

    Researchers at the Georgia Tech Research Institute designed a vision inspection system for poultry kill line sorting with the potential for process control at various points throughout a processing facility. This system has been successfully operating in a plant for over two and a half years and has been shown to provide multiple benefits. With the introduction of HACCP-Based Inspection Models (HIMP), the opportunity for automated inspection systems to emerge as viable alternatives to human screening is promising. As more plants move to HIMP, these systems have the great potential for augmenting a processing facilities visual inspection process. This will help to maintain a more consistent and potentially higher throughput while helping the plant remain within the HIMP performance standards. In recent years, several vision systems have been designed to analyze the exterior of a chicken and are capable of identifying Food Safety 1 (FS1) type defects under HIMP regulatory specifications. This means that a reliable vision system can be used in a processing facility as a carcass sorter to automatically detect and divert product that is not suitable for further processing. This improves the evisceration line efficiency by creating a smaller set of features that human screeners are required to identify. This can reduce the required number of screeners or allow for faster processing line speeds. In addition to identifying FS1 category defects, the Georgia Tech vision system can also identify multiple "Other Consumer Protection" (OCP) category defects such as skin tears, bruises, broken wings, and cadavers. Monitoring this data in an almost real-time system allows the processing facility to address anomalies as soon as they occur. The Georgia Tech vision system can record minute-by-minute averages of the following defects: Septicemia Toxemia, cadaver, over-scald, bruises, skin tears, and broken wings. In addition to these defects, the system also records the length and

  9. Automated Planning of Science Products Based on Nadir Overflights and Alerts for Onboard and Ground Processing

    Science.gov (United States)

    Chien, Steve A.; McLaren, David A.; Rabideau, Gregg R.; Mandl, Daniel; Hengemihle, Jerry

    2013-01-01

    A set of automated planning algorithms is the current operations baseline approach for the Intelligent Payload Module (IPM) of the proposed Hyper spectral Infrared Imager (HyspIRI) mission. For this operations concept, there are only local (e.g. non-depletable) operations constraints, such as real-time downlink and onboard memory, and the forward sweeping algorithm is optimal for determining which science products should be generated onboard and on ground based on geographical overflights, science priorities, alerts, requests, and onboard and ground processing constraints. This automated planning approach was developed for the HyspIRI IPM concept. The HyspIRI IPM is proposed to use an X-band Direct Broadcast (DB) capability that would enable data to be delivered to ground stations virtually as it is acquired. However, the HyspIRI VSWIR and TIR instruments will produce approximately 1 Gbps data, while the DB capability is 15 Mbps for a approx. =60X oversubscription. In order to address this mismatch, this innovation determines which data to downlink based on both the type of surface the spacecraft is overflying, and the onboard processing of data to detect events. For example, when the spacecraft is overflying Polar Regions, it might downlink a snow/ice product. Additionally, the onboard software will search for thermal signatures indicative of a volcanic event or wild fire and downlink summary information (extent, spectra) when detected, thereby reducing data volume. The planning system described above automatically generated the IPM mission plan based on requested products, the overflight regions, and available resources.

  10. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    Science.gov (United States)

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results. PMID:20012610

  11. AUTOMATED LONG-TERM REMOTE MONITORING OF SEDIMENT-WATER INTERFACIAL FLUX

    Science.gov (United States)

    Advective flux across the sediment-water interface is temporally and spatially heterogeneous in nature. For contaminated sediment sites, monitoring spatial as well as temporal variation of advective flux is of importance to proper risk management. This project was conducted to ...

  12. The Effect of Culture on the Sales Process Within a Global Company. Case Company ABB Oy Distribution Automation Sales Unit.

    OpenAIRE

    Kruger, Frantz

    2011-01-01

    My aim in this study was to investigate the possible differences between cultures when looking at them in the context of the sales process within a global company. If these differences did exist I would further attempt to prove that through careful analysis of the sales process, and the elements within the sales process, the associated activity within the sales process could be predicted or anticipated. I compared the activity of the ABB Distribution Automation Sales Unit (Vaasa, Finland) tow...

  13. Prototypic automated continuous recreational water quality monitoring of nine Chicago beaches.

    Science.gov (United States)

    Shively, Dawn A; Nevers, Meredith B; Breitenbach, Cathy; Phanikumar, Mantha S; Przybyla-Kelly, Kasia; Spoljaric, Ashley M; Whitman, Richard L

    2016-01-15

    Predictive empirical modeling is used in many locations worldwide as a rapid, alternative recreational water quality management tool to eliminate delayed notifications associated with traditional fecal indicator bacteria (FIB) culturing (referred to as the persistence model, PM) and to prevent errors in releasing swimming advisories. The goal of this study was to develop a fully automated water quality management system for multiple beaches using predictive empirical models (EM) and state-of-the-art technology. Many recent EMs rely on samples or data collected manually, which adds to analysis time and increases the burden to the beach manager. In this study, data from water quality buoys and weather stations were transmitted through cellular telemetry to a web hosting service. An executable program simultaneously retrieved and aggregated data for regression equations and calculated EM results each morning at 9:30 AM; results were transferred through RSS feed to a website, mapped to each beach, and received by the lifeguards to be posted at the beach. Models were initially developed for five beaches, but by the third year, 21 beaches were managed using refined and validated modeling systems. The adjusted R(2) of the regressions relating Escherichia coli to hydrometeorological variables for the EMs were greater than those for the PMs, and ranged from 0.220 to 0.390 (2011) and 0.103 to 0.381 (2012). Validation results in 2013 revealed reduced predictive capabilities; however, three of the originally modeled beaches showed improvement in 2013 compared to 2012. The EMs generally showed higher accuracy and specificity than those of the PMs, and sensitivity was low for both approaches. In 2012 EM accuracy was 70-97%; specificity, 71-100%; and sensitivity, 0-64% and in 2013 accuracy was 68-97%; specificity, 73-100%; and sensitivity 0-36%. Factors that may have affected model capabilities include instrument malfunction, non-point source inputs, and sparse calibration data

  14. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  15. Automation and data processing with the immucor Galileo (R) system in a university blood bank

    OpenAIRE

    Wittmann, Georg; Frank, Josef; Schramm, Wolfgang; Spannagl, Michael

    2007-01-01

    Background: The implementation of automated techniques improves the workflow and quality of immuno-hematological results. The workflows of our university blood bank were reviewed during the implementation of an automated immunohematological testing system. Methods: Work impact of blood grouping and subgrouping, cross- matching and antibody search using the Immucor Galileo system was compared to the previous used standard manual and semi- automated methods. Results: The redesign of our workflo...

  16. An automated medication system reduces errors in the medication administration process: results from a Danish hospital study

    DEFF Research Database (Denmark)

    Risør, Bettina Wulff; Lisby, Marianne; Sørensen, Jan

    2015-01-01

    Abstract Objectives: Improvements in a hospital's medication administration process might reduce the prevalence of medication errors and improve patient safety. The objective of this study was to evaluate the success of an automated medication system in reducing medication administration errors...... number of doses (opportunities for errors). Logistic regression was used to assess changes in error rates after implementation of the automated medication system with time, group, and interaction between time and group as independent variables. The estimated parameter for the interaction term was...... the control ward. The overall risk of errors was reduced by 57% in the intervention ward compared with the control ward (OR 0.43; 95% CI 0.30 to 0.63). Conclusions: The automated medication system reduced the error rate of the medication administration process and thus improved patient safety in the...

  17. A rapid, automated approach to optimisation of multiple reaction monitoring conditions for quantitative bioanalytical mass spectrometry.

    Science.gov (United States)

    Higton, D M

    2001-01-01

    An improvement to the procedure for the rapid optimisation of mass spectrometry (PROMS), for the development of multiple reaction methods (MRM) for quantitative bioanalytical liquid chromatography/tandem mass spectrometry (LC/MS/MS), is presented. PROMS is an automated protocol that uses flow-injection analysis (FIA) and AppleScripts to create methods and acquire the data for optimisation. The protocol determines the optimum orifice potential, the MRM conditions for each compound, and finally creates the MRM methods needed for sample analysis. The sensitivities of the MRM methods created by PROMS approach those created manually. MRM method development using PROMS currently takes less than three minutes per compound compared to at least fifteen minutes manually. To further enhance throughput, approaches to MRM optimisation using one injection per compound, two injections per pool of five compounds and one injection per pool of five compounds have been investigated. No significant difference in the optimised instrumental parameters for MRM methods were found between the original PROMS approach and these new methods, which are up to ten times faster. The time taken for an AppleScript to determine the optimum conditions and build the MRM methods is the same with all approaches. PMID:11596136

  18. Automated Grid Monitoring for the LHCb Experiment Through HammerCloud

    CERN Document Server

    Dice, Bradley

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  19. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    CERN Document Server

    Kazarov, A; The ATLAS collaboration; Magnoni, L

    2011-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for filtering and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The huge flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This require strong competence and experience in understanding and discovering problems and root causes, and often the meaningful in...

  20. The AAL project: Automated monitoring and intelligent AnaLysis for the ATLAS data taking infrastructure

    CERN Document Server

    Magnoni, L; The ATLAS collaboration; Kazarov, A

    2011-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for filtering and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The huge flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This require strong competence and experience in understanding and discovering problems and root causes, and often the meaningful in...

  1. Automated Wildlife Monitoring Using Self-Configuring Sensor Networks Deployed in Natural Habitats

    OpenAIRE

    Trifa, Vlad; Girod, Lewis; Travis C. Collier; Blumstein, Daniel; Taylor, C E

    2007-01-01

    To understand the complex interactions among animals within an ecosystem, biologists need to be able to track their location and social interactions. There are a variety of factors that make this difficult. We propose using adaptive, embedded networked sensing technologies to develop an efficient means for wildlife monitoring. This paper surveys our research; we demonstrate how a self-organizing system can efficiently conduct real-time acoustic source detection and localization using distribu...

  2. A STUDY AND REASEARCH ON MONITORING SYSTEM AND EMBEDDED WEB SERVER BASED AUTOMATION

    OpenAIRE

    V.Sai Kishore*

    2015-01-01

    Instead of PC based servers; ARM processor based servers are becoming trend of today’s market. Cost reduction is achieved using ARM processor along with Ethernet module as Embedded Web Server. Idea is utilized for monitoring and controlling maximum no. of either home appliances or industry devices. Without using a computer, Ethernet module can communicate to the owner of the overall system, who is able to manage appliances from any location outside. This server provides a powerful...

  3. An Automated DICOM Database Capable of Arbitrary Data Mining (Including Radiation Dose Indicators) for Quality Monitoring

    OpenAIRE

    Wang, Shanshan; Pavlicek, William; Roberts, Catherine C.; Langer, Steve G; Zhang, Muhong; Hu, Mengqi; Morin, Richard L.; Schueler, Beth A.; Wellnitz, Clinton V.; Wu, Teresa

    2010-01-01

    The U.S. National Press has brought to full public discussion concerns regarding the use of medical radiation, specifically x-ray computed tomography (CT), in diagnosis. A need exists for developing methods whereby assurance is given that all diagnostic medical radiation use is properly prescribed, and all patients’ radiation exposure is monitored. The “DICOM Index Tracker©” (DIT) transparently captures desired digital imaging and communications in medicine (DICOM) tags from CT, nuclear imagi...

  4. How Sandia's automated on-line monitoring system can cut emergency diesel costs

    International Nuclear Information System (INIS)

    Sandia National Laboratories has developed a system that could reduce the need for additional back-up power equipment at nuclear power plants by automatically monitoring the starting ability of emergency diesel generators. The equipment has arisen out of US DoE efforts on nuclear plant life extension, which have been looking at predicting equipment performance as a means of reducing equipment surveillance testing, maintenance and malfunctions. (author)

  5. Process monitoring using a Quality and Technical Surveillance Program

    Energy Technology Data Exchange (ETDEWEB)

    Rafferty, C.A.

    1995-02-01

    The purpose of process monitoring using a Quality and Technical Surveillance Program was to help ensure manufactured clad vents sets fully met technical and quality requirements established by the manufacturer and the customer, and that line and program management were immediately alerted if any aspect of the manufacturing activities drifted out of acceptable limits. The Quality and Technical Surveillance Program provided a planned, scheduled approach to monitor key processes and documentation illuminated potential problem areas early enough to permit timely corrective actions to reverse negative trends that, if left uncorrected, could have resulted in deficient hardware. Significant schedule and cost impacts were eliminated.

  6. Process monitoring using a Quality and Technical Surveillance Program

    International Nuclear Information System (INIS)

    The purpose of process monitoring using a Quality and Technical Surveillance Program was to help ensure manufactured clad vents sets fully met technical and quality requirements established by the manufacturer and the customer, and that line and program management were immediately alerted if any aspect of the manufacturing activities drifted out of acceptable limits. The Quality and Technical Surveillance Program provided a planned, scheduled approach to monitor key processes and documentation illuminated potential problem areas early enough to permit timely corrective actions to reverse negative trends that, if left uncorrected, could have resulted in deficient hardware. Significant schedule and cost impacts were eliminated

  7. Automation of metrological operations on measuring apparatuses of radiation monitoring system

    International Nuclear Information System (INIS)

    (J.K.)In this paper the measuring apparatuses of ionizing radiation for the radiation monitoring of NPP Dukovany operation is described. The increase of metrological operations number has been made possible only by a timely reconstruction of the laboratory and by computerization of the measuring procedure and of administrative work which consists mainly of recording of a great number information pieces about the observed measuring apparatuses. There are three working places in the laboratory: 1) irradiation gamma stand with cesium-137 sources; 2) irradiation stand with plutonium-beryllium neutron sources; 3) spectrometric working place. With the regard to the uniqueness of the laboratory operation, all the works in the sphere of hardware as well as software has been implemented by own forces. The equipment of the laboratory makes possible to test metrologically all the radiation monitoring apparatuses used in NPP Dukovany. The quantity of operation of he laboratory of ionizing metrology qualifies the proper functioning of the radiation monitoring system, which directly influences the ensurance of nuclear safety of NPP Dukovany

  8. Integrating and automating the software environment for the Beam and Radiation Monitoring for CMS

    CERN Document Server

    Filyushkina, Olga; Juslin, J

    2010-01-01

    The real-time online visualization framework used by the Beam and Radiation Monitoring group at the Compact Muon Solenoid at Large Hadron Collider, CERN. The purpose of the visualization framework is to provide real-time diagnostic of beam conditions, which defines the set of the requirements to be met by the framework. Those requirements include data quality assurance, vital safety issues, low latency, data caching, etc. The real-time visualization framework is written in the Java programming language and based on JDataViewer--a plotting package developed at CERN. At the current time the framework is run by the Beam and Radiation Monitoring, Pixel, Tracker groups, Run Field Manager and others. It contributed to real-time data analysis during 2009-2010 runs as a stable monitoring tool. The displays reflect the beam conditions in a real-time with the low latency level, thus it is the first place at the CMS detector where the beam collisions are observed.

  9. In situ monitoring of plasma etch processes with a quantum cascade laser arrangement in semiconductor industrial environment

    International Nuclear Information System (INIS)

    Concentrations of the etch product SiF4 were measured online and in situ in technological etch plasmas with an especially designed quantum cascade laser arrangement for application in semiconductor industrial environment, the Q-MACS Etch. The combination of quantum cascade lasers and infra red absorption spectroscopy (QCLAS) opens up new attractive possibilities for plasma process monitoring and control. With the realization of a specific interface the Q-MACS Etch system is synchronized to the etch process and allows therefore automated measurements, which is important in a high volume production environment.

  10. In situ monitoring of plasma etch processes with a quantum cascade laser arrangement in semiconductor industrial environment

    Energy Technology Data Exchange (ETDEWEB)

    Lang, N; Roepcke, J; Zimmermann, H [INP Greifswald, Felix-Hausdorff-Str. 2, 17489 Greifswald (Germany); Steinbach, A; Wege, S, E-mail: lang@jnp-greifswald.d [Qimonda Dresden GmbH and Co. OHG, Koenigsbruecker-Str. 180, 01099 Dresden (Germany)

    2009-03-01

    Concentrations of the etch product SiF{sub 4} were measured online and in situ in technological etch plasmas with an especially designed quantum cascade laser arrangement for application in semiconductor industrial environment, the Q-MACS Etch. The combination of quantum cascade lasers and infra red absorption spectroscopy (QCLAS) opens up new attractive possibilities for plasma process monitoring and control. With the realization of a specific interface the Q-MACS Etch system is synchronized to the etch process and allows therefore automated measurements, which is important in a high volume production environment.

  11. Distributed Inference and Query Processing for RFID Tracking and Monitoring

    CERN Document Server

    Cao, Zhao; Diao, Yanlei; Shenoy, Prashant

    2011-01-01

    In this paper, we present the design of a scalable, distributed stream processing system for RFID tracking and monitoring. Since RFID data lacks containment and location information that is key to query processing, we propose to combine location and containment inference with stream query processing in a single architecture, with inference as an enabling mechanism for high-level query processing. We further consider challenges in instantiating such a system in large distributed settings and design techniques for distributed inference and query processing. Our experimental results, using both real-world data and large synthetic traces, demonstrate the accuracy, efficiency, and scalability of our proposed techniques.

  12. Porosity of additive manufacturing parts for process monitoring

    International Nuclear Information System (INIS)

    Some metal additive manufacturing processes can produce parts with internal porosity, either intentionally (with careful selection of the process parameters) or unintentionally (if the process is not well-controlled.) Material porosity is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants, since surface-breaking pores allow for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the process. We are developing an ultrasonic sensor for detecting changes in porosity in metal parts during fabrication on a metal powder bed fusion system, for use as a process monitor. This paper will describe our work to develop an ultrasonic-based sensor for monitoring part porosity during an additive build, including background theory, the development and detailed characterization of reference additive porosity samples, and a potential design for in-situ implementation

  13. Microcontroller-Based Process Monitoring Using Petri-Nets

    Directory of Open Access Journals (Sweden)

    Frankowiak MarcosR

    2009-01-01

    Full Text Available Abstract This paper considers the development of a Petri-net-based modelling tool as a mechanism for process and system monitoring. The use of Petri-nets, which has previously been largely based in the areas of systems modelling and simulation, is shown here to have great potential for deployment as a process monitoring and management application. Interfacing with real-world processes has been achieved in part by introducing a specific set of extensions to the original Petri-net concept. This work has resulted in the engineering of a tool that can be embedded within the process using a microcontroller platform. The potential for such systems to provide low cost, yet powerful process management tools, is becoming increasingly evident, particularly given the ever-improving capabilities of microcontrollers.

  14. Microcontroller-Based Process Monitoring Using Petri-Nets

    Directory of Open Access Journals (Sweden)

    2009-02-01

    Full Text Available This paper considers the development of a Petri-net-based modelling tool as a mechanism for process and system monitoring. The use of Petri-nets, which has previously been largely based in the areas of systems modelling and simulation, is shown here to have great potential for deployment as a process monitoring and management application. Interfacing with real-world processes has been achieved in part by introducing a specific set of extensions to the original Petri-net concept. This work has resulted in the engineering of a tool that can be embedded within the process using a microcontroller platform. The potential for such systems to provide low cost, yet powerful process management tools, is becoming increasingly evident, particularly given the ever-improving capabilities of microcontrollers.

  15. A fuzzy model for processing and monitoring vital signs in ICU patients

    OpenAIRE

    Valentim Ricardo AM; Neto Adrião DD; Sizilio Gláucia RA; Leite Cicília RM; Guerreiro Ana MG

    2011-01-01

    Abstract Background The area of the hospital automation has been the subject of much research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others); communication (tracking patients, staff and materials), development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials); and aid to medical diagnosis (according to each speciality). Methods In th...

  16. Automating the Mapping Process of Traditional Malay Textile Knowledge Model with the Core Ontology

    Directory of Open Access Journals (Sweden)

    Syerina A.M. Nasir

    2011-01-01

    Full Text Available Problem statement: The wave of ontology has spread drastically in the cultural heritage domain. The impact can be seen from the growing number of cultural heritage web information systems, available textile ontology and harmonization works with the core ontology, CIDOC CRM. The aim of this study is to provide a base for common views in automating the process of mapping between revised TMT Knowledge Model and CIDOC CRM. Approach: Manual mapping was conducted to find similar or overlapping concepts which are aligned to each other in order to achieve ontology similarity. This is achieved after TMT Knowledge Model already undergone transformation process to match with CIDOC CRM structure. Results: Although there are several problems encountered during mapping process, the result shows an instant view of the classes which are found to be easily mapped between both models. Conclusion/Recommendations: Future research will be focused on the construction of Batik Heritage Ontology by using the mapping result obtained in this study. Further testing, evaluation and refinement by using the real collections of cultural artifacts within museums will also be conducted in the near future.

  17. Automated One-loop Computation in Quarkonium Process within NRQCD Framework

    CERN Document Server

    Feng, Feng

    2013-01-01

    In last decades, it has been realized that the next-to-leading order corrections may become very important, and sometimes requisite, for some processes involving quarkoinum production or decay, e.g., $e^+e^- \\to J/\\psi + \\eta_c$ and $J/\\psi \\to 3\\gamma$. In this article, we review some basic steps to perform automated one-loop computations in quarkonium process within the Non-relativistic Quantum Chromodynamics (NRQCD) factorization framework, and we give an introduction to some related public tools or packages and their usages in each step. We start from generating Feynman diagrams and amplitudes with \\textsc{FeynArts} for the quarkonium process, performing Dirac- and Color- algebras simplifications using \\textsc{FeynCalc} and \\textsc{FeynCalcFormLink}, and then to doing partial fractions on the linear-dependent propagators by \\textsc{APart}, and finally to reducing the Tensor Integrals (TI) into Scalar Integrals (SI) or Master Integrals (MI) using Integration-By-Parts (IBP) method with the help of \\textsc{F...

  18. Automated process parameters tuning for an injection moulding machine with soft computing§

    Institute of Scientific and Technical Information of China (English)

    Peng ZHAO; Jian-zhong FU; Hua-min ZHOU; Shu-biao CUI

    2011-01-01

    In injection moulding production, the tuning of the process parameters is a challenging job, which relies heavily on the experience of skilled operators. In this paper, taking into consideration operator assessment during moulding trials, a novel intelligent model for automated tuning of process parameters is proposed. This consists of case based reasoning (CBR), empirical model (EM), and fuzzy logic (FL) methods. CBR and EM are used to imitate recall and intuitive thoughts of skilled operators,respectively, while FL is adopted to simulate the skilled operator optimization thoughts. First, CBR is used to set up the initial process parameters. If CBR fails, EM is employed to calculate the initial parameters. Next, a moulding trial is performed using the initial parameters. Then FL is adopted to optimize these parameters and correct defects repeatedly until the moulded part is found to be satisfactory. Based on the above methodologies, intelligent software was developed and embedded in the controller of an injection moulding machine. Experimental results show that the intelligent software can be effectively used in practical production, and it greatly reduces the dependence on the experience of the operators.

  19. Three-dimensional rotation electron diffraction: software RED for automated data collection and data processing.

    Science.gov (United States)

    Wan, Wei; Sun, Junliang; Su, Jie; Hovmöller, Sven; Zou, Xiaodong

    2013-12-01

    Implementation of a computer program package for automated collection and processing of rotation electron diffraction (RED) data is described. The software package contains two computer programs: RED data collection and RED data processing. The RED data collection program controls the transmission electron microscope and the camera. Electron beam tilts at a fine step (0.05-0.20°) are combined with goniometer tilts at a coarse step (2.0-3.0°) around a common tilt axis, which allows a fine relative tilt to be achieved between the electron beam and the crystal in a large tilt range. An electron diffraction (ED) frame is collected at each combination of beam tilt and goniometer tilt. The RED data processing program processes three-dimensional ED data generated by the RED data collection program or by other approaches. It includes shift correction of the ED frames, peak hunting for diffraction spots in individual ED frames and identification of these diffraction spots as reflections in three dimensions. Unit-cell parameters are determined from the positions of reflections in three-dimensional reciprocal space. All reflections are indexed, and finally a list with hkl indices and intensities is output. The data processing program also includes a visualizer to view and analyse three-dimensional reciprocal lattices reconstructed from the ED frames. Details of the implementation are described. Data collection and data processing with the software RED are demonstrated using a calcined zeolite sample, silicalite-1. The structure of the calcined silicalite-1, with 72 unique atoms, could be solved from the RED data by routine direct methods. PMID:24282334

  20. Automated modal tracking and fatigue assessment of a wind turbine based on continuous dynamic monitoring

    Directory of Open Access Journals (Sweden)

    Oliveira Gustavo

    2015-01-01

    Full Text Available The paper describes the implementation of a dynamic monitoring system at a 2.0 MW onshore wind turbine. The system is composed by two components aiming at the structural integrity and fatigue assessment. The first component enables the continuous tracking of modal characteristics of the wind turbine (natural frequency values, modal damping ratios and mode shapes in order to detect abnormal deviations of these properties, which may be caused by the occurrence of structural damage. On the other hand, the second component allows the estimation of the remaining fatigue lifetime of the structure based on the analysis of the measured cycles of structural vibration.

  1. Monitoring of bone regeneration process by means of texture analysis

    International Nuclear Information System (INIS)

    An image analysis method is proposed for the monitoring of the regeneration of the tibial bone. For this purpose, 130 digitized radiographs of 13 patients, who had undergone tibial lengthening by the Ilizarov method, were studied. For each patient, 10 radiographs, taken at an equal number of postoperative successive time moments, were available. Employing available software, 3 Regions Of Interest (ROIs), corresponding to the: (a) upper, (b) central, and (c) lower aspect of the gap, where bone regeneration was expected to occur, were determined on each radiograph. Employing custom developed algorithms: (i) a number of textural features were generated from each of the ROIs, and (ii) a texture-feature based regression model was designed for the quantitative monitoring of the bone regeneration process. Statistically significant differences (p 2 = 0.9, p < 0.001). The suggested method may contribute to the monitoring of the tibial bone regeneration process.

  2. Laboratory support for the didactic process of engineering processes automation at the Faculty of Mechanical Engineering

    Directory of Open Access Journals (Sweden)

    G. Wszołek

    2006-02-01

    Full Text Available Purpose: The scope of the paper is to present effects of creating the laboratory support for the didactic process of automatic control of engineering processes.Design/methodology/approach: The discussed laboratory framework is a complex system, flexible in terms of further development, operating on four basic levels: rudimental- serving general introductory classes to the subject, advanced level- suitable for specialisation classes, hardware and software for individual or team work assignments completed in the course of self-studies, semester projects, BSc and MSc. theses, and the sophisticated level designed for PhD and DSc research workers.Findings: Close cooperation with industry and practical implementation of joint research projects play a crucial role in the functioning of the laboratory framework.Practical implications: The education of modern engineers and Masters of Science in automatic control and robotics is a challenging task which may be successfully accomplished only if faced with industrial reality. Continuously advancing industrial companies demand graduates who can quickly adjust to the workflow and who can instantly utilize the knowledge and skills acquired in the complex, interdisciplinary field of mechatronics.Originality/value: The discussed laboratory framework successfully couples software and hardware, providing a complex yet flexible system open for further development, enabling teaching and research into the design and operation of modern control systems, both by means of virtual construction and testing in simulation programs, as well as on real industrial structures configured in laboratory workstations.

  3. Modern integrated environmental monitoring and processing systems for nuclear facilities

    International Nuclear Information System (INIS)

    The continuous activity to survey and monitor releases and the current radiation levels in the vicinity of a nuclear object is essential for person and environment protection. Considering the vast amount of information and data needed to keep an updated overview of a situation both during the daily surveillance work and during accident situations, the need for an efficient monitoring and processing system is evident. The rapid development, both in computer technology and in telecommunications, the evolution of fast and accurate computer codes enabling the on-line calculations improve the quality of decision-making in complex situations and assure a high efficiency. The monitoring and processing systems are used both for environmental protection and for controlling nuclear power plant emergency and post-accident situations. Such a system can offer information to the radiation management systems in order to assess the consequences of nuclear accidents and to establish a basis for right decisions in civil defense. The integrated environmental monitoring systems have as main task to record, collect, process and transmit the radiation levels and weather data, incorporating a number of stationary or mobile radiation monitoring equipment, weather parameter measuring station, an information processing center and the communication network, all running under a real-time operating system.They provide the automatic data collection on-line and off-line, remote diagnostic, advanced presentation techniques, including a graphically oriented executive support, which has the ability to respond to an emergency by geographical representation of the hazard zones on the map. The systems are based on local intelligent measuring and transmission units, simultaneous processing and data presentation using a real-time operating system for personal computers and geographical information system (GIS). All information can be managed directly from the map by multilevel data retrieving and

  4. Process control monitoring systems, industrial plants, and process control monitoring methods

    Science.gov (United States)

    Skorpik, James R [Kennewick, WA; Gosselin, Stephen R [Richland, WA; Harris, Joe C [Kennewick, WA

    2010-09-07

    A system comprises a valve; a plurality of RFID sensor assemblies coupled to the valve to monitor a plurality of parameters associated with the valve; a control tag configured to wirelessly communicate with the respective tags that are coupled to the valve, the control tag being further configured to communicate with an RF reader; and an RF reader configured to selectively communicate with the control tag, the reader including an RF receiver. Other systems and methods are also provided.

  5. Facility Effluent Monitoring Plan for the 325 Radiochemical Processing Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Shields, K.D.; Ballinger, M.Y.

    1999-04-02

    This Facility Effluent Monitoring Plan (FEMP) has been prepared for the 325 Building Radiochemical Processing Laboratory (RPL) at the Pacific Northwest National Laboratory (PNNL) to meet the requirements in DOE Order 5400.1, ''General Environmental Protection Programs.'' This FEMP has been prepared for the RPL primarily because it has a ''major'' (potential to emit >0.1 mrem/yr) emission point for radionuclide air emissions according to the annual National Emission Standards for Hazardous Air Pollutants (NESHAP) assessment performed. This section summarizes the airborne and liquid effluents and the inventory based NESHAP assessment for the facility. The complete monitoring plan includes characterization of effluent streams, monitoring/sampling design criteria, a description of the monitoring systems and sample analysis, and quality assurance requirements. The RPL at PNNL houses radiochemistry research, radioanalytical service, radiochemical process development, and hazardous and radioactive mixed waste treatment activities. The laboratories and specialized facilities enable work ranging from that with nonradioactive materials to work with picogram to kilogram quantities of fissionable materials and up to megacurie quantities of other radionuclides. The special facilities within the building include two shielded hot-cell areas that provide for process development or analytical chemistry work with highly radioactive materials and a waste treatment facility for processing hazardous, mixed radioactive, low-level radioactive, and transuranic wastes generated by PNNL activities.

  6. Facility Effluent Monitoring Plan for the 325 Radiochemical Processing Laboratory

    International Nuclear Information System (INIS)

    This Facility Effluent Monitoring Plan (FEMP) has been prepared for the 325 Building Radiochemical Processing Laboratory (RPL) at the Pacific Northwest National Laboratory (PNNL) to meet the requirements in DOE Order 5400.1, ''General Environmental Protection Programs.'' This FEMP has been prepared for the RPL primarily because it has a ''major'' (potential to emit >0.1 mrem/yr) emission point for radionuclide air emissions according to the annual National Emission Standards for Hazardous Air Pollutants (NESHAP) assessment performed. This section summarizes the airborne and liquid effluents and the inventory based NESHAP assessment for the facility. The complete monitoring plan includes characterization of effluent streams, monitoring/sampling design criteria, a description of the monitoring systems and sample analysis, and quality assurance requirements. The RPL at PNNL houses radiochemistry research, radioanalytical service, radiochemical process development, and hazardous and radioactive mixed waste treatment activities. The laboratories and specialized facilities enable work ranging from that with nonradioactive materials to work with picogram to kilogram quantities of fissionable materials and up to megacurie quantities of other radionuclides. The special facilities within the building include two shielded hot-cell areas that provide for process development or analytical chemistry work with highly radioactive materials and a waste treatment facility for processing hazardous, mixed radioactive, low-level radioactive, and transuranic wastes generated by PNNL activities

  7. Acoustic monitoring of a fluidized bed coating process

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Veski, Peep; Pedersen, Joan G.; Anov, Dan; Jørgensen, Pia; Kristensen, Henning Gjelstrup; Bertelsen, Poul

      The aim of the study was to investigate the potential of acoustic monitoring of a production scale fluidized bed coating process. The correlation between sensor signals and the estimated amount of film applied and percentage release, respectively, were investigated in coating potassium chloride...

  8. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  9. Process control and recovery in the Link Monitor and Control Operator Assistant

    Science.gov (United States)

    Lee, Lorrine; Hill, Randall W., Jr.

    1993-01-01

    This paper describes our approach to providing process control and recovery functions in the Link Monitor and Control Operator Assistant (LMCOA). The focus of the LMCOA is to provide semi-automated monitor and control to support station operations in the Deep Space Network. The LMCOA will be demonstrated with precalibration operations for Very Long Baseline Interferometry on a 70-meter antenna. Precalibration, the task of setting up the equipment to support a communications link with a spacecraft, is a manual, time consuming and error-prone process. One problem with the current system is that it does not provide explicit feedback about the effects of control actions. The LMCOA uses a Temporal Dependency Network (TDN) to represent an end-to-end sequence of operational procedures and a Situation Manager (SM) module to provide process control, diagnosis, and recovery functions. The TDN is a directed network representing precedence, parallelism, precondition, and postcondition constraints. The SM maintains an internal model of the expected and actual states of the subsystems in order to determine if each control action executed successfully and to provide feedback to the user. The LMCOA is implemented on a NeXT workstation using Objective C, Interface Builder and the C Language Integrated Production System.

  10. Atmosphere Processing Module Automation and Catalyst Durability Analysis for Mars ISRU Pathfinder

    Science.gov (United States)

    Petersen, Elspeth M.

    2016-01-01

    The Mars In-Situ Resource Utilization Pathfinder was designed to create fuel using components found in the planet’s atmosphere and regolith for an ascension vehicle to return a potential sample return or crew return vehicle from Mars. The Atmosphere Processing Module (APM), a subunit of the pathfinder, uses cryocoolers to isolate and collect carbon dioxide from Mars simulant gas. The carbon dioxide is fed with hydrogen into a Sabatier reactor where methane is produced. The APM is currently undergoing the final stages of testing at Kennedy Space Center prior to process integration testing with the other subunits of the pathfinder. The automation software for the APM cryocoolers was tested and found to perform nominally. The catalyst used for the Sabatier reactor was investigated to determine the factors contributing to catalyst failure. The results from the catalyst testing require further analysis, but it appears that the rapid change in temperature during reactor start up or the elevated operating temperature is responsible for the changes observed in the catalyst.

  11. Batch process monitoring based on multilevel ICA-PCA

    Institute of Scientific and Technical Information of China (English)

    Zhi-qiang GE; Zhi-huan SONG

    2008-01-01

    In this paper,we describe a new batch process monitoring method based on multilevel independent component analysis and principal component analysis (MLICA-PCA).Unlike the conventional multi-way principal component analysis (MPCA) method,MLICA-PCA provides a separated interpretation for multilevel batch process data.Batch process data are partitioned into two levels:the within-batch level and the between-batch level.In each level,the Gaussian and non-Ganssian components of process information can be separately extracted.I2,T2 and SPE statistics are individually built and monitored.The new method facilitates fault diagnosis.Since the two variation levels arc decomposed,the variables responsible for faults in each level can be identified and interpreted more easily.A case study of the Dupont benchmark process showed that the proposed method was more efficient and interpretable in fault detection and diagnosis,compared to the alternative batch process monitoring method.

  12. Taking account of human factors for interface assessment and design in monitoring automated systems

    International Nuclear Information System (INIS)

    Optimum balance between control means and the operator capacities is sought for to achieve computerization of Man-Machine interfaces. Observation of the diagnosis activity of populations of operators in situation on simulators enables design criteria to be defined which are well-suited to the characteristics of the tasks with which they are confronted. This observation provides an assessment of the interfaces from the standpoint of the graphic layer, of the Human behaviour induced by the Machine and of the nature of the interaction between these two systems. This requires an original approach dialectically involving cognitive psychology, dynamic management of the knowledge bases (artificial intelligence) in a critical industrial control and monitoring application. (author)

  13. Automated Inventory and Monitoring of the ALICE HLT Cluster Resources with the SysMES Framework

    International Nuclear Information System (INIS)

    The High-Level-Trigger (HLT) cluster of the ALICE experiment is a computer cluster with about 200 nodes and 20 infrastructure machines. In its current state, the cluster consists of nearly 10 different configurations of nodes in terms of installed hardware, software and network structure. In such a heterogeneous environment with a distributed application, information about the actual configuration of the nodes is needed to automatically distribute and adjust the application accordingly. An inventory database provides a unified interface to such information. To be useful, the data in the inventory has to be up to date, complete and consistent. Manual maintenance of such databases is error-prone and data tends to become outdated. The inventory module of the ALICE HLT cluster overcomes these drawbacks by automatically updating the actual state periodically and, in contrast to existing solutions, it allows the definition of a target state for each node. A target state can simply be a fully operational state, i.e. a state without malfunctions, or a dedicated configuration of the node. The target state is then compared to the actual state to detect deviations and malfunctions which could induce severe problems when running the application. The inventory module of the ALICE HLT cluster has been integrated into the monitoring and management framework SysMES in order to use existing functionality like transactionality and monitoring infrastructure. Additionally, SysMES allows to solve detected problems automatically via its rule-system. To describe the heterogeneous environment with all its specifics, like custom hardware, the inventory module uses an object-oriented model which is based on the Common Information Model. The inventory module provides an automatically updated actual state of the cluster, detects discrepancies between the actual and the target state and is able to solve detected problems automatically. This contribution presents the current implementation

  14. Automated Inventory and Monitoring of the ALICE HLT Cluster Resources with the SysMES Framework

    Science.gov (United States)

    Ulrich, J.; Lara, C.; Haaland, Ø.; Böttger, S.; Röhrich, D.; Kebschull, U.

    2012-12-01

    The High-Level-Trigger (HLT) cluster of the ALICE experiment is a computer cluster with about 200 nodes and 20 infrastructure machines. In its current state, the cluster consists of nearly 10 different configurations of nodes in terms of installed hardware, software and network structure. In such a heterogeneous environment with a distributed application, information about the actual configuration of the nodes is needed to automatically distribute and adjust the application accordingly. An inventory database provides a unified interface to such information. To be useful, the data in the inventory has to be up to date, complete and consistent. Manual maintenance of such databases is error-prone and data tends to become outdated. The inventory module of the ALICE HLT cluster overcomes these drawbacks by automatically updating the actual state periodically and, in contrast to existing solutions, it allows the definition of a target state for each node. A target state can simply be a fully operational state, i.e. a state without malfunctions, or a dedicated configuration of the node. The target state is then compared to the actual state to detect deviations and malfunctions which could induce severe problems when running the application. The inventory module of the ALICE HLT cluster has been integrated into the monitoring and management framework SysMES in order to use existing functionality like transactionality and monitoring infrastructure. Additionally, SysMES allows to solve detected problems automatically via its rule-system. To describe the heterogeneous environment with all its specifics, like custom hardware, the inventory module uses an object-oriented model which is based on the Common Information Model. The inventory module provides an automatically updated actual state of the cluster, detects discrepancies between the actual and the target state and is able to solve detected problems automatically. This contribution presents the current implementation

  15. Heterogeneous recurrence monitoring and control of nonlinear stochastic processes

    International Nuclear Information System (INIS)

    Recurrence is one of the most common phenomena in natural and engineering systems. Process monitoring of dynamic transitions in nonlinear and nonstationary systems is more concerned with aperiodic recurrences and recurrence variations. However, little has been done to investigate the heterogeneous recurrence variations and link with the objectives of process monitoring and anomaly detection. Notably, nonlinear recurrence methodologies are based on homogeneous recurrences, which treat all recurrence states in the same way as black dots, and non-recurrence is white in recurrence plots. Heterogeneous recurrences are more concerned about the variations of recurrence states in terms of state properties (e.g., values and relative locations) and the evolving dynamics (e.g., sequential state transitions). This paper presents a novel approach of heterogeneous recurrence analysis that utilizes a new fractal representation to delineate heterogeneous recurrence states in multiple scales, including the recurrences of both single states and multi-state sequences. Further, we developed a new set of heterogeneous recurrence quantifiers that are extracted from fractal representation in the transformed space. To that end, we integrated multivariate statistical control charts with heterogeneous recurrence analysis to simultaneously monitor two or more related quantifiers. Experimental results on nonlinear stochastic processes show that the proposed approach not only captures heterogeneous recurrence patterns in the fractal representation but also effectively monitors the changes in the dynamics of a complex system

  16. Cognitive processes in the Breakfast Task: Planning and monitoring.

    Science.gov (United States)

    Rose, Nathan S; Luo, Lin; Bialystok, Ellen; Hering, Alexandra; Lau, Karen; Craik, Fergus I M

    2015-09-01

    The Breakfast Task (Craik & Bialystok, 2006) is a computerized task that simulates the planning and monitoring requirements involved in cooking breakfast, an everyday activity important for functional independence. In Experiment 1, 28 adults performed the Breakfast Task, and outcome measures were examined with principal component analysis to elucidate the structure of cognitive processes underlying performance. Analyses revealed a 2-component structure which putatively captured global planning and local monitoring abilities. In Experiment 2, the structure of Breakfast Task performance was cross-validated on a new sample of 59 healthy older adults who also performed tests assessing working memory, processing speed, inhibition, reasoning and prospective memory. Factor analyses showed that the global planning component from the Breakfast Task was significantly correlated with individual differences in executive functions but the local monitoring component was independent of such functions. The Breakfast Task provides a fast, enjoyable, and lifelike assessment of complex everyday planning and monitoring, and their underlying processes such as working memory and executive functions. PMID:25938251

  17. A Permanent Automated Real-Time Passive Acoustic Monitoring System for Bottlenose Dolphin Conservation in the Mediterranean Sea

    Science.gov (United States)

    Brunoldi, Marco; Bozzini, Giorgio; Casale, Alessandra; Corvisiero, Pietro; Grosso, Daniele; Magnoli, Nicodemo; Alessi, Jessica; Bianchi, Carlo Nike; Mandich, Alberta; Morri, Carla; Povero, Paolo; Wurtz, Maurizio; Melchiorre, Christian; Viano, Gianni; Cappanera, Valentina; Fanciulli, Giorgio; Bei, Massimiliano; Stasi, Nicola; Taiuti, Mauro

    2016-01-01

    Within the framework of the EU Life+ project named LIFE09 NAT/IT/000190 ARION, a permanent automated real-time passive acoustic monitoring system for the improvement of the conservation status of the transient and resident population of bottlenose dolphin (Tursiops truncatus) has been implemented and installed in the Portofino Marine Protected Area (MPA), Ligurian Sea. The system is able to detect the simultaneous presence of dolphins and boats in the area and to give their position in real time. This information is used to prevent collisions by diffusing warning messages to all the categories involved (tourists, professional fishermen and so on). The system consists of two gps-synchronized acoustic units, based on a particular type of marine buoy (elastic beacon), deployed about 1 km off the Portofino headland. Each one is equipped with a four-hydrophone array and an onboard acquisition system which can record the typical social communication whistles emitted by the dolphins and the sound emitted by boat engines. Signals are pre-filtered, digitized and then broadcast to the ground station via wi-fi. The raw data are elaborated to get the direction of the acoustic target to each unit, and hence the position of dolphins and boats in real time by triangulation. PMID:26789265

  18. A Permanent Automated Real-Time Passive Acoustic Monitoring System for Bottlenose Dolphin Conservation in the Mediterranean Sea.

    Science.gov (United States)

    Brunoldi, Marco; Bozzini, Giorgio; Casale, Alessandra; Corvisiero, Pietro; Grosso, Daniele; Magnoli, Nicodemo; Alessi, Jessica; Bianchi, Carlo Nike; Mandich, Alberta; Morri, Carla; Povero, Paolo; Wurtz, Maurizio; Melchiorre, Christian; Viano, Gianni; Cappanera, Valentina; Fanciulli, Giorgio; Bei, Massimiliano; Stasi, Nicola; Taiuti, Mauro

    2016-01-01

    Within the framework of the EU Life+ project named LIFE09 NAT/IT/000190 ARION, a permanent automated real-time passive acoustic monitoring system for the improvement of the conservation status of the transient and resident population of bottlenose dolphin (Tursiops truncatus) has been implemented and installed in the Portofino Marine Protected Area (MPA), Ligurian Sea. The system is able to detect the simultaneous presence of dolphins and boats in the area and to give their position in real time. This information is used to prevent collisions by diffusing warning messages to all the categories involved (tourists, professional fishermen and so on). The system consists of two gps-synchronized acoustic units, based on a particular type of marine buoy (elastic beacon), deployed about 1 km off the Portofino headland. Each one is equipped with a four-hydrophone array and an onboard acquisition system which can record the typical social communication whistles emitted by the dolphins and the sound emitted by boat engines. Signals are pre-filtered, digitized and then broadcast to the ground station via wi-fi. The raw data are elaborated to get the direction of the acoustic target to each unit, and hence the position of dolphins and boats in real time by triangulation. PMID:26789265

  19. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    Science.gov (United States)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  20. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    Science.gov (United States)

    Li, Y. T.; Wittenberg, L. J.

    1992-01-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  1. NeuronMetrics: Software for Semi-Automated Processing of Cultured-Neuron Images

    Science.gov (United States)

    Narro, Martha L.; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L.

    2007-01-01

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics™ for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch-number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of ~60 2D images is 1.0–2.5 hours, from a folder of images to a table of numeric data. NeuronMetrics’ output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery. PMID:17270152

  2. Signal and image processing for monitoring and testing at EDF

    International Nuclear Information System (INIS)

    The quality of monitoring and non destructive testing devices in plants and utilities today greatly depends on the efficient processing of signal and image data. In this context, signal or image processing techniques, such as adaptive filtering or detection or 3D reconstruction, are required whenever manufacturing nonconformances or faulty operation have to be recognized and identified. This paper reviews the issues of industrial image and signal processing, by briefly considering the relevant studies and projects under way at EDF. (authors). 1 fig., 11 refs

  3. Quantitative and Qualitative Analysis of Aconitum Alkaloids in Raw and Processed Chuanwu and Caowu by HPLC in Combination with Automated Analytical System and ESI/MS/MS

    OpenAIRE

    Aimin Sun; Bo Gao; Xueqing Ding; Chi-Ming Huang; Paul Pui-Hay But

    2012-01-01

    HPLC in combination with automated analytical system and ESI/MS/MS was used to analyze aconitine (A), mesaconitine (MA), hypaconitine (HA), and their benzoyl analogs in the Chinese herbs Caowu and Chuanwu. First, an HPLC method was developed and validated to determine A, MA, and HA in raw and processed Caowu and Chuanwu. Then an automated analytical system and ESI/MS/MS were applied to analyze these alkaloids and their semihydrolyzed products. The results obtained from automated analytical sy...

  4. Methodology on Investigating the Influences of Automated Material Handling System in Automotive Assembly Process

    Science.gov (United States)

    Saffar, Seha; Azni Jafar, Fairul; Jamaludin, Zamberi

    2016-02-01

    A case study was selected as a method to collect data in actual industry situation. The study aimed to assess the influences of automated material handling system in automotive industry by proposing a new design of integration system through simulation, and analyze the significant effect and influence of the system. The method approach tool will be CAD Software (Delmia & Quest). The process of preliminary data gathering in phase 1 will collect all data related from actual industry situation. It is expected to produce a guideline and limitation in designing a new integration system later. In phase 2, an idea or concept of design will be done by using 10 principles of design consideration for manufacturing. A full factorial design will be used as design of experiment in order to analyze the performance measured of the integration system with the current system in case study. From the result of the experiment, an ANOVA analysis will be done to study the performance measured. Thus, it is expected that influences can be seen from the improvement made in the system.

  5. A Multi-Scale Flood Monitoring System Based on Fully Automatic MODIS and TerraSAR-X Processing Chains

    Directory of Open Access Journals (Sweden)

    Enrico Stein

    2013-10-01

    Full Text Available A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently under development at DLR’s (German Aerospace Center Center for Satellite based Crisis Information (ZKI to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS. A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR based satellite mission (TerraSAR-X. The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS and the TerraSAR-X Flood Service (TFS include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013.

  6. Implementation of automated, on-line fatigue monitoring in a boiling water reactor

    International Nuclear Information System (INIS)

    A workstation-based, on-line fatigue monitoring system for tracking fatigue usage applied to a Japanese operating boiling water reactor (BWR), Tsuruga Unit 1, is described. The system uses the influence function approach and rainflow cycle counting methodology, operates on a workstation computer, and determines component stresses using temperature, pressure, and flow rate data that are made available via signal taps from previously existing plant sensors. Using plant-unique influence functions developed specifically for the feedwater nozzle location, the system calculates stresses as a function of time and computes the fatigue usage. The analysis method used to compute fatigue usage complies with MITI Code Notification number-sign 501. Fatigue values are saved automatically on files at times defined by the user for use at a later time. Of particular note, this paper describes some of the details involved with implementing such a system from the utility perspective. Utility installation details, as well as why such a system was chosen for implementation are presented. Fatigue results for an entire fuel cycle are presented and compared to assumed design basis events to confirm that actual plant thermal duty is significantly less severe than originally estimated in the design basis stress report. Although the system is specifically set up to address fatigue duty for the feedwater nozzle location, a generic shell structure was implemented so that any other components could be added at a future time without software modifications. As a result, the system provides the technical basis to more accurately evaluate actual reactor conditions as well as the justification for plant life extension

  7. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM Experiments

    Directory of Open Access Journals (Sweden)

    Lisa M. Chung

    2014-06-01

    Full Text Available Multiple Reaction Monitoring (MRM conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets.

  8. Integratable high temperature ultrasonic transducers for NDT of metals and industrial process monitoring

    International Nuclear Information System (INIS)

    Thick (> 40 μm) piezoelectric ceramic films have been successfully deposited on metallic substrates by a sol-gel spray technique as HTUTs. Our novel approach focuses on the fabrication techniques of these HTUTs at the test site with handheld equipment and no furnaces. These HTUTs can be integrated onto large metallic structures such as pipes and molds for real-time and on-line automate NDT and process monitoring at the sensor location. The characteristics of these ultrasonic transducers are that they (1) can be fabricated directly onto the desired planar or curved metallic substrate such as large pipe at the NDT site; (2) do not need couplant; (3) can operate in the pulse/echo mode with a signal to noise ratio more than 30 dB at 10 MHz; (4) can operate up to more than 400oC. These HTUTs can be made onto thin metallic membranes as flexible transducers that can be wrapped around samples of cylindrical surfaces for NDT applications. The capability of these thick film UTs for NDT applications at temperatures up to 440oC and real-time non-intrusive and nondestructive process monitoring of polymer injection molding has been demonstrated. (author)

  9. Grinding process monitoring based on electromechanical impedance measurements

    Science.gov (United States)

    Marchi, Marcelo; Guimarães Baptista, Fabricio; de Aguiar, Paulo Roberto; Bianchi, Eduardo Carlos

    2015-04-01

    Grinding is considered one of the last processes in precision parts manufacturing, which makes it indispensable to have a reliable monitoring system to evaluate workpiece surface integrity. This paper proposes the use of the electromechanical impedance (EMI) method to monitor the surface grinding operation in real time, particularly the surface integrity of the ground workpiece. The EMI method stands out for its simplicity and for using low-cost components such as PZT (lead zirconate titanate) piezoelectric transducers. In order to assess the feasibility of applying the EMI method to the grinding process, experimental tests were performed on a surface grinder using a CBN grinding wheel and a SAE 1020 steel workpiece, with PZT transducers mounted on the workpiece and its holder. During the grinding process, the electrical impedance of the transducers was measured and damage indices conventionally used in the EMI method were calculated and compared with workpiece wear, indicating the surface condition of the workpiece. The experimental results indicate that the EMI method can be an efficient and cost-effective alternative for monitoring precision workpieces during the surface grinding process.

  10. Rapid Automated Treatment Planning Process to Select Breast Cancer Patients for Active Breathing Control to Achieve Cardiac Dose Reduction

    International Nuclear Information System (INIS)

    Purpose: To evaluate a rapid automated treatment planning process for the selection of patients with left-sided breast cancer for a moderate deep inspiration breath-hold (mDIBH) technique using active breathing control (ABC); and to determine the dose reduction to the left anterior descending coronary artery (LAD) and the heart using mDIBH. Method and Materials: Treatment plans were generated using an automated method for patients undergoing left-sided breast radiotherapy (n = 53) with two-field tangential intensity-modulated radiotherapy. All patients with unfavorable cardiac anatomy, defined as having >10 cm3 of the heart receiving 50% of the prescribed dose (V50) on the free-breathing automated treatment plan, underwent repeat scanning on a protocol using a mDIBH technique and ABC. The doses to the LAD and heart were compared between the free-breathing and mDIBH plans. Results: The automated planning process required approximately 9 min to generate a breast intensity-modulated radiotherapy plan. Using the dose–volume criteria, 20 of the 53 patients were selected for ABC. Significant differences were found between the free-breathing and mDIBH plans for the heart V50 (29.9 vs. 3.7 cm3), mean heart dose (317 vs. 132 cGy), mean LAD dose (2,047 vs. 594 cGy), and maximal dose to 0.2 cm3 of the LAD (4,155 vs. 1,507 cGy, all p 50 using the mDIBH technique. The 3 patients who had had a breath-hold threshold 50. Conclusions: A rapid automated treatment planning process can be used to select patients who will benefit most from mDIBH. For selected patients with unfavorable cardiac anatomy, the mDIBH technique using ABC can significantly reduce the dose to the LAD and heart, potentially reducing the cardiac risks.

  11. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    Science.gov (United States)

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  12. Automated continuous monitoring of inorganic and total mercury in wastewater and other waters by flow-injection analysis and cold-vapour atomic absorption spectrometry

    OpenAIRE

    Birnie, S. E.

    1988-01-01

    An automated continuous monitoring system for the determination of inorganic and total mercury by flow-injection analysis followed by cold-vapour atomic absorption spectrometry is described. The method uses a typical flow-injection manifold where digestion and reduction of the injected sample takes place. Mercury is removed by aeration from the flowing stream in a specially designed air-liquid separator and swept into a silica cell for absorption measurement at a wavelength of 253.7 nm. A cal...

  13. Wind-driven desertification: Process modeling, remote monitoring, and forecasting

    Science.gov (United States)

    Okin, Gregory Stewart

    Arid and semiarid landscapes comprise nearly a third of the Earth's total land surface. These areas are coming under increasing land use pressures. Despite their low productivity these lands are not barren. Rather, they consist of fragile ecosystems vulnerable to anthropogenic disturbance. The purpose of this thesis is threefold: (I) to develop and test a process model of wind-driven desertification, (II) to evaluate next-generation process-relevant remote monitoring strategies for use in arid and semiarid regions, and (III) to identify elements for effective management of the world's drylands. In developing the process model of wind-driven desertification in arid and semiarid lands, field, remote sensing, and modeling observations from a degraded Mojave Desert shrubland are used. This model focuses on aeolian removal and transport of dust, sand, and litter as the primary mechanisms of degradation: killing plants by burial and abrasion, interrupting natural processes of nutrient accumulation, and allowing the loss of soil resources by abiotic transport. This model is tested in field sampling experiments at two sites and is extended by Fourier Transform and geostatistical analysis of high-resolution imagery from one site. Next, the use of hyperspectral remote sensing data is evaluated as a substantive input to dryland remote monitoring strategies. In particular, the efficacy of spectral mixture analysis (SMA) in discriminating vegetation and soil types and determining vegetation cover is investigated. The results indicate that hyperspectral data may be less useful than often thought in determining vegetation parameters. Its usefulness in determining soil parameters, however, may be leveraged by developing simple multispectral classification tools that can be used to monitor desertification. Finally, the elements required for effective monitoring and management of arid and semiarid lands are discussed. Several large-scale multi-site field experiments are proposed to

  14. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.

  15. Signal processing methodologies for an acoustic fetal heart rate monitor

    Science.gov (United States)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  16. Neural network training by Kalman filtering in process system monitoring

    International Nuclear Information System (INIS)

    Kalman filtering approach for neural network training is described. Its extended form is used as an adaptive filter in a nonlinear environment of the form a feedforward neural network. Kalman filtering approach generally provides fast training as well as avoiding excessive learning which results in enhanced generalization capability. The network is used in a process monitoring application where the inputs are measurement signals. Since the measurement errors are also modelled in Kalman filter the approach yields accurate training with the implication of accurate neural network model representing the input and output relationships in the application. As the process of concern is a dynamic system, the input source of information to neural network is time dependent so that the training algorithm presents an adaptive form for real-time operation for the monitoring task. (orig.)

  17. A new versatile in-process monitoring system for milling

    CERN Document Server

    Ritou, Mathieu; Furet, Benoît; Hascoët, Jean-Yves

    2013-01-01

    Tool condition monitoring (TCM) systems can improve productivity and ensure workpiece quality, yet, there is a lack of reliable TCM solutions for small-batch or one-off manufacturing of industrial parts. TCM methods which include the characteristics of the cut seem to be particularly suitable for these demanding applications. In the first section of this paper, three process-based indicators have been retrieved from literature dealing with TCM. They are analysed using a cutting force model and experiments are carried out in industrial conditions. Specific transient cuttings encountered during the machining of the test part reveal the indicators to be unreliable. Consequently, in the second section, a versatile in-process monitoring method is suggested. Based on experiments carried out under a range of different cutting conditions, an adequate indicator is proposed: the relative radial eccentricity of the cutters is estimated at each instant and characterizes the tool state. It is then compared with the previo...

  18. AE Monitoring and Analysis of HVOF Thermal Spraying Process

    Science.gov (United States)

    Faisal, N. H.; Ahmed, R.; Reuben, R. L.; Allcock, B.

    2011-09-01

    This work presents an in situ monitoring of HVOF thermal spraying process through an acoustic emission (AE) technique in an industrial coating chamber. Single layer thermal spraying on substrate was carried out through slits. Continuous multilayer thermal spraying onto the sample without slit was also conducted. The AE was measured using a broadband piezoelectric AE sensor positioned on the back of the substrate. A mathematical model has been developed to determine the total kinetic energy of particles impacting the substrate through slits. Results of this work demonstrate that AE associated with particle impacts can be used for in situ monitoring of coating process. Results also show that the amplitude and AE energy is related to the spray gun transverse speed and the oxy-fuel pressure. The measured AE energy was found to vary with the number of particles impacting the substrate, determined using the mathematical model.

  19. Multivariate Statistical Process Monitoring Using Robust Nonlinear Principal Component Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHAO Shijian; XU Yongmao

    2005-01-01

    The principal component analysis (PCA) algorithm is widely applied in a diverse range of fields for performance assessment, fault detection, and diagnosis. However, in the presence of noise and gross errors, the nonlinear PCA (NLPCA) using autoassociative bottle-neck neural networks is so sensitive that the obtained model differs significantly from the underlying system. In this paper, a robust version of NLPCA is introduced by replacing the generally used error criterion mean squared error with a mean log squared error. This is followed by a concise analysis of the corresponding training method. A novel multivariate statistical process monitoring (MSPM) scheme incorporating the proposed robust NLPCA technique is then investigated and its efficiency is assessed through application to an industrial fluidized catalytic cracking plant. The results demonstrate that, compared with NLPCA, the proposed approach can effectively reduce the number of false alarms and is, hence, expected to better monitor real-world processes.

  20. A new versatile in-process monitoring system for milling

    OpenAIRE

    Ritou, Mathieu; Garnier, Sébastien; FURET, Benoît; Hascoët, Jean-Yves

    2013-01-01

    Tool condition monitoring (TCM) systems can improve productivity and ensure workpiece quality, yet, there is a lack of reliable TCM solutions for small-batch or one-off manufacturing of industrial parts. TCM methods which include the characteristics of the cut seem to be particularly suitable for these demanding applications. In the first section of this paper, three process-based indicators have been retrieved from literature dealing with TCM. They are analysed using a cutting force model an...

  1. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance

    OpenAIRE

    Schiepek, Günter; Aichhorn, Wolfgang; Gruber, Martin; Strunk, Guido; Bachler, Egon; Aas, Benjamin

    2016-01-01

    Objective: The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients' compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific) surv...

  2. Statistical process control methods for expert system performance monitoring.

    OpenAIRE

    Kahn, M. G.; Bailey, T C; Steib, S. A.; Fraser, V J; Dunagan, W C

    1996-01-01

    The literature on the performance evaluation of medical expert system is extensive, yet most of the techniques used in the early stages of system development are inappropriate for deployed expert systems. Because extensive clinical and informatics expertise and resources are required to perform evaluations, efficient yet effective methods of monitoring performance during the long-term maintenance phase of the expert system life cycle must be devised. Statistical process control techniques pro...

  3. System and process for pulsed multiple reaction monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Belov, Mikhail E

    2013-05-17

    A new pulsed multiple reaction monitoring process and system are disclosed that uses a pulsed ion injection mode for use in conjunction with triple-quadrupole instruments. The pulsed injection mode approach reduces background ion noise at the detector, increases amplitude of the ion signal, and includes a unity duty cycle that provides a significant sensitivity increase for reliable quantitation of proteins/peptides present at attomole levels in highly complex biological mixtures.

  4. A Generic Framework for Systematic Design of Process Monitoring and Control System for Crystallization Processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Sin, Gürkan;

    2012-01-01

    A generic framework for systematic design of a process monitoring and control system for crystallization processes has been developed in order to obtain the desired end-product properties notably the crystal size distribution (CSD). The design framework contains a generic crystallizer modelling...... a Process Analytical Technology (PAT) system design including implementation of monitoring tools and control strategies in order to produce a desired product with its corresponding target properties. Application of the framework is highlighted through a case study involving the system potassium...... tool-box, a tool for design of operational policies as well as a tool for design of process monitoring and control systems. Through this framework, it is possible for a wide range of crystallization processes to generate the necessary problem-system specific model, the necessary operational policy and...

  5. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance

    Directory of Open Access Journals (Sweden)

    Guenter Karl Schiepek

    2016-05-01

    Full Text Available AbstractObjective. The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients’ compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific surveys. Methods. The process-outcome monitoring is technically realized by an internet-based device for data collection and data analysis, the Synergetic Navigation System. Its feasibility is documented by a compliance study on 151 clients treated in an inpatient and a day-treatment clinic. Results. We found high compliance rates (mean: 78.3%, median: 89.4% amongst the respondents, independent of the severity of symptoms or the degree of impairment. Compared to other diagnoses, the compliance rate was lower in the group diagnosed with personality disorders. Conclusion. The results support the feasibility of high-frequency monitoring in routine psychotherapy settings. Daily collection of psychological surveys allows for assessment of highly resolved, equidistant time series data which gives insight into the nonlinear qualities of therapeutic change processes (e.g., pattern transitions, critical instabilities.

  6. Ultrasonic monitoring of material processing using clad buffer rod sensors

    Science.gov (United States)

    Ramos Franca, Demartonne

    Ultrasonic sensors and techniques are developed for in-line monitoring of polymer extrusion, cleanliness of molten metals and liquid flow speed at elevated temperature. Pulse-echo mode is used for the first two processes, while the through-transmission mode is applied in the third one. The ultrasonic probe consists of high performance clad buffer rods with different dimensions to thermally isolate the commercial ultrasonic transducer from materials at high temperature. The clad buffer rods are made of steel, polymer and ceramic. Steel clad buffer rods are introduced for in-line monitoring of polymer extrusion processes. Owing to its superior performance in pulse-echo mode, for the first time such a probe is installed and performs ultrasonic monitoring in the die of a co-extrusion machine and in the barrel section of a twin-screw extruder. It can reveal a variety of information relevant to process parameters, such as polymer layer thickness, interface location and adhesion quality, stability, or polymer composition change. For the ultrasonic monitoring of polymer processes, probes with acoustic impedance that matches that of the processed polymer may offer certain advantages such as quantitative viscoelastic evaluation; thus high temperature polymer clad buffer rods, in particular PEEK, are developed. It is demonstrated that this new probe exhibits unique advantages for in-line monitoring of the cure of epoxies and polymer extrusion process. Long steel clad buffer rods with a spherical focus lens machined at the probing end are proposed for cleanliness evaluation of molten metals. The potential of this focusing probe is demonstrated by means of high-resolution imaging and particles detection in molten zinc at temperatures higher than 600°C, using a single probe operated at pulse-echo mode. A contrapropagating ultrasonic flowmeter employing steel clad buffer rods is devised to operate at high temperature. It is demonstrated that these rods guide ultrasonic signals

  7. Implementation of automated macro after develop inspection in a production lithography process

    Science.gov (United States)

    Yanof, Arnold W.; Plachecki, Vincent E.; Fischer, Frank W.; Cusacovich, Marcelo; Nelson, Chris; Merrill, Mark A.

    2000-06-01

    impossibility of accurate classification and recording of defect types, locations, and layer of occurrence. In this paper, we discuss a pilot implementation of an automated macro inspection system at Motorola, Inc., which has enabled the early detection and containment of significant photolithography defects. We show a variety of different types of defects that have been effectively detected and identified by this system during production usage. We introduce a methodology for determining the automated tool's ability to discriminate between the defect signal and process noise. We indicate the potential for defect database analysis, and identification of maverick product. Based upon the pilot experience, we discuss the parameters of a cost/benefit analysis of full implementation. The costs involve tool cost, additional wafer dispositions, and the engineering costs of recipe management. The most tangible measurable benefit is the saved revenue of scrapped wafers. An analysis of risk also shows a major reduction due to improved detection, as well as reduced occurrence because of better containment. This reduction of risk extends both to the customer -- in terms of field failures, OTD, maverick product -- as well as to the production facility -- in terms of major scrap incidents, forced inking at probe, redo, and containment.

  8. Analysis of dip coating processing parameters by double optical monitoring.

    Science.gov (United States)

    Horowitz, Flavio; Michels, Alexandre F

    2008-05-01

    Double optical monitoring is applied to determine the influence of main process parameters on the formation of sulfated zirconia and self-assembled mesoporous silica solgel films by dip coating. In addition, we analyze, for the first time to the best of our knowledge, the influence of withdrawal speed, temperature, and relative humidity on refractive-index and physical thickness variations (uncertainties of +/-0.005 and +/-7 nm) during the process. Results provide insight into controlled production of single and multilayer films from complex fluids by dip coating. PMID:18449244

  9. Imaging 3D strain field monitoring during hydraulic fracturing processes

    Science.gov (United States)

    Chen, Rongzhang; Zaghloul, Mohamed A. S.; Yan, Aidong; Li, Shuo; Lu, Guanyi; Ames, Brandon C.; Zolfaghari, Navid; Bunger, Andrew P.; Li, Ming-Jun; Chen, Kevin P.

    2016-05-01

    In this paper, we present a distributed fiber optic sensing scheme to study 3D strain fields inside concrete cubes during hydraulic fracturing process. Optical fibers embedded in concrete were used to monitor 3D strain field build-up with external hydraulic pressures. High spatial resolution strain fields were interrogated by the in-fiber Rayleigh backscattering with 1-cm spatial resolution using optical frequency domain reflectometry. The fiber optics sensor scheme presented in this paper provides scientists and engineers a unique laboratory tool to understand the hydraulic fracturing processes in various rock formations and its impacts to environments.

  10. New Tool To Monitor Biofilm Growth in Industrial Process Waters

    OpenAIRE

    Blanco Suárez, Ángeles; Torres, Esperanza; de la Fuente González, Elena; Negro Álvarez, Carlos

    2011-01-01

    A new online methodology based on a continuous process video microscopy and image analysis has been developed to study the effects of enzymes on the formation of biofilm. This research consists of two parts: (1) the monitoring of the growth of a biofilm formed with the axenic culture isolated from the process waters of a recycling paper mill, aiming at determining the most appropriate way to quantify the biofilm growth from the obtained images; and (2) the study of the effects of three new en...

  11. Safety. [requirements for software to monitor and control critical processes

    Science.gov (United States)

    Leveson, Nancy G.

    1991-01-01

    Software requirements, design, implementation, verification and validation, and especially management are affected by the need to produce safe software. This paper discusses the changes in the software life cycle that are necessary to ensure that software will execute without resulting in unacceptable risk. Software is being used increasingly to monitor and control safety-critical processes in which a run-time failure or error could result in unacceptable losses such as death, injury, loss of property, or environmental harm. Examples of such processes maybe found in transportation, energy, aerospace, basic industry, medicine, and defense systems.

  12. A Tool for the Automated Collection of Space Utilization Data: Three Dimensional Space Utilization Monitor

    Science.gov (United States)

    Vos, Gordon A.; Fink, Patrick; Ngo, Phong H.; Morency, Richard; Simon, Cory; Williams, Robert E.; Perez, Lance C.

    2015-01-01

    Space Human Factors and Habitability (SHFH) Element within the Human Research Program (HRP), in collaboration with the Behavioral Health and Performance (BHP) Element, is conducting research regarding Net Habitable Volume (NHV), the internal volume within a spacecraft or habitat that is available to crew for required activities, as well as layout and accommodations within that volume. NASA is looking for innovative methods to unobtrusively collect NHV data without impacting crew time. Data required includes metrics such as location and orientation of crew, volume used to complete tasks, internal translation paths, flow of work, and task completion times. In less constrained environments methods for collecting such data exist yet many are obtrusive and require significant post-processing. Example technologies used in terrestrial settings include infrared (IR) retro-reflective marker based motion capture, GPS sensor tracking, inertial tracking, and multiple camera filmography. However due to constraints of space operations many such methods are infeasible, such as inertial tracking systems which typically rely upon a gravity vector to normalize sensor readings, and traditional IR systems which are large and require extensive calibration. However multiple technologies have not yet been applied to space operations for these explicit purposes. Two of these include 3-Dimensional Radio Frequency Identification Real-Time Localization Systems (3D RFID-RTLS) and depth imaging systems which allow for 3D motion capture and volumetric scanning (such as those using IR-depth cameras like the Microsoft Kinect or Light Detection and Ranging / Light-Radar systems, referred to as LIDAR).

  13. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  14. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.; Virden, Daniel J.; Myers, Joshua R.; Maxwell, Adam R.

    2012-09-01

    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objects recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.

  15. The role of social and ecological processes in structuring animal populations: a case study from automated tracking of wild birds

    OpenAIRE

    Farine, Damien R.; Firth, Josh A.; Aplin, Lucy M.; Crates, Ross A.; Culina, Antica; Garroway, Colin J.; Hinde, Camilla A; Kidd, Lindall R.; Milligan, Nicole D.; Psorakis, Ioannis; Radersma, Reinder; Verhelst, Brecht; Voelkl, Bernhard; Sheldon, Ben C.

    2015-01-01

    Both social and ecological factors influence population process and structure, with resultant consequences for phenotypic selection on individuals. Understanding the scale and relative contribution of these two factors is thus a central aim in evolutionary ecology. In this study, we develop a framework using null models to identify the social and spatial patterns that contribute to phenotypic structure in a wild population of songbirds. We used automated technologies to track 1053 individuals...

  16. Translation Expert (TranslationQ & RevisionQ): Automated translation process with real-time feedback & evaluation/ revision with PIE

    OpenAIRE

    Steurs, Frieda; Segers, Winibert; Kockaert, Hendrik

    2015-01-01

    Translation Expert (TranslationQ & RevisionQ): Automated translation process with real-time feedback & evaluation/ revision with PIE Winibert Segers, Hendrik Kockaert & Frieda Steurs KU Leuven This paper reports on an experiment working with a new evaluation technique for translator training. Organizing high level translation classes in a master in translation involves intensive assessment of the work delivered by the students. The evaluation has to be precise, professional, and at...

  17. Self-tuning process monitoring system for process-based product

    Energy Technology Data Exchange (ETDEWEB)

    Hillaire, R. [Sandia National Labs., Livermore, CA (United States); Loucks, C. [Sandia National Labs., Albuquerque, NM (United States)

    1998-02-01

    The hidden qualities of a product are often revealed in the process. Subsurface material damage, surface cracks, and unusual burr formation can occur during a poorly controlled machining process. Standard post process inspection is costly and may not reveal these conditions. However, by monitoring the proper process parameters, these conditions are readily detectable without incurring the cost of post process inspection. In addition, many unforeseen process anomalies may be detected using an advanced process monitoring system. This work created a process monitoring system for milling machines which mapped the forces, power, vibration, and acoustic emissions generated during a cutting cycle onto a 3D model of the part being machined. The hyperpoint overlay can be analyzed and visualized with VRML (Virtual Reality Modeling Language). Once the Process Monitoring System is deployed, detailed inspection may be significantly reduced or eliminated. The project deployed a Pro-Engineer to VRML model conversion routine, advanced visualization interface, tool path transformation with mesh generation routine, hyperpoint overlay routine, stable sensor array, sensor calibration routine, and machine calibration methodology. The technology created in this project can help validate production of WR (War Reserve) components by generating process signatures for products, processes, and lot runs. The signatures of each product can be compared across all products made within and across lot runs to determine if the processes that produced the product are consistently providing superior quality. Furthermore, the qualities of the processes are visibly apparent, since the part model is overlaid with process data. The system was evaluated on three different part productions.

  18. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  19. Prototype development of filter monitor for 131I processing plant

    International Nuclear Information System (INIS)

    Iodine-131 (131I) is used extensively in nuclear medicine because of its short half-life and useful beta emission. Isotope Production and Applications Division (IP and AD) of BARC produces 131I in its processing plant. The charcoal filters that are capable of extracting high levels of radioactive iodine and particulates in the suction flow are installed in the plant. The radioactive iodine is fully removed and deposited onto activated charcoal impregnated with potassium iodide. These charcoal filters get saturated over a period of use and need to be replaced with fresh ones. A 5-channel Filter monitor for online measurement of radiation level of trapped 131I on the charcoal filter is being developed by IP and AD, BARC. The unavailability of this type of instrument motivated to undertake this development. Current paper deals with a prototype filter monitor developed with single detector. Some results prove the functionality of the system. (author)

  20. CO2 laser tailored blank welding: process monitoring

    Science.gov (United States)

    D'Angelo, Giuseppe; Borello, Elena; Pallaro, Nereo

    1996-09-01

    Tailored blank welding has been a rapidly growing segment of the automotive industry over the last five years. It allows to choose the optimal thickness of the sheets for different zones taking into account different mechanical stresses, vehicle safety reinforcement. Through the elimination of extra reinforcement parts, the use of tailored blanks allows to produce lighter car bodies and to simplify the production cycle. As more laser welding systems are being installed in industry, in order to increase the productivity and maintain constant quality of the products, the demand for the development of process monitoring systems increases. In this paper a monitoring system, based on the measurement of the radiation from the plasma plume during the CO2 tailored blanks laser welding, is presented. Using an appropriate combination of optical components, detectors and a special software, a complete apparatus has been developed. The signals were found to be correlated to weld quality parameters including the defects such as holes, overlapping and open butts.