WorldWideScience

Sample records for automated process monitoring

  1. Automated point clouds processing for deformation monitoring

    Directory of Open Access Journals (Sweden)

    Ján Erdélyi

    2015-12-01

    Full Text Available The weather conditions and the operation load are causing changes in the spatial position and in the shape of engineering constructions, which affects their static and dynamic function and reliability. Because these facts, geodetic measurements are integral parts of engineering structures diagnosis.The advantage of terrestrial laser scanning (TLS over conventional surveying methods is the efficiency of spatial data acquisition. TLS allows contactless determining the spatial coordinates of points lying on the surface on the measured object. The scan rate of current scanners (up to 1 million of points/s allows significant reduction of time, necessary for the measurement; respectively increase the quantity of obtained information about the measured object. To increase the accuracy of results, chosen parts of the monitored construction can be approximated by single geometric entities using regression. In this case the position of measured point is calculated from tens or hundreds of scanned points.This paper presents the possibility of deformation monitoring of engineering structures using the technology of TLS. For automated data processing was developed an application based on Matlab®, Displacement_TLS. The operation mode, the basic parts of this application and the calculation of displacements are described.

  2. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jay W. Grate; Timothy A. DeVol

    2006-07-20

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodoology, and to initiate work on new detection approaches, especially using modified diode detectors.

  3. Radioanalytical Chemistry for Automated Nuclear Waste Process Monitoring

    International Nuclear Information System (INIS)

    The objectives of our research were to develop the first automated radiochemical process analyzer including sample pretreatment methodology, and to initiate work on new detection approaches, especially using modified diode detectors

  4. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  5. Elektronische monitoring van luchtwassers op veehouderijbedrijven = Automated process monitoring and data logging of air scrubbers at animal houses

    NARCIS (Netherlands)

    Melse, R.W.; Franssen, J.C.T.J.

    2010-01-01

    At 6 animal houses air scrubbers equipped with an automated process monitoring and data logging system were tested. The measured values were successfully stored but the measured values, especially the pH and EC of the recirculation water, appeared not to be correct at all times.

  6. Complex Event Processing Approach To Automated Monitoring Of Particle Accelerator And Its Control System

    OpenAIRE

    Karol Grzegorczyk; Vito Baggiolini; Krzysztof Zieliński

    2014-01-01

    This article presents the design and implementation of a software component for automated monitoring and diagnostic information analysis of a particle accelerator and its control system. The information that is analyzed can be seen as streams of events. A Complex Event Processing (CEP) approach to event processing was selected. The main advantage of this approach is the ability to continuously query data coming from several streams. The presented software component is based on Esper, the most...

  7. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    Science.gov (United States)

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  8. Complex Event Processing Approach To Automated Monitoring Of Particle Accelerator And Its Control System

    Directory of Open Access Journals (Sweden)

    Karol Grzegorczyk

    2014-01-01

    Full Text Available This article presents the design and implementation of a software component for automated monitoring and diagnostic information analysis of a particle accelerator and its control system. The information that is analyzed can be seen as streams of events. A Complex Event Processing (CEP approach to event processing was selected. The main advantage of this approach is the ability to continuously query data coming from several streams. The presented software component is based on Esper, the most popular open-source implementation of CEP. As a test bed, the control system of the accelerator complex located at CERN, the European Organization for Nuclear Research, was chosen. The complex includes the Large Hadron Collider, the world’s most powerful accelerator. The main contribution to knowledge is by showing that the CEP approach can successfully address many of the challenges associated with automated monitoring of the accelerator and its control system that were previously unsolved. Test results, performance analysis, and a proposal for further works are also presented.

  9. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    Science.gov (United States)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  10. Automated system for acquisition and image processing for the control and monitoring boned nopal

    Science.gov (United States)

    Luevano, E.; de Posada, E.; Arronte, M.; Ponce, L.; Flores, T.

    2013-11-01

    This paper describes the design and fabrication of a system for acquisition and image processing to control the removal of thorns nopal vegetable (Opuntia ficus indica) in an automated machine that uses pulses of a laser of Nd: YAG. The areolas, areas where thorns grow on the bark of the Nopal, are located applying segmentation algorithms to the images obtained by a CCD. Once the position of the areolas is known, coordinates are sent to a motors system that controls the laser to interact with all areolas and remove the thorns of the nopal. The electronic system comprises a video decoder, memory for image and software storage, and digital signal processor for system control. The firmware programmed tasks on acquisition, preprocessing, segmentation, recognition and interpretation of the areolas. This system achievement identifying areolas and generating table of coordinates of them, which will be send the motor galvo system that controls the laser for removal

  11. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin;

    2015-01-01

    and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells......The industrial production of cells has a large unmet need for greater process monitoring, in addition to the standard temperature, pH and oxygen concentration determination. Monitoring the cell health by a vast range of fluorescence cell-based assays can greatly improve the feedback control......, and determining the total cell and dead cells concentrations, within a time frame of 10.3 min. The platform consists of custom made stepper motor actuated peristaltic pumps and valves, fluidic interconnections, sample to waste liquid management and image cytometry-based detection. The total concentration of cells...

  12. The value of automated high-frequency nutrient monitoring in inference of biogeochemical processes, temporal variability and trends

    Science.gov (United States)

    Bieroza, Magdalena; Heathwaite, Louise

    2013-04-01

    Stream water quality signals integrate catchment-scale processes responsible for delivery and biogeochemical transformation of the key biotic macronutrients (N, C, P). This spatial and temporal integration is particularly pronounced in the groundwater-dominated streams, as in-stream nutrient dynamics are mediated by the processes occurring within riparian and hyporheic ecotones. In this paper we show long-term high-frequency in-stream macronutrient dynamics from a small agricultural catchment located in the North West England. Hourly in-situ measurements of total and reactive phosphorus (Systea, IT), nitrate (Hach Lange, DE) and physical water quality parameters (turbidity, specific conductivity, dissolved oxygen, temperature, pH; WaterWatch, UK) were carried out on the lowland, gaining reach of the River Leith. High-frequency data show complex non-linear nutrient concentration-discharge relationships. The dominance of hysteresis effects suggests the presence of a temporally varying apportionment of allochthonous and autochthonous nutrient sources. Varying direction, magnitude and dynamics of the hysteretic responses between storm events is driven by the variation in the contributing source areas and shows the importance of the coupling of catchment-scale, in-stream, riparian and hyporheic biogeochemical cycles. The synergistic effect of physical (temperature-driven, the hyporheic exchange controlled by diffusion) and biogeochemical drivers (stream and hyporheic metabolism) on in-stream nutrient concentrations manifests itself in observed diurnal patterns. As inferred from the high-frequency nutrient monitoring, the diurnal dynamics are of the greatest importance under baseflow conditions. Understanding the role and relative importance of these processes can be difficult due to spatial and temporal heterogeneity of the key mechanisms involved. This study shows the importance of in-situ, fine temporal resolution, automated monitoring approaches in providing evidence

  13. HEAVY OIL PROCESS MONITOR: AUTOMATED ON-COLUMN ASPHALTENE PRECIPITATION AND RE-DISSOLUTION

    Energy Technology Data Exchange (ETDEWEB)

    John F. Schabron; Joseph F. Rovani Jr; Mark Sanderson

    2006-06-01

    About 37-50% (w/w) of the heptane asphaltenes from unpyrolyzed residua dissolve in cyclohexane. As pyrolysis progresses, this number decrease to below 15% as coke and toluene insoluble pre-coke materials appear. This solubility measurement can be used after coke begins to form, unlike the flocculation titration, which cannot be applied to multi-phase systems. Currently, the procedure for the isolation of heptane asphaltenes and the determination of the amount of asphaltenes soluble in cyclohexane spans three days. A more rapid method to measure asphaltene solubility was explored using a novel on-column asphaltene precipitation and re-dissolution technique. This was automated using high performance liquid chromatography (HPLC) equipment with a step gradient sequence using the solvents: heptane, cyclohexane, toluene:methanol (98:2). Results for four series of original and pyrolyzed residua were compared with data from the gravimetric method. The measurement time was reduced from three days to forty minutes. The separation was expanded further with the use of four solvents: heptane, cyclohexane, toluene, and cyclohexanone or methylene chloride. This provides a fourth peak which represents the most polar components, in the oil.

  14. Interoperability for Space Mission Monitor and Control: Applying Technologies from Manufacturing Automation and Process Control Industries

    Science.gov (United States)

    Jones, Michael K.

    1998-01-01

    Various issues associated with interoperability for space mission monitor and control are presented in viewgraph form. Specific topics include: 1) Space Project Mission Operations Control Architecture (SuperMOCA) goals and methods for achieving them; 2) Specifics on the architecture: open standards ad layering, enhancing interoperability, and promoting commercialization; 3) An advertisement; 4) Status of the task - government/industry cooperation and architecture and technology demonstrations; and 5) Key features of messaging services and virtual devices.

  15. Simultaneous and automated monitoring of the multimetal biosorption processes by potentiometric sensor array and artificial neural network.

    Science.gov (United States)

    Wilson, D; del Valle, M; Alegret, S; Valderrama, C; Florido, A

    2013-09-30

    In this communication, a new methodology for the simultaneous and automated monitoring of biosorption processes of multimetal mixtures of polluting heavy metals on vegetable wastes based on flow-injection potentiometry (FIP) and electronic tongue detection (ET) is presented. A fixed-bed column filled with grape stalks from wine industry wastes is used as the biosorption setup to remove the metal mixtures from the influent solution. The monitoring system consists in a computer controlled-FIP prototype with the ET based on an array of 9 flow-through ion-selective electrodes and electrodes with generic response to divalent ions placed in series, plus an artificial neural network response model. The cross-response to Cu(2+), Cd(2+), Zn(2+), Pb(2+) and Ca(2+) (as target ions) is used, and only when dynamic treatment of the kinetic components of the transient signal is incorporated, a correct operation of the system is achieved. For this purpose, the FIA peaks are transformed via use of Fourier treatment, and selected coefficients are used to feed an artificial neural network response model. Real-time monitoring of different binary (Cu(2+)/ Pb(2+)), (Cu(2+)/ Zn(2+)) and ternary mixtures (Cu(2+)/ Pb(2+)/ Zn(2+)), (Cu(2+)/ Zn(2+)/ Cd(2+)), simultaneous to the release of Ca(2+) in the effluent solution, are achieved satisfactorily using the reported system, obtaining the corresponding breakthrough curves, and showing the ion-exchange mechanism among the different metals. Analytical performance is verified against conventional spectroscopic techniques, with good concordance of the obtained breakthrough curves and modeled adsorption parameters. PMID:23953435

  16. Automated process control monitor for 0.18-um technology and beyond

    Science.gov (United States)

    Choo, Bryan; Riley, Trina; Schulz, Bernd; Singh, Bhanwar

    2000-06-01

    Currently, most production fabs use critical dimension (CD) measurements as their primary means for process control in printing lines, spaces and contacts. Historically, this has been adequate to control the lithography and etch processes and produce reasonable yields. However, as the industry moves from 0.25 micrometer manufacturing to 0.18 micrometer and beyond, it is becoming increasingly obvious that CD measurements alone do not provide enough information about the printed structures. As the geometry shrinks, slight changes in the shape and profile can significantly affect the electrical characteristics of the circuit while maintaining the same CD value. In this paper, we will describe a method which, in conjunction with the CD measurements, better characterizes the circuit structures and therefore provides valuable feedback about the process. This method compares stored image and linescan information of a 'golden' (correctly processed) structure to that of the structure being measured. Based on the collected data, it is possible to distinguish between different profiles and determine if a process shift has occurred, even when the measured CD remains within specification. The correlation score therefore provides an additional constraint that better defines the true process window and provides an additional flag for process problems. Without this information, the process used may not be truly optimized, or a shift may occur that is not detected in a timely manner, resulting in the loss of yield and revenue. This data collection has been implemented in production on a local interconnect lithography process. Before the correlation information was available, it was very difficult to detect the scumming within the LI trench, in which it was a time consuming and labor intensive procedure to identify problem lots. The correlation scores, collected automatically and concurrently with the CD measurement, allowed tracking through the SPC chart and the automatic flagging

  17. Automated radioanalytical system incorporating microwave-assisted sample preparation, chemical separation, and online radiometric detection for the monitoring of total 99Tc in nuclear waste processing streams.

    Science.gov (United States)

    Egorov, Oleg B; O'Hara, Matthew J; Grate, Jay W

    2012-04-01

    An automated fluidic instrument is described that rapidly determines the total (99)Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the (99)Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of (99)Tc. We developed a preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of (99)Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.

  18. National Automated Conformity Inspection Process

    Data.gov (United States)

    Department of Transportation — The National Automated Conformity Inspection Process (NACIP) Application is intended to expedite the workflow process as it pertains to the FAA Form 81 0-10 Request...

  19. Automated satellite telemetry processing system

    Science.gov (United States)

    Parunakian, David; Kalegaev, Vladimir; Barinova, Vera

    In this paper we describe the design and important implementation details of the new automated system for processing satellite telemetry developedat Skobeltsyn Institute of Nuclear Physics of Moscow State University (SINP MSU) . We discuss the most common tasks and pitfall for such systems built around data stream from a single spacecraft or a single instrument, and suggest a solution that allows to quickly develop telemetry processing modules and to integrate them with an existing polling mechanism, support infrastructure and data storage in Oracle or MySQL database systems. We also demonstrate the benefits of this approach using modules for processing three different spacecraft data streams: Coronas-Photon (2009-003A), Tatiana-2 (2009-049D) and Meteor-M no.1 (2009-049A). The data format and protocols used by each of these spacecraft have distinct peculiarities, which nevertheless did not pose a problem for integrating their modules into the main system. Remote access via web interface to Oracle databases and sophisticated visualization tools create a possibility of efficient scientific exploitation of satellite data. Such a system is already deployed at the web portal of the Space Monitoring Data Center (SMDC) of SINP MSU (http://smdc.sinp.msu.ru).

  20. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 109 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author)

  1. The Automator: Intelligent control system monitoring

    International Nuclear Information System (INIS)

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  2. The Automator: Intelligent Control System Monitoring

    International Nuclear Information System (INIS)

    A large-scale control system may contain several hundred thousand control points which must be monitored to ensure smooth operation. Knowledge of the current state of such a system is often implicit in the values of these points and operators must be cognizant of the state while making decisions. Repetitive operators requiring human intervention lead to fatigue, which can in turn lead to mistakes. The authors propose a tool called the Automator based on a middleware software server. This tool would provide a user-configurable engine for monitoring control points. Based on the status of these control points, a specified action could be taken. The action could range from setting another control point, to triggering an alarm, to running an executable. Often the data presented by a system is meaningless without context information from other channels. Such a tool could be configured to present interpreted information based on values of other channels. Additionally, this tool could translate numerous values in a non-friendly form (such as numbers, bits, or return codes) into meaningful strings of information. Multiple instances of this server could be run, allowing individuals or groups to configure their own Automators. The configuration of the tool will be file-based. In the future, these files could be generated by graphical design tools, allowing for rapid development of new configurations. In addition, the server will be able to explicitly maintain information about the state of the control system. This state information can be used in decision-making processes and shared with other applications. A conceptual framework and software design for the tool are presented

  3. Automated process planning system

    Science.gov (United States)

    Mann, W.

    1978-01-01

    Program helps process engineers set up manufacturing plans for machined parts. System allows one to develop and store library of similar parts characteristics, as related to particular facility. Information is then used in interactive system to help develop manufacturing plans that meet required standards.

  4. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision.

  5. Using artificial intelligence to automate remittance processing.

    Science.gov (United States)

    Adams, W T; Snow, G M; Helmick, P M

    1998-06-01

    The consolidated business office of the Allegheny Health Education Research Foundation (AHERF), a large integrated healthcare system based in Pittsburgh, Pennsylvania, sought to improve its cash-related business office activities by implementing an automated remittance processing system that uses artificial intelligence. The goal was to create a completely automated system whereby all monies it processed would be tracked, automatically posted, analyzed, monitored, controlled, and reconciled through a central database. Using a phased approach, the automated payment system has become the central repository for all of the remittances for seven of the hospitals in the AHERF system and has allowed for the complete integration of these hospitals' existing billing systems, document imaging system, and intranet, as well as the new automated payment posting, and electronic cash tracking and reconciling systems. For such new technology, which is designed to bring about major change, factors contributing to the project's success were adequate planning, clearly articulated objectives, marketing, end-user acceptance, and post-implementation plan revision. PMID:10179973

  6. Monitoring of the physical status of Mars-500 subjects as a model of structuring an automated system in support of the training process in an exploration mission

    Science.gov (United States)

    Fomina, Elena; Savinkina, Alexandra; Kozlovskaya, Inesa; Lysova, Nataliya; Angeli, Tomas; Chernova, Maria; Uskov, Konstantin; Kukoba, Tatyana; Sonkin, Valentin; Ba, Norbert

    Physical training sessions aboard the ISS are performed under the permanent continuous control from Earth. Every week the instructors give their recommendations on how to proceed with the training considering the results of analysis of the daily records of training cosmonauts and data of the monthly fitness testing. It is obvious that in very long exploration missions this system of monitoring will be inapplicable. For this reason we venture to develop an automated system to control the physical training process using the current ISS locomotion test parameters as the leading criteria. Simulation of an extended exploration mission in experiment MARS-500 enabled the trial application of the automated system for assessing shifts in cosmonauts’ physical status in response to exercises of varying category and dismissal periods. Methods. Six subjects spent 520 days in the analog of an interplanetary vehicle at IBMP (Moscow). A variety of training regimens and facilities were used to maintain a high level of physical performance of the subjects. The resistance exercises involved expanders, strength training device (MDS) and vibrotraining device (Galileo). The cycling exercises were performed on the bicycle ergometer (VB-3) and a treadmill with the motor in or out of motion. To study the effect of prolonged periods of dismissal from training on physical performance, the training flow was interrupted for a month once in the middle and then at the end of isolation. In addition to the in-flight locomotion test integrated into the automated training control system, the physical status of subjects was attested by analysis of the records of the monthly incremental testing on the bicycle ergometer and MDS. Results. It was demonstrated that the recommended training regimens maintained high physical performance levels despite the limited motor activities in isolation. According to the locomotion testing, the subjects increased velocity significantly and reduced the physiological

  7. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  8. Automated engineering of process automation systems; Automatisiertes Engineering von Prozessleitsystem-Funktionen

    Energy Technology Data Exchange (ETDEWEB)

    Schmidberger, T.; Fay, A. [Univ. der Bundeswehr Hamburg (Germany). Inst. fuer Automatisierungstechnik; Drath, R. [ABB AG, Ladenburg (Germany). Forschungszentrum

    2005-07-01

    The paper proposes a concept to reduce engineering effort for planning and implementation of process control systems. According to this concept, knowledge based methods accomplish engineering tasks automatically. This approach makes use of information provided electronically by new, object-oriented P and I diagram tool, thus allowing the 'automation of automation'. As examples for this concept, the automatic engineering of interlockings and asset monitors is described. (orig.)

  9. Design and development of automated TLD contamination monitor

    International Nuclear Information System (INIS)

    Thermo Luminescent Dosimeter (TLD) is issued to occupational worker to register the external exposure received during his course of work. Before sending back the TLDs for processing it is the responsibility of the parent institution to check and certify that the TLDs are free of radioactive contamination. To ease the duty of health physicist a PC based automated TLD contamination monitor was designed and developed and the details of the same are presented in this paper

  10. Biogeochemical processing of nutrients in groundwater-fed stream during baseflow conditions - the value of fluorescence spectroscopy and automated high-frequency nutrient monitoring

    Science.gov (United States)

    Bieroza, Magdalena; Heathwaite, Louise

    2014-05-01

    Recent research in groundwater-dominated streams indicates that organic matter plays an important role in nutrient transformations at the surface-groundwater interface known as the hyporheic zone. Mixing of water and nutrient fluxes in the hyporheic zone controls in-stream nutrients availability, dynamics and export to downstream reaches. In particular, benthic sediments can form adsorptive sinks for organic matter and reactive nutrients (nitrogen and phosphorus) that sustain a variety of hyporheic processes e.g. denitrification, microbial uptake. Thus, hyporheic metabolism can have an important effect on both quantity (concentration) and quality (labile vs. refractory character) of organic matter. Here high-frequency nutrient monitoring combined with spectroscopic analysis was used to provide insights into biogeochemical processing of a small, agricultural stream in the NE England subject to diffuse nutrient pollution. Biogeochemical data were collected hourly for a week at baseflow conditions when in-stream-hyporheic nutrient dynamics have the greatest impact on stream health. In-stream nutrients (total phosphorus, reactive phosphorus, nitrate nitrogen) and water quality parameters (turbidity, specific conductivity, pH, temperature, dissolved oxygen, redox potential) were measured in situ hourly by an automated bank-side laboratory. Concurrent hourly autosamples were retrieved daily and analysed for nutrients and fine sediments including spectroscopic analyses of dissolved organic matter - excitation-emission matrix (EEM) fluorescence spectroscopy and ultraviolet-visible (UV-Vis) absorbance spectroscopy. Our results show that organic matter can potentially be utilised as a natural, environmental tracer of the biogeochemical processes occurring at the surface-groundwater interface in streams. High-frequency spectroscopic characterisation of in-stream organic matter can provide useful quantitative and qualitative information on fluxes of reactive nutrients in

  11. Automation of Design Engineering Processes

    Science.gov (United States)

    Torrey, Glenn; Sawasky, Gerald; Courey, Karim

    2004-01-01

    A method, and a computer program that helps to implement the method, have been developed to automate and systematize the retention and retrieval of all the written records generated during the process of designing a complex engineering system. It cannot be emphasized strongly enough that all the written records as used here is meant to be taken literally: it signifies not only final drawings and final engineering calculations but also such ancillary documents as minutes of meetings, memoranda, requests for design changes, approval and review documents, and reports of tests. One important purpose served by the method is to make the records readily available to all involved users via their computer workstations from one computer archive while eliminating the need for voluminous paper files stored in different places. Another important purpose served by the method is to facilitate the work of engineers who are charged with sustaining the system and were not involved in the original design decisions. The method helps the sustaining engineers to retrieve information that enables them to retrace the reasoning that led to the original design decisions, thereby helping them to understand the system better and to make informed engineering choices pertaining to maintenance and/or modifications of the system. The software used to implement the method is written in Microsoft Access. All of the documents pertaining to the design of a given system are stored in one relational database in such a manner that they can be related to each other via a single tracking number.

  12. Real-time bioacoustics monitoring and automated species identification

    Directory of Open Access Journals (Sweden)

    T. Mitchell Aide

    2013-07-01

    Full Text Available Traditionally, animal species diversity and abundance is assessed using a variety of methods that are generally costly, limited in space and time, and most importantly, they rarely include a permanent record. Given the urgency of climate change and the loss of habitat, it is vital that we use new technologies to improve and expand global biodiversity monitoring to thousands of sites around the world. In this article, we describe the acoustical component of the Automated Remote Biodiversity Monitoring Network (ARBIMON, a novel combination of hardware and software for automating data acquisition, data management, and species identification based on audio recordings. The major components of the cyberinfrastructure include: a solar powered remote monitoring station that sends 1-min recordings every 10 min to a base station, which relays the recordings in real-time to the project server, where the recordings are processed and uploaded to the project website (arbimon.net. Along with a module for viewing, listening, and annotating recordings, the website includes a species identification interface to help users create machine learning algorithms to automate species identification. To demonstrate the system we present data on the vocal activity patterns of birds, frogs, insects, and mammals from Puerto Rico and Costa Rica.

  13. Automated wireless monitoring system for cable tension using smart sensors

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jongwoong; Cho, Soojin; Spencer, Billie F.; Yun, Chung-Bang

    2013-04-01

    Cables are critical load carrying members of cable-stayed bridges; monitoring tension forces of the cables provides valuable information for SHM of the cable-stayed bridges. Monitoring systems for the cable tension can be efficiently realized using wireless smart sensors in conjunction with vibration-based cable tension estimation approaches. This study develops an automated cable tension monitoring system using MEMSIC's Imote2 smart sensors. An embedded data processing strategy is implemented on the Imote2-based wireless sensor network to calculate cable tensions using a vibration-based method, significantly reducing the wireless data transmission and associated power consumption. The autonomous operation of the monitoring system is achieved by AutoMonitor, a high-level coordinator application provided by the Illinois SHM Project Services Toolsuite. The monitoring system also features power harvesting enabled by solar panels attached to each sensor node and AutoMonitor for charging control. The proposed wireless system has been deployed on the Jindo Bridge, a cable-stayed bridge located in South Korea. Tension forces are autonomously monitored for 12 cables in the east, land side of the bridge, proving the validity and potential of the presented tension monitoring system for real-world applications.

  14. Automated Method for Monitoring Water Quality Using Landsat Imagery

    Directory of Open Access Journals (Sweden)

    D. Clay Barrett

    2016-06-01

    Full Text Available Regular monitoring of water quality is increasingly necessary to keep pace with rapid environmental change and protect human health and well-being. Remote sensing has been suggested as a potential solution for monitoring certain water quality parameters without the need for in situ sampling, but universal methods and tools are lacking. While many studies have developed predictive relationships between remotely sensed surface reflectance and water parameters, these relationships are often unique to a particular geographic region and have little applicability in other areas. In order to remotely monitor water quality, these relationships must be developed on a region by region basis. This paper presents an automated method for processing remotely sensed images from Landsat Thematic Mapper (TM and Enhanced Thematic Mapper Plus (ETM+ and extracting corrected reflectance measurements around known sample locations to allow rapid development of predictive water quality relationships to improve remote monitoring. Using open Python scripting, this study (1 provides an openly accessible and simple method for processing publicly available remote sensing data; and (2 allows determination of relationships between sampled water quality parameters and reflectance values to ultimately allow predictive monitoring. The method is demonstrated through a case study of the Ozark/Ouchita-Appalachian ecoregion in eastern Oklahoma using data collected for the Beneficial Use Monitoring Program (BUMP.

  15. AUTOMATED CONTROL SYSTEM AND MONITORING BY TECHNOLOGICAL PROCESSES BY PRODUCTION OF POLYMERIC AND BITUMINOUS TAPES ON THE BASIS OF APPLICATION OF SCADA OF SYSTEM

    Directory of Open Access Journals (Sweden)

    A. S. Kirienko

    2016-01-01

    Full Text Available Expediency of use of a control system and monitoring of technological processes of production is proved in article that will allow to lower work expenses, and also to increase productivity due to the best production process.The main objective of system, remote monitoring is that gives the chance far off and to quickly give an assessment to the current situation on production, to accept reasonable and timely administrative decisions.

  16. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  17. Design and implementation of an Internet based effective controlling and monitoring system with wireless fieldbus communications technologies for process automation--an experimental study.

    Science.gov (United States)

    Cetinceviz, Yucel; Bayindir, Ramazan

    2012-05-01

    The network requirements of control systems in industrial applications increase day by day. The Internet based control system and various fieldbus systems have been designed in order to meet these requirements. This paper describes an Internet based control system with wireless fieldbus communication designed for distributed processes. The system was implemented as an experimental setup in a laboratory. In industrial facilities, the process control layer and the distance connection of the distributed control devices in the lowest levels of the industrial production environment are provided with fieldbus networks. In this paper, the Internet based control system that will be able to meet the system requirements with a new-generation communication structure, which is called wired/wireless hybrid system, has been designed on field level and carried out to cover all sectors of distributed automation, from process control, to distributed input/output (I/O). The system has been accomplished by hardware structure with a programmable logic controller (PLC), a communication processor (CP) module, two industrial wireless modules and a distributed I/O module, Motor Protection Package (MPP) and software structure with WinCC flexible program used for the screen of Scada (Supervisory Control And Data Acquisition), SIMATIC MANAGER package program ("STEP7") used for the hardware and network configuration and also for downloading control program to PLC. PMID:22306882

  18. Automated Monitoring of Pipeline Rights-of-Way

    Science.gov (United States)

    Frost, Chard Ritchie

    2010-01-01

    NASA Ames Research Center and the Pipeline Research Council International, Inc. have partnered in the formation of a research program to identify and develop the key technologies required to enable automated detection of threats to gas and oil transmission and distribution pipelines. This presentation describes the Right-of-way Automated Monitoring (RAM) program and highlights research successes to date, continuing challenges to implementing the RAM objectives, and the program's ongoing work and plans.

  19. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    The Siemens RDS 112, an automated radiochemical production and delivery system designed to support a clinical PET program, consists of an 11 MeV, proton only, negative ion cyclotron, a shield, a computer, and targetry and chemical processing modules to produce radiochemicals used in PET imaging. The principal clinical PET tracers are [18F]FDG, [13N]ammonia and [15O]water. Automated synthesis of [18F]FDG is achieved using the Chemistry Process Control Unit (CPCU), a general purpose valve-and-tubing device that emulates manual processes while allowing for competent operator intervention. Using function-based command file software, this pressure-driven synthesis system carries out chemical processing procedures by timing only, without process-based feedback. To date, nine CPCUs have installed at seven institutions resulting in 1,200+ syntheses of [18F]FDG, with an average yield of 55% (EOB)

  20. Automated chemical monitoring in new projects of nuclear power plant units

    Science.gov (United States)

    Lobanok, O. I.; Fedoseev, M. V.

    2013-07-01

    The development of automated chemical monitoring systems in nuclear power plant units for the past 30 years is briefly described. The modern level of facilities used to support the operation of automated chemical monitoring systems in Russia and abroad is shown. Hardware solutions suggested by the All-Russia Institute for Nuclear Power Plant Operation (which is the General Designer of automated process control systems for power units used in the AES-2006 and VVER-TOI Projects) are presented, including the structure of additional equipment for monitoring water chemistry (taking the Novovoronezh 2 nuclear power plant as an example). It is shown that the solutions proposed with respect to receiving and processing of input measurement signals and subsequent construction of standard control loops are unified in nature. Simultaneous receipt of information from different sources for ensuring that water chemistry is monitored in sufficient scope and with required promptness is one of the problems that have been solved successfully. It is pointed out that improved quality of automated chemical monitoring can be supported by organizing full engineering follow-up of the automated chemical monitoring system's equipment throughout its entire service life.

  1. Monitoring of Microalgal Processes.

    Science.gov (United States)

    Havlik, Ivo; Scheper, Thomas; Reardon, Kenneth F

    2016-01-01

    Process monitoring, which can be defined as the measurement of process variables with the smallest possible delay, is combined with process models to form the basis for successful process control. Minimizing the measurement delay leads inevitably to employing online, in situ sensors where possible, preferably using noninvasive measurement methods with stable, low-cost sensors. Microalgal processes have similarities to traditional bioprocesses but also have unique monitoring requirements. In general, variables to be monitored in microalgal processes can be categorized as physical, chemical, and biological, and they are measured in gaseous, liquid, and solid (biological) phases. Physical and chemical process variables can be usually monitored online using standard industrial sensors. The monitoring of biological process variables, however, relies mostly on sensors developed and validated using laboratory-scale systems or uses offline methods because of difficulties in developing suitable online sensors. Here, we review current technologies for online, in situ monitoring of all types of process parameters of microalgal cultivations, with a focus on monitoring of biological parameters. We discuss newly introduced methods for measuring biological parameters that could be possibly adapted for routine online use, should be preferably noninvasive, and are based on approaches that have been proven in other bioprocesses. New sensor types for measuring physicochemical parameters using optical methods or ion-specific field effect transistor (ISFET) sensors are also discussed. Reviewed methods with online implementation or online potential include measurement of irradiance, biomass concentration by optical density and image analysis, cell count, chlorophyll fluorescence, growth rate, lipid concentration by infrared spectrophotometry, dielectric scattering, and nuclear magnetic resonance. Future perspectives are discussed, especially in the field of image analysis using in situ

  2. Automated Synthesis of Assertion Monitors using Visual Specifications

    CERN Document Server

    Gadkari, Ambar A

    2011-01-01

    Automated synthesis of monitors from high-level properties plays a significant role in assertion-based verification. We present here a methodology to synthesize assertion monitors from visual specifications given in CESC (Clocked Event Sequence Chart). CESC is a visual language designed for specifying system level interactions involving single and multiple clock domains. It has well-defined graphical and textual syntax and formal semantics based on synchronous language paradigm enabling formal analysis of specifications. In this paper we provide an overview of CESC language with few illustrative examples. The algorithm for automated synthesis of assertion monitors from CESC specifications is described. A few examples from standard bus protocols (OCP-IP and AMBA) are presented to demonstrate the application of monitor synthesis algorithm.

  3. Process monitor gratings

    Science.gov (United States)

    Brunner, T. A.; Ausschnitt, C. P.

    2007-03-01

    Despite the increasing use of advanced imaging methods to pattern chip features, process windows continue to shrink with decreasing critical dimensions. Controlling the manufacturing process within these shrinking windows requires monitor structures designed to maximize both sensitivity and robustness. In particular, monitor structures must exhibit a large, measurable response to dose and focus changes over the entire range of the critical features process window. Any process variations present fundamental challenges to the effectiveness of OPC methods, since the shape compensation assumes a repeatable process. One particular process parameter which is under increasing scrutiny is focus blur, e.g. from finite laser bandwidth, which can cause such OPC instability, and thereby damage pattern fidelity. We introduce a new type of test target called the Process Monitor Grating (PMG) which is designed for extreme sensitivity to process variation. The PMG design principle is to use assist features to zero out higher diffraction orders. We show via simulation and experiment that such structures are indeed very sensitive to process variation. In addition, PMG targets have other desirable attributes such as mask manufacturability, robustness to pattern collapse, and compatibility with standard CD metrology methods such as scatterometry. PMG targets are applicable to the accurate determination of dose and focus deviations, and in combination with an isofocal grating target, allow the accurate determination of focus blur. The methods shown in this paper are broadly applicable to the characterization of process deviations using test wafers or to the control of product using kerf structures.

  4. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    OpenAIRE

    Hon Ming Yip; John C. S. Li; Kai Xie; Xin Cui; Agrim Prasad; Qiannan Gao; Chi Chiu Leung; Lam, Raymond H. W.

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet...

  5. Automation and control of off-planet oxygen production processes

    Science.gov (United States)

    Marner, W. J.; Suitor, J. W.; Schooley, L. S.; Cellier, F. E.

    1990-01-01

    This paper addresses several aspects of the automation and control of off-planet production processes. First, a general approach to process automation and control is discussed from the viewpoint of translating human process control procedures into automated procedures. Second, the control issues for the automation and control of off-planet oxygen processes are discussed. Sensors, instruments, and components are defined and discussed in the context of off-planet applications, and the need for 'smart' components is clearly established.

  6. Classification Trees for Quality Control Processes in Automated Constructed Response Scoring.

    Science.gov (United States)

    Williamson, David M.; Hone, Anne S.; Miller, Susan; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, the issue of monitoring the scoring process becomes a primary concern, particularly when the goal is to have automated scoring operate completely unassisted by humans. Using a vignette from the Architectural Registration Examination and data for 326 cases with both human…

  7. Java Implementation based Heterogeneous Video Sequence Automated Surveillance Monitoring

    Directory of Open Access Journals (Sweden)

    Sankari Muthukarupan

    2013-04-01

    Full Text Available Automated video based surveillance monitoring is an essential and computationally challenging task to resolve issues in the secure access localities. This paper deals with some of the issues which are encountered in the integration surveillance monitoring in the real-life circumstances. We have employed video frames which are extorted from heterogeneous video formats. Each video frame is chosen to identify the anomalous events which are occurred in the sequence of time-driven process. Background subtraction is essentially required based on the optimal threshold and reference frame. Rest of the frames are ablated from reference image, hence all the foreground images paradigms are obtained. The co-ordinate existing in the deducted images is found by scanning the images horizontally until the occurrence of first black pixel. Obtained coordinate is twinned with existing co-ordinates in the primary images. The twinned co-ordinate in the primary image is considered as an active-region-of-interest. At the end, the starred images are converted to temporal video that scrutinizes the moving silhouettes of human behaviors in a static background. The proposed model is implemented in Java. Results and performance analysis are carried out in the real-life environments.

  8. In Process Beam Monitoring

    Science.gov (United States)

    Steen, W. M.; Weerasinghe, V. M.

    1986-11-01

    The industrial future of lasers in material processing lies in the combination of the laser with automatic machinery. One possible form of such a combination is an intelligent workstation which monitors the process as it occurs and adjusts itself accordingly, either by self teaching or by comparison to a process data bank or algorithm. In order to achieve this attractive goal in-process signals are required. Two devices are described in this paper. One is the Laser Beam Analyser which is now maturing into a second generation with computerised output. The other is the Acoustic Mirror, a totally novel analytic technique, not yet fully understood, but which nevertheless can act as a very effective process monitor.

  9. Wind Turbine Manufacturing Process Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Waseem Faidi; Chris Nafis; Shatil Sinha; Chandra Yerramalli; Anthony Waas; Suresh Advani; John Gangloff; Pavel Simacek

    2012-04-26

    To develop a practical inline inspection that could be used in combination with automated composite material placement equipment to economically manufacture high performance and reliable carbon composite wind turbine blade spar caps. The approach technical feasibility and cost benefit will be assessed to provide a solid basis for further development and implementation in the wind turbine industry. The program is focused on the following technology development: (1) Develop in-line monitoring methods, using optical metrology and ultrasound inspection, and perform a demonstration in the lab. This includes development of the approach and performing appropriate demonstration in the lab; (2) Develop methods to predict composite strength reduction due to defects; and (3) Develop process models to predict defects from leading indicators found in the uncured composites.

  10. Methodology for monitoring and automated diagnosis of ball bearing using para consistent logic, wavelet transform and digital signal processing; Metodologia de monitoracao e diagnostico automatizado de rolamentos utilizando logica paraconsistente, transformada de Wavelet e processamento de sinais digitais

    Energy Technology Data Exchange (ETDEWEB)

    Masotti, Paulo Henrique Ferraz

    2006-07-01

    The monitoring and diagnosis area is presenting an impressive development in recent years with the introduction of new diagnosis techniques as well as with the use the computers in the processing of the information and of the diagnosis techniques. The contribution of the artificial intelligence in the automation of the defect diagnosis is developing continually and the growing automation in the industry meets this new techniques. In the nuclear area, the growing concern with the safety in the facilities requires more effective techniques that have been sought to increase the safety level. Some nuclear power stations have already installed in some machines, sensors that allow the verification of their operational conditions. In this way, the present work can also collaborate in this area, helping in the diagnosis of the operational condition of the machines. This work presents a new technique for characteristic extraction based on the Zero Crossing of Wavelet Transform, contributing with the development of this dynamic area. The technique of artificial intelligence was used in this work the Paraconsistent Logic of Annotation with Two values (LPA2v), contributing with the automation of the diagnosis of defects, because this logic can deal with contradictory results that the techniques of feature extraction can present. This work also concentrated on the identification of defects in its initial phase trying to use accelerometers, because they are robust sensors, of low cost and can be easily found the industry in general. The results obtained in this work were accomplished through the use of an experimental database, and it was observed that the results of diagnoses of defects shown good results for defects in their initial phase. (author)

  11. Process development for automated solar cell and module production. Task 4: automated array assembly

    Energy Technology Data Exchange (ETDEWEB)

    Hagerty, J.J.

    1980-06-30

    The scope of work under this contract involves specifying a process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use. This process sequence is then critically analyzed from a technical and economic standpoint to determine the technological readiness of each process step for implementation. The process steps are ranked according to the degree of development effort required and according to their significance to the overall process. Under this contract the steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development. Economic analysis using the SAMICS system has been performed during these studies to assure that development efforts have been directed towards the ultimate goal of price reduction. Details are given. (WHK)

  12. Automated inundation monitoring using TerraSAR-X multitemporal imagery

    Science.gov (United States)

    Gebhardt, S.; Huth, J.; Wehrmann, T.; Schettler, I.; Künzer, C.; Schmidt, M.; Dech, S.

    2009-04-01

    The Mekong Delta in Vietnam offers natural resources for several million inhabitants. However, a strong population increase, changing climatic conditions and regulatory measures at the upper reaches of the Mekong lead to severe changes in the Delta. Extreme flood events occur more frequently, drinking water availability is increasingly limited, soils show signs of salinization or acidification, species and complete habitats diminish. During the Monsoon season the river regularly overflows its banks in the lower Mekong area, usually with beneficial effects. However, extreme flood events occur more frequently causing extensive damage, on the average once every 6 to 10 years river flood levels exceed the critical beneficial level X-band SAR data are well suited for deriving inundated surface areas. The TerraSAR-X sensor with its different scanning modi allows for the derivation of spatial and temporal high resolved inundation masks. The paper presents an automated procedure for deriving inundated areas from TerraSAR-X Scansar and Stripmap image data. Within the framework of the German-Vietnamese WISDOM project, focussing the Mekong Delta region in Vietnam, images have been acquired covering the flood season from June 2008 to November 2008. Based on these images a time series of the so called watermask showing inundated areas have been derived. The product is required as intermediate to (i) calibrate 2d inundation model scenarios, (ii) estimate the extent of affected areas, and (iii) analyze the scope of prior crisis. The image processing approach is based on the assumption that water surfaces are forward scattering the radar signal resulting in low backscatter signals to the sensor. It uses multiple grey level thresholds and image morphological operations. The approach is robust in terms of automation, accuracy, robustness, and processing time. The resulting watermasks show the seasonal flooding pattern with inundations starting in July, having their peak at the end

  13. A plasma process monitor/control system

    Energy Technology Data Exchange (ETDEWEB)

    Stevenson, J.O.; Ward, P.P.; Smith, M.L. [Sandia National Labs., Albuquerque, NM (United States); Markle, R.J. [Advanced Micro Devices, Inc., Austin, TX (United States)

    1997-08-01

    Sandia National Laboratories has developed a system to monitor plasma processes for control of industrial applications. The system is designed to act as a fully automated, sand-alone process monitor during printed wiring board and semiconductor production runs. The monitor routinely performs data collection, analysis, process identification, and error detection/correction without the need for human intervention. The monitor can also be used in research mode to allow process engineers to gather additional information about plasma processes. The plasma monitor can perform real-time control of support systems known to influence plasma behavior. The monitor can also signal personnel to modify plasma parameters when the system is operating outside of desired specifications and requires human assistance. A notification protocol can be selected for conditions detected in the plasma process. The Plasma Process Monitor/Control System consists of a computer running software developed by Sandia National Laboratories, a commercially available spectrophotometer equipped with a charge-coupled device camera, an input/output device, and a fiber optic cable.

  14. D-MSR: a distributed network management scheme for real-time monitoring and process control applications in wireless industrial automation.

    Science.gov (United States)

    Zand, Pouria; Dilo, Arta; Havinga, Paul

    2013-01-01

    Current wireless technologies for industrial applications, such as WirelessHART and ISA100.11a, use a centralized management approach where a central network manager handles the requirements of the static network. However, such a centralized approach has several drawbacks. For example, it cannot cope with dynamicity/disturbance in large-scale networks in a real-time manner and it incurs a high communication overhead and latency for exchanging management traffic. In this paper, we therefore propose a distributed network management scheme, D-MSR. It enables the network devices to join the network, schedule their communications, establish end-to-end connections by reserving the communication resources for addressing real-time requirements, and cope with network dynamicity (e.g., node/edge failures) in a distributed manner. According to our knowledge, this is the first distributed management scheme based on IEEE 802.15.4e standard, which guides the nodes in different phases from joining until publishing their sensor data in the network. We demonstrate via simulation that D-MSR can address real-time and reliable communication as well as the high throughput requirements of industrial automation wireless networks, while also achieving higher efficiency in network management than WirelessHART, in terms of delay and overhead.

  15. Automated Web-based Monitoring of a Pump and Treat System at the Hanford Site

    Science.gov (United States)

    Webber, W.; Versteeg, R.; Richardson, A.; Ankeny, M.; Gilmore, T.; Morse, J.; Thompson, M.

    2006-05-01

    Automated and autonomous monitoring of environmental conditions can be used to improve operational efficiency, verify remedial action decisions, and promote confidence in the monitoring process by making data and associated derived information readily accessible to regulators and stakeholders. Ultimately autonomous monitoring systems can reduce overall costs associated with regulatory compliance of performance and long- term monitoring. As part of a joint decision between DOE and the WA Department of Ecology to put on "cold standby" a pump and treat system that has been operating on the Department of Energy's Hanford site in Washington State since 1995, a web site was developed to display the automated water level network around the pump and treat system. The automated water level network consists of nineteen wells with water level transducers and temperature and conductivity probes for selected wells. Data from this network will be used to evaluate the impacts of the pump-and-treat system and the response of the aquifer to shutdown of the system. The website will provide access to data from the automated network along with additional information pertaining to the shutdown of the pump and treat system to the various stakeholders in a convenient and timely fashion. This will allow the various stakeholders to observe the impacts of the shutdown as the aquifer responds. There are future plans to expand this web-based data reporting platform to other environmental data that pertains to the various remedial actions planned at the Hanford site. The benefits of the web site application for monitoring and stewardship are: consistency of data processing and analyses with automated and on demand data and information delivery. The system and data access is password controlled and access to various data or fields can be restricted to specified users. An important feature is that the stakeholders have access to the data in near-real time providing a checks-and-balance system

  16. Automation of electroweak corrections for LHC processes

    Science.gov (United States)

    Chiesa, Mauro; Greiner, Nicolas; Tramontano, Francesco

    2016-01-01

    Next-to-leading order (NLO) electroweak corrections will play an important role in Run 2 of the Large Hadron Collider (LHC). Even though they are typically moderate at the level of total cross sections, they can lead to substantial deviations in the shapes of distributions. In particular, for the search for new physics, but also for a precise determination of Standard Model observables, their inclusion in theoretical predictions is mandatory for a reliable estimation of the Standard Model contribution. In this article we review the status and recent developments in electroweak calculations and their automation for LHC processes. We discuss general issues and properties of NLO electroweak corrections and present some examples, including the full calculation of the NLO corrections to the production of a W-boson in association with two jets computed using GoSam interfaced to MadDipole.

  17. A software architecture for automating operations processes

    Science.gov (United States)

    Miller, Kevin J.

    1994-01-01

    The Operations Engineering Lab (OEL) at JPL has developed a software architecture based on an integrated toolkit approach for simplifying and automating mission operations tasks. The toolkit approach is based on building adaptable, reusable graphical tools that are integrated through a combination of libraries, scripts, and system-level user interface shells. The graphical interface shells are designed to integrate and visually guide a user through the complex steps in an operations process. They provide a user with an integrated system-level picture of an overall process, defining the required inputs and possible output through interactive on-screen graphics. The OEL has developed the software for building these process-oriented graphical user interface (GUI) shells. The OEL Shell development system (OEL Shell) is an extension of JPL's Widget Creation Library (WCL). The OEL Shell system can be used to easily build user interfaces for running complex processes, applications with extensive command-line interfaces, and tool-integration tasks. The interface shells display a logical process flow using arrows and box graphics. They also allow a user to select which output products are desired and which input sources are needed, eliminating the need to know which program and its associated command-line parameters must be executed in each case. The shells have also proved valuable for use as operations training tools because of the OEL Shell hypertext help environment. The OEL toolkit approach is guided by several principles, including the use of ASCII text file interfaces with a multimission format, Perl scripts for mission-specific adaptation code, and programs that include a simple command-line interface for batch mode processing. Projects can adapt the interface shells by simple changes to the resources configuration file. This approach has allowed the development of sophisticated, automated software systems that are easy, cheap, and fast to build. This paper will

  18. Concept of Educationional and Administrative Processes Automation System for Department

    OpenAIRE

    Ivan N. Berlinets

    2012-01-01

    Article describes concept and approach to implementation of educational and administrative processes automation system for graduate department. Described program components and technologies implementing system’s functions

  19. G-Cloud Monitor: A Cloud Monitoring System for Factory Automation for Sustainable Green Computing

    Directory of Open Access Journals (Sweden)

    Hwa-Young Jeong

    2014-11-01

    Full Text Available Green and cloud computing (G-cloud are new trends in all areas of computing. The G-cloud provides an efficient function, which enables users to access their programs, systems and platforms at anytime and anyplace. Green computing can also yield greener technology by reducing power consumption for sustainable environments. Furthermore, in order to apply user needs to the system development, the user characteristics are regarded as some of the most important factors to be considered in product industries. In this paper, we propose a cloud monitoring system to observe and manage the manufacturing system/factory automation for sustainable green computing. For monitoring systems, we utilized the resources in the G-cloud environments, and hence, it can reduce the amount of system resources and devices, such as system power and processes. In addition, we propose adding a user profile to the monitoring system in order to provide a user-friendly function. That is, this function allows system configurations to be automatically matched to the individual’s requirements, thus increasing efficiency.

  20. Biosensors and Automation for Bioprocess Monitoring and Control

    OpenAIRE

    Kumar, M A

    2011-01-01

    Bioprocess monitoring and control is a complex task that needs rapid and reliable methods which are adaptable to continuous analysis. Process monitoring during fermentation is widely applicable in the field of pharmaceutical, food and beverages and wastewater treatment. The ability to monitor has direct relevance in improving performance, quality, productivity, and yield of the process. In fact, the complexity of the bioprocesses requires almost real time insight into the dynamic process for ...

  1. Semisupervised Gaussian Process for Automated Enzyme Search.

    Science.gov (United States)

    Mellor, Joseph; Grigoras, Ioana; Carbonell, Pablo; Faulon, Jean-Loup

    2016-06-17

    Synthetic biology is today harnessing the design of novel and greener biosynthesis routes for the production of added-value chemicals and natural products. The design of novel pathways often requires a detailed selection of enzyme sequences to import into the chassis at each of the reaction steps. To address such design requirements in an automated way, we present here a tool for exploring the space of enzymatic reactions. Given a reaction and an enzyme the tool provides a probability estimate that the enzyme catalyzes the reaction. Our tool first considers the similarity of a reaction to known biochemical reactions with respect to signatures around their reaction centers. Signatures are defined based on chemical transformation rules by using extended connectivity fingerprint descriptors. A semisupervised Gaussian process model associated with the similar known reactions then provides the probability estimate. The Gaussian process model uses information about both the reaction and the enzyme in providing the estimate. These estimates were validated experimentally by the application of the Gaussian process model to a newly identified metabolite in Escherichia coli in order to search for the enzymes catalyzing its associated reactions. Furthermore, we show with several pathway design examples how such ability to assign probability estimates to enzymatic reactions provides the potential to assist in bioengineering applications, providing experimental validation to our proposed approach. To the best of our knowledge, the proposed approach is the first application of Gaussian processes dealing with biological sequences and chemicals, the use of a semisupervised Gaussian process framework is also novel in the context of machine learning applied to bioinformatics. However, the ability of an enzyme to catalyze a reaction depends on the affinity between the substrates of the reaction and the enzyme. This affinity is generally quantified by the Michaelis constant KM

  2. Automated Monitoring System for Waste Disposal Sites and Groundwater

    Energy Technology Data Exchange (ETDEWEB)

    S. E. Rawlinson

    2003-03-01

    A proposal submitted to the U.S. Department of Energy (DOE), Office of Science and Technology, Accelerated Site Technology Deployment (ASTD) program to deploy an automated monitoring system for waste disposal sites and groundwater, herein referred to as the ''Automated Monitoring System,'' was funded in fiscal year (FY) 2002. This two-year project included three parts: (1) deployment of cellular telephone modems on existing dataloggers, (2) development of a data management system, and (3) development of Internet accessibility. The proposed concept was initially (in FY 2002) to deploy cellular telephone modems on existing dataloggers and partially develop the data management system at the Nevada Test Site (NTS). This initial effort included both Bechtel Nevada (BN) and the Desert Research Institute (DRI). The following year (FY 2003), cellular modems were to be similarly deployed at Sandia National Laboratories (SNL) and Los Alamos National Laboratory (LANL), and the early data management system developed at the NTS was to be brought to those locations for site-specific development and use. Also in FY 2003, additional site-specific development of the complete system was to be conducted at the NTS. To complete the project, certain data, depending on site-specific conditions or restrictions involving distribution of data, were to made available through the Internet via the DRI/Western Region Climate Center (WRCC) WEABASE platform. If the complete project had been implemented, the system schematic would have looked like the figure on the following page.

  3. Automate The Tax Levy Process (Taxy)

    Data.gov (United States)

    Social Security Administration — This data store contains information to support the automation of Tax Levy payments. Data includes but is not limited to Title II benefits adjustment data, as well...

  4. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  5. Tools for automated acoustic monitoring within the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  6. Tools for automated acoustic monitoring within the R package monitoR

    DEFF Research Database (Denmark)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those...... with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors....

  7. An overview of the Environmental Monitoring Computer Automation Project

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S.M.; Lorenz, R.

    1992-01-01

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS.

  8. An overview of the Environmental Monitoring Computer Automation Project

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, S.M.; Lorenz, R.

    1992-12-31

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS.

  9. An overview of the Environmental Monitoring Computer Automation Project

    International Nuclear Information System (INIS)

    The Savannah River Site (SRS) was bulk to produce plutonium and tritium for national defense. As a result of site operations, routine and accidental releases of radionuclides have occurred. The effects these releases have on the k>cal population and environment are of concern to the Department of Energy (DOE) and SRS personnel. Each year, approximately 40,000 environmental samples are collected. The quality of the samples, analytical methods and results obtained are important to site personnel. The Environmental Monitoring Computer Automation Project (EMCAP) was developed to better manage scheduling, log-in, tracking, analytical results, and report generation. EMCAP can be viewed as a custom Laboratory Information Management System (LIMS) with the ability to schedule samples, generate reports, and query data. The purpose of this paper is to give an overview of the SRS environmental monitoring program, describe the development of EMCAP software and hardware, discuss the different software modules, show how EMCAP improved the Environmental Monitoring Section program, and examine the future of EMCAP at SRS

  10. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; McCormick, T. W.; Lewis, E. R.; Hogan, S. J. (Spire Corporation)

    1999-08-31

    This report describes work performed by Spire Corporation during Phase 1 of this three-phase PVMaT subcontract to develop new automated post-lamination processes for PV module manufacturing. These processes are applicable to a very broad range of module types, including those made with wafer-based and thin-film solar cells. No off-the-shelf automation was available for these processes prior to this program. Spire conducted a survey of PV module manufacturers to identify current industry practices and to determine the requirements for the automated systems being developed in this program. Spire also completed detailed mechanical and electrical designs and developed software for two prototype automation systems: a module buffer storage system, designated the SPI-BUFFER 350, and an integrated module testing system, designated the SPI-MODULE QA 350. Researchers fabricated, tested, and evaluated both systems with module components from several module manufacturers. A new size simulator , th e SPI-SUN SIMULATOR 350i, was designed with a test area that can handle most production modules without consuming excessive floor space. Spire's subcontractor, the Automation and Robotics Research Institute (ARRI) at the University of Texas, developed and demonstrated module edge trimming, edge sealing, and framing processes that are suitable for automation. The automated processes under development throughout this program are being designed to be combined together to create automated production lines. ARRI completed a cost study to determine the level of investment that can be justified by implementing automation for post-lamination assembly and testing processes. The study concluded that a module production line operating two shifts per day and producing 10 MW of modules per year can justify $2.37 million in capital equipment, assuming a 5-year payback period.

  11. AUTOMATED DEPLOYMENT PROCESS WITHIN ENTERPRISE SOLUTIONS : Case Episerver

    OpenAIRE

    Heinänen, Michael

    2016-01-01

    This research focused on studying the concept of automated deployment in Web hosted applications. The work, conducted for within Episerver, had three objectives, i.e. to reduce deployment times, cost and dependency on managed services engineers; to introduce a more reliable deployment solution with the current infrastructure in order to minimize human error; and to develop an agile and secure automated deployment process for the case company. The research presents a fully functional deplo...

  12. Process monitoring by display devices

    International Nuclear Information System (INIS)

    The use of extensive automation, regulating, protection and limiting devices and the application of ergonomic principles (e.g. the increased use of mimic diagrams) has led to plant being capable of continued operation. German nuclear power stations are in top position worldwide as regards safety and availability. However, there is already a requirement to overcome the unmanageable state due to the large number and miniaturization of elements by renewed efforts. An attempt at this made with conventional technology is represented by a mimic board, which was provided in a powerstation just being set to work. Such mimic boards give the opportunity of monitoring the most important parameters at a glance but there are limits to their use due to the large space required. The use of VDU screens represents a possibility of solving this problem. (orig./DG)

  13. A system for automated monitoring of embankment deformation along the Qinghai-Tibet Railway in permafrost regions

    Institute of Scientific and Technical Information of China (English)

    YongPeng Yang; YaoHui Qu; HanCheng Cai; Jia Cheng; CaiMei Tang

    2015-01-01

    At present, the monitoring of embankment deformation in permafrost regions along the Qinghai-Tibet Railway is mainly done manually. However, the harsh climate on the plateau affects the results greatly by lowering the observation frequency, so the manual monitoring can barely meet the observational demand. This research develops a system of automated monitoring of embankment deformation, and aims to address the problems caused by the plateau climate and the perma-frost conditions in the region. The equipment consists of a monitoring module, a data collection module, a transmission module, and a data processing module. The field experiments during this program indicate that (1) the combined auto-mated monitoring device overcame the problems associated with the complicated and tough plateau environment by means of wireless transmission and automatic analysis of the embankment settlement data;(2) the calibration of the combined settlement gauge at −20 °C was highly accurate, with an error rate always <0.5%; (3) the gauge calibration at high-temperature conditions was also highly accurate, with an error rate<0.5%even though the surface of the instrument reached more than 50 °C;and (4) compared with the data manually taken, the data automatically acquired during field monitoring experiments demonstrated that the combined settlement gauge and the automated monitoring system could meet the requirements of the monitoring mission in permafrost regions along the Qinghai-Tibet Railway.

  14. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.;

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually....... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management. The...

  15. Process Monitoring for Nuclear Safeguards

    International Nuclear Information System (INIS)

    Process Monitoring has long been used to evaluate industrial processes and operating conditions in nuclear and non-nuclear facilities. In nuclear applications there is a recognized need to demonstrate the safeguards benefits from using advanced process monitoring on spent fuel reprocessing technologies and associated facilities, as a complement to nuclear materials accounting. This can be accomplished by: defining credible diversion pathway scenarios as a sample problem; using advanced sensor and data analysis techniques to illustrate detection capabilities; and formulating 'event detection' methodologies as a means to quantify performance of the safeguards system. Over the past 30 years there have been rapid advances and improvement in the technology associated with monitoring and control of industrial processes. In the context of bulk handling facilities that process nuclear materials, modern technology can provide more timely information on the location and movement of nuclear material to help develop more effective safeguards. For international safeguards, inspection means verification of material balance data as reported by the operator through the State to the international inspectorate agency. This verification recognizes that the State may be in collusion with the operator to hide clandestine activities, potentially during abnormal process conditions with falsification of data to mask the removal. Records provided may show material is accounted for even though a removal occurred. Process monitoring can offer additional fidelity during a wide variety of operating conditions to help verify the declaration or identify possible diversions. The challenge is how to use modern technology for process monitoring and control in a proprietary operating environment subject to safeguards inspectorate or other regulatory oversight. Under the U.S. National Nuclear Security Administration's Next Generation Safeguards Initiative, a range of potential safeguards applications

  16. Automated Instrumentation, Monitoring and Visualization of PVM Programs Using AIMS

    Science.gov (United States)

    Mehra, Pankaj; VanVoorst, Brian; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    We present views and analysis of the execution of several PVM (Parallel Virtual Machine) codes for Computational Fluid Dynamics on a networks of Sparcstations, including: (1) NAS Parallel Benchmarks CG and MG; (2) a multi-partitioning algorithm for NAS Parallel Benchmark SP; and (3) an overset grid flowsolver. These views and analysis were obtained using our Automated Instrumentation and Monitoring System (AIMS) version 3.0, a toolkit for debugging the performance of PVM programs. We will describe the architecture, operation and application of AIMS. The AIMS toolkit contains: (1) Xinstrument, which can automatically instrument various computational and communication constructs in message-passing parallel programs; (2) Monitor, a library of runtime trace-collection routines; (3) VK (Visual Kernel), an execution-animation tool with source-code clickback; and (4) Tally, a tool for statistical analysis of execution profiles. Currently, Xinstrument can handle C and Fortran 77 programs using PVM 3.2.x; Monitor has been implemented and tested on Sun 4 systems running SunOS 4.1.2; and VK uses XIIR5 and Motif 1.2. Data and views obtained using AIMS clearly illustrate several characteristic features of executing parallel programs on networked workstations: (1) the impact of long message latencies; (2) the impact of multiprogramming overheads and associated load imbalance; (3) cache and virtual-memory effects; and (4) significant skews between workstation clocks. Interestingly, AIMS can compensate for constant skew (zero drift) by calibrating the skew between a parent and its spawned children. In addition, AIMS' skew-compensation algorithm can adjust timestamps in a way that eliminates physically impossible communications (e.g., messages going backwards in time). Our current efforts are directed toward creating new views to explain the observed performance of PVM programs. Some of the features planned for the near future include: (1) ConfigView, showing the physical topology

  17. An automated digital imaging system for environmental monitoring applications

    Science.gov (United States)

    Bogle, Rian; Velasco, Miguel; Vogel, John

    2013-01-01

    Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.

  18. The Automated Discovery of Hybrid Processes

    DEFF Research Database (Denmark)

    Slaats, Tijs; Reijers, Hajo; Maggi, Fabrizio Maria

    2014-01-01

    The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast, procedu......The declarative-procedural dichotomy is highly relevant when choosing the most suitable process modeling language to represent a discovered process. Less-structured processes with a high level of variability can be described in a more compact way using a declarative language. By contrast......, procedural process modeling languages seem more suitable to describe structured and stable processes. However, in various cases, a process may incorporate parts that are better captured in a declarative fashion, while other parts are more suitable to be described procedurally. In this paper, we present...... a technique for discovering from an event log a so-called hybrid process model. A hybrid process model is hierarchical, where each of its sub-processes may be specified in a declarative or procedural fashion. We have implemented the proposed approach as a plug-in of the ProM platform. To evaluate the approach...

  19. Automated business processes in outbound logistics: An information system perspective

    DEFF Research Database (Denmark)

    Tambo, Torben

    2010-01-01

    This article analyses the potentials and possibilities of changing outbound logistics from highly labour intensive on the information processing side to a more or less fully automated solution. Automation offers advantages in terms of direct labour cost reduction as well as indirect cost reduction...... is not a matter of whether the system can or cannot, but a matter of making a technological and economical best fit. Along the formal implementation issues there is a parallel process focused on a mutuality between IT teams, business users, management and external stakeholders in offering relevant inputs...

  20. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino;

    2014-01-01

    In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...... assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C...

  1. Initial Flight Results for an Automated Satellite Beacon Health Monitoring Network

    OpenAIRE

    Young, Anthony; Kitts, Christopher; Neumann, Michael; Mas, Ignacio; Rasay, Mike

    2010-01-01

    Beacon monitoring is an automated satellite health monitoring architecture that combines telemetry analysis, periodic low data rate message broadcasts by a spacecraft, and automated ground reception and data handling in order to implement a cost-effective anomaly detection and notification capability for spacecraft missions. Over the past two decades, this architecture has been explored and prototyped for a range of spacecraft mission classes to include use on NASA deep space probes, military...

  2. Toward the automation of road networks extraction processes

    Science.gov (United States)

    Leymarie, Frederic; Boichis, Nicolas; Airault, Sylvain; Jamet, Olivier

    1996-12-01

    Syseca and IGN are working on various steps in the ongoing march from digital photogrammetry to the semi-automation and ultimately the full automation of data manipulation, i.e., capture and analysis. The immediate goals are to reduce the production costs and the data availability delays. Within this context, we have tackle the distinctive problem of 'automated road network extraction.' The methodology adopted is to first study semi-automatic solutions which probably increase the global efficiency of human operators in topographic data capture; in a second step, automatic solutions are designed based upon the gained experience. We report on different (semi-)automatic solutions for the road following algorithm. One key aspect of our method is to have the stages of 'detection' and 'geometric recovery' cooperate together while remaining distinct. 'Detection' is based on a local (texture) analysis of the image, while 'geometric recovery' is concerned with the extraction of 'road objects' for both monocular and stereo information. 'Detection' is a low-level visual process, 'reasoning' directly at the level of image intensities, while the mid-level visual process, 'geometric recovery', uses contextual knowledge about roads, both generic, e.g. parallelism of borders, and specific, e.g. using previously extracted road segments and disparities. We then pursue our 'march' by reporting on steps we are exploring toward full automation. We have in particular made attempts at tackling the automation of the initialization step to start searching in a valid direction.

  3. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  4. Automated Tow Placement Processing and Characterization of Composites

    Science.gov (United States)

    Prabhakaran, R.

    2004-01-01

    The project had one of the initial objectives as automated tow placement (ATP), in which a robot was used to place a collimated band of pre-impregnated ribbons or a wide preconsolidated tape onto a tool surface. It was proposed to utilize the Automated Tow Placement machine that was already available and to fabricate carbon fiber reinforced PEEK (polyether-ether-ketone) matrix composites. After initial experiments with the fabrication of flat plates, composite cylinders were to be fabricated. Specimens from the fabricated parts were to be tested for mechanical characterization. A second objective was to conduct various types of tests for characterizing composite specimens cured by different fabrication processes.

  5. Monitoring cognitive function and need with the automated neuropsychological assessment metrics in Decompression Sickness (DCS) research

    Science.gov (United States)

    Nesthus, Thomas E.; Schiflett, Sammuel G.

    1993-01-01

    Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.

  6. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance an

  7. Monitoring and control of fine abrasive finishing processes

    DEFF Research Database (Denmark)

    Lazarev, Ruslan

    In engineering, surfaces with specified functional properties are of high demand in various applications. Desired surface finish can be obtained using several methods. Abrasive finishing is one of the most important processes in the manufacturing of mould and dies tools. It is a principal method...... to remove unwanted material, obtain desired geometry, surface quality and surface functional properties. The automation and computerization of finishing processes involves utilisation of robots, specialized machines with several degrees of freedom, sensors and data acquisition systems. The focus...... of this work was to investigate foundations for process monitoring and control methods in application to semi-automated polishing machine based on the industrial robot. The monitoring system was built on NI data acquisition system with two sensors, acoustic emission sensor and accelerometer. Acquired sensory...

  8. ECG acquisition and automated remote processing

    CERN Document Server

    Gupta, Rajarshi; Bera, Jitendranath

    2014-01-01

    The book is focused on the area of remote processing of ECG in the context of telecardiology, an emerging area in the field of Biomedical Engineering Application. Considering the poor infrastructure and inadequate numbers of physicians in rural healthcare clinics in India and other developing nations, telemedicine services assume special importance. Telecardiology, a specialized area of telemedicine, is taken up in this book considering the importance of cardiac diseases, which is prevalent in the population under discussion. The main focus of this book is to discuss different aspects of ECG acquisition, its remote transmission and computerized ECG signal analysis for feature extraction. It also discusses ECG compression and application of standalone embedded systems, to develop a cost effective solution of a telecardiology system.

  9. Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation.

    Science.gov (United States)

    Muir, B M; Moray, N

    1996-03-01

    Two experiments are reported which examined operators' trust in and use of the automation in a simulated supervisory process control task. Tests of the integrated model of human trust in machines proposed by Muir (1994) showed that models of interpersonal trust capture some important aspects of the nature and dynamics of human-machine trust. Results showed that operators' subjective ratings of trust in the automation were based mainly upon their perception of its competence. Trust was significantly reduced by any sign of incompetence in the automation, even one which had no effect on overall system performance. Operators' trust changed very little with experience, with a few notable exceptions. Distrust in one function of an automatic component spread to reduce trust in another function of the same component, but did not generalize to another independent automatic component in the same system, or to other systems. There was high positive correlation between operators' trust in and use of the automation; operators used automation they trusted and rejected automation they distrusted, preferring to do the control task manually. There was an inverse relationship between trust and monitoring of the automation. These results suggest that operators' subjective ratings of trust and the properties of the automation which determine their trust, can be used to predict and optimize the dynamic allocation of functions in automated systems. PMID:8849495

  10. Automated control system for a mashing process

    Science.gov (United States)

    Teterin, E.; Rudnickiy, V.

    2015-10-01

    The goal of this paper is to describe a system for a mashing process, which is the first part of brewing beer. The mashing is a procedure where the fermentable (and some nonfermentable) sugars are extracted from malts. The program part based on LabVIEW, which is used to control NI CompactRIO. The main target of the project is to reach a predefined levels of the temperatures and maintain it during the pauses. When the necessary break time is ended the system is ready to go to the new value. The precise control of the temperatures during the breaks is one of the critical factors that define the texture and alcohol content of the beer. The system has two tanks with resistors PT'00 in both of them, heat exchanger (coil), heater and pump. The first tank has heating element in order to rise the temperature in the other one. This project has practical solution with all explanations and graphs which are proven working ability of this control system.

  11. Technology transfer potential of an automated water monitoring system. [market research

    Science.gov (United States)

    Jamieson, W. M.; Hillman, M. E. D.; Eischen, M. A.; Stilwell, J. M.

    1976-01-01

    The nature and characteristics of the potential economic need (markets) for a highly integrated water quality monitoring system were investigated. The technological, institutional and marketing factors that would influence the transfer and adoption of an automated system were studied for application to public and private water supply, public and private wastewater treatment and environmental monitoring of rivers and lakes.

  12. Monitoring of polymer melt processing

    International Nuclear Information System (INIS)

    The paper reviews the state-of-the-art of in-line and on-line monitoring during polymer melt processing by compounding, extrusion and injection moulding. Different spectroscopic and scattering techniques as well as conductivity and viscosity measurements are reviewed and compared concerning their potential for different process applications. In addition to information on chemical composition and state of the process, the in situ detection of morphology, which is of specific interest for multiphase polymer systems such as polymer composites and polymer blends, is described in detail. For these systems, the product properties strongly depend on the phase or filler morphology created during processing. Examples for optical (UV/vis, NIR) and ultrasonic attenuation spectra recorded during extrusion are given, which were found to be sensitive to the chemical composition as well as to size and degree of dispersion of micro or nanofillers in the polymer matrix. By small-angle light scattering experiments, process-induced structures were detected in blends of incompatible polymers during compounding. Using conductivity measurements during extrusion, the influence of processing conditions on the electrical conductivity of polymer melts with conductive fillers (carbon black or carbon nanotubes) was monitored. (topical review)

  13. Mass Spectrometry-Based Monitoring of Millisecond Protein-Ligand Binding Dynamics Using an Automated Microfluidic Platform

    Energy Technology Data Exchange (ETDEWEB)

    Cong, Yongzheng; Katipamula, Shanta; Trader, Cameron D.; Orton, Daniel J.; Geng, Tao; Baker, Erin Shammel; Kelly, Ryan T.

    2016-03-24

    Characterizing protein-ligand binding dynamics is crucial for understanding protein function and developing new therapeutic agents. We have developed a novel microfluidic platform that features rapid mixing of protein and ligand solutions, variable incubation times, and on-chip electrospray ionization to perform label-free, solution-based monitoring of protein-ligand binding dynamics. This platform offers many advantages including automated processing, rapid mixing, and low sample consumption.

  14. The automation of analysis of technological process effectiveness

    Directory of Open Access Journals (Sweden)

    B. Krupińska

    2007-10-01

    Full Text Available Purpose: Improvement of technological processes by the use of technological efficiency analysis can create basis of their optimization. Informatization and computerization of wider and wider scope of activity is one of the most important current development trends of an enterprise.Design/methodology/approach: Indicators appointment makes it possible to evaluate the process efficiency, which can constitute an optimization basis of particular operation. Model of technological efficiency analysis is based on particular efficiency indicators that characterize operation, taking into account following criteria: operation – material, operation – machine, operation – human, operation – technological parameters.Findings: From the qualitative and correctness of choose of technology point of view comprehensive technological processes assessment makes up the basis of technological efficiency analysis. Results of technological efficiency analysis of technological process of prove that the chosen model of technological efficiency analysis makes it possible to improve the process continuously by the technological analysis, and application of computer assistance makes it possible to automate the process of efficiency analysis, and finally controlled improvement of technological processes.Practical implications: For the sake of complexity of technological efficiency analysis one has created an AEPT computer analysis from which result: operation efficiency indicators with distinguished indicators with minimal acceptable values, values of efficiency of the applied samples, value of technological process efficiency.Originality/value: The created computer analysis of ef technological process efficiency (AEPT makes it possible to automate the process of analysis and optimization.

  15. Linked Data approach for selection process automation in Systematic Reviews

    OpenAIRE

    Torchiano, Marco; Morisio, Maurizio; Tomassetti, Federico Cesare Argentino; Ardito, Luca; Vetro, Antonio; Rizzo, Giuseppe

    2011-01-01

    Background: a systematic review identifies, evaluates and synthesizes the available literature on a given topic using scientific and repeatable methodologies. The significant workload required and the subjectivity bias could affect results. Aim: semi-automate the selection process to reduce the amount of manual work needed and the consequent subjectivity bias. Method: extend and enrich the selection of primary studies using the existing technologies in the field of Linked Data and text mining...

  16. Automated DEM extraction in digital aerial photogrammetry: precisions and validation for mass movement monitoring

    Directory of Open Access Journals (Sweden)

    A. Pesci

    2005-06-01

    Full Text Available Automated procedures for photogrammetric image processing and Digital Elevation Models (DEM extraction yield high precision terrain models in a short time, reducing manual editing; their accuracy is strictly related to image quality and terrain features. After an analysis of the performance of the Digital Photogrammetric Workstation (DPW 770 Helava, the paper compares DEMs derived from different surveys and registered in the same reference system. In the case of stable area, the distribution of height residuals, their mean and standard deviation values, indicate that the theoretical accuracy is achievable automatically when terrain is characterized by regular morphology. Steep slopes, corrugated surfaces, vegetation and shadows can degrade results even if manual editing procedures are applied. The comparison of multi-temporal DEMs on unstable areas allows the monitoring of surface deformation and morphological changes.

  17. An extended process automation system : an approach based on a multi-agent system

    OpenAIRE

    Seilonen, Ilkka

    2006-01-01

    This thesis describes studies on application of multi-agent systems (acronym: MAS) to enhance process automation systems. A specification of an extended process automation system is presented. According to this specification, MAS can be used to extend the functionality of ordinary process automation systems at higher levels of control. Anticipated benefits of the specification include enhanced reconfigurability, responsiveness and flexibility properties of process automation. Previous res...

  18. Intelligent sensor-model automated control of PMR-15 autoclave processing

    Science.gov (United States)

    Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.

    An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.

  19. Inter-process handling automating system; Koteikan handling jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, H. [Meidensha Corp., Tokyo (Japan)

    1994-10-18

    This paper introduces automation of loading works in production site by using robots. Loading robots are required of complex movements, and are used for loading work in processing machines requiring six degrees of freedom and for relatively simple palletizing work that can be dealt with by four degrees of freedom. The `inter-machine handling system` is an automated system performed by a ceiling running robot in which different workpiece model determination and positional shift measurement are carried out by image processing. A robot uses the image information to exchange hands automatically as required, and clamp a workpiece; then runs to an M/C to replace the processed workpiece; and put the M/C processes workpiece onto a multi-axial dedicated machine. Five processing machines are operated in parallel with the cycle time matched with that of this handling process, and a processing machine finished of processing is given a handling work in preferential order. As a result, improvement in productivity and elimination of two workers were achieved simultaneously. 6 figs., 5 tabs.

  20. Advanced oxidation protein products (AOPP) for monitoring oxidative stress in critically ill patients: a simple, fast and inexpensive automated technique.

    Science.gov (United States)

    Selmeci, László; Seres, Leila; Antal, Magda; Lukács, Júlia; Regöly-Mérei, Andrea; Acsády, György

    2005-01-01

    Oxidative stress is known to be involved in many human pathological processes. Although there are numerous methods available for the assessment of oxidative stress, most of them are still not easily applicable in a routine clinical laboratory due to the complex methodology and/or lack of automation. In research into human oxidative stress, the simplification and automation of techniques represent a key issue from a laboratory point of view at present. In 1996 a novel oxidative stress biomarker, referred to as advanced oxidation protein products (AOPP), was detected in the plasma of chronic uremic patients. Here we describe in detail an automated version of the originally published microplate-based technique that we adapted for a Cobas Mira Plus clinical chemistry analyzer. AOPP reference values were measured in plasma samples from 266 apparently healthy volunteers (university students; 81 male and 185 female subjects) with a mean age of 21.3 years (range 18-33). Over a period of 18 months we determined AOPP concentrations in more than 300 patients in our department. Our experiences appear to demonstrate that this technique is especially suitable for monitoring oxidative stress in critically ill patients (sepsis, reperfusion injury, heart failure) even at daily intervals, since AOPP exhibited rapid responses in both directions. We believe that the well-established relationship between AOPP response and induced damage makes this simple, fast and inexpensive automated technique applicable in daily routine laboratory practice for assessing and monitoring oxidative stress in critically ill or other patients.

  1. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    Science.gov (United States)

    Singh, Preetpal

    Canada is home to thousands of freshwater lakes and rivers. Apart from being sources of infinite natural beauty, rivers and lakes are an important source of water, food and transportation. The northern hemisphere of Canada experiences extreme cold temperatures in the winter resulting in a freeze up of regional lakes and rivers. Frozen lakes and rivers tend to offer unique opportunities in terms of wildlife harvesting and winter transportation. Ice roads built on frozen rivers and lakes are vital supply lines for industrial operations in the remote north. Monitoring the ice freeze-up and break-up dates annually can help predict regional climatic changes. Lake ice impacts a variety of physical, ecological and economic processes. The construction and maintenance of a winter road can cost millions of dollars annually. A good understanding of ice mechanics is required to build and deem an ice road safe. A crucial factor in calculating load bearing capacity of ice sheets is the thickness of ice. Construction costs are mainly attributed to producing and maintaining a specific thickness and density of ice that can support different loads. Climate change is leading to warmer temperatures causing the ice to thin faster. At a certain point, a winter road may not be thick enough to support travel and transportation. There is considerable interest in monitoring winter road conditions given the high construction and maintenance costs involved. Remote sensing technologies such as Synthetic Aperture Radar have been successfully utilized to study the extent of ice covers and record freeze-up and break-up dates of ice on lakes and rivers across the north. Ice road builders often used Ultrasound equipment to measure ice thickness. However, an automated monitoring system, based on machine vision and image processing technology, which can measure ice thickness on lakes has not been thought of. Machine vision and image processing techniques have successfully been used in manufacturing

  2. An automated platform for phytoplankton ecology and aquatic ecosystem monitoring

    NARCIS (Netherlands)

    Pomati, F.; Jokela, J.; Simona, M.; Veronesi, M.; Ibelings, B.W.

    2011-01-01

    High quality monitoring data are vital for tracking and understanding the causes of ecosystem change. We present a potentially powerful approach for phytoplankton and aquatic ecosystem monitoring, based on integration of scanning flow-cytometry for the characterization and counting of algal cells wi

  3. Automated Quality Monitoring and Validation of the CMS Reconstruction Software

    CERN Document Server

    Piparo, Danilo

    2011-01-01

    assessed. The automated procedure adopted by CMS to accomplish this ambitious task and the innovative tools developed for that purpose are presented. The whole chain of steps is illustrated, starting from the application testing over large ensembles of datasets emulating Tier-0, Tier-1 and Tier-2 environments, to the collection of the produced physical quantities in the form of several hundred thousand histograms, to the estimation of their compatibility between releases, to the final production and publication of reports characterised by an ef...

  4. Tracking forest canopy stress from an automated proximal hyperspectral monitoring system

    Science.gov (United States)

    Woodgate, William; van Gorsel, Eva; Hughes, Dale; Cabello-Leblic, Arantxa

    2016-04-01

    Increasing climate variability and associated extreme weather events such as drought are likely to profoundly affect ecosystems, as many ecological processes are more sensitive to climate extremes than to changes in the mean states. However, the response of vegetation to these changes is one of the largest uncertainties in projecting future climate, carbon sequestration, and water resources. This remains a major limitation for long term climate prediction models integrating vegetation dynamics that are crucial for modelling the interplay of water, carbon and radiation fluxes. Satellite remote sensing data, such as that from the MODIS, Landsat and Sentinel missions, are the only viable means to study national and global vegetation trends. Highly accurate in-situ data is critical to better understand and validate our satellite products. Here, we developed a fully automated hyperspectral monitoring system installed on a flux monitoring tower at a mature Eucalypt forest site. The monitoring system is designed to provide a long-term (May 2014 - ongoing) and high temporal characterisation (3 acquisitions per day) of the proximal forest canopy to an unprecedented level of detail. The system comprises four main instruments: a thermal imaging camera and hyperspectral line camera (spectral ranges 7.5-14 μm and 0.4-1 μm, respectively), an upward pointing spectrometer (350-1000 nm), and hemispherical camera. The time series of hyperspectral and thermal imagery and flux tower data provides a unique dataset to study the impacts of logging, nutrient, and heat stress on trees and forest. Specifically, the monitoring system can be used to derive a range of physiological and structural indices that are also derived by satellites, such as PRI, TCARI/OSAVI, and NDVI. The monitoring system, to our knowledge, is the first fully automated data acquisition system that allows for spatially resolved spectral measurements at the sub-crown scale. Preliminary results indicate that canopy

  5. Text mining from ontology learning to automated text processing applications

    CERN Document Server

    Biemann, Chris

    2014-01-01

    This book comprises a set of articles that specify the methodology of text mining, describe the creation of lexical resources in the framework of text mining and use text mining for various tasks in natural language processing (NLP). The analysis of large amounts of textual data is a prerequisite to build lexical resources such as dictionaries and ontologies and also has direct applications in automated text processing in fields such as history, healthcare and mobile applications, just to name a few. This volume gives an update in terms of the recent gains in text mining methods and reflects

  6. Automated separation process for radioanalytical purposes at nuclear power plants.

    Science.gov (United States)

    Nagy, L G; Vajda, N; Vodicska, M; Zagyvai, P; Solymosi, J

    1987-10-01

    Chemical separation processes have been developed to remove the matrix components and thus to determine fission products, especially radioiodine nuclides, in the primary coolant of WWER-type nuclear reactors. Special procedures have been elaborated to enrich long-lived nuclides in waste waters to be released and to separate and enrich caesium isotopes in the environment. All processes are based mainly on ion-exchange separations using amorphous zirconium phosphate. Automated equipment was constructed to meet the demands of the plant personnel for serial analysis.

  7. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    Science.gov (United States)

    Kazarov, A.; Lehmann Miotto, G.; Magnoni, L.

    2012-06-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  8. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    International Nuclear Information System (INIS)

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for collecting and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This requires strong competence and experience in understanding and discovering problems and root causes, and often the meaningful information is not in the single message or update, but in the aggregated behavior in a certain time-line. The AAL project is meant at reducing the man power needs and at assuring a constant high quality of problem detection by automating most of the monitoring tasks and providing real-time correlation of data-taking and system metrics. This project combines technologies coming from different disciplines, in particular it leverages on an Event Driven Architecture to unify the flow of data from the ATLAS infrastructure, on a Complex Event Processing (CEP) engine for correlation of events and on a message oriented architecture for components integration. The project is composed of 2 main components: a core processing engine, responsible for correlation of events through expert-defined queries and a web based front-end to present real-time information and interact with the system. All components works in a loose-coupled event based architecture, with a message broker

  9. Automated multi-parameter monitoring of neo-nates

    OpenAIRE

    Gangadharan, V.

    2013-01-01

    Advancements in monitoring technology have led to an increasing amount of physiological data; such as heart rate and oxygen saturation, being accumulated in hospitals. A high rate of false alarms in the neonatal intensive care environment due to inadequate analysis of data highlights the need for an intelligent detection system with improved specificity that provides timely alerts to allow early clinical intervention. Current cot-side monitoring systems analyse data channels independently by ...

  10. Microsoft Business Solutions-Axapta as a basis for automated monitoring of high technology products competitiveness

    Science.gov (United States)

    Tashchiyan, G. O.; Sushko, A. V.; Grichin, S. V.

    2015-09-01

    One of the conditions of normal performance of the Russian economy is the problem of high technology products competitiveness. Different tools of these products estimation are used nowadays, one of them is automated monitoring of the high technology products in mechanical engineering. This system is developed on the basis of “Innovator" software integrated in Microsoft Business Solutions-Axapta.

  11. Starting the automation process by using group technology

    Directory of Open Access Journals (Sweden)

    Jorge Andrés García Barbosa

    2010-06-01

    Full Text Available This article describes starting-up an automation process based on applying group technology (GT. Mecanizados CNC, a company making matallurgical sector products, bases the layout (organisation and disposition of its machinery on the concept of manufacturing cells; production is programmed once the best location for the equipment has been determined. The order of making products and suitable setting up of tools for the machinery in the cells is established, aimed at minimising set up leading to achieving 15% improvement in productivity.

  12. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  13. Monitoring and Control of the Automated Transfer Vehicle

    Science.gov (United States)

    Hugonnet, C.; D'Hoine, S.

    The objective of this paper is to present succinctly the architecture of the heart of the ATV Control Centre: the Monitoring and Control developed by CS for the French Space Agency (CNES) and the European Space Agency (ESA). At the moment, the Monitoring and Control is in the development phase, a first real time version will be delivered to CNES in July 2003, then a second version will be delivered in October including off line capabilities. The following paper introduces the high level specifications and the main driving performance criteria of the monitoring and control system in order to successfully operate these complex ATV space vehicles from the first flight planned in 2004. It presents the approach taken by CS and CNES in order to meet this challenge in a very short time. ATV-CC Monitoring and Control system is based on the reuse of flight proven components that are integrated in a software bus based architecture. The paper particularly shows the advantages of using new computer technologies in operational system: use of Object Oriented technologies from specification, design (UML) to development (C++, Java, PLSQL), use of a CORBA Object Request Broker for the exchange of messages and some centralised services, use of Java for the development of an ergonomic and standardised (for all functions of the M&C) Graphical User Interface and the extensive use of XML for data exchanges.

  14. Atlas-based multichannel monitoring of functional MRI signals in real-time: automated approach.

    Science.gov (United States)

    Lee, Jong-Hwan; O'Leary, Heather M; Park, Hyunwook; Jolesz, Ferenc A; Yoo, Seung-Schik

    2008-02-01

    We report an automated method to simultaneously monitor blood-oxygenation-level-dependent (BOLD) MR signals from multiple cortical areas in real-time. Individual brain anatomy was normalized and registered to a pre-segmented atlas in standardized anatomical space. Subsequently, using real-time fMRI (rtfMRI) data acquisition, localized BOLD signals were measured and displayed from user-selected areas labeled with anatomical and Brodmann's Area (BA) nomenclature. The method was tested on healthy volunteers during the performance of hand motor and internal speech generation tasks employing a trial-based design. Our data normalization and registration algorithm, along with image reconstruction, movement correction and a data display routine were executed with enough processing and communication bandwidth necessary for real-time operation. Task-specific BOLD signals were observed from the hand motor and language areas. One of the study participants was allowed to freely engage in hand clenching tasks, and associated brain activities were detected from the motor-related neural substrates without prior knowledge of the task onset time. The proposed method may be applied to various applications such as neurofeedback, brain-computer-interface, and functional mapping for surgical planning where real-time monitoring of region-specific brain activity is needed. PMID:17370340

  15. QualitySpy: a framework for monitoring software development processes

    Directory of Open Access Journals (Sweden)

    Marian Jureczko

    2012-03-01

    Full Text Available The growing popularity of highly iterative, agile processes creates increasing need for automated monitoring of the quality of software artifacts, which would be focused on short terms (in the case of eXtreme Programming process iteration can be limited to one week. This paper presents a framework that calculates software metrics and cooperates with development tools (e.g. source version control system and issue tracking system to describe current state of a software project with regard to its quality. The framework is designed to support high level of automation of data collection and to be useful for researchers as well as for industry. The framework is currently being developed hence the paper reports already implemented features as well as future plans. The first release is scheduled for July.

  16. Process development for automated solar cell and module production. Task 4. Automated array assembly. Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Witham, C.R.

    1979-06-12

    MBA has been working on the automated array assembly task of the Low-Cost Solar Array project. A baseline sequence for the manufacture of solar cell modules is specified. Starting with silicon wafers, the process goes through damage etching, texture etching, junction formation, plasma edge etch, aluminum back surface field formation, and screen printed metallization to produce finished solar cells which are then series connected on a ribbon and bonded into a finished glass, PVB, tedlar module. A number of steps required additional developmental effort to verify technical and economic feasibility. These steps include texture etching, plasma edge etch, aluminum back surface field formation, array layup and interconnect, and module edge sealing and framing.

  17. Automated Grid Monitoring for LHCb through HammerCloud

    CERN Document Server

    CERN. Geneva

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  18. Cost Analysis of an Automated and Manual Cataloging and Book Processing System.

    Science.gov (United States)

    Druschel, Joselyn

    1981-01-01

    Cost analysis of an automated network system and a manual system of cataloging and book processing indicates a 20 percent savings using automation. Per unit costs based on the average monthly automation rate are used for comparison. Higher manual system costs are attributed to staff costs. (RAA)

  19. Monitoring method for automated CD-SEM recipes

    Science.gov (United States)

    Maeda, Tatsuya; Iwama, Satoru; Nishihara, Makoto; Berger, Daniel; Berger, Andrew; Ueda, Kazuhiro; Kenichi, Takenouchi; Iizumi, Takashi

    2005-05-01

    A prototype of a digital video storage system (CD-watcher) has been developed and attached to a Hitachi S-9380 CD-SEM. The storage system has several modes that are selectable depending on the phenomenon of interest. The system can store video images of duration from a few seconds to a few weeks depending on resolution, sampling rate, and hard disc drive capacity. The system was used to analyze apparent focusing problems that occurred during the execution of automated recipes. Intermittent focusing problems had been an issue on a particular tool for a period of approximately three months. By reviewing saved images, the original diagnosis of the problem appeared to be auto focus. Two days after installation, the CD-watcher system was able to record the errors making it possible to determine the root cause by checking the stored video files. After analysis of the stored video files, it was apparent that the problem consisted of three types of errors. The ability to record and store video files reduced the time to isolate the problem and prevented incorrect diagnosis. The system was also used to explain a complex phenomenon that occurred during the observation a particular layer. Because it is sometimes difficult to accurately describe, and to have others easily understand, certain phenomena in a written report, the video storage system can be used in place of manual annotation. In this report, we describe the CD-watcher system, test results after installing the system on a Hitachi S9380 CD-SEM, and potential applications of the system.

  20. Silicon Carbide Temperature Monitor Processing Improvements. Status Report

    International Nuclear Information System (INIS)

    Silicon carbide (SiC) temperature monitors are used as temperature sensors in Advanced Test Reactor (ATR) irradiations at the Idaho National Laboratory (INL). Although thermocouples are typically used to provide real-time temperature indication in instrumented lead tests, other indicators, such as melt wires, are also often included in such tests as an independent technique of detecting peak temperatures incurred during irradiation. In addition, less expensive static capsule tests, which have no leads attached for real-time data transmission, often rely on melt wires as a post-irradiation technique for peak temperature indication. Melt wires are limited in that they can only detect whether a single temperature is or is not exceeded. SiC monitors are advantageous because a single monitor can be used to detect for a range of temperatures that occurred during irradiation. As part of the process initiated to make SiC temperature monitors available at the ATR, post-irradiation evaluations of these monitors have been previously completed at the High Temperature Test Laboratory (HTTL). INL selected the resistance measurement approach for determining irradiation temperature from SiC temperature monitors because it is considered to be the most accurate measurement. The current process involves the repeated annealing of the SiC monitors at incrementally increasing temperature, with resistivity measurements made between annealing steps. The process is time consuming and requires the nearly constant attention of a trained staff member. In addition to the expensive and lengthy post analysis required, the current process adds many potential sources of error in the measurement, as the sensor must be repeatedly moved from furnace to test fixture. This time-consuming post irradiation analysis is a significant portion of the total cost of using these otherwise inexpensive sensors. An additional consideration of this research is that, if the SiC post processing can be automated, it

  1. Silicon Carbide Temperature Monitor Processing Improvements. Status Report

    Energy Technology Data Exchange (ETDEWEB)

    Unruh, Troy Casey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Daw, Joshua Earl [Idaho National Lab. (INL), Idaho Falls, ID (United States); Al Rashdan, Ahamad [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-01-29

    Silicon carbide (SiC) temperature monitors are used as temperature sensors in Advanced Test Reactor (ATR) irradiations at the Idaho National Laboratory (INL). Although thermocouples are typically used to provide real-time temperature indication in instrumented lead tests, other indicators, such as melt wires, are also often included in such tests as an independent technique of detecting peak temperatures incurred during irradiation. In addition, less expensive static capsule tests, which have no leads attached for real-time data transmission, often rely on melt wires as a post-irradiation technique for peak temperature indication. Melt wires are limited in that they can only detect whether a single temperature is or is not exceeded. SiC monitors are advantageous because a single monitor can be used to detect for a range of temperatures that occurred during irradiation. As part of the process initiated to make SiC temperature monitors available at the ATR, post-irradiation evaluations of these monitors have been previously completed at the High Temperature Test Laboratory (HTTL). INL selected the resistance measurement approach for determining irradiation temperature from SiC temperature monitors because it is considered to be the most accurate measurement. The current process involves the repeated annealing of the SiC monitors at incrementally increasing temperature, with resistivity measurements made between annealing steps. The process is time consuming and requires the nearly constant attention of a trained staff member. In addition to the expensive and lengthy post analysis required, the current process adds many potential sources of error in the measurement, as the sensor must be repeatedly moved from furnace to test fixture. This time-consuming post irradiation analysis is a significant portion of the total cost of using these otherwise inexpensive sensors. An additional consideration of this research is that, if the SiC post processing can be automated, it

  2. Automated Data Processing as an AI Planning Problem

    Science.gov (United States)

    Golden, Keith; Pang, Wanlin; Nemani, Ramakrishna; Votava, Petr

    2003-01-01

    NASA s vision for Earth Science is to build a "sensor web"; an adaptive array of heterogeneous satellites and other sensors that will track important events, such as storms, and provide real-time information about the state of the Earth to a wide variety of customers. Achieving his vision will require automation not only in the scheduling of the observations but also in the processing af tee resulting data. Ta address this need, we have developed a planner-based agent to automatically generate and execute data-flow programs to produce the requested data products. Data processing domains are substantially different from other planning domains that have been explored, and this has led us to substantially different choices in terms of representation and algorithms. We discuss some of these differences and discuss the approach we have adopted.

  3. Highly Automated Agile Testing Process: An Industrial Case Study

    Directory of Open Access Journals (Sweden)

    Jarosław Berłowski

    2016-09-01

    Full Text Available This paper presents a description of an agile testing process in a medium size software project that is developed using Scrum. The research methods used is the case study were as follows: surveys, quantifiable project data sources and qualitative project members opinions were used for data collection. Challenges related to the testing process regarding a complex project environment and unscheduled releases were identified. Based on the obtained results, we concluded that the described approach addresses well the aforementioned issues. Therefore, recommendations were made with regard to the employed principles of agility, specifically: continuous integration, responding to change, test automation and test driven development. Furthermore, an efficient testing environment that combines a number of test frameworks (e.g. JUnit, Selenium, Jersey Test with custom-developed simulators is presented.

  4. Automated Image Processing for the Analysis of DNA Repair Dynamics

    CERN Document Server

    Riess, Thorsten; Tomas, Martin; Ferrando-May, Elisa; Merhof, Dorit

    2011-01-01

    The efficient repair of cellular DNA is essential for the maintenance and inheritance of genomic information. In order to cope with the high frequency of spontaneous and induced DNA damage, a multitude of repair mechanisms have evolved. These are enabled by a wide range of protein factors specifically recognizing different types of lesions and finally restoring the normal DNA sequence. This work focuses on the repair factor XPC (xeroderma pigmentosum complementation group C), which identifies bulky DNA lesions and initiates their removal via the nucleotide excision repair pathway. The binding of XPC to damaged DNA can be visualized in living cells by following the accumulation of a fluorescent XPC fusion at lesions induced by laser microirradiation in a fluorescence microscope. In this work, an automated image processing pipeline is presented which allows to identify and quantify the accumulation reaction without any user interaction. The image processing pipeline comprises a preprocessing stage where the ima...

  5. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy of the...... massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time and...... mass axis on to a fixed one-dimensional array, we obtain a vector that can be used directly as input in multivariate statistics or library search methods. We demonstrate that both cluster- and discriminant analysis as well as PCA (and related methods) can be applied directly on mass spectra from direct...

  6. High-Resolution Time-Lapse Monitoring of Unsaturated Flow using Automated GPR Data Collection

    Science.gov (United States)

    Mangel, A. R.; Moysey, S. M.; Lytle, B. A.; Bradford, J. H.

    2015-12-01

    High-resolution ground-penetrating radar (GPR) data provide the detailed information required to image subsurface structures. Recent advances in GPR monitoring now also make it possible to study transient hydrologic processes, but high-speed data acquisition is critical for this application. We therefore highlight the capabilities of our automated system to acquire time-lapse, high-resolution multifold GPR data during infiltration of water into soils. The system design allows for fast acquisition of constant-offset (COP) and common-midpoint profiles (CMP) to monitor unsaturated flow at multiple locations. Qualitative interpretation of the unprocessed COPs can provide substantial information regarding the hydrologic response of the system, such as the complexities of patterns associated with the wetting of the soil and geophysical evidence of non-uniform propagation of a wetting front. While we find that unprocessed images are informative, we show that the spatial variability of velocity introduced by infiltration events can complicate the images and that migration of the data is an effective tool to improve interpretability of the time-lapse images. The ability of the system to collect high density CMP data also introduces the potential for improving the velocity model along with the image via reflection tomography in the post-migrated domain. We show that for both simulated and empirical time-lapse GPR profiles we can resolve a propagating wetting front in the soil that is in good agreement with the response of in-situ soil moisture measurements. The data from these experiments illustrate the importance of high-speed, high-resolution GPR data acquisition for obtaining insight about the dynamics of hydrologic events. Continuing research is aimed at improving the quantitative analysis of surface-based GPR monitoring data for identifying preferential flow in soils.

  7. Automated angiogenesis quantification through advanced image processing techniques.

    Science.gov (United States)

    Doukas, Charlampos N; Maglogiannis, Ilias; Chatziioannou, Aristotle; Papapetropoulos, Andreas

    2006-01-01

    Angiogenesis, the formation of blood vessels in tumors, is an interactive process between tumor, endothelial and stromal cells in order to create a network for oxygen and nutrients supply, necessary for tumor growth. According to this, angiogenic activity is considered a suitable method for both tumor growth or inhibition detection. The angiogenic potential is usually estimated by counting the number of blood vessels in particular sections. One of the most popular assay tissues to study the angiogenesis phenomenon is the developing chick embryo and its chorioallantoic membrane (CAM), which is a highly vascular structure lining the inner surface of the egg shell. The aim of this study was to develop and validate an automated image analysis method that would give an unbiased quantification of the micro-vessel density and growth in angiogenic CAM images. The presented method has been validated by comparing automated results to manual counts over a series of digital chick embryo photos. The results indicate the high accuracy of the tool, which has been thus extensively used for tumor growth detection at different stages of embryonic development. PMID:17946107

  8. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    Science.gov (United States)

    Bond, Jason; Kim, Don; Chrzanowski, Adam; Szostak-Chrzanowski, Anna

    2007-01-01

    The increasing number of structural collapses, slope failures and other natural disasters has lead to a demand for new sensors, sensor integration techniques and data processing strategies for deformation monitoring systems. In order to meet extraordinary accuracy requirements for displacement detection in recent deformation monitoring projects, research has been devoted to integrating Global Positioning System (GPS) as a monitoring sensor. Although GPS has been used for monitoring purposes worldwide, certain environments pose challenges where conventional processing techniques cannot provide the required accuracy with sufficient update frequency. Described is the development of a fully automated, continuous, real-time monitoring system that employs GPS sensors and pseudolite technology to meet these requirements in such environments. Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based upon client needs. A test was conducted that illustrated a 10 mm displacement was remotely detected at a target point using the designed system. This information could then be used to signal an alarm if conditions are deemed to be unsafe.

  9. Material quality development during the automated tow placement process

    Science.gov (United States)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  10. FDEMS Sensing for Automated Intelligent Processing of PMR-15

    Science.gov (United States)

    Kranbuehl, David E.; Hood, D. K.; Rogozinski, J.; Barksdale, R.; Loos, Alfred C.; McRae, Doug

    1993-01-01

    The purpose of this grant was to develop frequency dependent dielectric measurements, often called FDEMS (frequency dependent electromagnetic sensing), to monitor and intelligently control the cure process in PMR-15, a stoichiometric mixture of a nadic ester, dimethyl ester, and methylendianiline in a monomor ratio.

  11. Automated EEG monitoring in defining a chronic epilepsy model.

    Science.gov (United States)

    Mascott, C R; Gotman, J; Beaudet, A

    1994-01-01

    There has been a recent surge of interest in chronic animal models of epilepsy. Proper assessment of these models requires documentation of spontaneous seizures by EEG, observation, or both in each individual animal to confirm the presumed epileptic condition. We used the same automatic seizure detection system as that currently used for patients in our institution and many others. Electrodes were implanted in 43 rats before intraamygdalar administration of kainic acid (KA). Animals were monitored intermittently for 3 months. Nine of the rats were protected by anticonvulsants [pentobarbital (PB) and diazepam (DZP)] at the time of KA injection. Between 1 and 3 months after KA injection, spontaneous seizures were detected in 20 of the 34 unprotected animals (59%). Surprisingly, spontaneous seizures were also detected during the same period in 2 of the 9 protected animals that were intended to serve as nonepileptic controls. Although the absence of confirmed spontaneous seizures in the remaining animals cannot exclude their occurrence, it indicates that, if present, they are at least rare. On the other hand, definitive proof of epilepsy is invaluable in the attempt to interpret pathologic data from experimental brains.

  12. Multivariate Statistical Process Control Process Monitoring Methods and Applications

    CERN Document Server

    Ge, Zhiqiang

    2013-01-01

      Given their key position in the process control industry, process monitoring techniques have been extensively investigated by industrial practitioners and academic control researchers. Multivariate statistical process control (MSPC) is one of the most popular data-based methods for process monitoring and is widely used in various industrial areas. Effective routines for process monitoring can help operators run industrial processes efficiently at the same time as maintaining high product quality. Multivariate Statistical Process Control reviews the developments and improvements that have been made to MSPC over the last decade, and goes on to propose a series of new MSPC-based approaches for complex process monitoring. These new methods are demonstrated in several case studies from the chemical, biological, and semiconductor industrial areas.   Control and process engineers, and academic researchers in the process monitoring, process control and fault detection and isolation (FDI) disciplines will be inter...

  13. Automation of a problem list using natural language processing

    Directory of Open Access Journals (Sweden)

    Haug Peter J

    2005-08-01

    Full Text Available Abstract Background The medical problem list is an important part of the electronic medical record in development in our institution. To serve the functions it is designed for, the problem list has to be as accurate and timely as possible. However, the current problem list is usually incomplete and inaccurate, and is often totally unused. To alleviate this issue, we are building an environment where the problem list can be easily and effectively maintained. Methods For this project, 80 medical problems were selected for their frequency of use in our future clinical field of evaluation (cardiovascular. We have developed an Automated Problem List system composed of two main components: a background and a foreground application. The background application uses Natural Language Processing (NLP to harvest potential problem list entries from the list of 80 targeted problems detected in the multiple free-text electronic documents available in our electronic medical record. These proposed medical problems drive the foreground application designed for management of the problem list. Within this application, the extracted problems are proposed to the physicians for addition to the official problem list. Results The set of 80 targeted medical problems selected for this project covered about 5% of all possible diagnoses coded in ICD-9-CM in our study population (cardiovascular adult inpatients, but about 64% of all instances of these coded diagnoses. The system contains algorithms to detect first document sections, then sentences within these sections, and finally potential problems within the sentences. The initial evaluation of the section and sentence detection algorithms demonstrated a sensitivity and positive predictive value of 100% when detecting sections, and a sensitivity of 89% and a positive predictive value of 94% when detecting sentences. Conclusion The global aim of our project is to automate the process of creating and maintaining a problem

  14. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    OpenAIRE

    Thomas E. Watts III; Robyn A. Snow; Brown, Aaron W.; J. C. York; Greg Fantom; Paul S. Simone Jr.; Gary L. Emmert

    2015-01-01

    An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The...

  15. Using process-oriented interfaces for solving the automation paradox in highly automated navy vessels

    NARCIS (Netherlands)

    Diggelen, J. van; Post, W.; Rakhorst, M.; Plasmeijer, R.; Staal, W. van

    2014-01-01

    This paper describes a coherent engineering method for developing high level human machine interaction within a highly automated environment consisting of sensors, actuators, automatic situation assessors and planning devices. Our approach combines ideas from cognitive work analysis, cognitive engin

  16. 10 CFR 1017.28 - Processing on Automated Information Systems (AIS).

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false Processing on Automated Information Systems (AIS). 1017.28... UNCLASSIFIED CONTROLLED NUCLEAR INFORMATION Physical Protection Requirements § 1017.28 Processing on Automated Information Systems (AIS). UCNI may be processed or produced on any AIS that complies with the guidance in...

  17. ERT monitoring of environmental remediation processes

    Science.gov (United States)

    La Brecque, D. J.; Ramirez, A. L.; Daily, W. D.; Binley, A. M.; Schima, S. A.

    1996-03-01

    The use of electrical resistance tomography (ERT) to monitor new environmental remediation processes is addressed. An overview of the ERT method, including design of surveys and interpretation, is given. Proper design and lay-out of boreholes and electrodes are important for successful results. Data are collected using an automated collection system and interpreted using a nonlinear least squares inversion algorithm. Case histories are given for three remediation technologies: Joule (ohmic) heating, in which clay layers are heated electrically; air sparging, the injection of air below the water table; and electrokinetic treatment, which moves ions by applying an electric current. For Joule heating, a case history is given for an experiment near Savannah River, Georgia, USA. The target for Joule heating was a clay layer of variable thickness. During the early stages of heating, ERT images show increases in conductivity due to the increased temperatures. Later, the conductivities decreased as the system became dehydrated. For air sparging, a case history from Florence, Oregon, USA is described. Air was injected into a sandy aquifer at the site of a former service station. Successive images clearly show the changes in shape of the region of air saturation with time. The monitoring of an electrokinetic laboratory test on core samples is shown. The electrokinetic treatment creates a large change in the core resistivity, decreasing near the anode and increasing near the cathode. Although remediation efforts were successful both at Savannah River and at Florence, in neither case did experiments progress entirely as predicted. At Savannah River, the effects of heating and venting were not uniform and at Florence the radius of air flow was smaller than expected. Most sites are not as well characterized as these two sites. Improving remediation methods requires an understanding of the movements of heat, air, fluids and ions in the sub-surface which ERT can provide. The

  18. Generic HPLC platform for automated enzyme reaction monitoring: Advancing the assay toolbox for transaminases and other PLP-dependent enzymes.

    Science.gov (United States)

    Börner, Tim; Grey, Carl; Adlercreutz, Patrick

    2016-08-01

    Methods for rapid and direct quantification of enzyme kinetics independent of the substrate stand in high demand for both fundamental research and bioprocess development. This study addresses the need for a generic method by developing an automated, standardizable HPLC platform monitoring reaction progress in near real-time. The method was applied to amine transaminase (ATA) catalyzed reactions intensifying process development for chiral amine synthesis. Autosampler-assisted pipetting facilitates integrated mixing and sampling under controlled temperature. Crude enzyme formulations in high and low substrate concentrations can be employed. Sequential, small (1 µL) sample injections and immediate detection after separation permits fast reaction monitoring with excellent sensitivity, accuracy and reproducibility. Due to its modular design, different chromatographic techniques, e.g. reverse phase and size exclusion chromatography (SEC) can be employed. A novel assay for pyridoxal 5'-phosphate-dependent enzymes is presented using SEC for direct monitoring of enzyme-bound and free reaction intermediates. Time-resolved changes of the different cofactor states, e.g. pyridoxal 5'-phosphate, pyridoxamine 5'-phosphate and the internal aldimine were traced in both half reactions. The combination of the automated HPLC platform with SEC offers a method for substrate-independent screening, which renders a missing piece in the assay and screening toolbox for ATAs and other PLP-dependent enzymes.

  19. Automating the Human Factors Engineering and Evaluation Processes

    International Nuclear Information System (INIS)

    The Westinghouse Savannah River Company (WSRC) has developed a software tool for automating the Human Factors Engineering (HFE) design review, analysis, and evaluation processes. The tool provides a consistent, cost effective, graded, user-friendly approach for evaluating process control system Human System Interface (HSI) specifications, designs, and existing implementations. The initial set of HFE design guidelines, used in the tool, was obtained from NUREG- 0700. Each guideline was analyzed and classified according to its significance (general concept vs. supporting detail), the HSI technology (computer based vs. non-computer based), and the HSI safety function (safety vs. non-safety). Approximately 10 percent of the guidelines were determined to be redundant or obsolete and were discarded. The remaining guidelines were arranged in a Microsoft Access relational database, and a Microsoft Visual Basic user interface was provided to facilitate the HFE design review. The tool also provides the capability to add new criteria to accommodate advances in HSI technology and incorporate lessons learned. Summary reports produced by the tool can be easily ported to Microsoft Word and other popular PC office applications. An IBM compatible PC with Microsoft Windows 95 or higher is required to run the application

  20. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process

  1. Automation of the Technological Process to Produce Building Frame-Monolithic Modules Based on Fluoranhydrite

    Science.gov (United States)

    Fedorchuk, J.; Sadenova, M.; Rusina, O.

    2016-01-01

    The paper first proposes the automation of the technological process to produce building frame-monolithic modules from production wastes, namely technogenic anhydrite and fluoranhydrite. A functional diagram of the process automation is developed, the devices to perform control and maintenance with account of the production characteristics are chosen.

  2. Process development for automated solar cell and module production. Task 4: Automated array assembly

    Science.gov (United States)

    Hagerty, J. J.

    1981-01-01

    Progress in the development of automated solar cell and module production is reported. The unimate robot is programmed for the final 35 cell pattern to be used in the fabrication of the deliverable modules. The mechanical construction of the automated lamination station and final assembly station phases are completed and the first operational testing is underway. The final controlling program is written and optimized. The glass reinforced concrete (GRC) panels to be used for testing and deliverables are in production. Test routines are grouped together and defined to produce the final control program.

  3. Vision-Based Geo-Monitoring - A New Approach for an Automated System

    Science.gov (United States)

    Wagner, A.; Reiterer, A.; Wasmeier, P.; Rieke-Zapp, D.; Wunderlich, T.

    2012-04-01

    The necessity for monitoring geo-risk areas such as rock slides is growing due to the increasing probability of such events caused by environmental change. Life with threat becomes to a calculable risk by geodetic deformation monitoring. An in-depth monitoring concept with modern measurement technologies allows the estimation of the hazard potential and the prediction of life-threatening situations. The movements can be monitored by sensors, placed in the unstable slope area. In most cases, it is necessary to enter the regions at risk in order to place the sensors and maintain them. Using long-range monitoring systems (e.g. terrestrial laser scanners, total stations, ground based synthetic aperture radar) allows avoiding this risk. To close the gap between the existing low-resolution, medium-accuracy sensors and conventional (co-operative target-based) surveying methods, image-assisted total stations (IATS) are a suggestive solution. IATS offer the user (e.g. metrology expert) an image capturing system (CCD/CMOS camera) in addition to 3D point measurements. The images of the telescope's visual field are projected onto the camera's chip. With appropriate calibration, these images are accurately geo-referenced and oriented since the horizontal and vertical angles of rotation are continuously recorded. The oriented images can directly be used for direction measurements with no need for object control points or further photogrammetric orientation processes. IATS are able to provide high density deformation fields with high accuracy (down to mm range), in all three coordinate directions. Tests have shown that with suitable image processing measurements a precision of 0.05 pixel ± 0.04·σ is possible (which corresponds to 0.03 mgon ± 0.04·σ). These results have to be seen under the consideration that such measurements are image-based only. For measuring in 3D object space the precision of pointing has to be taken into account. IATS can be used in two different ways

  4. Automated frame selection process for high-resolution microendoscopy

    Science.gov (United States)

    Ishijima, Ayumu; Schwarz, Richard A.; Shin, Dongsuk; Mondrik, Sharon; Vigneswaran, Nadarajah; Gillenwater, Ann M.; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2015-04-01

    We developed an automated frame selection algorithm for high-resolution microendoscopy video sequences. The algorithm rapidly selects a representative frame with minimal motion artifact from a short video sequence, enabling fully automated image analysis at the point-of-care. The algorithm was evaluated by quantitative comparison of diagnostically relevant image features and diagnostic classification results obtained using automated frame selection versus manual frame selection. A data set consisting of video sequences collected in vivo from 100 oral sites and 167 esophageal sites was used in the analysis. The area under the receiver operating characteristic curve was 0.78 (automated selection) versus 0.82 (manual selection) for oral sites, and 0.93 (automated selection) versus 0.92 (manual selection) for esophageal sites. The implementation of fully automated high-resolution microendoscopy at the point-of-care has the potential to reduce the number of biopsies needed for accurate diagnosis of precancer and cancer in low-resource settings where there may be limited infrastructure and personnel for standard histologic analysis.

  5. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    International Nuclear Information System (INIS)

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  6. Quantification of diffusion tensor imaging in normal white matter maturation of early childhood using an automated processing pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Loh, K.B.; Ramli, N.; Tan, L.K.; Roziah, M. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); Rahmat, K. [University of Malaya, Department of Biomedical Imaging, University Malaya Research Imaging Centre (UMRIC), Faculty of Medicine, Kuala Lumpur (Malaysia); University Malaya, Biomedical Imaging Department, Kuala Lumpur (Malaysia); Ariffin, H. [University of Malaya, Department of Paediatrics, Faculty of Medicine, Kuala Lumpur (Malaysia)

    2012-07-15

    The degree and status of white matter myelination can be sensitively monitored using diffusion tensor imaging (DTI). This study looks at the measurement of fractional anistropy (FA) and mean diffusivity (MD) using an automated ROI with an existing DTI atlas. Anatomical MRI and structural DTI were performed cross-sectionally on 26 normal children (newborn to 48 months old), using 1.5-T MRI. The automated processing pipeline was implemented to convert diffusion-weighted images into the NIfTI format. DTI-TK software was used to register the processed images to the ICBM DTI-81 atlas, while AFNI software was used for automated atlas-based volumes of interest (VOIs) and statistical value extraction. DTI exhibited consistent grey-white matter contrast. Triphasic temporal variation of the FA and MD values was noted, with FA increasing and MD decreasing rapidly early in the first 12 months. The second phase lasted 12-24 months during which the rate of FA and MD changes was reduced. After 24 months, the FA and MD values plateaued. DTI is a superior technique to conventional MR imaging in depicting WM maturation. The use of the automated processing pipeline provides a reliable environment for quantitative analysis of high-throughput DTI data. (orig.)

  7. Using Automated On-Site Monitoring to Calibrate Empirical Models of Trihalomethanes Concentrations in Drinking Water

    Directory of Open Access Journals (Sweden)

    Thomas E. Watts III

    2015-10-01

    Full Text Available An automated, on-site trihalomethanes concentration data set from a conventional water treatment plant was used to optimize powdered activated carbon and pre-chlorination doses. The trihalomethanes concentration data set was used with commonly monitored water quality parameters to improve an empirical model of trihalomethanes formation. A calibrated model was used to predict trihalomethanes concentrations the following year. The agreement between the models and measurements was evaluated. The original model predicted trihalomethanes concentrations within ~10 μg·L−1 of the measurement. Calibration improved model prediction by a factor of three to five times better than the literature model.

  8. Partitioning,Automation and Error Recovery in the Control and Monitoring System of an LHC Experiment

    Institute of Scientific and Technical Information of China (English)

    C.Gaspar

    2001-01-01

    The Joint Controls Project(JCOP)is a collaboration between CERN and the four LHC experiments to find and implement common solutions for their control and monitoring systems.As part of this project and Architecture Working Group was set up in order to study the requirements and devise an architectural model that would suit the four experiments.Many issues were studied by this working group:Alarm handling,Access Control,Hierarchical Control,etc.This paper will report on the specific issue of hierarchical control and in particular partitioning,automation and error recovery.

  9. AUTOMATION OF BUISNESS-PROCESSES OF A TRAINING CENTER

    Directory of Open Access Journals (Sweden)

    Kovalenko A. V.

    2015-06-01

    Full Text Available The modern Russian companies have realized the need of automation of document flow not only as a mean of keeping documents in order, but also as a tool of optimization of expenses, as an assistant in adoption of administrative decisions. The Russian market of information systems for long time had no software products intended for educational institutions. The majority of the automated systems are intended for the enterprises with an activity in the sphere of trade and production. In comparison with the above companies, the list of software products for commercial training centers is small. Even considering the developed line of programs it is impossible to speak about meeting all the requirements for companies of such activity. At creation of the automated system for training center, the analysis of the existing software products intended for automation of training centers and adjacent institutes was carried out; a number of features of activity are revealed. The article is devoted to the description of the developed automated information system of document flow of a commercial educational institution, namely the developed configuration of "Training center" on a platform of "1C: Enterprise 8.2". The developed program complex serving as the administrative tool for the analysis of economic activity of training center, scheduling of the educational equipment and teaching structure, payroll calculation taking into account specifics of branch has been presented in the article

  10. A novel automated discontinuous venous blood monitoring system for ex vivo glucose determination in humans.

    Science.gov (United States)

    Schaller, R; Feichtner, F; Köhler, H; Bodenlenz, M; Plank, J; Wutte, A; Mader, J K; Ellmerer, M; Hellmich, R; Wedig, H; Hainisch, R; Pieber, T R; Schaupp, L

    2009-03-15

    Intensive insulin therapy reduces mortality and morbidity in critically ill patients but imposes great demands on medical staff who must take frequent blood samples for the determination of glucose levels. A solution to this resourcing problem would be provided by an automated blood monitoring system. The aim of the present clinical study was to evaluate such a system comprising an automatic blood sampling unit linked to a glucose biosensor. Our approach was to determine the correlation and system error of the sampling unit alone and of the combined system with respect to reference levels over 12h in humans. Two venous cannulae were inserted to connect the automatic and reference systems to the subjects. Blood samples were taken at 15 and 30 min intervals. The median Pearson coefficient of correlation between manually and automatically withdrawn blood samples was 0.982 for the sampling unit alone and 0.950 for the complete system. The biosensor had a linear range up to 20 mmoll(-1) and a 95% response time of Titration Error Grid analysis suggested an acceptable treatment in 99.56% of cases. Implementation of a "Keep Vein Open" saline infusion into the automated blood sampling system reduced blood withdrawal failures through occluded catheters fourfold. In summary, automated blood sampling from a peripheral vein coupled with automatic glucose determination is a promising alternative to frequent manual blood sampling. PMID:19135351

  11. How automation helps steer the revenue cycle process.

    Science.gov (United States)

    Colpas, Phil

    2013-06-01

    top-of-mind issue as we see how healthcare reform plays out. Here's what our select group of experts had to say about how automation helps to steer the revenue cycle process. PMID:23855249

  12. Comprehensive process monitoring for laser welding process optimization

    Science.gov (United States)

    Stritt, P.; Boley, M.; Heider, A.; Fetzer, F.; Jarwitz, M.; Weller, D.; Weber, R.; Berger, P.; Graf, T.

    2016-03-01

    Fundamental process monitoring is very helpful to detect defects formed during the complex interactions of capillary laser welding process. Beside the monitoring and diagnostics of laser welding process enlarges the process knowledge which is essential to prevent weld defects. Various studies on monitoring of laser welding processes of aluminum, copper and steel were performed. Coaxial analyses in real-time with inline coherent imaging and photodiode based measurements have been applied as well as off-axis thermography, spectroscopy, online X-Ray observation and highspeed imaging with 808 nm illumination wavelength. The presented diagnostics and monitoring methods were appropriate to study typical weld defects like pores, spatters and cracks. Using these diagnostics allows understanding the formation of such defects and developing strategies to prevent them.

  13. Experimental demonstration of microscopic process monitoring

    International Nuclear Information System (INIS)

    Microscopic process monitoring (MPM) is a material control strategy designed to use standard process control data to provide expanded safeguards protection of nuclear fuel cycle facilities. The MPM methodology identifies process events by recognizing significant patterns of changes in on-line measurements. The goals of MPM are to detect diversions of nuclear material and to provide information on process status useful to other facility safeguards operations

  14. FRP resin process automating system; FRP jushi kako jidoka system

    Energy Technology Data Exchange (ETDEWEB)

    Ochiai, I.; Sakai, H. [Meidensha Corp., Tokyo (Japan)

    1994-10-18

    This paper introduces as FRP resin product processing system using robots. Automatic processing by means of robots requires considerations on positioning of delivered workpieces, correction of positional shift of workpieces, monitoring of tools and cutters, disposal of chips, and dust and noise preventive measures. In a bath tank drilling and deburring system, robots should measure and correct the positional shift of workpieces, exchange tools automatically, and shut down the system upon occurrence of anomaly in processing. The wall panel processing system performs transportation of products by using a lift and carry system in consideration of preventing nicks on the transportation side a product. Workpiece positioning is performed by lifting and pressing them onto the standard plate on the upper portion of panel, and the thickness and length are measured and corrected by a workpiece shift correcting sensor disposed on a robot. The purification tank partition drilling system has shuttle-type transportation devices installed on both flanks of a robot. This is a high-efficiency system requiring no robot downtime. A dust collecting duct is disposed below the positioning device to prevent chips from leaking outside the device. 4 figs., 7 tabs.

  15. Marketing automation processes as a way to improve contemporary marketing of a company

    OpenAIRE

    Witold Świeczak

    2013-01-01

    The main aim of this article is to identify the possibilities which are given to contemporary companies thanks to the usage of processes included in marketing automation system. This publication deals with the key aspects of this issue. Its shows how the importance of the organization changes, how its value increases, as a result of using the tools which are provided by the processes included in the concept of marketing automation. This article defines the factors and processes which influenc...

  16. A model for business process automation in service oriented systems with knowledge management technologies

    OpenAIRE

    Šaša, Ana

    2009-01-01

    Due to increasing requirements for efficiency, effectiveness and flexibility of business systems, automation of business processes has become an important topic. In the last years, the most successful and predominant approach to business process automation has become the service-oriented architecture approach. In a service-oriented architecture a business process is composed of services, which represent different tasks that have to be performed in a business system. Typically, a business proc...

  17. Design Automation Systems for Production Preparation : Applied on the Rotary Draw Bending Process

    OpenAIRE

    Johansson, Joel

    2008-01-01

    Intensive competition on the global market puts great pressure on manufacturing companies to develop and produce products that meet requirements from customers and investors. One key factor in meeting these requirements is the efficiency of the product development and the production preparation process. Design automation is a powerful tool to increase efficiency in these two processes. The benefits of automating the production preparation process are shortened led-time, improved product perfo...

  18. Towards Automated Education Demand-Offer Information Monitoring: the Information Extraction

    OpenAIRE

    Rudzājs, P

    2012-01-01

    Dynamically changing work environment in knowledge economy causes the changes in knowledge requirements for labor. Therefore it becomes more and more important to be constantly aware of what education is currently demanded and what education is currently offered. The IT solution is vital to process various information sources, extract education information, and provide analysis mechanisms in automated manner. The education information extraction is detailed in this paper in the context of Edu...

  19. Automated Identification of Volcanic Plumes using the Ozone Monitoring Instrument (OMI)

    Science.gov (United States)

    Flower, V. J. B.; Oommen, T.; Carn, S. A.

    2015-12-01

    Volcanic eruptions are a global phenomenon which are increasingly impacting human populations due to factors such as the extension of population centres into areas of higher risk, expansion of agricultural sectors to accommodate increased production or the increasing impact of volcanic plumes on air travel. In areas where extensive monitoring is present these impacts can be moderated by ground based monitoring and alert systems, however many volcanoes have little or no monitoring capabilities. In many of these regions volcanic alerts are generated by local communities with limited resources or formal communication systems, however additional eruption alerts can result from chance encounters with passing aircraft. In contrast satellite based remote sensing instruments possess the capability to provide near global daily monitoring, facilitating automated volcanic eruption detection. One such system generates eruption alerts through the detection of thermal anomalies, known as MODVOLC, and is currently operational utilising moderate resolution MODIS satellite data. Within this work we outline a method to distinguish SO2 eruptions from background levels recorded by the Ozone Monitoring Instrument (OMI) through the identification and classification of volcanic activity over a 5 year period. The incorporation of this data into a logistic regression model facilitated the classification of volcanic events with an overall accuracy of 80% whilst consistently identifying plumes with a mass of 400 tons or higher. The implementation of the developed model could facilitate the near real time identification of new and ongoing volcanic activity on a global scale.

  20. Fully Automated Field-Deployable Bioaerosol Monitoring System Using Carbon Nanotube-Based Biosensors.

    Science.gov (United States)

    Kim, Junhyup; Jin, Joon-Hyung; Kim, Hyun Soo; Song, Wonbin; Shin, Su-Kyoung; Yi, Hana; Jang, Dae-Ho; Shin, Sehyun; Lee, Byung Yang

    2016-05-17

    Much progress has been made in the field of automated monitoring systems of airborne pathogens. However, they still lack the robustness and stability necessary for field deployment. Here, we demonstrate a bioaerosol automonitoring instrument (BAMI) specifically designed for the in situ capturing and continuous monitoring of airborne fungal particles. This was possible by developing highly sensitive and selective fungi sensors based on two-channel carbon nanotube field-effect transistors (CNT-FETs), followed by integration with a bioaerosol sampler, a Peltier cooler for receptor lifetime enhancement, and a pumping assembly for fluidic control. These four main components collectively cooperated with each other to enable the real-time monitoring of fungi. The two-channel CNT-FETs can detect two different fungal species simultaneously. The Peltier cooler effectively lowers the working temperature of the sensor device, resulting in extended sensor lifetime and receptor stability. The system performance was verified in both laboratory conditions and real residential areas. The system response was in accordance with reported fungal species distribution in the environment. Our system is versatile enough that it can be easily modified for the monitoring of other airborne pathogens. We expect that our system will expedite the development of hand-held and portable systems for airborne bioaerosol monitoring. PMID:27070239

  1. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    Directory of Open Access Journals (Sweden)

    Anna Szostak-Chrzanowski

    2007-06-01

    Full Text Available The increasing number of structural collapses, slope failures and other naturaldisasters has lead to a demand for new sensors, sensor integration techniques and dataprocessing strategies for deformation monitoring systems. In order to meet extraordinaryaccuracy requirements for displacement detection in recent deformation monitoringprojects, research has been devoted to integrating Global Positioning System (GPS as amonitoring sensor. Although GPS has been used for monitoring purposes worldwide,certain environments pose challenges where conventional processing techniques cannotprovide the required accuracy with sufficient update frequency. Described is thedevelopment of a fully automated, continuous, real-time monitoring system that employsGPS sensors and pseudolite technology to meet these requirements in such environments.Ethernet and/or serial port communication techniques are used to transfer data betweenGPS receivers at target points and a central processing computer. The data can beprocessed locally or remotely based upon client needs. A test was conducted that illustrateda 10 mm displacement was remotely detected at a target point using the designed system.This information could then be used to signal an alarm if conditions are deemed to beunsafe.

  2. Automated Performance Monitoring Data Analysis and Reporting within the Open Source R Environment

    Science.gov (United States)

    Kennel, J.; Tonkin, M. J.; Faught, W.; Lee, A.; Biebesheimer, F.

    2013-12-01

    Environmental scientists encounter quantities of data at a rate that in many cases outpaces our ability to appropriately store, visualize and convey the information. The free software environment, R, provides a framework for efficiently processing, analyzing, depicting and reporting on data from a multitude of formats in the form of traceable and quality-assured data summary reports. Automated data summary reporting leverages document markup languages such as markdown, HTML, or LaTeX using R-scripts capable of completing a variety of simple or sophisticated data processing, analysis and visualization tasks. Automated data summary reports seamlessly integrate analysis into report production with calculation outputs - such as plots, maps and statistics - included alongside report text. Once a site-specific template is set up, including data types, geographic base data and reporting requirements, reports can be (re-)generated trivially as the data evolve. The automated data summary report can be a stand-alone report, or it can be incorporated as an attachment to an interpretive report prepared by a subject-matter expert, thereby providing the technical basis to report on and efficiently evaluate large volumes of data resulting in a concise interpretive report. Hence, the data summary report does not replace the scientist, but relieves them of repetitive data processing tasks, facilitating a greater level of analysis. This is demonstrated using an implementation developed for monthly groundwater data reporting for a multi-constituent contaminated site, highlighting selected analysis techniques that can be easily incorporated in a data summary report.

  3. Electronic Tongue-FIA system for the Monitoring of Heavy Metal Biosorption Processes

    Science.gov (United States)

    Wilson, D.; Florido, A.; Valderrama, C.; de Labastida, M. Fernández; Alegret, S.; del Valle, M.

    2011-09-01

    An automated flow injection potentiometric (FIP) system with electronic tongue detection (ET) was used for the monitoring of biosorption processes of heavy metals on waste biomaterial. Grape stalk wastes were used as biosorbent to remove Cu2+ ions in a fixed-bed column setup. For the monitoring, the used ET employed a sensor array formed by Cu2+ and Ca2+ selective electrodes and two generic heavy-metal electrodes. The subsequent cross-response obtained was processed by a multilayer artificial neural network (ANN) model in order to resolve the concentrations of the monitored species. The coupling of the electronic tongue with the automation features of the flow-injection system (ET-FIP) allowed us to accurately characterize the biosorption process, through obtaining its breakthrough curves. In parallel, fractions of the extract solution were analyzed by atomic absorption spectroscopy in order to validate the results obtained with the reported methodology.

  4. Advanced monitoring with complex stream processing

    CERN Document Server

    CERN. Geneva

    2015-01-01

    Making sense of metrics and logs for service monitoring can be a complicated task. Valuable information is normally scattered across several streams of monitoring data, requiring aggregation, correlation and time-based analysis to promptly detect problems and failures. This presentations shows a solution which is used to support the advanced monitoring of the messaging services provided by the IT Department. It uses Esper, an open-source software product for Complex Event Processing (CEP), that analyses series of events for deriving conclusions from them.

  5. Automated processing of data on the use of motor vehicles in the Serbian Armed Forces

    Directory of Open Access Journals (Sweden)

    Nikola S. Osmokrović

    2012-10-01

    Full Text Available The main aim of introducing information technology into the armed forces is the automation of the management process. The management in movement and transport (M&T in our armed forces has been included in the process of automation from the beginning. For that reason, today we can speak about the automated processing of data on road traffic safety and on the use of motor vehicles. With regard to the overall development of the information system of the movement and transport service, the paper presents an information system of the M&T service for the processing of data on the use of motor vehicles. The main features, components and functions of the 'Vozila' application, which was specially developed for the automated processing of data on motor vehicle use, are explained in particular.

  6. Ultrasonic techniques for process monitoring and control.

    Energy Technology Data Exchange (ETDEWEB)

    Chien, H.-T.

    1999-03-24

    Ultrasonic techniques have been applied successfully to process monitoring and control for many industries, such as energy, medical, textile, oil, and material. It helps those industries in quality control, energy efficiency improving, waste reducing, and cost saving. This paper presents four ultrasonic systems, ultrasonic viscometer, on-loom, real-time ultrasonic imaging system, ultrasonic leak detection system, and ultrasonic solid concentration monitoring system, developed at Argonne National Laboratory in the past five years for various applications.

  7. An improved approach for process monitoring in laser material processing

    Science.gov (United States)

    König, Hans-Georg; Pütsch, Oliver; Stollenwerk, Jochen; Loosen, Peter

    2016-04-01

    Process monitoring is used in many different laser material processes due to the demand for reliable and stable processes. Among different methods, on-axis process monitoring offers multiple advantages. To observe a laser material process it is unavoidable to choose a wavelength for observation that is different to the one used for material processing, otherwise the light of the processing laser would outshine the picture of the process. By choosing a different wavelength, lateral chromatic aberration occurs in not chromatically corrected optical systems with optical scanning units and f-Theta lenses. These aberrations lead to a truncated image of the process on the camera or the pyrometer, respectively. This is the reason for adulterated measurements and non-satisfying images of the process. A new approach for solving the problem of field dependent lateral chromatic aberration in process monitoring is presented. Therefore, the scanner-based optical system is reproduced in a simulation environment, to predict the occurring lateral chromatic aberrations. In addition, a second deflecting system is integrated into the system. By using simulation, a predictive control is designed that uses the additional deflecting system to introduce reverse lateral deviations in order to compensate the lateral effect of chromatic aberration. This paper illustrates the concept and the implementation of the predictive control, which is used to eliminate lateral chromatic aberrations in process monitoring, the simulation on which the system is based the optical system as well as the control concept.

  8. FY-2010 Process Monitoring Technology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Casella, Amanda J.; Hines, Wes; Levitskaia, Tatiana G.; henkell, J.; Schwantes, Jon M.; Jordan, Elizabeth A.; Lines, Amanda M.; Fraga, Carlos G.; Peterson, James M.; Verdugo, Dawn E.; Christensen, Ronald N.; Peper, Shane M.

    2011-01-01

    During FY 2010, work under the Spectroscopy-Based Process Monitoring task included ordering and receiving four fluid flow meters and four flow visible-near infrared spectrometer cells to be instrumented within the centrifugal contactor system at Pacific Northwest National Laboratory (PNNL). Initial demonstrations of real-time spectroscopic measurements on cold-stream simulants were conducted using plutonium (Pu)/uranium (U) (PUREX) solvent extraction process conditions. The specific test case examined the extraction of neodymium nitrate (Nd(NO3)3) from an aqueous nitric acid (HNO3) feed into a tri-n-butyl phosphate (TBP)/ n-dodecane solvent. Demonstration testing of this system included diverting a sample from the aqueous feed meanwhile monitoring the process in every phase using the on-line spectroscopic process monitoring system. The purpose of this demonstration was to test whether spectroscopic monitoring is capable of determining the mass balance of metal nitrate species involved in a cross-current solvent extraction scheme while also diverting a sample from the system. The diversion scenario involved diverting a portion of the feed from a counter-current extraction system while a continuous extraction experiment was underway. A successful test would demonstrate the ability of the process monitoring system to detect and quantify the diversion of material from the system during a real-time continuous solvent extraction experiment. The system was designed to mimic a PUREX-type extraction process with a bank of four centrifugal contactors. The aqueous feed contained Nd(NO3)3 in HNO3, and the organic phase was composed of TBP/n-dodecane. The amount of sample observed to be diverted by on-line spectroscopic process monitoring was measured to be 3 mmol (3 x 10-3 mol) Nd3+. This value was in excellent agreement with the 2.9 mmol Nd3+ value based on the known mass of sample taken (i.e., diverted) directly from the system feed solution.

  9. A report on the impact of automation in the food process industry

    OpenAIRE

    Dudbridge, Michael

    2008-01-01

    Research Objectives: To understand how the food industry in Europe is using automation To ascertain what the food processing industry requires from equipment suppliers Furthermore to identify variations by sector and by country

  10. Automated data evaluation and modelling of simultaneous (19) F-(1) H medium-resolution NMR spectra for online reaction monitoring.

    Science.gov (United States)

    Zientek, Nicolai; Laurain, Clément; Meyer, Klas; Paul, Andrea; Engel, Dirk; Guthausen, Gisela; Kraume, Matthias; Maiwald, Michael

    2016-06-01

    Medium-resolution nuclear magnetic resonance spectroscopy (MR-NMR) currently develops to an important analytical tool for both quality control and process monitoring. In contrast to high-resolution online NMR (HR-NMR), MR-NMR can be operated under rough environmental conditions. A continuous re-circulating stream of reaction mixture from the reaction vessel to the NMR spectrometer enables a non-invasive, volume integrating online analysis of reactants and products. Here, we investigate the esterification of 2,2,2-trifluoroethanol with acetic acid to 2,2,2-trifluoroethyl acetate both by (1) H HR-NMR (500 MHz) and (1) H and (19) F MR-NMR (43 MHz) as a model system. The parallel online measurement is realised by splitting the flow, which allows the adjustment of quantitative and independent flow rates, both in the HR-NMR probe as well as in the MR-NMR probe, in addition to a fast bypass line back to the reactor. One of the fundamental acceptance criteria for online MR-MNR spectroscopy is a robust data treatment and evaluation strategy with the potential for automation. The MR-NMR spectra are treated by an automated baseline and phase correction using the minimum entropy method. The evaluation strategies comprise (i) direct integration, (ii) automated line fitting, (iii) indirect hard modelling (IHM) and (iv) partial least squares regression (PLS-R). To assess the potential of these evaluation strategies for MR-NMR, prediction results are compared with the line fitting data derived from the quantitative HR-NMR spectroscopy. Although, superior results are obtained from both IHM and PLS-R for (1) H MR-NMR, especially the latter demands for elaborate data pretreatment, whereas IHM models needed no previous alignment. Copyright © 2015 John Wiley & Sons, Ltd. PMID:25854892

  11. Digital Automation and Real-Time Monitoring of an Original Installation for "Wet Combustion" of Organic Wastes

    Science.gov (United States)

    Morozov, Yegor; Tikhomirov, Alexander A.; Saltykov, Mikhail; Trifonov, Sergey V.; Kudenko, D.. Yurii A.

    2016-07-01

    An original method for "wet combustion" of organic wastes, which is being developed at the IBP SB RAS, is a very promising approach for regeneration of nutrient solutions for plants in future spacecraft closed Bioregenerative Life Support Systems (BLSS). The method is quick, ecofriendly, does not require special conditions such as high pressure and temperature, and the resulting nitrogen stays in forms easy for further preparation of the fertilizer. An experimental testbed of a new-generation closed ecosystem is being currently run at the IBP SB RAS to examine compatibility of the latest technologies for accelerating the cycling. Integration of "wet combustion" of organic wastes into the information system of closed ecosystem experimental testbed has been studied as part of preparatory work. Digital automation and real-time monitoring of original "wet combustion" installation operation parameters have been implemented. The new system enabled remotely controlled or automatic work of the installation. Data are stored in standard easily processed formats, allowing further mathematical processing where necessary. During ongoing experiments on improving "wet combustion" of organic wastes, automatic monitoring can notice slight changes in process parameters and record them in more detail. The ultimate goal of the study is to include the "wet combustion" installation into future full-scale experiment with humans, thus reducing the time spent by the crew on life support issues while living in the BLSS. The work was carried out with the financial support of the Russian Scientific Foundation (project 14-14-00599).

  12. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    Science.gov (United States)

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Santos, Adenilson O Dos; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization.

  13. Automated swimming activity monitor for examining temporal patterns of toxicant effects on individual Daphnia magna.

    Science.gov (United States)

    Bahrndorff, Simon; Michaelsen, Thomas Yssing; Jensen, Anne; Marcussen, Laurits Faarup; Nielsen, Majken Elley; Roslev, Peter

    2016-07-01

    Aquatic pollutants are often biologically active at low concentrations and impact on biota in combination with other abiotic stressors. Traditional toxicity tests may not detect these effects, and there is a need for sensitive high-throughput methods for detecting sublethal effects. We have evaluated an automated infra-red (IR) light-based monitor for recording the swimming activity of Daphnia magna to establish temporal patterns of toxicant effects on an individual level. Activity was recorded for 48 h and the sensitivity of the monitor was evaluated by exposing D. magna to the reference chemicals K2 Cr2 O7 at 15, 20 and 25 °C and 2,4-dichlorophenol at 20 °C. Significant effects (P Cr2 O7 whereas activity at 20 and 25 °C was more biphasic with decreases in activity occurring after 12-18 h. A similar biphasic pattern was observed after 2,4-dichlorophenol exposure at 20 °C. EC50 values for 2,4-dichlorophenol and K2 Cr2 O7 determined from automated recording of swimming activity showed increasing toxicity with time corresponding to decreases in EC50 of 0.03-0.07 mg l(-1) h(-1) . EC50 values determined after 48 h were comparable or lower than EC50 values based on visual inspection according to ISO 6341. The results demonstrated that the swimming activity monitor is capable of detecting sublethal behavioural effects that are toxicant and temperature dependent. The method allows EC values to be established at different time points and can serve as a high-throughput screening tool in toxicity testing. Copyright © 2015 John Wiley & Sons, Ltd. PMID:26198804

  14. Monitoring and controlling the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Ahring, B.K.; Angelidaki, I. [The Technical Univ. of Denmark, Dept. of Environmental Science and Engineering, Lyngby (Denmark)

    1997-08-01

    Many modern large-scale biogas plants have been constructed recently, increasing the demand for proper monitoring and control of these large reactor systems. For monitoring the biogas process, an easy to measure and reliable indicator is required, which reflects the metabolic state and the activity of the bacterial populations in the reactor. In this paper, we discuss existing indicators as well as indicators under development which can potentially be used to monitor the state of the biogas process in a reactor. Furthermore, data are presented from two large scale thermophilic biogas plants, subjected to temperature changes and where the concentration of volatile fatty acids was monitored. The results clearly demonstrated that significant changes in the concentration of the individual VFA occurred although the biogas production was not significantly changed. Especially the concentrations of butyrate, isobutyrate and isovalerate showed significant changes. Future improvements of process control could therefore be based on monitoring of the concentration of specific VFA`s together with information about the bacterial populations in the reactor. The last information could be supplied by the use of modern molecular techniques. (au) 51 refs.

  15. A Camera and Multi-Sensor Automated Station Design for Polar Physical and Biological Systems Monitoring: AMIGOS

    Science.gov (United States)

    Bohlander, J. A.; Ross, R.; Scambos, T.; Haran, T. M.; Bauer, R. J.

    2012-12-01

    The Automated Meteorology - Ice/Indigenous species - Geophysics Observation System (AMIGOS) consists of a set of measurement instruments and camera(s) controlled by a single-board computer with a simplified Linux operating system and an Iridium satellite modem supporting two-way communication. Primary features of the system relevant to polar operations are low power requirements, daily data uploading, reprogramming, tolerance for low temperatures, and various approaches for automatic resets and recovery from low power or cold shut-down. Instruments include a compact weather station, C/A or dual-frequency GPS, solar flux and reflectivity sensors, sonic snow gages, simplified radio-echo-sounder, and resistance thermometer string in the firn column. In the current state of development, there are two basic designs. One is intended for in situ observations of glacier conditions. The other design supports a high-resolution camera for monitoring biological or geophysical systems from short distances (100 m to 20 km). The stations have been successfully used in several locations for operational support, monitoring rapid ice changes in response to climate change or iceberg drift, and monitoring penguin colony activity. As of June, 2012, there are 9 AMIGOS systems installed, all on the Antarctic continent. The stations are a working prototype for a planned series of upgraded stations, currently termed 'Sentinels'. These stations would carry further instrumentation, communications, and processing capability to investigate ice - ocean interaction from ice tongue, ice shelf, or fjord coastline areas.

  16. Intelligent monitoring and fault diagnosis for ATLAS TDAQ: a complex event processing solution

    CERN Document Server

    Magnoni, Luca; Luppi, Eleonora

    Effective monitoring and analysis tools are fundamental in modern IT infrastructures to get insights on the overall system behavior and to deal promptly and effectively with failures. In recent years, Complex Event Processing (CEP) technologies have emerged as effective solutions for information processing from the most disparate fields: from wireless sensor networks to financial analysis. This thesis proposes an innovative approach to monitor and operate complex and distributed computing systems, in particular referring to the ATLAS Trigger and Data Acquisition (TDAQ) system currently in use at the European Organization for Nuclear Research (CERN). The result of this research, the AAL project, is currently used to provide ATLAS data acquisition operators with automated error detection and intelligent system analysis. The thesis begins by describing the TDAQ system and the controlling architecture, with a focus on the monitoring infrastructure and the expert system used for error detection and automated reco...

  17. Robust processing of mining subsidence monitoring data

    Energy Technology Data Exchange (ETDEWEB)

    Wang Mingzhong; Huang Guogang [Pingdingshan Mining Bureau (China); Wang Yunjia; Guogangli [China Univ. of Mining and Technology, Xuzhou (China)

    1996-12-31

    Since China began to do research on mining subsidence in 1950s, more than one thousand lines have been observed. Yet, monitoring data sometimes contain quite a lot of outliers because of the limit of observation and geological mining conditions. In China, nowdays, the method of processing mining subsidence monitoring data is based on the principle of the least square method. It is possible to produce lower accuracy, less reliability, or even errors. For reason given above, the authors, according to Chinese actual situation, have done some research work on the robust processing of mining subsidence monitoring data in respect of how to get prediction parameters. The authors have derived related formulas, designed some computational programmes, done a great quantity of actual calculation and simulation, and achieved good results. (orig.)

  18. Process optimization and biocompatibility of cell carriers suitable for automated magnetic manipulation.

    Science.gov (United States)

    Krejci, I; Piana, C; Howitz, S; Wegener, T; Fiedler, S; Zwanzig, M; Schmitt, D; Daum, N; Meier, K; Lehr, C M; Batista, U; Zemljic, S; Messerschmidt, J; Franzke, J; Wirth, M; Gabor, F

    2012-03-01

    There is increasing demand for automated cell reprogramming in the fields of cell biology, biotechnology and the biomedical sciences. Microfluidic-based platforms that provide unattended manipulation of adherent cells promise to be an appropriate basis for cell manipulation. In this study we developed a magnetically driven cell carrier to serve as a vehicle within an in vitro environment. To elucidate the impact of the carrier on cells, biocompatibility was estimated using the human adenocarcinoma cell line Caco-2. Besides evaluation of the quality of the magnetic carriers by field emission scanning electron microscopy, the rate of adherence, proliferation and differentiation of Caco-2 cells grown on the carriers was quantified. Moreover, the morphology of the cells was monitored by immunofluorescent staining. Early generations of the cell carrier suffered from release of cytotoxic nickel from the magnetic cushion. Biocompatibility was achieved by complete encapsulation of the nickel bulk within galvanic gold. The insulation process had to be developed stepwise and was controlled by parallel monitoring of the cell viability. The final carrier generation proved to be a proper support for cell manipulation, allowing proliferation of Caco-2 cells equal to that on glass or polystyrene as a reference for up to 10 days. Functional differentiation was enhanced by more than 30% compared with the reference. A flat, ferromagnetic and fully biocompatible carrier for cell manipulation was developed for application in microfluidic systems. Beyond that, this study offers advice for the development of magnetic cell carriers and the estimation of their biocompatibility.

  19. Coating Process Monitoring Using Computer Vision

    OpenAIRE

    Veijola, Erik

    2013-01-01

    The aim of this Bachelor’s Thesis was to make a prototype system for Metso Paper Inc. for monitoring a paper roll coating process. If the coating is done badly and there are faults one has to redo the process which lowers the profits of the company since the process is costly. The work was proposed by Seppo Parviainen in December of 2012. The resulting system was to alarm the personnel of faults in the process. Specifically if the system that is applying the synthetic resin on to the roll...

  20. Automated size-specific CT dose monitoring program: Assessing variability in CT dose

    Energy Technology Data Exchange (ETDEWEB)

    Christianson, Olav; Li Xiang; Frush, Donald; Samei, Ehsan [Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 and Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States) and Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Clinical Imaging Physics Group, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Carl E. Ravin Advanced Imaging Laboratories, Department of Radiology, Duke University Medical Center, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Department of Physics, Duke University, Durham, North Carolina 27710 (United States); and Department of Biomedical Engineering, Duke University, Durham, North Carolina 27708 (United States)

    2012-11-15

    Purpose: The potential health risks associated with low levels of ionizing radiation have created a movement in the radiology community to optimize computed tomography (CT) imaging protocols to use the lowest radiation dose possible without compromising the diagnostic usefulness of the images. Despite efforts to use appropriate and consistent radiation doses, studies suggest that a great deal of variability in radiation dose exists both within and between institutions for CT imaging. In this context, the authors have developed an automated size-specific radiation dose monitoring program for CT and used this program to assess variability in size-adjusted effective dose from CT imaging. Methods: The authors radiation dose monitoring program operates on an independent health insurance portability and accountability act compliant dosimetry server. Digital imaging and communication in medicine routing software is used to isolate dose report screen captures and scout images for all incoming CT studies. Effective dose conversion factors (k-factors) are determined based on the protocol and optical character recognition is used to extract the CT dose index and dose-length product. The patient's thickness is obtained by applying an adaptive thresholding algorithm to the scout images and is used to calculate the size-adjusted effective dose (ED{sub adj}). The radiation dose monitoring program was used to collect data on 6351 CT studies from three scanner models (GE Lightspeed Pro 16, GE Lightspeed VCT, and GE Definition CT750 HD) and two institutions over a one-month period and to analyze the variability in ED{sub adj} between scanner models and across institutions. Results: No significant difference was found between computer measurements of patient thickness and observer measurements (p= 0.17), and the average difference between the two methods was less than 4%. Applying the size correction resulted in ED{sub adj} that differed by up to 44% from effective dose

  1. ADVANCES IN CLOG STATE MONITORING FOR USE IN AUTOMATED REED BED INSTALLATIONS

    Directory of Open Access Journals (Sweden)

    Theodore HUGHES-RILEY

    2014-06-01

    Full Text Available Constructed wetlands are a popular form of waste-water treatment that have proliferated across Europe and the rest of the world in recent years as an environmentally conscious form of waste water treatment. The ability to monitor the conditions in the bed and control input factors such as heating and aeration may extend the lifetime of the reed bed substantially beyond the ten year lifetime normally reached. The Autonomous Reed Bed Installation (ARBI project is an EU FP7 initiative to develop a reed bed with automated control over input parameters based on readings taken from embedded sensors. Automated remedial action may improve bed treatment efficiency, and prolong the life of the bed and avoiding the need to refurbish the bed, which is both time consuming and costly. One critical parameter to observe is the clog state of the reed bed, as this can severely impact on the efficiency of water treatment to the point of the bed becoming non-operable. Magnetic resonance (MR sensors can be a powerful tool in determining clogging levels, and has previously been explored in the literature. This work is based on a conference paper (2nd International Conference "Water resources and wetlands", 2014 and details magnetic sensors suitable for long-term embedding into a constructed wetland. Unlike previous studies this work examines a probe embedded into a wetland.

  2. Automated tests for diagnosing and monitoring cognitive impairment: a diagnostic accuracy review.

    Science.gov (United States)

    Aslam, Rabeea'h W; Bates, Vickie; Dundar, Yenal; Hounsome, Juliet; Richardson, Marty; Krishan, Ashma; Dickson, Rumona; Boland, Angela; Kotas, Eleanor; Fisher, Joanne; Sikdar, Sudip; Robinson, Louise

    2016-01-01

    BACKGROUND Cognitive impairment is a growing public health concern, and is one of the most distinctive characteristics of all dementias. The timely recognition of dementia syndromes can be beneficial, as some causes of dementia are treatable and are fully or partially reversible. Several automated cognitive assessment tools for assessing mild cognitive impairment (MCI) and early dementia are now available. Proponents of these tests cite as benefits the tests' repeatability and robustness and the saving of clinicians' time. However, the use of these tools to diagnose and/or monitor progressive cognitive impairment or response to treatment has not yet been evaluated. OBJECTIVES The aim of this review was to determine whether or not automated computerised tests could accurately identify patients with progressive cognitive impairment in MCI and dementia and, if so, to investigate their role in monitoring disease progression and/or response to treatment. DATA SOURCES Five electronic databases (MEDLINE, EMBASE, The Cochrane Library, ISI Web of Science and PsycINFO), plus ProQuest, were searched from 2005 to August 2015. The bibliographies of retrieved citations were also examined. Trial and research registers were searched for ongoing studies and reviews. A second search was run to identify individual test costs and acquisition costs for the various tools identified in the review. REVIEW METHODS Two reviewers independently screened all titles and abstracts to identify potentially relevant studies for inclusion in the review. Full-text copies were assessed independently by two reviewers. Data were extracted and assessed for risk of bias by one reviewer and independently checked for accuracy by a second. The results of the data extraction and quality assessment for each study are presented in structured tables and as a narrative summary. RESULTS The electronic searching of databases, including ProQuest, resulted in 13,542 unique citations. The titles and abstracts of these

  3. A novel automated bioreactor for scalable process optimisation of haematopoietic stem cell culture.

    Science.gov (United States)

    Ratcliffe, E; Glen, K E; Workman, V L; Stacey, A J; Thomas, R J

    2012-10-31

    Proliferation and differentiation of haematopoietic stem cells (HSCs) from umbilical cord blood at large scale will potentially underpin production of a number of therapeutic cellular products in development, including erythrocytes and platelets. However, to achieve production processes that are scalable and optimised for cost and quality, scaled down development platforms that can define process parameter tolerances and consequent manufacturing controls are essential. We have demonstrated the potential of a new, automated, 24×15 mL replicate suspension bioreactor system, with online monitoring and control, to develop an HSC proliferation and differentiation process for erythroid committed cells (CD71(+), CD235a(+)). Cell proliferation was relatively robust to cell density and oxygen levels and reached up to 6 population doublings over 10 days. The maximum suspension culture density for a 48 h total media exchange protocol was established to be in the order of 10(7)cells/mL. This system will be valuable for the further HSC suspension culture cost reduction and optimisation necessary before the application of conventional stirred tank technology to scaled manufacture of HSC derived products.

  4. Completely automated measurement facility (PAVICOM) for track-detector data processing

    CERN Document Server

    Aleksandrov, A B; Feinberg, E L; Goncharova, L A; Konovalova, N S; Martynov, A G; Polukhina, N G; Roussetski, A S; Starkov, NI; Tsarev, V A

    2004-01-01

    A review of technical capabilities and investigations performed using the completely automated measuring facility (PAVICOM) is presented. This very efficient facility for track-detector data processing in the field of nuclear and high-energy particle physics has been constructed in the Lebedev physical institute. PAVICOM is widely used in Russia for treatment of experimental data from track detectors (emulsion and solid-state trackers) in high- and low-energy physics, cosmic ray physics, etc. PAVICOM provides an essential improvement of the efficiency of experimental studies. In contrast to semi-automated microscopes widely used until now, PAVICOM is capable of performing completely automated measurements of charged particle tracks in nuclear emulsions and track detectors without employing hard visual work. In this case, track images are recorded by CCD cameras and then are digitized and converted into files. Thus, experimental data processing is accelerated by approximately a thousand times. Completely autom...

  5. A prototype of an automated high resolution InSAR volcano-monitoring system in the MED-SUV project

    Science.gov (United States)

    Chowdhury, Tanvir A.; Minet, Christian; Fritz, Thomas

    2016-04-01

    Volcanic processes which produce a variety of geological and hydrological hazards are difficult to predict and capable of triggering natural disasters on regional to global scales. Therefore it is important to monitor volcano continuously and with a high spatial and temporal sampling rate. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities and it helps for the better understanding and modelling of the involved geophysical processes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide a powerful tool for observing the eruptive activities and measuring the surface changes of millimetre accuracy. All the mentioned techniques with deformation time series extraction address the challenges by exploiting medium to large SAR image stacks. The process of selecting, ordering, downloading, storing, logging, extracting and preparing the data for processing is very time consuming has to be done manually for every single data-stack. In many cases it is even an iterative process which has to be done regularly and continuously. Therefore, data processing becomes slow which causes significant delays in data delivery. The SAR Satellite based High Resolution Data Acquisition System, which will be developed at DLR, will automate this entire time consuming tasks and allows an operational volcano monitoring system. Every 24 hours the system runs for searching new acquired scene over the volcanoes and keeps track of the data orders, log the status and download the provided data via ftp-transfer including E-Mail alert. Furthermore, the system will deliver specified reports and maps to a database for review and use by specialists. The user interaction will be minimized and iterative processes will be totally avoided. In this presentation, a prototype of SAR Satellite based High Resolution Data

  6. Verifiable process monitoring through enhanced data authentication

    International Nuclear Information System (INIS)

    To ensure the peaceful intent for production and processing of nuclear fuel, verifiable process monitoring of the fuel production cycle is required. As part of a U.S. Department of Energy (DOE)-EURATOM collaboration in the field of international nuclear safeguards, the DOE Sandia National Laboratories (SNL), the European Commission Joint Research Centre (JRC) and Directorate General-Energy (DG-ENER) developed and demonstrated a new concept in process monitoring, enabling the use of operator process information by branching a second, authenticated data stream to the Safeguards inspectorate. This information would be complementary to independent safeguards data, improving the understanding of the plant's operation. The concept is called the Enhanced Data Authentication System (EDAS). EDAS transparently captures, authenticates, and encrypts communication data that is transmitted between operator control computers and connected analytical equipment utilized in nuclear processes controls. The intent is to capture information as close to the sensor point as possible to assure the highest possible confidence in the branched data. Data must be collected transparently by the EDAS: Operator processes should not be altered or disrupted by the insertion of the EDAS as a monitoring system for safeguards. EDAS employs public key authentication providing 'jointly verifiable' data and private key encryption for confidentiality. Timestamps and data source are also added to the collected data for analysis. The core of the system hardware is in a security enclosure with both active and passive tamper indication. Further, the system has the ability to monitor seals or other security devices in close proximity. This paper will discuss the EDAS concept, recent technical developments, intended application philosophy and the planned future progression of this system.

  7. Using Natural Language Processing to Improve Accuracy of Automated Notifiable Disease Reporting

    OpenAIRE

    Friedlin, Jeff; Grannis, Shaun; Overhage, J Marc

    2008-01-01

    We examined whether using a natural language processing (NLP) system results in improved accuracy and completeness of automated electronic laboratory reporting (ELR) of notifiable conditions. We used data from a community-wide health information exchange that has automated ELR functionality. We focused on methicillin-resistant Staphylococcus Aureus (MRSA), a reportable infection found in unstructured, free-text culture result reports. We used the Regenstrief EXtraction tool (REX) for this wor...

  8. Knowledge Acquisition, Validation, and Maintenance in a Planning System for Automated Image Processing

    Science.gov (United States)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintainting the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems. This paper describes a planning application of automated imaging processing and our overall approach to knowledge acquisition for this application.

  9. A wireless smart sensor network for automated monitoring of cable tension

    International Nuclear Information System (INIS)

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea. (paper)

  10. A wireless smart sensor network for automated monitoring of cable tension

    Science.gov (United States)

    Sim, Sung-Han; Li, Jian; Jo, Hongki; Park, Jong-Woong; Cho, Soojin; Spencer, Billie F., Jr.; Jung, Hyung-Jo

    2014-02-01

    As cables are primary load carrying members in cable-stayed bridges, monitoring the tension forces of the cables provides valuable information regarding structural soundness. Incorporating wireless smart sensors with vibration-based tension estimation methods provides an efficient means of autonomous long-term monitoring of cable tensions. This study develops a wireless cable tension monitoring system using MEMSIC’s Imote2 smart sensors. The monitoring system features autonomous operation, sustainable energy harvesting and power consumption, and remote access using the internet. To obtain the tension force, an in-network data processing strategy associated with the vibration-based tension estimation method is implemented on the Imote2-based sensor network, significantly reducing the wireless data transmission and the power consumption. The proposed monitoring system has been deployed and validated on the Jindo Bridge, a cable-stayed bridge located in South Korea.

  11. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Science.gov (United States)

    Ker, Dai Fei Elmer; Weiss, Lee E; Junkers, Silvina N; Chen, Mei; Yin, Zhaozheng; Sandbothe, Michael F; Huh, Seung-il; Eom, Sungeun; Bise, Ryoma; Osuna-Highley, Elvira; Kanade, Takeo; Campbell, Phil G

    2011-01-01

    Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and developing robotic cell

  12. Seismic monitoring of torrential and fluvial processes

    Science.gov (United States)

    Burtin, Arnaud; Hovius, Niels; Turowski, Jens M.

    2016-04-01

    In seismology, the signal is usually analysed for earthquake data, but earthquakes represent less than 1 % of continuous recording. The remaining data are considered as seismic noise and were for a long time ignored. Over the past decades, the analysis of seismic noise has constantly increased in popularity, and this has led to the development of new approaches and applications in geophysics. The study of continuous seismic records is now open to other disciplines, like geomorphology. The motion of mass at the Earth's surface generates seismic waves that are recorded by nearby seismometers and can be used to monitor mass transfer throughout the landscape. Surface processes vary in nature, mechanism, magnitude, space and time, and this variability can be observed in the seismic signals. This contribution gives an overview of the development and current opportunities for the seismic monitoring of geomorphic processes. We first describe the common principles of seismic signal monitoring and introduce time-frequency analysis for the purpose of identification and differentiation of surface processes. Second, we present techniques to detect, locate and quantify geomorphic events. Third, we review the diverse layout of seismic arrays and highlight their advantages and limitations for specific processes, like slope or channel activity. Finally, we illustrate all these characteristics with the analysis of seismic data acquired in a small debris-flow catchment where geomorphic events show interactions and feedbacks. Further developments must aim to fully understand the richness of the continuous seismic signals, to better quantify the geomorphic activity and to improve the performance of warning systems. Seismic monitoring may ultimately allow the continuous survey of erosion and transfer of sediments in the landscape on the scales of external forcing.

  13. Swab culture monitoring of automated endoscope reprocessors after high-level disinfection

    Institute of Scientific and Technical Information of China (English)

    Lung-Sheng Lu; Keng-Liang Wu; Yi-Chun Chiu; Ming-Tzung Lin; Tsung-Hui Hu; King-Wah Chiu

    2012-01-01

    AIM:To conduct a bacterial culture study for monitoring decontamination of automated endoscope reprocessors (AERs) after high-level disinfection (HLD).METHODS:From February 2006 to January 2011,authors conducted randomized consecutive sampling each month for 7 AERs.Authors collected a total of 420 swab cultures,including 300 cultures from 5 gastroscope AERs,and 120 cultures from 2 colonoscope AERs.Swab cultures were obtained from the residual water from the AERs after a full reprocessing cycle.Samples were cultured to test for aerobic bacteria,anaerobic bacteria,and mycobacterium tuberculosis.RESULTS:The positive culture rate of the AERs was 2.0% (6/300) for gastroscope AERs and 0.8% (1/120)for colonoscope AERs.All the positive cultures,including 6 from gastroscope and 1 from colonoscope AERs,showed monofloral colonization.Of the gastroscope AER samples,50% (3/6) were colonized by aerobic bacterial and 50% (3/6) by fungal contaminations.CONCLUSION:A full reprocessing cycle of an AER with HLD is adequate for disinfection of the machine.Swab culture is a useful method for monitoring AER decontamination after each reprocessing cycle.Fungal contamination of AERs after reprocessing should also be kept in mind.

  14. Automated assay data processing and quality control: A review and recommendations

    International Nuclear Information System (INIS)

    Automated data processing and quality control of assays offers not only increased speed but also a more thorough and statistically rigorous analysis of results. This review outlines the motivations, statistical definitions, and mathematical methods pertinent to assay data processing. The presentation concentrates on basic concepts rather than specific mathematical formulae. The numerous automated calibration procedures are discussed and summarized in tabular form. A comprehensive view of data processing is offered which includes much more than simple calibration and interpolation. A small number of calculator and computer programs which provide an acceptably detailed statistical analysis of assays are recommended. Finally, possible future developments in hardware and software are discussed. (author)

  15. A Continuous Automated Vault Inventory System (CAVIS) for accountability monitoring of stored nuclear materials

    International Nuclear Information System (INIS)

    Nearly all facilities that store hazardous (radioactive or non-radioactive) materials must comply with prevailing federal, state, and local laws. These laws usually have components that require periodic physical inspections to insure that all materials remain safely and securely stored. The inspections are generally labor intensive, slow, put personnel at risk, and only find anomalies after they have occurred. The system described in this paper was developed for monitoring stored nuclear materials resulting from weapons dismantlement, but its applications extend to any storage facility that meets the above criteria. The traditional special nuclear material (SNM) accountability programs, that are currently used within most of the Department of Energy (DOE) complex, require the physical entry of highly trained personnel into SNM storage vaults. This imposes the need for additional security measures, which typically mandate that extra security personnel be present while SNM inventories are performed. These requirements increase labor costs and put additional personnel at risk to radiation exposure. In some cases, individuals have received radiation exposure equivalent to the annual maximum during just one inventory verification. With increasing overhead costs, the current system is rapidly becoming too expensive to operate, the need for an automated method of inventory verification is evident. The Continuous Automated Vault Inventory System (CAVIS) described in this paper was designed and prototyped as a low cost, highly reliable, and user friendly system that is capable of providing, real-time weight, gamma. and neutron energy confirmation from each item stored in a SNM vault. This paper describes the sensor technologies, the CAVIS prototype system (built at Y- 12 for highly enriched uranium storage), the technical requirements that must be achieved to assure successful implementation, and descriptions of sensor technologies needed for a plutonium facility

  16. ROLE AND PECULIARITIES OF PROJECT STREAM IN THE FIELDOF AUTOMATION OF BUISNESS-PROCESSES IN COMPANIES

    Directory of Open Access Journals (Sweden)

    Kovalenko A. V.

    2015-06-01

    Full Text Available For effective management of economic and financial activity of a modern enterprise it is required to have information – software supply for its separate components: branches, divisions, services. Automation of the account allows accelerating significantly the process of granting, processing and the analysis of information necessary for the purposes of management. An important aspect on introduction of a program complex is the concept on its realization, i.e. the head of the company should define the way of development of automation of business - processes, what type of introduction is more preferable to the enterprise, what documents will formalize each of the stages. The article generalizes the data obtained by practical consideration on the basis of the realized projects on introduction of the automated systems in the companies from various fields of activity. The main stages of the design direction in the sphere of automation of business – processes are presented in this work, as well as the features of the subject and the characteristics of each stage, documentary objects for realization of each of them. Also on the basis of the carried-out analysis, the authors described a number of the existing shortcomings on realization of the design direction. In view of the data specified in article, the companies will be able to begin the project on automation of their own business effectively and quickly

  17. An automated fog monitoring system for the Indo-Gangetic Plains based on satellite measurements

    Science.gov (United States)

    Patil, Dinesh; Chourey, Reema; Rizvi, Sarwar; Singh, Manoj; Gautam, Ritesh

    2016-05-01

    Fog is a meteorological phenomenon that causes reduction in regional visibility and affects air quality, thus leading to various societal and economic implications, especially disrupting air and rail transportation. The persistent and widespread winter fog impacts the entire the Indo-Gangetic Plains (IGP), as frequently observed in satellite imagery. The IGP is a densely populated region in south Asia, inhabiting about 1/6th of the world's population, with a strong upward pollution trend. In this study, we have used multi-spectral radiances and aerosol/cloud retrievals from Terra/Aqua MODIS data for developing an automated web-based fog monitoring system over the IGP. Using our previous and existing methodologies, and ongoing algorithm development for the detection of fog and retrieval of associated microphysical properties (e.g. fog droplet effective radius), we characterize the widespread fog detection during both daytime and nighttime. Specifically, for the night time fog detection, the algorithm employs a satellite-based bi-spectral brightness temperature difference technique between two spectral channels: MODIS band-22 (3.9μm) and band-31 (10.75μm). Further, we are extending our algorithm development to geostationary satellites, for providing continuous monitoring of the spatial-temporal variation of fog. We anticipate that the ongoing and future development of a fog monitoring system would be of assistance to air, rail and vehicular transportation management, as well as for dissemination of fog information to government agencies and general public. The outputs of fog detection algorithm and related aerosol/cloud parameters are operationally disseminated via http://fogsouthasia.com/.

  18. Acoustic Emission Based In-process Monitoring in Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas; Bissacco, Giuliano; De Chiffre, Leonardo;

    The applicability of acoustic emission (AE) measurements for in-process monitoring in the Robot Assisted Polishing (RAP) process was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate removal of the...... improving the efficiency of the process. It also allows for intelligent process control and generally enhances the robustness and reliability of the automated RAP system in industrial applications.......The applicability of acoustic emission (AE) measurements for in-process monitoring in the Robot Assisted Polishing (RAP) process was investigated. Surface roughness measurements require interruption of the process, proper surface cleaning and measurements that sometimes necessitate removal...... of the part from the machine tool. In this study, development of surface roughness during polishing rotational symmetric surfaces by the RAP process was inferred from AE measurements. An AE sensor was placed on a polishing tool, and a cylindrical rod of Vanadis 4E steel having an initial turned surface...

  19. Automated Miniaturized Instrument for Space Biology Applications and the Monitoring of the Astronauts Health Onboard the ISS

    Science.gov (United States)

    Karouia, Fathi; Peyvan, Kia; Danley, David; Ricco, Antonio J.; Santos, Orlando; Pohorille, Andrew

    2011-01-01

    substantially by combining it with other technologies for automated, miniaturized, high-throughput biological measurements, such as fast sequencing, protein identification (proteomics) and metabolite profiling (metabolomics). Thus, the system can be integrated with other biomedical instruments in order to support and enhance telemedicine capability onboard ISS. NASA's mission includes sustained investment in critical research leading to effective countermeasures to minimize the risks associated with human spaceflight, and the use of appropriate technology to sustain space exploration at reasonable cost. Our integrated microarray technology is expected to fulfill these two critical requirements and to enable the scientific community to better understand and monitor the effects of the space environment on microorganisms and on the astronaut, in the process leveraging current capabilities and overcoming present limitations.

  20. Trend Analysis on the Automation of the Notebook PC Production Process

    Directory of Open Access Journals (Sweden)

    Chin-Ching Yeh

    2012-09-01

    Full Text Available Notebook PCs are among the Taiwanese electronic products that generate the highest production value and market share. According to the ITRI IEK statistics, the domestic Notebook PC - production value in 2011 was about NT $2.3 trillion. Of about 200 million notebook PCs in global markets in 2011, Taiwan’s notebook PC output accounts for more than 90% of them, meaning that nine out of every ten notebook PCs in the world are manufactured by Taiwanese companies. For such a large industry with its output value and quantity, the degree of automation in production processes is not high. This means that there is still a great room for the automation of the notebook PC production process, or that the degree of automation of the production process of the laptops cannot be enhanced. This paper presents an analysis of the situation.

  1. The Use of an Automated System (GreenFeed) to Monitor Enteric Methane and Carbon Dioxide Emissions from Ruminant Animals

    Science.gov (United States)

    Hristov, Alexander N.; Oh, Joonpyo; Giallongo, Fabio; Frederick, Tyler; Weeks, Holley; Zimmerman, Patrick R.; Harper, Michael T.; Hristova, Rada A.; Zimmerman, R. Scott; Branco, Antonio F.

    2015-01-01

    Ruminant animals (domesticated or wild) emit methane (CH4) through enteric fermentation in their digestive tract and from decomposition of manure during storage. These processes are the major sources of greenhouse gas (GHG) emissions from animal production systems. Techniques for measuring enteric CH4 vary from direct measurements (respiration chambers, which are highly accurate, but with limited applicability) to various indirect methods (sniffers, laser technology, which are practical, but with variable accuracy). The sulfur hexafluoride (SF6) tracer gas method is commonly used to measure enteric CH4 production by animal scientists and more recently, application of an Automated Head-Chamber System (AHCS) (GreenFeed, C-Lock, Inc., Rapid City, SD), which is the focus of this experiment, has been growing. AHCS is an automated system to monitor CH4 and carbon dioxide (CO2) mass fluxes from the breath of ruminant animals. In a typical AHCS operation, small quantities of baiting feed are dispensed to individual animals to lure them to AHCS multiple times daily. As the animal visits AHCS, a fan system pulls air past the animal’s muzzle into an intake manifold, and through an air collection pipe where continuous airflow rates are measured. A sub-sample of air is pumped out of the pipe into non-dispersive infra-red sensors for continuous measurement of CH4 and CO2 concentrations. Field comparisons of AHCS to respiration chambers or SF6 have demonstrated that AHCS produces repeatable and accurate CH4 emission results, provided that animal visits to AHCS are sufficient so emission estimates are representative of the diurnal rhythm of rumen gas production. Here, we demonstrate the use of AHCS to measure CO2 and CH4 fluxes from dairy cows given a control diet or a diet supplemented with technical-grade cashew nut shell liquid. PMID:26383886

  2. The Use of an Automated System (GreenFeed) to Monitor Enteric Methane and Carbon Dioxide Emissions from Ruminant Animals.

    Science.gov (United States)

    Hristov, Alexander N; Oh, Joonpyo; Giallongo, Fabio; Frederick, Tyler; Weeks, Holley; Zimmerman, Patrick R; Harper, Michael T; Hristova, Rada A; Zimmerman, R Scott; Branco, Antonio F

    2015-01-01

    Ruminant animals (domesticated or wild) emit methane (CH4) through enteric fermentation in their digestive tract and from decomposition of manure during storage. These processes are the major sources of greenhouse gas (GHG) emissions from animal production systems. Techniques for measuring enteric CH4 vary from direct measurements (respiration chambers, which are highly accurate, but with limited applicability) to various indirect methods (sniffers, laser technology, which are practical, but with variable accuracy). The sulfur hexafluoride (SF6) tracer gas method is commonly used to measure enteric CH4 production by animal scientists and more recently, application of an Automated Head-Chamber System (AHCS) (GreenFeed, C-Lock, Inc., Rapid City, SD), which is the focus of this experiment, has been growing. AHCS is an automated system to monitor CH4 and carbon dioxide (CO2) mass fluxes from the breath of ruminant animals. In a typical AHCS operation, small quantities of baiting feed are dispensed to individual animals to lure them to AHCS multiple times daily. As the animal visits AHCS, a fan system pulls air past the animal's muzzle into an intake manifold, and through an air collection pipe where continuous airflow rates are measured. A sub-sample of air is pumped out of the pipe into non-dispersive infra-red sensors for continuous measurement of CH4 and CO2 concentrations. Field comparisons of AHCS to respiration chambers or SF6 have demonstrated that AHCS produces repeatable and accurate CH4 emission results, provided that animal visits to AHCS are sufficient so emission estimates are representative of the diurnal rhythm of rumen gas production. Here, we demonstrate the use of AHCS to measure CO2 and CH4 fluxes from dairy cows given a control diet or a diet supplemented with technical-grade cashew nut shell liquid. PMID:26383886

  3. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  4. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods.

  5. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  6. Effect of Using Automated Auditing Tools on Detecting Compliance Failures in Unmanaged Processes

    Science.gov (United States)

    Doganata, Yurdaer; Curbera, Francisco

    The effect of using automated auditing tools to detect compliance failures in unmanaged business processes is investigated. In the absence of a process execution engine, compliance of an unmanaged business process is tracked by using an auditing tool developed based on business provenance technology or employing auditors. Since budget constraints limit employing auditors to evaluate all process instances, a methodology is devised to use both expert opinion on a limited set of process instances and the results produced by fallible automated audit machines on all process instances. An improvement factor is defined based on the average number of non-compliant process instances detected and it is shown that the improvement depends on the prevalence of non-compliance in the process as well as the sensitivity and the specificity of the audit machine.

  7. The use of process simulation models in virtual commissioning of process automation software in drinking water treatment plants

    NARCIS (Netherlands)

    Worm, G.I.M.; Kelderman, J.P.; Lapikas, T.; Van der Helm, A.W.C.; Van Schagen, K.M.; Rietveld, L.C.

    2012-01-01

    This research deals with the contribution of process simulation models to the factory acceptance test (FAT) of process automation (PA) software of drinking water treatment plants. Two test teams tested the same piece of modified PA-software. One team used an advanced virtual commissioning (AVC) syst

  8. Monitoring of an antigen manufacturing process.

    Science.gov (United States)

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  9. Signal Processing for Beam Position Monitors

    CERN Document Server

    Vismara, Giuseppe

    2000-01-01

    At the first sight the problem to determine the beam position from the ratio of the induced charges of the opposite electrodes of a beam monitor seems trivial, but up to now no unique solution has been found that fits the various demands of all particle accelerators. The purpose of this paper is to help "instrumentalists" to choose the best processing system for their particular application, depending on the machine size, the input dynamic range, the required resolution and the acquisition speed. After a general introduction and an analysis of the electrical signals to be treated (frequency and time domain), the definition of the electronic specifications will be reviewed. The tutorial will present the different families in which the processing systems can be grouped. A general description of the operating principles with relative advantages and disadvantages for the most employed processing systems is presented. Special emphasis will be put on recent technological developments based on telecommunication circ...

  10. Implications of critical chain methodology for business process flexible automation projects in economic organizations

    OpenAIRE

    Paul BRUDARU

    2009-01-01

    Business processes flexible automation projects involve the use of methods and technologies from Business Processes Management area (BPM) that aim at increasing the agility of organizations in changing the business processes as response to environmental changes. BPM-type projects are a mix between process improvement projects and software development which implies a high complexity in managing them. The successful implementation of these projects involves overcoming problems inherent as delay...

  11. The Multi-Isotope Process (MIP) Monitor Project: FY12 Progress and Accomplishments

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B.; Orton, Christopher R.; Jordan, David V.; Schwantes, Jon M.; Bender, Sarah; Dayman, Kenneth J.; Unlu, Kenan; Landsberger, Sheldon

    2012-09-27

    The Multi-Isotope Process (MIP) Monitor, being developed at Pacific Northwest National Laboratory (PNNL), provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of "...(minimization of) the risks of nuclear proliferation and terrorism." The MIP Monitor measures distributions of a suite of indicator (radioactive) isotopes present within product and waste streams of a nuclear reprocessing facility. These indicator isotopes are monitored on-line by gamma spectrometry and compared, in near-real-time, to spectral patterns representing "normal" process conditions using multivariate pattern recognition software. The monitor utilizes this multivariate analysis and gamma spectroscopy of reprocessing streams to detect small changes in the gamma spectrum, which may indicate changes in process conditions. Multivariate analysis methods common in chemometrics, such as principal component analysis (PCA) and partial least squares regression (PLS), act as pattern recognition techniques, which can detect small deviations from the expected, nominal condition. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting. Development of the MIP Monitor approach continues to evaluate the efficacy of the monitor for automated, real-time or near-real-time application. This report details follow-on research and development efforts sponsored by the U.S. Department of Energy Fuel Cycle Research and Development related to the MIP Monitor for fiscal year

  12. Automated gas chromatography

    Science.gov (United States)

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  13. Synthesis of many different types of organic small molecules using one automated process.

    Science.gov (United States)

    Li, Junqi; Ballmer, Steven G; Gillis, Eric P; Fujii, Seiko; Schmidt, Michael J; Palazzolo, Andrea M E; Lehmann, Jonathan W; Morehouse, Greg F; Burke, Martin D

    2015-03-13

    Small-molecule synthesis usually relies on procedures that are highly customized for each target. A broadly applicable automated process could greatly increase the accessibility of this class of compounds to enable investigations of their practical potential. Here we report the synthesis of 14 distinct classes of small molecules using the same fully automated process. This was achieved by strategically expanding the scope of a building block-based synthesis platform to include even C(sp3)-rich polycyclic natural product frameworks and discovering a catch-and-release chromatographic purification protocol applicable to all of the corresponding intermediates. With thousands of compatible building blocks already commercially available, many small molecules are now accessible with this platform. More broadly, these findings illuminate an actionable roadmap to a more general and automated approach for small-molecule synthesis.

  14. The feasibility of automated online flow cytometry for in-situ monitoring of microbial dynamics in aquatic ecosystems

    Directory of Open Access Journals (Sweden)

    Michael Domenic Besmer

    2014-06-01

    Full Text Available Fluorescent staining coupled with flow cytometry (FCM is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i flowing tap water from a municipal drinking water supply network and (ii river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12 to 14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough towards the eventual establishment of fully automated online microbiological monitoring technologies.

  15. Automating the Object-Oriented Software Development Process: Workshop Report

    NARCIS (Netherlands)

    Aksit, Mehmet; Tekinerdogan, Bedir

    1998-01-01

    Cost-effective realization of robust, adaptable and reusable software systems demands efficient and effective management of the overall software production process. Current object-oriented methods are not completely formalized and lack the ability of reasoning about the quality of processes and soft

  16. An automated method to quantify microglia morphology and application to monitor activation state longitudinally in vivo.

    Directory of Open Access Journals (Sweden)

    Cleopatra Kozlowski

    Full Text Available Microglia are specialized immune cells of the brain. Upon insult, microglia initiate a cascade of cellular responses including a characteristic change in cell morphology. To study the dynamics of microglia immune response in situ, we developed an automated image analysis method that enables the quantitative assessment of microglia activation state within tissue based solely on cell morphology. Per cell morphometric analysis of fluorescently labeled microglia is achieved through local iterative threshold segmentation, which reduces errors caused by signal-to-noise variation across large volumes. We demonstrate, utilizing systemic application of lipopolysaccharide as a model of immune challenge, that several morphological parameters, including cell perimeter length, cell roundness and soma size, quantitatively distinguish resting versus activated populations of microglia within tissue comparable to traditional immunohistochemistry methods. Furthermore, we provide proof-of-concept data that monitoring soma size enables the longitudinal assessment of microglia activation in the mouse neocortex imaged via 2-photon in vivo microscopy. The ability to quantify microglia activation automatically by shape alone allows unbiased and rapid analysis of both fixed and in vivo central nervous system tissue.

  17. ConfocalCheck--a software tool for the automated monitoring of confocal microscope performance.

    Directory of Open Access Journals (Sweden)

    Keng Imm Hng

    Full Text Available Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system's performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments.

  18. Utility of an Automated Thermal-Based Approach for Monitoring Evapotranspiration

    Directory of Open Access Journals (Sweden)

    Timmermans Wim J.

    2015-12-01

    Full Text Available A very simple remote sensing-based model for water use monitoring is presented. The model acronym DATTUTDUT (Deriving Atmosphere Turbulent Transport Useful To Dummies Using Temperature is a Dutch word which loosely translates as “it’s unbelievable that it works”. DATTUTDUT is fully automated and only requires a surface temperature map, making it simple to use and providing a rapid estimate of spatially- distributed fluxes. The algorithm is first tested over a range of environmental and land-cover conditions using data from four short-term field experiments and then evaluated over a growing season in an agricultural region. Flux model output is in satisfactory agreement with observations and established remote sensing-based models, except under dry and partial canopy cover conditions. This suggests that DATTUTDUT has utility in identifying relative water use and as an operational tool providing initial estimates of ET anomalies in data-poor regions that would be confirmed using more robust modeling techniques.

  19. The Continuous Monitoring of Flash Flood Velocity Field based on an Automated LSPIV System

    Science.gov (United States)

    Li, W.; Ran, Q.; Liao, Q.

    2014-12-01

    Large-scale particle image velocimetry (LSPIV) is a non-intrusive tool for flow velocity field measurement and has more advantages against traditional techniques, with its applications on river, lake and ocean, especially under extreme conditions. An automated LSPIV system is presented in this study, which can be easily set up and executed for continuous monitoring of flash flood. The experiment site is Longchi village, Sichuan Province, where 8.0 magnitude earthquake occurred in 2008 and debris flow happens every year since then. The interest of area is about 30m*40m of the channel which has been heavily destroyed by debris flow. Series of videos obtained during the flood season indicates that flood outbreaks after rainstorm just for several hours. Measurement is complete without being influenced by this extreme weather condition and results are more reliable and accurate due to high soil concentration. Compared with direct measurement by impellor flow meter, we validated that LSPIV works well at mountain stream, with index of 6.7% (Average Relative Error) and 95% (Nash-Sutcliffe Coefficient). On Jun 26, the maximum flood surface velocity reached 4.26 m/s, and the discharge based on velocity-area method was also decided. Overall, this system is safe, non-contact and can be adjusted according to our requirement flexibly. We can get valuable data of flood which is scarce before, which will make a great contribution to the analysis of flood and debris flow mechanism.

  20. Automating Measurement for Software Process Models using Attribute Grammar Rules

    Directory of Open Access Journals (Sweden)

    Abdul Azim Abd. Ghani

    2007-08-01

    Full Text Available The modelling concept is well accepted in software engineering discipline. Some software models are built either to control the development stages, to measure program quality or to serve as a medium that gives better understanding of the actual software systems. Software process modelling nowadays has reached a level that allow software designs to be transformed into programming languages, such as architecture design language and unified modelling language. This paper described the adaptation of attribute grammar approach in measuring software process model. A tool, called Software Process Measurement Application was developed to enable the measurement accordingly to specified attribute grammar rules. A context-free grammar to read the process model is depicted from IDEF3 standard, and rules were attached to enable the measurement metrics calculation. The measurement metric values collected were used to aid in determining the decomposing and structuring of processes for the proposed software systems.

  1. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    Science.gov (United States)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  2. Secure VM for Monitoring Industrial Process Controllers

    Energy Technology Data Exchange (ETDEWEB)

    Dasgupta, Dipankar [ORNL; Ali, Mohammad Hassan [University of Memphis; Abercrombie, Robert K [ORNL; Schlicher, Bob G [ORNL; Sheldon, Frederick T [ORNL; Carvalho, Marco [Institute of Human and Machine Cognition

    2011-01-01

    In this paper, we examine the biological immune system as an autonomic system for self-protection, which has evolved over millions of years probably through extensive redesigning, testing, tuning and optimization process. The powerful information processing capabilities of the immune system, such as feature extraction, pattern recognition, learning, memory, and its distributive nature provide rich metaphors for its artificial counterpart. Our study focuses on building an autonomic defense system, using some immunological metaphors for information gathering, analyzing, decision making and launching threat and attack responses. In order to detection Stuxnet like malware, we propose to include a secure VM (or dedicated host) to the SCADA Network to monitor behavior and all software updates. This on-going research effort is not to mimic the nature but to explore and learn valuable lessons useful for self-adaptive cyber defense systems.

  3. The monitoring and control of TRUEX processes

    Energy Technology Data Exchange (ETDEWEB)

    Regalbuto, M.C.; Misra, B.; Chamberlain, D.B.; Leonard, R.A.; Vandegrift, G.F.

    1992-04-01

    The Generic TRUEX Model (GTM) was used to design a flowsheet for the TRUEX solvent extraction process that would be used to determine its instrumentation and control requirements. Sensitivity analyses of the key process variables, namely, the aqueous and organic flow rates, feed compositions, and the number of contactor stages, were carried out to assess their impact on the operation of the TRUEX process. Results of these analyses provide a basis for the selection of an instrument and control system and the eventual implementation of a control algorithm. Volume Two of this report is an evaluation of the instruments available for measuring many of the physical parameters. Equations that model the dynamic behavior of the TRUEX process have been generated. These equations can be used to describe the transient or dynamic behavior of the process for a given flowsheet in accordance with the TRUEX model. Further work will be done with the dynamic model to determine how and how quickly the system responds to various perturbations. The use of perturbation analysis early in the design stage will lead to a robust flowsheet, namely, one that will meet all process goals and allow for wide control bounds. The process time delay, that is, the speed with which the system reaches a new steady state, is an important parameter in monitoring and controlling a process. In the future, instrument selection and point-of-variable measurement, now done using the steady-state results reported here, will be reviewed and modified as necessary based on this dynamic method of analysis.

  4. The monitoring and control of TRUEX processes

    International Nuclear Information System (INIS)

    The Generic TRUEX Model (GTM) was used to design a flowsheet for the TRUEX solvent extraction process that would be used to determine its instrumentation and control requirements. Sensitivity analyses of the key process variables, namely, the aqueous and organic flow rates, feed compositions, and the number of contactor stages, were carried out to assess their impact on the operation of the TRUEX process. Results of these analyses provide a basis for the selection of an instrument and control system and the eventual implementation of a control algorithm. Volume Two of this report is an evaluation of the instruments available for measuring many of the physical parameters. Equations that model the dynamic behavior of the TRUEX process have been generated. These equations can be used to describe the transient or dynamic behavior of the process for a given flowsheet in accordance with the TRUEX model. Further work will be done with the dynamic model to determine how and how quickly the system responds to various perturbations. The use of perturbation analysis early in the design stage will lead to a robust flowsheet, namely, one that will meet all process goals and allow for wide control bounds. The process time delay, that is, the speed with which the system reaches a new steady state, is an important parameter in monitoring and controlling a process. In the future, instrument selection and point-of-variable measurement, now done using the steady-state results reported here, will be reviewed and modified as necessary based on this dynamic method of analysis

  5. An Improvement in Thermal Modelling of Automated Tape Placement Process

    International Nuclear Information System (INIS)

    The thermoplastic tape placement process offers the possibility of manufacturing large laminated composite parts with all kinds of geometries (double curved i.e.). This process is based on the fusion bonding of a thermoplastic tape on a substrate. It has received a growing interest during last years because of its non autoclave abilities.In order to control and optimize the quality of the manufactured part, we need to predict the temperature field throughout the processing of the laminate. In this work, we focus on a thermal modeling of this process which takes in account the imperfect bonding existing between the different layers of the substrate by introducing thermal contact resistance in the model. This study is leaning on experimental results which inform us that the value of the thermal resistance evolves with temperature and pressure applied on the material.

  6. An Improvement in Thermal Modelling of Automated Tape Placement Process

    Science.gov (United States)

    Barasinski, Anaïs; Leygue, Adrien; Soccard, Eric; Poitou, Arnaud

    2011-01-01

    The thermoplastic tape placement process offers the possibility of manufacturing large laminated composite parts with all kinds of geometries (double curved i.e.). This process is based on the fusion bonding of a thermoplastic tape on a substrate. It has received a growing interest during last years because of its non autoclave abilities. In order to control and optimize the quality of the manufactured part, we need to predict the temperature field throughout the processing of the laminate. In this work, we focus on a thermal modeling of this process which takes in account the imperfect bonding existing between the different layers of the substrate by introducing thermal contact resistance in the model. This study is leaning on experimental results which inform us that the value of the thermal resistance evolves with temperature and pressure applied on the material.

  7. MO-G-BRE-03: Automated Continuous Monitoring of Patient Setup with Second-Check Independent Image Registration

    International Nuclear Information System (INIS)

    Purpose: To create a non-supervised quality assurance program to monitor image-based patient setup. The system acts a secondary check by independently computing shifts and rotations and interfaces with Varian's database to verify therapist's work and warn against sub-optimal setups. Methods: Temporary digitally-reconstructed radiographs (DRRs) and OBI radiographic image files created by Varian's treatment console during patient setup are intercepted and used as input in an independent registration module customized for accuracy that determines the optimal rotations and shifts. To deal with the poor quality of OBI images, a histogram equalization of the live images to the DDR counterparts is performed as a pre-processing step. A search for the most sensitive metric was performed by plotting search spaces subject to various translations and convergence analysis was applied to ensure the optimizer finds the global minima. Final system configuration uses the NCC metric with 150 histogram bins and a one plus one optimizer running for 2000 iterations with customized scales for translations and rotations in a multi-stage optimization setup that first corrects and translations and subsequently rotations. Results: The system was installed clinically to monitor and provide almost real-time feedback on patient positioning. On a 2 month-basis uncorrected pitch values were of a mean 0.016° with standard deviation of 1.692°, and couch rotations of − 0.090°± 1.547°. The couch shifts were −0.157°±0.466° cm for the vertical, 0.045°±0.286 laterally and 0.084°± 0.501° longitudinally. Uncorrected pitch angles were the most common source of discrepancies. Large variations in the pitch angles were correlated with patient motion inside the mask. Conclusion: A system for automated quality assurance of therapist's registration was designed and tested in clinical practice. The approach complements the clinical software's automated registration in

  8. MIR-ATR sensor for process monitoring

    International Nuclear Information System (INIS)

    A mid-infrared attenuated total reflectance (MIR-ATR) sensor has been developed for chemical reaction monitoring. The optical setup of the compact and low-priced sensor consists of an IR emitter as light source, a zinc selenide (ZnSe) ATR prism as boundary to the process, and four thermopile detectors, each equipped with an optical bandpass filter. The practical applicability was tested during esterification of ethanol and formic acid to ethyl formate and water as a model reaction with subsequent distillation. For reference analysis, a Fourier transform mid-infrared (FT-MIR) spectrometer with diamond ATR module was applied. On-line measurements using the MIR-ATR sensor and the FT-MIR spectrometer were performed in a bypass loop. The sensor was calibrated by multiple linear regression in order to link the measured absorbance in the four optical channels to the analyte concentrations. The analytical potential of the MIR-ATR sensor was demonstrated by simultaneous real-time monitoring of all four chemical substances involved in the esterification and distillation process. The temporal courses of the sensor signals are in accordance with the concentration values achieved by the commercial FT-MIR spectrometer. The standard error of prediction for ethanol, formic acid, ethyl formate, and water were 0.38 mol L  −  1, 0.48 mol L  −  1, 0.38 mol L  −  1, and 1.12 mol L  −  1, respectively. A procedure based on MIR spectra is presented to simulate the response characteristics of the sensor if the transmission ranges of the filters are varied. Using this tool analyte specific bandpass filters for a particular chemical reaction can be identified. By exchanging the optical filters, the sensor can be adapted to a wide range of processes in the chemical, pharmaceutical, and beverage industries. (paper)

  9. Performance of Three Mode-Meter Block-Processing Algorithms for Automated Dynamic Stability Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Trudnowski, Daniel J.; Pierre, John W.; Zhou, Ning; Hauer, John F.; Parashar, Manu

    2008-05-31

    The frequency and damping of electromechanical modes offer considerable insight into the dynamic stability properties of a power system. The performance properties of three block-processing algorithms from the perspective of near real-time automated stability assessment are demonstrated and examined. The algorithms are: the extended modified Yule Walker (YW); extended modified Yule Walker with Spectral analysis (YWS); and numerical state-space subspace system identification(N4SID) algorithm. The YW and N4SID have been introduced in previous publications while the YWS is introduced here. Issues addressed include: stability assessment requirements; automated subset selecting identified modes; using algorithms in an automated format; data assumptions and quality; and expected algorithm estimation performance.

  10. 19 CFR 24.25 - Statement processing and Automated Clearinghouse.

    Science.gov (United States)

    2010-04-01

    ... of entry, but not a Saturday, Sunday or holiday). (2) Customs shall provide a preliminary statement... statement. This presentation must be made within 10 working days after entry of the merchandise. If a filer... processing within 10 working days after the date of entry. It is the responsibility of the ABI filer...

  11. Automation System in Rare Earths Countercurrent Extraction Processes

    Institute of Scientific and Technical Information of China (English)

    贾江涛; 严纯华; 廖春生; 吴声; 王明文; 李标国

    2001-01-01

    Based on the countercurrent extraction theory for optimized designing and simulating, the rare earth separation processes, the selection of the detecting points (stages) and on-line analysis for elements, the simulation of open loop response and its response speed, the diagnosis and the regulative prescription for running the solvent extraction cascades were studied.

  12. Automated Braille production from word-processed documents.

    Science.gov (United States)

    Blenkhorn, P; Evans, G

    2001-03-01

    This paper describes a novel method for automatically generating Braille documents from word-processed (Microsoft Word) documents. In particular it details how, by using the Word Object Model, the translation system can map the layout information (format) in the print document into an appropriate Braille equivalent.

  13. Emergency healthcare process automation using mobile computing and cloud services.

    Science.gov (United States)

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2012-10-01

    Emergency care is basically concerned with the provision of pre-hospital and in-hospital medical and/or paramedical services and it typically involves a wide variety of interdependent and distributed activities that can be interconnected to form emergency care processes within and between Emergency Medical Service (EMS) agencies and hospitals. Hence, in developing an information system for emergency care processes, it is essential to support individual process activities and to satisfy collaboration and coordination needs by providing readily access to patient and operational information regardless of location and time. Filling this information gap by enabling the provision of the right information, to the right people, at the right time fosters new challenges, including the specification of a common information format, the interoperability among heterogeneous institutional information systems or the development of new, ubiquitous trans-institutional systems. This paper is concerned with the development of an integrated computer support to emergency care processes by evolving and cross-linking institutional healthcare systems. To this end, an integrated EMS cloud-based architecture has been developed that allows authorized users to access emergency case information in standardized document form, as proposed by the Integrating the Healthcare Enterprise (IHE) profile, uses the Organization for the Advancement of Structured Information Standards (OASIS) standard Emergency Data Exchange Language (EDXL) Hospital Availability Exchange (HAVE) for exchanging operational data with hospitals and incorporates an intelligent module that supports triaging and selecting the most appropriate ambulances and hospitals for each case. PMID:22205383

  14. Lyophilization: a useful approach to the automation of analytical processes?

    OpenAIRE

    de Castro, M. D. Luque; Izquierdo, A

    1990-01-01

    An overview of the state-of-the-art in the use of lyophilization for the pretreatment of samples and standards prior to their storage and/or preconcentration is presented. The different analytical applications of this process are dealt with according to the type of material (reagent, standard, samples) and matrix involved.

  15. What do information reuse and automated processing require in engineering design? Semantic process

    Directory of Open Access Journals (Sweden)

    Ossi Nykänen

    2011-12-01

    Full Text Available Purpose: The purpose of this study is to characterize, analyze, and demonstrate machine-understandable semantic process for validating, integrating, and processing technical design information. This establishes both a vision and tools for information reuse and semi-automatic processing in engineering design projects, including virtual machine laboratory applications with generated components.Design/methodology/approach: The process model has been developed iteratively in terms of action research, constrained by the existing technical design practices and assumptions (design documents, expert feedback, available technologies (pre-studies and experiments with scripting and pipeline tools, benchmarking with other process models and methods (notably the RUP and DITA, and formal requirements (computability and the critical information paths for the generated applications. In practice, the work includes both quantitative and qualitative components.Findings: Technical design processes may be greatly enhanced in terms of semantic process thinking, by enriching design information, and automating information validation and transformation tasks. Contemporary design information, however, is mainly intended for human consumption, and needs to be explicitly enriched with the currently missing data and interfaces. In practice, this may require acknowledging the role of technical information or knowledge engineer, to lead the development of the semantic design information process in a design organization. There is also a trade-off between machine-readability and system complexity that needs to be studied further, both empirically and in theory.Research limitations/implications: The conceptualization of the semantic process is essentially an abstraction based on the idea of progressive design. While this effectively allows implementing semantic processes with, e.g., pipeline technologies, the abstraction is valid only when technical design is organized into

  16. Building Automation Systems.

    Science.gov (United States)

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  17. Measures and mechanisms for process monitoring in evolving business networks

    OpenAIRE

    Comuzzi, M.; Vonk, J.; Grefen, P.

    2012-01-01

    The literature on monitoring of cross-organizational processes, executed within business networks, considers monitoring only in the network formation phase, since network establishment determines what can be monitored during process execution. In particular, the impact of evolution in such networks on monitoring is not considered. When a business network evolves, e.g. contracts are introduced, updated, or dropped, or actors join or leave the network, the monitoring requirements of the network...

  18. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules: Final Subcontract Report, April 1998 - April 2002

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Miller, D. C.; Moore, S. B.; Hogan, S. J.

    2002-11-01

    This report describes the automated systems developed for PV module assembly and testing processes after lamination. These processes are applicable to a broad range of module types, including those made with wafer-based and thin-film solar cells. Survey data and input from module manufacturers gathered during site visits were used to define system capabilities and process specifications. Spire completed mechanical, electrical, and software engineering for four automation systems: a module edge trimming system, the SPI-TRIM 350; an edge sealing and framing system, the SPI-FRAMER 350; an integrated module testing system, the SPI-MODULE QA 350; and a module buffer storage system, the SPI-BUFFER 350. A fifth system for junction-box installation, the SPI-BOXER 350, was nearly completed during the program. A new-size solar simulator, the SPI-SUN SIMULATOR 350i, was designed as part of the SPI-MODULE QA 350. This simulator occupies minimal production floor space, and its test area is large enough to handle most production modules. The automated systems developed in this program are designed for integration to create automated production lines.

  19. Application of automation and information systems to forensic genetic specimen processing.

    Science.gov (United States)

    Leclair, Benoît; Scholl, Tom

    2005-03-01

    During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.

  20. Automation of the Process to Obtain U F4 Powders

    International Nuclear Information System (INIS)

    Here is exposed the preliminary analysis of the control system to be implemented in the Production Plant of UF4 Powders.The work has been done in the electronic laboratory.This implies, the setting of devices (PLC, Temperature Controllers, etc.) and the setting of the communications using the proper protocol.Also is shown a study about the logic for the first part of the conversion process of UF6: the evaporation.This study is used to define the methodology to follow in a future PLC program

  1. Cassini's Maneuver Automation Software (MAS) Process: How to Successfully Command 200 Navigation Maneuvers

    Science.gov (United States)

    Yang, Genevie Velarde; Mohr, David; Kirby, Charles E.

    2008-01-01

    To keep Cassini on its complex trajectory, more than 200 orbit trim maneuvers (OTMs) have been planned from July 2004 to July 2010. With only a few days between many of these OTMs, the operations process of planning and executing the necessary commands had to be automated. The resulting Maneuver Automation Software (MAS) process minimizes the workforce required for, and maximizes the efficiency of, the maneuver design and uplink activities. The MAS process is a well-organized and logically constructed interface between Cassini's Navigation (NAV), Spacecraft Operations (SCO), and Ground Software teams. Upon delivery of an orbit determination (OD) from NAV, the MAS process can generate a maneuver design and all related uplink and verification products within 30 minutes. To date, all 112 OTMs executed by the Cassini spacecraft have been successful. MAS was even used to successfully design and execute a maneuver while the spacecraft was in safe mode.

  2. Process monitoring for reprocessing plant safeguards: a summary review

    International Nuclear Information System (INIS)

    Process monitoring is a term typically associated with a detailed look at plant operating data to determine plant status. Process monitoring has been generally associated with operational control of plant processes. Recently, process monitoring has been given new attention for a possible role in international safeguards. International Safeguards Project Office (ISPO) Task C.59 has the goal to identify specific roles for process monitoring in international safeguards. As the preliminary effort associated with this task, a review of previous efforts in process monitoring for safeguards was conducted. Previous efforts mentioned concepts and a few specific applications. None were comprehensive in addressing all aspects of a process monitoring application for safeguards. This report summarizes the basic elements that must be developed in a comprehensive process monitoring application for safeguards. It then summarizes the significant efforts that have been documented in the literature with respect to the basic elements that were addressed

  3. AUTOMATED SYSTEM OF DATA PROCESSING WITH THE IMPLEMENTATION OF RATING TECHNOLOGY OF TEACHING

    OpenAIRE

    О. И. Дзювина; К. Е. Глинчиков

    2014-01-01

    Rating technology of teaching enables independent and individual work of students, increase their motivation.Purpose: to increase the efficiency of data processing with the implementation of rating technology of teaching.Method: analysis, synthesis,experiment.Results. Developed an automated data processing system for the implementation of rating technology of teaching.Practical implication. Education.Purchase on Elibrary.ru > Buy now

  4. AUTOMATED SYSTEM OF DATA PROCESSING WITH THE IMPLEMENTATION OF RATING TECHNOLOGY OF TEACHING

    Directory of Open Access Journals (Sweden)

    О. И. Дзювина

    2014-01-01

    Full Text Available Rating technology of teaching enables independent and individual work of students, increase their motivation.Purpose: to increase the efficiency of data processing with the implementation of rating technology of teaching.Method: analysis, synthesis,experiment.Results. Developed an automated data processing system for the implementation of rating technology of teaching.Practical implication. Education.Purchase on Elibrary.ru > Buy now

  5. An image-processing program for automated counting

    Science.gov (United States)

    Cunningham, D.J.; Anderson, W.H.; Anthony, R.M.

    1996-01-01

    An image-processing program developed by the National Institute of Health, IMAGE, was modified in a cooperative project between remote sensing specialists at the Ohio State University Center for Mapping and scientists at the Alaska Science Center to facilitate estimating numbers of black brant (Branta bernicla nigricans) in flocks at Izembek National Wildlife Refuge. The modified program, DUCK HUNT, runs on Apple computers. Modifications provide users with a pull down menu that optimizes image quality; identifies objects of interest (e.g., brant) by spectral, morphometric, and spatial parameters defined interactively by users; counts and labels objects of interest; and produces summary tables. Images from digitized photography, videography, and high- resolution digital photography have been used with this program to count various species of waterfowl.

  6. Quantitative and Qualitative Analysis of Aconitum Alkaloids in Raw and Processed Chuanwu and Caowu by HPLC in Combination with Automated Analytical System and ESI/MS/MS

    Directory of Open Access Journals (Sweden)

    Aimin Sun

    2012-01-01

    Full Text Available HPLC in combination with automated analytical system and ESI/MS/MS was used to analyze aconitine (A, mesaconitine (MA, hypaconitine (HA, and their benzoyl analogs in the Chinese herbs Caowu and Chuanwu. First, an HPLC method was developed and validated to determine A, MA, and HA in raw and processed Caowu and Chuanwu. Then an automated analytical system and ESI/MS/MS were applied to analyze these alkaloids and their semihydrolyzed products. The results obtained from automated analytical system are identical to those from ESI/MS/MS, which indicated that the method is a convenient and rapid tool for the qualitative analysis of herbal preparations. Furthermore, HA was little hydrolyzed by heating processes and thus it might account more for the toxicity of processed aconites. Hence, HA could be used as an indicator when one alkaloid is required as a reference to monitor the quality of raw and processed Chuanwu and Caowu. In addition, the raw and processed Chuanwu and Caowu can be distinguished by monitoring the ratio of A and MA to HA.

  7. Monitoring individual cow udder health in automated milking systems using online somatic cell counts.

    Science.gov (United States)

    Sørensen, L P; Bjerring, M; Løvendahl, P

    2016-01-01

    This study presents and validates a detection and monitoring model for mastitis based on automated frequent sampling of online cell count (OCC). Initially, data were filtered and adjusted for sensor drift and skewed distribution using ln-transformation. Acceptable data were passed on to a time-series model using double exponential smoothing to estimate level and trends at cow level. The OCC levels and trends were converted to a continuous (0-1) scale, termed elevated mastitis risk (EMR), where values close to zero indicate healthy cow status and values close to 1 indicate high risk of mastitis. Finally, a feedback loop was included to dynamically request a time to next sample, based on latest EMR values or errors in the raw data stream. The estimated EMR values were used to issue 2 types of alerts, new and (on-going) intramammary infection (IMI) alerts. The new alerts were issued when the EMR values exceeded a threshold, and the IMI alerts were issued for subsequent alerts. New alerts were only issued after the EMR had been below the threshold for at least 8d. The detection model was evaluated using time-window analysis and commercial herd data (6 herds, 595,927 milkings) at different sampling intensities. Recorded treatments of mastitis were used as gold standard. Significantly higher EMR values were detected in treated than in contemporary untreated cows. The proportion of detected mastitis cases using new alerts was between 28.0 and 43.1% and highest for a fixed sampling scheme aiming at 24h between measurements. This was higher for IMI alerts, between 54.6 and 89.0%, and highest when all available measurements were used. The lowest false alert rate of 6.5 per 1,000 milkings was observed when all measurements were used. The results showed that a dynamic sampling scheme with a default value of 24h between measurements gave only a small reduction in proportion of detected mastitis treatments and remained at 88.5%. It was concluded that filtering of raw data

  8. UNICOS CPC6: Automated Code Generation for Process Control Applications

    CERN Document Server

    Fernandez Adiego, B; Prieto Barreiro, I

    2011-01-01

    The Continuous Process Control package (CPC) is one of the components of the CERN Unified Industrial Control System framework (UNICOS) [1]. As a part of this framework, UNICOS-CPC provides a well defined library of device types, amethodology and a set of tools to design and implement industrial control applications. The new CPC version uses the software factory UNICOS Application Builder (UAB) [2] to develop CPC applications. The CPC component is composed of several platform oriented plugins PLCs and SCADA) describing the structure and the format of the generated code. It uses a resource package where both, the library of device types and the generated file syntax, are defined. The UAB core is the generic part of this software, it discovers and calls dynamically the different plug-ins and provides the required common services. In this paper the UNICOS CPC6 package is introduced. It is composed of several plug-ins: the Instance generator and the Logic generator for both, Siemens and Schneider PLCs, the SCADA g...

  9. Automated Coronal Loop Identification Using Digital Image Processing Techniques

    Science.gov (United States)

    Lee, Jong K.; Gary, G. Allen; Newman, Timothy S.

    2003-01-01

    The results of a master thesis project on a study of computer algorithms for automatic identification of optical-thin, 3-dimensional solar coronal loop centers from extreme ultraviolet and X-ray 2-dimensional images will be presented. These center splines are proxies of associated magnetic field lines. The project is pattern recognition problems in which there are no unique shapes or edges and in which photon and detector noise heavily influence the images. The study explores extraction techniques using: (1) linear feature recognition of local patterns (related to the inertia-tensor concept), (2) parametric space via the Hough transform, and (3) topological adaptive contours (snakes) that constrains curvature and continuity as possible candidates for digital loop detection schemes. We have developed synthesized images for the coronal loops to test the various loop identification algorithms. Since the topology of these solar features is dominated by the magnetic field structure, a first-order magnetic field approximation using multiple dipoles provides a priori information in the identification process. Results from both synthesized and solar images will be presented.

  10. Laser materials processing of complex components. From reverse engineering via automated beam path generation to short process development cycles.

    Science.gov (United States)

    Görgl, R.; Brandstätter, E.

    2016-03-01

    The article presents an overview of what is possible nowadays in the field of laser materials processing. The state of the art in the complete process chain is shown, starting with the generation of a specific components CAD data and continuing with the automated motion path generation for the laser head carried by a CNC or robot system. Application examples from laser welding, laser cladding and additive laser manufacturing are given.

  11. Manual of process automation. On-line control systems for devices in the process technology. 3. tot. rev. and enl. ed.; Handbuch der Prozessautomatisierung. Prozessleittechnik fuer verfahrenstechnische Anlagen

    Energy Technology Data Exchange (ETDEWEB)

    Maier, U.; Frueh, K.F. (eds.)

    2004-07-01

    This is a reference manual for engineers who need answers to automation problems in chemical engineering. Some new current subjects have been introduced to complement the information. The following chapters are new or have been rewritten by new authors: Internet and intranet technologies; Outline of process-related functions; Control systems in industrial applications; Problems and solutions; Model-based predicative control (MPC); Report archive analysis; Control Loop Performance Monitoring (CPM); Automation structures; Explosion protection; Remote-I/O; Integration of intelligent field equipment in PLS; Weighing and filling techniques; Safety; Maintenance - structures and strategies. The other chapters have been revised and updated as well. (orig.) [German] Das grundsaetzliche Konzept des Handbuchs ist unveraendert: Es dient als Nachschlagewerk fuer Ingenieure, die sich in verschiedenen Taetigkeitsbereichen mit Fragen der Automatisierung verfahrenstechnischer Anlagen auseinandersetzen muessen. Einige Themen wurden neu aufgenommen - wegen ihrer Aktualitaet und zur Abrundung des Themenspektrums. Folgende Kapitel sind voellig neu oder mit neuen Autoren wesentlich erweitert: Internet-/Intranettechnologien; Uebersicht ueber prozessnahe Funktionen; Industrielle Regelung: Probleme und Problemloesungen; Modellgestuetzte praediktive Regelung (MPC); Meldearchivanalyse; Control Loop Performance Monitoring (CPM); Automatisierungsstrukturen; Explosionsschutz; Remote-I/O; Integration intelligenter Feldgeraete in PLS; Waege- und Abfuelltechnik; Anlagensicherheit; Ganzheitliche Instandhaltung - Strukturen und Strategien. Die uebrigen Kapitel wurden aktualisiert und teilweise auch wesentlich ueberarbeitet. (orig.)

  12. An agent-based service-oriented integration architecture for chemical process automation

    Institute of Scientific and Technical Information of China (English)

    Na Luo; Weimin Zhong; Feng Wan; Zhencheng Ye; Feng Qian

    2015-01-01

    In reality, traditional process control system built upon centralized and hierarchical structures presents a weak response to change and is easy to shut down by single failure. Aiming at these problems, a new agent-based service-oriented integration architecture was proposed for chemical process automation system. Web services were dynamical y orchestrated on the internet and agent behaviors were built in them. Data analysis, model, op-timization, control, fault diagnosis and so on were capsuled into different web services. Agents were used for ser-vice compositions by negotiation. A prototype system of poly(ethylene terephthalate) process automation was used as the case study to demonstrate the validation of the integration.

  13. Automation of NLO processes and decays and POWHEG matching in WHIZARD

    CERN Document Server

    Reuter, J; Hoang, A; Kilian, W; Stahlhofen, M; Teubner, T; Weiss, C

    2016-01-01

    We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one-loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e+e- processes are shown. Also, first NLO-studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the POWHEG formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables.

  14. Automation of NLO processes and decays and POWHEG matching in WHIZARD

    Energy Technology Data Exchange (ETDEWEB)

    Reuter, Juergen; Chokoufe, Bijan [DESY, Hamburg (Germany). Theory Group; Hoang, Andre [Vienna Univ. (Austria). Faculty of Physics; Vienna Univ. (Austria). Erwin Schroedinger International Inst. for Mathematical Physics; Kilian, Wolfgang [Siegen Univ. (Germany); Stahlhofen, Maximilian [Mainz Univ. (Germany). PRISMA Cluster of Excellence; DESY, Hamburg (Germany). Theory Group; Teubner, Thomas [Liverpool Univ. (United Kingdom). Dept. of Mathematical Sciences; Weiss, Christian [DESY, Hamburg (Germany). Theory Group; Siegen Univ. (Germany)

    2016-03-15

    We give a status report on the automation of next-to-leading order processes within the Monte Carlo event generator WHIZARD, using GoSam and OpenLoops as provider for one-loop matrix elements. To deal with divergences, WHIZARD uses automated FKS subtraction, and the phase space for singular regions is generated automatically. NLO examples for both scattering and decay processes with a focus on e{sup +}e{sup -} processes are shown. Also, first NLO-studies of observables for collisions of polarized leptons beams, e.g. at the ILC, will be presented. Furthermore, the automatic matching of the fixed-order NLO amplitudes with emissions from the parton shower within the POWHEG formalism inside WHIZARD will be discussed. We also present results for top pairs at threshold in lepton collisions, including matching between a resummed threshold calculation and fixed-order NLO. This allows the investigation of more exclusive differential observables.

  15. Comprehensive automation and monitoring of MV grids as the key element of improvement of energy supply reliability and continuity

    Directory of Open Access Journals (Sweden)

    Stanisław Kubacki

    2012-03-01

    Full Text Available The paper presents the issue of comprehensive automation and monitoring of medium voltage (MV grids as a key element of the Smart Grid concept. The existing condition of MV grid control and monitoring is discussed, and the concept of a solution which will provide the possibility of remote automatic grid reconfiguration and ensure full grid observability from the dispatching system level is introduced. Automation of MV grid switching is discussed in detail to isolate a faulty line section and supply electricity at the time of the failure to the largest possible number of recipients. An example of such automation controls’ operation is also presented. The paper’s second part presents the key role of the quick fault location function and the possibility of the MV grid’s remote reconfiguration for improving power supply reliability (SAIDI and SAIFI indices. It is also shown how an increase in the number of points fitted with faulted circuit indicators with the option of remote control of switches from the dispatch system in MV grids may affect reduction of SAIDI and SAIFI indices across ENERGA-OPERATOR SA divisions.

  16. FLAME MONITORING IN POWER STATION BOILERS USING IMAGE PROCESSING

    Directory of Open Access Journals (Sweden)

    K. Sujatha

    2012-05-01

    Full Text Available Combustion quality in power station boilers plays an important role in minimizing the flue gas emissions. In the present work various intelligent schemes to infer the flue gas emissions by monitoring the flame colour at the furnace of the boiler are proposed here. Flame image monitoring involves capturing the flame video over a period of time with the measurement of various parameters like Carbon dioxide (CO2, excess oxygen (O2, Nitrogen dioxide (NOx, Sulphur dioxide (SOx and Carbon monoxide (CO emissions plus the flame temperature at the core of the fire ball, air/fuel ratio and the combustion quality. Higher the quality of combustion less will be the flue gases at the exhaust. The flame video was captured using an infrared camera. The flame video is then split up into the frames for further analysis. The video splitter is used for progressive extraction of the flame images from the video. The images of the flame are then pre-processed to reduce noise. The conventional classification and clustering techniques include the Euclidean distance classifier (L2 norm classifier. The intelligent classifier includes the Radial Basis Function Network (RBF, Back Propagation Algorithm (BPA and parallel architecture with RBF and BPA (PRBFBPA. The results of the validation are supported with the above mentioned performance measures whose values are in the optimal range. The values of the temperatures, combustion quality, SOx, NOx, CO, CO2 concentrations, air and fuel supplied corresponding to the images were obtained thereby indicating the necessary control action taken to increase or decrease the air supply so as to ensure complete combustion. In this work, by continuously monitoring the flame images, combustion quality was inferred (complete/partial/incomplete combustion and the air/fuel ratio can be automatically varied. Moreover in the existing set-up, measurements like NOx, CO and CO2 are inferred from the samples that are collected periodically or by

  17. An engineered approach to stem cell culture: automating the decision process for real-time adaptive subculture of stem cells.

    Directory of Open Access Journals (Sweden)

    Dai Fei Elmer Ker

    Full Text Available Current cell culture practices are dependent upon human operators and remain laborious and highly subjective, resulting in large variations and inconsistent outcomes, especially when using visual assessments of cell confluency to determine the appropriate time to subculture cells. Although efforts to automate cell culture with robotic systems are underway, the majority of such systems still require human intervention to determine when to subculture. Thus, it is necessary to accurately and objectively determine the appropriate time for cell passaging. Optimal stem cell culturing that maintains cell pluripotency while maximizing cell yields will be especially important for efficient, cost-effective stem cell-based therapies. Toward this goal we developed a real-time computer vision-based system that monitors the degree of cell confluency with a precision of 0.791±0.031 and recall of 0.559±0.043. The system consists of an automated phase-contrast time-lapse microscope and a server. Multiple dishes are sequentially imaged and the data is uploaded to the server that performs computer vision processing, predicts when cells will exceed a pre-defined threshold for optimal cell confluency, and provides a Web-based interface for remote cell culture monitoring. Human operators are also notified via text messaging and e-mail 4 hours prior to reaching this threshold and immediately upon reaching this threshold. This system was successfully used to direct the expansion of a paradigm stem cell population, C2C12 cells. Computer-directed and human-directed control subcultures required 3 serial cultures to achieve the theoretical target cell yield of 50 million C2C12 cells and showed no difference for myogenic and osteogenic differentiation. This automated vision-based system has potential as a tool toward adaptive real-time control of subculturing, cell culture optimization and quality assurance/quality control, and it could be integrated with current and

  18. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 230C/234-5Z. The magnesium hydroxide process control software Rev 0 is being updated to include control programming for a second hot plate. The process control programming was performed by the system administrator. Software testing for the additional hot plate was performed per PFP Job Control Work Package 2Z-00-1703. The software testing was verified by Quality Control to comply with OSD-Z-184-00044, Magnesium Hydroxide Precipitation Process

  19. An Advanced Pre-Processing Pipeline to Improve Automated Photogrammetric Reconstructions of Architectural Scenes

    Directory of Open Access Journals (Sweden)

    Marco Gaiani

    2016-02-01

    Full Text Available Automated image-based 3D reconstruction methods are more and more flooding our 3D modeling applications. Fully automated solutions give the impression that from a sample of randomly acquired images we can derive quite impressive visual 3D models. Although the level of automation is reaching very high standards, image quality is a fundamental pre-requisite to produce successful and photo-realistic 3D products, in particular when dealing with large datasets of images. This article presents an efficient pipeline based on color enhancement, image denoising, color-to-gray conversion and image content enrichment. The pipeline stems from an analysis of various state-of-the-art algorithms and aims to adjust the most promising methods, giving solutions to typical failure causes. The assessment evaluation proves how an effective image pre-processing, which considers the entire image dataset, can improve the automated orientation procedure and dense 3D point cloud reconstruction, even in the case of poor texture scenarios.

  20. Automated system for monitoring groundwater levels at an experimental low-level waste disposal site

    International Nuclear Information System (INIS)

    One of the major problems with disposing of low-level solid wastes in the eastern United States is the potential for water-waste interactions and leachate migration. To monitor groundwater fluctuations and the frequency with which groundwater comes into contact with a group of experimental trenches, work at Oak Ridge National Laboratory's Engineered Test Facility (ETF) has employed a network of water level recorders that feed information from 15 on-site wells to a centralized data recording system. The purpose of this report is to describe the monitoring system being used and to document the computer programs that have been developed to process the data. Included in this report are data based on more than 2 years of water level information for ETF wells 1 through 12 and more than 6 months of data from all 15 wells. The data thus reflect both long-term trends as well as a large number of short-term responses to individual storm events. The system was designed to meet the specific needs of the ETF, but the hardware and computer routines have generic application to a variety of groundwater monitoring situations. 5 references

  1. Total Column Greenhouse Gas Monitoring in Central Munich: Automation and Measurements

    Science.gov (United States)

    Chen, Jia; Heinle, Ludwig; Paetzold, Johannes C.; Le, Long

    2016-04-01

    It is challenging to use in-situ surface measurements of CO2 and CH4 to derive emission fluxes in urban regions. Surface concentrations typically have high variance due to the influence of nearby sources, and they are strongly modulated by mesoscale transport phenomena that are difficult to simulate in atmospheric models. The integrated amount of a tracer through the whole atmosphere is a direct measure of the mass loading of the atmosphere given by emissions. Column measurements are insensitive to vertical redistribution of tracer mass, e.g. due to growth of the planetary boundary layer, and are also less influenced by nearby point sources, whose emissions are concentrated in a thin layer near the surface. Column observations are more compatible with the scale of atmospheric models and hence provide stronger constraints for inverse modeling. In Munich we are aiming at establishing a regional sensor network with differential column measurements, i.e. total column measurements of CO2 and CH4 inside and outside of the city. The inner-city station is equipped with a compact solar-tracking Fourier transform spectrometer (Bruker EM27/SUN) in the campus of Technische Universität München, and our measurements started in Aug. 2015. The measurements over seasons will be shown, as well as preliminary emission studies using these observations. To deploy the compact spectrometers for stationary monitoring of the urban emissions, an automatic protection and control system is mandatory and a challenging task. It will allow solar measurements whenever the sun is out and reliable protection of the instrument when it starts to rain. We have developed a simplified and highly reliable concept for the enclosure, aiming for a fully automated data collection station without the need of local human interactions. Furthermore, we are validating and combining the OCO-2 satellite-based measurements with our ground-based measurements. For this purpose, we have developed a software tool that

  2. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  3. Radioactivity monitoring and data processing on the SHMU

    International Nuclear Information System (INIS)

    The radiation monitoring network in the Slovak Republic is presented. The data are collected and data processing proceeds on the Slovak Hydro-Meteorologic Institute. The data from 21 monitored sites are send to Slovak Centre of Radiation Monitoring Network, to the Nuclear Regulatory Authority of the Slovak Republic, to the Bohunice NPP, as well as to Austria and other national monitoring centres. From January 1999 the data from the Slovak Army monitoring network, consisting from 11 sites of measurements, are obtained. Process of data processing is described

  4. Analysis of the thoracic aorta using a semi-automated post processing tool

    Energy Technology Data Exchange (ETDEWEB)

    Entezari, Pegah, E-mail: p-entezari@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Kino, Aya, E-mail: ayakino@gmail.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Honarmand, Amir R., E-mail: arhonarmand@yahoo.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Galizia, Mauricio S., E-mail: maugalizia@yahoo.com.br [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yang, Yan, E-mail: yyang@vitalimages.com [Vital images Inc, Minnetonka, MN (United States); Collins, Jeremy, E-mail: collins@fsm.northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yaghmai, Vahid, E-mail: vyaghmai@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Carr, James C., E-mail: jcarr@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States)

    2013-09-15

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  5. 'Au.Raex': An Automated, Long Lasting Exposimeter for Monitoring Persons with Increased Radon-Exposure

    International Nuclear Information System (INIS)

    Within this framework, the automated radon exposimeter 'au.raex' improves the long-established method of radon exposure measurements using nuclear track detectors in a decisive method. Unlike conventional nuclear track exposimeters this radon measurement is switchable. By movement recognition the exposition is constrained automatically to the period in which it is actually worn, the exposition time is captured automatically. Despite these advantages, it is comfortable to wear au.raex. It has roughly the dimensions of a cigarette box. Used as a time-controlled ambient exposimeter it captures only the radon expositions during relevant and defined periods. The timing control has been implemented in form of a complete calendar. Thus, the on-and off separately for each weekday, as well as public holidays and holiday periods are defined, in which the detector, against the rule, remains completely closed. Data evaluation and programming are performed using the USB port and software on a computer. The switchability of the measurement is achieved by a movable slide at a small distance above the detector film. Both movement- and time-depended control of the closure are optimized for low electronic energy consumption. The 'au.raex' is applicable for measuring campaigns lasting about several years, without the need to charge the device or further maintenance. Calibration as well as the practical testing of 'au.raex' were made by the Radon Laboratory of Karlsruhe Institute of Technology KIT using their own nuclear track films and evaluation process. To validate the operation of the instrument, measurements are to be performed on persons with known increased radon exposure.(author)

  6. Laboratory automation and LIMS in forensics

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Anders Johannes; Morling, Niels

    2013-01-01

    Implementation of laboratory automation and LIMS in a forensic laboratory enables the laboratory, to standardize sample processing. Automated liquid handlers can increase throughput and eliminate manual repetitive pipetting operations, known to result in occupational injuries to the technical staff....... Furthermore, implementation of automated liquid handlers reduces the risk of sample misplacement. A LIMS can efficiently control the sample flow through the laboratory and manage the results of the conducted tests for each sample. Integration of automated liquid handlers with a LIMS provides the laboratory...... with the tools required for setting up automated production lines of complex laboratory processes and monitoring the whole process and the results. Combined, this enables processing of a large number of samples. Selection of the best automated solution for an individual laboratory should be based on user...

  7. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study

    Science.gov (United States)

    Johansen, Ayna; Brendryen, Håvar

    2016-01-01

    Background eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. Objective We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist’s support of a working alliance, internalization of motivation, and managing lapses. Methods We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several “counseling sessions” about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. Results The program supports the user’s working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. Conclusions A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective. PMID:27354373

  8. Assessment Study on Sensors and Automation in the Industries of the Future. Reports on Industrial Controls, Information Processing, Automation, and Robotics

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Bonnie [Adventium Labs; Boddy, Mark [Adventium Labs; Doyle, Frank [Univ. of California, Santa Barbara, CA (United States); Jamshidi, Mo [Univ. of New Mexico, Albuquerque, NM (United States); Ogunnaike, Tunde [Univ. of Delaware, Newark, DE (United States)

    2004-11-01

    This report presents the results of an expert study to identify research opportunities for Sensors & Automation, a sub-program of the U.S. Department of Energy (DOE) Industrial Technologies Program (ITP). The research opportunities are prioritized by realizable energy savings. The study encompasses the technology areas of industrial controls, information processing, automation, and robotics. These areas have been central areas of focus of many Industries of the Future (IOF) technology roadmaps. This report identifies opportunities for energy savings as a direct result of advances in these areas and also recognizes indirect means of achieving energy savings, such as product quality improvement, productivity improvement, and reduction of recycle.

  9. Development of a Fully Automated, GPS Based Monitoring System for Disaster Prevention and Emergency Preparedness: PPMS+RT

    OpenAIRE

    Anna Szostak-Chrzanowski; Adam Chrzanowski; Don Kim; Jason Bond

    2007-01-01

    The increasing number of structural collapses, slope failures and other natural disasters has lead to a demand for new sensors, sensor integration techniques and data processing strategies for deformation monitoring systems. In order to meet extraordinary accuracy requirements for displacement detection in recent deformation monitoring projects, research has been devoted to integrating Global Positioning System (GPS) as a monitoring sensor. Although GPS has been used for monitoring purposes w...

  10. A chemical sensor and biosensor based totally automated water quality monitor for extended space flight: Step 1

    Science.gov (United States)

    Smith, Robert S.

    1993-01-01

    The result of a literature search to consider what technologies should be represented in a totally automated water quality monitor for extended space flight is presented. It is the result of the first summer in a three year JOVE project. The next step will be to build a test platform at the Authors' school, St. John Fisher College. This will involve undergraduates in NASA related research. The test flow injection analysis system will be used to test the detection limit of sensors and the performance of sensors in groups. Sensor companies and research groups will be encouraged to produce sensors which are not currently available and are needed for this project.

  11. Research on the Correlation Between Oil Menitoring and Vibration Monitoring in Information Collecting and Processing Monitoring

    Institute of Scientific and Technical Information of China (English)

    ZHAO Xin-ze; YAN Xin-ping; ZHAO Chun-hong; GAO Xiao-hong; XIAO Han-liang

    2004-01-01

    Oil monitoriug and vibration monitoring are two principal techniques for mechanical fault diagnosis and condition monitoring at present. They monitor the mechanical condition by different approaches, neverthelcss, oil and vibration monitoring are related in information collecting and processing. In the same mechanical system, the information obtained from the same information source can be described with the same expression form. The expressions are constituted of a structure matrix, a relative matrix and a system matrix. For oil and vibration monitoring, the information source is correlation and the collection is independent and complementary. And oil monitoring and vibration monitoring have the same process method when they yield their information. This rcsearch has provided a reasonable and useful approach to combine oil monitoring and vibration monitoring.

  12. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  13. Validation of the process control system of an automated large scale manufacturing plant.

    Science.gov (United States)

    Neuhaus, H; Kremers, H; Karrer, T; Traut, R H

    1998-02-01

    The validation procedure for the process control system of a plant for the large scale production of human albumin from plasma fractions is described. A validation master plan is developed, defining the system and elements to be validated, the interfaces with other systems with the validation limits, a general validation concept and supporting documentation. Based on this master plan, the validation protocols are developed. For the validation, the system is subdivided into a field level, which is the equipment part, and an automation level. The automation level is further subdivided into sections according to the different software modules. Based on a risk categorization of the modules, the qualification activities are defined. The test scripts for the different qualification levels (installation, operational and performance qualification) are developed according to a previously performed risk analysis.

  14. Environmetrics of synfuels. I. Processing the automated PDP-11 data components for the UMD gasifier facility

    Energy Technology Data Exchange (ETDEWEB)

    Strand, R.H.; Farrell, M.P.; Gudmundson, C.W.; Birchfield, T.K.; Casada, S.S.; Vansuch, M.E.

    1981-01-01

    This report summarizes the techniques and procedures used to handle automated data collected at the University of Minnesota-Duluth (UMD) campus coal gasification facility. This facility, which is partially funded by the Department of Energy, is being evaluated by scientists at Oak Ridge National Laboratory (ORNL) for its potential health and environmental effects. Automatic data collections and manually collected and sample results data are used for this assessment. A data management project at ORNL handles these and other UMD data for the Gasifiers in Industry Program (GIIP). Specifically, this report documents the procedures developed within the data management project for handling two categories of automated data: (1) process and (2) environmental. The examples included use actual data from the first one and a half years of gasifier operation.

  15. An Empirical Study on the Impact of Automation on the Requirements Analysis Process

    Institute of Scientific and Technical Information of China (English)

    Giuseppe Lami; Robert W. Ferguson

    2007-01-01

    Requirements analysis is an important phase in a software project. The analysis is often performed in aninformal way by specialists who review documents looking for ambiguities, technical inconsistencies and incomplete parts.Automation is still far from being applied in requirements analyses, above all since natural languages are informal andthus difficult to treat automatically. There are only a few tools that can analyse texts. One of them, called QuARS, wasdeveloped by the Istituto di Scienza e Tecnologie dell'Informazione and can analyse texts in terms of ambiguity. This paperdescribes how QuARS was used in a formal empirical experiment to assess the impact in terms of effectiveness and efficacyof the automation in the requirements review process of a software company.

  16. Radiation Monitoring System in Advanced Spent Fuel Conditioning Process Facility

    International Nuclear Information System (INIS)

    The Advanced spent fuel Conditioning Process is under development for effective management of spent fuel by converting UO2 into U-metal. For demonstration of this process, α-γ type new hot cell was built in the IMEF basement . To secure against radiation hazard, this facility needs radiation monitoring system which will observe the entire operating area before the hot cell and service area at back of it. This system consists of 7 parts; Area Monitor for γ-ray, Room Air Monitor for particulate and iodine in both area, Hot cell Monitor for hot cell inside high radiation and rear door interlock, Duct Monitor for particulate of outlet ventilation, Iodine Monitor for iodine of outlet duct, CCTV for watching workers and material movement, Server for management of whole monitoring system. After installation and test of this, radiation monitoring system will be expected to assist the successful ACP demonstration

  17. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meier, David E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Coble, Jamie B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Jordan, David V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mcdonald, Luther W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Forrester, Joel B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Unlu, Kenan [Pennsylvania State Univ., University Park, PA (United States); Landsberger, Sheldon [Univ. of Texas, Austin, TX (United States); Bender, Sarah [Pennsylvania State Univ., University Park, PA (United States); Dayman, Kenneth J. [Univ. of Texas, Austin, TX (United States); Reilly, Dallas D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2013-09-01

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicate changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.

  18. Power up your plant - An introduction to integrated process and power automation

    Energy Technology Data Exchange (ETDEWEB)

    Vasel, Jeffrey

    2010-09-15

    This paper discusses how a single integrated system can increase energy efficiency, improve plant uptime, and lower life cycle costs. Integrated Process and Power Automation is a new system integration architecture and power strategy that addresses the needs of the process and power generation industries. The architecture is based on Industrial Ethernet standards such as IEC 61850 and Profinet as well as Fieldbus technologies. The energy efficiency gains from integration are discussed in a power generation use case. A power management system success story from a major oil and gas company, Petrobras, is also discussed.

  19. Proceedings of the third International Federation of Automatic Control symposium on automation in mining, mineral and metal processing

    Energy Technology Data Exchange (ETDEWEB)

    O' Shea, J.; Polis, M. (eds.)

    1980-01-01

    Sixty-four papers covering many aspects of automation in mining, mineral and metal processing are presented. Opening and concluding remarks are also given. Fourteen papers are individually abstracted.

  20. Monitoring of sediment transport processes using tracer stones

    Science.gov (United States)

    Redtenbacher, Matthias; Harb, Gabriele; Barbas, Teresa; Schneider, Josef

    2014-05-01

    In the last decades the vulnerability of our civilization to geomorphological damaging events like debris flows and exceptional floods increased. The reasons are, on one side, that the global hydrological cycle became more intense during the recent past and on the other side that the material assets of the population increased. Risk prevention, risk analysis and forecast methods thus became more important. Geomorphological processes are often not easy to analyse. To get information about the probability and the consequences of these increasing events, it is necessary to analyse the availability of sediments in the catchment area, the erosion processes of the sediment and the transport of the sediments along torrents. The project ClimCatch, which started in April 2012, investigates the torrential sediment transport processes in a non-glaciated Alpine valley in Austria and the related natural hazards under the viewpoint of the on-going climate change. Due to an extreme precipitation event in 2011 debris flow-similar discharges occurred in this catchment and since that the sediment sources are highly erodible there. The aims of the project are to derive a quantitative sediment budget model, including geomorphic process domains, determining sediment transport in the river system and the measurement of bed load output, besides others. To quantify river sediment dynamics several different methodologies are applied within the project. Discharge and sediment transport measurement as well as hydrological stations are installed in the catchment area. Aggradation and erosion are analysed by means of laser scanning technology in the sediment storage basin which is located at the outlet of the catchment. The observation and measurement of the sediment transport is performed by the application of radio telemetry stones and colour tracer stones. Line pebble counting, automated grain size determination using photographs and sieving on-site is performed to get qualitative sediment

  1. An Integrated Solution for both Monitoring and Controlling for Automization Using Wireless Sensor Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    M Gnana Seelan

    2013-02-01

    Full Text Available Temperature monitoring plays a major role in controlling it according to its varied conditions. Thisprocess is common in all critical areas like data centre, server rooms, grid rooms and other datacommunication equipped rooms. This is mandatory for each organization/industry to impart suchprocess, as most of the critical data would be in data centre along with their network infrastructure whichhaving various electronic, electrical and mechanical devices are involved for data transmissions. Thesedevices are very much depend on the environmental factors such as temperature, moisture, humidity etc.,and also emit heat in the form of thermal energy when they are in functional. To overcome these heats,the server/data centre room(s would be engaged with multiple (distributed air-conditioning (ac systemsto provide cooling environment and maintain the temperature level of the room. The proposed paper isthe study of automization of monitoring and controlling temperature as per desired requirements withwsn network

  2. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits. PMID:26227212

  3. Microprocessor-based integration of microfluidic control for the implementation of automated sensor monitoring and multithreaded optimization algorithms.

    Science.gov (United States)

    Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov

    2015-08-01

    Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.

  4. Computer-based diagnostic monitoring to enhance the human-machine interface of complex processes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, I.S.

    1992-02-01

    There is a growing interest in introducing an automated, on-line, diagnostic monitoring function into the human-machine interfaces (HMIs) or control rooms of complex process plants. The design of such a system should be properly integrated with other HMI systems in the control room, such as the alarms system or the Safety Parameter Display System (SPDS). This paper provides a conceptual foundation for the development of a Plant-wide Diagnostic Monitoring System (PDMS), along with functional requirements for the system and other advanced HMI systems. Insights are presented into the design of an efficient and robust PDMS, which were gained from a critical review of various methodologies developed in the nuclear power industry, the chemical process industry, and the space technological community.

  5. An Automated System for the Detection of Stratified Squamous Epithelial Cancer Cell Using Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Ram Krishna Kumar

    2013-06-01

    Full Text Available Early detection of cancer disease is a difficult problem and if it is not detected in starting phase the cancer can be fatal. Current medical procedures which are used to diagnose the cancer in body partsare time taking and more laboratory work is required for them. This work is an endeavor to possible recognition of cancer cells in the body part. The process consists of image taken of the affected area and digital image processing of the images to get a morphological pattern which differentiate normal cell to cancer cell. The technique is different than visual inspection and biopsy process. Image processing enables the visualization of cellular structure with substantial resolution. The aim of the work is to exploit differences in cellular organization between cancerous and normal tissue using image processing technique, thus allowing for automated, fast and accurate diagnosis.

  6. LIKELIHOOD ESTIMATION OF PARAMETERS USING SIMULTANEOUSLY MONITORED PROCESSES

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Ditlevsen, Ove Dalager

    2004-01-01

    The topic is maximum likelihood inference from several simultaneously monitored response processes of a structure to obtain knowledge about the parameters of other not monitored but important response processes when the structure is subject to some Gaussian load field in space and time...

  7. Research on machine vision system of monitoring injection molding processing

    Science.gov (United States)

    Bai, Fan; Zheng, Huifeng; Wang, Yuebing; Wang, Cheng; Liao, Si'an

    2016-01-01

    With the wide development of injection molding process, the embedded monitoring system based on machine vision has been developed to automatically monitoring abnormality of injection molding processing. First, the construction of hardware system and embedded software system were designed. Then camera calibration was carried on to establish the accurate model of the camera to correct distortion. Next the segmentation algorithm was applied to extract the monitored objects of the injection molding process system. The realization procedure of system included the initialization, process monitoring and product detail detection. Finally the experiment results were analyzed including the detection rate of kinds of the abnormality. The system could realize the multi-zone monitoring and product detail detection of injection molding process with high accuracy and good stability.

  8. Automated analyser for monitoring trace amounts of volatile chloro-organic compounds in recirculated industrial water

    OpenAIRE

    Elżbieta Przyk; Jacek Namieśnik; Wojciech Chrzanowski; Andrzej Wasik; Wacław Janicki

    2002-01-01

    An automated analyser of volatile chloro-organic compounds in water was constructed and tested using standard mixtures of dichloromethane and dichloroethane. It was based on continuous, countercurrent gas stripping of the liquid sample followed by periodic trapping of the analytes on two traps alternately connected to the bubbler outlet, and thermal desorption. When one trap performed adsorption, the other underwent desorption and cooling. Analytes were detected by an ECD detector. Integratio...

  9. Automated longitudinal monitoring of in vivo protein aggregation in neurodegenerative disease C. elegans models

    OpenAIRE

    Cornaglia, Matteo; Krishnamani, Gopalan; Mouchiroud, Laurent; Sorrentino, Vincenzo; Lehnert, Thomas; Auwerx, Johan; Gijs, Martin A. M.

    2016-01-01

    Background While many biological studies can be performed on cell-based systems, the investigation of molecular pathways related to complex human dysfunctions – e.g. neurodegenerative diseases – often requires long-term studies in animal models. The nematode Caenorhabditis elegans represents one of the best model organisms for many of these tests and, therefore, versatile and automated systems for accurate time-resolved analyses on C. elegans are becoming highly desirable tools in the field. ...

  10. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  11. A fuzzy model for processing and monitoring vital signs in ICU patients

    Directory of Open Access Journals (Sweden)

    Valentim Ricardo AM

    2011-08-01

    Full Text Available Abstract Background The area of the hospital automation has been the subject of much research, addressing relevant issues which can be automated, such as: management and control (electronic medical records, scheduling appointments, hospitalization, among others; communication (tracking patients, staff and materials, development of medical, hospital and laboratory equipment; monitoring (patients, staff and materials; and aid to medical diagnosis (according to each speciality. Methods In this context, this paper presents a Fuzzy model for helping medical diagnosis of Intensive Care Unit (ICU patients and their vital signs monitored through a multiparameter heart screen. Intelligent systems techniques were used in the data acquisition and processing (sorting, transforming, among others it into useful information, conducting pre-diagnosis and providing, when necessary, alert signs to the medical staff. Conclusions The use of fuzzy logic turned to the medical area can be very useful if seen as a tool to assist specialists in this area. This paper presented a fuzzy model able to monitor and classify the condition of the vital signs of hospitalized patients, sending alerts according to the pre-diagnosis done helping the medical diagnosis.

  12. Implications of critical chain methodology for business process flexible automation projects in economic organizations

    Directory of Open Access Journals (Sweden)

    Paul BRUDARU

    2009-12-01

    Full Text Available Business processes flexible automation projects involve the use of methods and technologies from Business Processes Management area (BPM that aim at increasing the agility of organizations in changing the business processes as response to environmental changes. BPM-type projects are a mix between process improvement projects and software development which implies a high complexity in managing them. The successful implementation of these projects involves overcoming problems inherent as delays in the activities of projects, multi-tasking, lack of focus which can not be solved by traditional project management tools. An approach which takes account of the difficulties of BPM projects is critical chain methodology. Using critical chain method provides the methodology fundament necessary for the successful completion of BPM-type projects.

  13. A Scheme for Automation of Telecom Data Processing for Business Application

    CERN Document Server

    Nair, T R Gopalakrishnan; V., Suma; Maharajan, Ezhilarasan

    2012-01-01

    As the telecom industry is witnessing a large scale growth, one of the major challenges faced in the domain deals with the analysis and processing of telecom transactional data which are generated in large volumes by embedded system communication controllers having various functions. This paper deals with the analysis of such raw data files which are made up of the sequences of the tokens. It also depicts the method in which the files are parsed for extracting the information leading to the final storage in predefined data base tables. The parser is capable of reading the file in a line structured way and store the tokens into the predefined tables of data bases. The whole process is automated using the SSIS tools available in the SQL server. The log table is maintained in each step of the process which will enable tracking of the file for any risk mitigation. It can extract, transform and load data resulting in the processing.

  14. Automated work-flow for processing high-resolution direct infusion electrospray ionization mass spectral fingerprints

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    2007-01-01

    an automated data processing pipeline to compare large numbers of fingerprint spectra from direct infusion experiments analyzed by high resolution MS. We describe some of the intriguing problems that have to be addressed. starting with the conversion and pre-processing of the raw data to the final data......The use of mass spectrometry (MS) is pivotal in analyses of the metabolome and presents a major challenge for subsequent data processing. While the last few years have given new high performance instruments, there has not been a comparable development in data processing. In this paper we discuss...... of the preprocessing is applicable to cluster-, discriminant analysis, and related multivariate methods applied directly to mass spectra from direct infusion analysis of crude extracts. This is done to find the relationship between several terverticillate Penicillium species and identify the ions responsible...

  15. Annotated bibliography of films in automation, data processing, and computer science

    CERN Document Server

    Soloman, Martin B Jr

    2015-01-01

    With the rapid development of computer science and the expanding use of computers in all facets of American life, there has been made available a wide range of instructional and informational films on automation, data processing, and computer science. Here is the first annotated bibliography of these and related films, gathered from industrial, institutional, and other sources.This bibliography annotates 244 films, alphabetically arranged by title, with a detailed subject index. Information is also provided concerning the intended audience, rental-purchase data, ordering procedures, and such s

  16. Instrumentation, Field Network and Process Automation for the Cryogenic System of the LHC Test String

    OpenAIRE

    Suraci, A.; Bager, T.; Balle, Ch.; Blanco, E.; Casas, J.; Gomes, P.; Pelletier, S.; Serio, L.; Vauthier, N.

    2001-01-01

    CERN is now setting up String 2, a full-size prototype of a regular cell of the LHC arc. It is composed of two quadrupole, six dipole magnets, and a separate cryogenic distribution line (QRL) for the supply and recovery of the cryogen. An electrical feed box (DFB), with up to 38 High Temperature Superconducting (HTS) leads, powers the magnets. About 700 sensors and actuators are distributed along four Profibus DP and two Profibus PA field buses. The process automation is handled by two contro...

  17. Web-based execution of graphical work-flows: a modular platform for multifunctional scientific process automation

    International Nuclear Information System (INIS)

    The Passerelle process automation suite offers a fundamentally modular solution platform, based on a layered integration of several best-of-breed technologies. It has been successfully applied by Synchrotron Soleil as the sequencer for data acquisition and control processes on its beamlines, integrated with TANGO as a control bus and GlobalScreenTM) as the SCADA package. Since last year it is being used as the graphical work-flow component for the development of an eclipse-based Data Analysis Work Bench, at ESRF. The top layer of Passerelle exposes an actor-based development paradigm, based on the Ptolemy framework (UC Berkeley). Actors provide explicit reusability and strong decoupling, combined with an inherently concurrent execution model. Actor libraries exist for TANGO integration, web-services, database operations, flow control, rules-based analysis, mathematical calculations, launching external scripts etc. Passerelle's internal architecture is based on OSGi, the major Java framework for modular service-based applications. A large set of modules exist that can be recombined as desired to obtain different features and deployment models. Besides desktop versions of the Passerelle work-flow workbench, there is also the Passerelle Manager. It is a secured web application including a graphical editor, for centralized design, execution, management and monitoring of process flows, integrating standard Java Enterprise services with OSGi. We will present the internal technical architecture, some interesting application cases and the lessons learnt. (authors)

  18. Processing of the WLCG monitoring data using NoSQL

    Science.gov (United States)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  19. Processing of the WLCG monitoring data using NoSQL

    International Nuclear Information System (INIS)

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  20. Development and testing of an automated High-resolution InSAR volcano-monitoring system in the MED-SUV project

    Science.gov (United States)

    Chowdhury, Tanvir Ahmed; Minet, Christian; Fritz, Thomas; Rodriguez Gonzalez, Fernando

    2015-04-01

    Volcanic unrest which produces a variety of geological and hydrological hazards is difficult to predict. Therefore it is important to monitor volcanoes continuously. The monitoring of active volcanoes requires the reliable measurement of surface deformation before, during and after volcanic activities. Besides the improvements of the understanding of geophysical processes underlying the volcanic systems of Vesuvius/ Campi Flegrei and Mt. Etna, one of the main goals of the MED-SUV (MEDiterranean SUpersite Volcanoes) project is to design a system for automatically monitoring ground deformations over active volcanoes. Space-borne synthetic aperture radar (SAR) interferometry (InSAR), persistent scatterer interferometry (PSI) and small baseline subset algorithm (SBAS) provide powerful tools for observing the surface changes with millimeter accuracy. All the mentioned techniques address the challenges by exploiting medium to large SAR image stacks. The generation of interferometric products constitutes a major effort in terms of processing and planning. It requires a high degree of automation, robustness and quality control of the overall process. As a consequence of these requirements and constrains, the Integrated Wide Area Processor (IWAP) developed at DLR is introduced in the framework of a remote sensing task of MED-SUV project. The IWAP has been conceived and designed to optimize the processing workflow in order to minimize the processing time. Moreover, a quality control concept has been developed and integrated in the workflow. The IWAP is structured into three parts: (i) firstly, preparation of an order file containing some configuration parameters and invokes the processor; (ii) secondly, upon request from the processor, the operator performs some manual interactions by means of visual interfaces; (iii) analysis of the final product supported by extensive product visualization. This visualization supports the interpretation of the results without the need of

  1. Quality control of environmental radiation monitoring process

    International Nuclear Information System (INIS)

    This report summarizes for the period (January/2003 to September 2003) the analytical results of the Environmental Monitoring Program- Centro de Desenvolvimento da Tecnologia Nuclear - CDTN. A statistical treatment using control graphs for periodicity and tendency analysis according to temporal variation is also carried out. Moreover, a comparison of radioactive and stable elements concentrations with the derived and intake limits for ingestion and inhalation recommended by Comissao Nacional de Energia Nuclear, Fundacao Estatual do Meio Ambiente (FEAM) e Instituto Brasileiro do Meio Ambiente (IBAMA) is performed. The results are compliant with those recommended by the legislation. (author)

  2. An improved, computer-based, on-line gamma monitor for plutonium anion exchange process control

    International Nuclear Information System (INIS)

    An improved, low-cost, computer-based system has replaced a previously developed on-line gamma monitor. Both instruments continuously profile uranium, plutonium, and americium in the nitrate anion exchange process used to recover and purify plutonium at the Los Alamos Plutonium Facility. The latest system incorporates a personal computer that provides full-feature multichannel analyzer (MCA) capabilities by means of a single-slot, plug-in integrated circuit board. In addition to controlling all MCA functions, the computer program continuously corrects for gain shift and performs all other data processing functions. This Plutonium Recovery Operations Gamma Ray Energy Spectrometer System (PROGRESS) provides on-line process operational data essential for efficient operation. By identifying abnormal conditions in real time, it allows operators to take corrective actions promptly. The decision-making capability of the computer will be of increasing value as we implement automated process-control functions in the future. 4 refs., 6 figs

  3. Subsea flow assurance and process monitoring via gamma radiation

    International Nuclear Information System (INIS)

    Condition monitoring and process control with the use of gamma radiation is considered to be the most reliable detection principle for a wide range of applications throughout the oil and gas industries, from measuring mechanical integrity to dynamic process fluid monitoring. The growing numbers of advanced subsea processing projects and pipeline flow assurance studies currently adds an increasing number of subsea applications to the radiation measurement reference list (author) (ml)

  4. Monitoring Assertion-Based Business Processes

    NARCIS (Netherlands)

    Aiello, Marco; Lazovik, Alexander

    2006-01-01

    Business processes that span organizational borders describe the interaction between multiple parties working towards a common objective. They also express business rules that govern the behavior of the process and account for expressing changes reflecting new business objectives and new market situ

  5. Application of PLC’s for Automation of Processes in Industries

    Directory of Open Access Journals (Sweden)

    Rahul Pawar

    2016-06-01

    Full Text Available Several industries utilize sequential industrial process which is respective in nature. For such processes industries have to depend upon use of relays, stepping drum, timers and controls, considerable difficulties experienced in reprogramming necessitated due to change in the nature of production. Often the whole system has to be scrapped and a redesigning is required. To overcome these problems PLC control system was introduced. The PLC can be described as a control ladder comprising a sequence program. PLC sequence program consists of normally open and normally closed contacts connected in parallel or in series. It also has relay coils, which turns ON and OFF as the state of these contacts change. In this paper, about all aspects of these powerful and versatile tools and its applications to process automation has been discussed

  6. Optical sensors for process monitoring in biotechnology

    Science.gov (United States)

    Ploetz, F.; Schelp, Carsten; Anders, K.; Eberhardt, F.; Scheper, Thomas-Helmut; Bueckmann, F.

    1991-09-01

    The development and application of various optical sensors will be presented. Among these are optical sensors (optrodes) with immobilized enzymes, coenzymes, and labeled antibodies. The NADH formation of coenzyme dependent enzymes was used for detection of lactate, pyrovate mannitol, ethanol, and formate. An enzyme optrode based on a pH-optrode as a transducer for the monitoring of urea and penicillin in fermentation media was developed. For preparing an oxygen optrode, oxygen-sensitive fluorophores were entrapped in a gaspermeable silicone matrix that is attached to the distal end of a bifurcated fiber optic waveguide bundle. By labeling of immuncomponent with fluorophores or enzymes, which transpose fluorophores or chromophores, immunreactions were observed by an optical sensors.

  7. Automated Formosat Image Processing System for Rapid Response to International Disasters

    Science.gov (United States)

    Cheng, M. C.; Chou, S. C.; Chen, Y. C.; Chen, B.; Liu, C.; Yu, S. J.

    2016-06-01

    FORMOSAT-2, Taiwan's first remote sensing satellite, was successfully launched in May of 2004 into the Sun-synchronous orbit at 891 kilometers of altitude. With the daily revisit feature, the 2-m panchromatic, 8-m multi-spectral resolution images captured have been used for researches and operations in various societal benefit areas. This paper details the orchestration of various tasks conducted in different institutions in Taiwan in the efforts responding to international disasters. The institutes involved including its space agency-National Space Organization (NSPO), Center for Satellite Remote Sensing Research of National Central University, GIS Center of Feng-Chia University, and the National Center for High-performance Computing. Since each institution has its own mandate, the coordinated tasks ranged from receiving emergency observation requests, scheduling and tasking of satellite operation, downlink to ground stations, images processing including data injection, ortho-rectification, to delivery of image products. With the lessons learned from working with international partners, the FORMOSAT Image Processing System has been extensively automated and streamlined with a goal to shorten the time between request and delivery in an efficient manner. The integrated team has developed an Application Interface to its system platform that provides functions of search in archive catalogue, request of data services, mission planning, inquiry of services status, and image download. This automated system enables timely image acquisition and substantially increases the value of data product. Example outcome of these efforts in recent response to support Sentinel Asia in Nepal Earthquake is demonstrated herein.

  8. A Noble Approach of Process Automation in Galvanized Nut, Bolt Manufacturing Industry

    Directory of Open Access Journals (Sweden)

    Akash Samanta

    2012-05-01

    Full Text Available Corrosion costs money”, The Columbus battle institute estimates that corrosion costs Americans more than $ 220 billion annually, about 4.3% of the gross natural product [1].Now a days due to increase of pollution, the rate of corrosion is also increasing day-by-day mainly in India, so, to save the steel structures, galvanizing is the best and the simplest solution. Due to this reason galvanizing industries are increasing day-by-day since mid of 1700s.Galvanizing is a controlled metallurgical combination of zinc and steel that can provide a corrosion resistance in a wide variety of environment. In fact, the galvanized metal corrosion resistance factor can be some 70 to 80 times greater that the base metal material. Keeping in mind the importance of this industry, a noble approach of process automation in galvanized nut-bolt  manufacturing plant is presented here as nuts and bolts are the prime ingredient of any structure. In this paper the main objectives of any industry like survival, profit maximization, profit satisfying and sales growth are fulfilled. Furthermore the environmental aspects i.e. pollution control and energy saving are also considered in this paper. The whole automation process is done using programmable logic controller (PLC which has number of unique advantages like being faster, reliable, requires less maintenance and reprogrammable. The whole system has been designed and tested using GE, FANUC PLC.

  9. An automated and integrated framework for dust storm detection based on ogc web processing services

    Science.gov (United States)

    Xiao, F.; Shea, G. Y. K.; Wong, M. S.; Campbell, J.

    2014-11-01

    Dust storms are known to have adverse effects on public health. Atmospheric dust loading is also one of the major uncertainties in global climatic modelling as it is known to have a significant impact on the radiation budget and atmospheric stability. The complexity of building scientific dust storm models is coupled with the scientific computation advancement, ongoing computing platform development, and the development of heterogeneous Earth Observation (EO) networks. It is a challenging task to develop an integrated and automated scheme for dust storm detection that combines Geo-Processing frameworks, scientific models and EO data together to enable the dust storm detection and tracking processes in a dynamic and timely manner. This study develops an automated and integrated framework for dust storm detection and tracking based on the Web Processing Services (WPS) initiated by Open Geospatial Consortium (OGC). The presented WPS framework consists of EO data retrieval components, dust storm detecting and tracking component, and service chain orchestration engine. The EO data processing component is implemented based on OPeNDAP standard. The dust storm detecting and tracking component combines three earth scientific models, which are SBDART model (for computing aerosol optical depth (AOT) of dust particles), WRF model (for simulating meteorological parameters) and HYSPLIT model (for simulating the dust storm transport processes). The service chain orchestration engine is implemented based on Business Process Execution Language for Web Service (BPEL4WS) using open-source software. The output results, including horizontal and vertical AOT distribution of dust particles as well as their transport paths, were represented using KML/XML and displayed in Google Earth. A serious dust storm, which occurred over East Asia from 26 to 28 Apr 2012, is used to test the applicability of the proposed WPS framework. Our aim here is to solve a specific instance of a complex EO data

  10. An Automated Capacitance-Based Fuel Level Monitoring System for Networked Tanks

    Directory of Open Access Journals (Sweden)

    Oke Alice O

    2015-08-01

    Full Text Available The making of an effective fuel measuring system has been a great challenge in the Nigerian industry, as various oil organization are running into different problems ranging from fire outbreak, oil pilfering, oil spillage and some other negative effects. The use of meter rule or long rod at most petrol filling stations for quantity assessment of fuel in tank is inefficient, stressful, dangerous and almost impossible in a networking environment. This archaic method does not provide good reorder date and does not give a good inventory. As such there is a need to automate the system by providing a real time measurement of fuel storage device to meet the demand of the customers. In this paper, a system was designed to sense the level of fuel in a networked tanks using a capacitive sensor controlled by an ATMEGA 328 Arduino microcontroller. The result was automated both in digital and analogue form through radio frequency Transmission using XBee and interfaced to Computer System for notification of fuel level and refill operations. This enables consumption control, cost analysis and tax accounting for fuel purchases

  11. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    Science.gov (United States)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing

  12. Monitoring of steam sterilization processes in the dental office

    NARCIS (Netherlands)

    J.P.C.M. van Doornmalen; A.G.M. Rietmeijer; A.J. Feilzer; K. Kopinga

    2013-01-01

    In dental offices steam sterilization is used to prevent infection of staff and patient. The necessity of sterilization is obvious. To ensure effective sterilization processes each load has to be monitored. Based on literature and standards a state of the art concept of every load monitoring is desc

  13. [Fetal ECG monitoring system based on MCU processing].

    Science.gov (United States)

    Hu, Gang; Chen, Wei; Xie, Xicheng; Zhang, Hao

    2004-12-01

    In order to monitor the fetus in labor, the signal characteristic from fetal scalp electrode is researched, An adaptation algorithm and a peak to peak detecting technology are adopted in signal processing, and an adaptation gain control method is used to eliminate disturber from base-line shift. A fetal ECG monitoring system is designed on the basis of C8051F020 MCU.

  14. An alarm filtering system for an automated process: a multiple-agent approach

    International Nuclear Information System (INIS)

    Nowadays, the supervision process of industrial installations is more and more complex involving the automation of their control. A malfunction generates an avalanche of alarms. The operator, in charge of the supervision, must face the incident and execute right actions to recover a normal situation. Generally, he is drowned under the great number of alarms. Our aim, in the frame of our researches, is to perform an alarm filtering system for an automated metro line, to help the operator finding the main alarm responsible for the malfunction. Our works are divided into two parts, both dealing with study and development of an alarm filtering system but using two different approaches. The first part is developed in the frame of the SARA project (an operator assistance system for an automated metro line) which is an expert system prototype helping the operators of a command center. In this part, a centralized approach has been used representing the events with a single event graph and using a global procedure to perform diagnosis. This approach has itself shown its limits. In the second part of our works, we have considered the distributed artificial intelligence (DAI) techniques, and more especially the multi-agent approach. The multi-agent approach has been motivated by the natural distribution of the metro line equipment and by the fact that each equipment has its own local control and knowledge. Thus, each equipment has been considered as an autonomous agent. Through agents cooperation, the system is able to determine the main alarm and the faulty equipment responsible for the incident. A prototype, written in SPIRAL (a tool for knowledge-based system) is running on a workstation. This prototype has allowed the concretization and the validation of our multi-agent approach. (author)

  15. Automated microfluidic platform of bead-based electrochemical immunosensor integrated with bioreactor for continual monitoring of cell secreted biomarkers

    Science.gov (United States)

    Riahi, Reza; Shaegh, Seyed Ali Mousavi; Ghaderi, Masoumeh; Zhang, Yu Shrike; Shin, Su Ryon; Aleman, Julio; Massa, Solange; Kim, Duckjin; Dokmeci, Mehmet Remzi; Khademhosseini, Ali

    2016-04-01

    There is an increasing interest in developing microfluidic bioreactors and organs-on-a-chip platforms combined with sensing capabilities for continual monitoring of cell-secreted biomarkers. Conventional approaches such as ELISA and mass spectroscopy cannot satisfy the needs of continual monitoring as they are labor-intensive and not easily integrable with low-volume bioreactors. This paper reports on the development of an automated microfluidic bead-based electrochemical immunosensor for in-line measurement of cell-secreted biomarkers. For the operation of the multi-use immunosensor, disposable magnetic microbeads were used to immobilize biomarker-recognition molecules. Microvalves were further integrated in the microfluidic immunosensor chip to achieve programmable operations of the immunoassay including bead loading and unloading, binding, washing, and electrochemical sensing. The platform allowed convenient integration of the immunosensor with liver-on-chips to carry out continual quantification of biomarkers secreted from hepatocytes. Transferrin and albumin productions were monitored during a 5-day hepatotoxicity assessment in which human primary hepatocytes cultured in the bioreactor were treated with acetaminophen. Taken together, our unique microfluidic immunosensor provides a new platform for in-line detection of biomarkers in low volumes and long-term in vitro assessments of cellular functions in microfluidic bioreactors and organs-on-chips.

  16. An Automated Electronic Tongue for In-Situ Quick Monitoring of Trace Heavy Metals in Water Environment

    Science.gov (United States)

    Cai, Wei; Li, Yi; Gao, Xiaoming; Guo, Hongsun; Zhao, Huixin; Wang, Ping

    2009-05-01

    An automated electronic tongue instrumentation has been developed for in-situ concentration determination of trace heavy metals in water environment. The electronic tongue contains two main parts. The sensor part consists of a silicon-based Hg-coated Au microelectrodes array (MEA) for the detection of Zn(II), Cd(II), Pb(II) and Cu(II) and a multiple light-addressable potentiometric sensor (MLAPS) for the detection of Fe(III) and Cr(VI). The control part employs pumps, valves and tubes to enable the pick-up and pretreatment of aqueous sample. The electronic tongue realized detection of the six metals mentioned above at part-per-billion (ppb) level without manual operation. This instrumentation will have wide application in quick monitoring and prediction the heavy metal pollution in lakes and oceans.

  17. ONLINE WATER MONITORING UTILIZING AN AUTOMATED MICROARRAY BIOSENSOR INSTRUMENT - PHASE I

    Science.gov (United States)

    Constellation Technology Corporation (Constellation) proposes the use of an integrated recovery and detection system for online water supply monitoring.  The integrated system is designed to efficiently capture and recover pathogens such as bacteria, viruses, parasites, an...

  18. Adaptive Soa Stack-Based Business Process Monitoring Platform

    Directory of Open Access Journals (Sweden)

    Przemysław Dadel

    2014-01-01

    Full Text Available Executable business processes that formally describe company activities are well placed in the SOA environment as they allow for declarative organization of high-level system logic.However, for both technical and non-technical users, to fully benet from that element of abstractionappropriate business process monitoring systems are required and existing solutions remain unsatisfactory.The paper discusses the problem of business process monitoring in the context of the service orientation paradigm in order to propose an architectural solution and provide implementation of a system for business process monitoring that alleviates the shortcomings of the existing solutions.Various platforms are investigated to obtain a broader view of the monitoring problem and to gather functional and non-functional requirements. These requirements constitute input forthe further analysis and the system design. The monitoring software is then implemented and evaluated according to the specied criteria.An extensible business process monitoring system was designed and built on top of OSGiMM - a dynamic, event-driven, congurable communications layer that provides real-time monitoring capabilities for various types of resources. The system was tested against the stated functional requirements and its implementation provides a starting point for the further work.It is concluded that providing a uniform business process monitoring solution that satises a wide range of users and business process platform vendors is a dicult endeavor. It is furthermore reasoned that only an extensible, open-source, monitoring platform built on top of a scalablecommunication core has a chance to address all the stated and future requirements.

  19. Signal Processing Methods Monitor Cranial Pressure

    Science.gov (United States)

    2010-01-01

    Dr. Norden Huang, of Goddard Space Flight Center, invented a set of algorithms (called the Hilbert-Huang Transform, or HHT) for analyzing nonlinear and nonstationary signals that developed into a user-friendly signal processing technology for analyzing time-varying processes. At an auction managed by Ocean Tomo Federal Services LLC, licenses of 10 U.S. patents and 1 domestic patent application related to HHT were sold to DynaDx Corporation, of Mountain View, California. DynaDx is now using the licensed NASA technology for medical diagnosis and prediction of brain blood flow-related problems, such as stroke, dementia, and traumatic brain injury.

  20. Honeywell Modular Automation System Computer Software Documentation for the Magnesium Hydroxide Precipitation Process

    International Nuclear Information System (INIS)

    The purpose of this Computer Software Document (CSWD) is to provide configuration control of the Honeywell Modular Automation System (MAS) in use at the Plutonium Finishing Plant (PFP) for the Magnesium Hydroxide Precipitation Process in Rm 23OC/234-52. This CSWD describes hardware and PFP/FFS developed software for control of Magnesium Hydroxide Precipitation process located in room 230, 234-52. The Honeywell and Plant Scape software generate limited configuration reports for the developed control software. These reports are described in the following section and are attached as addendum's. This plan applies to PFP Engineering Manager, Thermal Stabilization Cognizant Engineers, Solutions Stabilization Engineers, and the Shift Technical Advisors responsible for the Honeywell MAS software/hardware and administration of the Honeywell System

  1. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    Science.gov (United States)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  2. MAC3 Evaluation: Monitoring Process, Documenting Outcomes

    Science.gov (United States)

    Korey, Jane

    2010-01-01

    The role of evaluation is to determine whether a project achieves what it sets out to do. Using a strategy often referred to as "backwards planning" or "backwards research design," the evaluation process operationalizes project goals and then, asking the question "What would success look like?" identifies measurable indices of success (Friedman,…

  3. 10 CFR 74.53 - Process monitoring.

    Science.gov (United States)

    2010-01-01

    ... estimated measurement standard deviation greater than five percent that is either input or output material... differences greater than three times the estimated standard deviation of the process difference estimator and... times the estimated standard error of the inventory difference estimator; (2) Evaluate material...

  4. APACS: Monitoring and diagnosis of complex processes

    International Nuclear Information System (INIS)

    This paper describes APACS - a new framework for a system that detects, predicts and identifies faults in industrial processes. The APACS frameworks provides a structure in which a heterogeneous set of programs can share a common view of the problem and a common model of the domain. (author). 17 refs, 2 figs

  5. CMS Tracker Integration: monitoring the process quality

    International Nuclear Information System (INIS)

    The CMS experiment at LHC features the largest Silicon Strip Tracker (SST) ever built. This detector is composed of about 15000 modules, thus the potential problems of the system comes from its complexity. This article covers the tests performed during the tracker integration, describing their motivations in terms of process quality assurance

  6. FY09 PROGRESS: MULTI-ISOTOPE PROCESS (MIP) MONITOR

    Energy Technology Data Exchange (ETDEWEB)

    Schwantes, Jon M.; Orton, Christopher R.; Fraga, Carlos G.; Christensen, Richard; Laspe, Amy R.; Ward, Rebecca M.

    2009-10-18

    Model and experimental estimates of the Multi-Isotope Process Monitor performance for determining burnup after dissolution and acid concentration during solvent extraction steps during reprocessing of spent nuclear fuel are presented.

  7. Conflict Monitoring in Dual Process Theories of Thinking

    Science.gov (United States)

    De Neys, Wim; Glumicic, Tamara

    2008-01-01

    Popular dual process theories have characterized human thinking as an interplay between an intuitive-heuristic and demanding-analytic reasoning process. Although monitoring the output of the two systems for conflict is crucial to avoid decision making errors there are some widely different views on the efficiency of the process. Kahneman…

  8. A log mining approach for process monitoring in SCADA

    NARCIS (Netherlands)

    Hadziosmanovic, Dina; Bolzoni, Damiano; Hartel, Pieter

    2012-01-01

    SCADA (Supervisory Control and Data Acquisition) systems are used for controlling and monitoring industrial processes. We propose a methodology to systematically identify potential process-related threats in SCADA. Process-related threats take place when an attacker gains user access rights and perf

  9. Access Control for Monitoring System-Spanning Business Processes

    NARCIS (Netherlands)

    Bassil, S.; Reichert, M.U.; Bobrik, R.; Bauer, Th.

    2007-01-01

    Integrated process support is highly desirable in environ- ments where data related to a particular (business) process are scattered over distributed and heterogeneous information systems (IS). A process monitoring component is a much-needed module in order to provide an integrated view on all these

  10. Ultrasonic flow measurements for irrigation process monitoring

    Science.gov (United States)

    Ziani, Elmostafa; Bennouna, Mustapha; Boissier, Raymond

    2004-02-01

    This paper presents the state of the art of the general principle of liquid flow measurements by ultrasonic method, and problems of flow measurements. We present an ultrasonic flowmeter designed according to smart sensors concept, for the measurement of irrigation water flowing through pipelines or open channels, using the ultrasonic transit time approach. The new flowmeter works on the principle of measuring time delay differences between sound pulses transmitted upstream and downstream in the flowing liquid. The speed of sound in the flowing medium is eliminated as a variable because the flowrate calculations are based on the reciprocals of the transmission times. The transit time difference is digitally measured by means of a suitable, microprocessor controlled logic. This type of ultrasonic flowmeter will be widely used in industry and water management, it is well studied in this work, followed by some experimental results. For pressurized channels, we use one pair of ultrasonic transducer arranged in proper positions and directions of the pipe, in this case, to determine the liquid velocity, a real time on-line analysis taking account the geometries of the hydraulic system, is applied to the obtained ultrasonic data. In the open channels, we use a single or two pairs of ultrasonic emitter-receiver according to the desired performances. Finally, the goals of this work consist in integrating the smart sensor into irrigation systems monitoring in order to evaluate potential advantages and demonstrate their performance, on the other hand, to understand and use ultrasonic approach for determining flow characteristics and improving flow measurements by reducing errors caused by disturbances of the flow profiles.

  11. a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation

    Science.gov (United States)

    Kıvılcım, C. Ö.; Duran, Z.

    2016-06-01

    The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.

  12. A METHOD OF COMPLEX AUTOMATED MONITORING OF UKRAINIAN POWER ENERGY SYSTEM OBJECTS TO INCREASE ITS OPERATION SAFETY

    Directory of Open Access Journals (Sweden)

    Ye.I. Sokol

    2016-05-01

    Full Text Available The paper describes an algorithm of the complex automated monitoring of Ukraine’s power energy system, aimed at ensuring safety of its personnel and equipment. This monitoring involves usage of unmanned aerial vehicles (UAVs for planned and unplanned registration status of power transmission lines (PTL and high-voltage substations (HVS. It is assumed that unscheduled overflights will be made in emergency situations on power lines. With the help of the UAV, pictures of transmission and HVS will be recorded from the air in the optical and infrared ranges, as well as strength of electric (EF and magnetic (MF fields will be measured along the route of flight. Usage specially developed software allows to compare the recorded pictures with pre-UAV etalon patterns corresponding to normal operation of investigated transmission lines and the HVSs. Such reference pattern together with the experimentally obtained maps of HVS’s protective grounding will be summarized in a single document – a passport of HVS and PTL. This passport must also contain the measured and calculated values of strength levels of EF and MF in the places where staff of power facilities stay as well as layout of equipment, the most vulnerable to the effects of electromagnetic interference. If necessary, as part of ongoing monitoring, recommendations will be given on the design and location of electromagnetic screens, reducing the levels of electromagnetic interference as well as on location of lightning rods, reducing probability lightning attachment to the objects. The paper presents analytic expressions, which formed the basis of the developed software for calculation of the EF strength in the vicinity of power lines. This software will be used as a base at UAV navigation along the transmission lines, as well as to detect violations in the transmission lines operation. Comparison of distributions of EF strength calculated with the help of the elaborated software with the known

  13. Metrology Sampling Strategies for Process Monitoring Applications

    KAUST Repository

    Vincent, Tyrone L.

    2011-11-01

    Shrinking process windows in very large scale integration semiconductor manufacturing have already necessitated the development of control systems capable of addressing sub-lot-level variation. Within-wafer control is the next milestone in the evolution of advanced process control from lot-based and wafer-based control. In order to adequately comprehend and control within-wafer spatial variation, inline measurements must be performed at multiple locations across the wafer. At the same time, economic pressures prompt a reduction in metrology, for both capital and cycle-time reasons. This paper explores the use of modeling and minimum-variance prediction as a method to select the sites for measurement on each wafer. The models are developed using the standard statistical tools of principle component analysis and canonical correlation analysis. The proposed selection method is validated using real manufacturing data, and results indicate that it is possible to significantly reduce the number of measurements with little loss in the information obtained for the process control systems. © 2011 IEEE.

  14. Deriving robust distributed business processes with automated transformations of fallible component processes

    NARCIS (Netherlands)

    Wang, Lei; Ferreira Pires, Luis; Sinderen, van Marten J.; Wombacher, Andreas; Chi, Chi-Hung

    2015-01-01

    Due to the possibility of system crashes and network failures, the design of robust interactions for collaborative business processes is a challenge. If a process changes state, it sends messages to other relevant processes to inform them about this change. However, server crashes and network failur

  15. Spatial monitoring of groundwater drawdown and rebound associated with quarry dewatering using automated time-lapse electrical resistivity tomography and distribution guided clustering

    OpenAIRE

    Chambers, J. E.; Meldrum, P.I.; P. B. Wilkinson; Ward, W.; Jackson, C; Matthews, B.; Joel, P; Kuras, O.; Bai, L.; S. Uhlemann; Gunn, D

    2015-01-01

    Dewatering systems used for mining and quarrying operations often result in highly artificial and complex groundwater conditions, which can be difficult to characterise and monitor using borehole point sampling approaches. Here automated time-lapse electrical resistivity tomography (ALERT) is considered as a means of monitoring subsurface groundwater dynamics associated with changes in the dewatering regime in an operational sand and gravel quarry. We considered two scenarios: the first was u...

  16. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  17. Automated measurement and monitoring of bioprocesses: key elements of the M(3)C strategy.

    Science.gov (United States)

    Sonnleitner, Bernhard

    2013-01-01

    The state-of-routine monitoring items established in the bioprocess industry as well as some important state-of-the-art methods are briefly described and the potential pitfalls discussed. Among those are physical and chemical variables such as temperature, pressure, weight, volume, mass and volumetric flow rates, pH, redox potential, gas partial pressures in the liquid and molar fractions in the gas phase, infrared spectral analysis of the liquid phase, and calorimetry over an entire reactor. Classical as well as new optical versions are addressed. Biomass and bio-activity monitoring (as opposed to "measurement") via turbidity, permittivity, in situ microscopy, and fluorescence are critically analyzed. Some new(er) instrumental analytical tools, interfaced to bioprocesses, are explained. Among those are chromatographic methods, mass spectrometry, flow and sequential injection analyses, field flow fractionation, capillary electrophoresis, and flow cytometry. This chapter surveys the principles of monitoring rather than compiling instruments.

  18. Modelling and automation of the process of phosphate ion removal from waste waters

    Directory of Open Access Journals (Sweden)

    L. Lupa

    2008-03-01

    Full Text Available Phosphate removal from waste waters has become an environmental necessity, since these phosphates stimulate the growth of aquatic plants and planktons and contribute to the eutrophication process in general. The physicochemical methods of phosphate ion removal are the most effective and reliable. This paper presents studies on the process of phosphate ion removal from waste waters resulting from the fertiliser industry’s use of the method of co-precipitation with iron salts and with calcium hydroxide as the neutralizing agent. The optimal process conditions were established as those that allow achievement of a maximum degree of separation of the phosphate ions. The precipitate resulting from the co-precipitation process was analysed for chemical composition and establishment of thermal and structural stability, and the aim was also to establish in which form the phosphate ions in the formed precipitate can be found. Based on these considerations, the experimental data obtained in the process of phosphate ion removal from waste waters were analysed mathematically and the equations for the dependence of the degree of phosphate separation and residual concentration versus the main parameters of the process were formulated. In this paper an automated scheme for the phosphate ion removal from waste waters by co-precipitation is presented.

  19. Effects of Secondary Task Modality and Processing Code on Automation Trust and Utilization During Simulated Airline Luggage Screening

    Science.gov (United States)

    Phillips, Rachel; Madhavan, Poornima

    2010-01-01

    The purpose of this research was to examine the impact of environmental distractions on human trust and utilization of automation during the process of visual search. Participants performed a computer-simulated airline luggage screening task with the assistance of a 70% reliable automated decision aid (called DETECTOR) both with and without environmental distractions. The distraction was implemented as a secondary task in either a competing modality (visual) or non-competing modality (auditory). The secondary task processing code either competed with the luggage screening task (spatial code) or with the automation's textual directives (verbal code). We measured participants' system trust, perceived reliability of the system (when a target weapon was present and absent), compliance, reliance, and confidence when agreeing and disagreeing with the system under both distracted and undistracted conditions. Results revealed that system trust was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Perceived reliability of the system (when the target was present) was significantly higher when the secondary task was visual rather than auditory. Compliance with the aid increased in all conditions except for the auditory-verbal condition, where it decreased. Similar to the pattern for trust, reliance on the automation was lower in the visual-spatial and auditory-verbal conditions than in the visual-verbal and auditory-spatial conditions. Confidence when agreeing with the system decreased with the addition of any kind of distraction; however, confidence when disagreeing increased with the addition of an auditory secondary task but decreased with the addition of a visual task. A model was developed to represent the research findings and demonstrate the relationship between secondary task modality, processing code, and automation use. Results suggest that the nature of environmental distractions influence

  20. Intelligent Production Monitoring and Control based on Three Main Modules for Automated Manufacturing Cells in the Automotive Industry

    Science.gov (United States)

    Berger, Ulrich; Kretzschmann, Ralf; Algebra, A. Vargas Veronica

    2008-06-01

    The automotive industry is distinguished by regionalization and customization of products. As consequence, the diversity of products will increase while the lot sizes will decrease. Thus, more product types will be handled along the process chain and common production paradigms will fail. Although Rapid Manufacturing (RM) methodology will be used for producing small individual lot sizes, new solution for joining and assembling these components are needed. On the other hand, the non-availability of existing operational knowledge and the absence of dynamic and explicit knowledge retrieval minimize the achievement of on-demand capabilities. Thus, in this paper, an approach for an Intelligent Production System will be introduced. The concept is based on three interlinked main modules: a Technology Data Catalogue (TDC) based on an ontology system, an Automated Scheduling Processor (ASP) based on graph theory and a central Programmable Automation Controller (PAC) for real-time sensor/actor communication. The concept is being implemented in a laboratory set-up with several assembly and joining processes and will be experimentally validated in some research and development projects.

  1. "SmartMonitor"--an intelligent security system for the protection of individuals and small properties with the possibility of home automation.

    Science.gov (United States)

    Frejlichowski, Dariusz; Gościewska, Katarzyna; Forczmański, Paweł; Hofman, Radosław

    2014-01-01

    "SmartMonitor" is an intelligent security system based on image analysis that combines the advantages of alarm, video surveillance and home automation systems. The system is a complete solution that automatically reacts to every learned situation in a pre-specified way and has various applications, e.g., home and surrounding protection against unauthorized intrusion, crime detection or supervision over ill persons. The software is based on well-known and proven methods and algorithms for visual content analysis (VCA) that were appropriately modified and adopted to fit specific needs and create a video processing model which consists of foreground region detection and localization, candidate object extraction, object classification and tracking. In this paper, the "SmartMonitor" system is presented along with its architecture, employed methods and algorithms, and object analysis approach. Some experimental results on system operation are also provided. In the paper, focus is put on one of the aforementioned functionalities of the system, namely supervision over ill persons. PMID:24905854

  2. "SmartMonitor"--an intelligent security system for the protection of individuals and small properties with the possibility of home automation.

    Science.gov (United States)

    Frejlichowski, Dariusz; Gościewska, Katarzyna; Forczmański, Paweł; Hofman, Radosław

    2014-06-05

    "SmartMonitor" is an intelligent security system based on image analysis that combines the advantages of alarm, video surveillance and home automation systems. The system is a complete solution that automatically reacts to every learned situation in a pre-specified way and has various applications, e.g., home and surrounding protection against unauthorized intrusion, crime detection or supervision over ill persons. The software is based on well-known and proven methods and algorithms for visual content analysis (VCA) that were appropriately modified and adopted to fit specific needs and create a video processing model which consists of foreground region detection and localization, candidate object extraction, object classification and tracking. In this paper, the "SmartMonitor" system is presented along with its architecture, employed methods and algorithms, and object analysis approach. Some experimental results on system operation are also provided. In the paper, focus is put on one of the aforementioned functionalities of the system, namely supervision over ill persons.

  3. Remote monitoring field trial. Application to automated air sampling. Report on Task FIN-E935 of the Finnish Support Programme to IAEA Safeguards

    International Nuclear Information System (INIS)

    An automated air sampling station has recently been developed by Radiation and Nuclear Safety Authority (STUK). The station is furnished with equipment that allows comprehensive remote monitoring of the station and the data. Under the Finnish Support Programme to IAEA Safeguards, STUK and Sandia National Laboratories (SNL) established a field trial to demonstrate the use of remote monitoring technologies. STUK provided means for real-lime radiation monitoring and sample authentication whereas SNL delivered means for authenticated surveillance of the equipment and its location. The field trial showed that remote monitoring can be carried out using simple means although advanced facilities are needed for comprehensive surveillance. Authenticated measurement data could be reliably transferred from the monitoring site to the headquarters without the presence of authorized personnel in the monitoring site. The operation of the station and the remote monitoring system were reliable. (orig.)

  4. Prototypic automated continuous recreational water quality monitoring of nine Chicago beaches

    Science.gov (United States)

    Dawn Shively,; Nevers, Meredith; Cathy Breitenbach,; Phanikumar, Mantha S.; Kasia Przybyla-Kelly,; Ashley M. Spoljaric,; Richard L. Whitman,

    2015-01-01

    Predictive empirical modeling is used in many locations worldwide as a rapid, alternative recreational water quality management tool to eliminate delayed notifications associated with traditional fecal indicator bacteria (FIB) culturing (referred to as the persistence model, PM) and to prevent errors in releasing swimming advisories. The goal of this study was to develop a fully automated water quality management system for multiple beaches using predictive empirical models (EM) and state-of-the-art technology. Many recent EMs rely on samples or data collected manually, which adds to analysis time and increases the burden to the beach manager. In this study, data from water quality buoys and weather stations were transmitted through cellular telemetry to a web hosting service. An executable program simultaneously retrieved and aggregated data for regression equations and calculated EM results each morning at 9:30 AM; results were transferred through RSS feed to a website, mapped to each beach, and received by the lifeguards to be posted at the beach. Models were initially developed for five beaches, but by the third year, 21 beaches were managed using refined and validated modeling systems. The adjusted R2 of the regressions relating Escherichia coli to hydrometeorological variables for the EMs were greater than those for the PMs, and ranged from 0.220 to 0.390 (2011) and 0.103 to 0.381 (2012). Validation results in 2013 revealed reduced predictive capabilities; however, three of the originally modeled beaches showed improvement in 2013 compared to 2012. The EMs generally showed higher accuracy and specificity than those of the PMs, and sensitivity was low for both approaches. In 2012 EM accuracy was 70–97%; specificity, 71–100%; and sensitivity, 0–64% and in 2013 accuracy was 68–97%; specificity, 73–100%; and sensitivity 0–36%. Factors that may have affected model capabilities include instrument malfunction, non-point source inputs, and sparse

  5. Automated and Scalable Data Reduction in the textsc{Sofia} Data Processing System

    Science.gov (United States)

    Krzaczek, R.; Shuping, R.; Charcos-Llorens, M.; Alles, R.; Vacca, W.

    2015-09-01

    In order to provide suitable data products to general investigators and other end users in a timely manner, the Stratospheric Observatory for Infrared Astronomy SOFIA) has developed a framework supporting the automated execution of data processing pipelines for the various instruments, called the Data Processing System (DPS), see Shuping et al. (2014) for overview). The primary requirement is to process all data collected from a flight within eight hours, allowing data quality assessments and inspections to be made the following day. The raw data collected during a flight requires processing by a number of different software packages and tools unique to each combination of instrument and mode of operation, much of it developed in-house, in order to create data products for use by investigators and other end-users. The requirement to deliver these data products in a consistent, predictable, and performant manner presents a significant challenge for the observatory. Herein we present aspects of the DPS that help to achieve these goals. We discuss how it supports data reduction software written in a variety of languages and environments, its support for new versions and live upgrades to that software and other necessary resources (e.g., calibrations), its accommodation of sudden processing loads through the addition (and eventual removal) of computing resources, and close with an observation of the performance achieved in the first two observing cycles of SOFIA.

  6. Analysis of irradiated U-7wt%Mo dispersion fuel microstructures using automated image processing

    Science.gov (United States)

    Collette, R.; King, J.; Buesch, C.; Keiser, D. D.; Williams, W.; Miller, B. D.; Schulthess, J.

    2016-07-01

    The High Performance Research Reactor Fuel Development (HPPRFD) program is responsible for developing low enriched uranium (LEU) fuel substitutes for high performance reactors fueled with highly enriched uranium (HEU) that have not yet been converted to LEU. The uranium-molybdenum (U-Mo) fuel system was selected for this effort. In this study, fission gas pore segmentation was performed on U-7wt%Mo dispersion fuel samples at three separate fission densities using an automated image processing interface developed in MATLAB. Pore size distributions were attained that showed both expected and unexpected fission gas behavior. In general, it proved challenging to identify any dominant trends when comparing fission bubble data across samples from different fuel plates due to varying compositions and fabrication techniques. The results exhibited fair agreement with the fission density vs. porosity correlation developed by the Russian reactor conversion program.

  7. An automated data processing method dedicated to 3D ultrasonic non destructive testing of composite pieces

    International Nuclear Information System (INIS)

    State-of the art Non Destructive Testing using ultrasound is based on evaluation of C-scan images, which is done mainly visually. The development of the new Sampling Phased Array technique SPA by IZFP Fraunhofer provides a fast three-dimensional reconstruction of inner object structures. This new inspection technique is to be complemented with fully or semi-automated evaluation of ultrasonic data, providing maximum support to the operator. We present in this contribution a processing method for SPA ultrasonic data, where the main focus of this paper will be on speckle noise reduction. The evaluation method is applied on carbon fibre composite where it demonstrates robust and successful performance in recognition of defects.

  8. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    Science.gov (United States)

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  9. Automated Planning of Science Products Based on Nadir Overflights and Alerts for Onboard and Ground Processing

    Science.gov (United States)

    Chien, Steve A.; McLaren, David A.; Rabideau, Gregg R.; Mandl, Daniel; Hengemihle, Jerry

    2013-01-01

    A set of automated planning algorithms is the current operations baseline approach for the Intelligent Payload Module (IPM) of the proposed Hyper spectral Infrared Imager (HyspIRI) mission. For this operations concept, there are only local (e.g. non-depletable) operations constraints, such as real-time downlink and onboard memory, and the forward sweeping algorithm is optimal for determining which science products should be generated onboard and on ground based on geographical overflights, science priorities, alerts, requests, and onboard and ground processing constraints. This automated planning approach was developed for the HyspIRI IPM concept. The HyspIRI IPM is proposed to use an X-band Direct Broadcast (DB) capability that would enable data to be delivered to ground stations virtually as it is acquired. However, the HyspIRI VSWIR and TIR instruments will produce approximately 1 Gbps data, while the DB capability is 15 Mbps for a approx. =60X oversubscription. In order to address this mismatch, this innovation determines which data to downlink based on both the type of surface the spacecraft is overflying, and the onboard processing of data to detect events. For example, when the spacecraft is overflying Polar Regions, it might downlink a snow/ice product. Additionally, the onboard software will search for thermal signatures indicative of a volcanic event or wild fire and downlink summary information (extent, spectra) when detected, thereby reducing data volume. The planning system described above automatically generated the IPM mission plan based on requested products, the overflight regions, and available resources.

  10. Multimedia abstract generation of intensive care data: the automation of clinical processes through AI methodologies.

    Science.gov (United States)

    Jordan, Desmond; Rose, Sydney E

    2010-04-01

    Medical errors from communication failures are enormous during the perioperative period of cardiac surgical patients. As caregivers change shifts or surgical patients change location within the hospital, key information is lost or misconstrued. After a baseline cognitive study of information need and caregiver workflow, we implemented an advanced clinical decision support tool of intelligent agents, medical logic modules, and text generators called the "Inference Engine" to summarize individual patient's raw medical data elements into procedural milestones, illness severity, and care therapies. The system generates two displays: 1) the continuum of care, multimedia abstract generation of intensive care data (MAGIC)-an expert system that would automatically generate a physician briefing of a cardiac patient's operative course in a multimodal format; and 2) the isolated point in time, "Inference Engine"-a system that provides a real-time, high-level, summarized depiction of a patient's clinical status. In our studies, system accuracy and efficacy was judged against clinician performance in the workplace. To test the automated physician briefing, "MAGIC," the patient's intraoperative course, was reviewed in the intensive care unit before patient arrival. It was then judged against the actual physician briefing and that given in a cohort of patients where the system was not used. To test the real-time representation of the patient's clinical status, system inferences were judged against clinician decisions. Changes in workflow and situational awareness were assessed by questionnaires and process evaluation. MAGIC provides 200% more information, twice the accuracy, and enhances situational awareness. This study demonstrates that the automation of clinical processes through AI methodologies yields positive results. PMID:20012610

  11. Monitoring of Welding Processes with Application of Artificial Neural Networks

    OpenAIRE

    Чвертко, Євгенія Петрівна; Пірумов, Андрій Євгенович; Шевченко, Микола Віталійович

    2014-01-01

    The paper presents a summary of methods of monitoring systems’ development for the processes involving heating of filler material and/ or base metal by the electric current and with periodical shortages of the welding circuit. The processes investigated were MAG welding, underwater flux-cored welding and flash-butt welding. Details of experiments, primary data processing procedures based on statistical analysis methods are described, the aim of primary processing being obtaining of informativ...

  12. Development of an automated system for continuous monitoring of powered roof support in longwall panel

    Institute of Scientific and Technical Information of China (English)

    ATUL Kumar; DHEERAJ Kumar; SINGH U.K.; GUPTA P.S.

    2010-01-01

    Described the development of an Intrinsically Safe System for continuous monitoring of load and convergence of powered roof supports installed at longwall faces.The system developed for monitoring of behavior of a powered support in a mechanized longwall sublevel caving face. The logging system can be programmed for logging the data from the sensors at different logging intervals ranging from 16 h to 1 ms for logging variation in hydraulic pressures in legs and convergence of the support during progressive face advance. For recording dynamic loads, the data logger can be programmed to start fast logging, say at 10 ms intervals, when the pressure in a leg reaches a pre-specified threshold value, and continue fast logging until the pressure drops below this threshold value. This fast logging automatically stops when the pressure drops below this threshold value.

  13. Automated Grid Monitoring for the LHCb Experiment Through HammerCloud

    CERN Document Server

    Dice, Bradley

    2015-01-01

    The HammerCloud system is used by CERN IT to monitor the status of the Worldwide LHC Computing Grid (WLCG). HammerCloud automatically submits jobs to WLCG computing resources, closely replicating the workflow of Grid users (e.g. physicists analyzing data). This allows computation nodes and storage resources to be monitored, software to be tested (somewhat like continuous integration), and new sites to be stress tested with a heavy job load before commissioning. The HammerCloud system has been in use for ATLAS and CMS experiments for about five years. This summer's work involved porting the HammerCloud suite of tools to the LHCb experiment. The HammerCloud software runs functional tests and provides data visualizations. HammerCloud's LHCb variant is written in Python, using the Django web framework and Ganga/DIRAC for job management.

  14. The AAL project: Automated monitoring and intelligent AnaLysis for the ATLAS data taking infrastructure

    CERN Document Server

    Magnoni, L; The ATLAS collaboration; Kazarov, A

    2011-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for filtering and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The huge flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This require strong competence and experience in understanding and discovering problems and root causes, and often the meaningful in...

  15. The AAL project: automated monitoring and intelligent analysis for the ATLAS data taking infrastructure

    CERN Document Server

    Kazarov, A; The ATLAS collaboration; Magnoni, L

    2011-01-01

    The Trigger and Data Acquisition (TDAQ) system of the ATLAS experiment at CERN is the infrastructure responsible for filtering and transferring ATLAS experimental data from detectors to the mass storage system. It relies on a large, distributed computing environment, including thousands of computing nodes with thousands of application running concurrently. In such a complex environment, information analysis is fundamental for controlling applications behavior, error reporting and operational monitoring. During data taking runs, streams of messages sent by applications via the message reporting system together with data published from applications via information services are the main sources of knowledge about correctness of running operations. The huge flow of data produced (with an average rate of O(1-10KHz)) is constantly monitored by experts to detect problem or misbehavior. This require strong competence and experience in understanding and discovering problems and root causes, and often the meaningful in...

  16. A Review for Model Plant Mismatch Measures in Process Monitoring

    Institute of Scientific and Technical Information of China (English)

    王洪; 谢磊; 宋执环

    2012-01-01

    Model is usually necessary for the design of a control loop. Due to simplification and unknown dynamics, model plant mismatch is inevitable in the control loop. In process monitoring, detection of mismatch and evaluation of its influences are demanded. In this paper several mismatch measures are presented based on different model descriptions. They are categorized into different groups from different perspectives and their potential in detection and diagnosis is evaluated. Two case studies on mixing process and distillation process demonstrate the efficacy of the framework of mismatch monitoring.

  17. Nonlinear Statistical Process Monitoring and Fault Detection Using Kernel ICA

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xi; YAN Wei-wu; ZHAO Xu; SHAO Hui-he

    2007-01-01

    A novel nonlinear process monitoring and fault detection method based on kernel independent component analysis (ICA) is proposed. The kernel ICA method is a two-phase algorithm: whitened kernel principal component (KPCA) plus ICA. KPCA spheres data and makes the data structure become as linearly separable as possible by virtue of an implicit nonlinear mapping determined by kernel. ICA seeks the projection directions in the KPCA whitened space, making the distribution of the projected data as non-gaussian as possible. The application to the fluid catalytic cracking unit (FCCU) simulated process indicates that the proposed process monitoring method based on kernel ICA can effectively capture the nonlinear relationship in process variables. Its performance significantly outperforms monitoring method based on ICA or KPCA.

  18. Automated Wildlife Monitoring Using Self-Configuring Sensor Networks Deployed in Natural Habitats

    OpenAIRE

    Trifa, Vlad; Girod, Lewis; Travis C. Collier; Blumstein, Daniel; Taylor, C E

    2007-01-01

    To understand the complex interactions among animals within an ecosystem, biologists need to be able to track their location and social interactions. There are a variety of factors that make this difficult. We propose using adaptive, embedded networked sensing technologies to develop an efficient means for wildlife monitoring. This paper surveys our research; we demonstrate how a self-organizing system can efficiently conduct real-time acoustic source detection and localization using distribu...

  19. Automation and data processing with the immucor Galileo (R) system in a university blood bank

    OpenAIRE

    Wittmann, Georg; Frank, Josef; Schramm, Wolfgang; Spannagl, Michael

    2007-01-01

    Background: The implementation of automated techniques improves the workflow and quality of immuno-hematological results. The workflows of our university blood bank were reviewed during the implementation of an automated immunohematological testing system. Methods: Work impact of blood grouping and subgrouping, cross- matching and antibody search using the Immucor Galileo system was compared to the previous used standard manual and semi- automated methods. Results: The redesign of our workflo...

  20. Integrating and automating the software environment for the Beam and Radiation Monitoring for CMS

    CERN Document Server

    Filyushkina, Olga; Juslin, J

    2010-01-01

    The real-time online visualization framework used by the Beam and Radiation Monitoring group at the Compact Muon Solenoid at Large Hadron Collider, CERN. The purpose of the visualization framework is to provide real-time diagnostic of beam conditions, which defines the set of the requirements to be met by the framework. Those requirements include data quality assurance, vital safety issues, low latency, data caching, etc. The real-time visualization framework is written in the Java programming language and based on JDataViewer--a plotting package developed at CERN. At the current time the framework is run by the Beam and Radiation Monitoring, Pixel, Tracker groups, Run Field Manager and others. It contributed to real-time data analysis during 2009-2010 runs as a stable monitoring tool. The displays reflect the beam conditions in a real-time with the low latency level, thus it is the first place at the CMS detector where the beam collisions are observed.

  1. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    Energy Technology Data Exchange (ETDEWEB)

    Sudowe, Ralf [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program and Health Physics Dept.; Roman, Audrey [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Dailey, Ashlee [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Go, Elaine [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program

    2013-07-18

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  2. Advanced Process Monitoring Techniques for Safeguarding Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Orton, Christopher R.; Bryan, Samuel A.; Schwantes, Jon M.; Levitskaia, Tatiana G.; Fraga, Carlos G.; Peper, Shane M.

    2010-11-30

    The International Atomic Energy Agency (IAEA) has established international safeguards standards for fissionable material at spent fuel reprocessing plants to ensure that significant quantities of weapons-grade nuclear material are not diverted from these facilities. For large throughput nuclear facilities, it is difficult to satisfy the IAEA safeguards accountancy goal for detection of abrupt diversion. Currently, methods to verify material control and accountancy (MC&A) at these facilities require time-consuming and resource-intensive destructive assay (DA). Leveraging new on-line non destructive assay (NDA) process monitoring techniques in conjunction with the traditional and highly precise DA methods may provide an additional measure to nuclear material accountancy which would potentially result in a more timely, cost-effective and resource efficient means for safeguards verification at such facilities. By monitoring process control measurements (e.g. flowrates, temperatures, or concentrations of reagents, products or wastes), abnormal plant operations can be detected. Pacific Northwest National Laboratory (PNNL) is developing on-line NDA process monitoring technologies, including both the Multi-Isotope Process (MIP) Monitor and a spectroscopy-based monitoring system, to potentially reduce the time and resource burden associated with current techniques. The MIP Monitor uses gamma spectroscopy and multivariate analysis to identify off-normal conditions in process streams. The spectroscopic monitor continuously measures chemical compositions of the process streams including actinide metal ions (U, Pu, Np), selected fission products, and major cold flowsheet chemicals using UV-Vis, Near IR and Raman spectroscopy. This paper will provide an overview of our methods and report our on-going efforts to develop and demonstrate the technologies.

  3. Sensor-model prediction, monitoring and in-situ control of liquid RTM advanced fiber architecture composite processing

    Science.gov (United States)

    Kranbuehl, D.; Kingsley, P.; Hart, S.; Loos, A.; Hasko, G.; Dexter, B.

    In-situ frequency dependent electromagnetic sensors (FDEMS) and the Loos resin transfer model have been used to select and control the processing properties of an epoxy resin during liquid pressure RTM impregnation and cure. Once correlated with viscosity and degree of cure the FDEMS sensor monitors and the RTM processing model predicts the reaction advancement of the resin, viscosity and the impregnation of the fabric. This provides a direct means for predicting, monitoring, and controlling the liquid RTM process in-situ in the mold throughout the fabrication process and the effects of time, temperature, vacuum and pressure. Most importantly, the FDEMS-sensor model system has been developed to make intelligent decisions, thereby automating the liquid RTM process and removing the need for operator direction.

  4. Monitoring and analysis of air emissions based on condition models derived from process history

    Directory of Open Access Journals (Sweden)

    M. Liukkonen

    2016-12-01

    Full Text Available Evaluation of online information on operating conditions is necessary when reducing air emissions in energy plants. In this respect, automated monitoring and control are of primary concern, particularly in biomass combustion. As monitoring of emissions in power plants is ever more challenging because of low-grade fuels and fuel mixtures, new monitoring applications are needed to extract essential information from the large amount of measurement data. The management of emissions in energy boilers lacks economically efficient, fast, and competent computational systems that could support decision-making regarding the improvement of emission efficiency. In this paper, a novel emission monitoring platform based on the self-organizing map method is presented. The system is capable, not only of visualizing the prevailing status of the process and detecting problem situations (i.e. increased emission release rates, but also of analyzing these situations automatically and presenting factors potentially affecting them. The system is demonstrated using measurement data from an industrial circulating fluidized bed boiler fired by forest residue as the primary fuel and coal as the supporting fuel.

  5. A new highly automated sputter equipment for in situ investigation of deposition processes with synchrotron radiation

    Energy Technology Data Exchange (ETDEWEB)

    Doehrmann, Ralph; Botta, Stephan; Buffet, Adeline; Santoro, Gonzalo; Schlage, Kai; Schwartzkopf, Matthias; Risch, Johannes F. H.; Mannweiler, Roman; Roth, Stephan V. [DESY, Deutsches Elektronen-Synchrotron, Notkestrasse 85, D-22607 Hamburg (Germany); Bommel, Sebastian [DESY, Deutsches Elektronen-Synchrotron, Notkestrasse 85, D-22607 Hamburg (Germany); Institut fuer Physik, Humboldt-Universitaet zu Berlin, Newtonstr. 15, D-12489 Berlin (Germany); Brunner, Simon; Metwalli, Ezzeldin; Mueller-Buschbaum, Peter [Lehrstuhl fuer Funktionelle Materialien, Physik-Department, Technische Universitaet Muenchen, James-Franck-Str. 1, D-85748 Garching (Germany)

    2013-04-15

    HASE (Highly Automated Sputter Equipment) is a new mobile setup developed to investigate deposition processes with synchrotron radiation. HASE is based on an ultra-high vacuum sputter deposition chamber equipped with an in-vacuum sample pick-and-place robot. This enables a fast and reliable sample change without breaking the vacuum conditions and helps to save valuable measurement time, which is required for experiments at synchrotron sources like PETRA III at DESY. An advantageous arrangement of several sputter guns, mounted on a rotative flange, gives the possibility to sputter under different deposition angles or to sputter different materials on the same substrate. The chamber is also equipped with a modular sample stage, which allows for the integration of different sample environments, such as a sample heating and cooling device. The design of HASE is unique in the flexibility. The combination of several different sputtering methods like standard deposition, glancing angle deposition, and high pressure sputter deposition combined with heating and cooling possibil-ities of the sample, the large exit windows, and the degree of automation facilitate many different grazing incidence X-ray scattering experiments, such as grazing incidence small and wide angle X-ray scattering, in one setup. In this paper we describe in detail the design and the performance of the new equipment and present the installation of the HASE apparatus at the Micro and Nano focus X-ray Scattering beamline (MiNaXS) at PETRA III. Furthermore, we describe the measurement options and present some selected results. The HASE setup has been successfully commissioned and is now available for users.

  6. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  7. Automated modal tracking and fatigue assessment of a wind turbine based on continuous dynamic monitoring

    Directory of Open Access Journals (Sweden)

    Oliveira Gustavo

    2015-01-01

    Full Text Available The paper describes the implementation of a dynamic monitoring system at a 2.0 MW onshore wind turbine. The system is composed by two components aiming at the structural integrity and fatigue assessment. The first component enables the continuous tracking of modal characteristics of the wind turbine (natural frequency values, modal damping ratios and mode shapes in order to detect abnormal deviations of these properties, which may be caused by the occurrence of structural damage. On the other hand, the second component allows the estimation of the remaining fatigue lifetime of the structure based on the analysis of the measured cycles of structural vibration.

  8. Automating the Mapping Process of Traditional Malay Textile Knowledge Model with the Core Ontology

    Directory of Open Access Journals (Sweden)

    Syerina A.M. Nasir

    2011-01-01

    Full Text Available Problem statement: The wave of ontology has spread drastically in the cultural heritage domain. The impact can be seen from the growing number of cultural heritage web information systems, available textile ontology and harmonization works with the core ontology, CIDOC CRM. The aim of this study is to provide a base for common views in automating the process of mapping between revised TMT Knowledge Model and CIDOC CRM. Approach: Manual mapping was conducted to find similar or overlapping concepts which are aligned to each other in order to achieve ontology similarity. This is achieved after TMT Knowledge Model already undergone transformation process to match with CIDOC CRM structure. Results: Although there are several problems encountered during mapping process, the result shows an instant view of the classes which are found to be easily mapped between both models. Conclusion/Recommendations: Future research will be focused on the construction of Batik Heritage Ontology by using the mapping result obtained in this study. Further testing, evaluation and refinement by using the real collections of cultural artifacts within museums will also be conducted in the near future.

  9. Current good manufacturing practice in plant automation of biological production processes.

    Science.gov (United States)

    Dorresteijn, R C; Wieten, G; van Santen, P T; Philippi, M C; de Gooijer, C D; Tramper, J; Beuvery, E C

    1997-01-01

    The production of biologicals is subject to strict governmental regulations. These are drawn up in current good manufacturing practices (cGMP), a.o. by the U.S. Food and Drug Administration. To implement cGMP in a production facility, plant automation becomes an essential tool. For this purpose Manufacturing Execution Systems (MES) have been developed that control all operations inside a production facility. The introduction of these recipe-driven control systems that follow ISA S88 standards for batch processes has made it possible to implement cGMP regulations in the control strategy of biological production processes. Next to this, an MES offers additional features such as stock management, planning and routing tools, process-dependent control, implementation of software sensors and predictive models, application of historical data and on-line statistical techniques for trend analysis and detection of instrumentation failures. This paper focuses on the development of new production strategies in which cGMP guidelines are an essential part.

  10. Automated One-loop Computation in Quarkonium Process within NRQCD Framework

    CERN Document Server

    Feng, Feng

    2013-01-01

    In last decades, it has been realized that the next-to-leading order corrections may become very important, and sometimes requisite, for some processes involving quarkoinum production or decay, e.g., $e^+e^- \\to J/\\psi + \\eta_c$ and $J/\\psi \\to 3\\gamma$. In this article, we review some basic steps to perform automated one-loop computations in quarkonium process within the Non-relativistic Quantum Chromodynamics (NRQCD) factorization framework, and we give an introduction to some related public tools or packages and their usages in each step. We start from generating Feynman diagrams and amplitudes with \\textsc{FeynArts} for the quarkonium process, performing Dirac- and Color- algebras simplifications using \\textsc{FeynCalc} and \\textsc{FeynCalcFormLink}, and then to doing partial fractions on the linear-dependent propagators by \\textsc{APart}, and finally to reducing the Tensor Integrals (TI) into Scalar Integrals (SI) or Master Integrals (MI) using Integration-By-Parts (IBP) method with the help of \\textsc{F...

  11. Automated process parameters tuning for an injection moulding machine with soft computing§

    Institute of Scientific and Technical Information of China (English)

    Peng ZHAO; Jian-zhong FU; Hua-min ZHOU; Shu-biao CUI

    2011-01-01

    In injection moulding production, the tuning of the process parameters is a challenging job, which relies heavily on the experience of skilled operators. In this paper, taking into consideration operator assessment during moulding trials, a novel intelligent model for automated tuning of process parameters is proposed. This consists of case based reasoning (CBR), empirical model (EM), and fuzzy logic (FL) methods. CBR and EM are used to imitate recall and intuitive thoughts of skilled operators,respectively, while FL is adopted to simulate the skilled operator optimization thoughts. First, CBR is used to set up the initial process parameters. If CBR fails, EM is employed to calculate the initial parameters. Next, a moulding trial is performed using the initial parameters. Then FL is adopted to optimize these parameters and correct defects repeatedly until the moulded part is found to be satisfactory. Based on the above methodologies, intelligent software was developed and embedded in the controller of an injection moulding machine. Experimental results show that the intelligent software can be effectively used in practical production, and it greatly reduces the dependence on the experience of the operators.

  12. Knowledge management and process monitoring of pharmaceutical processes in the quality by design paradigm.

    Science.gov (United States)

    Rathore, Anurag S; Bansal, Anshuman; Hans, Jaspinder

    2013-01-01

    Pharmaceutical processes are complex and highly variable in nature. The complexity and variability associated with these processes result in inconsistent and sometimes unpredictable process outcomes. To deal with the complexity and understand the causes of variability in these processes, in-depth knowledge and thorough understanding of the process and the various factors affecting the process performance become critical. This makes knowledge management and process monitoring an indispensable part of the process improvement efforts for any pharmaceutical organization. PMID:23275947

  13. Knowledge management and process monitoring of pharmaceutical processes in the quality by design paradigm.

    Science.gov (United States)

    Rathore, Anurag S; Bansal, Anshuman; Hans, Jaspinder

    2013-01-01

    Pharmaceutical processes are complex and highly variable in nature. The complexity and variability associated with these processes result in inconsistent and sometimes unpredictable process outcomes. To deal with the complexity and understand the causes of variability in these processes, in-depth knowledge and thorough understanding of the process and the various factors affecting the process performance become critical. This makes knowledge management and process monitoring an indispensable part of the process improvement efforts for any pharmaceutical organization.

  14. Laboratory support for the didactic process of engineering processes automation at the Faculty of Mechanical Engineering

    Directory of Open Access Journals (Sweden)

    G. Wszołek

    2006-02-01

    Full Text Available Purpose: The scope of the paper is to present effects of creating the laboratory support for the didactic process of automatic control of engineering processes.Design/methodology/approach: The discussed laboratory framework is a complex system, flexible in terms of further development, operating on four basic levels: rudimental- serving general introductory classes to the subject, advanced level- suitable for specialisation classes, hardware and software for individual or team work assignments completed in the course of self-studies, semester projects, BSc and MSc. theses, and the sophisticated level designed for PhD and DSc research workers.Findings: Close cooperation with industry and practical implementation of joint research projects play a crucial role in the functioning of the laboratory framework.Practical implications: The education of modern engineers and Masters of Science in automatic control and robotics is a challenging task which may be successfully accomplished only if faced with industrial reality. Continuously advancing industrial companies demand graduates who can quickly adjust to the workflow and who can instantly utilize the knowledge and skills acquired in the complex, interdisciplinary field of mechatronics.Originality/value: The discussed laboratory framework successfully couples software and hardware, providing a complex yet flexible system open for further development, enabling teaching and research into the design and operation of modern control systems, both by means of virtual construction and testing in simulation programs, as well as on real industrial structures configured in laboratory workstations.

  15. An intelligent path to quality—process monitoring and control

    Science.gov (United States)

    Wen, Sheree

    1991-01-01

    The potential application of process monitoring and control in various industries is infinite and the impact is major. With computer-aided design and manufacturing, the process and tools can be managed beyond the reach of human hands. In the current environment, where computing power is increasing on an exponential scale and application software has been developing like blossoming spring flowers, material manufacturers can easily capitalize on these advantages. Conversely, major research is still needed in the in-situ sensing of material properties, the processing environment during fabrication processes, and adaptive control schemes to feed these parameters back to the process controllers.

  16. Porosity of additive manufacturing parts for process monitoring

    International Nuclear Information System (INIS)

    Some metal additive manufacturing processes can produce parts with internal porosity, either intentionally (with careful selection of the process parameters) or unintentionally (if the process is not well-controlled.) Material porosity is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants, since surface-breaking pores allow for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the process. We are developing an ultrasonic sensor for detecting changes in porosity in metal parts during fabrication on a metal powder bed fusion system, for use as a process monitor. This paper will describe our work to develop an ultrasonic-based sensor for monitoring part porosity during an additive build, including background theory, the development and detailed characterization of reference additive porosity samples, and a potential design for in-situ implementation

  17. Porosity of additive manufacturing parts for process monitoring

    Science.gov (United States)

    Slotwinski, J. A.; Garboczi, E. J.

    2014-02-01

    Some metal additive manufacturing processes can produce parts with internal porosity, either intentionally (with careful selection of the process parameters) or unintentionally (if the process is not well-controlled.) Material porosity is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants, since surface-breaking pores allow for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the process. We are developing an ultrasonic sensor for detecting changes in porosity in metal parts during fabrication on a metal powder bed fusion system, for use as a process monitor. This paper will describe our work to develop an ultrasonic-based sensor for monitoring part porosity during an additive build, including background theory, the development and detailed characterization of reference additive porosity samples, and a potential design for in-situ implementation.

  18. Automated Inventory and Monitoring of the ALICE HLT Cluster Resources with the SysMES Framework

    Science.gov (United States)

    Ulrich, J.; Lara, C.; Haaland, Ø.; Böttger, S.; Röhrich, D.; Kebschull, U.

    2012-12-01

    The High-Level-Trigger (HLT) cluster of the ALICE experiment is a computer cluster with about 200 nodes and 20 infrastructure machines. In its current state, the cluster consists of nearly 10 different configurations of nodes in terms of installed hardware, software and network structure. In such a heterogeneous environment with a distributed application, information about the actual configuration of the nodes is needed to automatically distribute and adjust the application accordingly. An inventory database provides a unified interface to such information. To be useful, the data in the inventory has to be up to date, complete and consistent. Manual maintenance of such databases is error-prone and data tends to become outdated. The inventory module of the ALICE HLT cluster overcomes these drawbacks by automatically updating the actual state periodically and, in contrast to existing solutions, it allows the definition of a target state for each node. A target state can simply be a fully operational state, i.e. a state without malfunctions, or a dedicated configuration of the node. The target state is then compared to the actual state to detect deviations and malfunctions which could induce severe problems when running the application. The inventory module of the ALICE HLT cluster has been integrated into the monitoring and management framework SysMES in order to use existing functionality like transactionality and monitoring infrastructure. Additionally, SysMES allows to solve detected problems automatically via its rule-system. To describe the heterogeneous environment with all its specifics, like custom hardware, the inventory module uses an object-oriented model which is based on the Common Information Model. The inventory module provides an automatically updated actual state of the cluster, detects discrepancies between the actual and the target state and is able to solve detected problems automatically. This contribution presents the current implementation

  19. Atmosphere Processing Module Automation and Catalyst Durability Analysis for Mars ISRU Pathfinder

    Science.gov (United States)

    Petersen, Elspeth M.

    2016-01-01

    The Mars In-Situ Resource Utilization Pathfinder was designed to create fuel using components found in the planet’s atmosphere and regolith for an ascension vehicle to return a potential sample return or crew return vehicle from Mars. The Atmosphere Processing Module (APM), a subunit of the pathfinder, uses cryocoolers to isolate and collect carbon dioxide from Mars simulant gas. The carbon dioxide is fed with hydrogen into a Sabatier reactor where methane is produced. The APM is currently undergoing the final stages of testing at Kennedy Space Center prior to process integration testing with the other subunits of the pathfinder. The automation software for the APM cryocoolers was tested and found to perform nominally. The catalyst used for the Sabatier reactor was investigated to determine the factors contributing to catalyst failure. The results from the catalyst testing require further analysis, but it appears that the rapid change in temperature during reactor start up or the elevated operating temperature is responsible for the changes observed in the catalyst.

  20. Achieving mask order processing automation, interoperability and standardization based on P10

    Science.gov (United States)

    Rodriguez, B.; Filies, O.; Sadran, D.; Tissier, Michel; Albin, D.; Stavroulakis, S.; Voyiatzis, E.

    2007-02-01

    Last year the MUSCLE (Masks through User's Supply Chain: Leadership by Excellence) project was presented. Here is the project advancement. A key process in mask supply chain management is the exchange of technical information for ordering masks. This process is large, complex, company specific and error prone, and leads to longer cycle times and higher costs due to missing or wrong inputs. Its automation and standardization could produce significant benefits. We need to agree on the standard for mandatory and optional parameters, and also a common way to describe parameters when ordering. A system was created to improve the performance in terms of Key Performance Indicators (KPIs) such as cycle time and cost of production. This tool allows us to evaluate and measure the effect of factors, as well as the effect of implementing the improvements of the complete project. Next, a benchmark study and a gap analysis were performed. These studies show the feasibility of standardization, as there is a large overlap in requirements. We see that the SEMI P10 standard needs enhancements. A format supporting the standard is required, and XML offers the ability to describe P10 in a flexible way. Beyond using XML for P10, the semantics of the mask order should also be addressed. A system design and requirements for a reference implementation for a P10 based management system are presented, covering a mechanism for the evolution and for version management and a design for P10 editing and data validation.

  1. FY 2009 Progress: Process Monitoring Technology Demonstration at PNNL

    Energy Technology Data Exchange (ETDEWEB)

    Arrigo, Leah M.; Christensen, Ronald N.; Fraga, Carlos G.; Liezers, Martin; Peper, Shane M.; Thomas, Elizabeth M.; Bryan, Samuel A.; Douglas, Matthew; Laspe, Amy R.; Lines, Amanda M.; Peterson, James M.; Ward, Rebecca M.; Casella, Amanda J.; Duckworth, Douglas C.; Levitskaia, Tatiana G.; Orton, Christopher R.; Schwantes, Jon M.

    2009-12-01

    Pacific Northwest National Laboratory (PNNL) is developing and demonstrating three technologies designed to assist in the monitoring of reprocessing facilities in near-real time. These technologies include 1) a multi-isotope process monitor (MIP), 2) a spectroscopy-based monitor that uses UV-Vis-NIR (ultraviolet-visible-near infrared) and Raman spectrometers, and 3) an electrochemically modulated separations approach (EMS). The MIP monitor uses gamma spectroscopy and pattern recognition software to identify off-normal conditions in process streams. The UV-Vis-NIR and Raman spectroscopic monitoring continuously measures chemical compositions of the process streams including actinide metal ions (uranium, plutonium, neptunium), selected fission products, and major cold flow sheet chemicals. The EMS approach provides an on-line means for separating and concentrating elements of interest out of complex matrices prior to detection via nondestructive assay by gamma spectroscopy or destructive analysis with mass spectrometry. A general overview of the technologies and ongoing demonstration results are described in this report.

  2. A Permanent Automated Real-Time Passive Acoustic Monitoring System for Bottlenose Dolphin Conservation in the Mediterranean Sea.

    Directory of Open Access Journals (Sweden)

    Marco Brunoldi

    Full Text Available Within the framework of the EU Life+ project named LIFE09 NAT/IT/000190 ARION, a permanent automated real-time passive acoustic monitoring system for the improvement of the conservation status of the transient and resident population of bottlenose dolphin (Tursiops truncatus has been implemented and installed in the Portofino Marine Protected Area (MPA, Ligurian Sea. The system is able to detect the simultaneous presence of dolphins and boats in the area and to give their position in real time. This information is used to prevent collisions by diffusing warning messages to all the categories involved (tourists, professional fishermen and so on. The system consists of two gps-synchronized acoustic units, based on a particular type of marine buoy (elastic beacon, deployed about 1 km off the Portofino headland. Each one is equipped with a four-hydrophone array and an onboard acquisition system which can record the typical social communication whistles emitted by the dolphins and the sound emitted by boat engines. Signals are pre-filtered, digitized and then broadcast to the ground station via wi-fi. The raw data are elaborated to get the direction of the acoustic target to each unit, and hence the position of dolphins and boats in real time by triangulation.

  3. A Permanent Automated Real-Time Passive Acoustic Monitoring System for Bottlenose Dolphin Conservation in the Mediterranean Sea.

    Science.gov (United States)

    Brunoldi, Marco; Bozzini, Giorgio; Casale, Alessandra; Corvisiero, Pietro; Grosso, Daniele; Magnoli, Nicodemo; Alessi, Jessica; Bianchi, Carlo Nike; Mandich, Alberta; Morri, Carla; Povero, Paolo; Wurtz, Maurizio; Melchiorre, Christian; Viano, Gianni; Cappanera, Valentina; Fanciulli, Giorgio; Bei, Massimiliano; Stasi, Nicola; Taiuti, Mauro

    2016-01-01

    Within the framework of the EU Life+ project named LIFE09 NAT/IT/000190 ARION, a permanent automated real-time passive acoustic monitoring system for the improvement of the conservation status of the transient and resident population of bottlenose dolphin (Tursiops truncatus) has been implemented and installed in the Portofino Marine Protected Area (MPA), Ligurian Sea. The system is able to detect the simultaneous presence of dolphins and boats in the area and to give their position in real time. This information is used to prevent collisions by diffusing warning messages to all the categories involved (tourists, professional fishermen and so on). The system consists of two gps-synchronized acoustic units, based on a particular type of marine buoy (elastic beacon), deployed about 1 km off the Portofino headland. Each one is equipped with a four-hydrophone array and an onboard acquisition system which can record the typical social communication whistles emitted by the dolphins and the sound emitted by boat engines. Signals are pre-filtered, digitized and then broadcast to the ground station via wi-fi. The raw data are elaborated to get the direction of the acoustic target to each unit, and hence the position of dolphins and boats in real time by triangulation. PMID:26789265

  4. Monitoring of bone regeneration process by means of texture analysis

    International Nuclear Information System (INIS)

    An image analysis method is proposed for the monitoring of the regeneration of the tibial bone. For this purpose, 130 digitized radiographs of 13 patients, who had undergone tibial lengthening by the Ilizarov method, were studied. For each patient, 10 radiographs, taken at an equal number of postoperative successive time moments, were available. Employing available software, 3 Regions Of Interest (ROIs), corresponding to the: (a) upper, (b) central, and (c) lower aspect of the gap, where bone regeneration was expected to occur, were determined on each radiograph. Employing custom developed algorithms: (i) a number of textural features were generated from each of the ROIs, and (ii) a texture-feature based regression model was designed for the quantitative monitoring of the bone regeneration process. Statistically significant differences (p 2 = 0.9, p < 0.001). The suggested method may contribute to the monitoring of the tibial bone regeneration process.

  5. Facility Effluent Monitoring Plan for the 325 Radiochemical Processing Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Shields, K.D.; Ballinger, M.Y.

    1999-04-02

    This Facility Effluent Monitoring Plan (FEMP) has been prepared for the 325 Building Radiochemical Processing Laboratory (RPL) at the Pacific Northwest National Laboratory (PNNL) to meet the requirements in DOE Order 5400.1, ''General Environmental Protection Programs.'' This FEMP has been prepared for the RPL primarily because it has a ''major'' (potential to emit >0.1 mrem/yr) emission point for radionuclide air emissions according to the annual National Emission Standards for Hazardous Air Pollutants (NESHAP) assessment performed. This section summarizes the airborne and liquid effluents and the inventory based NESHAP assessment for the facility. The complete monitoring plan includes characterization of effluent streams, monitoring/sampling design criteria, a description of the monitoring systems and sample analysis, and quality assurance requirements. The RPL at PNNL houses radiochemistry research, radioanalytical service, radiochemical process development, and hazardous and radioactive mixed waste treatment activities. The laboratories and specialized facilities enable work ranging from that with nonradioactive materials to work with picogram to kilogram quantities of fissionable materials and up to megacurie quantities of other radionuclides. The special facilities within the building include two shielded hot-cell areas that provide for process development or analytical chemistry work with highly radioactive materials and a waste treatment facility for processing hazardous, mixed radioactive, low-level radioactive, and transuranic wastes generated by PNNL activities.

  6. Monitoring sodium in commercially processed foods from stores and restaurants

    Science.gov (United States)

    Most of the sodium we eat comes from commercially processed foods from stores and restaurants. Sodium reduction in these foods is a key component of several recent public health efforts. Agricultural Research Service (ARS) of USDA, CDC and FDA have launched a collaborative program to monitor sodium ...

  7. Process control and recovery in the Link Monitor and Control Operator Assistant

    Science.gov (United States)

    Lee, Lorrine; Hill, Randall W., Jr.

    1993-01-01

    This paper describes our approach to providing process control and recovery functions in the Link Monitor and Control Operator Assistant (LMCOA). The focus of the LMCOA is to provide semi-automated monitor and control to support station operations in the Deep Space Network. The LMCOA will be demonstrated with precalibration operations for Very Long Baseline Interferometry on a 70-meter antenna. Precalibration, the task of setting up the equipment to support a communications link with a spacecraft, is a manual, time consuming and error-prone process. One problem with the current system is that it does not provide explicit feedback about the effects of control actions. The LMCOA uses a Temporal Dependency Network (TDN) to represent an end-to-end sequence of operational procedures and a Situation Manager (SM) module to provide process control, diagnosis, and recovery functions. The TDN is a directed network representing precedence, parallelism, precondition, and postcondition constraints. The SM maintains an internal model of the expected and actual states of the subsystems in order to determine if each control action executed successfully and to provide feedback to the user. The LMCOA is implemented on a NeXT workstation using Objective C, Interface Builder and the C Language Integrated Production System.

  8. Generation and monitoring of a discrete stable random process

    CERN Document Server

    Hopcraft, K I; Matthews, J O

    2002-01-01

    A discrete stochastic process with stationary power law distribution is obtained from a death-multiple immigration population model. Emigrations from the population form a random series of events which are monitored by a counting process with finite-dynamic range and response time. It is shown that the power law behaviour of the population is manifested in the intermittent behaviour of the series of events. (letter to the editor)

  9. Implementation of automated, on-line fatigue monitoring in a boiling water reactor

    International Nuclear Information System (INIS)

    A workstation-based, on-line fatigue monitoring system for tracking fatigue usage applied to a Japanese operating boiling water reactor (BWR), Tsuruga Unit 1, is described. The system uses the influence function approach and rainflow cycle counting methodology, operates on a workstation computer, and determines component stresses using temperature, pressure, and flow rate data that are made available via signal taps from previously existing plant sensors. Using plant-unique influence functions developed specifically for the feedwater nozzle location, the system calculates stresses as a function of time and computes the fatigue usage. The analysis method used to compute fatigue usage complies with MITI Code Notification number-sign 501. Fatigue values are saved automatically on files at times defined by the user for use at a later time. Of particular note, this paper describes some of the details involved with implementing such a system from the utility perspective. Utility installation details, as well as why such a system was chosen for implementation are presented. Fatigue results for an entire fuel cycle are presented and compared to assumed design basis events to confirm that actual plant thermal duty is significantly less severe than originally estimated in the design basis stress report. Although the system is specifically set up to address fatigue duty for the feedwater nozzle location, a generic shell structure was implemented so that any other components could be added at a future time without software modifications. As a result, the system provides the technical basis to more accurately evaluate actual reactor conditions as well as the justification for plant life extension

  10. Advances in statistical monitoring of complex multivariate processes with applications in industrial process control

    CERN Document Server

    Kruger, Uwe

    2012-01-01

    The development and application of multivariate statistical techniques in process monitoring has gained substantial interest over the past two decades in academia and industry alike.  Initially developed for monitoring and fault diagnosis in complex systems, such techniques have been refined and applied in various engineering areas, for example mechanical and manufacturing, chemical, electrical and electronic, and power engineering.  The recipe for the tremendous interest in multivariate statistical techniques lies in its simplicity and adaptability for developing monitoring applica

  11. Batch process monitoring based on multilevel ICA-PCA

    Institute of Scientific and Technical Information of China (English)

    Zhi-qiang GE; Zhi-huan SONG

    2008-01-01

    In this paper,we describe a new batch process monitoring method based on multilevel independent component analysis and principal component analysis (MLICA-PCA).Unlike the conventional multi-way principal component analysis (MPCA) method,MLICA-PCA provides a separated interpretation for multilevel batch process data.Batch process data are partitioned into two levels:the within-batch level and the between-batch level.In each level,the Gaussian and non-Ganssian components of process information can be separately extracted.I2,T2 and SPE statistics are individually built and monitored.The new method facilitates fault diagnosis.Since the two variation levels arc decomposed,the variables responsible for faults in each level can be identified and interpreted more easily.A case study of the Dupont benchmark process showed that the proposed method was more efficient and interpretable in fault detection and diagnosis,compared to the alternative batch process monitoring method.

  12. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    Science.gov (United States)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  13. Lunar surface mining for automated acquisition of helium-3: Methods, processes, and equipment

    Science.gov (United States)

    Li, Y. T.; Wittenberg, L. J.

    1992-01-01

    In this paper, several techniques considered for mining and processing the regolith on the lunar surface are presented. These techniques have been proposed and evaluated based primarily on the following criteria: (1) mining operations should be relatively simple; (2) procedures of mineral processing should be few and relatively easy; (3) transferring tonnages of regolith on the Moon should be minimized; (4) operations outside the lunar base should be readily automated; (5) all equipment should be maintainable; and (6) economic benefit should be sufficient for commercial exploitation. The economic benefits are not addressed in this paper; however, the energy benefits have been estimated to be between 250 and 350 times the mining energy. A mobile mining scheme is proposed that meets most of the mining objectives. This concept uses a bucket-wheel excavator for excavating the regolith, several mechanical electrostatic separators for beneficiation of the regolith, a fast-moving fluidized bed reactor to heat the particles, and a palladium diffuser to separate H2 from the other solar wind gases. At the final stage of the miner, the regolith 'tailings' are deposited directly into the ditch behind the miner and cylinders of the valuable solar wind gases are transported to a central gas processing facility. During the production of He-3, large quantities of valuable H2, H2O, CO, CO2, and N2 are produced for utilization at the lunar base. For larger production of He-3 the utilization of multiple-miners is recommended rather than increasing their size. Multiple miners permit operations at more sites and provide redundancy in case of equipment failure.

  14. Quantitative and Qualitative Analysis of Aconitum Alkaloids in Raw and Processed Chuanwu and Caowu by HPLC in Combination with Automated Analytical System and ESI/MS/MS

    OpenAIRE

    Aimin Sun; Bo Gao; Xueqing Ding; Chi-Ming Huang; Paul Pui-Hay But

    2012-01-01

    HPLC in combination with automated analytical system and ESI/MS/MS was used to analyze aconitine (A), mesaconitine (MA), hypaconitine (HA), and their benzoyl analogs in the Chinese herbs Caowu and Chuanwu. First, an HPLC method was developed and validated to determine A, MA, and HA in raw and processed Caowu and Chuanwu. Then an automated analytical system and ESI/MS/MS were applied to analyze these alkaloids and their semihydrolyzed products. The results obtained from automated analytical sy...

  15. AUTOMATED CALCULATIONS OF THE DYNAMICS OF TURBOGENERATOR ELECTROMAGNETIC PROCESSES IN SOFTWARE ENVIRONMENT FEMM

    Directory of Open Access Journals (Sweden)

    V.I. Milykh

    2015-12-01

    Full Text Available Attention is paid to the popular FEMM (Finite Element Method Magnetics program which is effective in the numerical calculations of the magnetic fields of electrical machines. The principles of the automated calculations providing the analysis of the dynamics of electromagnetic processes in turbo-generators are presented. This is realized in the form of a script on the algorithmic language Lua integrated with FEMM. The temporal functions of electromagnetic quantities are obtained by multi-position calculations of the magnetic field with ensuring its rotation together with the turbo-generator rotor. The developed program is universal in terms of the geometry and dimensions of turbo-generators, as well as the modes of their work with a minimum of input data in numerical form. This paper shows "extraction" of discrete temporal functions: the magnetic flux linkage of the phase stator winding; forces acting on the current-carrying and ferromagnetic elements of the structure; the magnetic induction at the fixed points; electromagnetic moment. This list can be expanded as part of the created program, as well as the use of the program can be extended to other types of electrical machines. The obtaining of a change period of any functions is provided by rotating the rotor to 60°.

  16. Methodology on Investigating the Influences of Automated Material Handling System in Automotive Assembly Process

    Science.gov (United States)

    Saffar, Seha; Azni Jafar, Fairul; Jamaludin, Zamberi

    2016-02-01

    A case study was selected as a method to collect data in actual industry situation. The study aimed to assess the influences of automated material handling system in automotive industry by proposing a new design of integration system through simulation, and analyze the significant effect and influence of the system. The method approach tool will be CAD Software (Delmia & Quest). The process of preliminary data gathering in phase 1 will collect all data related from actual industry situation. It is expected to produce a guideline and limitation in designing a new integration system later. In phase 2, an idea or concept of design will be done by using 10 principles of design consideration for manufacturing. A full factorial design will be used as design of experiment in order to analyze the performance measured of the integration system with the current system in case study. From the result of the experiment, an ANOVA analysis will be done to study the performance measured. Thus, it is expected that influences can be seen from the improvement made in the system.

  17. Automated continuous monitoring of inorganic and total mercury in wastewater and other waters by flow-injection analysis and cold-vapour atomic absorption spectrometry

    OpenAIRE

    Birnie, S. E.

    1988-01-01

    An automated continuous monitoring system for the determination of inorganic and total mercury by flow-injection analysis followed by cold-vapour atomic absorption spectrometry is described. The method uses a typical flow-injection manifold where digestion and reduction of the injected sample takes place. Mercury is removed by aeration from the flowing stream in a specially designed air-liquid separator and swept into a silica cell for absorption measurement at a wavelength of 253.7 nm. A cal...

  18. Post-Lamination Manufacturing Process Automation for Photovoltaic Modules; Annual Technical Progress Report: 15 June 1999--14 July 2000

    Energy Technology Data Exchange (ETDEWEB)

    Nowlan, M. J.; Murach, J. M.; Sutherland, S. F.; Lewis, E. R.; Hogan, S. J.

    2000-09-29

    Spire is addressing the PVMaT project goals of photovoltaic (PV) module cost reduction and improved module manufacturing process technology. New cost-effective automation processes are being developed for post-lamination PV module assembly, where post-lamination is defined as the processes after the solar cells are encapsulated. These processes apply to both crystalline and thin-film solar cell modules. Four main process areas are being addressed: (1) Module buffer storage and handling between steps; (2) Module edge trimming, edge sealing, and framing; (3) Junction-box installation; and (4) Testing for module performance, electrical isolation, and ground-path continuity.

  19. Acoustic monitoring of a fluidized bed coating process

    DEFF Research Database (Denmark)

    Naelapaa, Kaisa; Veski, Peep; Pedersen, Joan G.;

    2007-01-01

    point perspective. The acoustic monitoring has the potential of summarising the commonly used means to monitor the coating process. The best partial least squares (PLS) regressions, obtained by the high frequency accelerometer, showed for the release a correlation coefficient of 0.92 and a root mean...... square error of prediction (RMSEP) of 5.84% (31-82.8%), and for the estimated amount of film applied a correlation coefficient of 0.95 and RMSEP of 0.52% (0.6-6%). The results of the preliminary investigation are considered promising. There is however a need for further investigations on sampling...

  20. Rapid Automated Treatment Planning Process to Select Breast Cancer Patients for Active Breathing Control to Achieve Cardiac Dose Reduction

    International Nuclear Information System (INIS)

    Purpose: To evaluate a rapid automated treatment planning process for the selection of patients with left-sided breast cancer for a moderate deep inspiration breath-hold (mDIBH) technique using active breathing control (ABC); and to determine the dose reduction to the left anterior descending coronary artery (LAD) and the heart using mDIBH. Method and Materials: Treatment plans were generated using an automated method for patients undergoing left-sided breast radiotherapy (n = 53) with two-field tangential intensity-modulated radiotherapy. All patients with unfavorable cardiac anatomy, defined as having >10 cm3 of the heart receiving 50% of the prescribed dose (V50) on the free-breathing automated treatment plan, underwent repeat scanning on a protocol using a mDIBH technique and ABC. The doses to the LAD and heart were compared between the free-breathing and mDIBH plans. Results: The automated planning process required approximately 9 min to generate a breast intensity-modulated radiotherapy plan. Using the dose–volume criteria, 20 of the 53 patients were selected for ABC. Significant differences were found between the free-breathing and mDIBH plans for the heart V50 (29.9 vs. 3.7 cm3), mean heart dose (317 vs. 132 cGy), mean LAD dose (2,047 vs. 594 cGy), and maximal dose to 0.2 cm3 of the LAD (4,155 vs. 1,507 cGy, all p 50 using the mDIBH technique. The 3 patients who had had a breath-hold threshold 50. Conclusions: A rapid automated treatment planning process can be used to select patients who will benefit most from mDIBH. For selected patients with unfavorable cardiac anatomy, the mDIBH technique using ABC can significantly reduce the dose to the LAD and heart, potentially reducing the cardiac risks.

  1. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    Science.gov (United States)

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput.

  2. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    Science.gov (United States)

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. PMID:26520383

  3. Analysis of the Optimal Duration of Behavioral Observations Based on an Automated Continuous Monitoring System in Tree Swallows (Tachycineta bicolor: Is One Hour Good Enough?

    Directory of Open Access Journals (Sweden)

    Ádám Z Lendvai

    Full Text Available Studies of animal behavior often rely on human observation, which introduces a number of limitations on sampling. Recent developments in automated logging of behaviors make it possible to circumvent some of these problems. Once verified for efficacy and accuracy, these automated systems can be used to determine optimal sampling regimes for behavioral studies. Here, we used a radio-frequency identification (RFID system to quantify parental effort in a bi-parental songbird species: the tree swallow (Tachycineta bicolor. We found that the accuracy of the RFID monitoring system was similar to that of video-recorded behavioral observations for quantifying parental visits. Using RFID monitoring, we also quantified the optimum duration of sampling periods for male and female parental effort by looking at the relationship between nest visit rates estimated from sampling periods with different durations and the total visit numbers for the day. The optimum sampling duration (the shortest observation time that explained the most variation in total daily visits per unit time was 1h for both sexes. These results show that RFID and other automated technologies can be used to quantify behavior when human observation is constrained, and the information from these monitoring technologies can be useful for evaluating the efficacy of human observation methods.

  4. The Error Monitoring and Processing System in Alcohol Use

    Directory of Open Access Journals (Sweden)

    Menizibeya O. Welcome

    2010-10-01

    Full Text Available Background: Current data suggest that alcohol might play significant role in error commission. Error commission is related to the functions of the Error Monitoring and Processing System (EMPS located in the substantia nigra of the midbrain, basal ganglia and cortex of the forebrain. The main components of the EMPS are the dopaminergic system and anterior cingulate cortex. Although, recent data show that alcohol disrupts the EMPS, the ways in which alcohol affects this system are poorly understood.Aims & Objectives: We reviewed recent data that suggest the indirect effect of alcohol use on error commission.Methods / Study Design: Databases were searched for relevant literatures using the following keywords combination – Alcohol AND Error Commission (OR Processing, Monitoring, Correction, Detection. Literatures were searched in scientific databases (Medline, DOAJ, Embase from 1940 to August 2010, journal website (Psychophysiology, Neuroscience and Trends in Neuroscience. Manual book search, including library information were included in the data collection process. Other additional information was searched through Google.Results / Findings: Blood and brain glucose levels play a vital role in error commission, and are related to error commission, monitoring and processing through the modulation of the activity of the dopaminergic system. To summarize the results of our findings, here we suggest a hypothesis of Alcohol-Related Glucose-Dependent System of Error Monitoring and Processing (ARGD-EMPS hypothesis, which holds that the disruption of the EMPS is related to the competency of glucose homeostasis regulation, which in turn may determine the dopamine level as a major component of the EMPS. The ARGD-EMPS hypothesis explains the general processes and mechanism of alcohol related disruption of the EMPS.Conclusion: Alcohol may indirectly disrupt the EMPS by affecting dopamine level through disorders in blood glucose homeostasis regulation. The

  5. Fiber optic sensors for process monitoring of composite aerospace structures

    Science.gov (United States)

    Menendez Martin, Jose M.; Munoz-Esquer, Pedro; Rodriguez-Lence, Fernando; Guemes, J. Alfredo

    2002-07-01

    There are currently available many software tools for modeling the processing of composite materials, that help designers to evaluate the process constraints and the feasibility of different concepts. Nevertheless, several manufacturing tests are still required for adjustment of the control parameters before production may start. Real time monitoring is the only way to validate the numerical results and to get a deeper knowledge on the process evolution. Final objective would be a closed loop known as 'Intelligent Material Processing'.: process model - in situ sensors - predictive control, able to react on real time to small disturbances, adapting the process parameters for optimal results. This paper concentrates on the sensor development for two aerospace processes, autoclave curing and RTM, and it present the results obtained on a real aircraft structural part, a five meter diameter frame for the fuselage of Airbus A380 . An optical fiber system has been implemented to monitor the movement of the resin flow front during the injection and the internal residual strains. The procedure has the advantage of being very robust, and it may be used for complex geometry of the part. It has been demonstrated the feasibility of the procedure to work at an industrial environment; the results are being used to refine the data on the material properties, as the preform permeability, and to improve the process control.

  6. Toward Real-Time Continuous, Automated Hydrogeophysical Monitoring of Aquifer Storage and Recovery: Results of a Pilot-Scale Experiment, Charleston, South Carolina

    Science.gov (United States)

    Day-Lewis, F. D.; Singha, K.; Versteeg, R. J.; Johnson, C. D.; Petkewich, M. D.; Richardson, A.; Rowe, T.; Lane, J. W.

    2005-12-01

    Aquifer storage and recovery (ASR) is used increasingly as a water-resources management tool, particularly in arid and coastal areas. ASR involves subsurface freshwater injection and storage during periods of water surplus and subsequent extraction during periods of water deficit or high demand. In coastal areas, injection into brackish-to-saline aquifers creates freshwater zones, the shapes and extents of which are controlled by aquifer heterogeneity and ground-water flow. ASR efficiency is limited by a lack of information about (1) the spatial and temporal distribution of injected freshwater and (2) possible degradation of aquifer properties resulting from injections. Without such knowledge, ASR managers cannot optimize injection and extraction schemes, nor can they predict or prevent breakthrough of brackish water at pumping wells. In this study, we examine the potential of hydrogeophysical monitoring as a management tool for ASR operations. In August-September 2005, time-lapse electrical resistivity tomography (ERT), combined with conventional chemical and hydraulic sampling, was conducted during a pilot-scale ASR experiment in an Atlantic Coastal Plain aquifer in Charleston, SC. The field site consists of 4 wells including three observation wells arranged symmetrically around a central injection/extraction well at radial distances of about 9 m. The wells are 140-155 m deep. Sand and limestone sections of the Santee Limestone/Black Mingo aquifer served as target zones for injection, storage, recovery, and ERT monitoring. We acquired time-lapse ERT data sets every 2.5 hours during 120 hours of injection, 48 hours of quiescent storage, and 96 hours of extraction. A key aspect of this work was the use of an autonomous remote monitoring system developed by Idaho National Laboratory (INL), which controls data collection, automated data upload to a central server, and parsing of the data into a relational database. In addition, this system provides a web interface

  7. Integratable high temperature ultrasonic transducers for NDT of metals and industrial process monitoring

    International Nuclear Information System (INIS)

    Thick (> 40 μm) piezoelectric ceramic films have been successfully deposited on metallic substrates by a sol-gel spray technique as HTUTs. Our novel approach focuses on the fabrication techniques of these HTUTs at the test site with handheld equipment and no furnaces. These HTUTs can be integrated onto large metallic structures such as pipes and molds for real-time and on-line automate NDT and process monitoring at the sensor location. The characteristics of these ultrasonic transducers are that they (1) can be fabricated directly onto the desired planar or curved metallic substrate such as large pipe at the NDT site; (2) do not need couplant; (3) can operate in the pulse/echo mode with a signal to noise ratio more than 30 dB at 10 MHz; (4) can operate up to more than 400oC. These HTUTs can be made onto thin metallic membranes as flexible transducers that can be wrapped around samples of cylindrical surfaces for NDT applications. The capability of these thick film UTs for NDT applications at temperatures up to 440oC and real-time non-intrusive and nondestructive process monitoring of polymer injection molding has been demonstrated. (author)

  8. Data Pre-Processing for Label-Free Multiple Reaction Monitoring (MRM Experiments

    Directory of Open Access Journals (Sweden)

    Lisa M. Chung

    2014-06-01

    Full Text Available Multiple Reaction Monitoring (MRM conducted on a triple quadrupole mass spectrometer allows researchers to quantify the expression levels of a set of target proteins. Each protein is often characterized by several unique peptides that can be detected by monitoring predetermined fragment ions, called transitions, for each peptide. Concatenating large numbers of MRM transitions into a single assay enables simultaneous quantification of hundreds of peptides and proteins. In recognition of the important role that MRM can play in hypothesis-driven research and its increasing impact on clinical proteomics, targeted proteomics such as MRM was recently selected as the Nature Method of the Year. However, there are many challenges in MRM applications, especially data pre‑processing where many steps still rely on manual inspection of each observation in practice. In this paper, we discuss an analysis pipeline to automate MRM data pre‑processing. This pipeline includes data quality assessment across replicated samples, outlier detection, identification of inaccurate transitions, and data normalization. We demonstrate the utility of our pipeline through its applications to several real MRM data sets.

  9. Implementation of automated macro after develop inspection in a production lithography process

    Science.gov (United States)

    Yanof, Arnold W.; Plachecki, Vincent E.; Fischer, Frank W.; Cusacovich, Marcelo; Nelson, Chris; Merrill, Mark A.

    2000-06-01

    impossibility of accurate classification and recording of defect types, locations, and layer of occurrence. In this paper, we discuss a pilot implementation of an automated macro inspection system at Motorola, Inc., which has enabled the early detection and containment of significant photolithography defects. We show a variety of different types of defects that have been effectively detected and identified by this system during production usage. We introduce a methodology for determining the automated tool's ability to discriminate between the defect signal and process noise. We indicate the potential for defect database analysis, and identification of maverick product. Based upon the pilot experience, we discuss the parameters of a cost/benefit analysis of full implementation. The costs involve tool cost, additional wafer dispositions, and the engineering costs of recipe management. The most tangible measurable benefit is the saved revenue of scrapped wafers. An analysis of risk also shows a major reduction due to improved detection, as well as reduced occurrence because of better containment. This reduction of risk extends both to the customer -- in terms of field failures, OTD, maverick product -- as well as to the production facility -- in terms of major scrap incidents, forced inking at probe, redo, and containment.

  10. About the Monitoring System of Power Plant Electrical Automation Configuration Mode and Function%关于水电厂电气自动化的监控系统组态模式及功能初探

    Institute of Scientific and Technical Information of China (English)

    马艳冰

    2014-01-01

    The hydropower plant electrical automation monitoring configuration mode, the function of a hydroelectric power plant electrical automation monitoring system in order to provide reference for related work.%分析了水电厂电气自动化监控组态模式,探析了水电厂电气自动化监控系统的功能,以期能为相关工作提供借鉴。

  11. Signal processing methodologies for an acoustic fetal heart rate monitor

    Science.gov (United States)

    Pretlow, Robert A., III; Stoughton, John W.

    1992-01-01

    Research and development is presented of real time signal processing methodologies for the detection of fetal heart tones within a noise-contaminated signal from a passive acoustic sensor. A linear predictor algorithm is utilized for detection of the heart tone event and additional processing derives heart rate. The linear predictor is adaptively 'trained' in a least mean square error sense on generic fetal heart tones recorded from patients. A real time monitor system is described which outputs to a strip chart recorder for plotting the time history of the fetal heart rate. The system is validated in the context of the fetal nonstress test. Comparisons are made with ultrasonic nonstress tests on a series of patients. Comparative data provides favorable indications of the feasibility of the acoustic monitor for clinical use.

  12. Multivariate Statistical Process Monitoring Using Robust Nonlinear Principal Component Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHAO Shijian; XU Yongmao

    2005-01-01

    The principal component analysis (PCA) algorithm is widely applied in a diverse range of fields for performance assessment, fault detection, and diagnosis. However, in the presence of noise and gross errors, the nonlinear PCA (NLPCA) using autoassociative bottle-neck neural networks is so sensitive that the obtained model differs significantly from the underlying system. In this paper, a robust version of NLPCA is introduced by replacing the generally used error criterion mean squared error with a mean log squared error. This is followed by a concise analysis of the corresponding training method. A novel multivariate statistical process monitoring (MSPM) scheme incorporating the proposed robust NLPCA technique is then investigated and its efficiency is assessed through application to an industrial fluidized catalytic cracking plant. The results demonstrate that, compared with NLPCA, the proposed approach can effectively reduce the number of false alarms and is, hence, expected to better monitor real-world processes.

  13. A new versatile in-process monitoring system for milling

    CERN Document Server

    Ritou, Mathieu; Furet, Benoît; Hascoët, Jean-Yves

    2013-01-01

    Tool condition monitoring (TCM) systems can improve productivity and ensure workpiece quality, yet, there is a lack of reliable TCM solutions for small-batch or one-off manufacturing of industrial parts. TCM methods which include the characteristics of the cut seem to be particularly suitable for these demanding applications. In the first section of this paper, three process-based indicators have been retrieved from literature dealing with TCM. They are analysed using a cutting force model and experiments are carried out in industrial conditions. Specific transient cuttings encountered during the machining of the test part reveal the indicators to be unreliable. Consequently, in the second section, a versatile in-process monitoring method is suggested. Based on experiments carried out under a range of different cutting conditions, an adequate indicator is proposed: the relative radial eccentricity of the cutters is estimated at each instant and characterizes the tool state. It is then compared with the previo...

  14. System and process for pulsed multiple reaction monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Belov, Mikhail E

    2013-05-17

    A new pulsed multiple reaction monitoring process and system are disclosed that uses a pulsed ion injection mode for use in conjunction with triple-quadrupole instruments. The pulsed injection mode approach reduces background ion noise at the detector, increases amplitude of the ion signal, and includes a unity duty cycle that provides a significant sensitivity increase for reliable quantitation of proteins/peptides present at attomole levels in highly complex biological mixtures.

  15. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.; Virden, Daniel J.; Myers, Joshua R.; Maxwell, Adam R.

    2012-09-01

    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objects recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.

  16. The Investigation of Monitoring Systems for SMAW Processes

    Directory of Open Access Journals (Sweden)

    Ahmed Samir Hamza

    2009-01-01

    Full Text Available The monitoring weld quality is increasingly important because great financial savings are possible because of it, and this especially happens in manufacturing where defective welds lead to losses in production and necessitate time consuming and expensive repair. This research deals with the monitoring and controllability of the fusion arc welding process using Artificial Neural Network (ANN model. The effect of weld parameters on the weld quality was studied by implementing the experimental results obtained from welding a non-Galvanized steel plate ASTM BN 1323 of 6 mm thickness in different weld parameters (current, voltage, and travel speed monitored by electronic systems that are followed by destructive (Tensile and Bending and non-destructive (Hardness on HAZ tests to investigate the quality control on the weld specimens. The experimental results obtained are then processed through the ANN model to control the welding process and predict the level of quality for different welding conditions. It has been deduced that the welding conditions (current, voltage, and travel speed have a dominant factors that affect the weld quality and strength. Also we found that for certain welding condition, there was an optimum weld travel speed to obtain an optimum weld quality. The system supports quality control procedures and welding productivity without doing more periodic destructive mechanical test to dozens of samples.

  17. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance

    Science.gov (United States)

    Schiepek, Günter; Aichhorn, Wolfgang; Gruber, Martin; Strunk, Guido; Bachler, Egon; Aas, Benjamin

    2016-01-01

    Objective: The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients' compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific) surveys. Methods: The process-outcome monitoring is technically realized by an internet-based device for data collection and data analysis, the Synergetic Navigation System. Its feasibility is documented by a compliance study on 151 clients treated in an inpatient and a day-treatment clinic. Results: We found high compliance rates (mean: 78.3%, median: 89.4%) amongst the respondents, independent of the severity of symptoms or the degree of impairment. Compared to other diagnoses, the compliance rate was lower in the group diagnosed with personality disorders. Conclusion: The results support the feasibility of high-frequency monitoring in routine psychotherapy settings. Daily collection of psychological surveys allows for the assessment of highly resolved, equidistant time series data which gives insight into the nonlinear qualities of therapeutic change processes (e.g., pattern transitions, critical instabilities). PMID:27199837

  18. Real-Time Monitoring of Psychotherapeutic Processes: Concept and Compliance

    Directory of Open Access Journals (Sweden)

    Guenter Karl Schiepek

    2016-05-01

    Full Text Available AbstractObjective. The feasibility of a high-frequency real-time monitoring approach to psychotherapy is outlined and tested for patients’ compliance to evaluate its integration to everyday practice. Criteria concern the ecological momentary assessment, the assessment of therapy-related cognitions and emotions, equidistant time sampling, real-time nonlinear time series analysis, continuous participative process control by client and therapist, and the application of idiographic (person-specific surveys. Methods. The process-outcome monitoring is technically realized by an internet-based device for data collection and data analysis, the Synergetic Navigation System. Its feasibility is documented by a compliance study on 151 clients treated in an inpatient and a day-treatment clinic. Results. We found high compliance rates (mean: 78.3%, median: 89.4% amongst the respondents, independent of the severity of symptoms or the degree of impairment. Compared to other diagnoses, the compliance rate was lower in the group diagnosed with personality disorders. Conclusion. The results support the feasibility of high-frequency monitoring in routine psychotherapy settings. Daily collection of psychological surveys allows for assessment of highly resolved, equidistant time series data which gives insight into the nonlinear qualities of therapeutic change processes (e.g., pattern transitions, critical instabilities.

  19. Ultrasonic monitoring of material processing using clad buffer rod sensors

    Science.gov (United States)

    Ramos Franca, Demartonne

    Ultrasonic sensors and techniques are developed for in-line monitoring of polymer extrusion, cleanliness of molten metals and liquid flow speed at elevated temperature. Pulse-echo mode is used for the first two processes, while the through-transmission mode is applied in the third one. The ultrasonic probe consists of high performance clad buffer rods with different dimensions to thermally isolate the commercial ultrasonic transducer from materials at high temperature. The clad buffer rods are made of steel, polymer and ceramic. Steel clad buffer rods are introduced for in-line monitoring of polymer extrusion processes. Owing to its superior performance in pulse-echo mode, for the first time such a probe is installed and performs ultrasonic monitoring in the die of a co-extrusion machine and in the barrel section of a twin-screw extruder. It can reveal a variety of information relevant to process parameters, such as polymer layer thickness, interface location and adhesion quality, stability, or polymer composition change. For the ultrasonic monitoring of polymer processes, probes with acoustic impedance that matches that of the processed polymer may offer certain advantages such as quantitative viscoelastic evaluation; thus high temperature polymer clad buffer rods, in particular PEEK, are developed. It is demonstrated that this new probe exhibits unique advantages for in-line monitoring of the cure of epoxies and polymer extrusion process. Long steel clad buffer rods with a spherical focus lens machined at the probing end are proposed for cleanliness evaluation of molten metals. The potential of this focusing probe is demonstrated by means of high-resolution imaging and particles detection in molten zinc at temperatures higher than 600°C, using a single probe operated at pulse-echo mode. A contrapropagating ultrasonic flowmeter employing steel clad buffer rods is devised to operate at high temperature. It is demonstrated that these rods guide ultrasonic signals

  20. Translation Expert (TranslationQ & RevisionQ): Automated translation process with real-time feedback & evaluation/ revision with PIE

    OpenAIRE

    Steurs, Frieda; Segers, Winibert; Kockaert, Hendrik

    2015-01-01

    Translation Expert (TranslationQ & RevisionQ): Automated translation process with real-time feedback & evaluation/ revision with PIE Winibert Segers, Hendrik Kockaert & Frieda Steurs KU Leuven This paper reports on an experiment working with a new evaluation technique for translator training. Organizing high level translation classes in a master in translation involves intensive assessment of the work delivered by the students. The evaluation has to be precise, professional, and at...

  1. A Generic Framework for Systematic Design of Process Monitoring and Control System for Crystallization Processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Sin, Gürkan;

    2012-01-01

    A generic framework for systematic design of a process monitoring and control system for crystallization processes has been developed in order to obtain the desired end-product properties notably the crystal size distribution (CSD). The design framework contains a generic crystallizer modelling...... tool-box, a tool for design of operational policies as well as a tool for design of process monitoring and control systems. Through this framework, it is possible for a wide range of crystallization processes to generate the necessary problem-system specific model, the necessary operational policy...... and a Process Analytical Technology (PAT) system design including implementation of monitoring tools and control strategies in order to produce a desired product with its corresponding target properties. Application of the framework is highlighted through a case study involving the system potassium dihydrogen...

  2. Analysis of dip coating processing parameters by double optical monitoring.

    Science.gov (United States)

    Horowitz, Flavio; Michels, Alexandre F

    2008-05-01

    Double optical monitoring is applied to determine the influence of main process parameters on the formation of sulfated zirconia and self-assembled mesoporous silica solgel films by dip coating. In addition, we analyze, for the first time to the best of our knowledge, the influence of withdrawal speed, temperature, and relative humidity on refractive-index and physical thickness variations (uncertainties of +/-0.005 and +/-7 nm) during the process. Results provide insight into controlled production of single and multilayer films from complex fluids by dip coating. PMID:18449244

  3. Safety. [requirements for software to monitor and control critical processes

    Science.gov (United States)

    Leveson, Nancy G.

    1991-01-01

    Software requirements, design, implementation, verification and validation, and especially management are affected by the need to produce safe software. This paper discusses the changes in the software life cycle that are necessary to ensure that software will execute without resulting in unacceptable risk. Software is being used increasingly to monitor and control safety-critical processes in which a run-time failure or error could result in unacceptable losses such as death, injury, loss of property, or environmental harm. Examples of such processes maybe found in transportation, energy, aerospace, basic industry, medicine, and defense systems.

  4. Imaging 3D strain field monitoring during hydraulic fracturing processes

    Science.gov (United States)

    Chen, Rongzhang; Zaghloul, Mohamed A. S.; Yan, Aidong; Li, Shuo; Lu, Guanyi; Ames, Brandon C.; Zolfaghari, Navid; Bunger, Andrew P.; Li, Ming-Jun; Chen, Kevin P.

    2016-05-01

    In this paper, we present a distributed fiber optic sensing scheme to study 3D strain fields inside concrete cubes during hydraulic fracturing process. Optical fibers embedded in concrete were used to monitor 3D strain field build-up with external hydraulic pressures. High spatial resolution strain fields were interrogated by the in-fiber Rayleigh backscattering with 1-cm spatial resolution using optical frequency domain reflectometry. The fiber optics sensor scheme presented in this paper provides scientists and engineers a unique laboratory tool to understand the hydraulic fracturing processes in various rock formations and its impacts to environments.

  5. Conception through build of an automated liquids processing system for compound management in a low-humidity environment.

    Science.gov (United States)

    Belval, Richard; Alamir, Ab; Corte, Christopher; DiValentino, Justin; Fernandes, James; Frerking, Stuart; Jenkins, Derek; Rogers, George; Sanville-Ross, Mary; Sledziona, Cindy; Taylor, Paul

    2012-12-01

    Boehringer Ingelheim's Automated Liquids Processing System (ALPS) in Ridgefield, Connecticut, was built to accommodate all compound solution-based operations following dissolution in neat DMSO. Process analysis resulted in the design of two nearly identical conveyor-based subsystems, each capable of executing 1400 × 384-well plate or punch tube replicates per batch. Two parallel-positioned subsystems are capable of independent execution or alternatively executed as a unified system for more complex or higher throughput processes. Primary ALPS functions include creation of high-throughput screening plates, concentration-response plates, and reformatted master stock plates (e.g., 384-well plates from 96-well plates). Integrated operations included centrifugation, unsealing/piercing, broadcast diluent addition, barcode print/application, compound transfer/mix via disposable pipette tips, and plate sealing. ALPS key features included instrument pooling for increased capacity or fail-over situations, programming constructs to associate one source plate to an array of replicate plates, and stacked collation of completed plates. Due to the hygroscopic nature of DMSO, ALPS was designed to operate within a 10% relativity humidity environment. The activities described are the collaborative efforts that contributed to the specification, build, delivery, and acceptance testing between Boehringer Ingelheim Pharmaceuticals, Inc. and the automation integration vendor, Thermo Scientific Laboratory Automation (Burlington, ON, Canada).

  6. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  7. In-process monitoring of powder metal component production

    Science.gov (United States)

    Rehbein, D. K.; Foley, J. C.; Osborne, M. G.; Lograsso, B. K.

    2000-05-01

    The goal of nondestructive evaluation is to determine the usefulness and quality of a component in such a manner as to leave it suitable for subsequent use. In many manufacturing processes, it is desirable to test the quality of the component being produced before production is completed in order to minimize the cost of rejected parts. This paper will examine the in-situ monitoring of two processes used in the manufacture of power metallurgy parts: (1.) production of green-state components by Powder Injection Molding (PIM) and (2.) sintering of green-state components. In both cases, ultrasonic waves are used to monitor and characterize the quality of the part during processing. In the PIM process, the effect of changes in various machine operating parameters is reflected in the ultrasonic investigated resulting in detection of processing under non-optimal conditions. Changes in the acoustic properties of the component during sintering can be detected in-situ thereby allowing a judgement of the completeness of the sintering. The PIM portion of the work was funded through the SBIR program of the U.S. Army Research Laboratory. The sintering work was funded through the Ames Laboratory directed research and development grant.

  8. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  9. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  10. A systematic framework for design of process monitoring and control (PAT) systems for crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist;

    2013-01-01

    A generic computer-aided framework for systematic design of a process monitoring and control system for crystallization processes has been developed to study various aspects of crystallization operations.The systematic design framework contains a generic crystallizer modelling toolbox, a tool for...

  11. Automation of management processes as a factor in the emergence of the jobs of the future

    Directory of Open Access Journals (Sweden)

    Veretehin Vladislav Vadimovich

    2016-03-01

    Full Text Available In the article the review of researches of domestic and foreign organizations, modeling the demand and supply of professions on the labour market. Determined that most of the management functions is transferred to automated systems, robots and machines. The article presents the table containing the list of names of professions that are replaced by automated systems of management of objects and the list of names of professions that are replaced by automated systems in the management of documents. Defined the professions that will be in demand tomorrow (2020, the day after tomorrow" (after 2020, "profession-retired," and your future profession. Described professional skills the jobs of the future. Based on research by the Agency for strategic initiatives and the Moscow school of management SKOLKOVO is defined as the TOP 10 most popular professions of the future. Identified the need to replace old professions. Generated table the jobs of the future according to eaten research school of management "SKOLKOVO". The efficiency of transferring management functions to automated systems.

  12. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  13. Prototype development of filter monitor for 131I processing plant

    International Nuclear Information System (INIS)

    Iodine-131 (131I) is used extensively in nuclear medicine because of its short half-life and useful beta emission. Isotope Production and Applications Division (IP and AD) of BARC produces 131I in its processing plant. The charcoal filters that are capable of extracting high levels of radioactive iodine and particulates in the suction flow are installed in the plant. The radioactive iodine is fully removed and deposited onto activated charcoal impregnated with potassium iodide. These charcoal filters get saturated over a period of use and need to be replaced with fresh ones. A 5-channel Filter monitor for online measurement of radiation level of trapped 131I on the charcoal filter is being developed by IP and AD, BARC. The unavailability of this type of instrument motivated to undertake this development. Current paper deals with a prototype filter monitor developed with single detector. Some results prove the functionality of the system. (author)

  14. Adaptive Local Outlier Probability for Dynamic Process Monitoring

    Institute of Scientific and Technical Information of China (English)

    Yuxin Ma; Hongbo Shi; Mengling Wang

    2014-01-01

    Complex industrial processes often have multiple operating modes and present time-varying behavior. The data in one mode may follow specific Gaussian or non-Gaussian distributions. In this paper, a numerical y efficient moving window local outlier probability algorithm is proposed. Its key feature is the capability to handle complex data distributions and incursive operating condition changes including slow dynamic variations and instant mode shifts. First, a two-step adaption approach is introduced and some designed updating rules are applied to keep the monitoring model up-to-date. Then, a semi-supervised monitoring strategy is developed with an updating switch rule to deal with mode changes. Based on local probability models, the algorithm has a superior ability in detecting faulty conditions and fast adapting to slow variations and new operating modes. Final y, the utility of the proposed method is demonstrated with a numerical example and a non-isothermal continuous stirred tank reactor.

  15. Nonintrusive Monitoring and Control of Metallurgical Processes by Acoustic Measurements

    Science.gov (United States)

    Yu, Hao-Ling; Khajavi, Leili Tafaghodi; Barati, Mansoor

    2011-06-01

    The feasibility of developing a new online monitoring technique based on the characteristic acoustic response of gas bubbles in a liquid has been investigated. The method is intended to monitor the chemistry of the liquid through its relation to the bubble sound frequency. A low-temperature model consisting of water and alcohol mixtures was established, and the frequency of bubbles rising under varying concentrations of methanol was measured. It was shown that the frequency of the sound created by bubble pulsation varies with the percentage of alcohol in water. The frequency drops sharply with the increase in methanol content up to 20 wt pct, after which the decreases is gradual. Surface tension seems to be a critical liquid property affecting the sound frequency through its two-fold effects on the bubble size and the pulsation domain. The dependence between the frequency and the liquid composition suggests the feasibility of developing an acoustic-based technique for process control purposes.

  16. Self-tuning process monitoring system for process-based product

    Energy Technology Data Exchange (ETDEWEB)

    Hillaire, R. [Sandia National Labs., Livermore, CA (United States); Loucks, C. [Sandia National Labs., Albuquerque, NM (United States)

    1998-02-01

    The hidden qualities of a product are often revealed in the process. Subsurface material damage, surface cracks, and unusual burr formation can occur during a poorly controlled machining process. Standard post process inspection is costly and may not reveal these conditions. However, by monitoring the proper process parameters, these conditions are readily detectable without incurring the cost of post process inspection. In addition, many unforeseen process anomalies may be detected using an advanced process monitoring system. This work created a process monitoring system for milling machines which mapped the forces, power, vibration, and acoustic emissions generated during a cutting cycle onto a 3D model of the part being machined. The hyperpoint overlay can be analyzed and visualized with VRML (Virtual Reality Modeling Language). Once the Process Monitoring System is deployed, detailed inspection may be significantly reduced or eliminated. The project deployed a Pro-Engineer to VRML model conversion routine, advanced visualization interface, tool path transformation with mesh generation routine, hyperpoint overlay routine, stable sensor array, sensor calibration routine, and machine calibration methodology. The technology created in this project can help validate production of WR (War Reserve) components by generating process signatures for products, processes, and lot runs. The signatures of each product can be compared across all products made within and across lot runs to determine if the processes that produced the product are consistently providing superior quality. Furthermore, the qualities of the processes are visibly apparent, since the part model is overlaid with process data. The system was evaluated on three different part productions.

  17. Shoe-String Automation

    Energy Technology Data Exchange (ETDEWEB)

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  18. Novel methods for process monitoring and quality control using mass spectrometry

    Science.gov (United States)

    McClennen, William Herbert

    1999-11-01

    Both development and production of polymeric materials require sophisticated and versatile methods for process monitoring and quality control. This dissertation describes simplified methods of sample introduction to facilitate the use of mass spectrometry (MS) in monitoring chemical processes. The three general methods examined in this thesis are pyrolysis-gas chromatography (Py-GC), direct vapor sampling GC and direct laser Py-MS. The first of the two chapters on Py-GC utilizes an earlier Curie-point Py inlet design to analyze the oxygen containing compounds in coal-derived liquids by flash vaporization GUMS. Based on the capabilities and limitations of the older pyrolysis inlet, the second chapter describes a novel Curie-point Py-GC inlet characterized by a micro-volume pyrolysis/desorption chamber and a built-in split for more efficient transfer of products to the capillary GC column. Applications of the patented inlet to the analysis of various polymers and low volatility additives feature microgram sample size and direct sampling techniques. An automated vapor sampling (VS) inlet was developed for rapid, repetitive introduction of ambient vapors and gases into the inlet of a capillary GC column with a sub-ambient pressure detector. Two chapters describe applications of VS/GC/MS with short GC columns for on-line monitoring of combustion and thermal desorption processes. In the first formaldehyde vapors are monitored in wood combustion gases at concentrations as low as 1 ppm with a 30 s sampling interval. Other compounds simultaneously determined included ethenone, propylene, propyne, and acetaldehyde. The second VS chapter describes the use of GC/MS and GC/MSn (tandem MS) system to monitor the evolution of alkylbenzenes and polycyclic aromatic hydrocarbons during thermal desorption of coal tar contaminated soils with detection limits in the low ppb range. Direct laser Py-MS is used to analyze several experimental compounded rubbers containing a variety of

  19. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  20. Process monitoring for intelligent manufacturing processes - Methodology and application to Robot Assisted Polishing

    DEFF Research Database (Denmark)

    Pilny, Lukas

    variation in surface roughness was indirectly monitored through identified KPVs in terms of Acoustic Emission (AE), friction forces and power consumption during polishing. A dedicated polishing arm with integrated strain gauge based force sensors and a miniature AE sensor was developed, enabling in...... of six consecutive steps for identification of product Vital Quality Characteristics (VQCs) and Key Process Variables (KPVs), selection and characterization of sensors, optimization of sensors placement, validation of the monitoring solutions, definition of the reference manufacturing performance...... of process monitoring andcontrol strategy for automatic process End Point Detection (EPD) and on the machine surface characterization in Robot Assisted Polishing (RAP) with oscillating tool. VQCs were identified in terms of surface roughness, defects and gloss. Polishing progression in terms of relative...

  1. Unsupervised process monitoring and fault diagnosis with machine learning methods

    CERN Document Server

    Aldrich, Chris

    2013-01-01

    This unique text/reference describes in detail the latest advances in unsupervised process monitoring and fault diagnosis with machine learning methods. Abundant case studies throughout the text demonstrate the efficacy of each method in real-world settings. The broad coverage examines such cutting-edge topics as the use of information theory to enhance unsupervised learning in tree-based methods, the extension of kernel methods to multiple kernel learning for feature extraction from data, and the incremental training of multilayer perceptrons to construct deep architectures for enhanced data

  2. ["Veille sanitaire": tools, functions, process of healthcare monitoring in France].

    Science.gov (United States)

    Eilstein, D; Salines, G; Desenclos, J-C

    2012-10-01

    In France, the term "veille sanitaire" is widely used to designate healthcare monitoring. It contains, however, a set of concepts that are not shared equally by the entire scientific community. The same is true for activities that are part of it, even if some (surveillance for example) are already well defined. Concepts such as "observation", "vigilance", "alert" for example are not always clear. Furthermore, the use of these words in everyday language maintains this ambiguity. Thus, it seemed necessary to recall these definitions as already used in the literature or legislation texts and to make alternative suggestions. This formalization cannot be carried out without thinking about the structure of "veille sanitaire" and its components. Proposals are provided bringing out concepts of formated "veille" (monitoring) and non-formatted "veille" (monitoring). Definitions, functions, (methods and tools, processes) of these two components are outlined here as well as the cooperative relationship they sustain. The authors have attempted to provide the scientific community with a reference framework useful for exchanging information to promote research and methodological development dedicated to this public health application of epidemiology.

  3. Design and Application of Electric Power Automation Equipment Online Monitoring System%调度自动化设备状态在线监测系统的设计与应用

    Institute of Scientific and Technical Information of China (English)

    刘晖

    2016-01-01

    The current electric power dispatching automation system of equipment maintenance management mainly limited to the configuration of equipment management. In the face of all kinds of unexpected events, there is not an effective early warning in time, so design and make application of dispatching automation equipment status online monitoring system. The system adopts the modular structure design, with real-time monitoring, alarm and remote maintenance, diagnosis, and other functions. After the system is applied in Anhui Chaohu Electric Power Supply Company, realized real-time monitoring and diagnosis of automation equipment operation condition. Alarm through a variety of alarm way in order to make the maintenance personnel process faults in time, effectively ensure the safe and stable and reliable operation of the dispatching automation equipment.%目前电力调度自动化系统对设备的维护管理主要局限于设备的配置管理,面对各种突发性事件不能进行及时、有效预警,为此设计并应用了调度自动化设备状态在线监测系统。该系统采用模块化结构设计,具有实时监测、告警及远程维护、诊断等功能。该系统在安徽省电力公司巢湖市供电公司应用后,实现了对自动化设备运行工况的实时监测和诊断,及时发现故障,并通过多种方式告警,使维护人员及时处理故障,有效保障了调度自动化设备的安全、稳定和可靠运行。

  4. Rapid Automated Treatment Planning Process to Select Breast Cancer Patients for Active Breathing Control to Achieve Cardiac Dose Reduction

    Energy Technology Data Exchange (ETDEWEB)

    Wang Wei; Purdie, Thomas G. [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Department of Radiation Oncology, University of Toronto, Toronto, ON (Canada); Rahman, Mohammad; Marshall, Andrea [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Liu Feifei [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Department of Radiation Oncology, University of Toronto, Toronto, ON (Canada); Fyles, Anthony, E-mail: anthony.fyles@rmp.uhn.on.ca [Radiation Medicine Program, Princess Margaret Hospital, University Health Network, Toronto, ON (Canada); Department of Radiation Oncology, University of Toronto, Toronto, ON (Canada)

    2012-01-01

    Purpose: To evaluate a rapid automated treatment planning process for the selection of patients with left-sided breast cancer for a moderate deep inspiration breath-hold (mDIBH) technique using active breathing control (ABC); and to determine the dose reduction to the left anterior descending coronary artery (LAD) and the heart using mDIBH. Method and Materials: Treatment plans were generated using an automated method for patients undergoing left-sided breast radiotherapy (n = 53) with two-field tangential intensity-modulated radiotherapy. All patients with unfavorable cardiac anatomy, defined as having >10 cm{sup 3} of the heart receiving 50% of the prescribed dose (V{sub 50}) on the free-breathing automated treatment plan, underwent repeat scanning on a protocol using a mDIBH technique and ABC. The doses to the LAD and heart were compared between the free-breathing and mDIBH plans. Results: The automated planning process required approximately 9 min to generate a breast intensity-modulated radiotherapy plan. Using the dose-volume criteria, 20 of the 53 patients were selected for ABC. Significant differences were found between the free-breathing and mDIBH plans for the heart V{sub 50} (29.9 vs. 3.7 cm{sup 3}), mean heart dose (317 vs. 132 cGy), mean LAD dose (2,047 vs. 594 cGy), and maximal dose to 0.2 cm{sup 3} of the LAD (4,155 vs. 1,507 cGy, all p <.001). Of the 17 patients who had a breath-hold threshold of {>=}0.8 L, 14 achieved a {>=}90% reduction in the heart V{sub 50} using the mDIBH technique. The 3 patients who had had a breath-hold threshold <0.8 L achieved a lower, but still significant, reduction in the heart V{sub 50}. Conclusions: A rapid automated treatment planning process can be used to select patients who will benefit most from mDIBH. For selected patients with unfavorable cardiac anatomy, the mDIBH technique using ABC can significantly reduce the dose to the LAD and heart, potentially reducing the cardiac risks.

  5. Online monitoring and control of the biogas process

    Energy Technology Data Exchange (ETDEWEB)

    Boe, K.

    2006-07-01

    The demand for online monitoring and control of biogas process is increasing, since better monitoring and control system can improve process stability and enhance process performance for better economy of the biogas plants. A number of parameters in both the liquid and the gas phase have been suggested as process indicators. These include gas production, pH, alkalinity, volatile fatty acids (VFA) and hydrogen. Of these, VFA is the most widely recognised as a direct, relevant measure of stability. The individual, rather than collective VFA concentrations are recognised as providing significantly more information for diagnosis. However, classic on-line measurement is based on filtration, which suffers from fouling, especially in particulate or slurry wastes. In this project, a new online VFA monitoring system has been developed using gas-phase VFA extraction to avoid sample filtration. The liquid sample is pumped into a sampling chamber, acidified, added with salt and heated to extract VFA into the gas phase before analysis by GC-FID. This allows easy application to manure. Sample and analysis time of the system varies from 25-40 min. depending on the washing duration. The sampling frequency is fast enough for the dynamic of a manure digester, which is in the range of several hours. This system has been validated over more than 6 months and had shown good agreement with offline VFA measurement. Response from this sensor was compared with other process parameters such as biogas production, pH and dissolved hydrogen during overload situations in a laboratory-scale digester, to investigate the suitability of each measure as a process indicator. VFA was most reliable for indicating process imbalance, and propionate was most persistent. However, when coupling the online VFA monitoring with a simple control for automatic controlling propionate level in a digester, it was found that propionate decreased so slow that the biogas production fluctuated. Therefore, it is more

  6. The european primary care monitor: structure, process and outcome indicators

    Directory of Open Access Journals (Sweden)

    Wilson Andrew

    2010-10-01

    Full Text Available Abstract Background Scientific research has provided evidence on benefits of well developed primary care systems. The relevance of some of this research for the European situation is limited. There is currently a lack of up to date comprehensive and comparable information on variation in development of primary care, and a lack of knowledge of structures and strategies conducive to strengthening primary care in Europe. The EC funded project Primary Health Care Activity Monitor for Europe (PHAMEU aims to fill this gap by developing a Primary Care Monitoring System (PC Monitor for application in 31 European countries. This article describes the development of the indicators of the PC Monitor, which will make it possible to create an alternative model for holistic analyses of primary care. Methods A systematic review of the primary care literature published between 2003 and July 2008 was carried out. This resulted in an overview of: (1 the dimensions of primary care and their relevance to outcomes at (primary health system level; (2 essential features per dimension; (3 applied indicators to measure the features of primary care dimensions. The indicators were evaluated by the project team against criteria of relevance, precision, flexibility, and discriminating power. The resulting indicator set was evaluated on its suitability for Europe-wide comparison of primary care systems by a panel of primary care experts from various European countries (representing a variety of primary care systems. Results The developed PC Monitor approaches primary care in Europe as a multidimensional concept. It describes the key dimensions of primary care systems at three levels: structure, process, and outcome level. On structure level, it includes indicators for governance, economic conditions, and workforce development. On process level, indicators describe access, comprehensiveness, continuity, and coordination of primary care services. On outcome level, indicators

  7. Multi-Source Data Processing Middleware for Land Monitoring within a Web-Based Spatial Data Infrastructure for Siberia

    Directory of Open Access Journals (Sweden)

    Christiane Schmullius

    2013-06-01

    Full Text Available Land monitoring is a key issue in Earth system sciences to study environmental changes. To generate knowledge about change, e.g., to decrease uncertaincy in the results and build confidence in land change monitoring, multiple information sources are needed. Earth observation (EO satellites and in situ measurements are available for operational monitoring of the land surface. As the availability of well-prepared geospatial time-series data for environmental research is limited, user-dependent processing steps with respect to the data source and formats pose additional challenges. In most cases, it is possible to support science with spatial data infrastructures (SDI and services to provide such data in a processed format. A data processing middleware is proposed as a technical solution to improve interdisciplinary research using multi-source time-series data and standardized data acquisition, pre-processing, updating and analyses. This solution is being implemented within the Siberian Earth System Science Cluster (SIB-ESS-C, which combines various sources of EO data, climate data and analytical tools. The development of this SDI is based on the definition of automated and on-demand tools for data searching, ordering and processing, implemented along with standard-compliant web services. These tools, consisting of a user-friendly download, analysis and interpretation infrastructure, are available within SIB-ESS-C for operational use.

  8. Estimation of urinary stone composition by automated processing of CT images

    CERN Document Server

    Chevreau, Grégoire; Conort, Pierre; Renard-Penna, Raphaëlle; Mallet, Alain; Daudon, Michel; Mozer, Pierre; 10.1007/s00240-009-0195-3

    2009-01-01

    The objective of this article was developing an automated tool for routine clinical practice to estimate urinary stone composition from CT images based on the density of all constituent voxels. A total of 118 stones for which the composition had been determined by infrared spectroscopy were placed in a helical CT scanner. A standard acquisition, low-dose and high-dose acquisitions were performed. All voxels constituting each stone were automatically selected. A dissimilarity index evaluating variations of density around each voxel was created in order to minimize partial volume effects: stone composition was established on the basis of voxel density of homogeneous zones. Stone composition was determined in 52% of cases. Sensitivities for each compound were: uric acid: 65%, struvite: 19%, cystine: 78%, carbapatite: 33.5%, calcium oxalate dihydrate: 57%, calcium oxalate monohydrate: 66.5%, brushite: 75%. Low-dose acquisition did not lower the performances (P < 0.05). This entirely automated approach eliminat...

  9. Technologies for the Fast Set-Up of Automated Assembly Processes

    DEFF Research Database (Denmark)

    Krüger, Norbert; Ude, Ales; Petersen, Henrik Gordon;

    2014-01-01

    In this article, we describe technologies facilitating the set-up of automated assembly solutions which have been developed in the context of the IntellAct project (2011–2014). Tedious procedures are currently still required to establish such robot solutions. This hinders especially the automatio...... work on tele-operation, dexterous grasping, pose estimation and learning of control strategies. The prototype developed in IntellAct is at a TRL4 (corresponding to ‘demonstration in lab environment’)....

  10. A Comparison of Natural Language Processing Methods for Automated Coding of Motivational Interviewing.

    Science.gov (United States)

    Tanana, Michael; Hallgren, Kevin A; Imel, Zac E; Atkins, David C; Srikumar, Vivek

    2016-06-01

    Motivational interviewing (MI) is an efficacious treatment for substance use disorders and other problem behaviors. Studies on MI fidelity and mechanisms of change typically use human raters to code therapy sessions, which requires considerable time, training, and financial costs. Natural language processing techniques have recently been utilized for coding MI sessions using machine learning techniques, rather than human coders, and preliminary results have suggested these methods hold promise. The current study extends this previous work by introducing two natural language processing models for automatically coding MI sessions via computer. The two models differ in the way they semantically represent session content, utilizing either 1) simple discrete sentence features (DSF model) and 2) more complex recursive neural networks (RNN model). Utterance- and session-level predictions from these models were compared to ratings provided by human coders using a large sample of MI sessions (N=341 sessions; 78,977 clinician and client talk turns) from 6 MI studies. Results show that the DSF model generally had slightly better performance compared to the RNN model. The DSF model had "good" or higher utterance-level agreement with human coders (Cohen's kappa>0.60) for open and closed questions, affirm, giving information, and follow/neutral (all therapist codes); considerably higher agreement was obtained for session-level indices, and many estimates were competitive with human-to-human agreement. However, there was poor agreement for client change talk, client sustain talk, and therapist MI-inconsistent behaviors. Natural language processing methods provide accurate representations of human derived behavioral codes and could offer substantial improvements to the efficiency and scale in which MI mechanisms of change research and fidelity monitoring are conducted.

  11. Simplified Monitoring System

    CERN Document Server

    Jelinskas, Adomas

    2013-01-01

    This project can be considered as a model for a simplified grid monitoring. In particular, I was creating a specific monitoring instance, which can be easily set up on a machine and, depending on an input information, automatically start monitoring services using Nagios software application. I had to automate the set up process and configuration of the monitoring system in order for the user to use it easily. I developed a script which automatically sets up the monitoring system, configures it and starts monitoring. I put the script, files and instructions in the repository 'https://git.cern.ch/web/?p=cosmic.git;a=summary' under the sub-directory called SNCG.

  12. Process Model of Quality Cost Monitoring for Small and Medium Wood-Processing Enterprises

    Directory of Open Access Journals (Sweden)

    Denis Jelačić

    2016-01-01

    Full Text Available Quality is not only a technical category and the system of quality management is not only focused on product quality. Quality and costs are closely interlinked. The paper deals with the quality cost monitoring in small and medium wood-processing enterprises (SMEs in Slovakia, and also presents the results of the questionnaire survey. An empirical study is aimed to determine the level of understanding and level of implementation of quality cost monitoring in wood-processing SMEs in Slovakia. The research is based on PAF model. A suitable model for quality cost monitoring is also proposed in the paper based on the research results with guidelines for using the methods of Activity Basic Costing. The empirical study is focused on SMEs, which make 99.8 % of all companies in the branch, and where the quality cost monitoring often works as a latent management subsystem. SMEs managers use indicators for monitoring the processe performance and production quality, but they usually do not develop a separate framework for measuring and evaluating quality costs.

  13. Thermographic process monitoring in powderbed based additive manufacturing

    International Nuclear Information System (INIS)

    Selective Laser Melting is utilized to build metallic parts directly from CAD-Data by solidification of thin powder layers through application of a fast scanning laser beam. In this study layerwise monitoring of the temperature distribution is used to gather information about the process stability and the resulting part quality. The heat distribution varies with different kinds of parameters including scan vector length, laser power, layer thickness and inter-part distance in the job layout which in turn influence the resulting part quality. By integration of an off-axis mounted uncooled thermal detector the solidification as well as the layer deposition are monitored and evaluated. Errors in the generation of new powder layers usually result in a locally varying layer thickness that may cause poor part quality. For effect quantification, the locally applied layer thickness is determined by evaluating the heat-up of the newly deposited powder. During the solidification process space and time-resolved data is used to characterize the zone of elevated temperatures and to derive locally varying heat dissipation properties. Potential quality indicators are evaluated and correlated to the resulting part quality: Thermal diffusivity is derived from a simplified heat dissipation model and evaluated for every pixel and cool-down phase of a layer. This allows the quantification of expected material homogeneity properties. Maximum temperature and time above certain temperatures are measured in order to detect hot spots or delamination issues that may cause a process breakdown. Furthermore, a method for quantification of sputter activity is presented. Since high sputter activity indicates unstable melt dynamics this can be used to identify parameter drifts, improper atmospheric conditions or material binding errors. The resulting surface structure after solidification complicates temperature determination on the one hand but enables the detection of potential surface defects

  14. An Information System to Support and Monitor Clinical Trial Process

    Directory of Open Access Journals (Sweden)

    Daniela Luzi

    2013-01-01

    Full Text Available The demand of transparency of clinical research results, the need of accelerating the process oftransferring innovation in the daily medical practice as well as assuring patient safety and product efficacymake it necessary to extend the functionality of traditional trial registries. These new systems shouldcombine different functionalities to track the information exchange, support collaborative work, manageregulatory documents and monitor the entire clinical investigation (CIV lifecycle. This is the approachused to develop MEDIS, a Medical Device Information System, described in this paper under theperspective of the business process, and the underlining architecture. Moreover, MEDIS was designed onthe basis of Health Level 7 (HL7 v.3 standards and methodology to make it interoperable with similarregistries, but also to facilitate information exchange between different health information systems.

  15. Virtual instrument for monitoring process of brush plating

    Institute of Scientific and Technical Information of China (English)

    JING Xue-dong; XU Bin-shi; WANG Cheng-tao; ZHU Sheng; DONG Shi-yun

    2004-01-01

    A virtual instrument(Ⅵ) was developed to monitor the technological parameters in the process of brush plating, including coating thickness, brush-plating current, current density, deposition rate, and brush plating voltage. Meanwhile two approaches were presented to improve the measurement accuracy of coating thickness. One of them aims at eliminating the random interferences by moving average filtering; while the other manages to calculate the quantity of electricity consumed accurately with rectangular integration. With these two approaches, the coating thickness can be measured in real time with higher accuracy than the voltage-frequency conversion method. During the process of plating all the technological parameters are displayed visually on the front panel of the Ⅵ. Once brush current or current density overruns the limited values, or when the coating thickness reaches the objective value, the virtual will alarm. With this Ⅵ, the solution consumption can be decreased and the operating efficiency is improved.

  16. Process Diagnostics and Monitoring Using the Multipole Resonance Probe (MRP)

    Science.gov (United States)

    Harhausen, J.; Awakowicz, P.; Brinkmann, R. P.; Foest, R.; Lapke, M.; Musch, T.; Mussenbrock, T.; Oberrath, J.; Ohl, A.; Rolfes, I.; Schulz, Ch.; Storch, R.; Styrnoll, T.

    2011-10-01

    In this contribution we present the application of the MRP in an industrial plasma ion assisted deposition (PIAD) chamber (Leybold optics SYRUS-pro). The MRP is a novel plasma diagnostic which is suitable for an industrial environment - which means that the proposed method is robust, calibration free, and economical, and can be used for ideal and reactive plasmas alike. In order to employ the MRP as process diagnostics we mounted the probe on a manipulator to obtain spatially resolved information on the electron density and temperature. As monitoring tool the MRP is installed at a fixed position. Even during the deposition process it provides stable measurement results while other diagnostic methods, e.g. the Langmuir probe, may suffer from dielectric coatings. In this contribution we present the application of the MRP in an industrial plasma ion assisted deposition (PIAD) chamber (Leybold optics SYRUS-pro). The MRP is a novel plasma diagnostic which is suitable for an industrial environment - which means that the proposed method is robust, calibration free, and economical, and can be used for ideal and reactive plasmas alike. In order to employ the MRP as process diagnostics we mounted the probe on a manipulator to obtain spatially resolved information on the electron density and temperature. As monitoring tool the MRP is installed at a fixed position. Even during the deposition process it provides stable measurement results while other diagnostic methods, e.g. the Langmuir probe, may suffer from dielectric coatings. Funded by the German Ministry for Education and Research (BMBF, Fkz. 13N10462).

  17. Optical sensors for process control and emissions monitoring in industry

    Energy Technology Data Exchange (ETDEWEB)

    S. W. Alendorf; D. K. Ottensen; D. W. Hahn; T. J. Kulp; U. B. Goers

    1999-01-01

    Sandia National Laboratories has a number of ongoing projects developing optical sensors for industrial environments. Laser-based sensors can be attractive for relatively harsh environments where extractive sampling is difficult, inaccurate, or impractical. Tools developed primarily for laboratory research can often be adapted for the real world and applied to problems far from their original uses. Spectroscopic techniques, appropriately selected, have the potential to impact the bottom line of a number of industries and industrial processes. In this paper the authors discuss three such applications: a laser-based instrument for process control in steelmaking, a laser-induced breakdown method for hazardous metal detection in process streams, and a laser-based imaging sensor for evaluating surface cleanliness. Each has the potential to provide critical, process-related information in a real-time, continuous manner. These sensor techniques encompass process control applications and emissions monitoring for pollution prevention. They also span the range from a field-tested pre-commercial prototype to laboratory instrumentation. Finally, these sensors employ a wide range of sophistication in both the laser source and associated analytical spectroscopy. In the ultimate applications, however, many attributes of the sensors are in common, such as the need for robust operation and hardening for harsh industrial environments.

  18. The Development of Automated Detection Techniques for Passive Acoustic Monitoring as a Tool for Studying Beaked Whale Distribution and Habitat Preferences in the California Current Ecosystem

    Science.gov (United States)

    Yack, Tina M.

    The objectives of this research were to test available automated detection methods for passive acoustic monitoring and integrate the best available method into standard marine mammal monitoring protocols for ship based surveys. The goal of the first chapter was to evaluate the performance and utility of PAMGUARD 1.0 Core software for use in automated detection of marine mammal acoustic signals during towed array surveys. Three different detector configurations of PAMGUARD were compared. These automated detection algorithms were evaluated by comparing them to the results of manual detections made by an experienced bio-acoustician (author TMY). This study provides the first detailed comparisons of PAMGUARD automated detection algorithms to manual detection methods. The results of these comparisons clearly illustrate the utility of automated detection methods for odontocete species. Results of this work showed that the majority of whistles and click events can be reliably detected using PAMGUARD software. The second chapter moves beyond automated detection to examine and test automated classification algorithms for beaked whale species. Beaked whales are notoriously elusive and difficult to study, especially using visual survey methods. The purpose of the second chapter was to test, validate, and compare algorithms for detection of beaked whales in acoustic line-transect survey data. Using data collected at sea from the PAMGUARD classifier developed in Chapter 2 it was possible to measure the clicks from visually verified Baird's beaked whale encounters and use this data to develop classifiers that could discriminate Baird's beaked whales from other beaked whale species in future work. Echolocation clicks from Baird's beaked whales, Berardius bairdii, were recorded during combined visual and acoustic shipboard surveys of cetacean populations in the California Current Ecosystem (CCE) and with autonomous, long-term recorders at four different sites in the Southern

  19. Distributed multisensor processing, decision making, and control under constrained resources for remote health and environmental monitoring

    Science.gov (United States)

    Talukder, Ashit; Sheikh, Tanwir; Chandramouli, Lavanya

    2004-04-01

    Previous field-deployable distributed sensing systems for health/biomedical applications and environmental sensing have been designed for data collection and data transmission at pre-set intervals, rather than for on-board processing These previous sensing systems lack autonomous capabilities, and have limited lifespans. We propose the use of an integrated machine learning architecture, with automated planning-scheduling and resource management capabilities that can be used for a variety of autonomous sensing applications with very limited computing, power, and bandwidth resources. We lay out general solutions for efficient processing in a multi-tiered (three-tier) machine learning framework that is suited for remote, mobile sensing systems. Novel dimensionality reduction techniques that are designed for classification are used to compress each individual sensor data and pass only relevant information to the mobile multisensor fusion module (second-tier). Statistical classifiers that are capable of handling missing/partial sensory data due to sensor failure or power loss are used to detect critical events and pass the information to the third tier (central server) for manual analysis and/or analysis by advanced pattern recognition techniques. Genetic optimisation algorithms are used to control the system in the presence of dynamic events, and also ensure that system requirements (i.e. minimum life of the system) are met. This tight integration of control optimisation and machine learning algorithms results in a highly efficient sensor network with intelligent decision making capabilities. The applicability of our technology in remote health monitoring and environmental monitoring is shown. Other uses of our solution are also discussed.

  20. Update on scribe–cleave–passivate (SCP) slim edge technology for silicon sensors: Automated processing and radiation resistance

    Energy Technology Data Exchange (ETDEWEB)

    Fadeyev, V., E-mail: fadeyev@ucsc.edu [Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 (United States); Ely, S.; Galloway, Z.; Ngo, J.; Parker, C.; Sadrozinski, H.F.-W. [Santa Cruz Institute for Particle Physics, University of California, Santa Cruz, CA 95064 (United States); Christophersen, M.; Phlips, B.F. [U.S. Naval Research Laboratory, Code 7654, 4555 Overlook Avenue, Southwest Washington, DC 20375 (United States); Pellegrini, G.; Rafi, J.M.; Quirion, D. [Instituto de Microelectrónica de Barcelona, IMB-CNM-CSIC, Bellaterra, Barcelona (Spain); Dalla Betta, G.-F. [INFN and University of Trento, Via Sommarive, 14, 38123 Povo di Trento (Italy); Boscardin, M. [Fondazione Bruno Kessler, Via Sommarive, 18, 38123 Povo di Trento (Italy); Casse, G. [Department of Physics, University of Liverpool, O. Lodge Laboratory, Oxford Street, Liverpool L69 7ZE (United Kingdom); Gorelov, I.; Hoeferkamp, M.; Metcalfe, J.; Seidel, S. [Department of Physics and Astronomy, University of New Mexico, MSC 07 4220, 1919 Lomas Boulevard NE, Albuquerque, NM 87131 (United States); Gaubas, E.; Ceponis, T. [Institute of Applied Research, Vilnius University, Sauletekio 9, LT-10222 Vilnius (Lithuania); and others

    2014-11-21

    We pursue scribe–cleave–passivate (SCP) technology for making “slim edge” sensors. The goal is to reduce the inactive region at the periphery of the devices while maintaining their performance. In this paper we report on two aspects of the current efforts. The first one involves fabrication options for mass production. We describe the automated cleaving tests and a simplified version of SCP post-processing of n-type devices. Another aspect is the radiation resistance of the passivation. We report on the radiation tests of n- and p-type devices with protons and neutrons.

  1. Process monitoring of additive manufacturing by using optical tomography

    International Nuclear Information System (INIS)

    Parts fabricated by means of additive manufacturing are usually of complex shape and owing to the fabrication procedure by using selective laser melting (SLM), potential defects and inaccuracies are often very small in lateral size. Therefore, an adequate quality inspection of such parts is rather challenging, while non-destructive-techniques (NDT) are difficult to realize, but considerable efforts are necessary in order to ensure the quality of SLM-parts especially used for aerospace components. Thus, MTU Aero Engines is currently focusing on the development of an Online Process Control system which monitors and documents the complete welding process during the SLM fabrication procedure. A high-resolution camera system is used to obtain images, from which tomographic data for a 3dim analysis of SLM-parts are processed. From the analysis, structural irregularities and structural disorder resulting from any possible erroneous melting process become visible and may be allocated anywhere within the 3dim structure. Results of our optical tomography (OT) method as obtained on real defects are presented

  2. Geoscientific process monitoring with positron emission tomography (GeoPET)

    Science.gov (United States)

    Kulenkampff, Johannes; Gründig, Marion; Zakhnini, Abdelhamid; Lippmann-Pipke, Johanna

    2016-08-01

    Transport processes in geomaterials can be observed with input-output experiments, which yield no direct information on the impact of heterogeneities, or they can be assessed by model simulations based on structural imaging using µ-CT. Positron emission tomography (PET) provides an alternative experimental observation method which directly and quantitatively yields the spatio-temporal distribution of tracer concentration. Process observation with PET benefits from its extremely high sensitivity together with a resolution that is acceptable in relation to standard drill core sizes. We strongly recommend applying high-resolution PET scanners in order to achieve a resolution on the order of 1 mm. We discuss the particularities of PET applications in geoscientific experiments (GeoPET), which essentially are due to high material density. Although PET is rather insensitive to matrix effects, mass attenuation and Compton scattering have to be corrected thoroughly in order to derive quantitative values. Examples of process monitoring of advection and diffusion processes with GeoPET illustrate the procedure and the experimental conditions, as well as the benefits and limits of the method.

  3. Process monitoring of additive manufacturing by using optical tomography

    Science.gov (United States)

    Zenzinger, Guenter; Bamberg, Joachim; Ladewig, Alexander; Hess, Thomas; Henkel, Benjamin; Satzger, Wilhelm

    2015-03-01

    Parts fabricated by means of additive manufacturing are usually of complex shape and owing to the fabrication procedure by using selective laser melting (SLM), potential defects and inaccuracies are often very small in lateral size. Therefore, an adequate quality inspection of such parts is rather challenging, while non-destructive-techniques (NDT) are difficult to realize, but considerable efforts are necessary in order to ensure the quality of SLM-parts especially used for aerospace components. Thus, MTU Aero Engines is currently focusing on the development of an Online Process Control system which monitors and documents the complete welding process during the SLM fabrication procedure. A high-resolution camera system is used to obtain images, from which tomographic data for a 3dim analysis of SLM-parts are processed. From the analysis, structural irregularities and structural disorder resulting from any possible erroneous melting process become visible and may be allocated anywhere within the 3dim structure. Results of our optical tomography (OT) method as obtained on real defects are presented.

  4. Process monitoring of additive manufacturing by using optical tomography

    Energy Technology Data Exchange (ETDEWEB)

    Zenzinger, Guenter, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Bamberg, Joachim, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Ladewig, Alexander, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Hess, Thomas, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Henkel, Benjamin, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de; Satzger, Wilhelm, E-mail: guenter.zenzinger@mtu.de, E-mail: alexander.ladewig@mtu.de [MTU Aero Engines AG, Dachauerstrasse 665, 80995 Munich (Germany)

    2015-03-31

    Parts fabricated by means of additive manufacturing are usually of complex shape and owing to the fabrication procedure by using selective laser melting (SLM), potential defects and inaccuracies are often very small in lateral size. Therefore, an adequate quality inspection of such parts is rather challenging, while non-destructive-techniques (NDT) are difficult to realize, but considerable efforts are necessary in order to ensure the quality of SLM-parts especially used for aerospace components. Thus, MTU Aero Engines is currently focusing on the development of an Online Process Control system which monitors and documents the complete welding process during the SLM fabrication procedure. A high-resolution camera system is used to obtain images, from which tomographic data for a 3dim analysis of SLM-parts are processed. From the analysis, structural irregularities and structural disorder resulting from any possible erroneous melting process become visible and may be allocated anywhere within the 3dim structure. Results of our optical tomography (OT) method as obtained on real defects are presented.

  5. Spectral induced polarization for monitoring electrokinetic remediation processes

    Science.gov (United States)

    Masi, Matteo; Losito, Gabriella

    2015-12-01

    Electrokinetic remediation is an emerging technology for extracting heavy metals from contaminated soils and sediments. This method uses a direct or alternating electric field to induce the transport of contaminants toward the electrodes. The electric field also produces pH variations, sorption/desorption and precipitation/dissolution of species in the porous medium during remediation. Since heavy metal mobility is pH-dependent, the accurate control of pH inside the material is required in order to enhance the removal efficiency. The common approach for monitoring the remediation process both in laboratory and in the field is the chemical analysis of samples collected from discrete locations. The purpose of this study is the evaluation of Spectral Induced Polarization as an alternative method for monitoring geochemical changes in the contaminated mass during remediation. The advantage of this technique applied to field-scale is to offer higher resolution mapping of the remediation site and lower cost compared to the conventional sampling procedure. We carried out laboratory-scale electrokinetic remediation experiments on fine-grained marine sediments contaminated by heavy metal and we made Spectral Induced Polarization measurements before and after each treatment. Measurements were done in the frequency range 10- 3-103 Hz. By the deconvolution of the spectra using the Debye Decomposition method we obtained the mean relaxation time and total chargeability. The main finding of this work is that a linear relationship exists between the local total chargeability and pH, with good agreement. The observed behaviour of chargeability is interpreted as a direct consequence of the alteration of the zeta potential of the sediment particles due to pH changes. Such relationship has a significant value for the interpretation of induced polarization data, allowing the use of this technique for monitoring electrokinetic remediation at field-scale.

  6. Automated Characterization of Spent Fuel through the Multi-Isotope Process (MIP) Monitor

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Orton, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Schwantes, Jon M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2012-07-31

    This research developed an algorithm for characterizing spent nuclear fuel (SNF) samples based on simulated gamma spectra. The gamma spectra for a variety of light water reactor fuels typical of those found in the United States were simulated. Fuel nuclide concentrations were simulated in ORIGEN-ARP for 1296 fuel samples with a variety of reactor designs, initial enrichments, burn ups, and cooling times. The results of the ORIGEN-ARP simulation were then input to SYNTH to simulate the gamma spectrum for each sample. These spectra were evaluated with partial least squares (PLS)-based multivariate analysis methods to characterize the fuel according to reactor type (pressurized or boiling water reactor), enrichment, burn up, and cooling time. Characterizing some of the features in series by using previously estimated features in the prediction greatly improves the performance. By first classifying the spent fuel reactor type and then using type-specific models, the prediction error for enrichment, burn up, and cooling time improved by a factor of two to four. For some features, the prediction was further improved by including additional information, such as including the predicted burn up in the estimation of cooling time. The optimal prediction flow was determined based on the simulated data. A PLS discriminate analysis model was developed which perfectly classified SNF reactor type. Burn up was predicted within 0.1% root mean squared percent error (RMSPE) and both cooling time and initial enrichment within approximately 2% RMSPE.

  7. Non-destructive monitoring of curing process in precast concrete

    International Nuclear Information System (INIS)

    Currently, the use of precast concrete elements has gained importance because it offers many advantages over site-cast concrete. A disadvantage of site-cast concrete is that its properties vary according to the manufacturing method, the environment and even the operator who carried out the mixing, pouring and implementation of the concrete. Precast concrete elements are manufactured in a controlled environment (typically referred to as a precast plant) and this reduces the shrinkage and creep. One of the key properties of precast concrete is the capability to gain compressive strength rapidly under the appropriate conditions. The compressive strength determines if the precast can be stripped from the form or manipulated. This parameter is measured using destructive testing over cylindrical or cubic samples. The quality control of precast is derived from the fracture suffered by these elements, resulting in a 'pass or fail' evaluation. In most cases, the solution to this problem is to allow the material to cure for a few hours until it acquires sufficient strength to handle the precast element. The focus of this paper is the description of the research project 'CUREND'. This project aims to design a non-destructive methodology to monitor the curing process in precast concrete. The monitoring will be performed using wireless sensor networks.

  8. Non-destructive monitoring of curing process in precast concrete

    Science.gov (United States)

    Aparicio, S.; Ranz, J.; Fernández, R.; Albert, V.; Fuente, J. V.; Hernández, M. G.

    2012-12-01

    Currently, the use of precast concrete elements has gained importance because it offers many advantages over site-cast concrete. A disadvantage of site-cast concrete is that its properties vary according to the manufacturing method, the environment and even the operator who carried out the mixing, pouring and implementation of the concrete. Precast concrete elements are manufactured in a controlled environment (typically referred to as a precast plant) and this reduces the shrinkage and creep. One of the key properties of precast concrete is the capability to gain compressive strength rapidly under the appropriate conditions. The compressive strength determines if the precast can be stripped from the form or manipulated. This parameter is measured using destructive testing over cylindrical or cubic samples. The quality control of precast is derived from the fracture suffered by these elements, resulting in a "pass or fail" evaluation. In most cases, the solution to this problem is to allow the material to cure for a few hours until it acquires sufficient strength to handle the precast element. The focus of this paper is the description of the research project "CUREND". This project aims to design a non-destructive methodology to monitor the curing process in precast concrete. The monitoring will be performed using wireless sensor networks.

  9. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures

    Science.gov (United States)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin

    2014-04-01

    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic

  10. Kohonen Self-Organizing Maps in Validity Maintenance for Automated Scoring of Constructed Response.

    Science.gov (United States)

    Williamson, David M.; Bejar, Isaac I.

    As the automated scoring of constructed responses reaches operational status, monitoring the scoring process becomes a primary concern, particularly if automated scoring is intended to operate completely unassisted by humans. Using actual candidate selections from the Architectural Registration Examination (n=326), this study uses Kohonen…

  11. Monitoring Biological Modes in a Bioreactor Process by Computer Simulation

    Directory of Open Access Journals (Sweden)

    Samia Semcheddine

    2015-12-01

    Full Text Available This paper deals with the general framework of fermentation system modeling and monitoring, focusing on the fermentation of Escherichia coli. Our main objective is to develop an algorithm for the online detection of acetate production during the culture of recombinant proteins. The analysis the fermentation process shows that it behaves like a hybrid dynamic system with commutation (since it can be represented by 5 nonlinear models. We present a strategy of fault detection based on residual generation for detecting the different actual biological modes. The residual generation is based on nonlinear analytical redundancy relations. The simulation results show that the several modes that are occulted during the bacteria cultivation can be detected by residuals using a nonlinear dynamic model and a reduced instrumentation.

  12. Fiber-coupled THz spectroscopy for monitoring polymeric compounding processes

    Science.gov (United States)

    Vieweg, N.; Krumbholz, N.; Hasek, T.; Wilk, R.; Bartels, V.; Keseberg, C.; Pethukhov, V.; Mikulics, M.; Wetenkamp, L.; Koch, M.

    2007-06-01

    We present a compact, robust, and transportable fiber-coupled THz system for inline monitoring of polymeric compounding processes in an industrial environment. The system is built on a 90cm x 90cm large shock absorbing optical bench. A sealed metal box protects the system against dust and mechanical disturbances. A closed loop controller unit is used to ensure optimum coupling of the laser beam into the fiber. In order to build efficient and stable fiber-coupled antennas we glue the fibers directly onto photoconductive switches. Thus, the antenna performance is very stable and it is secured from dust or misalignment by vibrations. We discuss fabrication details and antenna performance. First spectroscopic data obtained with this system is presented.

  13. Opportunities for Process Monitoring Techniques at Delayed Access Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Curtis, Michael M.; Gitau, Ernest TN; Johnson, Shirley J.; Schanfein, Mark; Toomey, Christopher

    2013-09-20

    Except for specific cases where the International Atomic Energy Agency (IAEA) maintains a continuous presence at a facility (such as the Japanese Rokkasho Reprocessing Plant), there is always a period of time or delay between the moment a State is notified or aware of an upcoming inspection, and the time the inspector actually enters the material balance area or facility. Termed by the authors as “delayed access,” this period of time between inspection notice and inspector entrance to a facility poses a concern. Delayed access also has the potential to reduce the effectiveness of measures applied as part of the Safeguards Approach for a facility (such as short-notice inspections). This report investigates the feasibility of using process monitoring to address safeguards challenges posed by delayed access at a subset of facility types.

  14. New vision technology for multidimensional quality monitoring of food processes

    DEFF Research Database (Denmark)

    Dissing, Bjørn Skovlund

    the high throughput needs for quality control, process control and monitoring. In this Ph.D. project the possibilities provided by spectroscopic imaging and chemometrics have been utilized to improve the analysis and understanding of different food products. The work is presented in seven papers and two......Spectroscopy and spectral imaging in combination with multivariate data analysis and machine learning techniques have proven to be an outstanding tool for rapid analysis of different products. This may be utilized in various industries, but especially rapid assessment of food products in food...... research and industry is of importance in this thesis. The non-invasive spectroscopic imaging techniques are able to measure individual food components simultaneously in situ in the food matrix while pattern recognition techniques effectively are able to extract the quantitative information from the vast...

  15. Compact Raman instrumentation for process and environmental monitoring

    Science.gov (United States)

    Carrabba, Michael M.; Spencer, Kevin M.; Rauh, R. D.

    1991-04-01

    Raman spectroscopy is a powerful noninvasive tool for elucidating chemical structure. Like infrared spectroscopy, it has many potential practical applications, such as process monitoring, environmental sensing, clinical analysis, forensic identification, and as a detector for use with analytical instruments. Until recently, however, Raman has been considered mainly in the context of basic research. The present generation of high performance Raman instruments tend to be large, complex and expensive, and thus have been of primary interest only to specialists in the field. This paper will discuss the development of a compact Raman spectrometer system consisting of a diode laser, fiber optics of excitation and collection, and a compact spectrograph with charge coupled device (CCD) detection.

  16. A computer based, automated analysis of process and outcomes of diabetic care in 23 GP practices.

    LENUS (Irish Health Repository)

    Hill, F

    2012-02-01

    The predicted prevalence of diabetes in Ireland by 2015 is 190,000. Structured diabetes care in general practice has outcomes equivalent to secondary care and good diabetes care has been shown to be associated with the use of electronic healthcare records (EHRs). This automated analysis of EHRs in 23 practices took 10 minutes per practice compared with 15 hours per practice for manual searches. Data was extracted for 1901 type II diabetics. There was valid data for >80% of patients for 6 of the 9 key indicators in the previous year. 543 (34%) had a Hba1c > 7.5%, 142 (9%) had a total cholesterol >6 mmol\\/l, 83 (6%) had an LDL cholesterol >4 mmol\\/l, 367 (22%) had Triglycerides > 2.2 mmol\\/l and 162 (10%) had Blood Pressure > 160\\/100 mmHg. Data quality and key indicators of care compare well with manual audits in Ireland and the U.K. electronic healthcare records and automated audits should be a feature of all chronic disease management programs.

  17. APPLICATION OF ABSORPTION SPECTROSCOPY TO ACTINIDE PROCESS ANALYSIS AND MONITORING

    Energy Technology Data Exchange (ETDEWEB)

    Lascola, R.; Sharma, V.

    2010-06-03

    The characteristic strong colors of aqueous actinide solutions form the basis of analytical techniques for actinides based on absorption spectroscopy. Colorimetric measurements of samples from processing activities have been used for at least half a century. This seemingly mature technology has been recently revitalized by developments in chemometric data analysis. Where reliable measurements could formerly only be obtained under well-defined conditions, modern methods are robust with respect to variations in acidity, concentration of complexants and spectral interferents, and temperature. This paper describes two examples of the use of process absorption spectroscopy for Pu analysis at the Savannah River Site, in Aiken, SC. In one example, custom optical filters allow accurate colorimetric measurements of Pu in a stream with rapid nitric acid variation. The second example demonstrates simultaneous measurement of Pu and U by chemometric treatment of absorption spectra. The paper concludes with a description of the use of these analyzers to supplement existing technologies in nuclear materials monitoring in processing, reprocessing, and storage facilities.

  18. Simplified Automated Image Analysis for Detection and Phenotyping of Mycobacterium tuberculosis on Porous Supports by Monitoring Growing Microcolonies

    OpenAIRE

    den Hertog, Alice L.; Dennis W Visser; Ingham, Colin J.; Frank H A G Fey; Paul R Klatser; Anthony, Richard M.

    2010-01-01

    BACKGROUND: Even with the advent of nucleic acid (NA) amplification technologies the culture of mycobacteria for diagnostic and other applications remains of critical importance. Notably microscopic observed drug susceptibility testing (MODS), as opposed to traditional culture on solid media or automated liquid culture, has shown potential to both speed up and increase the provision of mycobacterial culture in high burden settings. METHODS: Here we explore the growth of Mycobacterial tubercul...

  19. A Case Study Improvement of a Testing Process by Combining Lean Management, Industrial Engineering and Automation Methods

    Directory of Open Access Journals (Sweden)

    Simon Withers

    2013-07-01

    Full Text Available Increasingly competitive market environments have forced not only large manufacturing, but also smalland-medium size enterprises (SME to look for means to improve their operations in order to increase competitive strength. This paper presents an adaptation and adoption by a UK SME engineering service organisation, of lean management, industrial engineering, and automation metods developed within larger organisations. This SME sought to improve the overall performance of one of its core testing processes. An exploratory analysis, based on the lean management concept of “value added” and work measurement technique “time study”, was developed and carried out in order to understand the current performance of a testing process for gas turbine fuel flow dividers. A design for the automation of some operations of the testing process was followed as an approach to reduce non-value added activities, and improve the overall efficiency of the testing process. The overall testing time was reduced from 12.41 to 7.93 hours (36.09 percent while the man hours and non-value added time were also reduced from 23.91 to 12.94 hours (45.87 percent and from 11.08 to 6.69 (39.67 percent hours respectively. This resulted in an increase in process efficiency in terms of man hours from 51.91 to 61.28 percent. The contribution of this paper resides in presenting a case study that can be used as a guiding reference for managers and engineers to undertake improvement projects, in their organisations, similar to the one presented in this paper.

  20. Automation of disbond detection in aircraft fuselage through thermal image processing

    Science.gov (United States)

    Prabhu, D. R.; Winfree, W. P.

    1992-01-01

    A procedure for interpreting thermal images obtained during the nondestructive evaluation of aircraft bonded joints is presented. The procedure operates on time-derivative thermal images and resulted in a disbond image with disbonds highlighted. The size of the 'black clusters' in the output disbond image is a quantitative measure of disbond size. The procedure is illustrated using simulation data as well as data obtained through experimental testing of fabricated samples and aircraft panels. Good results are obtained, and, except in pathological cases, 'false calls' in the cases studied appeared only as noise in the output disbond image which was easily filtered out. The thermal detection technique coupled with an automated image interpretation capability will be a very fast and effective method for inspecting bonded joints in an aircraft structure.

  1. Advanced modelling, monitoring, and process control of bioconversion systems

    Science.gov (United States)

    Schmitt, Elliott C.

    Production of fuels and chemicals from lignocellulosic biomass is an increasingly important area of research and industrialization throughout the world. In order to be competitive with fossil-based fuels and chemicals, maintaining cost-effectiveness is critical. Advanced process control (APC) and optimization methods could significantly reduce operating costs in the biorefining industry. Two reasons APC has previously proven challenging to implement for bioprocesses include: lack of suitable online sensor technology of key system components, and strongly nonlinear first principal models required to predict bioconversion behavior. To overcome these challenges batch fermentations with the acetogen Moorella thermoacetica were monitored with Raman spectroscopy for the conversion of real lignocellulosic hydrolysates and a kinetic model for the conversion of synthetic sugars was developed. Raman spectroscopy was shown to be effective in monitoring the fermentation of sugarcane bagasse and sugarcane straw hydrolysate, where univariate models predicted acetate concentrations with a root mean square error of prediction (RMSEP) of 1.9 and 1.0 g L-1 for bagasse and straw, respectively. Multivariate partial least squares (PLS) models were employed to predict acetate, xylose, glucose, and total sugar concentrations for both hydrolysate fermentations. The PLS models were more robust than univariate models, and yielded a percent error of approximately 5% for both sugarcane bagasse and sugarcane straw. In addition, a screening technique was discussed for improving Raman spectra of hydrolysate samples prior to collecting fermentation data. Furthermore, a mechanistic model was developed to predict batch fermentation of synthetic glucose, xylose, and a mixture of the two sugars to acetate. The models accurately described the bioconversion process with an RMSEP of approximately 1 g L-1 for each model and provided insights into how kinetic parameters changed during dual substrate

  2. Technical study for the automation and control of processes of the chemical processing plant for liquid radioactive waste at Racso Nuclear Center

    International Nuclear Information System (INIS)

    The purpose of this study is to introduce the development of an automation and control system in a chemical processing plant for liquid radioactive waste of low and medium activity. The control system established for the chemical processing plant at RACSO Nuclear Center is described. It is an on-off sequential type system with feedback. This type of control has been chosen according to the volumes to be treated at the plant as processing is carried out by batches. The system will be governed by a programmable controller (PLC), modular, with a minimum of 24 digital inputs, 01 analog input, 16 digital outputs and 01 analog input. Digital inputs and outputs are specifically found at the level sensors of the tanks and at the solenoid-type electro valve control. Analog inputs and outputs have been considered at the pH control. The comprehensive system has been divided into three control bonds, The bonds considered for the operation of the plant are described, the plant has storing, fitting, processing and clarifying tanks. National Instruments' Lookout software has been used for simulation, constituting an important tool not only for a design phase but also for a practical one since this software will be used as SCADA system. Finally, the advantages and benefits of this automation system are analyzed, radiation doses received by occupationally exposed workers are reduced and reliability on the operation on the system is increased. (authors)

  3. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    Science.gov (United States)

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time. PMID:25448021

  4. A high throughput MATLAB program for automated force-curve processing using the AdG polymer model.

    Science.gov (United States)

    O'Connor, Samantha; Gaddis, Rebecca; Anderson, Evan; Camesano, Terri A; Burnham, Nancy A

    2015-02-01

    Research in understanding biofilm formation is dependent on accurate and representative measurements of the steric forces related to brush on bacterial surfaces. A MATLAB program to analyze force curves from an AFM efficiently, accurately, and with minimal user bias has been developed. The analysis is based on a modified version of the Alexander and de Gennes (AdG) polymer model, which is a function of equilibrium polymer brush length, probe radius, temperature, separation distance, and a density variable. Automating the analysis reduces the amount of time required to process 100 force curves from several days to less than 2min. The use of this program to crop and fit force curves to the AdG model will allow researchers to ensure proper processing of large amounts of experimental data and reduce the time required for analysis and comparison of data, thereby enabling higher quality results in a shorter period of time.

  5. Automated data processing of { 1H-decoupled} 13C MR spectra acquired from human brain in vivo

    Science.gov (United States)

    Shic, Frederick; Ross, Brian

    2003-06-01

    In clinical 13C infusion studies, broadband excitation of 200 ppm of the human brain yields 13C MR spectra with a time resolution of 2-5 min and generates up to 2000 metabolite peaks over 2 h. We describe a fast, automated, observer-independent technique for processing { 1H-decoupled} 13C spectra. Quantified 13C spectroscopic signals, before and after the administration of [1- 13C]glucose and/or [1- 13C]acetate in human subjects are determined. Stepwise improvements of data processing are illustrated by examples of normal and pathological results. Variation in analysis of individual 13C resonances ranged between 2 and 14%. Using this method it is possible to reliably identify subtle metabolic effects of brain disease including Alzheimer's disease and epilepsy.

  6. A fully automated meltwater monitoring and collection system for spatially distributed isotope analysis in snowmelt-dominated catchments

    Science.gov (United States)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James

    2016-04-01

    In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter

  7. A fully automated health-care monitoring at home without attachment of any biological sensors and its clinical evaluation.

    Science.gov (United States)

    Motoi, Kosuke; Ogawa, Mitsuhiro; Ueno, Hiroshi; Kuwae, Yutaka; Ikarashi, Akira; Yuji, Tadahiko; Higashi, Yuji; Tanaka, Shinobu; Fujimoto, Toshiro; Asanoi, Hidetsugu; Yamakoshi, Ken-ichi

    2009-01-01

    Daily monitoring of health condition is important for an effective scheme for early diagnosis, treatment and prevention of lifestyle-related diseases such as adiposis, diabetes, cardiovascular diseases and other diseases. Commercially available devices for health care monitoring at home are cumbersome in terms of self-attachment of biological sensors and self-operation of the devices. From this viewpoint, we have been developing a non-conscious physiological monitor installed in a bath, a lavatory, and a bed for home health care and evaluated its measurement accuracy by simultaneous recordings of a biological sensors directly attached to the body surface. In order to investigate its applicability to health condition monitoring, we have further developed a new monitoring system which can automatically monitor and store the health condition data. In this study, by evaluation on 3 patients with cardiac infarct or sleep apnea syndrome, patients' health condition such as body and excretion weight in the toilet and apnea and hypopnea during sleeping were successfully monitored, indicating that the system appears useful for monitoring the health condition during daily living.

  8. Cloud-based CT dose monitoring using the DICOM-structured report. Fully automated analysis in regard to national diagnostic reference levels

    Energy Technology Data Exchange (ETDEWEB)

    Boos, J.; Rubbert, C.; Heusch, P.; Lanzman, R.S.; Aissa, J.; Antoch, G.; Kroepil, P. [Univ. Duesseldorf (Germany). Dept. of Diagnostic an Interventional Radiology; Meineke, A. [Cerner Health Services, Idstein (Germany)

    2016-03-15

    To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). Materials and Methods: We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDI{sub vol}) and dose length product (DLP). Results: Data of 36 523 examinations (131 527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDI{sub vol} and DLP were 51.3 % and 52.8 % of the national DRLs, respectively. CTDI{sub vol} and DLP reached 43.8 % and 43.1 % for abdominal CT (n = 10 590), 66.6 % and 69.6 % for cranial CT (n = 16 098) and 37.8 % and 44.0 % for chest CT (n = 10 387) of the compared national DRLs, respectively. Overall, the CTDI{sub vol} exceeded national DRLs in 1.9 % of the examinations, while the DLP exceeded national DRLs in 2.9 % of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50 % of the DRLs. Conclusion: The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50 % of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments.

  9. 监控自动化设计在广播发射台的应用%Application of Monitoring Automation Design in Broadcast Transmitting Station

    Institute of Scientific and Technical Information of China (English)

    李俊学; 郝渝

    2011-01-01

    分析了陕西广播发射台监控自动化系统采用三梯级树形结构进行信号采集、传输、以及指标体系的特征点和系统优缺点。给出了将计算机技术、自动控制原理、网络传输技术应用于广播发射台的应用和设计方法,该方法对类似的通信或物流等各节点进行自动化升级改造具有一定的参考价值。%The monitoring automation system of Shaanxi broadcast transmitting station adopted three cascade tree structure. The advantages and disadvantages of the system are analyzed. The design methods of applying computer tech- nology, automatic control theory and network transmission technology to broadcast transmitting station are introduced, which has a certain reference value for automated upgrade of similar nodes such as communication or logistics.

  10. Operational Ship Monitoring System Based on Synthetic Aperture Radar Processing

    Directory of Open Access Journals (Sweden)

    Antonio Tabasco

    2009-08-01

    Full Text Available This paper presents a Ship Monitoring System (SIMONS working with Synthetic Aperture Radar (SAR images. It is able to infer ship detection and classification information, and merge the results with other input channels, such as polls from the Automatic Identification System (AIS. Two main stages can be identified, namely: SAR processing and data dissemination. The former has three independent modules, which are related to Coastline Detection (CD, Ship Detection (SD and Ship Classification (SC. The later is solved via an advanced web interface, which is compliant with the OpenSource standards fixed by the Open Geospatial Consortium (OGC. SIMONS has been designed to be a modular, unsupervised and reliable system that meets Near-Real Time (NRT delivery requirements. From data ingestion to product delivery, the processing chain is fully automatic accepting ERS and ENVISAT formats. SIMONS has been developed by GMV Aerospace, S.A. with three main goals, namely: 1 To limit the dependence on the ancillary information provided by systems such as AIS. 2 To achieve the maximum level of automatism and restrict human manipulation. 3 To limit the error sources and their propagation. Spanish authorities have validated SIMONS. The results have been satisfactory and have confirmed that the system is useful for improving decision making. For single-polarimetric images with a resolution of 30 m, SIMONS permits the location of ships larger than 40 m with a classification ratio around 50% of positive matches. These values are expected to be improved with SAR data from new sensors. In the paper, the performance of SD and SC modules is assessed by cross-check of SAR data with AIS reports.

  11. Modernization of existing power plants. Progress in automation and process control/observation

    International Nuclear Information System (INIS)

    Numerous power plants are now getting on in years, and their owners have to face the question 'New plant or upgrade job ?'. Experience in the past few years has shown that in many cases modernization/upgrading of existing plants is a more favorable option than building a complete new power plant. Advantages like lower capital investment costs and avoidance of licensing risks for new plants constitute important motives for choosing the upgrade option in numerous power plants modernization projects. The defined objective here is to ensure the units' operating capability for another 20 to 25 years, sometimes supplemented by meticulous compliance with current environmental impact legislation. Another cogent argument emerges from automation engineering advances in modern-day control systems which make an effective contribution to meeting upgrading objective like: equipment/material -friendly operation, extended useful lifetime, enhanced plant reliability, enhanced plant availability, improved plant efficiency, optimized staffing levels, enhanced cost-effectiveness, compliance with today's international standards. In this context special attention is paid to the economical aspects and to the increase of plant availability. (author). 6 figs

  12. Online flow cytometry for monitoring apoptosis in mammalian cell cultures as an application for process analytical technology.

    Science.gov (United States)

    Kuystermans, Darrin; Avesh, Mohd; Al-Rubeai, Mohamed

    2016-05-01

    Apoptosis is the main driver of cell death in bioreactor suspension cell cultures during the production of biopharmaceuticals from animal cell lines. It is known that apoptosis also has an effect on the quality and quantity of the expressed recombinant protein. This has raised the importance of studying apoptosis for implementing culture optimization strategies. The work here describes a novel approach to obtain near real time data on proportion of viable, early apoptotic, late apoptotic and necrotic cell populations in a suspension CHO culture using automated sample preparation in conjunction with flow cytometry. The resultant online flow cytometry data can track the progression of apoptotic events in culture, aligning with analogous manual methodologies and giving similar results. The obtained near-real time apoptosis data are a significant improvement in monitoring capabilities and can lead to improved control strategies and research data on complex biological systems in bioreactor cultures in both academic and industrial settings focused on process analytical technology applications.

  13. Miniaturized, Multi-Analyte Sensor Array for the Automated Monitoring of Major Atmospheric Constituents in Spacecraft Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — InnoSense LLC (ISL) proposes to develop a miniaturized, multi-analyte sensor for near real-time monitoring of analytes in the spacecraft environment. The proposed...

  14. Miniaturized, Multi-Analyte Sensor Array for the Automated Monitoring of Major Atmospheric Constituents in Spacecraft Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the Phase II SBIR project is to develop a prototype sensor system to detect gaseous analytes in support of the spacecraft environmental monitoring...

  15. Evaluating an Automated Approach for Monitoring Forest Disturbances in the Pacific Northwest from Logging, Fire and Insect Outbreaks with Landsat Time Series Data

    Directory of Open Access Journals (Sweden)

    Christopher S. R. Neigh

    2014-12-01

    Full Text Available Forests are the largest aboveground sink for atmospheric carbon (C, and understanding how they change through time is critical to reduce our C-cycle uncertainties. We investigated a strong decline in Normalized Difference Vegetation Index (NDVI from 1982 to 1991 in Pacific Northwest forests, observed with the National Ocean and Atmospheric Administration’s (NOAA series of Advanced Very High Resolution Radiometers (AVHRRs. To understand the causal factors of this decline, we evaluated an automated classification method developed for Landsat time series stacks (LTSS to map forest change. This method included: (1 multiple disturbance index thresholds; and (2 a spectral trajectory-based image analysis with multiple confidence thresholds. We produced 48 maps and verified their accuracy with air photos, monitoring trends in burn severity data and insect aerial detection survey data. Area-based accuracy estimates for change in forest cover resulted in producer’s and user’s accuracies of 0.21 ± 0.06 to 0.38 ± 0.05 for insect disturbance, 0.23 ± 0.07 to 1 ± 0 for burned area and 0.74 ± 0.03 to 0.76 ± 0.03 for logging. We believe that accuracy was low for insect disturbance because air photo reference data were temporally sparse, hence missing some outbreaks, and the annual anniversary time step is not dense enough to track defoliation and progressive stand mortality. Producer’s and user’s accuracy for burned area was low due to the temporally abrupt nature of fire and harvest with a similar response of spectral indices between the disturbance index and normalized burn ratio. We conclude that the spectral trajectory approach also captures multi-year stress that could be caused by climate, acid deposition, pathogens, partial harvest, thinning, etc. Our study focused on understanding the transferability of previously successful methods to new ecosystems and found that this automated method does not perform with the same accuracy in Pacific

  16. LOCALIZATION OF PALM DORSAL VEIN PATTERN USING IMAGE PROCESSING FOR AUTOMATED INTRA-VENOUS DRUG NEEDLE INSERTION

    Directory of Open Access Journals (Sweden)

    Mrs. Kavitha. R,

    2011-06-01

    Full Text Available Vein pattern in palms is a random mesh of interconnected and inter- wining blood vessels. This project is the application of vein detection concept to automate the drug delivery process. It dealswith extracting palm dorsal vein structures, which is a key procedure for selecting the optimal drug needle insertion point. Gray scale images obtained from a low cost IR-webcam are poor in contrast, and usually noisy which make an effective vein segmentation a great challenge. Here a new vein image segmentation method is introduced, based on enhancement techniques resolves the conflict between poor contrast vein image and good quality image segmentation. Gaussian filter is used to remove the high frequency noise in the image. The ultimate goal is to identify venous bifurcations and determine the insertion point for the needle in between their branches.

  17. Use of a robotic manipulator in the simulation of the automation of a calibration process of dosemeters

    International Nuclear Information System (INIS)

    The development of a system based in a manipulative robot which simulates the operative sequence in a calibration process of dosemeters is presented. In this process it is performed the monitoring of the dosemeter positions and the calibrator by mean of an arm of articulated robot which develops the movement sequences and the taking a decision based on the information coming from the external sensors. (Author)

  18. 76 FR 52581 - Automated Data Processing and Information Retrieval System Requirements

    Science.gov (United States)

    2011-08-23

    ... and Information Retrieval System Requirements AGENCY: Food and Nutrition Service, USDA. ACTION... automatic data processing (ADP) and information retrieval system, including the evaluation of data from... data processing and information retrieval system and to provide clarifications and updates which...

  19. Automated three-dimensional detection and classification of living organisms using digital holographic microscopy with partial spatial coherent source: application to the monitoring of drinking water resources.

    Science.gov (United States)

    El Mallahi, Ahmed; Minetti, Christophe; Dubois, Frank

    2013-01-01

    In this paper, we investigate the use of a digital holographic microscope working with partially coherent spatial illumination for an automated detection and classification of living organisms. A robust automatic method based on the computation of propagating matrices is proposed to detect the 3D position of organisms. We apply this procedure to the evaluation of drinking water resources by developing a classification process to identify parasitic protozoan Giardia lamblia cysts among two other similar organisms. By selecting textural features from the quantitative optical phase instead of morphological ones, a robust classifier is built to propose a new method for the unambiguous detection of Giardia lamblia cyst that present a critical contamination risk.

  20. Towards automated processing of the right of access in inter-organizational Web Service compositions

    DEFF Research Database (Denmark)

    Herkenhöner, Ralph; De Meer, Hermann; Jensen, Meiko;

    2010-01-01

    Enforcing the right of access to personal data usually is a long-running process between a data subject and an organization that processes personal data. As of today, this task is commonly realized using a manual process based on postal communication or personal attendance and ends up conflicting...