WorldWideScience

Sample records for automated fault extraction

  1. Automated fault extraction and classification using 3-D seismic data for the Ekofisk field development

    Energy Technology Data Exchange (ETDEWEB)

    Signer, C.; Nickel, M.; Randen, T.; Saeter, T.; Soenneland, H.H.

    1998-12-31

    Mapping of fractures is important for the prediction of fluid flow in many reservoir types. The fluid flow depends mainly on the efficiency of the reservoir seals. Improved spatial mapping of the open and closed fracture systems will allow a better prediction of the fluid flow pattern. The primary objectives of this paper is to present fracture characterization at the reservoir scale combined with seismic facies mapping. The complexity of the giant Ekofisk field on the Norwegian continental shelf provides an ideal framework for testing the validity and the applicability of an automated seismic fault and fracture detection and mapping tool. The mapping of the faults can be based on seismic attribute grids, which means that attribute-responses related to faults are extracted along key horizons which were interpreted in the reservoir interval. 3 refs., 3 figs.

  2. Automated extraction of faults and porous reservoir bodies. Examples from the Vallhall Field

    Energy Technology Data Exchange (ETDEWEB)

    Barkved, Olav Inge; Whitman, Doug; Kunz, Tim

    1998-12-31

    The Norwegian Vahall field is located 250 km South-West of Stavanger. The production is primarily from the highly porous and fractured chalk, the Tor formation. Fractures, evidently play a significant role in enhancing flow properties as well as production rates, are significantly higher than expected from matrix permeability alone. The fractures are primarily tectonically induced and related to faulting. Syn-depositional faulting is believed to be a controlling factor on reservoir thickness variations observed across the field. Due to the low acoustic contrast and weak appearance of the highly porous chalk, direct evidence of faulting in well bore logs is limited. The seismic data quality in the most central area of the field is very poor due to tertiary gas charging, but in the flank area of the field, the quality is excellent. 1 ref., 5 figs.

  3. Semi-automated fault system extraction and displacement analysis of an excavated oyster reef using high-resolution laser scanned data

    Science.gov (United States)

    Molnár, Gábor; Székely, Balázs; Harzhauser, Mathias; Djuricic, Ana; Mandic, Oleg; Dorninger, Peter; Nothegger, Clemens; Exner, Ulrike; Pfeifer, Norbert

    2015-04-01

    In this contribution we present a semi-automated method for reconstructing the brittle deformation field of an excavated Miocene oyster reef, in Stetten, Korneuburg Basin, Lower Austria. Oyster shells up to 80 cm in size were scattered in a shallow estuarine bay forming a continuous and almost isochronous layer as a consequence of a catastrophic event in the Miocene. This shell bed was preserved by burial of several hundred meters of sandy to silty sediments. Later the layers were tilted westward, uplifted and erosion almost exhumed them. An excavation revealed a 27 by 17 meters area of the oyster covered layer. During the tectonic processes the sediment volume suffered brittle deformation. Faults mostly with some centimeter normal component and NW-SE striking affected the oyster covered volume, dissecting many shells and the surrounding matrix as well. Faults and displacements due to them can be traced along the site typically at several meters long, and as fossil oysters are broken and parts are displaced due to the faulting, along some faults it is possible to follow these displacements in 3D. In order to quantify these varying displacements and to map the undulating fault traces high-resolution scanning of the excavated and cleaned surface of the oyster bed has been carried out using a terrestrial laser scanner. The resulting point clouds have been co-georeferenced at mm accuracy and a 1mm resolution 3D point cloud of the surface has been created. As the faults are well-represented in the point cloud, this enables us to measure the dislocations of the dissected shell parts along the fault lines. We used a semi-automatic method to quantify these dislocations. First we manually digitized the fault lines in 2D as an initial model. In the next step we estimated the vertical (i.e. perpendicular to the layer) component of the dislocation along these fault lines comparing the elevations on two sides of the faults with moving averaging windows. To estimate the strike

  4. Automated metadata extraction

    OpenAIRE

    Migletz, James J.

    2008-01-01

    Metadata is data that describes data. There are many computer forensic uses of metadata and being able to extract metadata automatically provides positive forensic implications. This thesis presents a new technique for batch processing disk images and automatically extracting metadata from files and file contents. The technique is embodied in a program called fiwalk that has a plug-in architecture allowing new metadata extractors to be readily incorporated. Output from fiwalk can be provided ...

  5. Automated Water Extraction Index

    DEFF Research Database (Denmark)

    Feyisa, Gudina Legese; Meilby, Henrik; Fensholt, Rasmus;

    2014-01-01

    Classifying surface cover types and analyzing changes are among the most common applications of remote sensing. One of the most basic classification tasks is to distinguish water bodies from dry land surfaces. Landsat imagery is among the most widely used sources of data in remote sensing of water...... resources; and although several techniques of surface water extraction using Landsat data are described in the literature, their application is constrained by low accuracy in various situations. Besides, with the use of techniques such as single band thresholding and two-band indices, identifying...... an appropriate threshold yielding the highest possible accuracy is a challenging and time consuming task, as threshold values vary with location and time of image acquisition. The purpose of this study was therefore to devise an index that consistently improves water extraction accuracy in the presence...

  6. Software fault tree analysis of an automated control system device written in Ada

    OpenAIRE

    Winter, Mathias William.

    1995-01-01

    Software Fault Tree Analysis (SFTA) is a technique used to analyze software for faults that could lead to hazardous conditions in systems which contain software components. Previous thesis works have developed three Ada-based, semi-automated software analysis tools, the Automated Code Translation Tool (ACm) an Ada statement template generator, the Fault Tree Editor (Fm) a graphical fault tree editor, and the Fault Isolator (Fl) an automated software fault tree isolator. These previous works d...

  7. Automated Extraction of Flow Features

    Science.gov (United States)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  8. Automated Fault Detection for DIII-D Tokamak Experiments

    International Nuclear Information System (INIS)

    An automated fault detection software system has been developed and was used during 1999 DIII-D plasma operations. The Fault Identification and Communication System (FICS) executes automatically after every plasma discharge to check dozens of subsystems for proper operation and communicates the test results to the tokamak operator. This system is now used routinely during DIII-D operations and has led to an increase in tokamak productivity

  9. Automated Extraction of DNA from clothing

    OpenAIRE

    Stangegaard, Michael; Hjort, Benjamin Benn; Nøhr Hansen, Thomas; Hansen, Anders Johannes; Morling, Niels

    2011-01-01

    Presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. We have compared three automated DNA extraction methods based on magnetic beads with a manual method with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable DNA profiles.

  10. Automated Extraction of DNA from clothing

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin Benn; Nøhr Hansen, Thomas;

    2011-01-01

    Presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. We have compared three automated DNA extraction methods based on magnetic beads with a manual method with the aim of reducing the...... amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable DNA profiles....

  11. Automated Feature Extraction from Hyperspectral Imagery Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed activities will result in the development of a novel hyperspectral feature-extraction toolkit that will provide a simple, automated, and accurate...

  12. Optimization-based Method for Automated Road Network Extraction

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, D

    2001-09-18

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction.

  13. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  14. Automatic fault extraction using a modified ant-colony algorithm

    International Nuclear Information System (INIS)

    The basis of automatic fault extraction is seismic attributes, such as the coherence cube which is always used to identify a fault by the minimum value. The biggest challenge in automatic fault extraction is noise, including that of seismic data. However, a fault has a better spatial continuity in certain direction, which makes it quite different from noise. Considering this characteristic, a modified ant-colony algorithm is introduced into automatic fault identification and tracking, where the gradient direction and direction consistency are used as constraints. Numerical model test results show that this method is feasible and effective in automatic fault extraction and noise suppression. The application of field data further illustrates its validity and superiority. (paper)

  15. Fault tolerant strategies for automated operation of nuclear reactors

    International Nuclear Information System (INIS)

    This paper introduces an automatic control system incorporating a number of verification, validation, and command generation tasks with-in a fault-tolerant architecture. The integrated system utilizes recent methods of artificial intelligence such as neural networks and fuzzy logic control. Furthermore, advanced signal processing and nonlinear control methods are also included in the design. The primary goal is to create an on-line capability to validate signals, analyze plant performance, and verify the consistency of commands before control decisions are finalized. The application of this approach to the automated startup of the Experimental Breeder Reactor-II (EBR-II) is performed using a validated nonlinear model. The simulation results show that the advanced concepts have the potential to improve plant availability andsafety

  16. Automation of the Tritium Extraction Facility

    International Nuclear Information System (INIS)

    The US Department of Energy has determined its future requirements for tritium will be met using the existing reactors of the Tennessee Valley Authority. Tritium Producing Burnable Absorber Rods (TPBARs) will replace the existing burnable absorber rods in the reactor core to beneficially use excess neutrons to create the tritium. The irradiated TPBARs will be shipped from the reactor to a new facility at the Savannah River Site. This new facility, the Tritium Extraction Facility (TEF), will receive the shipments from the reactor, store the TPBARs, prepare the TPBARs for tritium extraction, extract the tritium, and package the waste for disposal. The high level of gamma radiation emitted from the TPBARs will preclude human contact. Automation and remote handling will be used to accomplish the required operations, while minimizing radiation exposure to workers

  17. Automated fault-management in a simulated spaceflight micro-world

    Science.gov (United States)

    Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja

    2002-01-01

    BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.

  18. Recent advances in automated system model extraction (SME)

    International Nuclear Information System (INIS)

    In this paper we present two different techniques for automated extraction of system models from FEA models. We discuss two different algorithms: for (i) automated N-DOF SME for electrostatically actuated MEMS and (ii) automated N-DOF SME for MEMS inertial sensors. We will present case studies for the two different algorithms presented

  19. Automated Fluid Feature Extraction from Transient Simulations

    Science.gov (United States)

    Haimes, Robert

    2000-01-01

    In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like iso-surfaces, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one 'snap-shot' of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments like pV3). And methods must be developed to abstract the feature and display it in a manner that physically makes sense.

  20. Automated Extraction of Secondary Flow Features

    Science.gov (United States)

    Dorney, Suzanne M.; Haimes, Robert

    2005-01-01

    The use of Computational Fluid Dynamics (CFD) has become standard practice in the design and development of the major components used for air and space propulsion. To aid in the post-processing and analysis phase of CFD many researchers now use automated feature extraction utilities. These tools can be used to detect the existence of such features as shocks, vortex cores and separation and re-attachment lines. The existence of secondary flow is another feature of significant importance to CFD engineers. Although the concept of secondary flow is relatively understood there is no commonly accepted mathematical definition for secondary flow. This paper will present a definition for secondary flow and one approach for automatically detecting and visualizing secondary flow.

  1. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  2. PCA Fault Feature Extraction in Complex Electric Power Systems

    Directory of Open Access Journals (Sweden)

    ZHANG, J.

    2010-08-01

    Full Text Available Electric power system is one of the most complex artificial systems in the world. The complexity is determined by its characteristics about constitution, configuration, operation, organization, etc. The fault in electric power system cannot be completely avoided. When electric power system operates from normal state to failure or abnormal, its electric quantities (current, voltage and angles, etc. may change significantly. Our researches indicate that the variable with the biggest coefficient in principal component usually corresponds to the fault. Therefore, utilizing real-time measurements of phasor measurement unit, based on principal components analysis technology, we have extracted successfully the distinct features of fault component. Of course, because of the complexity of different types of faults in electric power system, there still exists enormous problems need a close and intensive study.

  3. Automated valve condition classification of a reciprocating compressor with seeded faults: experimentation and validation of classification strategy

    International Nuclear Information System (INIS)

    This paper deals with automatic valve condition classification of a reciprocating processor with seeded faults. The seeded faults are considered based on observation of valve faults in practice. They include the misplacement of valve and spring plates, incorrect tightness of the bolts for valve cover or valve seat, softening of the spring plate, and cracked or broken spring plate or valve plate. The seeded faults represent various stages of machine health condition and it is crucial to be able to correctly classify the conditions so that preventative maintenance can be performed before catastrophic breakdown of the compressor occurs. Considering the non-stationary characteristics of the system, time–frequency analysis techniques are applied to obtain the vibration spectrum as time develops. A data reduction algorithm is subsequently employed to extract the fault features from the formidable amount of time–frequency data and finally the probabilistic neural network is utilized to automate the classification process without the intervention of human experts. This study shows that the use of modification indices, as opposed to the original indices, greatly reduces the classification error, from about 80% down to about 20% misclassification for the 15 fault cases. Correct condition classification can be further enhanced if the use of similar fault cases is avoided. It is shown that 6.67% classification error is achievable when using the short-time Fourier transform and the mean variation method for the case of seven seeded faults with 10 training samples used. A stunning 100% correct classification can even be realized when the neural network is well trained with 30 training samples being used

  4. Automated valve condition classification of a reciprocating compressor with seeded faults: experimentation and validation of classification strategy

    Science.gov (United States)

    Lin, Yih-Hwang; Liu, Huai-Sheng; Wu, Chung-Yung

    2009-09-01

    This paper deals with automatic valve condition classification of a reciprocating processor with seeded faults. The seeded faults are considered based on observation of valve faults in practice. They include the misplacement of valve and spring plates, incorrect tightness of the bolts for valve cover or valve seat, softening of the spring plate, and cracked or broken spring plate or valve plate. The seeded faults represent various stages of machine health condition and it is crucial to be able to correctly classify the conditions so that preventative maintenance can be performed before catastrophic breakdown of the compressor occurs. Considering the non-stationary characteristics of the system, time-frequency analysis techniques are applied to obtain the vibration spectrum as time develops. A data reduction algorithm is subsequently employed to extract the fault features from the formidable amount of time-frequency data and finally the probabilistic neural network is utilized to automate the classification process without the intervention of human experts. This study shows that the use of modification indices, as opposed to the original indices, greatly reduces the classification error, from about 80% down to about 20% misclassification for the 15 fault cases. Correct condition classification can be further enhanced if the use of similar fault cases is avoided. It is shown that 6.67% classification error is achievable when using the short-time Fourier transform and the mean variation method for the case of seven seeded faults with 10 training samples used. A stunning 100% correct classification can even be realized when the neural network is well trained with 30 training samples being used.

  5. Machine fault feature extraction based on intrinsic mode functions

    International Nuclear Information System (INIS)

    This work employs empirical mode decomposition (EMD) to decompose raw vibration signals into intrinsic mode functions (IMFs) that represent the oscillatory modes generated by the components that make up the mechanical systems generating the vibration signals. The motivation here is to develop vibration signal analysis programs that are self-adaptive and that can detect machine faults at the earliest onset of deterioration. The change in velocity of the amplitude of some IMFs over a particular unit time will increase when the vibration is stimulated by a component fault. Therefore, the amplitude acceleration energy in the intrinsic mode functions is proposed as an indicator of the impulsive features that are often associated with mechanical component faults. The periodicity of the amplitude acceleration energy for each IMF is extracted by spectrum analysis. A spectrum amplitude index is introduced as a method to select the optimal result. A comparison study of the method proposed here and some well-established techniques for detecting machinery faults is conducted through the analysis of both gear and bearing vibration signals. The results indicate that the proposed method has superior capability to extract machine fault features from vibration signals

  6. Automated feature extraction and classification from image sources

    Science.gov (United States)

    U.S. Geological Survey

    1995-01-01

    The U.S. Department of the Interior, U.S. Geological Survey (USGS), and Unisys Corporation have completed a cooperative research and development agreement (CRADA) to explore automated feature extraction and classification from image sources. The CRADA helped the USGS define the spectral and spatial resolution characteristics of airborne and satellite imaging sensors necessary to meet base cartographic and land use and land cover feature classification requirements and help develop future automated geographic and cartographic data production capabilities. The USGS is seeking a new commercial partner to continue automated feature extraction and classification research and development.

  7. FADES: A tool for automated fault analysis of complex systems

    International Nuclear Information System (INIS)

    FADES is an Expert System for performing fault analyses on complex connected systems. By using a graphical editor to draw components and link them together the FADES system allows the analyst to describe a given system. The knowledge base created is used to qualitatively simulate the system behaviour. By inducing all possible component failures in the system and determining their effects, a set of facts is built up. These facts are then used to create Fault Trees, or FMEA tables. The facts may also be used for explanation effects and to generate diagnostic rules allowing system instrumentation to be optimised. The prototype system has been built and tested and is preently undergoing testing by users. All comments from these trials will be used to tailor the system to the requirements of the user so that the end product performs the exact task required

  8. Automated forensic extraction of encryption keys using behavioural analysis

    OpenAIRE

    Owen, Gareth

    2012-01-01

    In this paper we describe a technique for automatic algorithm identification and information extraction from unknown binaries. We emulate the binary using PyEmu forcing complete code coverage whilst simultaneously examining its behavior. Our behavior matcher then identifies specific algorithmic behavior and extracts information. We demonstrate the use of this technique for automated extraction of encryption keys from an unseen program with no prior knowledge about its implementation. Our tech...

  9. Automated Extraction of Free-Text from Pathology Reports

    OpenAIRE

    Currie*, Anne-Marie; Fricke, Travis; Gawne, Agnes; Johnston, Ric; Liu, John; Stein, Barbara

    2006-01-01

    Manually populating a cancer registry from free-text pathology reports is labor intensive and costly. This poster describes a method of automated text extraction to improve the efficiency of this process and reduce cost. FineTooth, a software company, provides an automated service to the Fred Hutchinson Cancer Research Center (FHCRC) to help populate their breast and prostate cancer clinical research database by electronically abstracting over 80 data fields from pathology text reports.

  10. Automated extraction of change information from multispectral satellite imagery

    International Nuclear Information System (INIS)

    Seeing the expected technical improvements as to the spatial and spectral resolution, satellite imagery could more and more provide a basis for complex information systems for recognizing and monitoring even small-scale and short-term structural features of interests within nuclear facilities, for instance construction of buildings, plant expansion, changes of the operational status, underground activities etc. The analysis of large volumes of multi sensor satellite data will then definitely require a high degree of automation for (pre-) processing, analysis and interpretation in order to extract the features of interest. Against this background, the present paper focuses on the automated extraction of change information from multispectral satellite imagery

  11. Technology Corner: Automated Data Extraction Using Facebook

    Directory of Open Access Journals (Sweden)

    Nick Flor

    2012-06-01

    Full Text Available Because of Facebook’s popularity, law enforcement agents often use it as a key source of evidence. But like many user digital trails, there can be a large amount of data to extract for analysis. In this paper, we explore the basics of extracting data programmatically from a user’s Facebook via a Web app. A data extraction app requests data using the Facebook Graph API, and Facebook returns a JSON object containing the data. Before an app can access a user’s Facebook data, the user must log into Facebook and give permission. Thus, this approach is limited to situations where users give consent to the data extraction.

  12. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  13. An automated and simple method for brain MR image extraction

    OpenAIRE

    Zhu Zixin; Liu Jiafeng; Zhang Haiyan; Li Haiyun

    2011-01-01

    Abstract Background The extraction of brain tissue from magnetic resonance head images, is an important image processing step for the analyses of neuroimage data. The authors have developed an automated and simple brain extraction method using an improved geometric active contour model. Methods The method uses an improved geometric active contour model which can not only solve the boundary leakage problem but also is less sensitive to intensity inhomogeneity. The method defines the initial fu...

  14. Automated vasculature extraction from placenta images

    Science.gov (United States)

    Almoussa, Nizar; Dutra, Brittany; Lampe, Bryce; Getreuer, Pascal; Wittman, Todd; Salafia, Carolyn; Vese, Luminita

    2011-03-01

    Recent research in perinatal pathology argues that analyzing properties of the placenta may reveal important information on how certain diseases progress. One important property is the structure of the placental blood vessels, which supply a fetus with all of its oxygen and nutrition. An essential step in the analysis of the vascular network pattern is the extraction of the blood vessels, which has only been done manually through a costly and time-consuming process. There is no existing method to automatically detect placental blood vessels; in addition, the large variation in the shape, color, and texture of the placenta makes it difficult to apply standard edge-detection algorithms. We describe a method to automatically detect and extract blood vessels from a given image by using image processing techniques and neural networks. We evaluate several local features for every pixel, in addition to a novel modification to an existing road detector. Pixels belonging to blood vessel regions have recognizable responses; hence, we use an artificial neural network to identify the pattern of blood vessels. A set of images where blood vessels are manually highlighted is used to train the network. We then apply the neural network to recognize blood vessels in new images. The network is effective in capturing the most prominent vascular structures of the placenta.

  15. Automated Image Registration Using Morphological Region of Interest Feature Extraction

    Science.gov (United States)

    Plaza, Antonio; LeMoigne, Jacqueline; Netanyahu, Nathan S.

    2005-01-01

    With the recent explosion in the amount of remotely sensed imagery and the corresponding interest in temporal change detection and modeling, image registration has become increasingly important as a necessary first step in the integration of multi-temporal and multi-sensor data for applications such as the analysis of seasonal and annual global climate changes, as well as land use/cover changes. The task of image registration can be divided into two major components: (1) the extraction of control points or features from images; and (2) the search among the extracted features for the matching pairs that represent the same feature in the images to be matched. Manual control feature extraction can be subjective and extremely time consuming, and often results in few usable points. Automated feature extraction is a solution to this problem, where desired target features are invariant, and represent evenly distributed landmarks such as edges, corners and line intersections. In this paper, we develop a novel automated registration approach based on the following steps. First, a mathematical morphology (MM)-based method is used to obtain a scale-orientation morphological profile at each image pixel. Next, a spectral dissimilarity metric such as the spectral information divergence is applied for automated extraction of landmark chips, followed by an initial approximate matching. This initial condition is then refined using a hierarchical robust feature matching (RFM) procedure. Experimental results reveal that the proposed registration technique offers a robust solution in the presence of seasonal changes and other interfering factors. Keywords-Automated image registration, multi-temporal imagery, mathematical morphology, robust feature matching.

  16. Applications of the Automated SMAC Modal Parameter Extraction Package

    International Nuclear Information System (INIS)

    An algorithm known as SMAC (Synthesize Modes And Correlate), based on principles of modal filtering, has been in development for a few years. The new capabilities of the automated version are demonstrated on test data from a complex shell/payload system. Examples of extractions from impact and shaker data are shown. The automated algorithm extracts 30 to 50 modes in the bandwidth from each column of the frequency response function matrix. Examples of the synthesized Mode Indicator Functions (MIFs) compared with the actual MIFs show the accuracy of the technique. A data set for one input and 170 accelerometer outputs can typically be reduced in an hour. Application to a test with some complex modes is also demonstrated

  17. Feature evaluation and extraction based on neural network in analog circuit fault diagnosis

    Institute of Scientific and Technical Information of China (English)

    Yuan Haiying; Chen Guangju; Xie Yongle

    2007-01-01

    Choosing the right characteristic parameter is the key to fault diagnosis in analog circuit.The feature evaluation and extraction methods based on neural network are presented.Parameter evaluation of circuit features is realized by training results from neural network; the superior nonlinear mapping capability is competent for extracting fault features which are normalized and compressed subsequently.The complex classification problem on fault pattern recognition in analog circuit is transferred into feature processing stage by feature extraction based on neural network effectively, which improves the diagnosis efficiency.A fault diagnosis illustration validated this method.

  18. Critical Evaluation of Validation Rules Automated Extraction from Data

    Directory of Open Access Journals (Sweden)

    David Pejcoch

    2014-10-01

    Full Text Available The goal of this article is to critically evaluate a possibility of automatic extraction of such kind of rules which could be later used within a Data Quality Management process for validation of records newly incoming to Information System. For practical demonstration the 4FT-Miner procedure implemented in LISpMiner System was chosen. A motivation for this task is the potential simplification of projects focused on Data Quality Management. Initially, this article is going to critically evaluate a possibility of fully automated extraction with the aim to identify strengths and weaknesses of this approach in comparison to its alternative, when at least some a priori knowledge is available. As a result of practical implementation, this article provides design of recommended process which would be used as a guideline for future projects. Also the question of how to store and maintain extracted rules and how to integrate them with existing tools supporting Data Quality Management is discussed

  19. Automated RNA Extraction and Purification for Multiplexed Pathogen Detection

    Energy Technology Data Exchange (ETDEWEB)

    Bruzek, Amy K.; Bruckner-Lea, Cindy J.

    2005-01-01

    Pathogen detection has become an extremely important part of our nation?s defense in this post 9/11 world where the threat of bioterrorist attacks are a grim reality. When a biological attack takes place, response time is critical. The faster the biothreat is assessed, the faster countermeasures can be put in place to protect the health of the general public. Today some of the most widely used methods for detecting pathogens are either time consuming or not reliable [1]. Therefore, a method that can detect multiple pathogens that is inherently reliable, rapid, automated and field portable is needed. To that end, we are developing automated fluidics systems for the recovery, cleanup, and direct labeling of community RNA from suspect environmental samples. The advantage of using RNA for detection is that there are multiple copies of mRNA in a cell, whereas there are normally only one or two copies of DNA [2]. Because there are multiple copies of mRNA in a cell for highly expressed genes, no amplification of the genetic material may be necessary, and thus rapid and direct detection of only a few cells may be possible [3]. This report outlines the development of both manual and automated methods for the extraction and purification of mRNA. The methods were evaluated using cell lysates from Escherichia coli 25922 (nonpathogenic), Salmonella typhimurium (pathogenic), and Shigella spp (pathogenic). Automated RNA purification was achieved using a custom sequential injection fluidics system consisting of a syringe pump, a multi-port valve and a magnetic capture cell. mRNA was captured using silica coated superparamagnetic beads that were trapped in the tubing by a rare earth magnet. RNA was detected by gel electrophoresis and/or by hybridization of the RNA to microarrays. The versatility of the fluidics systems and the ability to automate these systems allows for quick and easy processing of samples and eliminates the need for an experienced operator.

  20. Arduino-based automation of a DNA extraction system.

    Science.gov (United States)

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile. PMID:26409535

  1. Evaluation of Four Automated Protocols for Extraction of DNA from FTA Cards

    OpenAIRE

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J.; Morling, Niels

    2013-01-01

    Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cell...

  2. Seismicity on Basement Faults Induced by Simultaneous Fluid Injection-Extraction

    Science.gov (United States)

    Chang, Kyung Won; Segall, Paul

    2016-08-01

    Large-scale carbon dioxide (CO2) injection into geological formations increases pore pressure, potentially inducing seismicity on critically stressed faults by reducing the effective normal stress. In addition, poroelastic expansion of the reservoir alters stresses, both within and around the formation, which may trigger earthquakes without direct pore-pressure diffusion. One possible solution to mitigate injection-induced earthquakes is to simultaneously extract pre-existing pore fluids from the target reservoir. To examine the feasibility of the injection-extraction strategy, we compute the spatiotemporal change in Coulomb stress on basement normal faults, including: (1) the change in poroelastic stresses Δ τ _s+fΔ σ _n, where Δ τ _s and Δ σ _n are changes in shear and normal stress. respectively, and (2) the change in pore-pressure fΔ p. Using the model of (J. Geophys. Res. Solid Earth 99(B2):2601-2618, 1994), we estimate the seismicity rate on basement fault zones. Fluid extraction reduces direct pore-pressure diffusion into conductive faults, generally reducing the risk of induced seismicity. Limited diffusion into/from sealing faults results in negligible pore pressure changes within them. However, fluid extraction can cause enhanced seismicity rates on deep normal faults near the injector as well as shallow normal faults near the producer by poroelastic stressing. Changes in seismicity rate driven by poroelastic response to fluid injection-extraction depends on fault geometry, well operations, and the background stressing rate.

  3. Seismicity on Basement Faults Induced by Simultaneous Fluid Injection-Extraction

    Science.gov (United States)

    Chang, Kyung Won; Segall, Paul

    2016-06-01

    Large-scale carbon dioxide (CO2) injection into geological formations increases pore pressure, potentially inducing seismicity on critically stressed faults by reducing the effective normal stress. In addition, poroelastic expansion of the reservoir alters stresses, both within and around the formation, which may trigger earthquakes without direct pore-pressure diffusion. One possible solution to mitigate injection-induced earthquakes is to simultaneously extract pre-existing pore fluids from the target reservoir. To examine the feasibility of the injection-extraction strategy, we compute the spatiotemporal change in Coulomb stress on basement normal faults, including: (1) the change in poroelastic stresses Δ τ _s+fΔ σ _n , where Δ τ _s and Δ σ _n are changes in shear and normal stress. respectively, and (2) the change in pore-pressure fΔ p . Using the model of (J. Geophys. Res. Solid Earth 99(B2):2601-2618, 1994), we estimate the seismicity rate on basement fault zones. Fluid extraction reduces direct pore-pressure diffusion into conductive faults, generally reducing the risk of induced seismicity. Limited diffusion into/from sealing faults results in negligible pore pressure changes within them. However, fluid extraction can cause enhanced seismicity rates on deep normal faults near the injector as well as shallow normal faults near the producer by poroelastic stressing. Changes in seismicity rate driven by poroelastic response to fluid injection-extraction depends on fault geometry, well operations, and the background stressing rate.

  4. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  5. Faults

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Through the study of faults and their effects, much can be learned about the size and recurrence intervals of earthquakes. Faults also teach us about crustal...

  6. Automated extraction of knowledge for model-based diagnostics

    Science.gov (United States)

    Gonzalez, Avelino J.; Myler, Harley R.; Towhidnejad, Massood; Mckenzie, Frederic D.; Kladke, Robin R.

    1990-01-01

    The concept of accessing computer aided design (CAD) design databases and extracting a process model automatically is investigated as a possible source for the generation of knowledge bases for model-based reasoning systems. The resulting system, referred to as automated knowledge generation (AKG), uses an object-oriented programming structure and constraint techniques as well as internal database of component descriptions to generate a frame-based structure that describes the model. The procedure has been designed to be general enough to be easily coupled to CAD systems that feature a database capable of providing label and connectivity data from the drawn system. The AKG system is capable of defining knowledge bases in formats required by various model-based reasoning tools.

  7. A semi-automated protocol for Archaea DNA extraction from stools.

    OpenAIRE

    Khelaifia, Saber; Ramonet, Pierre-Yves; Bedotto Buffet, Marielle; Drancourt, Michel

    2013-01-01

    BACKGROUND: The PCR-based detection of archaea DNA in human specimens relies on efficient DNA extraction. We previously designed one such protocol involving only manual steps. In an effort to reduce the workload involved, we compared this manual protocol to semi-automated and automated protocols for archaea DNA extraction from human specimens. FINDINGS: We tested 110 human stool specimens using each protocol. An automated protocol using the EZ1 Advanced XL extractor with the V 1.066069118 Qia...

  8. Fault feature extraction of rolling bearing based on an improved cyclical spectrum density method

    Science.gov (United States)

    Li, Min; Yang, Jianhong; Wang, Xiaojing

    2015-11-01

    The traditional cyclical spectrum density(CSD) method is widely used to analyze the fault signals of rolling bearing. All modulation frequencies are demodulated in the cyclic frequency spectrum. Consequently, recognizing bearing fault type is difficult. Therefore, a new CSD method based on kurtosis(CSDK) is proposed. The kurtosis value of each cyclic frequency is used to measure the modulation capability of cyclic frequency. When the kurtosis value is large, the modulation capability is strong. Thus, the kurtosis value is regarded as the weight coefficient to accumulate all cyclic frequencies to extract fault features. Compared with the traditional method, CSDK can reduce the interference of harmonic frequency in fault frequency, which makes fault characteristics distinct from background noise. To validate the effectiveness of the method, experiments are performed on the simulation signal, the fault signal of the bearing outer race in the test bed, and the signal gathered from the bearing of the blast furnace belt cylinder. Experimental results show that the CSDK is better than the resonance demodulation method and the CSD in extracting fault features and recognizing degradation trends. The proposed method provides a new solution to fault diagnosis in bearings.

  9. Automated Dsm Extraction from Uav Images and Performance Analysis

    Science.gov (United States)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  10. Application of Waveform Factors in Extracting Fault Trend of Rotary Machines

    Institute of Scientific and Technical Information of China (English)

    YE Yu-gang; ZUO Yun-bo; HUANG Xiao-bin

    2009-01-01

    Vibration intensity and non-dimensional amplitude parameters are often used to extract the fault trend of rotary machines. But, they are the parameters related to energy, and can not describe the fault trend because of varying load and conditions or too slight change of vibration signal. For this reason, three non-dimensional parameters are presented, namely waveform repeatability factor, waveform jumping factor and waveform similarity factor, called as waveform factors jointly, which are based on statistics analysis for the waveform and sensitive to the change of signal waveform. When they are used to extract the fault trend of rotary machines as a kind of technology of instrument and meter, they can reflect the fault trend better than the vibration intensity, peak amplitude and margin index.

  11. PCA Fault Feature Extraction in Complex Electric Power Systems

    OpenAIRE

    ZHANG, J.; Z. Wang; Zhang, Y.; J. MA

    2010-01-01

    Electric power system is one of the most complex artificial systems in the world. The complexity is determined by its characteristics about constitution, configuration, operation, organization, etc. The fault in electric power system cannot be completely avoided. When electric power system operates from normal state to failure or abnormal, its electric quantities (current, voltage and angles, etc.) may change significantly. Our researches indicate that the variable with the biggest coeffic...

  12. Extraction of fault component from abnormal sound in diesel engines using acoustic signals

    Science.gov (United States)

    Dayong, Ning; Changle, Sun; Yongjun, Gong; Zengmeng, Zhang; Jiaoyi, Hou

    2016-06-01

    In this paper a method for extracting fault components from abnormal acoustic signals and automatically diagnosing diesel engine faults is presented. The method named dislocation superimposed method (DSM) is based on the improved random decrement technique (IRDT), differential function (DF) and correlation analysis (CA). The aim of DSM is to linearly superpose multiple segments of abnormal acoustic signals because of the waveform similarity of faulty components. The method uses sample points at the beginning of time when abnormal sound appears as the starting position for each segment. In this study, the abnormal sound belonged to shocking faulty type; thus, the starting position searching method based on gradient variance was adopted. The coefficient of similar degree between two same sized signals is presented. By comparing with a similar degree, the extracted fault component could be judged automatically. The results show that this method is capable of accurately extracting the fault component from abnormal acoustic signals induced by faulty shocking type and the extracted component can be used to identify the fault type.

  13. The Orion GN and C Data-Driven Flight Software Architecture for Automated Sequencing and Fault Recovery

    Science.gov (United States)

    King, Ellis; Hart, Jeremy; Odegard, Ryan

    2010-01-01

    The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.

  14. Histogram analysis with automated extraction of brain-tissue region from whole-brain CT images

    OpenAIRE

    Kondo, Masatoshi; Yamashita, Koji; Yoshiura, Takashi; Hiwatash, Akio; Shirasaka, Takashi; Arimura, Hisao; Nakamura, Yasuhiko; Honda, Hiroshi

    2015-01-01

    To determine whether an automated extraction of the brain-tissue region from CT images is useful for the histogram analysis of the brain-tissue region was studied. We used the CT images of 11 patients. We developed an automatic brain-tissue extraction algorithm. We evaluated the similarity index of this automated extraction method relative to manual extraction, and we compared the mean CT number of all extracted pixels and the kurtosis and skewness of the distribution of CT numbers of all ext...

  15. Evaluation of Four Automated Protocols for Extraction of DNA from FTA Cards

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura;

    2013-01-01

    from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore......, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already......Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction...

  16. A COMPARISON OF AUTOMATED AND TRADITIONAL METHODS FOR THE EXTRACTION OF ARSENICALS FROM FISH

    Science.gov (United States)

    An automated extractor employing accelerated solvent extraction (ASE) has been compared with a traditional sonication method of extraction for the extraction of arsenicals from fish tissue. Four different species of fish and a standard reference material, DORM-2, were subjected t...

  17. AUTOMATED SOLID PHASE EXTRACTION GC/MS FOR ANALYSIS OF SEMIVOLATILES IN WATER AND SEDIMENTS

    Science.gov (United States)

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line sampl...

  18. Auditory-model-based Feature Extraction Method for Mechanical Faults Diagnosis

    Institute of Scientific and Technical Information of China (English)

    LI Yungong; ZHANG Jinping; DAI Li; ZHANG Zhanyi; LIU Jie

    2010-01-01

    It is well known that the human auditory system possesses remarkable capabilities to analyze and identify signals. Therefore, it would be significant to build an auditory model based on the mechanism of human auditory systems, which may improve the effects of mechanical signal analysis and enrich the methods of mechanical faults features extraction. However the existing methods are all based on explicit senses of mathematics or physics, and have some shortages on distinguishing different faults, stability, and suppressing the disturbance noise, etc. For the purpose of improving the performances of the work of feature extraction, an auditory model, early auditory(EA) model, is introduced for the first time. This auditory model transforms time domain signal into auditory spectrum via bandpass filtering, nonlinear compressing, and lateral inhibiting by simulating the principle of the human auditory system. The EA model is developed with the Gammatone filterbank as the basilar membrane. According to the characteristics of vibration signals, a method is proposed for determining the parameter of inner hair cells model of EA model. The performance of EA model is evaluated through experiments on four rotor faults, including misalignment, rotor-to-stator rubbing, oil film whirl, and pedestal looseness. The results show that the auditory spectrum, output of EA model, can effectively distinguish different faults with satisfactory stability and has the ability to suppress the disturbance noise. Then, it is feasible to apply auditory model, as a new method, to the feature extraction for mechanical faults diagnosis with effect.

  19. Performance and Analysis of the Automated Semantic Object and Spatial Relationships Extraction in Traffic Images

    OpenAIRE

    Wang Hui Hui

    2013-01-01

    Extraction and representation of spatial relations semantics among objects are important as it can convey important information about the image and to further increase the confidence in image understanding which contributes to richer querying and retrieval facilities. This paper discusses the performance of the automated object spatial relationships semantic information extraction as proposed. Experiments have been conducted to demonstrate that the proposed automated object spatial relations...

  20. Automated extraction protocol for quantification of SARS-Coronavirus RNA in serum: an evaluation study

    OpenAIRE

    Lui Wing-bong; Chung Grace TY; Jin Yongjie; Chiu Rossa WK; Chan Anthony TC; Lim Wilina; Dennis Lo YM

    2006-01-01

    Abstract Background We have previously developed a test for the diagnosis and prognostic assessment of the severe acute respiratory syndrome (SARS) based on the detection of the SARS-coronavirus RNA in serum by real-time quantitative reverse transcriptase polymerase chain reaction (RT-PCR). In this study, we evaluated the feasibility of automating the serum RNA extraction procedure in order to increase the throughput of the assay. Methods An automated nucleic acid extraction platform using th...

  1. Extraction Error Modeling and Automated Model Debugging in High-Performance Low Power Custom Designs

    OpenAIRE

    Yang, Yu-Shen; Veneris, Andreas; Thadikaran, Paul; Venkataraman, Srikanth

    2005-01-01

    Test model generation is common in the design cycle of custom made high performance low power designs targeted for high volume production. Logic extraction is a key step in test model generation to produce a logic level netlist from the transistor level representation. This is a semi-automated process which is error prone. This paper analyzes typical extraction errors applicable to clocking schemes seen in high-performance designs today. An automated debugging solution for these errors in des...

  2. A fully automated liquid–liquid extraction system utilizing interface detection

    OpenAIRE

    Jeffrey Pan; Robert Schmitt; Eugene Maslana

    2000-01-01

    The development of the Abbott Liquid-Liquid Extraction Station was a result of the need for an automated system to perform aqueous extraction on large sets of newly synthesized organic compounds used for drug discovery. The system utilizes a cylindrical laboratory robot to shuttle sample vials between two loading racks, two identical extraction stations, and a centrifuge. Extraction is performed by detecting the phase interface (by difference in refractive index) of the moving column of fluid...

  3. Reliable Fault Classification of Induction Motors Using Texture Feature Extraction and a Multiclass Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Jia Uddin

    2014-01-01

    Full Text Available This paper proposes a method for the reliable fault detection and classification of induction motors using two-dimensional (2D texture features and a multiclass support vector machine (MCSVM. The proposed model first converts time-domain vibration signals to 2D gray images, resulting in texture patterns (or repetitive patterns, and extracts these texture features by generating the dominant neighborhood structure (DNS map. The principal component analysis (PCA is then used for the purpose of dimensionality reduction of the high-dimensional feature vector including the extracted texture features due to the fact that the high-dimensional feature vector can degrade classification performance, and this paper configures an effective feature vector including discriminative fault features for diagnosis. Finally, the proposed approach utilizes the one-against-all (OAA multiclass support vector machines (MCSVMs to identify induction motor failures. In this study, the Gaussian radial basis function kernel cooperates with OAA MCSVMs to deal with nonlinear fault features. Experimental results demonstrate that the proposed approach outperforms three state-of-the-art fault diagnosis algorithms in terms of fault classification accuracy, yielding an average classification accuracy of 100% even in noisy environments.

  4. Automated Fault Diagnostics, Prognostics, and Recovery in Spacecraft Power Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault detection and isolation (FDI) in spacecraft's electrical power system (EPS) has always received special attention. However, the power systems health...

  5. Manifold Learning with Self-Organizing Mapping for Feature Extraction of Nonlinear Faults in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    Lin Liang

    2015-01-01

    Full Text Available A new method for extracting the low-dimensional feature automatically with self-organization mapping manifold is proposed for the detection of rotating mechanical nonlinear faults (such as rubbing, pedestal looseness. Under the phase space reconstructed by single vibration signal, the self-organization mapping (SOM with expectation maximization iteration algorithm is used to divide the local neighborhoods adaptively without manual intervention. After that, the local tangent space alignment algorithm is adopted to compress the high-dimensional phase space into low-dimensional feature space. The proposed method takes advantages of the manifold learning in low-dimensional feature extraction and adaptive neighborhood construction of SOM and can extract intrinsic fault features of interest in two dimensional projection space. To evaluate the performance of the proposed method, the Lorenz system was simulated and rotation machinery with nonlinear faults was obtained for test purposes. Compared with the holospectrum approaches, the results reveal that the proposed method is superior in identifying faults and effective for rotating machinery condition monitoring.

  6. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    Directory of Open Access Journals (Sweden)

    Huaqing Wang

    2009-04-01

    Full Text Available This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to.

  7. Automated serial extraction of DNA and RNA from biobanked tissue specimens

    OpenAIRE

    Mathot, Lucy; Wallin, Monica; Sjöblom, Tobias

    2013-01-01

    Background: With increasing biobanking of biological samples, methods for large scale extraction of nucleic acids are in demand. The lack of such techniques designed for extraction from tissues results in a bottleneck in downstream genetic analyses, particularly in the field of cancer research. We have developed an automated procedure for tissue homogenization and extraction of DNA and RNA into separate fractions from the same frozen tissue specimen. A purpose developed magnetic bead based te...

  8. Automated Information Extraction of Key Trial Design Elements from Clinical Trial Publications

    OpenAIRE

    de Bruijn, Berry; Carini, Simona; Kiritchenko, Svetlana; Martin, Joel; Sim, Ida

    2008-01-01

    Clinical trials are one of the most valuable sources of scientific evidence for improving the practice of medicine. The Trial Bank project aims to improve structured access to trial findings by including formalized trial information into a knowledge base. Manually extracting trial information from published articles is costly, but automated information extraction techniques can assist. The current study highlights a single architecture to extract a wide array of information elements from full...

  9. Extracting invariable fault features of rotating machines with multi-ICA networks

    Institute of Scientific and Technical Information of China (English)

    焦卫东; 杨世锡; 吴昭同

    2003-01-01

    This paper proposes novel multi-layer neural networks based on Independent Component Analysis for feature extraction of fault modes. By the use of ICA, invariable features embedded in multi-channel vibration measurements under different operating conditions (rotating speed and/or load) can be captured together.Thus, stable MLP classifiers insensitive to the variation of operation conditions are constructed. The successful results achieved by selected experiments indicate great potential of ICA in health condition monitoring of rotating machines.

  10. Manifold Learning with Self-Organizing Mapping for Feature Extraction of Nonlinear Faults in Rotating Machinery

    OpenAIRE

    Lin Liang; Fei Liu; Maolin Li; Guanghua Xu

    2015-01-01

    A new method for extracting the low-dimensional feature automatically with self-organization mapping manifold is proposed for the detection of rotating mechanical nonlinear faults (such as rubbing, pedestal looseness). Under the phase space reconstructed by single vibration signal, the self-organization mapping (SOM) with expectation maximization iteration algorithm is used to divide the local neighborhoods adaptively without manual intervention. After that, the local tangent space alignment ...

  11. Acoustic diagnosis of mechanical fault feature based on reference signal frequency domain semi-blind extraction

    OpenAIRE

    Zeguang YI; Pan, Nan; Liu, Feng

    2015-01-01

    Aiming at fault diagnosis problems caused by complex machinery parts, serious background noises and the application limitations of traditional blind signal processing algorithm to the mechanical acoustic signal processing, a failure acoustic diagnosis based on reference signal frequency domain semi-blind extraction is proposed. Key technologies are introduced: Based on frequency-domain blind deconvolution algorithm, the artificial fish swarm algorithm which is good for global optimization is ...

  12. Spatial resolution requirements for automated cartographic road extraction

    Science.gov (United States)

    Benjamin, S.; Gaydos, L.

    1990-01-01

    Ground resolution requirements for detection and extraction of road locations in a digitized large-scale photographic database were investigated. A color infrared photograph of Sunnyvale, California was scanned, registered to a map grid, and spatially degraded to 1- to 5-metre resolution pixels. Road locations in each data set were extracted using a combination of image processing and CAD programs. These locations were compared to a photointerpretation of road locations to determine a preferred pixel size for the extraction method. Based on road pixel omission error computations, a 3-metre pixel resolution appears to be the best choice for this extraction method. -Authors

  13. Improving Access to Archival Collections with Automated Entity Extraction

    OpenAIRE

    Kyle Banerjee; Max Johnson

    2015-01-01

    The complexity and diversity of archival resources make constructing rich metadata records time consuming and expensive, which in turn limits access to these valuable materials. However, significant automation of the metadata creation process would dramatically reduce the cost of providing access points, improve access to individual resources, and establish connections between resources that would otherwise remain unknown. Using a case study at Oregon Health & Science University as a len...

  14. RFI detection by automated feature extraction and statistical analysis

    OpenAIRE

    Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorit...

  15. Automated serial extraction of DNA and RNA from biobanked tissue specimens

    Science.gov (United States)

    2013-01-01

    Background With increasing biobanking of biological samples, methods for large scale extraction of nucleic acids are in demand. The lack of such techniques designed for extraction from tissues results in a bottleneck in downstream genetic analyses, particularly in the field of cancer research. We have developed an automated procedure for tissue homogenization and extraction of DNA and RNA into separate fractions from the same frozen tissue specimen. A purpose developed magnetic bead based technology to serially extract both DNA and RNA from tissues was automated on a Tecan Freedom Evo robotic workstation. Results 864 fresh-frozen human normal and tumor tissue samples from breast and colon were serially extracted in batches of 96 samples. Yields and quality of DNA and RNA were determined. The DNA was evaluated in several downstream analyses, and the stability of RNA was determined after 9 months of storage. The extracted DNA performed consistently well in processes including PCR-based STR analysis, HaloPlex selection and deep sequencing on an Illumina platform, and gene copy number analysis using microarrays. The RNA has performed well in RT-PCR analyses and maintains integrity upon storage. Conclusions The technology described here enables the processing of many tissue samples simultaneously with a high quality product and a time and cost reduction for the user. This reduces the sample preparation bottleneck in cancer research. The open automation format also enables integration with upstream and downstream devices for automated sample quantitation or storage. PMID:23957867

  16. Automated Algorithm for Extraction of Wetlands from IRS Resourcesat Liss III Data

    Science.gov (United States)

    Subramaniam, S.; Saxena, M.

    2011-09-01

    Wetlands play significant role in maintaining the ecological balance of both biotic and abiotic life in coastal and inland environments. Hence, understanding of their occurrence, spatial extent of change in wetland environment is very important and can be monitored using satellite remote sensing technique. The extraction of wetland features using remote sensing has so far been carried out using visual/ hybrid digital analysis techniques, which is time consuming. To monitor the wetland and their features at National/ State level, there is a need for the development of automated technique for the extraction of wetland features. A knowledge based algorithm has been developed using hierarchical decision tree approach for automated extraction of wetland features such as surface water spread, wet area, turbidity and wet vegetation including aquatic for pre and post monsoon period. The results obtained for Chhattisgarh, India using the automated technique has been found to be satisfactory, when compared with hybrid digital/visual analysis technique.

  17. Automation System in Rare Earths Countercurrent Extraction Processes

    Institute of Scientific and Technical Information of China (English)

    贾江涛; 严纯华; 廖春生; 吴声; 王明文; 李标国

    2001-01-01

    Based on the countercurrent extraction theory for optimized designing and simulating, the rare earth separation processes, the selection of the detecting points (stages) and on-line analysis for elements, the simulation of open loop response and its response speed, the diagnosis and the regulative prescription for running the solvent extraction cascades were studied.

  18. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    Science.gov (United States)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-01

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis. PMID:27237618

  19. Automated Training Sample Extraction for Global Land Cover Mapping

    OpenAIRE

    Julien Radoux; Céline Lamarche; Eric Van Bogaert; Sophie Bontemps; Carsten Brockmann; Pierre Defourny

    2014-01-01

    Land cover is one of the essential climate variables of the ESA Climate Change Initiative (CCI). In this context, the Land Cover CCI (LC CCI) project aims at building global land cover maps suitable for climate modeling based on Earth observation by satellite sensors.  The  challenge  is  to  generate  a  set  of  successive  maps  that  are  both  accurate and consistent over time. To do so, operational methods for the automated classification of optical images are investigated. The pr...

  20. Fault feature extraction and enhancement of rolling element bearing in varying speed condition

    Science.gov (United States)

    Ming, A. B.; Zhang, W.; Qin, Z. Y.; Chu, F. L.

    2016-08-01

    In engineering applications, the variability of load usually varies the shaft speed, which further degrades the efficacy of the diagnostic method based on the hypothesis of constant speed analysis. Therefore, the investigation of the diagnostic method suitable for the varying speed condition is significant for the bearing fault diagnosis. In this instance, a novel fault feature extraction and enhancement procedure was proposed by the combination of the iterative envelope analysis and a low pass filtering operation in this paper. At first, based on the analytical model of the collected vibration signal, the envelope signal was theoretically calculated and the iterative envelope analysis was improved for the varying speed condition. Then, a feature enhancement procedure was performed by applying a low pass filter on the temporal envelope obtained by the iterative envelope analysis. Finally, the temporal envelope signal was transformed to the angular domain by the computed order tracking and the fault feature was extracted on the squared envelope spectrum. Simulations and experiments were used to validate the efficacy of the theoretical analysis and proposed procedure. It is shown that the computed order tracking method is recommended to be applied on the envelope of the signal in order to avoid the energy spreading and amplitude distortion. Compared with the feature enhancement method performed by the fast kurtogram and corresponding optimal band pass filtering, the proposed method can efficiently extract the fault character in the varying speed condition with less amplitude attenuation. Furthermore, do not involve the center frequency estimation, the proposed method is more concise for engineering applications.

  1. Automated microfluidic DNA/RNA extraction with both disposable and reusable components

    International Nuclear Information System (INIS)

    An automated microfluidic nucleic extraction system was fabricated with a multilayer polydimethylsiloxane (PDMS) structure that consists of sample wells, microvalves, a micropump and a disposable microfluidic silica cartridge. Both the microvalves and micropump structures were fabricated in a single layer and are operated pneumatically using a 100 µm PDMS membrane. To fabricate the disposable microfluidic silica cartridge, two-cavity structures were made in a PDMS replica to fit the stacked silica membranes. A handheld controller for the microvalves and pumps was developed to enable system automation. With purified ribonucleic acid (RNA), whole blood and E. coli samples, the automated microfluidic nucleic acid extraction system was validated with a guanidine-based solid phase extraction procedure. An extraction efficiency of ∼90% for deoxyribonucleic acid (DNA) and ∼54% for RNA was obtained in 12 min from whole blood and E. coli samples, respectively. In addition, the same quantity and quality of extracted DNA was confirmed by polymerase chain reaction (PCR) amplification. The PCR also presented the appropriate amplification and melting profiles. Automated, programmable fluid control and physical separation of the reusable components and the disposable components significantly decrease the assay time and manufacturing cost and increase the flexibility and compatibility of the system with downstream components

  2. Automated microfluidic DNA/RNA extraction with both disposable and reusable components

    Science.gov (United States)

    Kim, Jungkyu; Johnson, Michael; Hill, Parker; Sonkul, Rahul S.; Kim, Jongwon; Gale, Bruce K.

    2012-01-01

    An automated microfluidic nucleic extraction system was fabricated with a multilayer polydimethylsiloxane (PDMS) structure that consists of sample wells, microvalves, a micropump and a disposable microfluidic silica cartridge. Both the microvalves and micropump structures were fabricated in a single layer and are operated pneumatically using a 100 µm PDMS membrane. To fabricate the disposable microfluidic silica cartridge, two-cavity structures were made in a PDMS replica to fit the stacked silica membranes. A handheld controller for the microvalves and pumps was developed to enable system automation. With purified ribonucleic acid (RNA), whole blood and E. coli samples, the automated microfluidic nucleic acid extraction system was validated with a guanidine-based solid phase extraction procedure. An extraction efficiency of ~90% for deoxyribonucleic acid (DNA) and ~54% for RNA was obtained in 12 min from whole blood and E. coli samples, respectively. In addition, the same quantity and quality of extracted DNA was confirmed by polymerase chain reaction (PCR) amplification. The PCR also presented the appropriate amplification and melting profiles. Automated, programmable fluid control and physical separation of the reusable components and the disposable components significantly decrease the assay time and manufacturing cost and increase the flexibility and compatibility of the system with downstream components.

  3. Feature Extraction and Selection Strategies for Automated Target Recognition

    Science.gov (United States)

    Greene, W. Nicholas; Zhang, Yuhan; Lu, Thomas T.; Chao, Tien-Hsin

    2010-01-01

    Several feature extraction and selection methods for an existing automatic target recognition (ATR) system using JPLs Grayscale Optical Correlator (GOC) and Optimal Trade-Off Maximum Average Correlation Height (OT-MACH) filter were tested using MATLAB. The ATR system is composed of three stages: a cursory region of-interest (ROI) search using the GOC and OT-MACH filter, a feature extraction and selection stage, and a final classification stage. Feature extraction and selection concerns transforming potential target data into more useful forms as well as selecting important subsets of that data which may aide in detection and classification. The strategies tested were built around two popular extraction methods: Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Performance was measured based on the classification accuracy and free-response receiver operating characteristic (FROC) output of a support vector machine(SVM) and a neural net (NN) classifier.

  4. Feature Extraction Using Discrete Wavelet Transform for Gear Fault Diagnosis of Wind Turbine Gearbox

    Directory of Open Access Journals (Sweden)

    Rusmir Bajric

    2016-01-01

    Full Text Available Vibration diagnosis is one of the most common techniques in condition evaluation of wind turbine equipped with gearbox. On the other side, gearbox is one of the key components of wind turbine drivetrain. Due to the stochastic operation of wind turbines, the gearbox shaft rotating speed changes with high percentage, which limits the application of traditional vibration signal processing techniques, such as fast Fourier transform. This paper investigates a new approach for wind turbine high speed shaft gear fault diagnosis using discrete wavelet transform and time synchronous averaging. First, the vibration signals are decomposed into a series of subbands signals with the use of a multiresolution analytical property of the discrete wavelet transform. Then, 22 condition indicators are extracted from the TSA signal, residual signal, and difference signal. Through the case study analysis, a new approach reveals the most relevant condition indicators based on vibrations that can be used for high speed shaft gear spalling fault diagnosis and their tracking abilities for fault degradation progression. It is also shown that the proposed approach enhances the gearbox fault diagnosis ability in wind turbines. The approach presented in this paper was programmed in Matlab environment using data acquired on a 2 MW wind turbine.

  5. Automated spectral extraction for high multiplexing MOS and IFU observations

    CERN Document Server

    Scodeggio, M; Garilli, B; Lefèvre, O; Vettolani, G

    2001-01-01

    VIMOS main distinguishing characteristic is its very high multiplex capability: in MOS mode up to 800 spectra can be acquired simultaneously, while the Integral Field Unit produces 6400 spectra to obtain integral field spectroscopy of an area approximately 1x1 arcmin in size. To successfully exploit the capabilities of such an instrument it is necessary to expedite as much as possible the analysis of the very large volume of data that it will produce, automating almost completely the basic data reduction and the related bookkeeping process. The VIMOS Data Reduction Software (DRS) has been designed specifically to satisfy these two requirements. A complete automation is achieved using a series of auxiliary tables that store all the input information needed by the data reduction procedures, and all the output information that they produce. We expect to achieve a satisfactory data reduction for more than 90% of the input spectra, while some level of human intervention might be required for a small fraction of th...

  6. Automated rapid finite fault inversion for megathrust earthquakes: Application to the Maule (2010), Iquique (2014) and Illapel (2015) great earthquakes

    Science.gov (United States)

    Benavente, Roberto; Cummins, Phil; Dettmer, Jan

    2016-04-01

    Rapid estimation of the spatial and temporal rupture characteristics of large megathrust earthquakes by finite fault inversion is important for disaster mitigation. For example, estimates of the spatio-temporal evolution of rupture can be used to evaluate population exposure to tsunami waves and ground shaking soon after the event by providing more accurate predictions than possible with point source approximations. In addition, rapid inversion results can reveal seismic source complexity to guide additional, more detailed subsequent studies. This work develops a method to rapidly estimate the slip distribution of megathrust events while reducing subjective parameter choices by automation. The method is simple yet robust and we show that it provides excellent preliminary rupture models as soon as 30 minutes for three great earthquakes in the South-American subduction zone. This may slightly change for other regions depending on seismic station coverage but method can be applied to any subduction region. The inversion is based on W-phase data since it is rapidly and widely available and of low amplitude which avoids clipping at close stations for large events. In addition, prior knowledge of the slab geometry (e.g. SLAB 1.0) is applied and rapid W-phase point source information (time delay and centroid location) is used to constrain the fault geometry and extent. Since the linearization by multiple time window (MTW) parametrization requires regularization, objective smoothing is achieved by the discrepancy principle in two fully automated steps. First, the residuals are estimated assuming unknown noise levels, and second, seeking a subsequent solution which fits the data to noise level. The MTW scheme is applied with positivity constraints and a solution is obtained by an efficient non-negative least squares solver. Systematic application of the algorithm to the Maule (2010), Iquique (2014) and Illapel (2015) events illustrates that rapid finite fault inversion with

  7. Instrumentation and monitoring for pillar extraction in a deep, faulted uranium mine

    International Nuclear Information System (INIS)

    A rock mechanics instrumentation and monitoring program was implemented during pillar extraction at Gulf Mineral Resources' Mt. Taylor Mine, the deepest uranium mine in the U.S. Three types of monitoring were employed: (1) drift convergence around stopes (using a portable tube extensometer), (2) stress changes in pillars (using vibrating wire stressmeters in horizontal boreholes), and (3) load changes in haulageways under stopes (using vibrating wire load cells in jack stands). Results include convergence-time graphs, convergence contour maps, stress-time graphs, a stress increase contour map, load-time graphs, and abutment load limit maps. Major factors influencing results include (1) size of pillars around the extraction area, and (2) presence and orientation of a major fault adjacent to one extraction area. These factors, combined with pillar extraction, produced evidence of load transfer over 120 m from a stope measuring only 55 m x 45 m. Vertical stress increases of 7 to 17 MPa (1000 to 2500 psi) within 60 m of the stope were obtained using the stressmeter manufactures's method. The fault apparently prevented load transfer in the down-dip direction but increased load transfer in the up-dip direction. Convergence rates of up to 1.3 cm/day (0.5 in/day) were measured at two stations. Instruments were generally reliable despite adverse underground conditions

  8. Automated pattern extraction and three-dimensional construction

    International Nuclear Information System (INIS)

    Computed tomography equipment provides cross sectional images of human bodies. From these two-dimensional images, doctors can extract useful information for diagnosis. Possibilities of contour extraction and three-dimensional construction of objective regions of affected parts (dilated ventricles) have been investigated on the basis of brain CT images. The perspectives of extracted three-dimensional objects are dynamically displayed on a three-dimensional graphic display system and following conclusions have been obtained. (1) Three-dimensional images provide helpful information for recognizing the shapes of objective regions of affected parts of a brain. (2) The volume and cross sectional area of the object of interest are available from this system. Especially the volumetric ratio of dilated ventricles to cranium has the possibility of becoming a better measure than the conventional one. (author)

  9. Extraction of Prostatic Lumina and Automated Recognition for Prostatic Calculus Image Using PCA-SVM

    OpenAIRE

    D. Joshua Liao; Yusheng Huang; Xiaofen Xing; Hua Wang; Jian Liu; Hui Xiao; Zhuocai Wang; Xiaojun Ding; Xiangmin Xu

    2011-01-01

    Identification of prostatic calculi is an important basis for determining the tissue origin. Computation-assistant diagnosis of prostatic calculi may have promising potential but is currently still less studied. We studied the extraction of prostatic lumina and automated recognition for calculus images. Extraction of lumina from prostate histology images was based on local entropy and Otsu threshold recognition using PCA-SVM and based on the texture features of prostatic calculus. The SVM cla...

  10. Automated Boundary-Extraction and Region-Growing Techniques Applied to Solar Magnetograms

    Science.gov (United States)

    McAteer, R. T. James; Gallagher, Peter; Ireland, Jack; Young, C Alex

    2005-01-01

    We present an automated approach to active region extraction from full disc MDI longitudinal magnetograms. This uses a region-growing technique in conjunction with boundary-extraction to define a number of enclosed contours as belonging to separate regions of magnetic significance on the solar disc. This provides an objective definition of active regions and areas of plage on the Sun. A number of parameters relating to the flare-potential of each region is discussed.

  11. Natural Environment Modeling and Fault-Diagnosis for Automated Agricultural Vehicle

    DEFF Research Database (Denmark)

    Blas, Morten Rufus; Blanke, Mogens

    2008-01-01

    This paper presents results for an automatic navigation system for agricultural vehicles. The system uses stereo-vision, inertial sensors and GPS. Special emphasis has been placed on modeling the natural environment in conjunction with a fault-tolerant navigation system. The results are exemplified...

  12. Discovering Indicators of Successful Collaboration Using Tense: Automated Extraction of Patterns in Discourse

    Science.gov (United States)

    Thompson, Kate; Kennedy-Clark, Shannon; Wheeler, Penny; Kelly, Nick

    2014-01-01

    This paper describes a technique for locating indicators of success within the data collected from complex learning environments, proposing an application of e-research to access learner processes and measure and track group progress. The technique combines automated extraction of tense and modality via parts-of-speech tagging with a visualisation…

  13. Automated extraction protocol for quantification of SARS-Coronavirus RNA in serum: an evaluation study

    Directory of Open Access Journals (Sweden)

    Lui Wing-bong

    2006-02-01

    Full Text Available Abstract Background We have previously developed a test for the diagnosis and prognostic assessment of the severe acute respiratory syndrome (SARS based on the detection of the SARS-coronavirus RNA in serum by real-time quantitative reverse transcriptase polymerase chain reaction (RT-PCR. In this study, we evaluated the feasibility of automating the serum RNA extraction procedure in order to increase the throughput of the assay. Methods An automated nucleic acid extraction platform using the MagNA Pure LC instrument (Roche Diagnostics was evaluated. We developed a modified protocol in compliance with the recommended biosafety guidelines from the World Health Organization based on the use of the MagNA Pure total nucleic acid large volume isolation kit for the extraction of SARS-coronavirus RNA. The modified protocol was compared with a column-based extraction kit (QIAamp viral RNA mini kit, Qiagen for quantitative performance, analytical sensitivity and precision. Results The newly developed automated protocol was shown to be free from carry-over contamination and have comparable performance with other standard protocols and kits designed for the MagNA Pure LC instrument. However, the automated method was found to be less sensitive, less precise and led to consistently lower serum SARS-coronavirus concentrations when compared with the column-based extraction method. Conclusion As the diagnostic efficiency and prognostic value of the serum SARS-CoV RNA RT-PCR test is critically associated with the analytical sensitivity and quantitative performance contributed both by the RNA extraction and RT-PCR components of the test, we recommend the use of the column-based manual RNA extraction method.

  14. Towards automated support for extraction of reusable components

    Science.gov (United States)

    Abd-El-hafiz, S. K.; Basili, Victor R.; Caldiera, Gianluigi

    1992-01-01

    A cost effective introduction of software reuse techniques requires the reuse of existing software developed in many cases without aiming at reusability. This paper discusses the problems related to the analysis and reengineering of existing software in order to reuse it. We introduce a process model for component extraction and focus on the problem of analyzing and qualifying software components which are candidates for reuse. A prototype tool for supporting the extraction of reusable components is presented. One of the components of this tool aids in understanding programs and is based on the functional model of correctness. It can assist software engineers in the process of finding correct formal specifications for programs. A detailed description of this component and an example to demonstrate a possible operational scenario are given.

  15. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    OpenAIRE

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N.; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J.; Morling, Niels

    2013-01-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (Pr...

  16. Fault Tolerant Modular Linear Motor for Safe-Critical Automated Industrial Applications

    Directory of Open Access Journals (Sweden)

    Loránd SZABÓ

    2009-05-01

    Full Text Available In various safe-critical industrial, medical and defence applications the translational movements are performed by linear motors. In such applications both the motor and its power converter should be fault tolerant. To fulfil this assignment redesigned motorstructures with novel phase connections must be used. In the paper a modular double salient permanent magnet linear motor is studied. Its phases are split into independent channels. The study on the fault tolerant capability of the linear motor was performed via cosimulation, using the Flux-to-Simulink Technology. The conclusions of the paper could help the users to select the optimal linear motor topology for their certain application, function of the required meantraction force and its acceptable ripples.

  17. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    Science.gov (United States)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  18. Plasmid purification by phenol extraction from guanidinium thiocyanate solution: development of an automated protocol.

    Science.gov (United States)

    Fisher, J A; Favreau, M B

    1991-05-01

    We have developed a novel plasmid isolation procedure and have adapted it for use on an automated nucleic acid extraction instrument. The protocol is based on the finding that phenol extraction of a 1 M guanidinium thiocyanate solution at pH 4.5 efficiently removes genomic DNA from the aqueous phase, while supercoiled plasmid DNA is retained in the aqueous phase. S1 nuclease digestion of the removed genomic DNA shows that it has been denatured, which presumably confers solubility in the organic phase. The complete automated protocol for plasmid isolation involves pretreatment of bacterial cells successively with lysozyme, RNase A, and proteinase K. Following these digestions, the solution is extracted twice with a phenol/chloroform/water mixture and once with chloroform. Purified plasmid is then collected by isopropanol precipitation. The purified plasmid is essentially free of genomic DNA, RNA, and protein and is a suitable substrate for DNA sequencing and other applications requiring highly pure supercoiled plasmid. PMID:1713749

  19. Automated extraction of metastatic liver cancer regions from abdominal contrast CT images

    International Nuclear Information System (INIS)

    In this paper, automated extraction of metastatic liver cancer regions from abdominal contrast X-ray CT images is investigated. Because even in Japan, cases of metastatic liver cancers are increased due to recent Europeanization and/or Americanization of Japanese eating habits, development of a system for computer aided diagnosis of them is strongly expected. Our automated extraction procedure consists of following four steps; liver region extraction, density transformation for enhancement of cancer regions, segmentation for obtaining candidate cancer regions, and reduction of false positives by shape feature. Parameter values used in each step of the procedure are decided based on density and shape features of typical metastatic liver cancers. In experiments using practical 20 cases of metastatic liver tumors, it is shown that 56% of true cancers can be detected successfully from CT images by the proposed procedure. (author)

  20. Automated extraction of lexical meanings from Polish corpora: potentialities and limitations

    Directory of Open Access Journals (Sweden)

    Maciej Piasecki

    2015-11-01

    Full Text Available Automated extraction of lexical meanings from Polish corpora: potentialities and limitationsLarge corpora are often consulted by linguists as a knowledge source with respect to lexicon, morphology or syntax. However, there are also several methods of automated extraction of semantic properties of language units from corpora. In the paper we focus on emerging potentialities of these methods, as well as on their identified limitations. Evidence that can be collected from corpora is confronted with the existing models of formalised description of lexical meanings. Two basic paradigms of lexical semantics extraction are briefly described. Their properties are analysed on the basis of several experiments performed on Polish corpora. Several potential applications of the methods, including a system supporting expansion of a Polish wordnet, are discussed. Finally, perspectives on the potential further development are discussed.

  1. Automatic fault feature extraction of mechanical anomaly on induction motor bearing using ensemble super-wavelet transform

    Science.gov (United States)

    He, Wangpeng; Zi, Yanyang; Chen, Binqiang; Wu, Feng; He, Zhengjia

    2015-03-01

    Mechanical anomaly is a major failure type of induction motor. It is of great value to detect the resulting fault feature automatically. In this paper, an ensemble super-wavelet transform (ESW) is proposed for investigating vibration features of motor bearing faults. The ESW is put forward based on the combination of tunable Q-factor wavelet transform (TQWT) and Hilbert transform such that fault feature adaptability is enabled. Within ESW, a parametric optimization is performed on the measured signal to obtain a quality TQWT basis that best demonstrate the hidden fault feature. TQWT is introduced as it provides a vast wavelet dictionary with time-frequency localization ability. The parametric optimization is guided according to the maximization of fault feature ratio, which is a new quantitative measure of periodic fault signatures. The fault feature ratio is derived from the digital Hilbert demodulation analysis with an insightful quantitative interpretation. The output of ESW on the measured signal is a selected wavelet scale with indicated fault features. It is verified via numerical simulations that ESW can match the oscillatory behavior of signals without artificially specified. The proposed method is applied to two engineering cases, signals of which were collected from wind turbine and steel temper mill, to verify its effectiveness. The processed results demonstrate that the proposed method is more effective in extracting weak fault features of induction motor bearings compared with Fourier transform, direct Hilbert envelope spectrum, different wavelet transforms and spectral kurtosis.

  2. Sparse representation based on local time-frequency template matching for bearing transient fault feature extraction

    Science.gov (United States)

    He, Qingbo; Ding, Xiaoxi

    2016-05-01

    The transients caused by the localized fault are important measurement information for bearing fault diagnosis. Thus it is crucial to extract the transients from the bearing vibration or acoustic signals that are always corrupted by a large amount of background noise. In this paper, an iterative transient feature extraction approach is proposed based on time-frequency (TF) domain sparse representation. The approach is realized by presenting a new method, called local TF template matching. In this method, the TF atoms are constructed based on the TF distribution (TFD) of the Morlet wavelet bases and local TF templates are formulated from the TF atoms for the matching process. The instantaneous frequency (IF) ridge calculated from the TFD of an analyzed signal provides the frequency parameter values for the TF atoms as well as an effective template matching path on the TF plane. In each iteration, local TF templates are employed to do correlation with the TFD of the analyzed signal along the IF ridge tube for identifying the optimum parameters of transient wavelet model. With this iterative procedure, transients can be extracted in the TF domain from measured signals one by one. The final signal can be synthesized by combining the extracted TF atoms and the phase of the raw signal. The local TF template matching builds an effective TF matching-based sparse representation approach with the merit of satisfying the native pulse waveform structure of transients. The effectiveness of the proposed method is verified by practical defective bearing signals. Comparison results also show that the proposed method is superior to traditional methods in transient feature extraction.

  3. Enabling swifter operator response in the event of a fault initiation through adaptive automation

    OpenAIRE

    Kleij, R. van der; Brake, G.M. te; Broek, J. van den

    2015-01-01

    The increasing size and operational complexity of Dynamic Positioning (DP) platforms and the continuous increase in number of DP incidents has driven the need to further improve the safety and reliability of DP operations. A large portion of so-called ‘operator error’ is explained by increasing automation of operator tasks, pushing bridge teams into a more and more passive supervisory role, a role for which humans are not very well suited. For instance, a supervisory role may undermine the te...

  4. Semi-automated extraction of brain contours from MRI

    International Nuclear Information System (INIS)

    We compared brain volumes computed by trained operators using the BRAIN-MAP algorithm, which automatically extracts the contours of the brain from gradient-echo magnetic resonance images. The images of 19 subjects randomly selected from a group of normals and a group of patients with dementia were included. BRAIN-MAP found brain perimeters that were on the average ca. 3 % tighter than those obtained by two experienced operators. Between-operator and within-operator reproducibility of the analyses were also estimated and found to be (-0.13 ± 0.51) % and (-0.63 ± 0.08) %, respectively. Replicate volume measurement by the computer alone provided a reproducibility of (0.44 ± 0.46) %. (orig.)

  5. Automated Smiley Face Extraction Based on Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Md Alamgir Hossain

    2012-07-01

    Full Text Available Facial expression scrutiny has attracted tremendous consciousness in the area of computer vision because it plays a prime role in the domain of human-machine communication. Smiley face expressions are generated by slimming down of facial muscles, which results in temporally buckled facial features such as eye lids, eye brows, nose, lips, and skin texture. These are evaluated by three characteristics: those portions of the face that will take part for facial action, the intensity of facial actions, and the dynamics of facial actions. In this paper we propose a real-time, accurate, and robust smile detection method based on genetic algorithm. We generated leaf-matrix to extract target expression. Finally, we have compared our methodology with the smile shutter function of Canon Camera. We have achieved better performance than Sony on slight smile.

  6. An Analytical Model for Assessing Stability of Pre-Existing Faults in Caprock Caused by Fluid Injection and Extraction in a Reservoir

    Science.gov (United States)

    Wang, Lei; Bai, Bing; Li, Xiaochun; Liu, Mingze; Wu, Haiqing; Hu, Shaobin

    2016-07-01

    Induced seismicity and fault reactivation associated with fluid injection and depletion were reported in hydrocarbon, geothermal, and waste fluid injection fields worldwide. Here, we establish an analytical model to assess fault reactivation surrounding a reservoir during fluid injection and extraction that considers the stress concentrations at the fault tips and the effects of fault length. In this model, induced stress analysis in a full-space under the plane strain condition is implemented based on Eshelby's theory of inclusions in terms of a homogeneous, isotropic, and poroelastic medium. The stress intensity factor concept in linear elastic fracture mechanics is adopted as an instability criterion for pre-existing faults in surrounding rocks. To characterize the fault reactivation caused by fluid injection and extraction, we define a new index, the "fault reactivation factor" η, which can be interpreted as an index of fault stability in response to fluid pressure changes per unit within a reservoir resulting from injection or extraction. The critical fluid pressure change within a reservoir is also determined by the superposition principle using the in situ stress surrounding a fault. Our parameter sensitivity analyses show that the fault reactivation tendency is strongly sensitive to fault location, fault length, fault dip angle, and Poisson's ratio of the surrounding rock. Our case study demonstrates that the proposed model focuses on the mechanical behavior of the whole fault, unlike the conventional methodologies. The proposed method can be applied to engineering cases related to injection and depletion within a reservoir owing to its efficient computational codes implementation.

  7. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  8. Weak transient fault feature extraction based on an optimized Morlet wavelet and kurtosis

    Science.gov (United States)

    Qin, Yi; Xing, Jianfeng; Mao, Yongfang

    2016-08-01

    Aimed at solving the key problem in weak transient detection, the present study proposes a new transient feature extraction approach using the optimized Morlet wavelet transform, kurtosis index and soft-thresholding. Firstly, a fast optimization algorithm based on the Shannon entropy is developed to obtain the optimized Morlet wavelet parameter. Compared to the existing Morlet wavelet parameter optimization algorithm, this algorithm has lower computation complexity. After performing the optimized Morlet wavelet transform on the analyzed signal, the kurtosis index is used to select the characteristic scales and obtain the corresponding wavelet coefficients. From the time-frequency distribution of the periodic impulsive signal, it is found that the transient signal can be reconstructed by the wavelet coefficients at several characteristic scales, rather than the wavelet coefficients at just one characteristic scale, so as to improve the accuracy of transient detection. Due to the noise influence on the characteristic wavelet coefficients, the adaptive soft-thresholding method is applied to denoise these coefficients. With the denoised wavelet coefficients, the transient signal can be reconstructed. The proposed method was applied to the analysis of two simulated signals, and the diagnosis of a rolling bearing fault and a gearbox fault. The superiority of the method over the fast kurtogram method was verified by the results of simulation analysis and real experiments. It is concluded that the proposed method is extremely suitable for extracting the periodic impulsive feature from strong background noise.

  9. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N;

    2013-01-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors...... that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit...... either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480...

  10. Automated photocurrent and bussing extraction for dose-rate rail span collapse simulations

    International Nuclear Information System (INIS)

    Computer-aided rail span collapse simulations require radiation-induced photocurrent partitioning among proximal physical collection regions and electrical contacts to the power distribution network. In this paper, the authors present a new simulation approach incorporating simple geometric rules for current division inside a contiguous region along with the automated extraction of the power distribution network. Experimental results to verify the current division algorithms are also presented. Pixel-plane and scan-line techniques used for the automated extraction of power distribution network are described For simulation of the circuit, we have used a simulator employing conjugate-gradient algorithms. A post-simulation processor maps the actual supply rails onto the layout itself for easy identification of critical sub-circuits

  11. An integrated approach for automating validation of extracted ion chromatographic peaks

    OpenAIRE

    Nelson, William D.; Viele, Kert; Lynn, Bert C.

    2008-01-01

    Summary: Accurate determination of extracted ion chromatographic peak areas in isotope-labeled quantitative proteomics is difficult to automate. Manual validation of identified peaks is typically required. We have integrated a peak confidence scoring algorithm into existing tools which are compatible with analysis pipelines based on the standards from the Institute for Systems Biology. This algorithm automatically excludes incorrectly identified peaks, improving the accuracy of the final prot...

  12. Designing an automated prototype tool for preservation quality metadata extraction for ingest into digital repository

    OpenAIRE

    Dobreva, M.; Kim, Y; Ross, S

    2008-01-01

    We present a viable framework for the automated extraction of preservation quality metadata, which is adjusted to meet the needs of, ingest to digital repositories. It has three distinctive features: wide coverage, specialisation and emphasis on quality. Wide coverage is achieved through the use of a distributed system of tool repositories, which helps to implement it over a broad range of document object types. Specialisation is maintained through the selection of the most appropriate metada...

  13. An automated system for liquid-liquid extraction in monosegmented flow analysis

    OpenAIRE

    Facchin, Ileana; Jarbas J. R. Rohwedder; Pasquini, Celio

    1997-01-01

    An automated system to perform liquid-liquid extraction in monosegmented flow analysis is described. The system is controlled by a microcomputer that can track the localization of the aqueous monosegmented sample in the manifold. Optical switches are employed to sense the gas-liquid interface of the air bubbles that define the monosegment. The logical level changes, generated by the switches, are flagged by the computer through a home-made interface that also contains the analogue-to-digital ...

  14. Automation of Extraction Chromatograhic and Ion Exchange Separations for Radiochemical Analysis and Monitoring

    International Nuclear Information System (INIS)

    Radiochemical analysis, complete with the separation of radionuclides of interest from the sample matrix and from other interfering radionuclides, is often an essential step in the determination of the radiochemical composition of a nuclear sample or process stream. Although some radionuclides can be determined nondestructively by gamma spectroscopy, where the gamma rays penetrate significant distances in condensed media and the gamma ray energies are diagnostic for specific radionuclides, other radionuclides that may be of interest emit only alpha or beta particles. For these, samples must be taken for destructive analysis and radiochemical separations are required. For process monitoring purposes, the radiochemical separation and detection methods must be rapid so that the results will be timely. These results could be obtained by laboratory analysis or by radiochemical process analyzers operating on-line or at-site. In either case, there is a need for automated radiochemical analysis methods to provide speed, throughput, safety, and consistent analytical protocols. Classical methods of separation used during the development of nuclear technologies, namely manual precipitations, solvent extractions, and ion exchange, are slow and labor intensive. Fortunately, the convergence of digital instrumentation for preprogrammed fluid manipulation and the development of new separation materials for column-based isolation of radionuclides has enabled the development of automated radiochemical analysis methodology. The primary means for separating radionuclides in solution are liquid-liquid extraction and ion exchange. These processes are well known and have been reviewed in the past.1 Ion exchange is readily employed in column formats. Liquid-liquid extraction can also be implemented on column formats using solvent-impregnated resins as extraction chromatographic materials. The organic liquid extractant is immobilized in the pores of a microporous polymer material. Under

  15. Automated method for determination of uranium in kerosene-amine sulphate extracts

    International Nuclear Information System (INIS)

    An automated method is described for the determination of uranium(VI) that has been extracted into a trialkylamine in kerosene or similar diluent from sulphuric acid leach liquor. The method uses the continuous segmented-flow technique and can be set up with the use of commercial components. Discrimination against interference from other ions, especially sulphate, should be adequate for most purposes. 0.5 to 5 g uranium per litre of extract can be determined at a rate of 60 samples per hour. Minor modifications permit extension of this range to lower concentrations. (author)

  16. Analysis of Automated Modern Web Crawling and Testing Tools and Their Possible Employment for Information Extraction

    Directory of Open Access Journals (Sweden)

    Tomas Grigalis

    2012-04-01

    Full Text Available World Wide Web has become an enormously big repository of data. Extracting, integrating and reusing this kind of data has a wide range of applications, including meta-searching, comparison shopping, business intelligence tools and security analysis of information in websites. However, reaching information in modern WEB 2.0 web pages, where HTML tree is often dynamically modified by various JavaScript codes, new data are added by asynchronous requests to the web server and elements are positioned with the help of cascading style sheets, is a difficult task. The article reviews automated web testing tools for information extraction tasks.Article in Lithuanian

  17. Evaluation of automated nucleic acid extraction methods for virus detection in a multicenter comparative trial

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Bruun; Uttenthal, Åse; Hakhverdyan, M.; Belak, S.; Wakeley, P. R.; Reid, S. M.; Ebert, K.; King, D. P.

    2009-01-01

    Five European veterinary laboratories participated in an exercise to compare the performance of nucleic acid extraction robots. Identical sets of coded samples were prepared using serial dilutions of bovine viral diarrhoea virus (BVDV) from serum and cell culture propagated material. Each...... laboratory extracted nucleic acid from this panel using available robotic equipment (12 separate instruments, comprising 8 different models), after which the processed samples were frozen and sent to a single laboratory for subsequent testing by real-time RT-PCR. In general, there was good concordance...... between the results obtained for the different automated extraction platforms. In particular, the limit of detection was identical for 9/12 and 8/12 best performing robots (using dilutions of BVDV infected-serum and cell culture material, respectively), which was similar to a manual extraction method used...

  18. Bearing Fault Diagnosis Based on Multiscale Permutation Entropy and Support Vector Machine

    OpenAIRE

    Jian-Jiun Ding; Chun-Chieh Wang; Chiu-Wen Wu; Po-Hung Wu; Shuen-De Wu

    2012-01-01

    Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, multiscale permutation entropy (MPE) was introduced for feature extraction from faulty bearing vibration signals. After extracting feature vectors by MPE, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. Simulation results demonstr...

  19. Adaptive Redundant Lifting Wavelet Transform Based on Fitting for Fault Feature Extraction of Roller Bearings

    Directory of Open Access Journals (Sweden)

    Huaqing Wang

    2012-03-01

    Full Text Available A least square method based on data fitting is proposed to construct a new lifting wavelet, together with the nonlinear idea and redundant algorithm, the adaptive redundant lifting transform based on fitting is firstly stated in this paper. By variable combination selections of basis function, sample number and dimension of basis function, a total of nine wavelets with different characteristics are constructed, which are respectively adopted to perform redundant lifting wavelet transforms on low-frequency approximate signals at each layer. Then the normalized lP norms of the new node-signal obtained through decomposition are calculated to adaptively determine the optimal wavelet for the decomposed approximate signal. Next, the original signal is taken for subsection power spectrum analysis to choose the node-signal for single branch reconstruction and demodulation. Experiment signals and engineering signals are respectively used to verify the above method and the results show that bearing faults can be diagnosed more effectively by the method presented here than by both spectrum analysis and demodulation analysis. Meanwhile, compared with the symmetrical wavelets constructed with Lagrange interpolation algorithm, the asymmetrical wavelets constructed based on data fitting are more suitable in feature extraction of fault signal of roller bearings.

  20. An initial exploration of ethical research practices regarding automated data extraction from online social media user profiles

    OpenAIRE

    Alim, Sophia

    2014-01-01

    The popularity of social media, especially online social networks, has led to the availability of potentially rich sources of data, which researchers can use for extraction via automated means. However, the process of automated extraction from user profiles results in a variety of ethical considerations and challenges for researchers. This paper examines this question further, surveying researchers to gain information regarding their experiences of, and thoughts about, the challenges to ethic...

  1. Automated and Manual Methods of DNA Extraction for Aspergillus fumigatus and Rhizopus oryzae Analyzed by Quantitative Real-Time PCR▿

    OpenAIRE

    Francesconi, Andrea; Kasai, Miki; Harrington, Susan M.; Beveridge, Mara G; Petraitiene, Ruta; Petraitis, Vidmantas; Schaufele, Robert L.; Walsh, Thomas J.

    2008-01-01

    Quantitative real-time PCR (qPCR) may improve the detection of fungal pathogens. Extraction of DNA from fungal pathogens is fundamental to optimization of qPCR; however, the loss of fungal DNA during the extraction process is a major limitation to molecular diagnostic tools for pathogenic fungi. We therefore studied representative automated and manual extraction methods for Aspergillus fumigatus and Rhizopus oryzae. Both were analyzed by qPCR for their ability to extract DNA from propagules a...

  2. Automated pressurized injection system for the separation of actinides by extraction chromatography

    International Nuclear Information System (INIS)

    This article describes a novel separation scheme developed for an automated system to efficiently separate actinides in individual fractions. The automated pressurized injection (PI) system developed allows precise collection of high-purity actinide fractions (?99 %) at elevated flow rates (15-30 mL min-1) using two extraction chromatographic TEVA and DGA resins. This system is sufficiently robust to enable the use of highly viscous acid media, limit acid corrosion, and tolerate large amount of gases generated by redox reactions by some of the reagents. The PI system was successfully applied to the separation of actinides in individual fractions (recovery yield ≥97 % for Th, U, Np, Pu, and Am) and shows the absence of cross contamination even with highly concentrated actinide solutions. The methodology was also applied to the measurement of actinides in large spiked soil samples. (author)

  3. Automated extraction of natural drainage density patterns for the conterminous United States through high performance computing

    Science.gov (United States)

    Stanislawski, Larry V.; Falgout, Jeff T.; Buttenfield, Barbara P.

    2015-01-01

    Hydrographic networks form an important data foundation for cartographic base mapping and for hydrologic analysis. Drainage density patterns for these networks can be derived to characterize local landscape, bedrock and climate conditions, and further inform hydrologic and geomorphological analysis by indicating areas where too few headwater channels have been extracted. But natural drainage density patterns are not consistently available in existing hydrographic data for the United States because compilation and capture criteria historically varied, along with climate, during the period of data collection over the various terrain types throughout the country. This paper demonstrates an automated workflow that is being tested in a high-performance computing environment by the U.S. Geological Survey (USGS) to map natural drainage density patterns at the 1:24,000-scale (24K) for the conterminous United States. Hydrographic network drainage patterns may be extracted from elevation data to guide corrections for existing hydrographic network data. The paper describes three stages in this workflow including data pre-processing, natural channel extraction, and generation of drainage density patterns from extracted channels. The workflow is concurrently implemented by executing procedures on multiple subbasin watersheds within the U.S. National Hydrography Dataset (NHD). Pre-processing defines parameters that are needed for the extraction process. Extraction proceeds in standard fashion: filling sinks, developing flow direction and weighted flow accumulation rasters. Drainage channels with assigned Strahler stream order are extracted within a subbasin and simplified. Drainage density patterns are then estimated with 100-meter resolution and subsequently smoothed with a low-pass filter. The extraction process is found to be of better quality in higher slope terrains. Concurrent processing through the high performance computing environment is shown to facilitate and refine

  4. Accelerated solvent extraction (ASE) - a fast and automated technique with low solvent consumption for the extraction of solid samples (T12)

    International Nuclear Information System (INIS)

    Full text: Accelerated solvent extraction (ASE) is a modern extraction technique that significantly streamlines sample preparation. A common organic solvent as well as water is used as extraction solvent at elevated temperature and pressure to increase extraction speed and efficiency. The entire extraction process is fully automated and performed within 15 minutes with a solvent consumption of 18 ml for a 10 g sample. For many matrices and for a variety of solutes, ASE has proven to be equivalent or superior to sonication, Soxhlet, and reflux extraction techniques while requiring less time, solvent and labor. First ASE has been applied for the extraction of environmental hazards from solid matrices. Within a very short time ASE was approved by the U.S. EPA for the extraction of BNAs, PAHs, PCBs, pesticides, herbicides, TPH, and dioxins from solid samples in method 3545. Especially for the extraction of dioxins the extraction time with ASE is reduced to 20 minutes in comparison to 18 h using Soxhlet. In food analysis ASE is used for the extraction of pesticide and mycotoxin residues from fruits and vegetables, the fat determination and extraction of vitamins. Time consuming and solvent intensive methods for the extraction of additives from polymers as well as for the extraction of marker compounds from herbal supplements can be performed with higher efficiencies using ASE. For the analysis of chemical weapons the extraction process and sample clean-up including derivatization can be automated and combined with GC-MS using an online ASE-APEC-GC system. (author)

  5. Dynamic electromembrane extraction: Automated movement of donor and acceptor phases to improve extraction efficiency.

    Science.gov (United States)

    Asl, Yousef Abdossalami; Yamini, Yadollah; Seidi, Shahram; Amanzadeh, Hatam

    2015-11-01

    In the present research, dynamic electromembrane extraction (DEME) was introduced for the first time for extraction and determination of ionizable species from different biological matrices. The setup proposed for DEME provides an efficient, stable, and reproducible method to increase extraction efficiency. This setup consists of a piece of hollow fiber mounted inside a glass flow cell by means of two plastics connector tubes. In this dynamic system, an organic solvent is impregnated into the pores of hollow fiber as supported liquid membrane (SLM); an aqueous acceptor solution is repeatedly pumped into the lumen of hollow fiber by a syringe pump whereas a peristaltic pump is used to move sample solution around the mounted hollow fiber into the flow cell. Two platinum electrodes connected to a power supply are used during extractions which are located into the lumen of the hollow fiber and glass flow cell, respectively. The method was applied for extraction of amitriptyline (AMI) and nortriptyline (NOR) as model analytes from biological fluids. Effective parameters on DEME of the model analytes were investigated and optimized. Under optimized conditions, the calibration curves were linear in the range of 2.0-100μgL(-1) with coefficient of determination (r(2)) more than 0.9902 for both of the analytes. The relative standard deviations (RSD %) were less than 8.4% based on four replicate measurements. LODs less than 1.0μgL(-1) were obtained for both AMI and NOR. The preconcentration factors higher than 83-fold were obtained for the extraction of AMI and NOR in various biological samples. PMID:26455283

  6. A new method of distribution automation system transmitting fault information%配电自动化系统转发故障信息的新方法

    Institute of Scientific and Technical Information of China (English)

    张益嘉

    2015-01-01

    Nanning Power Supply Bureau of distribution automation system has been formally applied,the system has a cross-age signifi-cance of 10kV line fault location and isolation,but the time difference lag between the distribution automation systems and information ex-change to repair personnel, seriously affects the repair efficiency and effective using of the automation system.Forwarding fault information methodology in this paper is based on the distribution automation system forwarding interface, fast-forwarding the fault information directly to the repair personnel's mobile device port(mobile phone).It is important to shorten customer average interruption time and improve power supply reliability rate.%南宁供电局的配电自动化系统已正式应用,该系统对10kV线路故障定位和隔离具有跨时代意义,但由于配电自动化系统和抢修人员之间的信息交换存在时间差问题,严重影响抢修效率和自动化系统的使用成效。本文所研究的转发故障信息的方法是利用配电自动化系统转发接口,直接把故障信息快速转发给抢修人员移动设备端口(手机),对缩短用户平均停电时间,提高供电可靠率有重大作用。

  7. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. PMID:22310206

  8. Automated solid-phase extraction of herbicides from water for gas chromatographic-mass spectrometric analysis

    Science.gov (United States)

    Meyer, M.T.; Mills, M.S.; Thurman, E.M.

    1993-01-01

    An automated solid-phase extraction (SPE) method was developed for the pre-concentration of chloroacetanilide and triazine herbicides, and two triazine metabolites from 100-ml water samples. Breakthrough experiments for the C18 SPE cartridge show that the two triazine metabolites are not fully retained and that increasing flow-rate decreases their retention. Standard curve r2 values of 0.998-1.000 for each compound were consistently obtained and a quantitation level of 0.05 ??g/l was achieved for each compound tested. More than 10,000 surface and ground water samples have been analyzed by this method.

  9. Automating the Extraction of Model-Based Software Product Lines from Model Variants

    OpenAIRE

    Martinez, Jabier; Ziadi, Tewfik; Klein, Jacques; Le Traon, Yves

    2015-01-01

    International audience We address the problem of automating 1) the analysis of existing similar model variants and 2) migrating them into a software product line. Our approach, named MoVa2PL, considers the identification of variability and commonality in model variants, as well as the extraction of a CVL-compliant Model-based Software Product Line (MSPL) from the features identified on these variants. MoVa2PL builds on a generic representation of models making it suitable to any MOF-based ...

  10. Field-scale validation of an automated soil nitrate extraction and measurement system

    OpenAIRE

    Sibley, K.J.; Astatkie, T.; Brewster, G.; Struik, P.C.; Adsett, J.F.; Pruski, K.

    2009-01-01

    One of the many gaps that needs to be solved by precision agriculture technologies is the availability of an economic, automated, on-the-go mapping system that can be used to obtain intensive and accurate ‘real-time’ data on the levels of nitrate nitrogen (NO3–N) in the soil. A soil nitrate mapping system (SNMS) has been developed to provide a way to collect such data. This study was done to provide extensive field-scale validation testing of the system’s nitrate extraction and measurement su...

  11. Automated extraction of DNA and PCR setup using a Tecan Freedom EVO® liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G.; Frank-Hansen, Rune;

    2009-01-01

    We have implemented and validated automated methods for DNA extraction and PCR setup developed for a Tecan Freedom EVO« liquid handler mounted with a Te-MagS(TM) magnetic separation device. The DNA was extracted using the Qiagen MagAttract« DNA Mini M48 kit. The DNA was amplified using Amp...

  12. Strategies for Medical Data Extraction and Presentation Part 3: Automated Context- and User-Specific Data Extraction.

    Science.gov (United States)

    Reiner, Bruce

    2015-08-01

    In current medical practice, data extraction is limited by a number of factors including lack of information system integration, manual workflow, excessive workloads, and lack of standardized databases. The combined limitations result in clinically important data often being overlooked, which can adversely affect clinical outcomes through the introduction of medical error, diminished diagnostic confidence, excessive utilization of medical services, and delays in diagnosis and treatment planning. Current technology development is largely inflexible and static in nature, which adversely affects functionality and usage among the diverse and heterogeneous population of end users. In order to address existing limitations in medical data extraction, alternative technology development strategies need to be considered which incorporate the creation of end user profile groups (to account for occupational differences among end users), customization options (accounting for individual end user needs and preferences), and context specificity of data (taking into account both the task being performed and data subject matter). Creation of the proposed context- and user-specific data extraction and presentation templates offers a number of theoretical benefits including automation and improved workflow, completeness in data search, ability to track and verify data sources, creation of computerized decision support and learning tools, and establishment of data-driven best practice guidelines. PMID:25833768

  13. Automated extraction of DNA and PCR setup using a Tecan Freedom EVO® liquid handler

    DEFF Research Database (Denmark)

    Frøslev, Tobias Guldberg; Hansen, Anders Johannes; Stangegaard, Michael;

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO® liquid handler mounted with the TeMagS magnetic separation device. The methods were validated for accredited, forensic genetic work according to ISO 17025 using the Qiagen Mag....... With the Identifiler kit, the number of full DNA profiles was approximately 20% higher with DNA prepared with the robot compared to that obtained with DNA prepared manually with the Chelex method. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for...... accredited, forensic genetic DNA typing can be implemented on a simple robot leading to the reduction of manual work as well as increased quality and throughput....

  14. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  15. Modelling and representation issues in automated feature extraction from aerial and satellite images

    Science.gov (United States)

    Sowmya, Arcot; Trinder, John

    New digital systems for the processing of photogrammetric and remote sensing images have led to new approaches to information extraction for mapping and Geographic Information System (GIS) applications, with the expectation that data can become more readily available at a lower cost and with greater currency. Demands for mapping and GIS data are increasing as well for environmental assessment and monitoring. Hence, researchers from the fields of photogrammetry and remote sensing, as well as computer vision and artificial intelligence, are bringing together their particular skills for automating these tasks of information extraction. The paper will review some of the approaches used in knowledge representation and modelling for machine vision, and give examples of their applications in research for image understanding of aerial and satellite imagery.

  16. High dimension feature extraction based visualized SOM fault diagnosis method and its application in p-xylene oxidation process☆

    Institute of Scientific and Technical Information of China (English)

    Ying Tian; Wenli Du; Feng Qian

    2015-01-01

    Purified terephthalic acid (PTA) is an important chemical raw material. P-xylene (PX) is transformed to terephthalic acid (TA) through oxidation process and TA is refined to produce PTA. The PX oxidation reaction is a complex process involving three-phase reaction of gas, liquid and solid. To monitor the process and to im-prove the product quality, as wel as to visualize the fault type clearly, a fault diagnosis method based on self-organizing map (SOM) and high dimensional feature extraction method, local tangent space alignment (LTSA), is proposed. In this method, LTSA can reduce the dimension and keep the topology information simultaneously, and SOM distinguishes various states on the output map. Monitoring results of PX oxidation reaction process in-dicate that the LTSA–SOM can wel detect and visualize the fault type.

  17. Evaluation of Automated and Manual Commercial DNA Extraction Methods for Recovery of Brucella DNA from Suspensions and Spiked Swabs ▿

    OpenAIRE

    Dauphin, Leslie A.; Hutchins, Rebecca J.; Bost, Liberty A.; Bowen, Michael D.

    2009-01-01

    This study evaluated automated and manual commercial DNA extraction methods for their ability to recover DNA from Brucella species in phosphate-buffered saline (PBS) suspension and from spiked swab specimens. Six extraction methods, representing several of the methodologies which are commercially available for DNA extraction, as well as representing various throughput capacities, were evaluated: the MagNA Pure Compact and the MagNA Pure LC instruments, the IT 1-2-3 DNA sample purification kit...

  18. The BUME method: a novel automated chloroform-free 96-well total lipid extraction method for blood plasma[S

    OpenAIRE

    Löfgren, Lars; Ståhlman, Marcus; Forsberg, Gun-Britt; Saarinen, Sinikka; Nilsson, Ralf; Göran I Hansson

    2012-01-01

    Lipid extraction from biological samples is a critical and often tedious preanalytical step in lipid research. Primarily on the basis of automation criteria, we have developed the BUME method, a novel chloroform-free total lipid extraction method for blood plasma compatible with standard 96-well robots. In only 60 min, 96 samples can be automatically extracted with lipid profiles of commonly analyzed lipid classes almost identically and with absolute recoveries similar or better to what is ob...

  19. A Novel Characteristic Frequency Bands Extraction Method for Automatic Bearing Fault Diagnosis Based on Hilbert Huang Transform

    Directory of Open Access Journals (Sweden)

    Xiao Yu

    2015-11-01

    Full Text Available Because roller element bearings (REBs failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC to select salient features from the marginal spectrum of vibration signals by Hilbert–Huang Transform (HHT. In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS into window spectrums, following which Rand Index (RI criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs. Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines. The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU. The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500–800 and a m range of 50–300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault

  20. An energy minimization approach to automated extraction of regular building footprints from airborne LiDAR data

    OpenAIRE

    He, Y; Zhang, C; Fraser, C. S.

    2014-01-01

    This paper presents an automated approach to the extraction of building footprints from airborne LiDAR data based on energy minimization. Automated 3D building reconstruction in complex urban scenes has been a long-standing challenge in photogrammetry and computer vision. Building footprints constitute a fundamental component of a 3D building model and they are useful for a variety of applications. Airborne LiDAR provides large-scale elevation representation of urban scene and as suc...

  1. Automating the Extraction of Metadata from Archaeological Data Using iRods Rules

    Directory of Open Access Journals (Sweden)

    David Walling

    2011-10-01

    Full Text Available The Texas Advanced Computing Center and the Institute for Classical Archaeology at the University of Texas at Austin developed a method that uses iRods rules and a Jython script to automate the extraction of metadata from digital archaeological data. The first step was to create a record-keeping system to classify the data. The record-keeping system employs file and directory hierarchy naming conventions designed specifically to maintain the relationship between the data objects and map the archaeological documentation process. The metadata implicit in the record-keeping system is automatically extracted upon ingest, combined with additional sources of metadata, and stored alongside the data in the iRods preservation environment. This method enables a more organized workflow for the researchers, helps them archive their data close to the moment of data creation, and avoids error prone manual metadata input. We describe the types of metadata extracted and provide technical details of the extraction process and storage of the data and metadata.

  2. Automated Extraction Of Associations Between Methylated Genes and Diseases From Biomedical Literature

    KAUST Repository

    Bin Res, Arwa A.

    2012-12-01

    Associations between methylated genes and diseases have been investigated in several studies, and it is critical to have such information available for better understanding of diseases and clinical decisions. However, such information is scattered in a large number of electronic publications and it is difficult to manually search for it. Therefore, the goal of the project is to develop a machine learning model that can efficiently extract such information. Twelve machine learning algorithms were applied and compared in application to this problem based on three approaches that involve: document-term frequency matrices, position weight matrices, and a hybrid approach that uses the combination of the previous two. The best results we obtained by the hybrid approach with a random forest model that, in a 10-fold cross-validation, achieved F-score and accuracy of nearly 85% and 84%, respectively. On a completely separate testing set, F-score and accuracy of 89% and 88%, respectively, were obtained. Based on this model, we developed a tool that automates extraction of associations between methylated genes and diseases from electronic text. Our study contributed an efficient method for extracting specific types of associations from free text and the methodology developed here can be extended to other similar association extraction problems.

  3. Automated Extraction of the Archaeological Tops of Qanat Shafts from VHR Imagery in Google Earth

    Directory of Open Access Journals (Sweden)

    Lei Luo

    2014-12-01

    Full Text Available Qanats in northern Xinjiang of China provide valuable information for agriculturists and anthropologists who seek fundamental understanding of the distribution of qanat water supply systems with regard to water resource utilization, the development of oasis agriculture, and eventually climate change. Only the tops of qanat shafts (TQSs, indicating the course of the qanats, can be observed from space, and their circular archaeological traces can also be seen in very high resolution imagery in Google Earth. The small size of the TQSs, vast search regions, and degraded features make manually extracting them from remote sensing images difficult and costly. This paper proposes an automated TQS extraction method that adopts mathematical morphological processing methods before an edge detecting module is used in the circular Hough transform approach. The accuracy assessment criteria for the proposed method include: (i extraction percentage (E = 95.9%, branch factor (B = 0 and quality percentage (Q = 95.9% in Site 1; and (ii extraction percentage (E = 83.4%, branch factor (B = 0.058 and quality percentage (Q = 79.5% in Site 2. Compared with the standard circular Hough transform, the quality percentages (Q of our proposed method were improved to 95.9% and 79.5% from 86.3% and 65.8% in test sites 1 and 2, respectively. The results demonstrate that wide-area discovery and mapping can be performed much more effectively based on our proposed method.

  4. A novel validation algorithm allows for automated cell tracking and the extraction of biologically meaningful parameters.

    Science.gov (United States)

    Rapoport, Daniel H; Becker, Tim; Madany Mamlouk, Amir; Schicktanz, Simone; Kruse, Charli

    2011-01-01

    Automated microscopy is currently the only method to non-invasively and label-free observe complex multi-cellular processes, such as cell migration, cell cycle, and cell differentiation. Extracting biological information from a time-series of micrographs requires each cell to be recognized and followed through sequential microscopic snapshots. Although recent attempts to automatize this process resulted in ever improving cell detection rates, manual identification of identical cells is still the most reliable technique. However, its tedious and subjective nature prevented tracking from becoming a standardized tool for the investigation of cell cultures. Here, we present a novel method to accomplish automated cell tracking with a reliability comparable to manual tracking. Previously, automated cell tracking could not rival the reliability of manual tracking because, in contrast to the human way of solving this task, none of the algorithms had an independent quality control mechanism; they missed validation. Thus, instead of trying to improve the cell detection or tracking rates, we proceeded from the idea to automatically inspect the tracking results and accept only those of high trustworthiness, while rejecting all other results. This validation algorithm works independently of the quality of cell detection and tracking through a systematic search for tracking errors. It is based only on very general assumptions about the spatiotemporal contiguity of cell paths. While traditional tracking often aims to yield genealogic information about single cells, the natural outcome of a validated cell tracking algorithm turns out to be a set of complete, but often unconnected cell paths, i.e. records of cells from mitosis to mitosis. This is a consequence of the fact that the validation algorithm takes complete paths as the unit of rejection/acceptance. The resulting set of complete paths can be used to automatically extract important biological parameters with high

  5. A novel validation algorithm allows for automated cell tracking and the extraction of biologically meaningful parameters.

    Directory of Open Access Journals (Sweden)

    Daniel H Rapoport

    Full Text Available Automated microscopy is currently the only method to non-invasively and label-free observe complex multi-cellular processes, such as cell migration, cell cycle, and cell differentiation. Extracting biological information from a time-series of micrographs requires each cell to be recognized and followed through sequential microscopic snapshots. Although recent attempts to automatize this process resulted in ever improving cell detection rates, manual identification of identical cells is still the most reliable technique. However, its tedious and subjective nature prevented tracking from becoming a standardized tool for the investigation of cell cultures. Here, we present a novel method to accomplish automated cell tracking with a reliability comparable to manual tracking. Previously, automated cell tracking could not rival the reliability of manual tracking because, in contrast to the human way of solving this task, none of the algorithms had an independent quality control mechanism; they missed validation. Thus, instead of trying to improve the cell detection or tracking rates, we proceeded from the idea to automatically inspect the tracking results and accept only those of high trustworthiness, while rejecting all other results. This validation algorithm works independently of the quality of cell detection and tracking through a systematic search for tracking errors. It is based only on very general assumptions about the spatiotemporal contiguity of cell paths. While traditional tracking often aims to yield genealogic information about single cells, the natural outcome of a validated cell tracking algorithm turns out to be a set of complete, but often unconnected cell paths, i.e. records of cells from mitosis to mitosis. This is a consequence of the fact that the validation algorithm takes complete paths as the unit of rejection/acceptance. The resulting set of complete paths can be used to automatically extract important biological parameters

  6. Feature extraction and recognition for rolling element bearing fault utilizing short-time Fourier transform and non-negative matrix factorization

    Science.gov (United States)

    Gao, Huizhong; Liang, Lin; Chen, Xiaoguang; Xu, Guanghua

    2015-01-01

    Due to the non-stationary characteristics of vibration signals acquired from rolling element bearing fault, the time-frequency analysis is often applied to describe the local information of these unstable signals smartly. However, it is difficult to classify the high dimensional feature matrix directly because of too large dimensions for many classifiers. This paper combines the concepts of time-frequency distribution(TFD) with non-negative matrix factorization(NMF), and proposes a novel TFD matrix factorization method to enhance representation and identification of bearing fault. Throughout this method, the TFD of a vibration signal is firstly accomplished to describe the localized faults with short-time Fourier transform(STFT). Then, the supervised NMF mapping is adopted to extract the fault features from TFD. Meanwhile, the fault samples can be clustered and recognized automatically by using the clustering property of NMF. The proposed method takes advantages of the NMF in the parts-based representation and the adaptive clustering. The localized fault features of interest can be extracted as well. To evaluate the performance of the proposed method, the 9 kinds of the bearing fault on a test bench is performed. The proposed method can effectively identify the fault severity and different fault types. Moreover, in comparison with the artificial neural network(ANN), NMF yields 99.3% mean accuracy which is much superior to ANN. This research presents a simple and practical resolution for the fault diagnosis problem of rolling element bearing in high dimensional feature space.

  7. Feature Extraction and Recognition for Rolling Element Bearing Fault Utilizing Short-Time Fourier Transform and Non-negative Matrix Factorization

    Institute of Scientific and Technical Information of China (English)

    GAO Huizhong; LIANG Lin; CHEN Xiaoguang; XU Guanghua

    2015-01-01

    Due to the non-stationary characteristics of vibration signals acquired from rolling element bearing fault, the time-frequency analysis is often applied to describe the local information of these unstable signals smartly. However, it is difficult to classify the high dimensional feature matrix directly because of too large dimensions for many classifiers. This paper combines the concepts of time-frequency distribution(TFD) with non-negative matrix factorization(NMF), and proposes a novel TFD matrix factorization method to enhance representation and identification of bearing fault. Throughout this method, the TFD of a vibration signal is firstly accomplished to describe the localized faults with short-time Fourier transform(STFT). Then, the supervised NMF mapping is adopted to extract the fault features from TFD. Meanwhile, the fault samples can be clustered and recognized automatically by using the clustering property of NMF. The proposed method takes advantages of the NMF in the parts-based representation and the adaptive clustering. The localized fault features of interest can be extracted as well. To evaluate the performance of the proposed method, the 9 kinds of the bearing fault on a test bench is performed. The proposed method can effectively identify the fault severity and different fault types. Moreover, in comparison with the artificial neural network(ANN), NMF yields 99.3%mean accuracy which is much superior to ANN. This research presents a simple and practical resolution for the fault diagnosis problem of rolling element bearing in high dimensional feature space.

  8. Structure and organization of automation subsystem for control of beam extraction from a fast-cycling synchrotron

    International Nuclear Information System (INIS)

    The status of development of an automation subsystem for control of beam extraction from the Erevan synchrotron is described. The hardware complex of the subsystem contains the RPT-80 microcomputer, seven units of automated control for the beam extraction channel, a timer unit for synchronization of the accelerator output devices, a unit for monitoring status signals, an ADS, an interface with the synchrotron, a commutation line between RPT80 and the host ES1010 computer. As a result pilot operation the beam energy spread instability has been reduced 15 times. 5 refs.; 1 fig

  9. Automated multisyringe stir bar sorptive extraction using robust montmorillonite/epoxy-coated stir bars.

    Science.gov (United States)

    Ghani, Milad; Saraji, Mohammad; Maya, Fernando; Cerdà, Víctor

    2016-05-01

    Herein we present a simple, rapid and low cost strategy for the preparation of robust stir bar coatings based on the combination of montmorillonite with epoxy resin. The composite stir bar was implemented in a novel automated multisyringe stir bar sorptive extraction system (MS-SBSE), and applied to the extraction of four chlorophenols (4-chlorophenol, 2,4-dichlorophenol, 2,4,6-trichlorophenol and pentachlorophenol) as model compounds, followed by high performance liquid chromatography-diode array detection. The different experimental parameters of the MS-SBSE, such as sample volume, selection of the desorption solvent, desorption volume, desorption time, sample solution pH, salt effect and extraction time were studied. Under the optimum conditions, the detection limits were between 0.02 and 0.34μgL(-1). Relative standard deviations (RSD) of the method for the analytes at 10μgL(-1) concentration level ranged from 3.5% to 4.1% (as intra-day RSD) and from 3.9% to 4.3% (as inter-day RSD at 50μgL(-1) concentration level). Batch-to-batch reproducibility for three different stir bars was 4.6-5.1%. The enrichment factors were between 30 and 49. In order to investigate the capability of the developed technique for real sample analysis, well water, wastewater and leachates from a solid waste treatment plant were satisfactorily analyzed. PMID:27062720

  10. Automated extraction and classification of time-frequency contours in humpback vocalizations.

    Science.gov (United States)

    Ou, Hui; Au, Whitlow W L; Zurk, Lisa M; Lammers, Marc O

    2013-01-01

    A time-frequency contour extraction and classification algorithm was created to analyze humpback whale vocalizations. The algorithm automatically extracted contours of whale vocalization units by searching for gray-level discontinuities in the spectrogram images. The unit-to-unit similarity was quantified by cross-correlating the contour lines. A library of distinctive humpback units was then generated by applying an unsupervised, cluster-based learning algorithm. The purpose of this study was to provide a fast and automated feature selection tool to describe the vocal signatures of animal groups. This approach could benefit a variety of applications such as species description, identification, and evolution of song structures. The algorithm was tested on humpback whale song data recorded at various locations in Hawaii from 2002 to 2003. Results presented in this paper showed low probability of false alarm (0%-4%) under noisy environments with small boat vessels and snapping shrimp. The classification algorithm was tested on a controlled set of 30 units forming six unit types, and all the units were correctly classified. In a case study on humpback data collected in the Auau Chanel, Hawaii, in 2002, the algorithm extracted 951 units, which were classified into 12 distinctive types. PMID:23297903

  11. Automated DEM extraction in digital aerial photogrammetry: precisions and validation for mass movement monitoring

    Directory of Open Access Journals (Sweden)

    A. Pesci

    2005-06-01

    Full Text Available Automated procedures for photogrammetric image processing and Digital Elevation Models (DEM extraction yield high precision terrain models in a short time, reducing manual editing; their accuracy is strictly related to image quality and terrain features. After an analysis of the performance of the Digital Photogrammetric Workstation (DPW 770 Helava, the paper compares DEMs derived from different surveys and registered in the same reference system. In the case of stable area, the distribution of height residuals, their mean and standard deviation values, indicate that the theoretical accuracy is achievable automatically when terrain is characterized by regular morphology. Steep slopes, corrugated surfaces, vegetation and shadows can degrade results even if manual editing procedures are applied. The comparison of multi-temporal DEMs on unstable areas allows the monitoring of surface deformation and morphological changes.

  12. Abbott RealTime HIV-1 m2000rt viral load testing: Manual extraction versus the automated m2000sp extraction

    OpenAIRE

    Scott, Lesley E.; Crump, John A.; Msuya, Emma; Morrissey, Anne B.; Venter, Willem F.; Stevens, Wendy S.

    2010-01-01

    The Abbott RealTime HIV-1 assay is a real-time nucleic acid amplification assay available for HIV-1 viral load quantitation. The assay has a platform for automated extraction of viral RNA from plasma or dried blood spot samples, and an amplification platform with real time fluorescent detection. Overall, this study found no clinically relevant differences in viral load, if samples were extracted manually.

  13. Automated extraction of precise protein expression patterns in lymphoma by text mining abstracts of immunohistochemical studies

    Directory of Open Access Journals (Sweden)

    Jia-Fu Chang

    2013-01-01

    Full Text Available Background: In general, surgical pathology reviews report protein expression by tumors in a semi-quantitative manner, that is, -, -/+, +/-, +. At the same time, the experimental pathology literature provides multiple examples of precise expression levels determined by immunohistochemical (IHC tissue examination of populations of tumors. Natural language processing (NLP techniques enable the automated extraction of such information through text mining. We propose establishing a database linking quantitative protein expression levels with specific tumor classifications through NLP. Materials and Methods: Our method takes advantage of typical forms of representing experimental findings in terms of percentages of protein expression manifest by the tumor population under study. Characteristically, percentages are represented straightforwardly with the % symbol or as the number of positive findings of the total population. Such text is readily recognized using regular expressions and templates permitting extraction of sentences containing these forms for further analysis using grammatical structures and rule-based algorithms. Results: Our pilot study is limited to the extraction of such information related to lymphomas. We achieved a satisfactory level of retrieval as reflected in scores of 69.91% precision and 57.25% recall with an F-score of 62.95%. In addition, we demonstrate the utility of a web-based curation tool for confirming and correcting our findings. Conclusions: The experimental pathology literature represents a rich source of pathobiological information, which has been relatively underutilized. There has been a combinatorial explosion of knowledge within the pathology domain as represented by increasing numbers of immunophenotypes and disease subclassifications. NLP techniques support practical text mining techniques for extracting this knowledge and organizing it in forms appropriate for pathology decision support systems.

  14. Automated concept-level information extraction to reduce the need for custom software and rules development

    Science.gov (United States)

    Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D

    2011-01-01

    Objective Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. Materials and methods A ‘learn by example’ approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Results Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Conclusion Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download. PMID:21697292

  15. Feature Extraction Method for High Impedance Ground Fault Localization in Radial Power Distribution Networks

    DEFF Research Database (Denmark)

    Jensen, Kåre Jean; Munk, Steen M.; Sørensen, John Aasted

    1998-01-01

    A new approach to the localization of high impedance ground faults in compensated radial power distribution networks is presented. The total size of such networks is often very large and a major part of the monitoring of these is carried out manually. The increasing complexity of industrial...... processes and communication systems lead to demands for improved monitoring of power distribution networks so that the quality of power delivery can be kept at a controlled level. The ground fault localization method for each feeder in a network is based on the centralized frequency broadband measurement of...... three phase voltages and currents. The method consists of a feature extractor, based on a grid description of the feeder by impulse responses, and a neural network for ground fault localization. The emphasis of this paper is the feature extractor, and the detection of the time instance of a ground fault...

  16. A technological system for fully automated extraction of quite thin flat strata with partial back filling of the worked space

    Energy Technology Data Exchange (ETDEWEB)

    Sapitskiy, K.F.; Bondarenko, Yu.V.; Gomal, I.I.; Ivashchenko, V.D.; Nosach, A.K.

    1983-01-01

    A new system is proposed for fully automated extraction of very thin flat strata with control of the roof by rock rubble strips. The results are cited of testbench based analytical investigations of a back filling scraper grader. The additional tractive force required for forming the rubble strips is determined.

  17. A Wavelet Based Multiscale Weighted Permutation Entropy Method for Sensor Fault Feature Extraction and Identification

    OpenAIRE

    Qiaoning Yang; Jianlin Wang

    2016-01-01

    Sensor is the core module in signal perception and measurement applications. Due to the harsh external environment, aging, and so forth, sensor easily causes failure and unreliability. In this paper, three kinds of common faults of single sensor, bias, drift, and stuck-at, are investigated. And a fault diagnosis method based on wavelet permutation entropy is proposed. It takes advantage of the multiresolution ability of wavelet and the internal structure complexity measure of permutation entr...

  18. Development of an automated sequential injection on-line solvent extraction-back extraction procedure as demonstrated for the determination of cadmium with detection by electrothermal atomic absorption spectrometry

    DEFF Research Database (Denmark)

    Wang, Jianhua; Hansen, Elo Harald

    2002-01-01

    An automated sequential injection (SI) on-line solvent extraction-back extraction separation/preconcentration procedure is described. Demonstrated for the assay of cadmium by electrothermal atomic absorption spectrometry (ETAAS), the analyte is initially complexed with ammonium...

  19. Automated data extraction from in situ protein stable isotope probing studies

    Energy Technology Data Exchange (ETDEWEB)

    Slysz, Gordon W.; Steinke, Laurey A.; Ward, David M.; Klatt, Christian G.; Clauss, Therese RW; Purvine, Samuel O.; Payne, Samuel H.; Anderson, Gordon A.; Smith, Richard D.; Lipton, Mary S.

    2014-01-27

    Protein stable isotope probing (protein-SIP) has strong potential for revealing key metabolizing taxa in complex microbial communities. While most protein-SIP work to date has been performed under controlled laboratory conditions to allow extensive isotope labeling of the target organism, a key application will be in situ studies of microbial communities under conditions that result in small degrees of partial labeling. One hurdle restricting large scale in situ protein-SIP studies is the lack of algorithms and software for automated data processing of the massive data sets resulting from such studies. In response, we developed Stable Isotope Probing Protein Extraction Resources software (SIPPER) and applied it for large scale extraction and visualization of data from short term (3 h) protein-SIP experiments performed in situ on Yellowstone phototrophic bacterial mats. Several metrics incorporated into the software allow it to support exhaustive analysis of the complex composite isotopic envelope observed as a result of low amounts of partial label incorporation. SIPPER also enables the detection of labeled molecular species without the need for any prior identification.

  20. Streamlining DNA barcoding protocols: automated DNA extraction and a new cox1 primer in arachnid systematics.

    Directory of Open Access Journals (Sweden)

    Nina Vidergar

    Full Text Available BACKGROUND: DNA barcoding is a popular tool in taxonomic and phylogenetic studies, but for most animal lineages protocols for obtaining the barcoding sequences--mitochondrial cytochrome C oxidase subunit I (cox1 AKA CO1--are not standardized. Our aim was to explore an optimal strategy for arachnids, focusing on the species-richest lineage, spiders by (1 improving an automated DNA extraction protocol, (2 testing the performance of commonly used primer combinations, and (3 developing a new cox1 primer suitable for more efficient alignment and phylogenetic analyses. METHODOLOGY: We used exemplars of 15 species from all major spider clades, processed a range of spider tissues of varying size and quality, optimized genomic DNA extraction using the MagMAX Express magnetic particle processor-an automated high throughput DNA extraction system-and tested cox1 amplification protocols emphasizing the standard barcoding region using ten routinely employed primer pairs. RESULTS: The best results were obtained with the commonly used Folmer primers (LCO1490/HCO2198 that capture the standard barcode region, and with the C1-J-2183/C1-N-2776 primer pair that amplifies its extension. However, C1-J-2183 is designed too close to HCO2198 for well-interpreted, continuous sequence data, and in practice the resulting sequences from the two primer pairs rarely overlap. We therefore designed a new forward primer C1-J-2123 60 base pairs upstream of the C1-J-2183 binding site. The success rate of this new primer (93% matched that of C1-J-2183. CONCLUSIONS: The use of C1-J-2123 allows full, indel-free overlap of sequences obtained with the standard Folmer primers and with C1-J-2123 primer pair. Our preliminary tests suggest that in addition to spiders, C1-J-2123 will also perform in other arachnids and several other invertebrates. We provide optimal PCR protocols for these primer sets, and recommend using them for systematic efforts beyond DNA barcoding.

  1. Automated Extraction and Mapping for Desert Wadis from Landsat Imagery in Arid West Asia

    Directory of Open Access Journals (Sweden)

    Yongxue Liu

    2016-03-01

    Full Text Available Wadis, ephemeral dry rivers in arid desert regions that contain water in the rainy season, are often manifested as braided linear channels and are of vital importance for local hydrological environments and regional hydrological management. Conventional methods for effectively delineating wadis from heterogeneous backgrounds are limited for the following reasons: (1 the occurrence of numerous morphological irregularities which disqualify methods based on physical shape; (2 inconspicuous spectral contrast with backgrounds, resulting in frequent false alarms; and (3 the extreme complexity of wadi systems, with numerous tiny tributaries characterized by spectral anisotropy, resulting in a conflict between global and local accuracy. To overcome these difficulties, an automated method for extracting wadis (AMEW from Landsat-8 Operational Land Imagery (OLI was developed in order to take advantage of the complementarity between Water Indices (WIs, which is a technique of mathematically combining different bands to enhance water bodies and suppress backgrounds, and image processing technologies in the morphological field involving multi-scale Gaussian matched filtering and a local adaptive threshold segmentation. Evaluation of the AMEW was carried out in representative areas deliberately selected from Jordan, SW Arabian Peninsula in order to ensure a rigorous assessment. Experimental results indicate that the AMEW achieved considerably higher accuracy than other effective extraction methods in terms of visual inspection and statistical comparison, with an overall accuracy of up to 95.05% for the entire area. In addition, the AMEW (based on the New Water Index (NWI achieved higher accuracy than other methods (the maximum likelihood classifier and the support vector machine classifier used for bulk wadi extraction.

  2. Design and development of an automated D.C. ground fault detection and location system for Cirus

    International Nuclear Information System (INIS)

    Full text: The original design of Cirus safety system provided for automatic detection of ground fault in class I D.C. power supply system and its annunciation followed by delayed reactor trip. Identification of a faulty section was required to be done manually by switching off various sections one at a time thus requiring a lot of shutdown time to identify the faulty section. Since class I power supply is provided for safety control system, quick detection and location of ground faults in this supply is necessary as these faults have potential to bypass safety interlocks and hence the need for a new system for automatic location of a faulty section. Since such systems are not readily available in the market, in-house efforts were made to design and develop a plant-specific system, which has been installed and commissioned

  3. Modeling interseismic deformation field of North Tehran Fault extracted from precise leveling observation

    Science.gov (United States)

    Amighpey, Masoome; Voosoghi, Behzad; Arabi, Siyavash

    2016-06-01

    The North Tehran Fault (NTF) stands out as a major active thrust fault running for approximately 110 km north of Tehran, the capital province of Iran. It has been the source of several major historical earthquakes in the past, including those in 958, 1665, and 1830. In this paper, interseismic strain accumulation on the NFT was investigated using precise leveling measurements obtained over the time frame 1997-2005. The relationship between surface deformation field and interseismic deformation models was evaluated using simulated annealing optimization in a Bayesian framework. The results show that the NTF fault follows an elastic dislocation model creep at a rate of 2.5 ± 0.06 mm/year in the eastern part and 6.2 ± 0.04 mm/year in the western part. Moreover, the locking depth of the fault was evaluated to be ± 1.1 km in the eastern part and 1.3 ± 0.2 km in the western part.

  4. Semi-automated procedures for shoreline extraction using single RADARSAT-1 SAR image

    Science.gov (United States)

    Al Fugura, A.'kif; Billa, Lawal; Pradhan, Biswajeet

    2011-12-01

    Coastline identification is important for surveying and mapping reasons. Coastline serves as the basic point of reference and is used on nautical charts for navigation purposes. Its delineation has become crucial and more important in the wake of the many recent earthquakes and tsunamis resulting in complete change and redraw of some shorelines. In a tropical country like Malaysia, presence of cloud cover hinders the application of optical remote sensing data. In this study a semi-automated technique and procedures are presented for shoreline delineation from RADARSAT-1 image. A scene of RADARSAT-1 satellite image was processed using enhanced filtering technique to identify and extract the shoreline coast of Kuala Terengganu, Malaysia. RADSARSAT image has many advantages over the optical data because of its ability to penetrate cloud cover and its night sensing capabilities. At first, speckles were removed from the image by using Lee sigma filter which was used to reduce random noise and to enhance the image and discriminate the boundary between land and water. The results showed an accurate and improved extraction and delineation of the entire coastline of Kuala Terrenganu. The study demonstrated the reliability of the image averaging filter in reducing random noise over the sea surface especially near the shoreline. It enhanced land-water boundary differentiation, enabling better delineation of the shoreline. Overall, the developed techniques showed the potential of radar imagery for accurate shoreline mapping and will be useful for monitoring shoreline changes during high and low tides as well as shoreline erosion in a tropical country like Malaysia.

  5. Automated Tongue Feature Extraction for ZHENG Classification in Traditional Chinese Medicine

    Directory of Open Access Journals (Sweden)

    Ratchadaporn Kanawong

    2012-01-01

    Full Text Available ZHENG, Traditional Chinese Medicine syndrome, is an integral and essential part of Traditional Chinese Medicine theory. It defines the theoretical abstraction of the symptom profiles of individual patients and thus, used as a guideline in disease classification in Chinese medicine. For example, patients suffering from gastritis may be classified as Cold or Hot ZHENG, whereas patients with different diseases may be classified under the same ZHENG. Tongue appearance is a valuable diagnostic tool for determining ZHENG in patients. In this paper, we explore new modalities for the clinical characterization of ZHENG using various supervised machine learning algorithms. We propose a novel-color-space-based feature set, which can be extracted from tongue images of clinical patients to build an automated ZHENG classification system. Given that Chinese medical practitioners usually observe the tongue color and coating to determine a ZHENG type and to diagnose different stomach disorders including gastritis, we propose using machine-learning techniques to establish the relationship between the tongue image features and ZHENG by learning through examples. The experimental results obtained over a set of 263 gastritis patients, most of whom suffering Cold Zheng or Hot ZHENG, and a control group of 48 healthy volunteers demonstrate an excellent performance of our proposed system.

  6. Neuron Image Analyzer: Automated and Accurate Extraction of Neuronal Data from Low Quality Images.

    Science.gov (United States)

    Kim, Kwang-Min; Son, Kilho; Palmore, G Tayhas R

    2015-01-01

    Image analysis software is an essential tool used in neuroscience and neural engineering to evaluate changes in neuronal structure following extracellular stimuli. Both manual and automated methods in current use are severely inadequate at detecting and quantifying changes in neuronal morphology when the images analyzed have a low signal-to-noise ratio (SNR). This inadequacy derives from the fact that these methods often include data from non-neuronal structures or artifacts by simply tracing pixels with high intensity. In this paper, we describe Neuron Image Analyzer (NIA), a novel algorithm that overcomes these inadequacies by employing Laplacian of Gaussian filter and graphical models (i.e., Hidden Markov Model, Fully Connected Chain Model) to specifically extract relational pixel information corresponding to neuronal structures (i.e., soma, neurite). As such, NIA that is based on vector representation is less likely to detect false signals (i.e., non-neuronal structures) or generate artifact signals (i.e., deformation of original structures) than current image analysis algorithms that are based on raster representation. We demonstrate that NIA enables precise quantification of neuronal processes (e.g., length and orientation of neurites) in low quality images with a significant increase in the accuracy of detecting neuronal changes post-stimulation. PMID:26593337

  7. A multi-atlas based method for automated anatomical rat brain MRI segmentation and extraction of PET activity.

    Directory of Open Access Journals (Sweden)

    Sophie Lancelot

    Full Text Available INTRODUCTION: Preclinical in vivo imaging requires precise and reproducible delineation of brain structures. Manual segmentation is time consuming and operator dependent. Automated segmentation as usually performed via single atlas registration fails to account for anatomo-physiological variability. We present, evaluate, and make available a multi-atlas approach for automatically segmenting rat brain MRI and extracting PET activies. METHODS: High-resolution 7T 2DT2 MR images of 12 Sprague-Dawley rat brains were manually segmented into 27-VOI label volumes using detailed protocols. Automated methods were developed with 7/12 atlas datasets, i.e. the MRIs and their associated label volumes. MRIs were registered to a common space, where an MRI template and a maximum probability atlas were created. Three automated methods were tested: 1/registering individual MRIs to the template, and using a single atlas (SA, 2/using the maximum probability atlas (MP, and 3/registering the MRIs from the multi-atlas dataset to an individual MRI, propagating the label volumes and fusing them in individual MRI space (propagation & fusion, PF. Evaluation was performed on the five remaining rats which additionally underwent [18F]FDG PET. Automated and manual segmentations were compared for morphometric performance (assessed by comparing volume bias and Dice overlap index and functional performance (evaluated by comparing extracted PET measures. RESULTS: Only the SA method showed volume bias. Dice indices were significantly different between methods (PF>MP>SA. PET regional measures were more accurate with multi-atlas methods than with SA method. CONCLUSIONS: Multi-atlas methods outperform SA for automated anatomical brain segmentation and PET measure's extraction. They perform comparably to manual segmentation for FDG-PET quantification. Multi-atlas methods are suitable for rapid reproducible VOI analyses.

  8. An Automated Graphical User Interface based System for the Extraction of Retinal Blood Vessels using Kirsch’s Template

    OpenAIRE

    Joshita Majumdar; Souvik Tewary; Shreyosi Chakraborty; Debasish Kundu; Sudipta Ghosh; Sauvik Das Gupta

    2015-01-01

    The assessment of Blood Vessel networks plays an important role in a variety of medical disorders. The diagnosis of Diabetic Retinopathy (DR) and its repercussions including micro aneurysms, haemorrhages, hard exudates and cotton wool spots is one such field. This study aims to develop an automated system for the extraction of blood vessels from retinal images by employing Kirsch’s Templates in a MATLAB based Graphical User Interface (GUI). Here, a RGB or Grey image of the retina (Fundus Phot...

  9. ESR studies on quartz extracted from shallow fault gouges related to the ms 8.0 Wenchuan earthquake - China - implications for ESR signal resetting in quaternary faults

    OpenAIRE

    Liu, Chun-Ru; Yin, Gong-Ming; Zhou, Yong-Sheng; Gao, Lu; Han, Fei; Li, Jian-ping

    2014-01-01

    ESR dating of the most recent fault activity through quartz signal measurement is based on the assumption that the ESR signal experienced zero resetting during the faulting event. However, several laboratory experiments implied that only partial zeroing of quartz ESR signals was possible. In order to verify whether the signal resetting could be complete under natural conditions, we analyzed quartz recovered from fault gouges after the 2008 Ms 8.0 Wenchuan earthquake. The quartz E’ and Al cent...

  10. Extraction of Citrus Hystrix D.C. (Kaffir Lime) Essential Oil Using Automated Steam Distillation Process: Analysis of Volatile Compounds

    International Nuclear Information System (INIS)

    An automated steam distillation was successfully used to extract volatiles from Citrus hystrix D.C (Kaffir lime) peels. The automated steam distillation integrated with robust temperature control can commercially produce large amount of essential oil with efficient heating system. Objective of this study is to quantify the oil production rate using automated steam distillation and analyze the composition of volatiles in Kaffir lime peels oil at different controlled and uncontrolled temperature conditions. From the experimentation, oil extraction from Kaffir lime peels only took approximately less than 3 hours with amount of oil yield was 13.4 % more than uncontrolled temperature. The identified major compounds from Kaffir lime peels oil were sabinene, β-pinene, limonene, α-pinene, camphene, myrcene, terpinen-4-ol, α-terpineol, linalool, terpinolene and citronellal which are considered to have good organoleptic quality. In contrast with uncontrolled temperature, oil analysis revealed that some important volatile compounds were absent such as terpinolene, linalool, terpinen-4-ol due to thermal degradation effect from fast heating of extracted material. (author)

  11. CENTRIFUGAL LABTUBE FOR FULLY AUTOMATED DNA EXTRACTION & LAMP AMPLIFICATION BASED ON AN INTEGRATED, LOW-COST HEATING SYSTEM

    OpenAIRE

    Hoehl, Melanie Margarete; Weibert, Michael; Paust, Nils; Zengerle, Roland; Slocum, Alexander H.; Steigert, Juergen

    2013-01-01

    In this paper, we introduce a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA-extraction platform (LabTube). We demonstrate fully automated, fully closed extraction of as little as 100 DNA copies of verotoxin-producing (VTEC) Escherichia coli lysate in water, milk and apple juice in a standard laboratory centrifuge, followed by subsequent automatic LAMP amplification with an overall time-to-result of 1.5hrs. The ...

  12. a New Object Based Method for Automated Extraction of Urban Objects from Airborne Sensors Data

    Science.gov (United States)

    Moussa, A.; El-Sheimy, N.

    2012-07-01

    The classification of urban objects such as buildings, trees and roads from airborne sensors data is an essential step in numerous mapping and modelling applications. The automation of this step is greatly needed as the manual processing is costly and time consuming. The increasing availability of airborne sensors data such as aerial imagery and LIDAR data offers new opportunities to develop more robust approaches for automatic classification. These approaches should integrate these data sources that have different characteristics to exceed the accuracy achieved using any individual data source. The proposed approach presented in this paper fuses the aerial images data with single return LIDAR data to extract buildings and trees for an urban area. Object based analysis is adopted to segment the entire DSM data into objects based on height variation. These objects are preliminarily classified into buildings, trees, and ground. This primary classification is used to compute the height to ground for each object to help improve the accuracy of the second phase of classification. The overlapping perspective aerial images are used to build an ortho-photo to derive a vegetation index value for each object. The second phase of classification is performed based on the height to ground and the vegetation index of each object. The proposed approach has been tested using three areas in the centre of the city of Vaihingen provided by ISPRS test project on urban classification and 3D building reconstruction. These areas have historic buildings having rather complex shapes, few high-rising residential buildings that are surrounded by trees, and a purely residential area with small detached houses. The results of the proposed approach are presented based on a reference solution for evaluation purposes. The classification evaluation exhibits highly successful classification results of buildings class. The proposed approach follows the exact boundary of trees based on LIDAR data

  13. A novel automated device for rapid nucleic acid extraction utilizing a zigzag motion of magnetic silica beads.

    Science.gov (United States)

    Yamaguchi, Akemi; Matsuda, Kazuyuki; Uehara, Masayuki; Honda, Takayuki; Saito, Yasunori

    2016-02-01

    We report a novel automated device for nucleic acid extraction, which consists of a mechanical control system and a disposable cassette. The cassette is composed of a bottle, a capillary tube, and a chamber. After sample injection in the bottle, the sample is lysed, and nucleic acids are adsorbed on the surface of magnetic silica beads. These magnetic beads are transported and are vibrated through the washing reagents in the capillary tube under the control of the mechanical control system, and thus, the nucleic acid is purified without centrifugation. The purified nucleic acid is automatically extracted in 3 min for the polymerase chain reaction (PCR). The nucleic acid extraction is dependent on the transport speed and the vibration frequency of the magnetic beads, and optimizing these two parameters provided better PCR efficiency than the conventional manual procedure. There was no difference between the detection limits of our novel device and that of the conventional manual procedure. We have already developed the droplet-PCR machine, which can amplify and detect specific nucleic acids rapidly and automatically. Connecting the droplet-PCR machine to our novel automated extraction device enables PCR analysis within 15 min, and this system can be made available as a point-of-care testing in clinics as well as general hospitals. PMID:26772121

  14. Time-frequency manifold for nonlinear feature extraction in machinery fault diagnosis

    Science.gov (United States)

    He, Qingbo

    2013-02-01

    Time-frequency feature is beneficial to representation of non-stationary signals for effective machinery fault diagnosis. The time-frequency distribution (TFD) is a major tool to reveal the synthetic time-frequency pattern. However, the TFD will also face noise corruption and dimensionality reduction issues in engineering applications. This paper proposes a novel nonlinear time-frequency feature based on a time-frequency manifold (TFM) technique. The new TFM feature is generated by mainly addressing manifold learning on the TFDs in a reconstructed phase space. It combines the non-stationary information and the nonlinear information of analyzed signals, and hence exhibits valuable properties. Specifically, the new feature is a quantitative low-dimensional representation, and reveals the intrinsic time-frequency pattern related to machinery health, which can effectively overcome the effects of noise and condition variance issues in sampling signals. The effectiveness and the merits of the proposed TFM feature are confirmed by case study on gear wear diagnosis, bearing defect identification and defect severity evaluation. Results show the value and potential of the new feature in machinery fault pattern representation and classification.

  15. Mixed-mode isolation of triazine metabolites from soil and aquifer sediments using automated solid-phase extraction

    Science.gov (United States)

    Mills, M.S.; Thurman, E.M.

    1992-01-01

    Reversed-phase isolation and ion-exchange purification were combined in the automated solid-phase extraction of two polar s-triazine metabolites, 2-amino-4-chloro-6-(isopropylamino)-s-triazine (deethylatrazine) and 2-amino-4-chloro-6-(ethylamino)-s-triazine (deisopropylatrazine) from clay-loam and slit-loam soils and sandy aquifer sediments. First, methanol/ water (4/1, v/v) soil extracts were transferred to an automated workstation following evaporation of the methanol phase for the rapid reversed-phase isolation of the metabolites on an octadecylresin (C18). The retention of the triazine metabolites on C18 decreased substantially when trace methanol concentrations (1%) remained. Furthermore, the retention on C18 increased with decreasing aqueous solubility and increasing alkyl-chain length of the metabolites and parent herbicides, indicating a reversed-phase interaction. The analytes were eluted with ethyl acetate, which left much of the soil organic-matter impurities on the resin. Second, the small-volume organic eluate was purified on an anion-exchange resin (0.5 mL/min) to extract the remaining soil pigments that could foul the ion source of the GC/MS system. Recoveries of the analytes were 75%, using deuterated atrazine as a surrogate, and were comparable to recoveries by soxhlet extraction. The detection limit was 0.1 ??g/kg with a coefficient of variation of 15%. The ease and efficiency of this automated method makes it viable, practical technique for studying triazine metabolites in the environment.

  16. The ValleyMorph Tool: An automated extraction tool for transverse topographic symmetry (T-) factor and valley width to valley height (Vf-) ratio

    Science.gov (United States)

    Daxberger, Heidi; Dalumpines, Ron; Scott, Darren M.; Riller, Ulrich

    2014-09-01

    In tectonically active regions on Earth, shallow-crustal deformation associated with seismic hazards may pose a threat to human life and property. The study of landform development, such as analysis of the valley width to valley height ratio (Vf-ratio) and the Transverse Topographic Symmetry Factor (T-factor), delineating drainage basin symmetry, can be used as a relative measure of tectonic activity along fault-bound mountain fronts. The fast evolution of digital elevation models (DEM) provides an ideal base for remotely-sensed tectonomorphic studies of large areas using Geographical Information Systems (GIS). However, a manual extraction of the above mentioned morphologic parameters may be tedious and very time consuming. Moreover, basic GIS software suites do not provide the necessary built-in functions. Therefore, we present a newly developed, Python based, ESRI ArcGIS compatible tool and stand-alone script, the ValleyMorph Tool. This tool facilitates an automated extraction of the Vf-ratio and the T-factor data for large regions. Using a digital elevation raster and watershed polygon files as input, the tool provides output in the form of several ArcGIS data tables and shapefiles, ideal for further data manipulation and computation. This coding enables an easy application among the ArcGIS user community and code conversion to earlier ArcGIS versions. The ValleyMorph Tool is easy to use due to a simple graphical user interface. The tool is tested for the southern Central Andes using a total of 3366 watersheds.

  17. Development of a relatively cheap and simple automated separation system for a routine separation procedure based on extraction chromatography

    International Nuclear Information System (INIS)

    A robust analytical method has been developed in our laboratory for the separation of radionuclides by means of extraction chromatography using an automated separation system. The proposed method is both cheap and simple and provides the advantageous, rapid and accurate separation of the element of interest. The automated separation system enables a shorter separation time by maintaining a constant flow rate of solution and by avoiding clogging or bubbling in the chromatographic column. The present separation method was tested with two types of samples (water and urine) using UTEVA-, TRU- and Sr-specific resins for the separation of U, Th, Am, Pu and Sr. The total separation time for one radionuclide ranged from 60 to 100 min with the separation yield ranging from 68 to 98% depending on the elements separated. We used ICP-QMS, multi-low-level counter and alpha spectroscopy to measure the corresponding elements. (author)

  18. Semi-automated extraction of landslides in Taiwan based on SPOT imagery and DEMs

    Science.gov (United States)

    Eisank, Clemens; Hölbling, Daniel; Friedl, Barbara; Chen, Yi-Chin; Chang, Kang-Tsung

    2014-05-01

    The vast availability and improved quality of optical satellite data and digital elevation models (DEMs), as well as the need for complete and up-to-date landslide inventories at various spatial scales have fostered the development of semi-automated landslide recognition systems. Among the tested approaches for designing such systems, object-based image analysis (OBIA) stepped out to be a highly promising methodology. OBIA offers a flexible, spatially enabled framework for effective landslide mapping. Most object-based landslide mapping systems, however, have been tailored to specific, mainly small-scale study areas or even to single landslides only. Even though reported mapping accuracies tend to be higher than for pixel-based approaches, accuracy values are still relatively low and depend on the particular study. There is still room to improve the applicability and objectivity of object-based landslide mapping systems. The presented study aims at developing a knowledge-based landslide mapping system implemented in an OBIA environment, i.e. Trimble eCognition. In comparison to previous knowledge-based approaches, the classification of segmentation-derived multi-scale image objects relies on digital landslide signatures. These signatures hold the common operational knowledge on digital landslide mapping, as reported by 25 Taiwanese landslide experts during personal semi-structured interviews. Specifically, the signatures include information on commonly used data layers, spectral and spatial features, and feature thresholds. The signatures guide the selection and implementation of mapping rules that were finally encoded in Cognition Network Language (CNL). Multi-scale image segmentation is optimized by using the improved Estimation of Scale Parameter (ESP) tool. The approach described above is developed and tested for mapping landslides in a sub-region of the Baichi catchment in Northern Taiwan based on SPOT imagery and a high-resolution DEM. An object

  19. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications

    DEFF Research Database (Denmark)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik;

    2016-01-01

    , <10%; R(2), 0.994) and finally, the EME-autosampler was used to analyze in vitro conversion of methadone into its main metabolite by rat liver microsomes and for demonstrating the potential of known CYP3A4 inhibitors to prevent metabolism of methadone. By making use of the high extraction speed of EME......The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution......, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was...

  20. Fully automated synthesis of 11C-acetate as tumor PET tracer by simple modified solid-phase extraction purification

    International Nuclear Information System (INIS)

    Introduction: Automated synthesis of 11C-acetate (11C-AC) as the most commonly used radioactive fatty acid tracer is performed by a simple, rapid, and modified solid-phase extraction (SPE) purification. Methods: Automated synthesis of 11C-AC was implemented by carboxylation reaction of MeMgBr on a polyethylene Teflon loop ring with 11C-CO2, followed by acidic hydrolysis with acid and SCX cartridge, and purification on SCX, AG11A8 and C18 SPE cartridges using a commercially available 11C-tracer synthesizer. Quality control test and animals positron emission tomography (PET) imaging were also carried out. Results: A high and reproducible decay-uncorrected radiochemical yield of (41.0±4.6)% (n=10) was obtained from 11C-CO2 within the whole synthesis time about 8 min. The radiochemical purity of 11C-AC was over 95% by high-performance liquid chromatography (HPLC) analysis. Quality control test and PET imaging showed that 11C-AC injection produced by the simple SPE procedure was safe and efficient, and was in agreement with the current Chinese radiopharmaceutical quality control guidelines. Conclusion: The novel, simple, and rapid method is readily adapted to the fully automated synthesis of 11C-AC on several existing commercial synthesis module. The method can be used routinely to produce 11C-AC for preclinical and clinical studies with PET imaging. - Highlights: • A fully automated synthesis of 11C-acetate by simple modified solid-phase extraction purification has been developed. • Typical non-decay-corrected yields were (41.0±4.6)% (n=10) • Radiochemical purity was determined by radio-HPLC analysis on a C18 column using the gradient program, instead of expensive organic acid column or anion column. • QC testing (RCP>99%)

  1. Automated Sampling and Extraction of Krypton from Small Air Samples for Kr-85 Measurement Using Atom Trap Trace Analysis

    International Nuclear Information System (INIS)

    Atom-Trap-Trace-Analysis (ATTA) provides the capability of measuring the Krypton-85 concentration in microlitre amounts of krypton extracted from air samples of about 1 litre. This sample size is sufficiently small to allow for a range of applications, including on-site spot sampling and continuous sampling over periods of several hours. All samples can be easily handled and transported to an off-site laboratory for ATTA measurement, or stored and analyzed on demand. Bayesian sampling methodologies can be applied by blending samples for bulk measurement and performing in-depth analysis as required. Prerequisite for measurement is the extraction of a pure krypton fraction from the sample. This paper introduces an extraction unit able to isolate the krypton in small ambient air samples with high speed, high efficiency and in a fully automated manner using a combination of cryogenic distillation and gas chromatography. Air samples are collected using an automated smart sampler developed in-house to achieve a constant sampling rate over adjustable time periods ranging from 5 minutes to 3 hours per sample. The smart sampler can be deployed in the field and operate on battery for one week to take up to 60 air samples. This high flexibility of sampling and the fast, robust sample preparation are a valuable tool for research and the application of Kr-85 measurements to novel Safeguards procedures. (author)

  2. Comparative Evaluation of a Commercially Available Automated System for Extraction of Viral DNA from Whole Blood: Application to Monitoring of Epstein-Barr Virus and Cytomegalovirus Load ▿

    OpenAIRE

    Pillet, Sylvie; Bourlet, Thomas; Pozzetto, Bruno

    2009-01-01

    The NucliSENS easyMAG automated system was compared to the column-based Qiagen method for Epstein-Barr virus (EBV) or cytomegalovirus (CMV) DNA extraction from whole blood before viral load determination using the corresponding R-gene amplification kits. Both extraction techniques exhibited a total agreement of 81.3% for EBV and 87.2% for CMV.

  3. Automated extraction of skeletal muscles from torso X-ray CT images based on anatomical positional information between skeleton and skeletal muscles

    International Nuclear Information System (INIS)

    We propose an automated approach to extract skeletal muscles in torso X-ray CT images. It transforms 3-D anatomy into 2-D stretched images for simplifying anatomical relationships to getting pathognomonical points. The experimental results show that the proposed method was effective to extract skeletal muscles. (author)

  4. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune;

    2011-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO...... automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid...... 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the...

  5. Feature Extraction Using Discrete Wavelet Transform for Gear Fault Diagnosis of Wind Turbine Gearbox

    DEFF Research Database (Denmark)

    Bajric, Rusmir; Zuber, Ninoslav; Skrimpas, Georgios Alexandros;

    2016-01-01

    vibration signals are decomposed into a series of subbands signals with the use of amultiresolution analytical property of the discrete wavelet transform.Then, 22 condition indicators are extracted fromthe TSA signal, residual signal, and difference signal.Through the case study analysis, a new approach....... The approach presented in this paper was programmed in Matlab environment using data acquired on a 2MWwind turbine....

  6. Automated Extraction of Tree Adjoining Grammars from a Treebank for Vietnamese

    OpenAIRE

    Le-Hong, Phuong; Nguyen, Thi Minh Huyen; Nguyen, Phuong Thai; Phan, Thi Ha

    2010-01-01

    In this paper, we present a system that automatically extracts lexicalized tree adjoining grammars (LTAG) from treebanks. We first discuss in detail extraction algorithms and compare them to previous works. We then report the first LTAG extraction result for Vietnamese, using a recently released Vietnamese treebank. The implementation of an open source and language independent system for automatic extraction of LTAG grammars is also discussed.

  7. Sequential automated fusion/extraction chromatography methodology for the dissolution of uranium in environmental samples for mass spectrometric determination

    International Nuclear Information System (INIS)

    An improved methodology has been developed, based on dissolution by automated fusion followed by extraction chromatography for the detection and quantification of uranium in environmental matrices by mass spectrometry. A rapid fusion protocol (2/LiBr melts were used. The use of a M4 fusion unit also improved repeatability in sample preparation over muffle furnace fusion. Instrumental issues originating from the presence of high salt concentrations in the digestate after lithium metaborate fusion was also mitigated using an extraction chromatography (EXC) protocol aimed at removing lithium and interfering matrix constituants prior to the elution of uranium. The sequential methodology, which can be performed simultaneously on three samples, requires less than 20 min per sample for fusion and separation. It was successfully coupled to inductively coupled plasma mass spectrometry (ICP-MS) achieving detection limits below 100 pg kg-1 for 5-300 mg of sample.

  8. Field-scale validation of an automated soil nitrate extraction and measurement system

    NARCIS (Netherlands)

    Sibley, K.J.; Astatkie, T.; Brewster, G.; Struik, P.C.; Adsett, J.F.; Pruski, K.

    2009-01-01

    One of the many gaps that needs to be solved by precision agriculture technologies is the availability of an economic, automated, on-the-go mapping system that can be used to obtain intensive and accurate ‘real-time’ data on the levels of nitrate nitrogen (NO3–N) in the soil. A soil nitrate mapping

  9. INVESTIGATION OF ARSENIC SPECIATION ON DRINKING WATER TREATMENT MEDIA UTILIZING AUTOMATED SEQUENTIAL CONTINUOUS FLOW EXTRACTION WITH IC-ICP-MS DETECTION

    Science.gov (United States)

    Three treatment media, used for the removal of arsenic from drinking water, were sequentially extracted using 10mM MgCl2 (pH 8), 10mM NaH2PO4 (pH 7) followed by 10mM (NH4)2C2O4 (pH 3). The media were extracted using an on-line automated continuous extraction system which allowed...

  10. Automated on-line liquid-liquid extraction system for temporal mass spectrometric analysis of dynamic samples.

    Science.gov (United States)

    Hsieh, Kai-Ta; Liu, Pei-Han; Urban, Pawel L

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid-liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053-2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h(-1)). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. PMID:26423626

  11. Comparative Assessment of Automated Nucleic Acid Sample Extraction Equipment for Biothreat Agents

    OpenAIRE

    Kalina, Warren Vincent; Douglas, Christina Elizabeth; Coyne, Susan Rajnik; Minogue, Timothy Devin

    2014-01-01

    Magnetic beads offer superior impurity removal and nucleic acid selection over older extraction methods. The performances of nucleic acid extraction of biothreat agents in blood or buffer by easyMAG, MagNA Pure, EZ1 Advanced XL, and Nordiag Arrow were evaluated. All instruments showed excellent performance in blood; however, the easyMAG had the best precision and versatility.

  12. Development and application of reversed phase liquid chromatography based techniques for automated purification of biologically active ingredients from plant extracts and for characterization of plant extracts and environmental pollutants

    OpenAIRE

    Mahsunah, Anis H.

    2006-01-01

    Automated preparative HPLC purification systems are an important and useful technology in pharmaceutical and chemical development. The systems have been applied to high-throughput purification of products from combinatorial compound synthesis for drug discovery, single compound isolation for further structure elucidation and activity screening, as well as fractionation of active compounds from plant extracts. Fraction collection in automated HPLC purification system can be triggered by le...

  13. Automated Extraction and Amplification for Direct Detection of Mycobacterium tuberculosis Complex in Various Clinical Samples▿

    OpenAIRE

    Simonnet, Christine; Lacoste, Vincent; Drogoul, Anne Sophie; Rastogi, Nalin

    2011-01-01

    With the incidence of culture-positive tuberculosis (TB) cases at 25.3 per 100,000 and a 25% rate of TB/HIV coinfection, the TB incidence in French Guiana is the highest of all French regions. In this context, there is an urgent need for simple, automated systems for molecular diagnosis of TB that can be adapted to small laboratories. Introduction of a nuclear amplification test in a routine clinical laboratory is an additional expense, and its cost-effectiveness and clinical utility need to b...

  14. Toward automated parasitic extraction of silicon photonics using layout physical verifications

    Science.gov (United States)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2016-08-01

    A physical verification flow of the layout of silicon photonic circuits is suggested. Simple empirical models are developed to estimate the bend power loss and coupled power in photonic integrated circuits fabricated using SOI standard wafers. These models are utilized in physical verification flow of the circuit layout to verify reliable fabrication using any electronic design automation tool. The models are accurate compared with electromagnetic solvers. The models are closed form and circumvent the need to utilize any EM solver for the verification process. Hence, it dramatically reduces the time of the verification process.

  15. Automated Feature Extraction In Cotton Leaf Images By Graph Cut Approach

    Directory of Open Access Journals (Sweden)

    Prashant R. Rothe

    2014-12-01

    Full Text Available Feature extraction and representation is a decisive step for pattern recognition system. How to extract ideal features that can reflect the inherent content of the images as complete as possible is still a challenging problem in computer vision. Therefore goal of feature extraction is to improve the effectiveness and efficiency of analysis and classification.In this work we present a graph cut based approach for the segmentation of images of diseased cotton leaves.We extract color layout descriptors which can be used for classification. The diseases that have been considered are Bacterial Blight, Myrothecium and Alternaria. The testing samples of the images are gathered from CICR Nagpur, cotton fields in Buldhana and Nagpur district.

  16. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    Science.gov (United States)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  17. Characterization and Application of Superlig 620 Solid Phase Extraction Resin for Automated Process Monitoring of 90Sr

    International Nuclear Information System (INIS)

    Characterization of SuperLig(regsign) 620 solid phase extraction resin was performed in order to develop an automated on-line process monitor for 90Sr. The main focus was on strontium separation from barium, with the goal of developing an automated separation process for 90Sr in high-level wastes. High-level waste contains significant 137Cs activity, of which 137mBa is of great concern as an interference to the quantification of strontium. In addition barium, yttrium and plutonium were studied as potential interferences to strontium uptake and detection. A number of complexants were studied in a series of batch Kd experiments, as SuperLig(regsign) 620 was not previously known to elute strontium in typical mineral acids. The optimal separation was found using a 2M nitric acid load solution with a strontium elution step of ∼0.49M ammonium citrate and a barium elution step of ∼1.8M ammonium citrate. 90Sr quantification of Hanford high-level tank waste was performed on a sequential injection analysis microfluidics system coupled to a flow-cell detector. The results of the on-line procedure are compared to standard radiochemical techniques in this paper.

  18. Characterization and application of SuperLigR 620 solid phase extraction resin for automated process monitoring of 90Sr

    International Nuclear Information System (INIS)

    Characterization of SuperLigR 620 solid phase extraction resin was performed in order to develop an automated on-line process monitor for 90Sr. The main focus was on strontium separation from barium, with the goal of developing an automated separation process for 90Sr in high-level wastes. High-level waste contains significant 137Cs activity, of which 137mBa is of great concern as an interference to the quantification of strontium. In addition barium, yttrium and plutonium were studied as potential interferences to strontium uptake and detection. A number of complexants were studied in a series of batch Kd experiments, as SuperLigR 620 was not previously known to elute strontium in typical mineral acids. The optimal separation was found using a 2 M nitric acid load solution with a strontium elution step of ∼0.49 M ammonium citrate and a barium elution step of ∼1.8 M ammonium citrate. 90Sr quantification of Hanford high-level tank waste was performed on a sequential injection analysis microfluidics system coupled to a flow-cell detector. The results of the on-line procedure are compared to standard radiochemical techniques in this paper. (author)

  19. Automated Manual Transmission Fault Diagnosis Based on Hybrid System Characteristic%基于混杂系统特性的机械式自动变速器故障诊断策略

    Institute of Scientific and Technical Information of China (English)

    彭建鑫; 刘海鸥; 陈慧岩

    2012-01-01

    Automated manual transmission(AMT) fault diagnosis based on hybrid system characteristic is studied. The hybrid automata are used to build AMT automatic transmission system model and analysis is made on the consistent relationship between the behavior of trajectories and the hybrid system model. The fault definition of AMT automatic transmission system is analyzed and the fault behavior of AMT hybrid system is defined by the hybrid system model. The fault diagnosability of the hybrid system is studied based on the fault behavior of the AMT hybrid system. The relationship between fault, fault diagnosability, fault diagnostic accuracy and measurable system state variables is explained by the system trajectories and the concept of fault entropy is used to describe the degree of fault diagnosability. From the AMT system function, system model trajectories and control cycle perspective the behavior of AMT hybrid system is analyzed and classified to raise AMT automatic transmission system fault diagnosis strategy. The fault diagnosis strategy with the diagnostic algorithm is transplanted to the real car platform and a large number of vehicle mileage prove the strategy's correctness and real-time.%基于机械式自动变速器(Automated manual transmission,AMT)自动变速系统混杂特性,对AMT自动变速系统故障诊断技术进行研究.运用混杂自动机建立AMT自动变速系统模型,结合系统运行特点研究系统行为轨迹,分析混杂系统模型与行为轨迹之间一致性关系.分析AMT自动变速系统故障定义,结合混杂系统模型提出AMT混杂系统故障行为的定义.通过混杂系统故障行为轨迹,分析故障可诊断性条件,从系统故障模型和故障阈值条件的角度描述故障、可诊断性、诊断精度、系统可观测状态变量之间的关系,并运用故障熵的概念来描述故障可诊断的程度.从系统功能、系统轨迹、系统控制周期多角度对AMT混杂系统行为轨迹进

  20. Screening for Anabolic Steroids in Urine of Forensic Cases Using Fully Automated Solid Phase Extraction and LC–MS-MS

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards...... and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids...... steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse...

  1. Technical Note: Semi-automated effective width extraction from time-lapse RGB imagery of a remote, braided Greenlandic river

    Science.gov (United States)

    Gleason, C. J.; Smith, L. C.; Finnegan, D. C.; LeWinter, A. L.; Pitcher, L. H.; Chu, V. W.

    2015-06-01

    River systems in remote environments are often challenging to monitor and understand where traditional gauging apparatus are difficult to install or where safety concerns prohibit field measurements. In such cases, remote sensing, especially terrestrial time-lapse imaging platforms, offer a means to better understand these fluvial systems. One such environment is found at the proglacial Isortoq River in southwestern Greenland, a river with a constantly shifting floodplain and remote Arctic location that make gauging and in situ measurements all but impossible. In order to derive relevant hydraulic parameters for this river, two true color (RGB) cameras were installed in July 2011, and these cameras collected over 10 000 half hourly time-lapse images of the river by September of 2012. Existing approaches for extracting hydraulic parameters from RGB imagery require manual or supervised classification of images into water and non-water areas, a task that was impractical for the volume of data in this study. As such, automated image filters were developed that removed images with environmental obstacles (e.g., shadows, sun glint, snow) from the processing stream. Further image filtering was accomplished via a novel automated histogram similarity filtering process. This similarity filtering allowed successful (mean accuracy 79.6 %) supervised classification of filtered images from training data collected from just 10 % of those images. Effective width, a hydraulic parameter highly correlated with discharge in braided rivers, was extracted from these classified images, producing a hydrograph proxy for the Isortoq River between 2011 and 2012. This hydrograph proxy shows agreement with historic flooding observed in other parts of Greenland in July 2012 and offers promise that the imaging platform and processing methodology presented here will be useful for future monitoring studies of remote rivers.

  2. [Corrected Title: Solid-Phase Extraction of Polar Compounds from Water] Automated Electrostatics Environmental Chamber

    Science.gov (United States)

    Sauer, Richard; Rutz, Jeffrey; Schultz, John

    2005-01-01

    A solid-phase extraction (SPE) process has been developed for removing alcohols, carboxylic acids, aldehydes, ketones, amines, and other polar organic compounds from water. This process can be either a subprocess of a water-reclamation process or a means of extracting organic compounds from water samples for gas-chromatographic analysis. This SPE process is an attractive alternative to an Environmental Protection Administration liquid-liquid extraction process that generates some pollution and does not work in a microgravitational environment. In this SPE process, one forces a water sample through a resin bed by use of positive pressure on the upstream side and/or suction on the downstream side, thereby causing organic compounds from the water to be adsorbed onto the resin. If gas-chromatographic analysis is to be done, the resin is dried by use of a suitable gas, then the adsorbed compounds are extracted from the resin by use of a solvent. Unlike the liquid-liquid process, the SPE process works in both microgravity and Earth gravity. In comparison with the liquid-liquid process, the SPE process is more efficient, extracts a wider range of organic compounds, generates less pollution, and costs less.

  3. Automated extraction of urban trees from mobile LiDAR point clouds

    Science.gov (United States)

    Fan, W.; Chenglu, W.; Jonathan, L.

    2016-03-01

    This paper presents an automatic algorithm to localize and extract urban trees from mobile LiDAR point clouds. First, in order to reduce the number of points to be processed, the ground points are filtered out from the raw point clouds, and the un-ground points are segmented into supervoxels. Then, a novel localization method is proposed to locate the urban trees accurately. Next, a segmentation method by localization is proposed to achieve objects. Finally, the features of objects are extracted, and the feature vectors are classified by random forests trained on manually labeled objects. The proposed method has been tested on a point cloud dataset. The results prove that our algorithm efficiently extracts the urban trees.

  4. Development of an automated extraction method for liver tumors in three dimensional multiphase multislice CT images

    International Nuclear Information System (INIS)

    This paper proposes a tumor detection method using four phase three dimensional (3D) CT images of livers, i.e. non-contrast, early, portal, and late phase images. The method extracts liver regions from the four phase images and enhances tumors in the livers using a 3D adaptive convergence index filter. Then it detects local maximum points and extracts tumor candidates by a region growing method. Subsequently several features of the candidates are measured and each candidate is classified into true tumor or normal tissue based on Mahalanobis distances. Above processes except liver region extraction are applied to four phase images, independently and four resultant images are integrated into one. We applied the proposed method to 3D abdominal CT images of ten patients obtained with multi-detector row CT scanner and confirmed that tumor detection rate was 100% without false positives, which was quite promising results. (author)

  5. Evaluation of Three Automated Nucleic Acid Extraction Systems for Identification of Respiratory Viruses in Clinical Specimens by Multiplex Real-Time PCR

    OpenAIRE

    2014-01-01

    A total of 84 nasopharyngeal swab specimens were collected from 84 patients. Viral nucleic acid was extracted by three automated extraction systems: QIAcube (Qiagen, Germany), EZ1 Advanced XL (Qiagen), and MICROLAB Nimbus IVD (Hamilton, USA). Fourteen RNA viruses and two DNA viruses were detected using the Anyplex II RV16 Detection kit (Seegene, Republic of Korea). The EZ1 Advanced XL system demonstrated the best analytical sensitivity for all the three viral strains. The nucleic acids extrac...

  6. An automated algorithm for extracting road edges from terrestrial mobile LiDAR data

    Science.gov (United States)

    Kumar, Pankaj; McElhinney, Conor P.; Lewis, Paul; McCarthy, Timothy

    2013-11-01

    Terrestrial mobile laser scanning systems provide rapid and cost effective 3D point cloud data which can be used for extracting features such as the road edge along a route corridor. This information can assist road authorities in carrying out safety risk assessment studies along road networks. The knowledge of the road edge is also a prerequisite for the automatic estimation of most other road features. In this paper, we present an algorithm which has been developed for extracting left and right road edges from terrestrial mobile LiDAR data. The algorithm is based on a novel combination of two modified versions of the parametric active contour or snake model. The parameters involved in the algorithm are selected empirically and are fixed for all the road sections. We have developed a novel way of initialising the snake model based on the navigation information obtained from the mobile mapping vehicle. We tested our algorithm on different types of road sections representing rural, urban and national primary road sections. The successful extraction of road edges from these multiple road section environments validates our algorithm. These findings and knowledge provide valuable insights as well as a prototype road edge extraction tool-set, for both national road authorities and survey companies.

  7. Semi-automated Extraction of a Wide-Coverage Type-Logical Grammar for French

    OpenAIRE

    Moot, Richard

    2010-01-01

    The paper describes the development of a wide-coverage type-logical grammar for French, which has been extracted from the Paris 7 treebank and received a significant amount of manual ver- ification and cleanup. The resulting treebank is evaluated using a supertagger and performs at a level comparable to the best supertagging results for English.

  8. Technical note: New applications for on-line automated solid phase extraction

    OpenAIRE

    MacFarlane, John D.

    1997-01-01

    This technical note explains the disadvantages of manual solid phase extraction (SPE) techniques and the benefits to be gained with automatic systems. The note reports on a number of general and highly specific applications using the Sample Preparation Unit OSP-2A.

  9. Using mobile laser scanning data for automated extraction of road markings

    Science.gov (United States)

    Guan, Haiyan; Li, Jonathan; Yu, Yongtao; Wang, Cheng; Chapman, Michael; Yang, Bisheng

    2014-01-01

    A mobile laser scanning (MLS) system allows direct collection of accurate 3D point information in unprecedented detail at highway speeds and at less than traditional survey costs, which serves the fast growing demands of transportation-related road surveying including road surface geometry and road environment. As one type of road feature in traffic management systems, road markings on paved roadways have important functions in providing guidance and information to drivers and pedestrians. This paper presents a stepwise procedure to recognize road markings from MLS point clouds. To improve computational efficiency, we first propose a curb-based method for road surface extraction. This method first partitions the raw MLS data into a set of profiles according to vehicle trajectory data, and then extracts small height jumps caused by curbs in the profiles via slope and elevation-difference thresholds. Next, points belonging to the extracted road surface are interpolated into a geo-referenced intensity image using an extended inverse-distance-weighted (IDW) approach. Finally, we dynamically segment the geo-referenced intensity image into road-marking candidates with multiple thresholds that correspond to different ranges determined by point-density appropriate normality. A morphological closing operation with a linear structuring element is finally used to refine the road-marking candidates by removing noise and improving completeness. This road-marking extraction algorithm is comprehensively discussed in the analysis of parameter sensitivity and overall performance. An experimental study performed on a set of road markings with ground-truth shows that the proposed algorithm provides a promising solution to the road-marking extraction from MLS data.

  10. Progress in automated extraction and purification of in situ 14C from quartz: Results from the Purdue in situ 14C laboratory

    Science.gov (United States)

    Lifton, Nathaniel; Goehring, Brent; Wilson, Jim; Kubley, Thomas; Caffee, Marc

    2015-10-01

    Current extraction methods for in situ 14C from quartz [e.g., Lifton et al., (2001), Pigati et al., (2010), Hippe et al., (2013)] are time-consuming and repetitive, making them an attractive target for automation. We report on the status of in situ 14C extraction and purification systems originally automated at the University of Arizona that have now been reconstructed and upgraded at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). The Purdue in situ 14C laboratory builds on the flow-through extraction system design of Pigati et al. (2010), automating most of the procedure by retrofitting existing valves with external servo-controlled actuators, regulating the pressure of research purity O2 inside the furnace tube via a PID-based pressure controller in concert with an inlet mass flow controller, and installing an automated liquid N2 distribution system, all driven by LabView® software. A separate system for cryogenic CO2 purification, dilution, and splitting is also fully automated, ensuring a highly repeatable process regardless of the operator. We present results from procedural blanks and an intercomparison material (CRONUS-A), as well as results of experiments to increase the amount of material used in extraction, from the standard 5 g to 10 g or above. Results thus far are quite promising with procedural blanks comparable to previous work and significant improvements in reproducibility for CRONUS-A measurements. The latter analyses also demonstrate the feasibility of quantitative extraction of in situ 14C from sample masses up to 10 g. Our lab is now analyzing unknowns routinely, but lowering overall blank levels is the focus of ongoing research.

  11. Linearly Supporting Feature Extraction For Automated Estimation Of Stellar Atmospheric Parameters

    CERN Document Server

    Li, Xiangru; Comte, Georges; Luo, Ali; Zhao, Yongheng; Wang, Yongjun

    2015-01-01

    We describe a scheme to extract linearly supporting (LSU) features from stellar spectra to automatically estimate the atmospheric parameters $T_{eff}$, log$~g$, and [Fe/H]. "Linearly supporting" means that the atmospheric parameters can be accurately estimated from the extracted features through a linear model. The successive steps of the process are as follow: first, decompose the spectrum using a wavelet packet (WP) and represent it by the derived decomposition coefficients; second, detect representative spectral features from the decomposition coefficients using the proposed method Least Absolute Shrinkage and Selection Operator (LARS)$_{bs}$; third, estimate the atmospheric parameters $T_{eff}$, log$~g$, and [Fe/H] from the detected features using a linear regression method. One prominent characteristic of this scheme is its ability to evaluate quantitatively the contribution of each detected feature to the atmospheric parameter estimate and also to trace back the physical significance of that feature. Th...

  12. Automated Extraction of Geospatial Features from Satellite Imagery: Computer Vision Oriented Plane Surveying

    Directory of Open Access Journals (Sweden)

    Usman Babawuro

    2012-11-01

    Full Text Available The paper explores and assesses the potential uses of high resolution satellite imagery and digital image processing algorithms for the auto detection and extraction of geospatial features, farmlands, for the purpose of statutory plane surveying tasks. The satellite imagery was georectified to provide the planar surface necessary for morphometric assessments followed by integrated image processing algorithms. Precisely, Canny edge algorithm followed by morphological closing as well as Hough transform for extracting lines of features was used. The algorithms were tested using Quick bird satellite imagery and in all cases we obtained encouraging results. This shows that computer vision and image processing using high resolution satellite imagery could be used for cadastration purposes, where property boundaries are needed and used for compensation purposes and other statutory surveying functions. The error matrix of the delineated boundaries is estimated as equal to 73.33%.

  13. Quantification of lung tumor rotation with automated landmark extraction using orthogonal cine MRI images

    Science.gov (United States)

    Paganelli, Chiara; Lee, Danny; Greer, Peter B.; Baroni, Guido; Riboldi, Marco; Keall, Paul

    2015-09-01

    The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were  -0.6   ±   2.3° and  -1.5   ±   2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment.

  14. Quantification of lung tumor rotation with automated landmark extraction using orthogonal cine MRI images

    International Nuclear Information System (INIS)

    The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were  −0.6   ±   2.3° and  −1.5   ±   2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment. (paper)

  15. AUTOMATED ALGORITHM FOR EXTRACTION OF WETLANDS FROM IRS RESOURCESAT LISS III DATA

    OpenAIRE

    Subramaniam, S; Saxena, M

    2012-01-01

    Wetlands play significant role in maintaining the ecological balance of both biotic and abiotic life in coastal and inland environments. Hence, understanding of their occurrence, spatial extent of change in wetland environment is very important and can be monitored using satellite remote sensing technique. The extraction of wetland features using remote sensing has so far been carried out using visual/ hybrid digital analysis techniques, which is time consuming. To monitor the wetlan...

  16. Automated DNA extraction platforms offer solutions to challenges of assessing microbial biofouling in oil production facilities

    OpenAIRE

    Oldham, Athenia L.; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S.; Kathleen E. Duncan

    2012-01-01

    The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Produ...

  17. Application of automated image analysis to the identification and extraction of recyclable plastic bottles

    Institute of Scientific and Technical Information of China (English)

    Edgar SCAVINO; Dzuraidah Abdul WAHAB; Aini HUSSAIN; Hassan BASRI; Mohd Marzuki MUSTAFA

    2009-01-01

    An experimental machine vision apparatus was used to identify and extract recyclable plastic bottles out of a conveyor belt. Color images were taken with a commercially available Webcam, and the recognition was performed by our homemade software, based on the shape and dimensions of object images. The software was able to manage multiple bottles in a single image and was additionally extended to cases involving touching bottles. The identification was fulfilled by comparing the set of measured features with an existing database and meanwhile integrating various recognition techniques such as minimum distance in the feature space, self-organized maps, and neural networks. The recognition system was tested on a set of 50 different bottles and provided so far an accuracy of about 97% on bottle identification. The extraction of the bottles was performed by means of a pneumatic arm, which was activated according to the plastic type; polyethylene-terephthalate (PET) bottles were left on the conveyor belt, while non-PET boules were extracted. The software was designed to provide the best compromise between reliability and speed for real-time applications in view of the commercialization of the system at existing recycling plants.

  18. Automated Extraction of Buildings and Roads in a Graph Partitioning Framework

    Science.gov (United States)

    Ok, A. O.

    2013-10-01

    This paper presents an original unsupervised framework to identify regions belonging to buildings and roads from monocular very high resolution (VHR) satellite images. The proposed framework consists of three main stages. In the first stage, we extract information only related to building regions using shadow evidence and probabilistic fuzzy landscapes. Firstly, the shadow areas cast by building objects are detected and the directional spatial relationship between buildings and their shadows is modelled with the knowledge of illumination direction. Thereafter, each shadow region is handled separately and initial building regions are identified by iterative graph-cuts designed in a two-label partitioning. The second stage of the framework automatically classifies the image into four classes: building, shadow, vegetation, and others. In this step, the previously labelled building regions as well as the shadow and vegetation areas are involved in a four-label graph optimization performed in the entire image domain to achieve the unsupervised classification result. The final stage aims to extend this classification to five classes in which the class road is involved. For that purpose, we extract the regions that might belong to road segments and utilize that information in a final graph optimization. This final stage eventually characterizes the regions belonging to buildings and roads. Experiments performed on seven test images selected from GeoEye-1 VHR datasets show that the presented approach has ability to extract the regions belonging to buildings and roads in a single graph theory framework.

  19. Evaluation of an Automated Information Extraction Tool for Imaging Data Elements to Populate a Breast Cancer Screening Registry.

    Science.gov (United States)

    Lacson, Ronilda; Harris, Kimberly; Brawarsky, Phyllis; Tosteson, Tor D; Onega, Tracy; Tosteson, Anna N A; Kaye, Abby; Gonzalez, Irina; Birdwell, Robyn; Haas, Jennifer S

    2015-10-01

    Breast cancer screening is central to early breast cancer detection. Identifying and monitoring process measures for screening is a focus of the National Cancer Institute's Population-based Research Optimizing Screening through Personalized Regimens (PROSPR) initiative, which requires participating centers to report structured data across the cancer screening continuum. We evaluate the accuracy of automated information extraction of imaging findings from radiology reports, which are available as unstructured text. We present prevalence estimates of imaging findings for breast imaging received by women who obtained care in a primary care network participating in PROSPR (n = 139,953 radiology reports) and compared automatically extracted data elements to a "gold standard" based on manual review for a validation sample of 941 randomly selected radiology reports, including mammograms, digital breast tomosynthesis, ultrasound, and magnetic resonance imaging (MRI). The prevalence of imaging findings vary by data element and modality (e.g., suspicious calcification noted in 2.6% of screening mammograms, 12.1% of diagnostic mammograms, and 9.4% of tomosynthesis exams). In the validation sample, the accuracy of identifying imaging findings, including suspicious calcifications, masses, and architectural distortion (on mammogram and tomosynthesis); masses, cysts, non-mass enhancement, and enhancing foci (on MRI); and masses and cysts (on ultrasound), range from 0.8 to1.0 for recall, precision, and F-measure. Information extraction tools can be used for accurate documentation of imaging findings as structured data elements from text reports for a variety of breast imaging modalities. These data can be used to populate screening registries to help elucidate more effective breast cancer screening processes. PMID:25561069

  20. An Automated Graphical User Interface based System for the Extraction of Retinal Blood Vessels using Kirsch’s Template

    Directory of Open Access Journals (Sweden)

    Joshita Majumdar

    2015-06-01

    Full Text Available The assessment of Blood Vessel networks plays an important role in a variety of medical disorders. The diagnosis of Diabetic Retinopathy (DR and its repercussions including micro aneurysms, haemorrhages, hard exudates and cotton wool spots is one such field. This study aims to develop an automated system for the extraction of blood vessels from retinal images by employing Kirsch’s Templates in a MATLAB based Graphical User Interface (GUI. Here, a RGB or Grey image of the retina (Fundus Photography is used to obtain the traces of blood vessels. We have incorporated a range of Threshold values for the blood vessel extraction which would provide the user with greater flexibility and ease. This paper also deals with the more generalized implementation of various MATLAB functions present in the image processing toolbox of MATLAB to create a basic image processing editor with different features like noise addition and removal, image cropping, resizing & rotation, histogram adjust, separately viewing the red, green and blue components of a colour image along with brightness control, that are used in a basic image editor. We have combined both Kirsch’s Template and various MATLAB Algorithms to obtain enhanced images which would allow the ophthalmologist to edit and intensify the images as per his/her requirement for diagnosis. Even a non technical person can manage to identify severe discrepancies because of its user friendly appearance. The GUI contains very commonly used English Language viz. Load, Colour Contrast Panel, Image Clarity etc that can be very easily understood. It is an attempt to incorporate maximum number of image processing techniques under one GUI to obtain higher performance. Also it would provide a cost effective solution towards obtaining high definition and resolution images of blood vessel extracted Retina in economically backward regions where costly machine like OCT (Optical Coherence Tomography, MRI (Magnetic Resonance

  1. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples.

    Science.gov (United States)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels

    2011-04-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. PMID:21609694

  2. An energy minimization approach to automated extraction of regular building footprints from airborne LiDAR data

    Science.gov (United States)

    He, Y.; Zhang, C.; Fraser, C. S.

    2014-08-01

    This paper presents an automated approach to the extraction of building footprints from airborne LiDAR data based on energy minimization. Automated 3D building reconstruction in complex urban scenes has been a long-standing challenge in photogrammetry and computer vision. Building footprints constitute a fundamental component of a 3D building model and they are useful for a variety of applications. Airborne LiDAR provides large-scale elevation representation of urban scene and as such is an important data source for object reconstruction in spatial information systems. However, LiDAR points on building edges often exhibit a jagged pattern, partially due to either occlusion from neighbouring objects, such as overhanging trees, or to the nature of the data itself, including unavoidable noise and irregular point distributions. The explicit 3D reconstruction may thus result in irregular or incomplete building polygons. In the presented work, a vertex-driven Douglas-Peucker method is developed to generate polygonal hypotheses from points forming initial building outlines. The energy function is adopted to examine and evaluate each hypothesis and the optimal polygon is determined through energy minimization. The energy minimization also plays a key role in bridging gaps, where the building outlines are ambiguous due to insufficient LiDAR points. In formulating the energy function, hard constraints such as parallelism and perpendicularity of building edges are imposed, and local and global adjustments are applied. The developed approach has been extensively tested and evaluated on datasets with varying point cloud density over different terrain types. Results are presented and analysed. The successful reconstruction of building footprints, of varying structural complexity, along with a quantitative assessment employing accurate reference data, demonstrate the practical potential of the proposed approach.

  3. Fault management for data systems

    Science.gov (United States)

    Boyd, Mark A.; Iverson, David L.; Patterson-Hine, F. Ann

    1993-01-01

    Issues related to automating the process of fault management (fault diagnosis and response) for data management systems are considered. Substantial benefits are to be gained by successful automation of this process, particularly for large, complex systems. The use of graph-based models to develop a computer assisted fault management system is advocated. The general problem is described and the motivation behind choosing graph-based models over other approaches for developing fault diagnosis computer programs is outlined. Some existing work in the area of graph-based fault diagnosis is reviewed, and a new fault management method which was developed from existing methods is offered. Our method is applied to an automatic telescope system intended as a prototype for future lunar telescope programs. Finally, an application of our method to general data management systems is described.

  4. High-Level Analogue Fault Simulation Using Linear and Non-Linear Models

    Directory of Open Access Journals (Sweden)

    I. Bell

    1999-12-01

    Full Text Available A novel method for analogue high-level fault simulation (HLFS using linear and non-linear high-level fault models is presented. Our approach uses automated fault model synthesis and automated model selection for fault simulation. A speed up compared with transistor-level fault simulation can be achieved, whilst retaining both behavioural and fault coverage accuracy. The suggested method was verified in detail using short faults in a 10k state variable bandpass filter.

  5. Pedestrian detection in thermal images: An automated scale based region extraction with curvelet space validation

    Science.gov (United States)

    Lakshmi, A.; Faheema, A. G. J.; Deodhare, Dipti

    2016-05-01

    Pedestrian detection is a key problem in night vision processing with a dozen of applications that will positively impact the performance of autonomous systems. Despite significant progress, our study shows that performance of state-of-the-art thermal image pedestrian detectors still has much room for improvement. The purpose of this paper is to overcome the challenge faced by the thermal image pedestrian detectors, which employ intensity based Region Of Interest (ROI) extraction followed by feature based validation. The most striking disadvantage faced by the first module, ROI extraction, is the failed detection of cloth insulted parts. To overcome this setback, this paper employs an algorithm and a principle of region growing pursuit tuned to the scale of the pedestrian. The statistics subtended by the pedestrian drastically vary with the scale and deviation from normality approach facilitates scale detection. Further, the paper offers an adaptive mathematical threshold to resolve the problem of subtracting the background while extracting cloth insulated parts as well. The inherent false positives of the ROI extraction module are limited by the choice of good features in pedestrian validation step. One such feature is curvelet feature, which has found its use extensively in optical images, but has as yet no reported results in thermal images. This has been used to arrive at a pedestrian detector with a reduced false positive rate. This work is the first venture made to scrutinize the utility of curvelet for characterizing pedestrians in thermal images. Attempt has also been made to improve the speed of curvelet transform computation. The classification task is realized through the use of the well known methodology of Support Vector Machines (SVMs). The proposed method is substantiated with qualified evaluation methodologies that permits us to carry out probing and informative comparisons across state-of-the-art features, including deep learning methods, with six

  6. Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry

    Science.gov (United States)

    Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.

    2015-09-01

    Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.

  7. Detecting and extracting clusters in atom probe data: A simple, automated method using Voronoi cells

    Energy Technology Data Exchange (ETDEWEB)

    Felfer, P., E-mail: peter.felfer@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Ceguerra, A.V., E-mail: anna.ceguerra@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Ringer, S.P., E-mail: simon.ringer@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Cairney, J.M., E-mail: julie.cairney@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia)

    2015-03-15

    The analysis of the formation of clusters in solid solutions is one of the most common uses of atom probe tomography. Here, we present a method where we use the Voronoi tessellation of the solute atoms and its geometric dual, the Delaunay triangulation to test for spatial/chemical randomness of the solid solution as well as extracting the clusters themselves. We show how the parameters necessary for cluster extraction can be determined automatically, i.e. without user interaction, making it an ideal tool for the screening of datasets and the pre-filtering of structures for other spatial analysis techniques. Since the Voronoi volumes are closely related to atomic concentrations, the parameters resulting from this analysis can also be used for other concentration based methods such as iso-surfaces. - Highlights: • Cluster analysis of atom probe data can be significantly simplified by using the Voronoi cell volumes of the atomic distribution. • Concentration fields are defined on a single atomic basis using Voronoi cells. • All parameters for the analysis are determined by optimizing the separation probability of bulk atoms vs clustered atoms.

  8. Automated feature extraction and spatial organization of seafloor pockmarks, Belfast Bay, Maine, USA

    Science.gov (United States)

    Andrews, B.D.; Brothers, L.L.; Barnhardt, W.A.

    2010-01-01

    Seafloor pockmarks occur worldwide and may represent millions of m3 of continental shelf erosion, but few numerical analyses of their morphology and spatial distribution of pockmarks exist. We introduce a quantitative definition of pockmark morphology and, based on this definition, propose a three-step geomorphometric method to identify and extract pockmarks from high-resolution swath bathymetry. We apply this GIS-implemented approach to 25km2 of bathymetry collected in the Belfast Bay, Maine USA pockmark field. Our model extracted 1767 pockmarks and found a linear pockmark depth-to-diameter ratio for pockmarks field-wide. Mean pockmark depth is 7.6m and mean diameter is 84.8m. Pockmark distribution is non-random, and nearly half of the field's pockmarks occur in chains. The most prominent chains are oriented semi-normal to the steepest gradient in Holocene sediment thickness. A descriptive model yields field-wide spatial statistics indicating that pockmarks are distributed in non-random clusters. Results enable quantitative comparison of pockmarks in fields worldwide as well as similar concave features, such as impact craters, dolines, or salt pools. ?? 2010.

  9. Detecting and extracting clusters in atom probe data: A simple, automated method using Voronoi cells

    International Nuclear Information System (INIS)

    The analysis of the formation of clusters in solid solutions is one of the most common uses of atom probe tomography. Here, we present a method where we use the Voronoi tessellation of the solute atoms and its geometric dual, the Delaunay triangulation to test for spatial/chemical randomness of the solid solution as well as extracting the clusters themselves. We show how the parameters necessary for cluster extraction can be determined automatically, i.e. without user interaction, making it an ideal tool for the screening of datasets and the pre-filtering of structures for other spatial analysis techniques. Since the Voronoi volumes are closely related to atomic concentrations, the parameters resulting from this analysis can also be used for other concentration based methods such as iso-surfaces. - Highlights: • Cluster analysis of atom probe data can be significantly simplified by using the Voronoi cell volumes of the atomic distribution. • Concentration fields are defined on a single atomic basis using Voronoi cells. • All parameters for the analysis are determined by optimizing the separation probability of bulk atoms vs clustered atoms

  10. A novel approach for automated shoreline extraction from remote sensing images using low level programming

    Science.gov (United States)

    Rigos, Anastasios; Vaiopoulos, Aristidis; Skianis, George; Tsekouras, George; Drakopoulos, Panos

    2015-04-01

    Tracking coastline changes is a crucial task in the context of coastal management and synoptic remotely sensed data has become an essential tool for this purpose. In this work, and within the framework of BeachTour project, we introduce a new method for shoreline extraction from high resolution satellite images. It was applied on two images taken by the WorldView-2 satellite (7 channels, 2m resolution) during July 2011 and August 2014. The location is the well-known tourist destination of Laganas beach spanning 5 km along the southern part of Zakynthos Island, Greece. The atmospheric correction was performed with the ENVI FLAASH procedure and the final images were validated against hyperspectral field measurements. Using three channels (CH2=blue, CH3=green and CH7=near infrared) the Modified Redness Index image was calculated according to: MRI=(CH7)2/[CH2x(CH3)3]. MRI has the property that its value keeps increasing as the water becomes shallower. This is followed by an abrupt reduction trend at the location of the wet sand up to the point where the dry shore face begins. After that it remains low-valued throughout the beach zone. Images based on this index were used for the shoreline extraction process that included the following steps: a) On the MRI based image, only an area near the shoreline was kept (this process is known as image masking). b) On the masked image the Canny edge detector operator was applied. c) Of all edges discovered on step (b) only the biggest was kept. d) If the line revealed on step (c) was unacceptable, i.e. not defining the shoreline or defining only part of it, then either more than one areas on step (c) were kept or on the MRI image the pixel values were bound in a particular interval [Blow, Bhigh] and only the ones belonging in this interval were kept. Then, steps (a)-(d) were repeated. Using this method, which is still under development, we were able to extract the shoreline position and reveal its changes during the 3-year period

  11. Acquisition of data for plasma simulation by automated extraction of terminology from article abstracts

    International Nuclear Information System (INIS)

    Computer simulation of burning plasmas as well as computational plasma modeling in image processing requires a number of accurate data, in addition to a relevant model framework. To this aim, it is very important to recognize, obtain and evaluate data relevant for such a simulation from the literature. This work focuses on the simultaneous search of relevant data across various online databases, extraction of cataloguing and numerical information, and automatic recognition of specific terminology in the text retrieved. The concept is illustrated on the particular terminology of Atomic and Molecular data relevant to edge plasma simulation. The IAEA search engine GENIE and the NIFS search engine Joint Search 2 are compared and discussed. Accurate modeling of the imaged object is considered to be the ultimate challenge in improving the resolution limits of plasma imaging. (author)

  12. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  13. Automated extraction of absorption features from Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and Geophysical and Environmental Research Imaging Spectrometer (GERIS) data

    Science.gov (United States)

    Kruse, Fred A.; Calvin, Wendy M.; Seznec, Olivier

    1988-01-01

    Automated techniques were developed for the extraction and characterization of absorption features from reflectance spectra. The absorption feature extraction algorithms were successfully tested on laboratory, field, and aircraft imaging spectrometer data. A suite of laboratory spectra of the most common minerals was analyzed and absorption band characteristics tabulated. A prototype expert system was designed, implemented, and successfully tested to allow identification of minerals based on the extracted absorption band characteristics. AVIRIS spectra for a site in the northern Grapevine Mountains, Nevada, have been characterized and the minerals sericite (fine grained muscovite) and dolomite were identified. The minerals kaolinite, alunite, and buddingtonite were identified and mapped for a site at Cuprite, Nevada, using the feature extraction algorithms on the new Geophysical and Environmental Research 64 channel imaging spectrometer (GERIS) data. The feature extraction routines (written in FORTRAN and C) were interfaced to the expert system (written in PROLOG) to allow both efficient processing of numerical data and logical spectrum analysis.

  14. Analysis of the influence of tectonics on the evolution valley network based on the SRTM DEM and the relationship of automatically extracted lineaments and the tectonic faults, Jemma River basin, Ethiopia

    Science.gov (United States)

    Kusák, Michal

    2016-04-01

    The Ethiopian Highland is good example of high plateau landscape formed by combination of tectonic uplift and episodic volcanism (Kazmin, 1975; Pik et al., 2003; Gani et al., 2009). Deeply incised gorges indicate active fluvial erosion which leads to instabilities of over-steepened slopes. In this study we focus on Jemma River basin which is a left tributary of Abay - Blue Nile to assess the influence of neotectonics on the evolution of its river and valley network. Tectonic lineaments, shape of valley networks, direction of river courses and intensity of fluvial erosion were compared in six subregions which were delineate beforehand by means of morphometric analysis. The influence of tectonics on the valley network is low in the older deep and wide canyons and in the and on the high plateau covered with Tertiary lava flows while younger upper part of the canyons it is high. Furthermore, the coincidence of the valley network with the tectonic lineaments differs in the subregions. The fluvial erosion along the main tectonic zones (NE-SW) direction made the way for backward erosion possible to reach far distant areas in E for the fluvial erosion. This tectonic zone also separates older areas in the W from the youngest landscape evolution subregions in the E, next to the Rift Valley. We studied the functions that can automatically extract lineaments in programs ArcGIS 10.1 and PCI Geomatica. The values of input parameters and their influence of the final shape and number of lineaments. A map of automated extracted lineaments was created and compared with 1) the tectonic faults by Geology Survey of Ethiopia (1996); and 2) the lineaments based on visual interpretation of by the author. The comparation of lineaments by automated visualization in GIS and visual interpretation of lineaments by the author proves that both sets of lineaments are in the same azimuth (NE-SW) - the same direction as the orientation of the rift. But it the mapping of lineaments by automated

  15. Automated 3D Particle Field Extraction and Tracking System Using Digital in-line Holography

    Directory of Open Access Journals (Sweden)

    Hesham Eldeeb

    2006-01-01

    Full Text Available Digital holography for 3D particle field extraction and tracking is an active research topic. It has a great application in realizing characterization of micro-scale structures in microelectromechanical systems (MEMS with high resolution and accuracy. In-line configuration is studied in this study as the fundamental structure of a digital holography system. Digital holographic approach, not only eliminates wet chemical processing and mechanical scanning, but also enables the use of complex amplitude information inaccessible by optical reconstruction, thereby allowing flexible reconstruction algorithms to achieve optimization of specific information. However, owing to the inherently low pixel resolution of solid-state imaging sensors, digital holography gives poor depth resolution for images. This problem severely impairs the usefulness of digital holography especially in densely populated particle fields. This study describes a system that significantly improves particle axial-location accuracy by exploring the reconstructed complex amplitude information, compared with other numerical reconstruction schemes that are merely traditional optical reconstruction. Theoretical analysis and experimental results demonstrate that in-line configuration presents advantageous in enhancing the system performance. Greater flexibility of the system, higher lateral resolution and lower speckle noise can be achieved

  16. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  17. Automated Classification of L/R Hand Movement EEG Signals using Advanced Feature Extraction and Machine Learning

    Directory of Open Access Journals (Sweden)

    Mohammad H. Alomari

    2013-07-01

    Full Text Available In this paper, we propose an automated computer platform for the purpose of classifying Electroencephalography (EEG signals associated with left and right hand movements using a hybrid system that uses advanced feature extraction techniques and machine learning algorithms. It is known that EEG represents the brain activity by the electrical voltage fluctuations along the scalp, and Brain-Computer Interface (BCI is a device that enables the use of the brain’s neural activity to communicate with others or to control machines, artificial limbs, or robots without direct physical movements. In our research work, we aspired to find the best feature extraction method that enables the differentiation between left and right executed fist movements through various classification algorithms. The EEG dataset used in this research was created and contributed to PhysioNet by the developers of the BCI2000 instrumentation system. Data was preprocessed using the EEGLAB MATLAB toolbox and artifacts removal was done using AAR. Data was epoched on the basis of Event-Related (De Synchronization (ERD/ERS and movement-related cortical potentials (MRCP features. Mu/beta rhythms were isolated for the ERD/ERS analysis and delta rhythms were isolated for the MRCP analysis. The Independent Component Analysis (ICA spatial filter was applied on related channels for noise reduction and isolation of both artifactually and neutrally generated EEG sources. The final feature vector included the ERD, ERS, and MRCP features in addition to the mean, power and energy of the activations of the resulting Independent Components (ICs of the epoched feature datasets. The datasets were inputted into two machine-learning algorithms: Neural Networks (NNs and Support Vector Machines (SVMs. Intensive experiments were carried out and optimum classification performances of 89.8 and 97.1 were obtained using NN and SVM, respectively. This research shows that this method of feature extraction

  18. Automated visual inspection of brake shoe wear

    Science.gov (United States)

    Lu, Shengfang; Liu, Zhen; Nan, Guo; Zhang, Guangjun

    2015-10-01

    With the rapid development of high-speed railway, the automated fault inspection is necessary to ensure train's operation safety. Visual technology is paid more attention in trouble detection and maintenance. For a linear CCD camera, Image alignment is the first step in fault detection. To increase the speed of image processing, an improved scale invariant feature transform (SIFT) method is presented. The image is divided into multiple levels of different resolution. Then, we do not stop to extract the feature from the lowest resolution to the highest level until we get sufficient SIFT key points. At that level, the image is registered and aligned quickly. In the stage of inspection, we devote our efforts to finding the trouble of brake shoe, which is one of the key components in brake system on electrical multiple units train (EMU). Its pre-warning on wear limitation is very important in fault detection. In this paper, we propose an automatic inspection approach to detect the fault of brake shoe. Firstly, we use multi-resolution pyramid template matching technology to fast locate the brake shoe. Then, we employ Hough transform to detect the circles of bolts in brake region. Due to the rigid characteristic of structure, we can identify whether the brake shoe has a fault. The experiments demonstrate that the way we propose has a good performance, and can meet the need of practical applications.

  19. Performance verification of the Maxwell 16 Instrument and DNA IQ Reference Sample Kit for automated DNA extraction of known reference samples.

    Science.gov (United States)

    Krnajski, Z; Geering, S; Steadman, S

    2007-12-01

    Advances in automation have been made for a number of processes conducted in the forensic DNA laboratory. However, because most robotic systems are designed for high-throughput laboratories batching large numbers of samples, smaller laboratories are left with a limited number of cost-effective options for employing automation. The Maxwell 16 Instrument and DNA IQ Reference Sample Kit marketed by Promega are designed for rapid, automated purification of DNA extracts from sample sets consisting of sixteen or fewer samples. Because the system is based on DNA capture by paramagnetic particles with maximum binding capacity, it is designed to generate extracts with yield consistency. The studies herein enabled evaluation of STR profile concordance, consistency of yield, and cross-contamination performance for the Maxwell 16 Instrument. Results indicate that the system performs suitably for streamlining the process of extracting known reference samples generally used for forensic DNA analysis and has many advantages in a small or moderate-sized laboratory environment. PMID:25869266

  20. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    OpenAIRE

    Elena Ordoñez; Laura Rueda; M. Paz Cañadas; Carme Fuster; Vincenzo Cirigliano

    2013-01-01

    Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 15...

  1. AUTOMATED ANALYSIS OF AQUEOUS SAMPLES CONTAINING PESTICIDES, ACIDIC/BASIC/NEUTRAL SEMIVOLATILES AND VOLATILE ORGANIC COMPOUNDS BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GC/MS

    Science.gov (United States)

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...

  2. Diagnosis and fault-tolerant control

    CERN Document Server

    Blanke, Mogens; Lunze, Jan; Staroswiecki, Marcel

    2016-01-01

    Fault-tolerant control aims at a gradual shutdown response in automated systems when faults occur. It satisfies the industrial demand for enhanced availability and safety, in contrast to traditional reactions to faults, which bring about sudden shutdowns and loss of availability. The book presents effective model-based analysis and design methods for fault diagnosis and fault-tolerant control. Architectural and structural models are used to analyse the propagation of the fault through the process, to test the fault detectability and to find the redundancies in the process that can be used to ensure fault tolerance. It also introduces design methods suitable for diagnostic systems and fault-tolerant controllers for continuous processes that are described by analytical models of discrete-event systems represented by automata. The book is suitable for engineering students, engineers in industry and researchers who wish to get an overview of the variety of approaches to process diagnosis and fault-tolerant contro...

  3. Satellite mapping and automated feature extraction: Geographic information system-based change detection of the Antarctic coast

    Science.gov (United States)

    Kim, Kee-Tae

    Declassified Intelligence Satellite Photograph (DISP) data are important resources for measuring the geometry of the coastline of Antarctica. By using the state-of-art digital imaging technology, bundle block triangulation based on tie points and control points derived from a RADARSAT-1 Synthetic Aperture Radar (SAR) image mosaic and Ohio State University (OSU) Antarctic digital elevation model (DEM), the individual DISP images were accurately assembled into a map quality mosaic of Antarctica as it appeared in 1963. The new map is one of important benchmarks for gauging the response of the Antarctic coastline to changing climate. Automated coastline extraction algorithm design is the second theme of this dissertation. At the pre-processing stage, an adaptive neighborhood filtering was used to remove the film-grain noise while preserving edge features. At the segmentation stage, an adaptive Bayesian approach to image segmentation was used to split the DISP imagery into its homogenous regions, in which the fuzzy c-means clustering (FCM) technique and Gibbs random field (GRF) model were introduced to estimate the conditional and prior probability density functions. A Gaussian mixture model was used to estimate the reliable initial values for the FCM technique. At the post-processing stage, image object formation and labeling, removal of noisy image objects, and vectorization algorithms were sequentially applied to segmented images for extracting a vector representation of coastlines. Results were presented that demonstrate the effectiveness of the algorithm in segmenting the DISP data. In the cases of cloud cover and little contrast scenes, manual editing was carried out based on intermediate image processing and visual inspection in comparison of old paper maps. Through a geographic information system (GIS), the derived DISP coastline data were integrated with earlier and later data to assess continental scale changes in the Antarctic coast. Computing the area of

  4. Soft Fault Diagnosis for Analog Circuits Based on Slope Fault Feature and BP Neural Networks

    Institute of Scientific and Technical Information of China (English)

    HU Mei; WANG Hong; HU Geng; YANG Shiyuan

    2007-01-01

    Fault diagnosis is very important for development and maintenance of safe and reliable electronic circuits and systems. This paper describes an approach of soft fault diagnosis for analog circuits based on slope fault feature and back propagation neural networks (BPNN). The reported approach uses the voltage relation function between two nodes as fault features; and for linear analog circuits, the voltage relation function is a linear function, thus the slope is invariant as fault feature. Therefore, a unified fault feature for both hard fault (open or short fault) and soft fault (parametric fault) is extracted. Unlike other NN-based diagnosis methods which utilize node voltages or frequency response as fault features, the reported BPNN is trained by the extracted feature vectors, the slope features are calculated by just simulating once for each component, and the trained BPNN can achieve all the soft faults diagnosis of the component. Experiments show that our approach is promising.

  5. Automated middle hepatic vessel extraction method using electronic atlas and line enhancement filter on non-contrast torso X-ray CT images

    International Nuclear Information System (INIS)

    Classification of the liver region of the Couinaud segment provides significant information for a computer-aided diagnostic system to localize the position of lesions in the liver region. Hepatic vessels provide essential information to classify the liver region of the Couinaud segment. However, automated segmentation and classification of hepatic vessels are difficult in non-contrast CT images owing to the low contrast between hepatic vessels and liver tissue. In this paper, we propose an automated extraction schema for extracting the middle hepatic vein (MHV), and we employ this schema to classify the liver region into right and left lobes. We applied our method to 22 non-contrast X-ray CT images. All of the cases were normal liver cases. The results for the MHV extraction were evaluated using three parameters for the volume ratio to the correct region of liver. The results show that hepatic vessels extracted using the proposed method were found to be satisfactory in 41% (9/22) of cases. (author)

  6. Development of an automated sequential injection on-line solvent extraction-back extraction procedure as demonstrated for the determination of cadmium with detection by electrothermal atomic absorption spectrometry

    DEFF Research Database (Denmark)

    Wang, Jianhua; Hansen, Elo Harald

    2002-01-01

    An automated sequential injection (SI) on-line solvent extraction-back extraction separation/preconcentration procedure is described. Demonstrated for the assay of cadmium by electrothermal atomic absorption spectrometry (ETAAS), the analyte is initially complexed with ammonium...... pyrrolidinedithiocarbamate (APDC) in citrate buffer and the chelate is extracted into isobutyl methyl ketone (IBMK), which is separated from the aqueous phase by means of a newly designed dual-conical gravitational phase separator. A metered amount of the organic eluate is aspirated and stored in the PTFE holding coil (HC......) of the SI-system. Afterwards, it is dispensed and mixed with an aqueous back extractant of dilute nitric acid containing Hg(II) ions as stripping agent, thereby facilitating a rapid metal-exchange reaction with the APDC ligand and transfer of the Cd into the aqueous phase. The aqueous phase is...

  7. Determination of Low Concentrations of Acetochlor in Water by Automated Solid-Phase Extraction and Gas Chromatography with Mass-Selective Detection

    Science.gov (United States)

    Lindley, C.E.; Stewart, J.T.; Sandstrom, M.W.

    1996-01-01

    A sensitive and reliable gas chromatographic/mass spectrometric (GC/MS) method for determining acetochlor in environmental water samples was developed. The method involves automated extraction of the herbicide from a filtered 1 L water sample through a C18 solid-phase extraction column, elution from the column with hexane-isopropyl alcohol (3 + 1), and concentration of the extract with nitrogen gas. The herbicide is quantitated by capillary/column GC/MS with selected-ion monitoring of 3 characteristic ions. The single-operator method detection limit for reagent water samples is 0.0015 ??g/L. Mean recoveries ranged from about 92 to 115% for 3 water matrixes fortified at 0.05 and 0.5 ??g/L. Average single-operator precision, over the course of 1 week, was better than 5%.

  8. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry

    International Nuclear Information System (INIS)

    Highlights: • A fully automated flow-based modified-BCR extraction method was developed to evaluate the extractable As of soil. • The MSFIA–HG-AFS system included an UV photo-oxidation step for organic species degradation. • The accuracy and precision of the proposed method were found satisfactory. • The time analysis can be reduced up to eight times by using the proposed flow-based BCR method. • The labile As (F1 + F2) was <50% of total As in soil samples from As-contaminated-mining zones. - Abstract: A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L−1 for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013–0.800, 0.011–0.900 and 0.079–1.400 mg L−1 for F1, F2, and F3, respectively. The precision of the automated MSFIA–HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L−1 As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural soil samples from

  9. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rosas-Castor, J.M. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Portugal, L.; Ferrer, L. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Guzmán-Mar, J.L.; Hernández-Ramírez, A. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Cerdà, V. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinojosa-Reyes, L., E-mail: laura.hinojosary@uanl.edu.mx [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico)

    2015-05-18

    Highlights: • A fully automated flow-based modified-BCR extraction method was developed to evaluate the extractable As of soil. • The MSFIA–HG-AFS system included an UV photo-oxidation step for organic species degradation. • The accuracy and precision of the proposed method were found satisfactory. • The time analysis can be reduced up to eight times by using the proposed flow-based BCR method. • The labile As (F1 + F2) was <50% of total As in soil samples from As-contaminated-mining zones. - Abstract: A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L{sup −1} for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013–0.800, 0.011–0.900 and 0.079–1.400 mg L{sup −1} for F1, F2, and F3, respectively. The precision of the automated MSFIA–HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L{sup −1} As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural

  10. A method for improving the reliability of distributed feeder automation fault location%一种提高分布式馈线自动化故障判定可靠性方法

    Institute of Scientific and Technical Information of China (English)

    张伟; 徐士华

    2013-01-01

    To improve the reliability of distributed feeder automation fault location, a distribution network switch group model is established. Based on switch group model, the failure process due to switch rejecting is researched. The fault handing principles are proposed based on logic operations. By supplementing the principle of single-shot reclosing, the extension of fault isolation scope is avoided effectively when the switch rejects to break. Through analyzing the switch rejecting to close, a secondary reclosing processing method is proposed to avoid fault isolation extending. The fault handling process is researched in three cases of communication failure of the adjacent switch. A uniform failure determination method is given. Incidental interference and permanent failure are discussed in switch protection signal failure. The research shows that incidental interference has no effect on fault isolation, while permanent failure will expand the scope of fault isolation. The results of many cases study show the feasibility and effectiveness of the proposed approaches.%为解决分布式馈线自动化故障判定过程中的可靠性问题,提出了配电网开关分组模型,依靠开关分组模型研究了开关拒分情况下的故障处理过程,并给出了一种基于逻辑运算的故障处理原则。对开关一次重合闸原则进行了补充,有效避免了开关拒分情况下的故障隔离范围扩大。对开关拒合情况进行了分析,增加了一种开关拒合情况下的二次重合功能,有效避免了开关拒合情况下的故障隔离范围扩大。分三种情况研究了相邻开关通信故障下的故障处理过程,并给出了一种统一的故障判定方法。对开关保护信号失灵分为互感器偶然性干扰及永久性故障分别进行了分析,研究表明偶然性故障对故障隔离无影响,永久性故障将导致故障隔离范围扩大。给出了实例分析,表明所提方法可行。

  11. An intelligent distributed feeder automation fault judgment%一种智能分布式馈线自动化故障判定方法

    Institute of Scientific and Technical Information of China (English)

    张伟

    2013-01-01

    Fault judgment criteria independent of operation mode is proposed to solve the problems of fast removal of the fault and avoiding re-tuning of device parameters in the operation mode changes. A distribution network switch group model is established, based on which and relying on fault currents and active power direction, the criteria is converted to a series of logical operations. Logical value of a switch and the logic of the switch group value are calculated, and the logical value of the switch group is used to calculate the action of each switch logic value. Using this method, distribution network open/closed loop operation mode are analyzed in detail. For temporary failure, the coincidence mechanism is introduced which does not depend on the communication side of the switch voltage, effectively avoiding the fault isolation scope expansion in the exceptional circumstances of the communication. A grid distribution network with 4 mains and 15 disconnected switches is taken as an example when two failures simultaneously occur. It demonstrates the feasibility of the proposed method.%  为了快速切除故障,避免电网运行方式变化时设备参数重新整定,提出了一种不依赖运行方式的故障判定准则。建立了配电网的开关分组模型,在此基础上依靠故障电流及有功功率方向,将判定准则转换为一系列的逻辑运算。提出了一种开关逻辑值及开关组逻辑值的计算方法,并运用开关组逻辑值计算出各个开关的动作逻辑值。运用该方法对配网开闭环运行方式进行了详细分析。针对暂时性故障引入了一种不依赖通信的开关一侧失压重合机制,有效避免了通信异常情况下的故障隔离范围扩大。给出一个4电源,15分段开关的网格状配电网两处同时故障作为实例,表明了所提出方法的可行性。

  12. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  13. Fault Estimation

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, H.

    2002-01-01

    This paper presents a range of optimization based approaches to fault diagnosis. A variety of fault diagnosis prob-lems are reformulated in the so-called standard problem setup introduced in the literature on robust control. Once the standard problem formulations are given, the fault diagnosis pr...... problems can be solved by standard optimization tech-niques. The proposed methods include: (1) fault diagnosis (fault estimation, (FE)) for systems with model uncertainties; (2) FE for systems with parametric faults, and (3) FE for a class of nonlinear systems....

  14. Fault Tree Generation and Augmentation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is one of the key components of system autonomy. In order to guarantee FM effectiveness and control the cost, tools are required to automate...

  15. Automation of radiochemical analysis by flow injection techniques. Am-Pu separation using TRU-resinTM sorbent extraction column

    International Nuclear Information System (INIS)

    A rapid automated flow injection analysis (FIA) procedure was developed for efficient separation of Am and Pu from each other and from interfering matrix and radionuclide components using a TRU-resinTM column. Selective Pu elution is enabled via on-column reduction. The separation was developed using on-line radioactivity detection. After the separation had been developed, fraction collection was used to obtain the separated fractions. In this manner, a FIA instrument functions as an automated separation workstation capable of unattended operation. (author)

  16. An Evaluation of the SCSN Moment Tensor Solutions: Robustness of the M_w Magnitude Scale, Style of Faulting, and Automation of the Method

    OpenAIRE

    Clinton, John F.; Hauksson, Egill; Solanki, Kalpesh

    2006-01-01

    We have generated moment tensor solutions and moment magnitudes (M_w) for >1700 earthquakes of local magnitude (M_L) >3.0 that occurred from September 1999 to November 2005 in southern California. The method is running as an automated real-time component of the Southern California Seismic Network (SCSN), with solutions available within 12 min of event nucleation. For local events, the method can reliably obtain good-quality solutions for M_w with M_L >3.5, and for the moment tensor for events...

  17. Observer-based Fault Detection and Isolation for Nonlinear Systems

    OpenAIRE

    Lootsma, T.F.

    2001-01-01

    With the rise in automation the increase in fault detectionand isolation & reconfiguration is inevitable. Interest in fault detection and isolation (FDI) for nonlinear systems has grown significantly in recent years. The design of FDI is motivated by the need for knowledge about occurring faults in fault-tolerant control systems (FTC systems). The idea of FTC systems is to detect, isolate, and handle faults in such a way that the systems can still perform in a required manner. One prefers...

  18. Brittle Faults

    Czech Academy of Sciences Publication Activity Database

    Caine, J.; Choudhuri, M.; Bose, N.; Mukherjee, S.; Misra, A.A.; Mathew, G.; Salvi, D.; Toro, B.; Pratt, B.R.; Dasgupta, S.; Nováková, Lucie

    Amsterdam: Elsevier, 2015 - (Mukherjee, S.), s. 79-106 ISBN 978-0-12-420152-1 Institutional support: RVO:67985891 Keywords : brittle shear zone * brittle tectonics * conjugate faults * faults * kinematic indicators Subject RIV: DC - Siesmology, Volcanology, Earth Structure

  19. Zipper Faults

    Science.gov (United States)

    Platt, J. P.; Passchier, C. W.

    2015-12-01

    Intersecting simultaneously active pairs of faults with different orientations and opposing slip sense ("conjugate faults") present geometrical and kinematic problems. Such faults rarely offset each other, even when they have displacements of many km. A simple solution to the problem is that the two faults merge, either zippering up or unzippering, depending on the relationship between the angle of intersection and the slip senses. A widely recognized example of this is the so-called blind front developed in some thrust belts, where a backthrust branches off a decollement surface at depth. The decollement progressively unzippers, so that its hanging wall becomes the hanging wall of the backthrust, and its footwall becomes the footwall of the active decollement. The opposite situation commonly arises in core complexes, where conjugate low-angle normal faults merge to form a single detachment; in this case the two faults zipper up. Analogous situations may arise for conjugate pairs of strike-slip faults. We present kinematic and geometrical analyses of the Garlock and San Andreas faults in California, the Najd fault system in Saudi Arabia, the North and East Anatolian faults, the Karakoram and Altyn Tagh faults in Tibet, and the Tonale and Guidicarie faults in the southern Alps, all of which appear to have undergone zippering over distances of several tens to hundreds of km. The zippering process may produce complex and significant patterns of strain and rotation in the surrounding rocks, particularly if the angle between the zippered faults is large. A zippering fault may be inactive during active movement on the intersecting faults, or it may have a slip rate that differs from either fault. Intersecting conjugate ductile shear zones behave in the same way on outcrop and micro-scales.

  20. Automated flow-based anion-exchange method for high-throughput isolation and real-time monitoring of RuBisCO in plant extracts.

    Science.gov (United States)

    Suárez, Ruth; Miró, Manuel; Cerdà, Víctor; Perdomo, Juan Alejandro; Galmés, Jeroni

    2011-06-15

    In this work, a miniaturized, completely enclosed multisyringe-flow system is proposed for high-throughput purification of RuBisCO from Triticum aestivum extracts. The automated method capitalizes on the uptake of the target protein at 4°C onto Q-Sepharose Fast Flow strong anion-exchanger packed in a cylindrical microcolumn (105 × 4 mm) followed by a stepwise ionic-strength gradient elution (0-0.8 mol/L NaCl) to eliminate concomitant extract components and retrieve highly purified RuBisCO. The manifold is furnished downstream with a flow-through diode-array UV/vis spectrophotometer for real-time monitoring of the column effluent at the protein-specific wavelength of 280 nm to detect the elution of RuBisCO. Quantitation of RuBisCO and total soluble proteins in the eluate fractions were undertaken using polyacrylamide gel electrophoresis (PAGE) and the spectrophotometric Bradford assay, respectively. A comprehensive investigation of the effect of distinct concentration gradients on the isolation of RuBisCO and experimental conditions (namely, type of resin, column dimensions and mobile-phase flow rate) upon column capacity and analyte breakthrough was effected. The assembled set-up was aimed to critically ascertain the efficiency of preliminary batchwise pre-treatments of crude plant extracts (viz., polyethylenglycol (PEG) precipitation, ammonium sulphate precipitation and sucrose gradient centrifugation) in terms of RuBisCO purification and absolute recovery prior to automated anion-exchange column separation. Under the optimum physical and chemical conditions, the flow-through column system is able to admit crude plant extracts and gives rise to RuBisCO purification yields better than 75%, which might be increased up to 96 ± 9% with a prior PEG fractionation followed by sucrose gradient step. PMID:21641435

  1. Automated Contingency Management for Advanced Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Automated Contingency Management (ACM), or the ability to confidently and autonomously adapt to fault conditions with the goal of still achieving mission...

  2. Determination of amlodipine in human plasma using automated online solid-phase extraction HPLC-tandem mass spectrometry: application to a bioequivalence study of Chinese volunteers.

    Science.gov (United States)

    Shentu, Jianzhong; Fu, Lizhi; Zhou, Huili; Hu, Xing Jiang; Liu, Jian; Chen, Junchun; Wu, Guolan

    2012-11-01

    An automated method (XLC-MS/MS) that uses online solid-phase extraction coupled with HPLC-tandem mass spectrometry was reported here for the first time to quantify amlodipine in human plasma. Automated pre-purification of plasma was performed using 10 mm × 2 mm HySphere C8 EC-SE online solid-phase extraction cartridges. After being eluted from the cartridge, the analyte and the internal standard were separated by HPLC and detected by tandem mass spectrometry. Mass spectrometric detection was achieved in the multiple reaction monitoring mode using a quadrupole tandem mass spectrometer in the positive electrospray ionization mode. The XLC-MS/MS method was validated and yielded excellent specificity. The calibration curve ranged from 0.10 to 10.22 ng/mL, and both the intra- and inter-day precision and accuracy values were within 8%. This method proved to be less laborious and was faster per analysis (high-throughput) than offline sample preparation methods. This method has been successfully applied in clinical pharmacokinetic and bioequivalence analyses. PMID:22770846

  3. Automated extraction method for the center line of spinal canal and its application to the spinal curvature quantification in torso X-ray CT images

    Science.gov (United States)

    Hayashi, Tatsuro; Zhou, Xiangrong; Chen, Huayue; Hara, Takeshi; Miyamoto, Kei; Kobayashi, Tatsunori; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Fujita, Hiroshi

    2010-03-01

    X-ray CT images have been widely used in clinical routine in recent years. CT images scanned by a modern CT scanner can show the details of various organs and tissues. This means various organs and tissues can be simultaneously interpreted on CT images. However, CT image interpretation requires a lot of time and energy. Therefore, support for interpreting CT images based on image-processing techniques is expected. The interpretation of the spinal curvature is important for clinicians because spinal curvature is associated with various spinal disorders. We propose a quantification scheme of the spinal curvature based on the center line of spinal canal on CT images. The proposed scheme consists of four steps: (1) Automated extraction of the skeletal region based on CT number thresholding. (2) Automated extraction of the center line of spinal canal. (3) Generation of the median plane image of spine, which is reformatted based on the spinal canal. (4) Quantification of the spinal curvature. The proposed scheme was applied to 10 cases, and compared with the Cobb angle that is commonly used by clinicians. We found that a high-correlation (for the 95% confidence interval, lumbar lordosis: 0.81-0.99) between values obtained by the proposed (vector) method and Cobb angle. Also, the proposed method can provide the reproducible result (inter- and intra-observer variability: within 2°). These experimental results suggested a possibility that the proposed method was efficient for quantifying the spinal curvature on CT images.

  4. Fault detection in reciprocating compressor valves under varying load conditions

    Science.gov (United States)

    Pichler, Kurt; Lughofer, Edwin; Pichler, Markus; Buchegger, Thomas; Klement, Erich Peter; Huschenbett, Matthias

    2016-03-01

    This paper presents a novel approach for detecting cracked or broken reciprocating compressor valves under varying load conditions. The main idea is that the time frequency representation of vibration measurement data will show typical patterns depending on the fault state. The problem is to detect these patterns reliably. For the detection task, we make a detour via the two dimensional autocorrelation. The autocorrelation emphasizes the patterns and reduces noise effects. This makes it easier to define appropriate features. After feature extraction, classification is done using logistic regression and support vector machines. The method's performance is validated by analyzing real world measurement data. The results will show a very high detection accuracy while keeping the false alarm rates at a very low level for different compressor loads, thus achieving a load-independent method. The proposed approach is, to our best knowledge, the first automated method for reciprocating compressor valve fault detection that can handle varying load conditions.

  5. Demonstration and validation of automated agricultural field extraction from multi-temporal Landsat data for the majority of United States harvested cropland

    Science.gov (United States)

    Yan, L.; Roy, D. P.

    2014-12-01

    The spatial distribution of agricultural fields is a fundamental description of rural landscapes and the location and extent of fields is important to establish the area of land utilized for agricultural yield prediction, resource allocation, and for economic planning, and may be indicative of the degree of agricultural capital investment, mechanization, and labor intensity. To date, field objects have not been extracted from satellite data over large areas because of computational constraints, the complexity of the extraction task, and because consistently processed appropriate resolution data have not been available or affordable. A recently published automated methodology to extract agricultural crop fields from weekly 30 m Web Enabled Landsat data (WELD) time series was refined and applied to 14 states that cover 70% of harvested U.S. cropland (USDA 2012 Census). The methodology was applied to 2010 combined weekly Landsat 5 and 7 WELD data. The field extraction and quantitative validation results are presented for the following 14 states: Iowa, North Dakota, Illinois, Kansas, Minnesota, Nebraska, Texas, South Dakota, Missouri, Indiana, Ohio, Wisconsin, Oklahoma and Michigan (sorted by area of harvested cropland). These states include the top 11 U.S states by harvested cropland area. Implications and recommendations for systematic application to global coverage Landsat data are discussed.

  6. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds

    OpenAIRE

    Norbert Pfeifer; Peter Dorninger

    2008-01-01

    Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are...

  7. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-11-01

    Full Text Available Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects.

  8. Automated Building Extraction from High-Resolution Satellite Imagery in Urban Areas Using Structural, Contextual, and Spectral Information

    Directory of Open Access Journals (Sweden)

    Jin Xiaoying

    2005-01-01

    Full Text Available High-resolution satellite imagery provides an important new data source for building extraction. We demonstrate an integrated strategy for identifying buildings in 1-meter resolution satellite imagery of urban areas. Buildings are extracted using structural, contextual, and spectral information. First, a series of geodesic opening and closing operations are used to build a differential morphological profile (DMP that provides image structural information. Building hypotheses are generated and verified through shape analysis applied to the DMP. Second, shadows are extracted using the DMP to provide reliable contextual information to hypothesize position and size of adjacent buildings. Seed building rectangles are verified and grown on a finely segmented image. Next, bright buildings are extracted using spectral information. The extraction results from the different information sources are combined after independent extraction. Performance evaluation of the building extraction on an urban test site using IKONOS satellite imagery of the City of Columbia, Missouri, is reported. With the combination of structural, contextual, and spectral information, of the building areas are extracted with a quality percentage .

  9. Development of an automated batch-type solid-liquid extraction apparatus and extraction of Zr, Hf, and Th by triisooctylamine from HCl solutions for chemistry of element 104, Rf

    International Nuclear Information System (INIS)

    Solid-liquid extraction of the group 4 elements Zr and Hf, which are homologues of Rf (Z = 104), and Th, a pseudo homologue, by triisooctylamine (TIOA) from HCl solutions was performed by batch method. After examining the time required to reach extraction equilibrium for these elements in various concentrations of TIOA and HCl, we investigated in detail variations in the distribution coefficients (Kd) with TIOA and HCl concentrations. The Kd values of Zr and Hf increased with increasing the HCl and TIOA concentrations, suggesting an increase in the abundance of the anionic chloride complexes of Zr and Hf. On the other hand, the Kd values of Th were low in all the HCl concentrations studied, implying that Th does not form anionic species dominantly. We developed a new automated batch-type solid-liquid extraction apparatus for repetitive experiments on transactinide elements. Using this apparatus, we performed solid-liquid extraction employing the radioactive nuclides 89mZr and 175Hf produced by nuclear reactions and transported continuously from the nuclear reaction chamber by the He/KCl gas-jet system. It was found that the distribution behaviors in 7-11 M HCl are almost constant in the time range 10-120 s, and the Kd values are consistent with those obtained in the above manual experiment. This result suggests that the chemical reactions in the extraction process reach equilibrium within 10 s for Zr and Hf under the present experimental conditions. It took about 35 s for the extraction using the apparatus. These results indicate the applicability of the present extraction using the developed apparatus to 261Rf (T1/2 = 68 s) experiments.

  10. Development of an automated batch-type solid-liquid extraction apparatus and extraction of Zr, Hf, and Th by triisooctylamine from HCl solutions for chemistry of element 104, Rf

    Energy Technology Data Exchange (ETDEWEB)

    Kasamatsu, Yoshitaka; Kino, Aiko; Yokokita, Takuya [Osaka Univ. (Japan). Graduate School of Science; and others

    2015-07-01

    Solid-liquid extraction of the group 4 elements Zr and Hf, which are homologues of Rf (Z = 104), and Th, a pseudo homologue, by triisooctylamine (TIOA) from HCl solutions was performed by batch method. After examining the time required to reach extraction equilibrium for these elements in various concentrations of TIOA and HCl, we investigated in detail variations in the distribution coefficients (K{sub d}) with TIOA and HCl concentrations. The K{sub d} values of Zr and Hf increased with increasing the HCl and TIOA concentrations, suggesting an increase in the abundance of the anionic chloride complexes of Zr and Hf. On the other hand, the K{sub d} values of Th were low in all the HCl concentrations studied, implying that Th does not form anionic species dominantly. We developed a new automated batch-type solid-liquid extraction apparatus for repetitive experiments on transactinide elements. Using this apparatus, we performed solid-liquid extraction employing the radioactive nuclides {sup 89m}Zr and {sup 175}Hf produced by nuclear reactions and transported continuously from the nuclear reaction chamber by the He/KCl gas-jet system. It was found that the distribution behaviors in 7-11 M HCl are almost constant in the time range 10-120 s, and the K{sub d} values are consistent with those obtained in the above manual experiment. This result suggests that the chemical reactions in the extraction process reach equilibrium within 10 s for Zr and Hf under the present experimental conditions. It took about 35 s for the extraction using the apparatus. These results indicate the applicability of the present extraction using the developed apparatus to {sup 261}Rf (T{sub 1/2} = 68 s) experiments.

  11. LOW-COST BACTERIAL DETECTION SYSTEM FOR FOOD SAFETY BASED ON AUTOMATED DNA EXTRACTION, AMPLIFICATION AND READOUT

    OpenAIRE

    Hoehl, Melanie Margarete; Bocholt, Eva Schulte; Karippai, Nobu; Zengerle, Roland; Steigert, Juergen; Slocum, Alexander H.

    2013-01-01

    To ensure food, medical and environmental safety and quality, rapid, low-cost and easy-to-use detection methods are desirable. Here, the LabSystem is introduced for integrated, automated DNA purification and amplification. It consists of a disposable, centrifugally-driven DNA purification platform (LabTube) and the subsequent amplification in a low-cost UV/vis-reader (LabReader). In this paper, food safety was chosen as the first sample application with pathogenic verotoxin-producing (VTEC) E...

  12. Development of a semi-automated system for the extraction of the organic phase in the production of Na99mTcO4

    International Nuclear Information System (INIS)

    The sodium pertechnetate, produced in the Plant of Radioisotopes Production is obtained by a solvent extraction method using methylethylketone. The organic phase that contains the pertechnetate ion is separated from the aqueous phase with a ceiling. This manipulation is a critical stage to obtain a better performance of the organic phase extraction. The current system is manual and has restrictions for the position of the jib, increasing the probability of dragging 99Mo from the aqueous phase to the chromatographic column. In order to improve the performance of this process, we have designed and implemented a semi-automated system for the extraction of the organic phase. In this new system we obtained a higher level of the organic phase which eliminates the restriction of jib position and, as a consequence, the probability of dragging 99Mo to the chromatographic column is decreased considerably. In this work we describe the experimental details of the implementation of this system and the results of the preliminary test. (orig.)

  13. Comparison of automated nucleic acid extraction methods for the detection of cytomegalovirus DNA in fluids and tissues

    OpenAIRE

    Waggoner, Jesse J.; Pinsky, Benjamin A.

    2014-01-01

    Testing for cytomegalovirus (CMV) DNA is increasingly being used for specimen types other than plasma or whole blood. However, few studies have investigated the performance of different nucleic acid extraction protocols in such specimens. In this study, CMV extraction using the Cell-free 1000 and Pathogen Complex 400 protocols on the QIAsymphony Sample Processing (SP) system were compared using bronchoalveolar lavage fluid (BAL), tissue samples, and urine. The QIAsymphonyAssay Set-up (AS) sys...

  14. Graphical User Interface Aided Online Fault Diagnosis of Electric Motor - DC motor case study

    OpenAIRE

    POSTALCIOGLU OZGEN, S.

    2009-01-01

    This paper contains graphical user interface (GUI) aided online fault diagnosis for DC motor. The aim of the research is to prevent system faults. Online fault diagnosis has been studied. Design of fault diagnosis has two main levels: Level 1 comprises a traditional control loop; Level 2 contains knowledge based fault diagnosis. Fault diagnosis technique contains feature extraction module, feature cluster module and fault decision module. Wavelet analysis has been used for the feature extract...

  15. Fault Management Techniques in Human Spaceflight Operations

    Science.gov (United States)

    O'Hagan, Brian; Crocker, Alan

    2006-01-01

    This paper discusses human spaceflight fault management operations. Fault detection and response capabilities available in current US human spaceflight programs Space Shuttle and International Space Station are described while emphasizing system design impacts on operational techniques and constraints. Preflight and inflight processes along with products used to anticipate, mitigate and respond to failures are introduced. Examples of operational products used to support failure responses are presented. Possible improvements in the state of the art, as well as prioritization and success criteria for their implementation are proposed. This paper describes how the architecture of a command and control system impacts operations in areas such as the required fault response times, automated vs. manual fault responses, use of workarounds, etc. The architecture includes the use of redundancy at the system and software function level, software capabilities, use of intelligent or autonomous systems, number and severity of software defects, etc. This in turn drives which Caution and Warning (C&W) events should be annunciated, C&W event classification, operator display designs, crew training, flight control team training, and procedure development. Other factors impacting operations are the complexity of a system, skills needed to understand and operate a system, and the use of commonality vs. optimized solutions for software and responses. Fault detection, annunciation, safing responses, and recovery capabilities are explored using real examples to uncover underlying philosophies and constraints. These factors directly impact operations in that the crew and flight control team need to understand what happened, why it happened, what the system is doing, and what, if any, corrective actions they need to perform. If a fault results in multiple C&W events, or if several faults occur simultaneously, the root cause(s) of the fault(s), as well as their vehicle-wide impacts, must be

  16. A fully automated system for analysis of pesticides in water: on-line extraction followed by liquid chromatography-tandem photodiode array/postcolumn derivatization/fluorescence detection.

    Science.gov (United States)

    Patsias, J; Papadopoulou-Mourkidou, E

    1999-01-01

    A fully automated system for on-line solid phase extraction (SPE) followed by high-performance liquid chromatography (HPLC) with tandem detection with a photodiode array detector and a fluorescence detector (after postcolumn derivatization) was developed for analysis of many chemical classes of pesticides and their major conversion products in aquatic systems. An automated on-line-SPE system (Prospekt) operated with reversed-phase cartridges (PRP-1) extracts analytes from 100 mL acidified (pH = 3) filtered water sample. On-line HPLC analysis is performed with a 15 cm C18 analytical column eluted with a mobile phase of phosphate (pH = 3)-acetonitrile in 25 min linear gradient mode. Solutes are detected by tandem diode array/derivatization/fluorescence detection. The system is controlled and monitored by a single computer operated with Millenium software. Recoveries of most analytes in samples fortified at 1 microgram/L are > 90%, with relative standard deviation values of < 5%. For a few very polar analytes, mostly N-methylcarbamoyloximes (i.e., aldicarb sulfone, methomyl, and oxamyl), recoveries are < 20%. However, for these compounds, as well as for the rest of the N-methylcarbamates except for aldicarb sulfoxide and butoxycarboxim, the limits of detection (LODs) are 0.005-0.05 microgram/L. LODs for aldicarb sulfoxide and butoxycarboxim are 0.2 and 0.1 microgram, respectively. LODs for the rest of the analytes except 4-nitrophenol, bentazone, captan, decamethrin, and MCPA are 0.05-0.1 microgram/L. LODs for the latter compounds are 0.2-1.0 microgram/L. The system can be operated unattended. PMID:10444834

  17. Comparison of Boiling and Robotics Automation Method in DNA Extraction for Metagenomic Sequencing of Human Oral Microbes

    Science.gov (United States)

    Shinozaki, Natsuko; Ye, Bin; Tsuboi, Akito; Nagasaki, Masao; Yamashita, Riu

    2016-01-01

    The rapid improvement of next-generation sequencing performance now enables us to analyze huge sample sets with more than ten thousand specimens. However, DNA extraction can still be a limiting step in such metagenomic approaches. In this study, we analyzed human oral microbes to compare the performance of three DNA extraction methods: PowerSoil (a method widely used in this field), QIAsymphony (a robotics method), and a simple boiling method. Dental plaque was initially collected from three volunteers in the pilot study and then expanded to 12 volunteers in the follow-up study. Bacterial flora was estimated by sequencing the V4 region of 16S rRNA following species-level profiling. Our results indicate that the efficiency of PowerSoil and QIAsymphony was comparable to the boiling method. Therefore, the boiling method may be a promising alternative because of its simplicity, cost effectiveness, and short handling time. Moreover, this method was reliable for estimating bacterial species and could be used in the future to examine the correlation between oral flora and health status. Despite this, differences in the efficiency of DNA extraction for various bacterial species were observed among the three methods. Based on these findings, there is no “gold standard” for DNA extraction. In future, we suggest that the DNA extraction method should be selected on a case-by-case basis considering the aims and specimens of the study. PMID:27104353

  18. Advanced Ground Systems Maintenance Functional Fault Models For Fault Isolation Project

    Science.gov (United States)

    Perotti, Jose M. (Compiler)

    2014-01-01

    This project implements functional fault models (FFM) to automate the isolation of failures during ground systems operations. FFMs will also be used to recommend sensor placement to improve fault isolation capabilities. The project enables the delivery of system health advisories to ground system operators.

  19. Automated in-line extraction of uranium(VI) from raffinate streams with on-line detection by cathodic stripping voltammetry

    International Nuclear Information System (INIS)

    An automated method for on-site monitoring of uranium(VI) in raffinate streams originating from nuclear fuel reprocessing plants is described. An in-line stripping procedure (based on liquid/liquid extraction) was developed to extract U(VI) from this stream, a solvent mixture of 20% tributyl phosphate and nitric acid in kerosene, into an aqueous sodium sulfate solution. Degradation products in the solvent mixture, especially dibutyl phosphate, give rise to very strong complexes and are responsible for moderate but constant U(VI) recoveries (nearly 50%). Optimal conditions for in-line stripping comprise a mixing ratio of extractant (0.5 M sodium sulfate in water)/solvent mixture of nearly 3 and a pumping rate of nearly 0.4 mL min- of the solvent mixture. The determination of U(VI) was by on-line cathodic stripping voltammetry (CSV), preceded by adsorptive collection of the U(VI) as an oxine complex onto a hanging mercury drop electrode. Quantities of 1-2 mL of the aqueous extract were pumped into the voltammetric cell and diluted (1/5 to 1/10) with a background electrolyte containing 0.1 M PIPES buffer, 2 x 10-4 M oxine, 10-4 M EDTA, and 0.2 M hydrazine hydrate (pH 9.0). The CSV peak for U(VI) was obtained at -0.68 V with a detection limit of 20 nM in the raffinate stream using an adsorption time of 120 s. 11 refs., 7 figs., 1 tab

  20. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    Science.gov (United States)

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  1. LFI: A Practical and General Library-Level Fault Injector

    OpenAIRE

    MARINESCU Paul; Candea, George

    2009-01-01

    Fault injection, a critical aspect of testing robust systems, is often overlooked in the development of general-purpose software. We believe this is due to the absence of easy-to-use tools and to the extensive manual labor required to perform fault injection tests. This paper introduces LFI (Library Fault Injector), a tool that automates the preparation of fault scenarios and their injection at the boundary between shared libraries and applications. LFI extends prior work by automatically pro...

  2. Fault detection and diagnosis for complex multivariable processes using neural networks

    International Nuclear Information System (INIS)

    the complex input-output mapping performed by a network, and are in general difficult to obtain. Statistical techniques and relationships between fuzzy systems and standard radial basis function networks have been exploited to prune a trained network and to extract qualitative rules that explain the network operation for fault diagnosis. Pruning the networks improved the fault classification, while offering simple qualitative rules on process behaviour. Automation of the pruning procedure introduced flexibility and ease of application of the methods. (author)

  3. Comparison of automated nucleic acid extraction methods for the detection of cytomegalovirus DNA in fluids and tissues

    Directory of Open Access Journals (Sweden)

    Jesse J. Waggoner

    2014-04-01

    Full Text Available Testing for cytomegalovirus (CMV DNA is increasingly being used for specimen types other than plasma or whole blood. However, few studies have investigated the performance of different nucleic acid extraction protocols in such specimens. In this study, CMV extraction using the Cell-free 1000 and Pathogen Complex 400 protocols on the QIAsymphony Sample Processing (SP system were compared using bronchoalveolar lavage fluid (BAL, tissue samples, and urine. The QIAsymphonyAssay Set-up (AS system was used to assemble reactions using artus CMV PCR reagents and amplification was carried out on the Rotor-Gene Q. Samples from 93 patients previously tested for CMV DNA and negative samples spiked with CMV AD-169 were used to evaluate assay performance. The Pathogen Complex 400 protocol yielded the following results: BAL, sensitivity 100% (33/33, specificity 87% (20/23; tissue, sensitivity 100% (25/25, specificity 100% (20/20; urine, sensitivity 100% (21/21, specificity 100% (20/20. Cell-free 1000 extraction gave comparable results for BAL and tissue, however, for urine, the sensitivity was 86% (18/21 and specimen quantitation was inaccurate. Comparative studies of different extraction protocols and DNA detection methods in body fluids and tissues are needed, as assays optimized for blood or plasma will not necessarily perform well on other specimen types.

  4. An automated flow injection system for metal determination by flame atomic absorption spectrometry involving on-line fabric disk sorptive extraction technique.

    Science.gov (United States)

    Anthemidis, A; Kazantzi, V; Samanidou, V; Kabir, A; Furton, K G

    2016-08-15

    A novel flow injection-fabric disk sorptive extraction (FI-FDSE) system was developed for automated determination of trace metals. The platform was based on a minicolumn packed with sol-gel coated fabric media in the form of disks, incorporated into an on-line solid-phase extraction system, coupled with flame atomic absorption spectrometry (FAAS). This configuration provides minor backpressure, resulting in high loading flow rates and shorter analytical cycles. The potentials of this technique were demonstrated for trace lead and cadmium determination in environmental water samples. The applicability of different sol-gel coated FPSE media was investigated. The on-line formed complex of metal with ammonium pyrrolidine dithiocarbamate (APDC) was retained onto the fabric surface and methyl isobutyl ketone (MIBK) was used to elute the analytes prior to atomization. For 90s preconcentration time, enrichment factors of 140 and 38 and detection limits (3σ) of 1.8 and 0.4μgL(-1) were achieved for lead and cadmium determination, respectively, with a sampling frequency of 30h(-1). The accuracy of the proposed method was estimated by analyzing standard reference materials and spiked water samples. PMID:27260436

  5. Automated Extraction and Quantification of Human Cytomegalovirus DNA in Whole Blood by Real-Time PCR Assay

    OpenAIRE

    Mengelle, C.; Sandres-Sauné, K.; Pasquier, C; Rostaing, L.; Mansuy, J.-M.; Marty, M.; Da Silva, I.; Attal, M.; Massip, P.; Izopet, J.

    2003-01-01

    The measurement of human cytomegalovirus (HCMV) DNA in blood is becoming the standard method for monitoring HCMV infection in immune-suppressed and unsuppressed patients. As various blood compartments can be used, we have compared the HCMV DNA measured in whole blood (WB), peripheral blood leukocytes (PBL), and plasma by real-time PCR. We tested 286 samples: HCMV DNA was extracted automatically from WB and PBL with the MagNA Pure instrument (Roche Molecular Biochemicals) and manually from pla...

  6. Effective Semi-Automated Extraction of Intact Mitochondria from Solid Tissues Using Gentle Mechanical Homogenization and Pressure Cycling Technology

    OpenAIRE

    Carlson, G.; Freeman, E.; Ivanov, A.R.; A. Lazarev; Gross, V. S.

    2011-01-01

    Impaired mitochondrial function has been linked to many diseases, such as stroke, heart disease, cancer, Type II diabetes and Parkinson's disease. Mitochondria-enriched preparations are needed for proteomic and metabolomic studies that may provide crucial insights into tissue-specific mitochondrial function and dysfunction, and answer fundamental questions of cell energetic and oxidative stress. Mitochondria extractions from whole tissue samples are typically performed using Potter-Elvehjem h...

  7. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavior...... of two IGBT modules rated at 1.7 kV / 1 kA and 1.7 kV / 1.4kA....

  8. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one. PMID:26374396

  9. Automated chromatographic system with polarimetric detection laser applied in the control of fermentation processes and seaweed extracts characterization

    International Nuclear Information System (INIS)

    There are presented applications and innovations of chromatographic and polarimetric systems in which develop methodologies for measuring the input molasses and the resulting product of a fermentation process of alcohol from a rich honey and evaluation of the fermentation process honey servery in obtaining a drink native to the Yucatan region. Composition was assessed optically active substances in seaweed, of interest to the pharmaceutical industry. The findings provide measurements alternative raw materials and products of the sugar industry, beekeeping and pharmaceutical liquid chromatography with automated polarimetric detection reduces measurement times up to 15 min, making it comparable to the times of high chromatography resolution, significantly reducing operating costs. By chromatography system with polarimetric detection (SCDP) is new columns have included standard size designed by the authors, which allow process samples with volumes up to 1 ml and reduce measurement time to 15 min, decreasing to 5 times the volume sample and halving the time of measurement. Was evaluated determining the concentration of substances using the peaks of the chromatograms obtained for the different columns and calculate the uncertainty of measurements. The results relating to the improvement of a data acquisition program (ADQUIPOL v.2.0) and new programs for the preparation of chromatograms (CROMAPOL CROMAPOL V.1.0 and V.1.2) provide important benefits, which allow a considerable saving of time the processing of the results and can be applied in other chromatography systems with the appropriate adjustments. (Author)

  10. Automated extraction and assessment of functional features of areal measured microstructures using a segmentation-based evaluation method

    International Nuclear Information System (INIS)

    In addition to currently available surface parameters, according to ISO 4287:2010 and ISO 25178-2:2012—which are defined particularly for stochastic surfaces—a universal evaluation procedure is provided for geometrical, well-defined, microstructured surfaces. Since several million of features (like diameters, depths, etc) are present on microstructured surfaces, segmentation techniques are used for the automation of the feature-based dimensional evaluation. By applying an additional extended 3D evaluation after the segmentation and classification procedure, the accuracy of the evaluation is improved compared to the direct evaluation of segments, and additional functional parameters can be derived. Advantages of the extended segmentation-based evaluation method include not only the ability to evaluate the manufacturing process statistically (e.g. by capability indices, according to ISO 21747:2007 and ISO 3534-2:2013) and to derive statistical reliable values for the correction of microstructuring processes but also the direct re-use of the evaluated parameter (including its statistical distribution) in simulations for the calculation of probabilities with respect to the functionality of the microstructured surface. The practical suitability of this method is demonstrated using examples of microstructures for the improvement of sliding and ink transfers for printing machines. (paper)

  11. Monitoring and Fault Diagnosis for Batch Process Based on Feature Extract in Fisher Subspace%基于Fisher子空间特征提取的间歇过程监控和故障诊断

    Institute of Scientific and Technical Information of China (English)

    赵旭; 阎威武; 邵惠鹤

    2006-01-01

    Multivariate statistical process control methods have been widely used in biochemical industries. Batch process is usually monitored by the method of multi-way principal component analysis (MPCA). In this article, a new batch process monitoring and fault diagnosis method based on feature extract in Fisher subspace is proposed.The feature vector and the feature direction are extracted by projecting the high-dimension process data onto the low-dimension Fisher space. The similarity of feature vector between the current and the reference batch is calculated for on-line process monitoring and the contribution plot of weights in feature direction is calculated for fault diagnosis. The approach overcomes the need for estimating or tilling in the unknown portion of the process variables trajectories from the current time to the end of the batch. Simulation results on the benchmark model of penicillin fermentation process can demonstrate that in comparison to the MPCA method, the proposed method is more accurate and efficient for process monitoring and fault diagnosis.

  12. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  13. Fully automated analysis of beta-lactams in bovine milk by online solid phase extraction-liquid chromatography-electrospray-tandem mass spectrometry.

    Science.gov (United States)

    Kantiani, Lina; Farré, Marinella; Sibum, Martin; Postigo, Cristina; López de Alda, Miren; Barceló, Damiá

    2009-06-01

    A fully automated method for the detection of beta-lactam antibiotics, including six penicillins (amoxicillin, ampicillin, cloxacillin, dicloxacillin, oxacillin, and penicillin G) and four cephalosporins (cefazolin, ceftiofur, cefoperazone, and cefalexin) in bovine milk samples has been developed. The outlined method is based on online solid-phase extraction-liquid chromatography/electrospray-tandem mass spectrometry (SPE-LC/ESI-MS-MS). Target compounds were concentrated from 500 microL of centrifuged milk samples using an online SPE procedure with C18 HD cartridges. Target analytes were eluted with a gradient mobile phase (water + 0.1% formic acid/methanol + 0.1% formic acid) at a flow rate of 0.7 mL/min. Chromatographic separation was achieved within 10 min using a C-12 reversed phase analytical column. For unequivocal identification and confirmation, two multiple reaction monitoring (MRM) transitions were acquired for each analyte in the positive electrospray ionization mode (ESI(+)). Method limits of detection (LODs) in milk were well below the maximum residue limits (MRLs) set by the European Union for all compounds. Limits of quantification in milk were between 0.09 ng/mL and 1.44 ng/mL. The developed method was validated according to EU's requirements, and accuracy results ranged from 80 to 116%. Finally, the method was applied to the analysis of twenty real samples previously screened by the inhibition of microbial growth test Eclipse 100. This new developed method offers high sensitivity and accuracy of results, minimum sample pre-treatment, and uses for the first time an automated online SPE offering a high throughput analysis. Because of all these characteristics, the proposed method is applicable and could be deemed necessary within the field of food control and safety. PMID:19402673

  14. Fault Current Characteristics of the DFIG under Asymmetrical Fault Conditions

    OpenAIRE

    Fan Xiao; Zhe Zhang; Xianggen Yin

    2015-01-01

    During non-severe fault conditions, crowbar protection is not activated and the rotor windings of a doubly-fed induction generator (DFIG) are excited by the AC/DC/AC converter. Meanwhile, under asymmetrical fault conditions, the electrical variables oscillate at twice the grid frequency in synchronous dq frame. In the engineering practice, notch filters are usually used to extract the positive and negative sequence components. In these cases, the dynamic response of a rotor-side converter (RS...

  15. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  16. New insights on Southern Coyote Creek Fault and Superstition Hills Fault

    Science.gov (United States)

    van Zandt, A. J.; Mellors, R. J.; Rockwell, T. K.; Burgess, M. K.; O'Hare, M.

    2007-12-01

    Recent field work has confirmed an extension of the southern Coyote Creek (CCF) branch of the San Jacinto fault in the western Salton trough. The fault marks the western edge of an area of subsidence caused by groundwater extraction, and field measurements suggest that recent strike-slip motion has occurred on this fault as well. We attempt to determine whether this fault connects at depth with the Superstition Hills fault (SHF) to the southeast by modeling observed surface deformation between the two faults measured by InSAR. Stacked ERS (descending) InSAR data from 1992 to 2000 is initially modeled using a finite fault in an elastic half-space. Observed deformation along the SHF and Elmore Ranch fault is modeled assuming shallow (< 5 km) creep. We test various models to explain surface deformation between the two faults.

  17. Automated and quantitative headspace in-tube extraction for the accurate determination of highly volatile compounds from wines and beers.

    Science.gov (United States)

    Zapata, Julián; Mateo-Vivaracho, Laura; Lopez, Ricardo; Ferreira, Vicente

    2012-03-23

    An automatic headspace in-tube extraction (ITEX) method for the accurate determination of acetaldehyde, ethyl acetate, diacetyl and other volatile compounds from wine and beer has been developed and validated. Method accuracy is based on the nearly quantitative transference of volatile compounds from the sample to the ITEX trap. For achieving that goal most methodological aspects and parameters have been carefully examined. The vial and sample sizes and the trapping materials were found to be critical due to the pernicious saturation effects of ethanol. Small 2 mL vials containing very small amounts of sample (20 μL of 1:10 diluted sample) and a trap filled with 22 mg of Bond Elut ENV resins could guarantee a complete trapping of sample vapors. The complete extraction requires 100 × 0.5 mL pumping strokes at 60 °C and takes 24 min. Analytes are further desorbed at 240 °C into the GC injector under a 1:5 split ratio. The proportion of analytes finally transferred to the trap ranged from 85 to 99%. The validation of the method showed satisfactory figures of merit. Determination coefficients were better than 0.995 in all cases and good repeatability was also obtained (better than 7% in all cases). Reproducibility was better than 8.3% except for acetaldehyde (13.1%). Detection limits were below the odor detection thresholds of these target compounds in wine and beer and well below the normal ranges of occurrence. Recoveries were not significantly different to 100%, except in the case of acetaldehyde. In such a case it could be determined that the method is not able to break some of the adducts that this compound forms with sulfites. However, such problem was avoided after incubating the sample with glyoxal. The method can constitute a general and reliable alternative for the analysis of very volatile compounds in other difficult matrixes. PMID:22340891

  18. miRSel: Automated extraction of associations between microRNAs and genes from the biomedical literature

    Directory of Open Access Journals (Sweden)

    Zimmer Ralf

    2010-03-01

    Full Text Available Abstract Background MicroRNAs have been discovered as important regulators of gene expression. To identify the target genes of microRNAs, several databases and prediction algorithms have been developed. Only few experimentally confirmed microRNA targets are available in databases. Many of the microRNA targets stored in databases were derived from large-scale experiments that are considered not very reliable. We propose to use text mining of publication abstracts for extracting microRNA-gene associations including microRNA-target relations to complement current repositories. Results The microRNA-gene association database miRSel combines text-mining results with existing databases and computational predictions. Text mining enables the reliable extraction of microRNA, gene and protein occurrences as well as their relationships from texts. Thereby, we increased the number of human, mouse and rat miRNA-gene associations by at least three-fold as compared to e.g. TarBase, a resource for miRNA-gene associations. Conclusions Our database miRSel offers the currently largest collection of literature derived miRNA-gene associations. Comprehensive collections of miRNA-gene associations are important for the development of miRNA target prediction tools and the analysis of regulatory networks. miRSel is updated daily and can be queried using a web-based interface via microRNA identifiers, gene and protein names, PubMed queries as well as gene ontology (GO terms. miRSel is freely available online at http://services.bio.ifi.lmu.de/mirsel.

  19. Observer-based Fault Detection and Isolation for Nonlinear Systems

    DEFF Research Database (Denmark)

    Lootsma, T.F.

    -tolerance can be applied to ordinary industrial processes that are not categorized as high risk applications, but where high availability is desirable. The quality of fault-tolerant control is totally dependent on the quality of the underlying algorithms. They detect possible faults, and later reconfigure......With the rise in automation the increase in fault detectionand isolation & reconfiguration is inevitable. Interest in fault detection and isolation (FDI) for nonlinear systems has grown significantly in recent years. The design of FDI is motivated by the need for knowledge about occurring faults in...... fault-tolerant control systems (FTC systems). The idea of FTC systems is to detect, isolate, and handle faults in such a way that the systems can still perform in a required manner. One prefers reduced performance after occurrence of a fault to the shut down of (sub-) systems. Hence, the idea of fault...

  20. Diagnosis and Fault-tolerant Control, 2nd edition

    DEFF Research Database (Denmark)

    Blanke, Mogens; Kinnaert, Michel; Lunze, Jan; Starosweicki, Marcel

    Fault-tolerant control aims at a graceful degradation of the behaviour of automated systems in case of faults. It satisfies the industrial demand for enhanced availability and safety, in contrast to traditional reactions to faults that bring about sudden shutdowns and loss of availability. The book...... presents effective model-based analysis and design methods for fault diagnosis and fault-tolerant control. Architectural and structural models are used to analyse the propagation of the fault throught the process, to test the fault detectability and to find the redundancies in the process that can be used...... to ensure fault tolerance. Design methods for diagnostic systems and fault-tolerant controllers are presented for processes that are described by analytical models, by discrete-event models or that can be dealt with as quantised systems. Five case studies on pilot processes show the applicability of...

  1. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra;

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/......., organism, tissue and disease terms. The evaluators in the BioCreative V Interactive Annotation Task found the system to be intuitive, useful, well documented and sufficiently accurate to be helpful in spotting relevant text passages and extracting organism and environment terms. Comparison of fully manual...

  2. Intelligent Fault Diagnosis in Lead-zinc Smelting Process

    Institute of Scientific and Technical Information of China (English)

    Wei-Hua Gui; Chun-Hua Yang; Jing Teng

    2007-01-01

    According to the fault characteristic of the imperial smelting process (ISP), a novel intelligent integrated fault diagnostic system is developed. In the system fuzzy neural networks are utilized to extract fault symptom and expert system is employed for effective fault diagnosis of the process. Furthermore, fuzzy abductive inference is introduced to diagnose multiple faults. Feasibility of the proposed system is demonstrated through a pilot plant case study.

  3. A new adaptive algorithm for automated feature extraction in exponentially damped signals for health monitoring of smart structures

    Science.gov (United States)

    Qarib, Hossein; Adeli, Hojjat

    2015-12-01

    In this paper authors introduce a new adaptive signal processing technique for feature extraction and parameter estimation in noisy exponentially damped signals. The iterative 3-stage method is based on the adroit integration of the strengths of parametric and nonparametric methods such as multiple signal categorization, matrix pencil, and empirical mode decomposition algorithms. The first stage is a new adaptive filtration or noise removal scheme. The second stage is a hybrid parametric-nonparametric signal parameter estimation technique based on an output-only system identification technique. The third stage is optimization of estimated parameters using a combination of the primal-dual path-following interior point algorithm and genetic algorithm. The methodology is evaluated using a synthetic signal and a signal obtained experimentally from transverse vibrations of a steel cantilever beam. The method is successful in estimating the frequencies accurately. Further, it estimates the damping exponents. The proposed adaptive filtration method does not include any frequency domain manipulation. Consequently, the time domain signal is not affected as a result of frequency domain and inverse transformations.

  4. Automatic location of short circuit faults

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [VTT Energy, Espoo (Finland); Hakola, T.; Antila, E. [ABB Power Oy (Finland); Seppaenen, M. [North-Carelian Power Company (Finland)

    1998-08-01

    In this chapter, the automatic location of short circuit faults on medium voltage distribution lines, based on the integration of computer systems of medium voltage distribution network automation is discussed. First the distribution data management systems and their interface with the substation telecontrol, or SCADA systems, is studied. Then the integration of substation telecontrol system and computerized relay protection is discussed. Finally, the implementation of the fault location system is presented and the practical experience with the system is discussed

  5. Automatic location of short circuit faults

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [VTT Energy, Espoo (Finland); Hakola, T.; Antila, E. [ABB Power Oy, Helsinki (Finland); Seppaenen, M. [North-Carelian Power Company (Finland)

    1996-12-31

    In this presentation, the automatic location of short circuit faults on medium voltage distribution lines, based on the integration of computer systems of medium voltage distribution network automation is discussed. First the distribution data management systems and their interface with the substation telecontrol, or SCADA systems, is studied. Then the integration of substation telecontrol system and computerised relay protection is discussed. Finally, the implementation of the fault location system is presented and the practical experience with the system is discussed

  6. Development and validation of an automated liquid-liquid extraction GC/MS method for the determination of THC, 11-OH-THC, and free THC-carboxylic acid (THC-COOH) from blood serum.

    Science.gov (United States)

    Purschke, Kirsten; Heinl, Sonja; Lerch, Oliver; Erdmann, Freidoon; Veit, Florian

    2016-06-01

    The analysis of Δ(9)-tetrahydrocannabinol (THC) and its metabolites 11-hydroxy-Δ(9)-tetrahydrocannabinol (11-OH-THC), and 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THC-COOH) from blood serum is a routine task in forensic toxicology laboratories. For examination of consumption habits, the concentration of the phase I metabolite THC-COOH is used. Recommendations for interpretation of analysis values in medical-psychological assessments (regranting of driver's licenses, Germany) include threshold values for the free, unconjugated THC-COOH. Using a fully automated two-step liquid-liquid extraction, THC, 11-OH-THC, and free, unconjugated THC-COOH were extracted from blood serum, silylated with N-methyl-N-(trimethylsilyl) trifluoroacetamide (MSTFA), and analyzed by GC/MS. The automation was carried out by an x-y-z sample robot equipped with modules for shaking, centrifugation, and solvent evaporation. This method was based on a previously developed manual sample preparation method. Validation guidelines of the Society of Toxicological and Forensic Chemistry (GTFCh) were fulfilled for both methods, at which the focus of this article is the automated one. Limits of detection and quantification for THC were 0.3 and 0.6 μg/L, for 11-OH-THC were 0.1 and 0.8 μg/L, and for THC-COOH were 0.3 and 1.1 μg/L, when extracting only 0.5 mL of blood serum. Therefore, the required limit of quantification for THC of 1 μg/L in driving under the influence of cannabis cases in Germany (and other countries) can be reached and the method can be employed in that context. Real and external control samples were analyzed, and a round robin test was passed successfully. To date, the method is employed in the Institute of Legal Medicine in Giessen, Germany, in daily routine. Automation helps in avoiding errors during sample preparation and reduces the workload of the laboratory personnel. Due to its flexibility, the analysis system can be employed for other liquid-liquid extractions as

  7. Fault tolerance and reliability in integrated ship control

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Izadi-Zamanabadi, Roozbeh; Schiøler, Henrik

    2002-01-01

    Various strategies for achieving fault tolerance in large scale control systems are discussed. The positive and negative impacts of distribution through network communication are presented. The ATOMOS framework for standardized reliable marine automation is presented along with the corresponding...

  8. Debug automation from pre-silicon to post-silicon

    CERN Document Server

    Dehbashi, Mehdi

    2015-01-01

    This book describes automated debugging approaches for the bugs and the faults which appear in different abstraction levels of a hardware system. The authors employ a transaction-based debug approach to systems at the transaction-level, asserting the correct relation of transactions. The automated debug approach for design bugs finds the potential fault candidates at RTL and gate-level of a circuit. Debug techniques for logic bugs and synchronization bugs are demonstrated, enabling readers to localize the most difficult bugs. Debug automation for electrical faults (delay faults)finds the potentially failing speedpaths in a circuit at gate-level. The various debug approaches described achieve high diagnosis accuracy and reduce the debugging time, shortening the IC development cycle and increasing the productivity of designers. Describes a unified framework for debug automation used at both pre-silicon and post-silicon stages; Provides approaches for debug automation of a hardware system at different levels of ...

  9. Analysis of betamethasone in rat plasma using automated solid-phase extraction coupled with liquid chromatography-tandem mass spectrometry. Determination of plasma concentrations in rat following oral and intravenous administration.

    Science.gov (United States)

    Tamvakopoulos, C S; Neugebauer, J M; Donnelly, M; Griffin, P R

    2002-09-01

    A method is described for the determination of betamethasone in rat plasma by liquid chromatography-tandem mass spectrometry (LC-MS-MS). The analyte was recovered from plasma by solid-phase extraction and subsequently analyzed by LC-MS-MS. A Packard Multiprobe II, an automated liquid handling system, was employed for the preparation and extraction of a 96-well plate containing unknown plasma samples, standards and quality control samples in an automated fashion. Prednisolone, a structurally related steroid, was used as an internal standard. Using the described approach, a limit of quantitation of 2 ng/ml was achieved with a 50 microl aliquot of rat plasma. The described level of sensitivity allowed the determination of betamethasone concentrations and subsequent measurement of kinetic parameters of betamethasone in rat. Combination of automated plasma extraction and the sensitivity and selectivity of LC-MS-MS offers a valuable alternative to the methodologies currently used for the quantitation of steroids in biological fluids. PMID:12137997

  10. Fault Current Characteristics of the DFIG under Asymmetrical Fault Conditions

    Directory of Open Access Journals (Sweden)

    Fan Xiao

    2015-09-01

    Full Text Available During non-severe fault conditions, crowbar protection is not activated and the rotor windings of a doubly-fed induction generator (DFIG are excited by the AC/DC/AC converter. Meanwhile, under asymmetrical fault conditions, the electrical variables oscillate at twice the grid frequency in synchronous dq frame. In the engineering practice, notch filters are usually used to extract the positive and negative sequence components. In these cases, the dynamic response of a rotor-side converter (RSC and the notch filters have a large influence on the fault current characteristics of the DFIG. In this paper, the influence of the notch filters on the proportional integral (PI parameters is discussed and the simplified calculation models of the rotor current are established. Then, the dynamic performance of the stator flux linkage under asymmetrical fault conditions is also analyzed. Based on this, the fault characteristics of the stator current under asymmetrical fault conditions are studied and the corresponding analytical expressions of the stator fault current are obtained. Finally, digital simulation results validate the analytical results. The research results are helpful to meet the requirements of a practical short-circuit calculation and the construction of a relaying protection system for the power grid with penetration of DFIGs.

  11. Comparative Evaluation of Three Automated Systems for DNA Extraction in Conjunction with Three Commercially Available Real-Time PCR Assays for Quantitation of Plasma Cytomegalovirus DNAemia in Allogeneic Stem Cell Transplant Recipients▿

    OpenAIRE

    Bravo, Dayana; Clari, María Ángeles; Costa, Elisa; Muñoz-Cobo Liria, Beatriz; Solano Vercet, Carlos; José Remigia, María; Navarro, David

    2011-01-01

    Limited data are available on the performance of different automated extraction platforms and commercially available quantitative real-time PCR (QRT-PCR) methods for the quantitation of cytomegalovirus (CMV) DNA in plasma. We compared the performance characteristics of the Abbott mSample preparation system DNA kit on the m24 SP instrument (Abbott), the High Pure viral nucleic acid kit on the COBAS AmpliPrep system (Roche), and the EZ1 Virus 2.0 kit on the BioRobot EZ1 extraction platform (Qia...

  12. Fault trees

    International Nuclear Information System (INIS)

    Fault trees are a method of deductive analysis and a means of graphic representation of the reliability and security of systems. The principles of the method are set out and the main points illustrated by many examples of electrical systems, fluids, and mechanical systems as well as everyday occurrences. In addition, some advice is given on the use of the method

  13. Machine Fault Signature Analysis

    OpenAIRE

    Mulchandani, K. B.; A.K. Wadhwani; Pratesh Jayaswal

    2008-01-01

    The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while deta...

  14. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  15. Automated solvent concentrator

    Science.gov (United States)

    Griffith, J. S.; Stuart, J. L.

    1976-01-01

    Designed for automated drug identification system (AUDRI), device increases concentration by 100. Sample is first filtered, removing particulate contaminants and reducing water content of sample. Sample is extracted from filtered residue by specific solvent. Concentrator provides input material to analysis subsystem.

  16. Determination of talinolol in human plasma using automated on-line solid phase extraction combined with atmospheric pressure chemical ionization tandem mass spectrometry.

    Science.gov (United States)

    Bourgogne, Emmanuel; Grivet, Chantal; Hopfgartner, Gérard

    2005-06-01

    A specific LC-MS/MS assay was developed for the automated determination of talinolol in human plasma, using on-line solid phase extraction system (prospekt 2) combined with atmospheric pressure chemical ionization (APCI) tandem mass spectrometry. The method involved simple precipitation of plasma proteins with perchloric acid (contained propranolol) as the internal standard (IS) and injection of the supernatant onto a C8 End Capped (10 mmx2 mm) cartridge without any evaporation step. Using the back-flush mode, the analytes were transferred onto an analytical column (XTerra C18, 50 mmx4.6 mm) for chromatographic separation and mass spectrometry detection. One of the particularities of the assay is that the SPE cartridge is used as a column switching device and not as an SPE cartridge. Therefore, the same SPE cartridge could be used more than 28 times, significantly reducing the analysis cost. APCI ionization was selected to overcome any potential matrix suppression effects because the analyte and IS co-eluted. The mean precision and accuracy in the concentration range 2.5-200 ng/mL was found to be 103% and 7.4%, respectively. The data was assessed from QC samples during the validation phase of the assay. The lower limit of quantification was 2.5 ng/mL, using a 250 microL plasma aliquot. The LC-MS/MS method provided the requisite selectivity, sensitivity, robustness accuracy and precision to assess pharmacokinetics of the compound in several hundred human plasma samples. PMID:15866498

  17. Automated solid-phase extraction for the determination of polybrominated diphenyl ethers and polychlorinated biphenyls in serum--application on archived Norwegian samples from 1977 to 2003.

    Science.gov (United States)

    Thomsen, Cathrine; Liane, Veronica Horpestad; Becher, Georg

    2007-02-01

    An analytical method comprised of automated solid-phase extraction and determination using gas chromatography mass spectrometry (single quadrupole) has been developed for the determination of 12 polybrominated diphenyl ethers (PBDEs), 26 polychlorinated biphenyls (PCBs), two organochlorine compounds (OCs) (hexachlorobenzene and octachlorostyrene) and two brominated phenols (pentabromophenol, and tetrabromobisphenol-A (TBBP-A)). The analytes were extracted using a sorbent of polystyrene-divinylbenzene and an additional clean-up was performed on a sulphuric acid-silica column to remove lipids. The method has been validated by spiking horse serum at five levels. The mean accuracy given as recovery relative to internal standards was 95%, 99%, 93% and 109% for the PBDEs PCBs, OCs and brominated phenols, respectively. The mean repeatability given as RSDs was respectively 6.9%, 8.7%, 7.5% and 15%. Estimated limits of detection (S/N=3) were in the range 0.2-1.8 pg/g serum for the PBDEs and phenols, and from 0.1 pg/g to 56 pg/g serum for the PCBs and OCs. The validated method has been used to investigate the levels of PBDEs and PCBs in 21 pooled serum samples from the general Norwegian population. In serum from men (age 40-50 years) the sum of seven PBDE congeners (IUPAC No. 28, 47, 99, 100, 153, 154 and 183) increased from 1977 (0.5 ng/g lipids) to 1998 (4.8 ng/g lipids). From 1999 to 2003 the concentration of PBDEs seems to have stabilised. On the other hand, the sum of five PCBs (IUPAC No. 101, 118, 138, 153 and 180) in these samples decreased steadily from 1977 (666 ng/g lipids) to 2003 (176 ng/g lipids). Tetrabromobisphenol-A and BDE-209 were detected in almost all samples, but no similar temporal trends to that seen for the PBDEs were observed for these compounds, which might be due to the short half-lives of these brominated flame retardants (FR) in humans. PMID:17023223

  18. Bearing fault detection using motor current signal analysis based on wavelet packet decomposition and Hilbert envelope

    Directory of Open Access Journals (Sweden)

    Imaouchen Yacine

    2015-01-01

    Full Text Available To detect rolling element bearing defects, many researches have been focused on Motor Current Signal Analysis (MCSA using spectral analysis and wavelet transform. This paper presents a new approach for rolling element bearings diagnosis without slip estimation, based on the wavelet packet decomposition (WPD and the Hilbert transform. Specifically, the Hilbert transform first extracts the envelope of the motor current signal, which contains bearings fault-related frequency information. Subsequently, the envelope signal is adaptively decomposed into a number of frequency bands by the WPD algorithm. Two criteria based on the energy and correlation analyses have been investigated to automate the frequency band selection. Experimental studies have confirmed that the proposed approach is effective in diagnosing rolling element bearing faults for improved induction motor condition monitoring and damage assessment.

  19. Validation of an automated solid-phase extraction method for the analysis of 23 opioids, cocaine, and metabolites in urine with ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Ramírez Fernández, María del Mar; Van Durme, Filip; Wille, Sarah M R; di Fazio, Vincent; Kummer, Natalie; Samyn, Nele

    2014-06-01

    The aim of this work was to automate a sample preparation procedure extracting morphine, hydromorphone, oxymorphone, norcodeine, codeine, dihydrocodeine, oxycodone, 6-monoacetyl-morphine, hydrocodone, ethylmorphine, benzoylecgonine, cocaine, cocaethylene, tramadol, meperidine, pentazocine, fentanyl, norfentanyl, buprenorphine, norbuprenorphine, propoxyphene, methadone and 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine from urine samples. Samples were extracted by solid-phase extraction (SPE) with cation exchange cartridges using a TECAN Freedom Evo 100 base robotic system, including a hydrolysis step previous extraction when required. Block modules were carefully selected in order to use the same consumable material as in manual procedures to reduce cost and/or manual sample transfers. Moreover, the present configuration included pressure monitoring pipetting increasing pipetting accuracy and detecting sampling errors. The compounds were then separated in a chromatographic run of 9 min using a BEH Phenyl analytical column on a ultra-performance liquid chromatography-tandem mass spectrometry system. Optimization of the SPE was performed with different wash conditions and elution solvents. Intra- and inter-day relative standard deviations (RSDs) were within ±15% and bias was within ±15% for most of the compounds. Recovery was >69% (RSD heroin, buprenorphine and methadone, offering fast and reliable results. Automation resulted in improved precision and accuracy, and a minimum operator intervention, leading to safer sample handling and less time-consuming procedures. PMID:24790061

  20. Method of Fault Area & Section Location for Non-solidly Earthed Distribution System

    Institute of Scientific and Technical Information of China (English)

    ZHENG Guping; JIANG Chao; LI Gang; QI Zheng; YANG Yihan

    2012-01-01

    Medium voltage distributions in China mainly use overhead lines; and most them are small current systems, whose single phase to-earth fault accounts for over 80% of the total failure in power grid. Fault monitoring is one of the main functions of distribution automation, so the new generation of power distribution automation systems in China should thoroughly solve the problem of the orientation of small current grounding fault.

  1. Automated on-line solid phase extraction coupled to HPLC-APCI-MS detection as a versatile tool for the analysis of phenols in water samples

    International Nuclear Information System (INIS)

    In this work a liquid chromatography-atmospheric pressure chemical ionization-mass spectrometry (HPLC-APCI-MS) technique was developed for the determination of phenols and anilines in waste water samples. All relevant parameters were optimized for liquid chromatographic (LC) separation and mass spectrometric (MS) detection. Mass spectrometric detection was used in either negative ionization (NI) or positive ionization (PI) mode, which was depending on the physicochemical properties of the analyte. For screening analysis, full scan mode (SCAN) was used, while selected ion monitoring (SIM) mode of acquisition was used for maximum sensitivity. The optimal interface parameters and solvent compositions were evaluated, which mainly determined the ionization of analytes thus strongly influencing the sensitivity. The quasi-molecular ions were the most abundant signals both for phenols ([M-H]- in NI) and for anilines ([M+H]+ in PI). In general, fragmentation was hardly observed for one-ring phenols. Only fragmentation due to neutral losses of NO, HCl, NH3, CO2, CHO or CO from the functional groups were obtained via collision induced dissociation (CID) in a single quadrupole mass spectrometer. A further source of structural information was the relative intensity of positive and negative ions for one analyte: Only in the case of para-methyl substituted phenols, detection was also possible in positive ionization mode with reasonable sensitivity. In contrast to the phenols, anilines offered somewhat higher structural information due to increased fragmentation through CID, when detected in the positive ionization mode. The main goal of this work was the development of a method for the determination of US EPA priority phenols in different environmental matrices. As highest sensitivity and selectivity was required for this task, a preconcentrating step was necessary, and consequently solid phase extraction (SPE) was coupled on-line to HPLC-APCI-MS. The optimized method allowed the

  2. Interactive Fault Localization Using Test Information

    Institute of Scientific and Technical Information of China (English)

    Dan Hao; Lu Zhang; Tao Xie; Hong Mei; Jia-Su Sun

    2009-01-01

    Debugging is a time-consuming task in software development.Although various automated approaches have been proposed,they are not effective enough.On the other hand,in manual debugging,developers have difficulty in choosing breakpoints.To address these problems and help developers locate faults effectively,we propose an interactive fault-localization framework,combining the benefits of automated approaches and manual debugging.Before the fault is found,this framework continuously recommends checking points based on statements'suspicions.which are calculated according to the execution information of test cases and the feedback information from the developer at earlier checking points.Then we propose a naive approach.which is an initial implementation of this framework.However.with this naive approach or manual debugging,developers'wrong estimation of whether the faulty statement is executed before the checking point(breakpoint)may make the debugging process fail.So we propose another robust approach based on this framework,handling cases where developers make mistakes during the fault-localization process.We performed two experimental studies and the results show that the two interactive approaches are quite effective compared with existing fault-localization approaches.Moreover,the robust approach can help developers find faults when they make wrong estimation at some checking points.

  3. Automated diagnosis of rolling bearings using MRA and neural networks

    Science.gov (United States)

    Castejón, C.; Lara, O.; García-Prada, J. C.

    2010-01-01

    Any industry needs an efficient predictive plan in order to optimize the management of resources and improve the economy of the plant by reducing unnecessary costs and increasing the level of safety. A great percentage of breakdowns in productive processes are caused by bearings. They begin to deteriorate from early stages of their functional life, also called the incipient level. This manuscript develops an automated diagnosis of rolling bearings based on the analysis and classification of signature vibrations. The novelty of this work is the application of the methodology proposed for data collected from a quasi-real industrial machine, where rolling bearings support the radial and axial loads the bearings are designed for. Multiresolution analysis (MRA) is used in a first stage in order to extract the most interesting features from signals. Features will be used in a second stage as inputs of a supervised neural network (NN) for classification purposes. Experimental results carried out in a real system show the soundness of the method which detects four bearing conditions (normal, inner race fault, outer race fault and ball fault) in a very incipient stage.

  4. Machine Fault Signature Analysis

    Directory of Open Access Journals (Sweden)

    K. B. Mulchandani

    2008-03-01

    Full Text Available The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while detailed consideration is given to the subject of the rolling element bearing fault signature analysis.

  5. An Integrated Framework of Drivetrain Degradation Assessment and Fault Localization for Offshore Wind Turbines

    Directory of Open Access Journals (Sweden)

    Jay Lee

    2013-01-01

    Full Text Available As wind energy proliferates in onshore and offshore applications, it has become significantly important to predict wind turbine downtime and maintain operation uptime to ensure maximal yield. Two types of data systems have been widely adopted for monitoring turbine health condition: supervisory control and data acquisition (SCADA and condition monitoring system (CMS. Provided that research and development have focused on advancing analytical techniques based on these systems independently, an intelligent model that associates information from both systems is necessary and beneficial. In this paper, a systematic framework is designed to integrate CMS and SCADA data and assess drivetrain degradation over its lifecycle. Information reference and advanced feature extraction techniques are employed to procure heterogeneous health indicators. A pattern recognition algorithm is used to model baseline behavior and measure deviation of current behavior, where a Self-organizing Map (SOM and minimum quantization error (MQE method is selected to achieve degradation assessment. Eventually, the computation and ranking of component contribution to the detected degradation offers component-level fault localization. When validated and automated by various applications, the approach is able to incorporate diverse data resources and output actionable information to advise predictive maintenance with precise fault information. The approach is validated on a 3 MW offshore turbine, where an incipient fault is detected well before existing system shuts down the unit. A radar chart is used to illustrate the fault localization result.

  6. Automated diagnostics of electronic systems

    OpenAIRE

    Drees, Arto

    2014-01-01

    This thesis was commissioned by Nokia Networks as a part of a wider ongoing quality project. The main objective for this thesis and the quality project was to improve diagnostic accuracy on a certain base station product by targeting the most misdiagnosed faults and to reduce unnecessary component replacement. To achieve the objective, an early version of an automated diagnostic tool was developed. The tool was developed using Microsoft Visual Studio and C# programming language. The tool ...

  7. Fault-tolerant building-block computer study

    Science.gov (United States)

    Rennels, D. A.

    1978-01-01

    Ultra-reliable core computers are required for improving the reliability of complex military systems. Such computers can provide reliable fault diagnosis, failure circumvention, and, in some cases serve as an automated repairman for their host systems. A small set of building-block circuits which can be implemented as single very large integration devices, and which can be used with off-the-shelf microprocessors and memories to build self checking computer modules (SCCM) is described. Each SCCM is a microcomputer which is capable of detecting its own faults during normal operation and is described to communicate with other identical modules over one or more Mil Standard 1553A buses. Several SCCMs can be connected into a network with backup spares to provide fault-tolerant operation, i.e. automated recovery from faults. Alternative fault-tolerant SCCM configurations are discussed along with the cost and reliability associated with their implementation.

  8. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  9. NETWORK FAULT DIAGNOSIS USING DATA MINING CLASSIFIERS

    Directory of Open Access Journals (Sweden)

    Eleni Rozaki

    2015-04-01

    Full Text Available Mobile networks are under more pressure than ever before because of the increasing number of smartphone users and the number of people relying on mobile data networks. With larger numbers of users, the issue of service quality has become more important for network operators. Identifying faults in mobile networks that reduce the quality of service must be found within minutes so that problems can be addressed and networks returned to optimised performance. In this paper, a method of automated fault diagnosis is presented using decision trees, rules and Bayesian classifiers for visualization of network faults. Using data mining techniques the model classifies optimisation criteria based on the key performance indicators metrics to identify network faults supporting the most efficient optimisation decisions. The goal is to help wireless providers to localize the key performance indicator alarms and determine which Quality of Service factors should be addressed first and at which locations.

  10. Process automation

    International Nuclear Information System (INIS)

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  11. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck;

    2013-01-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography–tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids......, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples......-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C18 column using a 6.5 min 0.1 % ammonia (25...

  12. Fault slip distribution and fault roughness

    Science.gov (United States)

    Candela, Thibault; Renard, François; Schmittbuhl, Jean; Bouchon, Michel; Brodsky, Emily E.

    2011-11-01

    We present analysis of the spatial correlations of seismological slip maps and fault topography roughness, illuminating their identical self-affine exponent. Though the complexity of the coseismic spatial slip distribution can be intuitively associated with geometrical or stress heterogeneities along the fault surface, this has never been demonstrated. Based on new measurements of fault surface topography and on statistical analyses of kinematic inversions of slip maps, we propose a model, which quantitatively characterizes the link between slip distribution and fault surface roughness. Our approach can be divided into two complementary steps: (i) Using a numerical computation, we estimate the influence of fault roughness on the frictional strength (pre-stress). We model a fault as a rough interface where elastic asperities are squeezed. The Hurst exponent ?, characterizing the self-affinity of the frictional strength field, approaches ?, where ? is the roughness exponent of the fault surface in the direction of slip. (ii) Using a quasi-static model of fault propagation, which includes the effect of long-range elastic interactions and spatial correlations in the frictional strength, the spatial slip correlation is observed to scale as ?, where ? represents the Hurst exponent of the slip distribution. Under the assumption that the origin of the spatial fluctuations in frictional strength along faults is the elastic squeeze of fault asperities, we show that self-affine geometrical properties of fault surface roughness control slip correlations and that ?. Given that ? for a wide range of faults (various accumulated displacement, host rock and slip movement), we predict that ?. Even if our quasi-static fault model is more relevant for creeping faults, the spatial slip correlations observed are consistent with those of seismological slip maps. A consequence is that the self-affinity property of slip roughness may be explained by fault geometry without considering

  13. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting. PMID:23292043

  14. Preliminaries of probabilistic hierarchical fault detection

    Czech Academy of Sciences Publication Activity Database

    Jirsa, Ladislav; Pavelková, Lenka; Dedecius, Kamil

    Prague: Institute of Information Theory and Automation, 2013 - (Guy, T.; Kárný, M.) ISBN 978-80-903834-8-7. [The 3rd International Workshop on Scalable Decision Making: Uncertainty, Imperfection, Deliberation held in conjunction with ECML/PKDD 2013. Prague (CZ), 23.09.2013-23.09.2013] R&D Projects: GA MŠk 7D12004; GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fault detection * FDI * probabilistic logic * system health Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2013/AS/jirsa-preliminaries of probabilistic hierarchical fault detection.pdf

  15. Evaluating Fault Management Operations Concepts for Next-Generation Spacecraft: What Eye Movements Tell Us

    Science.gov (United States)

    Hayashi, Miwa; Ravinder, Ujwala; McCann, Robert S.; Beutter, Brent; Spirkovska, Lily

    2009-01-01

    Performance enhancements associated with selected forms of automation were quantified in a recent human-in-the-loop evaluation of two candidate operational concepts for fault management on next-generation spacecraft. The baseline concept, called Elsie, featured a full-suite of "soft" fault management interfaces. However, operators were forced to diagnose malfunctions with minimal assistance from the standalone caution and warning system. The other concept, called Besi, incorporated a more capable C&W system with an automated fault diagnosis capability. Results from analyses of participants' eye movements indicate that the greatest empirical benefit of the automation stemmed from eliminating the need for text processing on cluttered, text-rich displays.

  16. TecLines: A MATLAB-Based Toolbox for Tectonic Lineament Analysis from Satellite Images and DEMs, Part 1: Line Segment Detection and Extraction

    OpenAIRE

    Mehdi Rahnama; Richard Gloaguen

    2014-01-01

    Geological structures, such as faults and fractures, appear as image discontinuities or lineaments in remote sensing data. Geologic lineament mapping is a very important issue in geo-engineering, especially for construction site selection, seismic, and risk assessment, mineral exploration and hydrogeological research. Classical methods of lineaments extraction are based on semi-automated (or visual) interpretation of optical data and digital elevation models. We developed a freely available M...

  17. Research on the Fault Coefficient in Complex Electrical Engineering

    Directory of Open Access Journals (Sweden)

    Yi Sun

    2015-08-01

    Full Text Available Fault detection and isolation in a complex system are research hotspots and frontier problems in the reliability engineering field. Fault identification can be regarded as a procedure of excavating key characteristics from massive failure data, then classifying and identifying fault samples. In this paper, based on the fundamental of feature extraction about the fault coefficient, we will discuss the fault coefficient feature in complex electrical engineering in detail. For general fault types in a complex power system, even if there is a strong white Gaussian stochastic interference, the fault coefficient feature is still accurate and reliable. The results about comparative analysis of noise influence will also demonstrate the strong anti-interference ability and great redundancy of the fault coefficient feature in complex electrical engineering.

  18. Space Station Initial Operational Concept (IOC) operations and safety view - Automation and robotics for Space Station

    Science.gov (United States)

    Bates, William V., Jr.

    1989-01-01

    The automation and robotics requirements for the Space Station Initial Operational Concept (IOC) are discussed. The amount of tasks to be performed by an eight-person crew, the need for an automated or directed fault analysis capability, and ground support requirements are considered. Issues important in determining the role of automation for the IOC are listed.

  19. Quantification of five compounds with heterogeneous physicochemical properties (morphine, 6-monoacetylmorphine, cyamemazine, meprobamate and caffeine) in 11 fluids and tissues, using automated solid-phase extraction and gas chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bévalot, Fabien; Bottinelli, Charline; Cartiser, Nathalie; Fanton, Laurent; Guitton, Jérôme

    2014-06-01

    An automated solid-phase extraction (SPE) protocol followed by gas chromatography coupled with tandem mass spectrometry was developed for quantification of caffeine, cyamemazine, meprobamate, morphine and 6-monoacetylmorphine (6-MAM) in 11 biological matrices [blood, urine, bile, vitreous humor, liver, kidney, lung and skeletal muscle, brain, adipose tissue and bone marrow (BM)]. The assay was validated for linearity, within- and between-day precision and accuracy, limits of quantification, selectivity, extraction recovery (ER), sample dilution and autosampler stability on BM. For the other matrices, partial validation was performed (limits of quantification, linearity, within-day precision, accuracy, selectivity and ER). The lower limits of quantification were 12.5 ng/mL(ng/g) for 6-MAM, morphine and cyamemazine, 100 ng/mL(ng/g) for meprobamate and 50 ng/mL(ng/g) for caffeine. Analysis of real-case samples demonstrated the performance of the assay in forensic toxicology to investigate challenging cases in which, for example, blood is not available or in which analysis in alternative matrices could be relevant. The SPE protocol was also assessed as an extraction procedure that could target other relevant analytes of interest. The extraction procedure was applied to 12 molecules of forensic interest with various physicochemical properties (alimemazine, alprazolam, amitriptyline, citalopram, cocaine, diazepam, levomepromazine, nordazepam, tramadol, venlafaxine, pentobarbital and phenobarbital). All drugs were able to be detected at therapeutic concentrations in blood and in the alternate matrices. PMID:24790060

  20. Operator Performance Evaluation of Fault Management Interfaces for Next-Generation Spacecraft

    Science.gov (United States)

    Hayashi, Miwa; Ravinder, Ujwala; Beutter, Brent; McCann, Robert S.; Spirkovska, Lilly; Renema, Fritz

    2008-01-01

    In the cockpit of the NASA's next generation of spacecraft, most of vehicle commanding will be carried out via electronic interfaces instead of hard cockpit switches. Checklists will be also displayed and completed on electronic procedure viewers rather than from paper. Transitioning to electronic cockpit interfaces opens up opportunities for more automated assistance, including automated root-cause diagnosis capability. The paper reports an empirical study evaluating two potential concepts for fault management interfaces incorporating two different levels of automation. The operator performance benefits produced by automation were assessed. Also, some design recommendations for spacecraft fault management interfaces are discussed.

  1. Optimal solutions for protection, control and automation in hydroelectric power systems

    International Nuclear Information System (INIS)

    Fault statistics and a poll at the electricity network companies show that incorrect functions from protection, control- and automation equipment contribute relatively much to undelivered energy. Yet there is little focus on doing fault analyses and register such faults in FASIT (a Norwegian system for registration of faults and interruption). This is especially true of the distribution network 1 - 22 kV. This is where the potential of reducing the amount of undelivered energy by introducing various automatic means is greatest

  2. Summary: beyond fault trees to fault graphs

    International Nuclear Information System (INIS)

    Fault Graphs are the natural evolutionary step over a traditional fault-tree model. A Fault Graph is a failure-oriented directed graph with logic connectives that allows cycles. We intentionally construct the Fault Graph to trace the piping and instrumentation drawing (P and ID) of the system, but with logical AND and OR conditions added. Then we evaluate the Fault Graph with computer codes based on graph-theoretic methods. Fault Graph computer codes are based on graph concepts, such as path set (a set of nodes traveled on a path from one node to another) and reachability (the complete set of all possible paths between any two nodes). These codes are used to find the cut-sets (any minimal set of component failures that will fail the system) and to evaluate the system reliability

  3. Programmable automation systems in PSA

    International Nuclear Information System (INIS)

    The Finnish safety authority (STUK) requires plant specific PSAs, and quantitative safety goals are set on different levels. The reliability analysis is more problematic when critical safety functions are realized by applying programmable automation systems. Conventional modeling techniques do not necessarily apply to the analysis of these systems, and the quantification seems to be impossible. However, it is important to analyze contribution of programmable automation systems to the plant safety and PSA is the only method with system analytical view over the safety. This report discusses the applicability of PSA methodology (fault tree analyses, failure modes and effects analyses) in the analysis of programmable automation systems. The problem of how to decompose programmable automation systems for reliability modeling purposes is discussed. In addition to the qualitative analysis and structural reliability modeling issues, the possibility to evaluate failure probabilities of programmable automation systems is considered. One solution to the quantification issue is the use of expert judgements, and the principles to apply expert judgements is discussed in the paper. A framework to apply expert judgements is outlined. Further, the impacts of subjective estimates on the interpretation of PSA results are discussed. (orig.) (13 refs.)

  4. Fault tree handbook

    International Nuclear Information System (INIS)

    This handbook describes a methodology for reliability analysis of complex systems such as those which comprise the engineered safety features of nuclear power generating stations. After an initial overview of the available system analysis approaches, the handbook focuses on a description of the deductive method known as fault tree analysis. The following aspects of fault tree analysis are covered: basic concepts for fault tree analysis; basic elements of a fault tree; fault tree construction; probability, statistics, and Boolean algebra for the fault tree analyst; qualitative and quantitative fault tree evaluation techniques; and computer codes for fault tree evaluation. Also discussed are several example problems illustrating the basic concepts of fault tree construction and evaluation

  5. Application of support vector machine based on pattern spectrum entropy in fault diagnostics of rolling element bearings

    International Nuclear Information System (INIS)

    This paper presents a novel pattern classification approach for the fault diagnostics of rolling element bearings, which combines the morphological multi-scale analysis and the 'one to others' support vector machine (SVM) classifiers. The morphological pattern spectrum describes the shape characteristics of the inspected signal based on the morphological opening operation with multi-scale structuring elements. The pattern spectrum entropy and the barycenter scale location of the spectrum curve are extracted as the feature vectors presenting different faults of the bearing, which are more effective and representative than the kurtosis and the enveloping demodulation spectrum. The 'one to others' SVM algorithm is adopted to distinguish six kinds of fault signals which were measured in the experimental test rig under eight different working conditions. The recognition results of the SVM are ideal and more precise than those of the artificial neural network even though the training samples are few. The combination of the morphological pattern spectrum parameters and the 'one to others' multi-class SVM algorithm is suitable for the on-line automated fault diagnosis of the rolling element bearings. This application is promising and worth well exploiting

  6. Fault Tolerant Control Systems

    DEFF Research Database (Denmark)

    Bøgh, S.A.

    from this study highlights requirements for a dedicated software environment for fault tolerant control systems design. The second detailed study addressed the detection of a fault event and determination of the failed component. A variety of algorithms were compared, based on two fault scenarios in...... faults, but also that the research field still misses a systematic approach to handle realistic problems such as low sampling rate and nonlinear characteristics of the system. The thesis contributed with methods to detect both faults and specifically with a novel algorithm for the actuator fault...... detection that is superior in terms of performance and complexity to the other algorithms in the comparative study....

  7. Fault Tolerant Feedback Control

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, H.

    2001-01-01

    An architecture for fault tolerant feedback controllers based on the Youla parameterization is suggested. It is shown that the Youla parameterization will give a residual vector directly in connection with the fault diagnosis part of the fault tolerant feedback controller. It turns out that there...... is a separation be-tween the feedback controller and the fault tolerant part. The closed loop feedback properties are handled by the nominal feedback controller and the fault tolerant part is handled by the design of the Youla parameter. The design of the fault tolerant part will not affect the...... design of the nominal feedback con-troller....

  8. Rapid and automated analysis of aflatoxin M1 in milk and dairy products by online solid phase extraction coupled to ultra-high-pressure-liquid-chromatography tandem mass spectrometry.

    Science.gov (United States)

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca

    2016-01-01

    This study reports a fast and automated analytical procedure for the analysis of aflatoxin M1 (AFM1) in milk and dairy products. The method is based on the simultaneous protein precipitation and AFM1 extraction, by salt-induced liquid-liquid extraction (SI-LLE), followed by an online solid-phase extraction (online SPE) coupled to ultra-high-pressure-liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis to the automatic pre-concentration, clean up and sensitive and selective determination of AFM1. The main parameters affecting the extraction efficiency and accuracy of the analytical method were studied in detail. In the optimal conditions, acetonitrile and NaCl were used as extraction/denaturant solvent and salting-out agent in SI-LLE, respectively. After centrifugation, the organic phase (acetonitrile) was diluted with water (1:9 v/v) and purified (1mL) by online C18 cartridge coupled with an UHPLC column. Finally, selected reaction monitoring (SRM) acquisition mode was applied to the detection of AFM1. Validation studies were carried out on different dairy products (whole and skimmed cow milk, yogurt, goat milk, and powder infant formula), providing method quantification limits about 25 times lower than AFM1 maximum levels permitted by EU regulation 1881/2006 in milk and dairy products for direct human consumption. Recoveries (86-102%) and repeatability (RSDeffects were observed in the different milk and dairy products studied. The proposed method improves the performance of AFM1 analysis in milk samples as AFM1 determination is performed with a degree of accuracy higher than the conventional methods. Other advantages are the reduction of sample preparation procedure, time and cost of the analysis, enabling high sample throughput that meet the current concerns of food safety and the public health protection. PMID:26589945

  9. Automated underwater object classification using optical imagery

    OpenAIRE

    Shihavuddin, A.S.M.

    2014-01-01

    This thesis addresses the problem of automated underwater optical image characterization. Remote underwater optical sensing allows the collection and storage of vast amounts of data for which manual classification may take months. Supervised automated classification of such datasets can save time and resources and can also enable extraction of valuableinformation related to marine and geological research

  10. Automated extraction of DNA from reference samples from various types of biological materials on the Qiagen BioRobot EZ1 Workstation

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Jørgensen, Mads; Hansen, Anders Johannes;

    2009-01-01

    We have validated and implemented a protocol for DNA extraction from various types of biological materials using a Qiagen BioRobot EZ1 Workstation. The sample materials included whole blood, blood from deceased, buccal cells on Omni swabs and FTA Cards, blood on FTA Cards and cotton swabs, and...... muscle biopsies. The DNA extraction was validated according to EN/ISO 17025 for the STR kits AmpFlSTR« Identifiler« and AmpFlSTR« Yfiler« (Applied Biosystems). Of 298 samples extracted, 11 (4%) did not yield acceptable results. In conclusion, we have demonstrated that extraction of DNA from various types...... of biological material can be performed quickly and without the use of hazardous chemicals, and that the DNA may be successfully STR typed according to the requirements of forensic genetic investigations accredited according to EN/ISO 17025...

  11. An automatic fault management model for distribution networks

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Haenninen, S. [VTT Energy, Espoo (Finland); Seppaenen, M. [North-Carelian Power Co (Finland); Antila, E.; Markkila, E. [ABB Transmit Oy (Finland)

    1998-08-01

    An automatic computer model, called the FI/FL-model, for fault location, fault isolation and supply restoration is presented. The model works as an integrated part of the substation SCADA, the AM/FM/GIS system and the medium voltage distribution network automation systems. In the model, three different techniques are used for fault location. First, by comparing the measured fault current to the computed one, an estimate for the fault distance is obtained. This information is then combined, in order to find the actual fault point, with the data obtained from the fault indicators in the line branching points. As a third technique, in the absence of better fault location data, statistical information of line section fault frequencies can also be used. For combining the different fault location information, fuzzy logic is used. As a result, the probability weights for the fault being located in different line sections, are obtained. Once the faulty section is identified, it is automatically isolated by remote control of line switches. Then the supply is restored to the remaining parts of the network. If needed, reserve connections from other adjacent feeders can also be used. During the restoration process, the technical constraints of the network are checked. Among these are the load carrying capacity of line sections, voltage drop and the settings of relay protection. If there are several possible network topologies, the model selects the technically best alternative. The FI/IL-model has been in trial use at two substations of the North-Carelian Power Company since November 1996. This chapter lists the practical experiences during the test use period. Also the benefits of this kind of automation are assessed and future developments are outlined

  12. Fault zone hydrogeology

    Science.gov (United States)

    Bense, V. F.; Gleeson, T.; Loveless, S. E.; Bour, O.; Scibek, J.

    2013-12-01

    Deformation along faults in the shallow crust (evaluation of the impact of faults to fluid flow patterns remains a challenge and requires a multidisciplinary research effort of structural geologists and hydrogeologists. However, we find that these disciplines often use different methods with little interaction between them. In this review, we document the current multi-disciplinary understanding of fault zone hydrogeology. We discuss surface- and subsurface observations from diverse rock types from unlithified and lithified clastic sediments through to carbonate, crystalline, and volcanic rocks. For each rock type, we evaluate geological deformation mechanisms, hydrogeologic observations and conceptual models of fault zone hydrogeology. Outcrop observations indicate that fault zones commonly have a permeability structure suggesting they should act as complex conduit-barrier systems in which along-fault flow is encouraged and across-fault flow is impeded. Hydrogeological observations of fault zones reported in the literature show a broad qualitative agreement with outcrop-based conceptual models of fault zone hydrogeology. Nevertheless, the specific impact of a particular fault permeability structure on fault zone hydrogeology can only be assessed when the hydrogeological context of the fault zone is considered and not from outcrop observations alone. To gain a more integrated, comprehensive understanding of fault zone hydrogeology, we foresee numerous synergistic opportunities and challenges for the discipline of structural geology and hydrogeology to co-evolve and address remaining challenges by co-locating study areas, sharing approaches and fusing data, developing conceptual models from hydrogeologic data, numerical modeling, and training interdisciplinary scientists.

  13. Graphical User Interface Aided Online Fault Diagnosis of Electric Motor - DC motor case study

    Directory of Open Access Journals (Sweden)

    POSTALCIOGLU OZGEN, S.

    2009-10-01

    Full Text Available This paper contains graphical user interface (GUI aided online fault diagnosis for DC motor. The aim of the research is to prevent system faults. Online fault diagnosis has been studied. Design of fault diagnosis has two main levels: Level 1 comprises a traditional control loop; Level 2 contains knowledge based fault diagnosis. Fault diagnosis technique contains feature extraction module, feature cluster module and fault decision module. Wavelet analysis has been used for the feature extraction module. For the feature cluster module, fuzzy cluster has been applied. Faults effects are examined on the system using statistical analysis. In this study Fault Diagnosis technique obtains fault detection, identification and halting the system. In the meantime graphical user interface (GUI is opened when fault is detected. GUI shows the measurement value, fault time and fault type. This property gives some information about the system to the personnel. As seen from the simulation results, faults can be detected and identified as soon as fault appears. In summary, if the system has a fault diagnosis structure, system dangerous situations can be avoided.

  14. Planetary Gearbox Fault Diagnosis Using Envelope Manifold Demodulation

    Directory of Open Access Journals (Sweden)

    Weigang Wen

    2016-01-01

    Full Text Available The important issue in planetary gear fault diagnosis is to extract the dependable fault characteristics from the noisy vibration signal of planetary gearbox. To address this critical problem, an envelope manifold demodulation method is proposed for planetary gear fault detection in the paper. This method combines complex wavelet, manifold learning, and frequency spectrogram to implement planetary gear fault characteristic extraction. The vibration signal of planetary gear is demodulated by wavelet enveloping. The envelope energy is adopted as an indicator to select meshing frequency band. Manifold learning is utilized to reduce the effect of noise within meshing frequency band. The fault characteristic frequency of the planetary gear is shown by spectrogram. The planetary gearbox model and test rig are established and experiments with planet gear faults are conducted for verification. All results of experiment analysis demonstrate its effectiveness and reliability.

  15. null Faults, null Images

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Through the study of faults and their effects, much can be learned about the size and recurrence intervals of earthquakes. Faults also teach us about crustal...

  16. Fault detection and isolation in systems with parametric faults

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, Hans Henrik

    1999-01-01

    The problem of fault detection and isolation of parametric faults is considered in this paper. A fault detection problem based on parametric faults are associated with internal parameter variations in the dynamical system. A fault detection and isolation method for parametric faults is formulated...

  17. Detection of Fault Location in Transmission Lines using Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Shilpi Sahu

    2013-09-01

    Full Text Available This paper presents a technique to detect the location of the different faults on a transmission lines for quick and reliable operation of protection schemes. The simulation is developed in MATLAB to generate the fundamental component of the transient voltage and current simultaneously both in time and frequency domain. One cycle of waveform, covering pre-fault and post-fault information is abstracted for analysis. The discrete wavelet transform (DWT is used for data preprocessing. It is applied for decomposition of fault transients, because of its ability to extract information from the transient signal, simultaneously both in time and frequency domain. MATLAB software is used to simulate different operating and fault conditions on high voltage transmission line, namely single phase to ground fault, line to line fault, double line to ground and three phase short circuit.

  18. Software fault tolerance

    OpenAIRE

    Kazinov, Tofik Hasanaga; Mostafa, Jalilian Shahrukh

    2009-01-01

    Because of our present inability to produce errorfree software, software fault tolerance is and will contiune to be an important consideration in software system. The root cause of software design errors in the complexity of the systems. This paper surveys various software fault tolerance techniquest and methodologies. They are two gpoups: Single version and Multi version software fault tolerance techniques. It is expected that software fault tolerance research will benefit from this research...

  19. Software fault tolerance

    OpenAIRE

    Strigini, Lorenzo

    1990-01-01

    Software design faults are a cause of major concern, and their relative importance is growing as techniques for tolerating hardware faults gain wider acceptance. The application of fault tolerance to design faults is both increasing, in particular in some life-critical applications, and controversial, due to the imperfect state of knowledge about it. This paper surveys the existing applications and research results, to help the reader form an initial picture of the existing possibilities, and...

  20. Fault diagnosis of nuclear equipment based on artificial immune system

    International Nuclear Information System (INIS)

    As the nuclear equipment is complicate and special, this paper put forward a novel fault diagnosis method for nuclear equipment based on artificial immune system and the principle to model with negative-selection algorithm and further identify the fault with clone-variation algorithm. Features are extracted with the signal that was sampled in a rotary machinery, then the result is input to the AIS model. Simulation result shows that the model can identify each fault type successfully. (authors)

  1. Faults Classification Scheme for Three Phase Induction Motor

    OpenAIRE

    Mohammed Obaid Mustafa; George Nikolakopoulos; Thomas Gustafsson

    2014-01-01

    In every kind of industrial application, the operation of fault detection and diagnosis for induction motors is of paramount importance. Fault diagnosis and detection led to minimize the downtime and improves its reliability and availability of the systems. In this article, a fault classification algorithm based on a robust linear discrimination scheme, for the case of a squirrel–cage three phase induction motor, will be presented. The suggested scheme is based on a novel feature extraction...

  2. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  3. Performance based fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2002-01-01

    Different aspects of fault detection and fault isolation in closed-loop systems are considered. It is shown that using the standard setup known from feedback control, it is possible to formulate fault diagnosis problems based on a performance index in this general standard setup. It is also shown...

  4. Fault recovery characteristics of the fault tolerant multi-processor

    Science.gov (United States)

    Padilla, Peter A.

    1990-01-01

    The fault handling performance of the fault tolerant multiprocessor (FTMP) was investigated. Fault handling errors detected during fault injection experiments were characterized. In these fault injection experiments, the FTMP disabled a working unit instead of the faulted unit once every 500 faults, on the average. System design weaknesses allow active faults to exercise a part of the fault management software that handles byzantine or lying faults. It is pointed out that these weak areas in the FTMP's design increase the probability that, for any hardware fault, a good LRU (line replaceable unit) is mistakenly disabled by the fault management software. It is concluded that fault injection can help detect and analyze the behavior of a system in the ultra-reliable regime. Although fault injection testing cannot be exhaustive, it has been demonstrated that it provides a unique capability to unmask problems and to characterize the behavior of a fault-tolerant system.

  5. Enantioselective determination of methylphenidate and ritalinic acid in whole blood from forensic cases using automated solid-phase extraction and liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Thomsen, Ragnar; B. Rasmussen, Henrik; Linnet, Kristian;

    2012-01-01

    A chiral liquid chromatography tandem mass spectrometry (LC–MS-MS) method was developed and validated for quantifying methylphenidate and its major metabolite ritalinic acid in blood from forensic cases. Blood samples were prepared in a fully automated system by protein precipitation followed by...... methylphenidate was not determined to be related to the cause of death, the femoral blood concentration of d-methylphenidate ranged from 5 to 58 ng/g, and from undetected to 48 ng/g for l-methylphenidate (median d/l-ratio 5.9). Ritalinic acid was present at concentrations 10–20 times higher with roughly equal...... amounts of the d- and l-forms. In blood from 10 living subjects that were not suspected of being intoxicated by methylphenidate, the concentration ranges and patterns were similar to those of the postmortem cases. Thus, methylphenidate does not appear to undergo significant postmortem redistribution....

  6. A Fault Alarm and Diagnosis Method Based on Sensitive Parameters and Support Vector Machine

    Science.gov (United States)

    Zhang, Jinjie; Yao, Ziyun; Lv, Zhiquan; Zhu, Qunxiong; Xu, Fengtian; Jiang, Zhinong

    2015-08-01

    Study on the extraction of fault feature and the diagnostic technique of reciprocating compressor is one of the hot research topics in the field of reciprocating machinery fault diagnosis at present. A large number of feature extraction and classification methods have been widely applied in the related research, but the practical fault alarm and the accuracy of diagnosis have not been effectively improved. Developing feature extraction and classification methods to meet the requirements of typical fault alarm and automatic diagnosis in practical engineering is urgent task. The typical mechanical faults of reciprocating compressor are presented in the paper, and the existing data of online monitoring system is used to extract fault feature parameters within 15 types in total; the inner sensitive connection between faults and the feature parameters has been made clear by using the distance evaluation technique, also sensitive characteristic parameters of different faults have been obtained. On this basis, a method based on fault feature parameters and support vector machine (SVM) is developed, which will be applied to practical fault diagnosis. A better ability of early fault warning has been proved by the experiment and the practical fault cases. Automatic classification by using the SVM to the data of fault alarm has obtained better diagnostic accuracy.

  7. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...

  8. Analysis of fault using microcomputer protection by symmetrical component method

    Directory of Open Access Journals (Sweden)

    Ashish Choubey

    2012-09-01

    Full Text Available To enhance power supply reliability for the user terminals in the case of the distribution system to avoid interference by the fault again, rapidly complete the automatic identification, positioning, automatic fault isolation, network reconfiguration until the resumption of supply of non-fault section, a microprocessor-based relay protection device has developed. As the fault component theory is widely used in microcomputer protection, and fault component exists in the network of fault component, it is necessary to build up the fault component network when short circuit fault emerging and to draw the current and voltage component phasor diagram at fault point. In order to understand microcomputer protection based on the symmetrical component principle, we obtained the sequence current and sequence voltage according to the concept of symmetrical component. Distribution line directly to user-oriented power supply, the reliability of its operation determines the quality and level of electricity supply. In recent decades, because of the general power of the tireless efforts of scientists and technicians, relay protection technology and equipment application level has been greatly improved, but the current domestic production of computer hardware, protection devices are still outdated systems. Software development has maintenance difficulties and short survival time. With the factory automation system interface functions weak points, the network communication cannot meet the actual requirements. Protection principle configuration and device manufacturing process to be improved and so on.

  9. New low voltage (LV) distribution automation system

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed, M.M.; Sulaiman, M. [National Technical Univ. College of Malaysia, Melaka (Malaysia)

    2007-07-01

    The challenge of supplying non-interrupted power from an electrical distribution system experiencing an electrical fault was discussed. Typically, a team of electricians is sent to the fault area to solve the problem. Since this is both time consuming and expensive, a new method called distribution automation system (DAS) has been proposed to address this challenge. The DAS is aimed at low voltage (LV) distribution systems. Under this newly developed DAS, only the consumer where the fault occurs will be affected. The automated system identifies the exact location of the fault and isolates the consumer from the rest of the power distribution system. The consumer will be reconnected to the system only after fault clearance. The system operates and controls the equipment connected at the substation and distribution line/zone/pole remotely. Linking is done by a power line communication (PLC) system with the help of a Remote Control Unit (RTU) and Supervisory Control and Data Acquisition (SCADA) system which improves the ability to monitor various equipment at the substation and at the consumer location. 5 refs., 1 tab., 18 figs.

  10. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  11. Information Based Fault Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2008-01-01

    Fault detection and isolation, (FDI) of parametric faults in dynamic systems will be considered in this paper. An active fault diagnosis (AFD) approach is applied. The fault diagnosis will be investigated with respect to different information levels from the external inputs to the systems. These...... inputs are disturbance inputs, reference inputs and auxilary inputs. The diagnosis of the system is derived by an evaluation of the signature from the inputs in the residual outputs. The changes of the signatures form the external inputs are used for detection and isolation of the parametric faults....

  12. Information Based Fault Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    Fault detection and isolation, (FDI) of parametric faults in dynamic systems will be considered in this paper. An active fault diagnosis (AFD) approach is applied. The fault diagnosis will be investigated with respect to different information levels from the external inputs to the systems. These...... inputs are disturbance inputs, reference inputs and auxilary inputs. The diagnosis of the system is derived by an evaluation of the signature from the inputs in the residual outputs. The changes of the signatures form the external inputs are used for detection and isolation of the parametric faults....

  13. Rapid analysis of three β-agonist residues in food of animal origin by automated on-line solid-phase extraction coupled to liquid chromatography and tandem mass spectrometry.

    Science.gov (United States)

    Mi, Jiebo; Li, Shujing; Xu, Hong; Liang, Wei; Sun, Tao

    2014-09-01

    An automated online solid-phase extraction with liquid chromatography and tandem mass spectrometry method was developed and validated for the detection of clenbuterol, salbutamol, and ractopamine in food of animal origin. The samples from the food matrix were pretreated with an online solid-phase extraction cartridge by Oasis MCX for <5 min after acid hydrolysis for 30 min. The peak focusing mode was used to elute the target compounds directly onto a C18 column. Chromatographic separation was achieved under gradient conditions using a mobile phase composed of acetonitrile/0.1% formic acid in aqueous solution. Each analyte was detected in two multiple reaction monitoring transitions via an electrospray ionization source in a positive mode. The relative standard deviations ranged from 2.6 to 10.5%, and recovery was between 76.7 and 107.2% at all quality control levels. The limits of quantification of three β-agonists were in the range of 0.024-0.29 μg/kg in pork, sausage, and milk powder, respectively. This newly developed method offers high sensitivity and minimum sample pretreatment for the high-throughput analysis of β-agonist residues. PMID:24916570

  14. Fault isolability conditions for linear systems with additive faults

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2006-01-01

    In this paper, we shall show that an unlimited number of additive single faults can be isolated under mild conditions if a general isolation scheme is applied. Multiple faults are also covered. The approach is algebraic and is based on a set representation of faults, where all faults within a set...... can occur simultaneously, whereas faults belonging to different fault sets appear disjoint in time. The proposed fault detection and isolation (FDI) scheme consists of three steps. A fault detection (FD) step is followed by a fault set isolation (FSI) step. Here the fault set is isolated wherein the...... faults have occurred. The last step is a fault isolation (FI) of the faults occurring in a specific fault set, i.e. equivalent with the standard FI step....

  15. High quality DNA obtained with an automated DNA extraction method with 70+ year old formalin-fixed celloidin-embedded (FFCE) blocks from the indiana medical history museum

    OpenAIRE

    Niland, Erin E; McGuire, Audrey; Cox, Mary H; Sandusky, George E

    2012-01-01

    DNA and RNA have been used as markers of tissue quality and integrity throughout the last few decades. In this research study, genomic quality DNA of kidney, liver, heart, lung, spleen, and brain were analyzed in tissues from post-mortem patients and surgical cancer cases spanning the past century. DNA extraction was performed on over 180 samples from: 70+ year old formalin-fixed celloidin-embedded (FFCE) tissues, formalin-fixed paraffin-embedded (FFPE) tissue samples from surgical cases and ...

  16. Multiresidue trace analysis of pharmaceuticals, their human metabolites and transformation products by fully automated on-line solid-phase extraction-liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    García-Galán, María Jesús; Petrovic, Mira; Rodríguez-Mozaz, Sara; Barceló, Damià

    2016-09-01

    A novel, fully automated analytical methodology based on dual column liquid chromatography coupled to tandem mass spectrometry (LC-LC-MS(2)) has been developed and validated for the analysis of 12 pharmaceuticals and 20 metabolites and transformation products in different types of water (influent and effluent wastewaters and surface water). Two LC columns were used - one for pre-concentration of the sample and the second for separation and analysis - so that water samples were injected directly in the chromatographic system. Besides the many advantages of the methodology, such as minimization of the sample volume required and its manipulation, both compounds ionized in positive and negative mode could be analyzed simultaneously without compromising the sensitivity. A comparative study of different mobile phases, gradients and LC pre-concentration columns was carried out to obtain the best analytical performance. Limits of detection (MLODs) achieved were in the low ngL(-1) range for all the compounds. The method was successfully applied to study the presence of the target analytes in different wastewater and surface water samples collected near the city of Girona (Catalonia, Spain). Data on the environmental presence and fate of pharmaceutical metabolites and TPs is still scarce, highlighting the relevance of the developed methodology. PMID:27343613

  17. Development and Test of Methods for Fault Detection and Isolation

    DEFF Research Database (Denmark)

    Jørgensen, R.B.

    Almost all industrial systemns are automated to ensure optimal production both in relation to energy consumtion and safety to equipment and humans. All working parts are individually subject to faults. This can lead to unacceptable economic loss or injury to people. This thesis deals with a...

  18. A Scrutiny of Automated Healthcare System with SFT

    Directory of Open Access Journals (Sweden)

    Jigna B. Prajapati

    2011-12-01

    Full Text Available In today’s techno savvy world, automated system is very important and contemporary issue. Automated systems are widely used at industries, appliance, automobile, undersea, space and healthcare over the past decade. The accuracy of Robots makes any system more acceptable. Here we use the Robots which assist us to manage patient’s heath. We always expect the system must work under any situation. The development of Robotic software is a complex and error prone process. Most complex systems contain software, and systems failures activated by software faults can provide lessons for software development practices and software quality assurance. They must be identified and removed as early as possible. The interrelationship between software faults and failures is quite intricate and obtaining a meaningful characterization. Towards this characterization, we have investigated and classified failures observed in Robotic system. In this paper, we describe the process used in our study for tracking faults. We present the different types of faults, their impact and fault classification. The concern thing is Faults classification and proposed way to manage them. Then we propose the fault tolerance techniques as single to covenant with different faults

  19. Intelligent Automated Diagnosis of Client Device Bottlenecks in Private Clouds

    CERN Document Server

    Widanapathirana, C; Sekercioglu, Y A; Ivanovich, M; Fitzpatrick, P; 10.1109/UCC.2011.42

    2012-01-01

    We present an automated solution for rapid diagnosis of client device problems in private cloud environments: the Intelligent Automated Client Diagnostic (IACD) system. Clients are diagnosed with the aid of Transmission Control Protocol (TCP) packet traces, by (i) observation of anomalous artifacts occurring as a result of each fault and (ii) subsequent use of the inference capabilities of soft-margin Support Vector Machine (SVM) classifiers. The IACD system features a modular design and is extendible to new faults, with detection capability unaffected by the TCP variant used at the client. Experimental evaluation of the IACD system in a controlled environment demonstrated an overall diagnostic accuracy of 98%.

  20. Transformer Internal Faults Simulation

    Directory of Open Access Journals (Sweden)

    KOOCHAKI, A.

    2008-06-01

    Full Text Available This paper presents a novel method of modeling internal faults in a power transformer. The method leads to a model which is compatible with commercial phasor-based software packages. Consequently; it enables calculation of fault currents in any branch of the network due to a winding fault of a power transformer. These currents can be used for evaluation of protective relays' performance and can lead to better setting of protective functions.

  1. Automated workstation for MT-25 microtron operator

    International Nuclear Information System (INIS)

    The goal of this paper is to create a AHM MT-25 Automated workstation in order to help microtron operator in the preparation, start, technical fault search and adjust, use, turn off and maintenance processes. The ARM MT-25 requirements are personal computer, compatible with IBM PC XT/AT with at least 640 RAM, medium resolution graphic display, hard disk and DOS 3.3 operating system. 4 refs.; 8 figs

  2. Research of Gear Fault Detection in Morphological Wavelet Domain

    Science.gov (United States)

    Hong, Shi; Fang-jian, Shan; Bo, Cong; Wei, Qiu

    2016-02-01

    For extracting mutation information from gear fault signal and achieving a valid fault diagnosis, a gear fault diagnosis method based on morphological mean wavelet transform was designed. Morphological mean wavelet transform is a linear wavelet in the framework of morphological wavelet. Decomposing gear fault signal by this morphological mean wavelet transform could produce signal synthesis operators and detailed synthesis operators. For signal synthesis operators, it was just close to orginal signal, and for detailed synthesis operators, it contained fault impact signal or interference signal and could be catched. The simulation experiment result indicates that, compared with Fourier transform, the morphological mean wavelet transform method can do time-frequency analysis for original signal, effectively catch impact signal appears position; and compared with traditional linear wavelet transform, it has simple structure, easy realization, signal local extremum sensitivity and high denoising ability, so it is more adapted to gear fault real-time detection.

  3. Support Vector Machine for mechanical faults classification

    Institute of Scientific and Technical Information of China (English)

    JIANG Zhi-qiang; FU Han-guang; LI Ling-jun

    2005-01-01

    Support Vector Machine (SVM) is a machine learning algorithm based on the Statistical Learning Theory (SLT), which can get good classification effects with a few learning samples. SVM represents a new approach to pattern classification and has been shown to be particularly successful in many fields such as image identification and face recognition. It also provides us with a new method to develop intelligent fault diagnosis. This paper presents an SVM based approach for fault diagnosis of rolling bearings. Experimentation with vibration signals of bearing was conducted. The vibration signals acquired from the bearings were directly used in the calculating without the preprocessing of extracting its features. Compared with the Artificial Neural Network (ANN) based method, the SVM based method has desirable advantages. Also a multi-fault SVM classifier based on binary classifier is constructed for gear faults in this paper. Other experiments with gear fault samples showed that the multi-fault SVM classifier has good classification ability and high efficiency in mechanical system. It is suitable for online diagnosis for mechanical system.

  4. Determination of etoricoxib in human plasma using automated on-line solid-phase extraction coupled with LC-APCI/MS/MS

    Directory of Open Access Journals (Sweden)

    Sérgio Luiz Dalmora

    2008-01-01

    Full Text Available A liquid chromatography-tandem mass spectrometry method with atmospheric pressure chemical ionization (LC-APCI/MS/MS was validated for the determination of etoricoxib in human plasma using antipyrin as internal standard, followed by on-line solid-phase extraction. The method was performed on a Luna C18 column and the mobile phase consisted of acetonitrile:water (95:5, v/v/ammonium acetate (pH 4.0; 10 mM, run at a flow rate of 0.6 mL/min. The method was linear in the range of 1-5000 ng/mL (r²>0.99. The lower limit of quantitation was 1 ng/mL. The recoveries were within 93.72-96.18%. Moreover, method validation demonstrated acceptable results for the precision, accuracy and stability studies.

  5. Automated microdialysis-based system for in situ microsampling and investigation of lead bioavailability in terrestrial environments under physiologically based extraction conditions.

    Science.gov (United States)

    Rosende, María; Magalhães, Luis M; Segundo, Marcela A; Miró, Manuel

    2013-10-15

    In situ automatic microdialysis sampling under batch-flow conditions is herein proposed for the first time for expedient assessment of the kinetics of lead bioaccessibility/bioavailability in contaminated and agricultural soils exploiting the harmonized physiologically based extraction test (UBM). Capitalized upon a concentric microdialysis probe immersed in synthetic gut fluids, the miniaturized flow system is harnessed for continuous monitoring of lead transfer across the permselective microdialysis membrane to mimic the diffusive transport of metal species through the epithelium of the stomach and of the small intestine. Besides, the addition of the UBM gastrointestinal fluid surrogates at a specified time frame is fully mechanized. Distinct microdialysis probe configurations and membranes types were investigated in detail to ensure passive sampling under steady-state dialytic conditions for lead. Using a 3-cm-long polysulfone membrane with averaged molecular weight cutoff of 30 kDa in a concentric probe and a perfusate flow rate of 2.0 μL min(-1), microdialysis relative recoveries in the gastric phase were close to 100%, thereby omitting the need for probe calibration. The automatic leaching method was validated in terms of bias in the analysis of four soils with different physicochemical properties and containing a wide range of lead content (16 ± 3 to 1216 ± 42 mg kg(-1)) using mass balance assessment as a quality control tool. No significant differences between the mass balance and the total lead concentration in the suite of analyzed soils were encountered (α = 0.05). Our finding that the extraction of soil-borne lead for merely one hour in the GI phase suffices for assessment of the bioavailable fraction as a result of the fast immobilization of lead species at near-neutral conditions would assist in providing risk assessment data from the UBM test on a short notice. PMID:24016003

  6. Series Arc Fault Detection Algorithm Based on Autoregressive Bispectrum Analysis

    Directory of Open Access Journals (Sweden)

    Kai Yang

    2015-10-01

    Full Text Available Arc fault is one of the most critical reasons for electrical fires. Due to the diversity, randomness and concealment of arc faults in low-voltage circuits, it is difficult for general methods to protect all loads from series arc faults. From the analysis of many series arc faults, a large number of high frequency signals generated in circuits are found. These signals are easily affected by Gaussian noise which is difficult to be eliminated as a result of frequency aliasing. Thus, a novel detection algorithm is developed to accurately detect series arc faults in this paper. Initially, an autoregressive model of the mixed high frequency signals is modelled. Then, autoregressive bispectrum analysis is introduced to analyze common series arc fault features. The phase information of arc fault signal is preserved using this method. The influence of Gaussian noise is restrained effectively. Afterwards, several features including characteristic frequency, fluctuation of phase angles, diffused distribution and incremental numbers of bispectrum peaks are extracted for recognizing arc faults. Finally, least squares support vector machine is used to accurately identify series arc faults from the load states based on these frequency features of bispectrum. The validity of the algorithm is experimentally verified obtaining arc fault detection rate above 97%.

  7. Stator Interturn Fault Detection in Permanent-Magnet Machines Using PWM Ripple Current Measurement

    OpenAIRE

    Sen, B.; Wang, J.

    2016-01-01

    This paper proposes a novel method of interturn fault detection based on measurement of pulsewidth modulation (PWM) ripple current. The method uses the ripple current generated by the switching inverter as a means to detect interturn fault. High-frequency (HF) impedance behavior of healthy and faulted windings is analyzed and modeled, and ripple current signature due to interturn faults is quantified. A simple analog circuit is designed to extract the PWM ripple current via a bandpass (BP) fi...

  8. Fault Detection and Isolation of Wind Energy Conversion Systems using Recurrent Neural Networks

    OpenAIRE

    N. Talebi; M.A. Sadrnia; A. Darabi

    2014-01-01

    Reliability of Wind Energy Conversion Systems (WECSs) is greatly important regarding to extract the maximum amount of available wind energy. In order to accurately study WECSs during occurrence of faults and to explore the impact of faults on each component of WECSs, a detailed model is required in which mechanical and electrical parts of WECSs are properly involved. In addition, a Fault Detection and Isolation System (FDIS) is required by which occurred faults can be diagnosed at the appropr...

  9. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  10. Fault Locating, Prediction and Protection (FLPPS)

    Energy Technology Data Exchange (ETDEWEB)

    Yinger, Robert, J.; Venkata, S., S.; Centeno, Virgilio

    2010-09-30

    One of the main objectives of this DOE-sponsored project was to reduce customer outage time. Fault location, prediction, and protection are the most important aspects of fault management for the reduction of outage time. In the past most of the research and development on power system faults in these areas has focused on transmission systems, and it is not until recently with deregulation and competition that research on power system faults has begun to focus on the unique aspects of distribution systems. This project was planned with three Phases, approximately one year per phase. The first phase of the project involved an assessment of the state-of-the-art in fault location, prediction, and detection as well as the design, lab testing, and field installation of the advanced protection system on the SCE Circuit of the Future located north of San Bernardino, CA. The new feeder automation scheme, with vacuum fault interrupters, will limit the number of customers affected by the fault. Depending on the fault location, the substation breaker might not even trip. Through the use of fast communications (fiber) the fault locations can be determined and the proper fault interrupting switches opened automatically. With knowledge of circuit loadings at the time of the fault, ties to other circuits can be closed automatically to restore all customers except the faulted section. This new automation scheme limits outage time and increases reliability for customers. The second phase of the project involved the selection, modeling, testing and installation of a fault current limiter on the Circuit of the Future. While this project did not pay for the installation and testing of the fault current limiter, it did perform the evaluation of the fault current limiter and its impacts on the protection system of the Circuit of the Future. After investigation of several fault current limiters, the Zenergy superconducting, saturable core fault current limiter was selected for

  11. Fault Locating, Prediction and Protection (FLPPS)

    Energy Technology Data Exchange (ETDEWEB)

    Yinger, Robert, J.; Venkata, S., S.; Centeno, Virgilio

    2010-09-30

    One of the main objectives of this DOE-sponsored project was to reduce customer outage time. Fault location, prediction, and protection are the most important aspects of fault management for the reduction of outage time. In the past most of the research and development on power system faults in these areas has focused on transmission systems, and it is not until recently with deregulation and competition that research on power system faults has begun to focus on the unique aspects of distribution systems. This project was planned with three Phases, approximately one year per phase. The first phase of the project involved an assessment of the state-of-the-art in fault location, prediction, and detection as well as the design, lab testing, and field installation of the advanced protection system on the SCE Circuit of the Future located north of San Bernardino, CA. The new feeder automation scheme, with vacuum fault interrupters, will limit the number of customers affected by the fault. Depending on the fault location, the substation breaker might not even trip. Through the use of fast communications (fiber) the fault locations can be determined and the proper fault interrupting switches opened automatically. With knowledge of circuit loadings at the time of the fault, ties to other circuits can be closed automatically to restore all customers except the faulted section. This new automation scheme limits outage time and increases reliability for customers. The second phase of the project involved the selection, modeling, testing and installation of a fault current limiter on the Circuit of the Future. While this project did not pay for the installation and testing of the fault current limiter, it did perform the evaluation of the fault current limiter and its impacts on the protection system of the Circuit of the Future. After investigation of several fault current limiters, the Zenergy superconducting, saturable core fault current limiter was selected for

  12. NASA Space Flight Vehicle Fault Isolation Challenges

    Science.gov (United States)

    Bramon, Christopher; Inman, Sharon K.; Neeley, James R.; Jones, James V.; Tuttle, Loraine

    2016-01-01

    The Space Launch System (SLS) is the new NASA heavy lift launch vehicle and is scheduled for its first mission in 2017. The goal of the first mission, which will be uncrewed, is to demonstrate the integrated system performance of the SLS rocket and spacecraft before a crewed flight in 2021. SLS has many of the same logistics challenges as any other large scale program. Common logistics concerns for SLS include integration of discrete programs geographically separated, multiple prime contractors with distinct and different goals, schedule pressures and funding constraints. However, SLS also faces unique challenges. The new program is a confluence of new hardware and heritage, with heritage hardware constituting seventy-five percent of the program. This unique approach to design makes logistics concerns such as testability of the integrated flight vehicle especially problematic. The cost of fully automated diagnostics can be completely justified for a large fleet, but not so for a single flight vehicle. Fault detection is mandatory to assure the vehicle is capable of a safe launch, but fault isolation is another issue. SLS has considered various methods for fault isolation which can provide a reasonable balance between adequacy, timeliness and cost. This paper will address the analyses and decisions the NASA Logistics engineers are making to mitigate risk while providing a reasonable testability solution for fault isolation.

  13. An automated SPE-based high-yield synthesis of [11C]acetate and [11C]palmitate: no liquid–liquid extraction, solvent evaporation or distillation required

    International Nuclear Information System (INIS)

    Introduction: An automated method is described for the rapid and high-yield synthesis of two of the most commonly used radioactive fatty acids: [11C]acetate and [11C]palmitate. Methods: Reaction of [11C]CO2 with the respective Grignard reagents in diethyl ether solution proceeded for 2 min at 40°C. Quenching of the reaction and liberation of nonreacted [11C]CO2 occurred upon addition of a fourfold molar excess of aqueous 0.1 M HCl (acetate) or nonaqueous HCl/Et2O (palmitate). Labeled products were then purified by adsorption to an Alumina-N Sep-Pak Plus cartridge and eluted with either aqueous NaH2PO4 solution (acetate) or 100% ethanol (palmitate). Results: High-performance liquid chromatography analysis confirmed that the radiochemical purity of each product was >98%, and decay-corrected radiochemical yields averaged 33% for [11C]palmitate and 40% for [11C]acetate. Conclusion: The method requires no liquid–liquid extraction, solvent evaporation or distillation capabilities and can be readily adapted to existing radiosynthesis modules.

  14. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection

    International Nuclear Information System (INIS)

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 μg mL-1 of propoxur, with a detection limit of 5 ng mL-1. The repeatability was 0.82% expressed as relative standard deviation (n = 10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL-1 levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L-1 using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 μg kg-1

  15. Characterization of leaky faults

    International Nuclear Information System (INIS)

    Leaky faults provide a flow path for fluids to move underground. It is very important to characterize such faults in various engineering projects. The purpose of this work is to develop mathematical solutions for this characterization. The flow of water in an aquifer system and the flow of air in the unsaturated fault-rock system were studied. If the leaky fault cuts through two aquifers, characterization of the fault can be achieved by pumping water from one of the aquifers, which are assumed to be horizontal and of uniform thickness. Analytical solutions have been developed for two cases of either a negligibly small or a significantly large drawdown in the unpumped aquifer. Some practical methods for using these solutions are presented. 45 refs., 72 figs., 11 tabs

  16. Pressure-driven mesofluidic platform integrating automated on-chip renewable micro-solid-phase extraction for ultrasensitive determination of waterborne inorganic mercury.

    Science.gov (United States)

    Portugal, Lindomar A; Laglera, Luis M; Anthemidis, Aristidis N; Ferreira, Sérgio L C; Miró, Manuel

    2013-06-15

    A dedicated pressure-driven mesofluidic platform incorporating on-chip sample clean-up and analyte preconcentration is herein reported for expedient determination of trace level concentrations of waterborne inorganic mercury. Capitalizing upon the Lab-on-a-Valve (LOV) concept, the mesofluidic device integrates on-chip micro-solid phase extraction (μSPE) in automatic disposable mode followed by chemical vapor generation and gas-liquid separation prior to in-line atomic fluorescence spectrometric detection. In contrast to prevailing chelating sorbents for Hg(II), bare poly(divinylbenzene-N-vinylpyrrolidone) copolymer sorptive beads were resorted to efficient uptake of Hg(II) in hydrochloric acid milieu (pH=2.3) without the need for metal derivatization nor pH adjustment of prior acidified water samples for preservation to near-neutral conditions. Experimental variables influencing the sorptive uptake and retrieval of target species and the evolvement of elemental mercury within the miniaturized integrated reaction chamber/gas-liquid separator were investigated in detail. Using merely <10 mg of sorbent, the limits of detection and quantification at the 3s(blank) and 10s(blank) levels, respectively, for a sample volume of 3 mL were 12 and 42 ng L(-1) Hg(II) with a dynamic range extending up to 5.0 μg L(-1). The proposed mesofluidic platform copes with the requirements of regulatory bodies (US-EPA, WHO, EU-Commission) for drinking water quality and surface waters that endorse maximum allowed concentrations of mercury spanning from 0.07 to 6.0 μg L(-1). Demonstrated with the analysis of aqueous samples of varying matrix complexity, the LOV approach afforded reliable results with relative recoveries of 86-107% and intermediate precision down to 9% in the renewable μSPE format. PMID:23618176

  17. Automated Periodontal Diseases Classification System

    OpenAIRE

    Aliaa A. A. Youssif; Abeer Saad Gawish,; Mohammed Elsaid Moussa

    2012-01-01

    This paper presents an efficient and innovative system for automated classification of periodontal diseases, The strength of our technique lies in the fact that it incorporates knowledge from the patients' clinical data, along with the features automatically extracted from the Haematoxylin and Eosin (H&E) stained microscopic images. Our system uses image processing techniques based on color deconvolution, morphological operations, and watershed transforms for epithelium & connective tissue se...

  18. Fault kinematic and Mesozoic paleo-stress evolution of the Hoop fault complex, Barents Sea

    Science.gov (United States)

    Etchebes, Marie; Athmer, Wiebke; Stueland, Eirik; Robertson, Sarah C.; Bounaim, Aicha; Steckhan, Dirk; Hellem Boe, Trond; Brenna, Trond; Sonneland, Lars; Reidar Granli, John

    2016-04-01

    The Hoop fault complex is an extensional fault system characterized by a series of multiscale half- and full-grabens trending NNE-SSW, NE-SW and E-W, and transfer zones striking ENE-WSW. In a joint collaboration between OMV Norge and Schlumberger Stavanger Research, the tectonic history of the Hoop area was assessed. A dense fault network was extracted from 3D seismic data using a novel workflow for mapping large and complex fault systems. The characterization of the fault systems was performed by integrating observations from (1) fault plane analysis, (2) geometrical shapes and crosscutting relationships of the different fault sets, (3) time-thickness maps, and (4) by establishing the relative timing of the tectonic events on key seismic lines orthogonal to the main fault strike azimuths. At least four successive extensional tectonic events affecting the Hoop fault complex have been identified in the Mesozoic. The first tectonic event is characterized by an Upper Triassic extensional event with an E-W trending maximum horizontal paleo-stress direction (Phase 1). This event led to new accommodation space established as a set of full-grabens. The grabens were orthogonally crosscut during the Middle Jurassic by a set of NNE-SSW striking grabens and half-grabens (Phase 2). Phase 3 was inferred from a set of E-W striking reactivated normal faults sealed by the Upper Jurassic-Lower Cretaceous sequence. In the Lower Cretaceous, the general trend of the maximum horizontal paleo-stress axis of Phase 2 rotates clockwise from NNE-SSW to NE-SW (Phase 4). This stress rotation induced the reactivation of Phase 2 and Phase 3 normal fault sets, producing west-dipping half-grabens/tilt-block systems and transtensional fault zones. A comparison between our results and the Mesozoic regional-scale tectonic events published for the Atlantic-Arctic region agrees with our reconstructed paleo-stress history. This implies that the Hoop fault complex is the result of far-field forces

  19. 基于小波包的离心压缩机故障特征提取方法研究%Research on Fault Feature Extraction of Centrifugal Compressor Based on Wavelet Packet Method

    Institute of Scientific and Technical Information of China (English)

    史生霖; 陈长征; 李延斌; 张晶

    2013-01-01

    The fault diagnosis of centrifugal compressor is an important aspect of mechanical failure detect. The wavelet packet analysis method can get the faint and non-stationary vibration signals of the machine and fully reflect the fault information. Therefore, this method can provide an exact and efficient method for the fault diagnosis of centrifugal compressor.%离心压缩机故障诊断是机械故障检测中的一个重要方面。为了提取机械故障发出微弱的非平稳的振动信号,小波包分析法能够充分体现其故障信息。所以小波包分析法为离心压缩机故障诊断提供了一种准确有效的方法。

  20. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  1. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    Science.gov (United States)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2015-01-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (coefficient of variation of 15% for positive control, 4% for ambient samples) and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution & Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, roadside, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per-air-volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in the southeastern US. The correlation may also suggest a mechanistic explanation (oxidative stress) for observed PM2.5 mass-health associations. The heterogeneity in the intrinsic DTT activity (per-PM-mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the

  2. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE

    Directory of Open Access Journals (Sweden)

    T. Fang

    2014-07-01

    Full Text Available A variety of methods are used to measure the capability of particulate matter (PM to catalytically generate reactive oxygen species (ROS in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples, and reasonably low limit of detection (0.31 nmol min−1. Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9. The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88, suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  3. A probabilistic method to diagnose faults of air handling units

    Science.gov (United States)

    Dey, Debashis

    Air handling unit (AHU) is one of the most extensively used equipment in large commercial buildings. This device is typically customized and lacks quality system integration which can result in hardwire failures and controller errors. Air handling unit Performance Assessment Rules (APAR) is a fault detection tool that uses a set of expert rules derived from mass and energy balances to detect faults in air handling units. APAR is computationally simple enough that it can be embedded in commercial building automation and control systems and relies only upon sensor data and control signals that are commonly available in these systems. Although APAR has many advantages over other methods, for example no training data required and easy to implement commercially, most of the time it is unable to provide the diagnosis of the faults. For instance, a fault on temperature sensor could be fixed bias, drifting bias, inappropriate location, complete failure. Also a fault in mixing box can be return and outdoor damper leak or stuck. In addition, when multiple rules are satisfied the list of faults increases. There is no proper way to have the correct diagnosis for rule based fault detection system. To overcome this limitation we proposed Bayesian Belief Network (BBN) as a diagnostic tool. BBN can be used to simulate diagnostic thinking of FDD experts through a probabilistic way. In this study we developed a new way to detect and diagnose faults in AHU through combining APAR rules and Bayesian Belief network. Bayesian Belief Network is used as a decision support tool for rule based expert system. BBN is highly capable to prioritize faults when multiple rules are satisfied simultaneously. Also it can get information from previous AHU operating conditions and maintenance records to provide proper diagnosis. The proposed model is validated with real time measured data of a campus building at University of Texas at San Antonio (UTSA).The results show that BBN is correctly able to

  4. Automated synthetic scene generation

    Science.gov (United States)

    Givens, Ryan N.

    Physics-based simulations generate synthetic imagery to help organizations anticipate system performance of proposed remote sensing systems. However, manually constructing synthetic scenes which are sophisticated enough to capture the complexity of real-world sites can take days to months depending on the size of the site and desired fidelity of the scene. This research, sponsored by the Air Force Research Laboratory's Sensors Directorate, successfully developed an automated approach to fuse high-resolution RGB imagery, lidar data, and hyperspectral imagery and then extract the necessary scene components. The method greatly reduces the time and money required to generate realistic synthetic scenes and developed new approaches to improve material identification using information from all three of the input datasets.

  5. Manufacturing and automation

    OpenAIRE

    Ernesto Córdoba Nieto

    2010-01-01

    The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and e...

  6. Fuzzy classifier for fault diagnosis in analog electronic circuits.

    Science.gov (United States)

    Kumar, Ashwani; Singh, A P

    2013-11-01

    Many studies have presented different approaches for the fault diagnosis with fault models having ± 50% variation in the component values in analog electronic circuits. There is still a need of the approaches which provide the fault diagnosis with the variation in the component value below ± 50%. A new single and multiple fault diagnosis technique for soft faults in analog electronic circuit using fuzzy classifier has been proposed in this paper. This technique uses the simulation before test (SBT) approach by analyzing the frequency response of the analog circuit under faulty and fault free conditions. Three signature parameters peak gain, frequency and phase associated with peak gain, of the frequency response of the analog circuit are observed and extracted such that they give unique values for faulty and fault free configuration of the circuit. The single and double fault models with the component variations from ± 10% to ± 50% are considered. The fuzzy classifier along the classification of faults gives the estimated component value under faulty and faultfree conditions. The proposed method is validated using simulated data and the real time data for a benchmark analog circuit. The comparative analysis is also presented for both the validations. PMID:23849881

  7. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  8. Fault Analysis in Cryptography

    CERN Document Server

    Joye, Marc

    2012-01-01

    In the 1970s researchers noticed that radioactive particles produced by elements naturally present in packaging material could cause bits to flip in sensitive areas of electronic chips. Research into the effect of cosmic rays on semiconductors, an area of particular interest in the aerospace industry, led to methods of hardening electronic devices designed for harsh environments. Ultimately various mechanisms for fault creation and propagation were discovered, and in particular it was noted that many cryptographic algorithms succumb to so-called fault attacks. Preventing fault attacks without

  9. Latest Progress of Fault Detection and Localization in Complex Electrical Engineering

    Science.gov (United States)

    Zhao, Zheng; Wang, Can; Zhang, Yagang; Sun, Yi

    2014-01-01

    In the researches of complex electrical engineering, efficient fault detection and localization schemes are essential to quickly detect and locate faults so that appropriate and timely corrective mitigating and maintenance actions can be taken. In this paper, under the current measurement precision of PMU, we will put forward a new type of fault detection and localization technology based on fault factor feature extraction. Lots of simulating experiments indicate that, although there are disturbances of white Gaussian stochastic noise, based on fault factor feature extraction principal, the fault detection and localization results are still accurate and reliable, which also identifies that the fault detection and localization technology has strong anti-interference ability and great redundancy.

  10. Detecting Faults By Use Of Hidden Markov Models

    Science.gov (United States)

    Smyth, Padhraic J.

    1995-01-01

    Frequency of false alarms reduced. Faults in complicated dynamic system (e.g., antenna-aiming system, telecommunication network, or human heart) detected automatically by method of automated, continuous monitoring. Obtains time-series data by sampling multiple sensor outputs at discrete intervals of t and processes data via algorithm determining whether system in normal or faulty state. Algorithm implements, among other things, hidden first-order temporal Markov model of states of system. Mathematical model of dynamics of system not needed. Present method is "prior" method mentioned in "Improved Hidden-Markov-Model Method of Detecting Faults" (NPO-18982).

  11. Fuzzy fault diagnostic system based on fault tree analysis

    OpenAIRE

    Yang, Zong Xiao; Suzuki, Kazuhiko; Shimada, Yukiyasu; Sayama, Hayatoshi

    1995-01-01

    A method is presented for process fault diagnosis using information from fault tree analysis and uncertainty/imprecision of data. Fault tree analysis, which has been used as a method of system reliability/safety analysis, provides a procedure for identifying failures within a process. A fuzzy fault diagnostic system is constructed which uses the fuzzy fault tree analysis to represent a knowledge of the causal relationships in process operation and control system. The proposed method is applie...

  12. Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry

    Energy Technology Data Exchange (ETDEWEB)

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2014-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  13. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  14. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  15. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine

    International Nuclear Information System (INIS)

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL-1, respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.

  16. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine

    Energy Technology Data Exchange (ETDEWEB)

    Leon, Zacarias; Chisvert, Alberto; Balaguer, Angel [Departamento de Quimica Analitica, Facultad de Quimica, Universitat de Valencia, Doctor Moliner 50, 46100 Burjassot, Valencia (Spain); Salvador, Amparo, E-mail: amparo.salvador@uv.es [Departamento de Quimica Analitica, Facultad de Quimica, Universitat de Valencia, Doctor Moliner 50, 46100 Burjassot, Valencia (Spain)

    2010-04-07

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL{sup -1}, respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.

  17. Fault Tolerant Software Architectures

    OpenAIRE

    Saridakis, Titos; Issarny, Valérie

    1998-01-01

    Coping explicitly with failures during the conception and the design of software development complicates significantly the designer's job. The design complexity leads to software descriptions difficult to understand, which have to undergo many simplifications until their first functioning version. To support the systematic development of complex, fault tolerant software, this paper proposes a layered framework for the analysis of the fault tolerance software properties, where the top-most lay...

  18. Cable-fault locator

    Science.gov (United States)

    Cason, R. L.; Mcstay, J. J.; Heymann, A. P., Sr.

    1979-01-01

    Inexpensive system automatically indicates location of short-circuited section of power cable. Monitor does not require that cable be disconnected from its power source or that test signals be applied. Instead, ground-current sensors are installed in manholes or at other selected locations along cable run. When fault occurs, sensors transmit information about fault location to control center. Repair crew can be sent to location and cable can be returned to service with minimum of downtime.

  19. Fault Locating in HVDC Transmission Lines Using Generalized Regression Neural Network and Random Forest Algorithm

    Directory of Open Access Journals (Sweden)

    M. Farshad

    2013-09-01

    Full Text Available This paper presents a novel method based on machine learning strategies for fault locating in high voltage direct current (HVDC transmission lines. In the proposed fault-location method, only post-fault voltage signals measured at one terminal are used for feature extraction. In this paper, due to high dimension of input feature vectors, two different estimators including the generalized regression neural network (GRNN and the random forest (RF algorithm are examined to find the relation between the features and the fault location. The results of evaluation using training and test patterns obtained by simulating various fault types in a long overhead transmission line with different fault locations, fault resistance and pre-fault current values have indicated the efficiency and the acceptable accuracy of the proposed approach.

  20. FTMP (Fault Tolerant Multiprocessor) programmer's manual

    Science.gov (United States)

    Feather, F. E.; Liceaga, C. A.; Padilla, P. A.

    1986-01-01

    The Fault Tolerant Multiprocessor (FTMP) computer system was constructed using the Rockwell/Collins CAPS-6 processor. It is installed in the Avionics Integration Research Laboratory (AIRLAB) of NASA Langley Research Center. It is hosted by AIRLAB's System 10, a VAX 11/750, for the loading of programs and experimentation. The FTMP support software includes a cross compiler for a high level language called Automated Engineering Design (AED) System, an assembler for the CAPS-6 processor assembly language, and a linker. Access to this support software is through an automated remote access facility on the VAX which relieves the user of the burden of learning how to use the IBM 4381. This manual is a compilation of information about the FTMP support environment. It explains the FTMP software and support environment along many of the finer points of running programs on FTMP. This will be helpful to the researcher trying to run an experiment on FTMP and even to the person probing FTMP with fault injections. Much of the information in this manual can be found in other sources; we are only attempting to bring together the basic points in a single source. If the reader should need points clarified, there is a list of support documentation in the back of this manual.

  1. Vipava fault (Slovenia

    Directory of Open Access Journals (Sweden)

    Ladislav Placer

    2008-06-01

    Full Text Available During mapping of the already accomplished Razdrto – Senožeče section of motorway and geologic surveying of construction operations of the trunk road between Razdrto and Vipava in northwestern part of External Dinarides on the southwestern slope of Mt. Nanos, called Rebrnice, a steep NW-SE striking fault was recognized, situated between the Predjama and the Ra{a faults. The fault was named Vipava fault after the Vipava town. An analysis of subrecent gravitational slips at Rebrnice indicates that they were probably associated with the activity of this fault. Unpublished results of a repeated levelling line along the regional road passing across the Vipava fault zone suggest its possible present activity. It would be meaningful to verify this by appropriate geodetic measurements, and to study the actual gravitational slips at Rebrnice. The association between tectonics and gravitational slips in this and in similar extreme cases in the areas of Alps and Dinarides points at the need of complex studying of geologic proceses.

  2. Transformer fault diagnosis using continuous sparse autoencoder

    OpenAIRE

    Wang, Lukun; Zhao, Xiaoying; Pei, Jiangnan; Tang, Gongyou

    2016-01-01

    This paper proposes a novel continuous sparse autoencoder (CSAE) which can be used in unsupervised feature learning. The CSAE adds Gaussian stochastic unit into activation function to extract features of nonlinear data. In this paper, CSAE is applied to solve the problem of transformer fault recognition. Firstly, based on dissolved gas analysis method, IEC three ratios are calculated by the concentrations of dissolved gases. Then IEC three ratios data is normalized to reduce data singularity ...

  3. Manufacturing and automation

    Directory of Open Access Journals (Sweden)

    Ernesto Córdoba Nieto

    2010-04-01

    Full Text Available The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and examined, especially that referring to the problem’s relationship with reconciling the level of automation with the flexibility and productivity demanded by competitive, worldwide manufacturing.

  4. A morphogram with the optimal selection of parameters used in morphological analysis for enhancing the ability in bearing fault diagnosis

    International Nuclear Information System (INIS)

    Morphological analysis is a signal processing method that extracts the local morphological features of a signal by intersecting it with a structuring element (SE). When a bearing suffers from a localized fault, an impulse-type cyclic signal is generated. The amplitude and the cyclic time interval of impacts could reflect the health status of the inspected bearing and the cause of defects, respectively. In this paper, an enhanced morphological analysis called ‘morphogram’ is presented for extracting the cyclic impacts caused by a certain bearing fault. Based on the theory of morphology, the morphogram is realized by simple mathematical operators, including Minkowski addition and subtraction. The morphogram is able to detect all possible fault intervals. The most likely fault-interval-based construction index (CI) is maximized to establish the optimal range of the flat SE for the extraction of bearing fault cyclic features so that the type and cause of bearing faults can be easily determined in a time domain. The morphogram has been validated by simulated bearing fault signals, real bearing faulty signals collected from a laboratorial rotary machine and an industrial bearing fault signal. The results show that the morphogram is able to detect all possible bearing fault intervals. Based on the most likely bearing fault interval shown on the morphogram, the CI is effective in determining the optimal parameters of the flat SE for the extraction of bearing fault cyclic features for bearing fault diagnosis. (paper)

  5. A System for Fault Management and Fault Consequences Analysis for NASA's Deep Space Habitat

    Science.gov (United States)

    Colombano, Silvano; Spirkovska, Liljana; Baskaran, Vijaykumar; Aaseng, Gordon; McCann, Robert S.; Ossenfort, John; Smith, Irene; Iverson, David L.; Schwabacher, Mark

    2013-01-01

    NASA's exploration program envisions the utilization of a Deep Space Habitat (DSH) for human exploration of the space environment in the vicinity of Mars and/or asteroids. Communication latencies with ground control of as long as 20+ minutes make it imperative that DSH operations be highly autonomous, as any telemetry-based detection of a systems problem on Earth could well occur too late to assist the crew with the problem. A DSH-based development program has been initiated to develop and test the automation technologies necessary to support highly autonomous DSH operations. One such technology is a fault management tool to support performance monitoring of vehicle systems operations and to assist with real-time decision making in connection with operational anomalies and failures. Toward that end, we are developing Advanced Caution and Warning System (ACAWS), a tool that combines dynamic and interactive graphical representations of spacecraft systems, systems modeling, automated diagnostic analysis and root cause identification, system and mission impact assessment, and mitigation procedure identification to help spacecraft operators (both flight controllers and crew) understand and respond to anomalies more effectively. In this paper, we describe four major architecture elements of ACAWS: Anomaly Detection, Fault Isolation, System Effects Analysis, and Graphic User Interface (GUI), and how these elements work in concert with each other and with other tools to provide fault management support to both the controllers and crew. We then describe recent evaluations and tests of ACAWS on the DSH testbed. The results of these tests support the feasibility and strength of our approach to failure management automation and enhanced operational autonomy

  6. Automated diagnostics scoping study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Quadrel, R.W.; Lash, T.A.

    1994-06-01

    The objective of the Automated Diagnostics Scoping Study was to investigate the needs for diagnostics in building operation and to examine some of the current technologies in automated diagnostics that can address these needs. The study was conducted in two parts. In the needs analysis, the authors interviewed facility managers and engineers at five building sites. In the technology survey, they collected published information on automated diagnostic technologies in commercial and military applications as well as on technologies currently under research. The following describe key areas that the authors identify for the research, development, and deployment of automated diagnostic technologies: tools and techniques to aid diagnosis during building commissioning, especially those that address issues arising from integrating building systems and diagnosing multiple simultaneous faults; technologies to aid diagnosis for systems and components that are unmonitored or unalarmed; automated capabilities to assist cause-and-effect exploration during diagnosis; inexpensive, reliable sensors, especially those that expand the current range of sensory input; technologies that aid predictive diagnosis through trend analysis; integration of simulation and optimization tools with building automation systems to optimize control strategies and energy performance; integration of diagnostic, control, and preventive maintenance technologies. By relating existing technologies to perceived and actual needs, the authors reached some conclusions about the opportunities for automated diagnostics in building operation. Some of a building operator`s needs can be satisfied by off-the-shelf hardware and software. Other needs are not so easily satisfied, suggesting directions for future research. Their conclusions and suggestions are offered in the final section of this study.

  7. Automated Feature Extraction from Hyperspectral Imagery Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to NASA Topic S7.01, Visual Learning Systems, Inc. (VLS) will develop a novel hyperspectral plug-in toolkit for its award winning Feature AnalystREG...

  8. Collection and analysis of existing information on applicability of investigation methods for estimation of beginning age of faulting in present faulting pattern

    International Nuclear Information System (INIS)

    In the field of R and D programs of a geological disposal of high level radioactive waste, it is great importance to develop a set of investigation and analysis techniques for the assessment of long-term geosphere stability over a geological time, which means that any changes of geological environment will not significantly impact on the long-term safety of a geological disposal system. In Japanese archipelago, crustal movements are so active that uplift and subsidence are remarkable in recent several hundreds of thousands of years. Therefore, it is necessary to assess the long-term geosphere stability taking into account a topographic change caused by crustal movements. One of the factors for the topographic change is the movement of an active fault, which is a geological process to release a strain accumulated by plate motion. A beginning age of the faulting in the present faulting pattern suggests the beginning age of neotectonic activities around the active fault, and also provides basic information to identifying the stage of a geomorphic development of mountains. Therefore, the age of faulting in the present faulting pattern is important information to estimate a topographic change in the future on the mountain regions of Japan. In this study, existing information related to methods for the estimation of the beginning age of the faulting in the present faulting pattern on the active fault were collected and reviewed. A principle of method, noticing points and technical know-hows in the application of the methods, data uncertainty, and so on were extracted from the existing information. Based on these extracted information, task-flows indicating working process on the estimation of the beginning age for the faulting of the active fault were illustrated on each method. Additionally, the distribution map of the beginning age with accuracy of faulting in the present faulting pattern on the active fault was illustrated. (author)

  9. Development of an automated scoring system for plant comet assay

    Directory of Open Access Journals (Sweden)

    Bertrand Pourrut

    2015-05-01

    -\tnucleus density: increase the density of nuclei is of importance to increase scoring reliability (Sharma et al., 2012. In conclusion, increasing plant nucleus extraction yield and automated scoring of nuclei do represent big challenges. However, our promising preliminary results open up the perspective of an automated high-throughput scoring of plant nuclei.

  10. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  11. Configuration Management Automation (CMA)

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  12. Fault tolerant model predictive control of open channels

    OpenAIRE

    Horváth, Klaudia; Blesa Izquierdo, Joaquim; Duviella, Eric; Chuquet, Karine

    2014-01-01

    Automated control of water systems (irrigation canals, navigation canals, rivers etc.) relies on the measured data. The control action is calculated, in case of feedback controller, directly from the on-line measured data. If the measured data is corrupted, the calculated control action will have a different effect than it is desired. Therefore, it is crucial that the feedback controller receives good quality measurement data. On-line fault detection techniques can be applied in order to dete...

  13. Image segmentation for automated dental identification

    Science.gov (United States)

    Haj Said, Eyad; Nassar, Diaa Eldin M.; Ammar, Hany H.

    2006-02-01

    Dental features are one of few biometric identifiers that qualify for postmortem identification; therefore, creation of an Automated Dental Identification System (ADIS) with goals and objectives similar to the Automated Fingerprint Identification System (AFIS) has received increased attention. As a part of ADIS, teeth segmentation from dental radiographs films is an essential step in the identification process. In this paper, we introduce a fully automated approach for teeth segmentation with goal to extract at least one tooth from the dental radiograph film. We evaluate our approach based on theoretical and empirical basis, and we compare its performance with the performance of other approaches introduced in the literature. The results show that our approach exhibits the lowest failure rate and the highest optimality among all full automated approaches introduced in the literature.

  14. Validated Fault Tolerant Architectures for Space Station

    Science.gov (United States)

    Lala, Jaynarayan H.

    1990-01-01

    Viewgraphs on validated fault tolerant architectures for space station are presented. Topics covered include: fault tolerance approach; advanced information processing system (AIPS); and fault tolerant parallel processor (FTPP).

  15. Active Fault Isolation in MIMO Systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2014-01-01

    isolation is based directly on the input/output s ignals applied for the fault detection. It is guaranteed that the fault group includes the fault that had occurred in the system. The second step is individual fault isolation in the fault group . Both types of isolation are obtained by applying dedicated......Active fault isolation of parametric faults in closed-loop MIMO system s are considered in this paper. The fault isolation consists of two steps. T he first step is group- wise fault isolation. Here, a group of faults is isolated from other pos sible faults in the system. The group-wise fault...

  16. Transient Fault Locating Method Based on Line Voltage and Zero-mode Current in Non-solidly Earthed Network

    Institute of Scientific and Technical Information of China (English)

    ZHANG Linli; XU Bingyin; XUE Yongduan; GAO Houlei

    2012-01-01

    Non-solidly earthed systems are widely used for middle voltage distribution network at home and abroad. Fault point location especially the single phase-to-earth fault is very difficult because the fault current is very weak and the fault arc is intermittent. Although several methods have been developed, the problem of fault location has not yet been resolved very well. A new fault location method based on transient component of line voltage and 0-mode current is presented in this paper, which can realize fault section location by the feeder automation (FA) system. Line voltage signal can be obtained conveniently without requiring any additional equipment. This method is based on transient information, not affected by arc suppression coil.

  17. General Monte Carlo reliability simulation code including common mode failures and HARP fault/error-handling

    Science.gov (United States)

    Platt, M. E.; Lewis, E. E.; Boehm, F.

    1991-01-01

    A Monte Carlo Fortran computer program was developed that uses two variance reduction techniques for computing system reliability applicable to solving very large highly reliable fault-tolerant systems. The program is consistent with the hybrid automated reliability predictor (HARP) code which employs behavioral decomposition and complex fault-error handling models. This new capability is called MC-HARP which efficiently solves reliability models with non-constant failures rates (Weibull). Common mode failure modeling is also a specialty.

  18. Temporal data mining for root-cause analysis of machine faults in automotive assembly lines

    OpenAIRE

    Laxman, Srivatsan; Shadid, Basel; Sastry, P. S.; Unnikrishnan, K. P.

    2009-01-01

    Engine assembly is a complex and heavily automated distributed-control process, with large amounts of faults data logged everyday. We describe an application of temporal data mining for analyzing fault logs in an engine assembly plant. Frequent episode discovery framework is a model-free method that can be used to deduce (temporal) correlations among events from the logs in an efficient manner. In addition to being theoretically elegant and computationally efficient, frequent episodes are als...

  19. An Effective Strategy to Build Up a Balanced Test Suite for Spectrum-Based Fault Localization

    OpenAIRE

    Ning Li; Rui Wang; Yu-li Tian; Wei Zheng

    2016-01-01

    During past decades, many automated software faults diagnosis techniques including Spectrum-Based Fault Localization (SBFL) have been proposed to improve the efficiency of software debugging activity. In the field of SBFL, suspiciousness calculation is closely related to the number of failed and passed test cases. Studies have shown that the ratio of the number of failed and passed test case has more significant impact on the accuracy of SBFL than the total number of test cases, and a balance...

  20. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  1. Fault-Tolerant Control using Adaptive Time-Frequency Method in Bearing Fault Detection for DFIG Wind Energy System

    Directory of Open Access Journals (Sweden)

    Suratsavadee Koonlaboon KORKUA

    2015-02-01

    Full Text Available With the advances in power electronic technology, doubly-fed induction generators (DFIG have increasingly drawn the interest of the wind turbine industry. To ensure the reliable operation and power quality of wind power systems, the fault-tolerant control for DFIG is studied in this paper. The fault-tolerant controller is designed to maintain an acceptable level of performance during bearing fault conditions. Based on measured motor current data, an adaptive statistical time-frequency method is then used to detect the fault occurrence in the system; the controller then compensates for faulty conditions. The feature vectors, including frequency components located in the neighborhood of the characteristic fault frequencies, are first extracted and then used to estimate the next sampling stator side current, in order to better perform the current control. Early fault detection, isolation and successful reconfiguration would be very beneficial in a wind energy conversion system. The feasibility of this fault-tolerant controller has been proven by means of mathematical modeling and digital simulation based on Matlab/Simulink. The simulation results of the generator output show the effectiveness of the proposed fault-tolerant controller.

  2. Mapping the hydrogeology of faults

    International Nuclear Information System (INIS)

    Faults have been confused as barriers and as conduits to fluid flow. Reconciling these paradoxical views of the hydrogeological significance of faults is of critical importance in evaluating how faults should be considered in performance assessments of radioactive waste disposal systems. A principle cause of uncertainty in the long term movement of fluid at Sellafield, the site being investigated by Nirex as a potential location for a deep repository , concerns the hydrogeological role of faults and their related fracture systems. When considering the hydraulic importance of faults the volume of rock through which damage has occurred should be investigated, rather than just the fault itself

  3. Fault location on power networks

    CERN Document Server

    Saha, Murari Mohan

    2009-01-01

    Fault Location on Power Lines enables readers to pinpoint the location of a fault on power lines following a disturbance. The nine chapters are organised according to the design of different locators. The authors do not simply refer the reader to manufacturers' documentation, but instead have compiled detailed information to allow for in-depth comparison. Fault Location on Power Lines describes basic algorithms used in fault locators, focusing on fault location on overhead transmission lines, but also covering fault location in distribution networks. An application of artificial intelligence i

  4. Automated quantification and analysis of mandibular asymmetry

    DEFF Research Database (Denmark)

    Darvann, T. A.; Hermann, N. V.; Larsen, P.; Ólafsdóttir, Hildur; Hansen, I. V.; Hove, H. D.; Christensen, L.; Rueckert, D.; Kreiborg, S.

    We present an automated method of spatially detailed 3D asymmetry quantification in mandibles extracted from CT and apply it to a population of infants with unilateral coronal synostosis (UCS). An atlas-based method employing non-rigid registration of surfaces is used for determining deformation...

  5. Automated Characterization Of Vibrations Of A Structure

    Science.gov (United States)

    Bayard, David S.; Yam, Yeung; Mettler, Edward; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1992-01-01

    Automated method of characterizing dynamical properties of large flexible structure yields estimates of modal parameters used by robust control system to stabilize structure and minimize undesired motions. Based on extraction of desired modal and control-design data from responses of structure to known vibrational excitations. Applicable to terrestrial structures where vibrations are important - aircraft, buildings, bridges, cranes, and drill strings.

  6. Cable fault locator research

    Science.gov (United States)

    Cole, C. A.; Honey, S. K.; Petro, J. P.; Phillips, A. C.

    1982-07-01

    Cable fault location and the construction of four field test units are discussed. Swept frequency sounding of mine cables with RF signals was the technique most thoroughly investigated. The swept frequency technique is supplemented with a form of moving target indication to provide a method for locating the position of a technician along a cable and relative to a suspected fault. Separate, more limited investigations involved high voltage time domain reflectometry and acoustical probing of mine cables. Particular areas of research included microprocessor-based control of the swept frequency system, a microprocessor based fast Fourier transform for spectral analysis, and RF synthesizers.

  7. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  8. Fault Tolerant Computer Architecture

    CERN Document Server

    Sorin, Daniel

    2009-01-01

    For many years, most computer architects have pursued one primary goal: performance. Architects have translated the ever-increasing abundance of ever-faster transistors provided by Moore's law into remarkable increases in performance. Recently, however, the bounty provided by Moore's law has been accompanied by several challenges that have arisen as devices have become smaller, including a decrease in dependability due to physical faults. In this book, we focus on the dependability challenge and the fault tolerance solutions that architects are developing to overcome it. The two main purposes

  9. Study on Fault Current of DFIG during Slight Fault Condition

    OpenAIRE

    Xiangping Kong; Zhe Zhang; Xianggen Yin; Zhenxing Li

    2013-01-01

    In order to ensure the safety of DFIG when severe fault happens, crowbar protection is adopted. But during slight fault condition, the crowbar protection will not trip, and the DFIG is still excited by AC-DC-AC converter. In this condition, operation characteristics of the converter have large influence on the fault current characteristics of DFIG. By theoretical analysis and digital simulation, the fault current characteristics of DFIG during slight voltage dips are studied. And the influenc...

  10. Shoe-String Automation

    Energy Technology Data Exchange (ETDEWEB)

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  11. Repeated extraction of DNA from FTA cards

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Ferrero, Laura; Børsting, Claus;

    2011-01-01

    Extraction of DNA using magnetic bead based techniques on automated DNA extraction instruments provides a fast, reliable and reproducible method for DNA extraction from various matrices. However, the yield of extracted DNA from FTA-cards is typically low. Here, we demonstrate that it is possible...

  12. Fault Management Assistant (FMA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — S&K Aerospace (SKA) proposes to develop the Fault Management Assistant (FMA) to aid project managers and fault management engineers in developing better and...

  13. 基于PS-InSAR技术的断裂带近场变形特征提取%The extraction of the near-field deformation features along the faulted zone based on PS-InSAR survey

    Institute of Scientific and Technical Information of China (English)

    李凌婧; 姚鑫; 张永双; 王桂杰; 郭长宝

    2015-01-01

    Aperture Radar)technology and using L band data, the authors conducted the survey of near-field deformation around Bamei-Daofu section of Xianshuihe ac⁃tive fault from 2007 to 2011 and, based on analysis in combination with other materials, inferred some complex fault near-field defor⁃mation information:①the deformation velocity of the north section is larger than that of the north section, and velocities on the two sides of the fault are somewhat different from each other, the velocity of SW wall is large than that of NE wall, the velocity difference of the far-field is more significant, and the velocity of the near-field is feeble; ②in area close to the active faulted zone, the values of PS(Persistent Scatterer)points deformation velocities are mainly comparatively small negative and positive values, reflecting the sur⁃ face ascent and suggesting that the location is composed mainly of wet land, exposed point of ground water, bank and gully. It is in⁃ferred that these phenomena are attributed to surface bulging and deformation caused by weather warming—glaciers melting—uplift of ground water level, the tendency uplift of wet land resulting from seasonal frost heaving, and certain expansibility of cataclastic rock and soil near the faulted zone;③the uplift deformation around Zhonggu-Bamei section results from the thrust movement near Xianshuihe fault, and the ductile shear zone absorbs and coordinates the entire block deformation; ④high deformation PS blocks re⁃flect the slope gravity deformation,especially in sections of Daofu-shonglinkou and Qianning basin-Longdengba, revealing the geo⁃hazard effects of the fault; ⑤the precise PS-InSAR results show that the deformation of the fault is complex and shows remarkable differences in different sections, different periods and different tectonic locations, so we can't simply consider the movement to be overall translation or elevation-subsidence with the faulted zone as the boundary.

  14. Improving Multiple Fault Diagnosability using Possible Conflicts

    Data.gov (United States)

    National Aeronautics and Space Administration — Multiple fault diagnosis is a difficult problem for dynamic systems. Due to fault masking, compensation, and relative time of fault occurrence, multiple faults can...

  15. Investigating multiple fault rupture at the Salar del Carmen segment of the Atacama Fault System (northern Chile): Fault scarp morphology and knickpoint analysis

    Science.gov (United States)

    Ewiak, Oktawian; Victor, Pia; Oncken, Onno

    2015-02-01

    This study presents a new geomorphological approach to investigate the past activity and potential seismic hazard of upper crustal faults at the Salar del Carmen segment of the Atacama Fault System in the northern Chile forearc. Our contribution is based on the analysis of a large set of topographic profiles and allows extrapolating fault analysis from a few selected locations to distances of kilometers along strike of the fault. We detected subtle changes in the fault scarp geometry which may represent the number of paleoearthquakes experienced by the structure and extracted the cumulative and last incremental displacement along strike of the investigated scarps. We also tested the potential of knickpoints in channels crossing the fault scarps as markers for repeated fault rupture and proxies for seismic displacement. The number of paleoearthquakes derived from our analysis is 2-3, well in agreement with recent paleoseismological investigations, which suggest 2-3 earthquakes (Mw = 6.5-6.7) at the studied segments. Knickpoints record the number of events for about 55% of the analyzed profile pairs. Only few knickpoints represent the full seismic displacement, while most retain only a fraction of the displacement. The along-strike displacement distributions suggest fault growth from the center toward the tips and linkage of individual ruptures. Our approach also improves the estimation of paleomagnitudes in case of multiple fault rupture by allowing to quantify the last increment of displacement separately. Paleomagnitudes calculated from total segment length and the last increment of displacement (Mw = 6.5-7.1) are in agreement with paleoseismological results.

  16. Transformer fault diagnosis using continuous sparse autoencoder.

    Science.gov (United States)

    Wang, Lukun; Zhao, Xiaoying; Pei, Jiangnan; Tang, Gongyou

    2016-01-01

    This paper proposes a novel continuous sparse autoencoder (CSAE) which can be used in unsupervised feature learning. The CSAE adds Gaussian stochastic unit into activation function to extract features of nonlinear data. In this paper, CSAE is applied to solve the problem of transformer fault recognition. Firstly, based on dissolved gas analysis method, IEC three ratios are calculated by the concentrations of dissolved gases. Then IEC three ratios data is normalized to reduce data singularity and improve training speed. Secondly, deep belief network is established by two layers of CSAE and one layer of back propagation (BP) network. Thirdly, CSAE is adopted to unsupervised training and getting features. Then BP network is used for supervised training and getting transformer fault. Finally, the experimental data from IEC TC 10 dataset aims to illustrate the effectiveness of the presented approach. Comparative experiments clearly show that CSAE can extract features from the original data, and achieve a superior correct differentiation rate on transformer fault diagnosis. PMID:27119052

  17. Causes of automotive turbocharger faults

    OpenAIRE

    Jan FILIPCZYK

    2013-01-01

    This paper presents the results of examinations of turbocharger damages. The analysis of the causes of faults in 100 engines with turbochargers of cars, buses and trucks has been carried out. The incidence and structure of turbocharged engine faults has been compared to the causes of faults of naturally aspirated engines. The cause of damage, the possibility of early detection, the time between overhaul and the impact on engine operation for each case of fault was carried out as well. The re...

  18. Electromagnetic Transient Response Analysis of DFIG under Cascading Grid Faults Considering Phase Angel Jumps

    DEFF Research Database (Denmark)

    Wang, Yun; Wu, Qiuwei

    2014-01-01

    This paper analysis the electromagnetic transient response characteristics of DFIG under symmetrical and asymmetrical cascading grid fault conditions considering phaseangel jump of grid. On deriving the dynamic equations of the DFIG with considering multiple constraints on balanced and unbalanced...... conditions, phase angel jumps, interval of cascading fault, electromagnetic transient characteristics, the principle of the DFIG response under cascading voltage fault can be extract. The influence of grid angel jump on the transient characteristic of DFIG is analyzed and electromagnetic response...

  19. Extracting Tag Hierarchies

    OpenAIRE

    Tibély, Gergely; Pollner, Péter; Vicsek, Tamás; Palla, Gergely

    2013-01-01

    Tagging items with descriptive annotations or keywords is a very natural way to compress and highlight information about the properties of the given entity. Over the years several methods have been proposed for extracting a hierarchy between the tags for systems with a "flat", egalitarian organization of the tags, which is very common when the tags correspond to free words given by numerous independent people. Here we present a complete framework for automated tag hierarchy extraction based o...

  20. Network Power Fault Detection

    OpenAIRE

    Siviero, Claudio

    2013-01-01

    Network power fault detection. At least one first network device is instructed to temporarily disconnect from a power supply path of a network, and at least one characteristic of the power supply path of the network is measured at a second network device connected to the network while the at least one first network device is temporarily disconnected from the network

  1. Fault detection and fault-tolerant control using sliding modes

    CERN Document Server

    Alwi, Halim; Tan, Chee Pin

    2011-01-01

    ""Fault Detection and Fault-tolerant Control Using Sliding Modes"" is the first text dedicated to showing the latest developments in the use of sliding-mode concepts for fault detection and isolation (FDI) and fault-tolerant control in dynamical engineering systems. It begins with an introduction to the basic concepts of sliding modes to provide a background to the field. This is followed by chapters that describe the use and design of sliding-mode observers for FDI using robust fault reconstruction. The development of a class of sliding-mode observers is described from first principles throug

  2. Simultaneous-Fault Diagnosis of Gearboxes Using Probabilistic Committee Machine

    Directory of Open Access Journals (Sweden)

    Jian-Hua Zhong

    2016-02-01

    Full Text Available This study combines signal de-noising, feature extraction, two pairwise-coupled relevance vector machines (PCRVMs and particle swarm optimization (PSO for parameter optimization to form an intelligent diagnostic framework for gearbox fault detection. Firstly, the noises of sensor signals are de-noised by using the wavelet threshold method to lower the noise level. Then, the Hilbert-Huang transform (HHT and energy pattern calculation are applied to extract the fault features from de-noised signals. After that, an eleven-dimension vector, which consists of the energies of nine intrinsic mode functions (IMFs, maximum value of HHT marginal spectrum and its corresponding frequency component, is obtained to represent the features of each gearbox fault. The two PCRVMs serve as two different fault detection committee members, and they are trained by using vibration and sound signals, respectively. The individual diagnostic result from each committee member is then combined by applying a new probabilistic ensemble method, which can improve the overall diagnostic accuracy and increase the number of detectable faults as compared to individual classifiers acting alone. The effectiveness of the proposed framework is experimentally verified by using test cases. The experimental results show the proposed framework is superior to existing single classifiers in terms of diagnostic accuracies for both single- and simultaneous-faults in the gearbox.

  3. Fault Diagnosis of Batch Reactor Using Machine Learning Methods

    Directory of Open Access Journals (Sweden)

    Sujatha Subramanian

    2014-01-01

    Full Text Available Fault diagnosis of a batch reactor gives the early detection of fault and minimizes the risk of thermal runaway. It provides superior performance and helps to improve safety and consistency. It has become more vital in this technical era. In this paper, support vector machine (SVM is used to estimate the heat release (Qr of the batch reactor both normal and faulty conditions. The signature of the residual, which is obtained from the difference between nominal and estimated faulty Qr values, characterizes the different natures of faults occurring in the batch reactor. Appropriate statistical and geometric features are extracted from the residual signature and the total numbers of features are reduced using SVM attribute selection filter and principle component analysis (PCA techniques. artificial neural network (ANN classifiers like multilayer perceptron (MLP, radial basis function (RBF, and Bayes net are used to classify the different types of faults from the reduced features. It is observed from the result of the comparative study that the proposed method for fault diagnosis with limited number of features extracted from only one estimated parameter (Qr shows that it is more efficient and fast for diagnosing the typical faults.

  4. Simultaneous-Fault Diagnosis of Gearboxes Using Probabilistic Committee Machine.

    Science.gov (United States)

    Zhong, Jian-Hua; Wong, Pak Kin; Yang, Zhi-Xin

    2016-01-01

    This study combines signal de-noising, feature extraction, two pairwise-coupled relevance vector machines (PCRVMs) and particle swarm optimization (PSO) for parameter optimization to form an intelligent diagnostic framework for gearbox fault detection. Firstly, the noises of sensor signals are de-noised by using the wavelet threshold method to lower the noise level. Then, the Hilbert-Huang transform (HHT) and energy pattern calculation are applied to extract the fault features from de-noised signals. After that, an eleven-dimension vector, which consists of the energies of nine intrinsic mode functions (IMFs), maximum value of HHT marginal spectrum and its corresponding frequency component, is obtained to represent the features of each gearbox fault. The two PCRVMs serve as two different fault detection committee members, and they are trained by using vibration and sound signals, respectively. The individual diagnostic result from each committee member is then combined by applying a new probabilistic ensemble method, which can improve the overall diagnostic accuracy and increase the number of detectable faults as compared to individual classifiers acting alone. The effectiveness of the proposed framework is experimentally verified by using test cases. The experimental results show the proposed framework is superior to existing single classifiers in terms of diagnostic accuracies for both single- and simultaneous-faults in the gearbox. PMID:26848665

  5. Road Features Extraction Using Terrestrial Mobile Laser Scanning System

    OpenAIRE

    Kumar, Pankaj

    2012-01-01

    In this thesis, we present the experimental research and key contributions we have made in the field of road feature extraction from LiDAR data. We detail the development of three automated algorithms for the extraction of road features from terrestrial mobile LiDAR data. LiDAR data is a rich source of 3D geo-referenced information whose volume and scale have inhibited the development of automated algorithms. Automated feature extraction algorithms enable the wider geospatia...

  6. Enhancing Seismic Calibration Research Through Software Automation

    Energy Technology Data Exchange (ETDEWEB)

    Ruppert, S; Dodge, D; Elliott, A; Ganzberger, M; Hauk, T; Matzel, E; Ryall, F

    2004-07-09

    unpredictable event observations. Even partial automation of this second tier, through development of prototype tools to extract observations and make many thousands of scientific measurements, has significantly increased the efficiency of the scientists who construct and validate integrated calibration surfaces. This achieved gain in efficiency and quality control is likely to continue and even accelerate through continued application of information science and scientific automation. Data volume and calibration research requirements have increased by several orders of magnitude over the past decade. Whereas it was possible for individual researchers to download individual waveforms and make time-consuming measurements event by event in the past, with the Terabytes of data available today, a software automation framework must exist to efficiently populate and deliver quality data to the researcher. This framework must also simultaneously provide the researcher with robust measurement and analysis tools that can handle and extract groups of events effectively and isolate the researcher from the now onerous task of database management and metadata collection necessary for validation and error analysis. We have succeeded in automating many of the collection, parsing, reconciliation and extraction tasks, individually. Several software automation prototypes have been produced and have resulted in demonstrated gains in efficiency of producing scientific data products. Future software automation tasks will continue to leverage database and information management technologies in addressing additional scientific calibration research tasks.

  7. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  8. Fault tolerant architecture for artificial olfactory system

    International Nuclear Information System (INIS)

    In this paper, to cover and mask the faults that occur in the sensing unit of an artificial olfactory system, a novel architecture is offered. The proposed architecture is able to tolerate failures in the sensors of the array and the faults that occur are masked. The proposed architecture for extracting the correct results from the output of the sensors can provide the quality of service for generated data from the sensor array. The results of various evaluations and analysis proved that the proposed architecture has acceptable performance in comparison with the classic form of the sensor array in gas identification. According to the results, achieving a high odor discrimination based on the suggested architecture is possible. (paper)

  9. Landscape response to normal fault growth and linkage in the Southern Apennines, Italy.

    Science.gov (United States)

    Roda-Boluda, Duna; Whittaker, Alex

    2016-04-01

    It is now well-established that landscape can record spatial and temporal variations in tectonic rates. However, decoding this information to extract detailed histories of fault growth is often a complex problem that requires careful integration of tectonic and geomorphic data sets. Here, we present new data addressing both normal fault evolution and coupled landscape response for two normal faults in the Southern Apennines: the Vallo di Diano and East Agri faults. By integrating published constraints with new data, we show that these faults have total throws of up to 2100 m, and Holocene throw rates of up to 1 mm/yr at their maximum. We demonstrate that geomorphology is effectively recording tectonics, with relief, channel and catchment slopes varying along fault strike as normal fault activity does. Therefore, valuable information about fault growth and interaction can be extracted from their geomorphic expression. We use the spatial distribution of knickpoints on the footwall channels to infer two episodes of base level change, which can be associated with distinct fault interaction events. From our detailed fault throw profiles, we reconstruct the amount of throw accumulated after each of these events, and the segments involved in each, and we use slip rate enhancement factors derived from fault interaction theory to estimate the magnitude of the tectonic perturbation in each case. From this approach, we are able to reconstruct pre-linkage throw rates, and we estimate that fault linkage events likely took place 0.7 ± 0.2 Ma and 1.9 ± 0.6 Ma in the Vallo di Diano fault, and 1.1 ± 0.1 and 2.3 ± 0.9 Ma in the East Agri fault. Our study suggests that both faults started their activity at 3.6 ± 0.5 Ma. These fault linkage scenarios are consistent with the knickpoint heights, and may relate to soft-linkage interaction with the Southern Apennines normal fault array, the existence of which has been the subject of considerable debate. Our combined geomorphic and

  10. Statistical method for the determination of equivalence of automated test procedures

    OpenAIRE

    Norman Wiggins; Gorko, Mary A.; Jennifer Llewelyn; K. Rick Lung

    2003-01-01

    In the development of test methods for solid dosage forms, manual test procedures for assay and content uniformity often precede the development of automated test procedures. Since the mode of extraction for automated test methods is often slightly different from that of the manual test method, additional validation of an automated test method is usually required. In addition to compliance with validation guidelines, developers of automated test methods are often asked to demonstrate equivale...

  11. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  12. Bearing fault diagnosis based on spectrum images of vibration signals

    International Nuclear Information System (INIS)

    Bearing fault diagnosis has been a challenge in the monitoring activities of rotating machinery, and it’s receiving more and more attention. The conventional fault diagnosis methods usually extract features from the waveforms or spectrums of vibration signals in order to correctly classify faults. In this paper, a novel feature in the form of images is presented, namely analysis of the spectrum images of vibration signals. The spectrum images are simply obtained by doing fast Fourier transformation. Such images are processed with two-dimensional principal component analysis (2DPCA) to reduce the dimensions, and then a minimum distance method is applied to classify the faults of bearings. The effectiveness of the proposed method is verified with experimental data. (paper)

  13. Automated Periodontal Diseases Classification System

    Directory of Open Access Journals (Sweden)

    Aliaa A. A. Youssif

    2012-01-01

    Full Text Available This paper presents an efficient and innovative system for automated classification of periodontal diseases, The strength of our technique lies in the fact that it incorporates knowledge from the patients' clinical data, along with the features automatically extracted from the Haematoxylin and Eosin (H&E stained microscopic images. Our system uses image processing techniques based on color deconvolution, morphological operations, and watershed transforms for epithelium & connective tissue segmentation, nuclear segmentation, and extraction of the microscopic immunohistochemical features for the nuclei, dilated blood vessels & collagen fibers. Also, Feedforward Backpropagation Artificial Neural Networks are used for the classification process. We report 100% classification accuracy in correctly identifying the different periodontal diseases observed in our 30 samples dataset.

  14. Analysis of fault using microcomputer protection by symmetrical component method

    Directory of Open Access Journals (Sweden)

    Mr. Ashish Choubey

    2012-09-01

    Full Text Available To enhance power supply reliability for the userterminals in the case of the distribution system toavoid interference by the fault again, rapidlycomplete the automatic identification, positioning,automatic fault isolation, network reconfigurationuntil the resumption of supply of non-fault section,a microprocessor-based relay protection device hasdeveloped. As the fault component theory is widelyused in microcomputer protection, and faultcomponent exists in the network of faultcomponent, it is necessary to build up the faultcomponent network when short circuit faultemerging and to draw the current and voltagecomponent phasor diagram at fault point. In orderto understand microcomputer protection based onthe symmetrical component principle, we obtainedthe sequence current and sequence voltageaccording to the concept of symmetrical component.Distribution line directly to user-oriented powersupply, the reliability of its operation determines thequality and level of electricity supply. In recentdecades, because of the general power of the tirelessefforts of scientists and technicians, relay protectiontechnology and equipment application level hasbeen greatly improved, but the current domesticproduction of computer hardware, protectiondevices are still outdated systems. Softwaredevelopment has maintenance difficulties and shortsurvival time. With the factory automation systeminterface functions weak points, the networkcommunication cannot meet the actualrequirements. Protection principle configurationand device manufacturing process to be improvedand so on.

  15. Automatic software fault localization based on ar tificial bee colony

    Institute of Scientific and Technical Information of China (English)

    Linzhi Huang∗; Jun Ai

    2015-01-01

    Software debugging accounts for a vast majority of the financial and time costs in software developing and maintenance. Thus, approaches of software fault localization that can help au-tomate the debugging process have become a hot topic in the field of software engineering. Given the great demand for software fault localization, an approach based on the artificial bee colony (ABC) algorithm is proposed to be integrated with other related techniques. In this process, the source program is initial y instru-mented after analyzing the dependence information. The test case sets are then compiled and run on the instrumented program, and execution results are input to the ABC algorithm. The algorithm can determine the largest fitness value and best food source by calculating the average fitness of the employed bees in the iter-ative process. The program unit with the highest suspicion score corresponding to the best test case set is regarded as the final fault localization. Experiments are conducted with the TCAS program in the Siemens suite. Results demonstrate that the proposed fault localization method is effective and efficient. The ABC algorithm can efficiently avoid the local optimum, and ensure the validity of the fault location to a larger extent.

  16. Automated stopcock actuator

    OpenAIRE

    Vandehey, N. T.; O'Neil, J.P.

    2015-01-01

    Introduction We have developed a low-cost stopcock valve actuator for radiochemistry automation built using a stepper motor and an Arduino, an open-source single-board microcontroller. The con-troller hardware can be programmed to run by serial communication or via two 5–24 V digital lines for simple integration into any automation control system. This valve actuator allows for automated use of a single, disposable stopcock, providing a number of advantages over stopcock manifold systems ...

  17. The Adaptive Automation Design

    OpenAIRE

    Calefato, Caterina; Montanari, Roberto; TESAURI, Francesco

    2008-01-01

    After considering the positive effects of adaptive automation implementation, this chapter focuses on two partly overlapping phenomena: on the one hand, the role of trust in automation is considered, particularly as to the effects of overtrust and mistrust in automation's reliability; on the other hand, long-term lack of exercise on specific operation may lead users to skill deterioration. As a future work, it will be interesting and challenging to explore the conjunction of adaptive automati...

  18. Service functional test automation

    OpenAIRE

    Hillah, Lom Messan; Maesano, Ariele-Paolo; Rosa, Fabio; Maesano, Libero; Lettere, Marco; Fontanelli, Riccardo

    2015-01-01

    This paper presents the automation of the functional test of services (black-box testing) and services architectures (grey-box testing) that has been developed by the MIDAS project and is accessible on the MIDAS SaaS. In particular, the paper illustrates the solutions of tough functional test automation problems such as: (i) the configuration of the automated test execution system against large and complex services architectures, (ii) the constraint-based test input generation, (iii) the spec...

  19. Fault intersections along the Hosgri Fault Zone, Central California

    Science.gov (United States)

    Watt, J. T.; Johnson, S. Y.; Langenheim, V. E.

    2011-12-01

    It is well-established that stresses concentrate at fault intersections or bends when subjected to tectonic loading, making focused studies of these areas particularly important for seismic hazard analysis. In addition, detailed fault models can be used to investigate how slip on one fault might transfer to another during an earthquake. We combine potential-field, high-resolution seismic-reflection, and multibeam bathymetry data with existing geologic and seismicity data to investigate the fault geometry and connectivity of the Hosgri, Los Osos, and Shoreline faults offshore of San Luis Obispo, California. The intersection of the Hosgri and Los Osos faults in Estero Bay is complex. The offshore extension of the Los Osos fault, as imaged with multibeam and high-resolution seismic data, is characterized by a west-northwest-trending zone (1-3 km wide) of near vertical faulting. Three distinct strands (northern, central, and southern) are visible on shallow seismic reflection profiles. The steep dip combined with dramatic changes in reflection character across mapped faults within this zone suggests horizontal offset of rock units and argues for predominantly strike-slip motion, however, the present orientation of the fault zone suggests oblique slip. As the Los Osos fault zone approaches the Hosgri fault, the northern and central strands become progressively more northwest-trending in line with the Hosgri fault. The northern strand runs subparallel to the Hosgri fault along the edge of a long-wavelength magnetic anomaly, intersecting the Hosgri fault southwest of Point Estero. Geophysical modeling suggests the northern strand dips 70° to the northeast, which is in agreement with earthquake focal mechanisms that parallel this strand. The central strand bends northward and intersects the Hosgri fault directly west of Morro Rock, corresponding to an area of compressional deformation visible in shallow seismic-reflection profiles. The southern strand of the Los Osos

  20. Fault Scarp Offsets and Fault Population Analysis on Dione

    Science.gov (United States)

    Tarlow, S.; Collins, G. C.

    2010-12-01

    Cassini images of Dione show several fault zones cutting through the moon’s icy surface. We have measured the displacement and length of 271 faults, and estimated the strain occurring in 6 different fault zones. These measurements allow us to quantify the total amount of surface strain on Dione as well as constrain what processes might have caused these faults to form. Though we do not have detailed topography across fault scarps on Dione, we can use their projected size on the camera plane to estimate their heights, assuming a reasonable surface slope. Starting with high resolution images of Dione obtained by the Cassini ISS, we marked points at the top to the bottom of each fault scarp to measure the fault’s projected displacement and its orientation along strike. Line and sample information for the measurements were then processed through ISIS to derive latitude/longitude information and pixel dimensions. We then calculate the three dimensional orientation of a vector running from the bottom to the top of the fault scarp, assuming a 45 degree angle with respect to the surface, and project this vector onto the spacecraft camera plane. This projected vector gives us a correction factor to estimate the actual vertical displacement of the fault scarp. This process was repeated many times for each fault, to show variations of displacement along the length of the fault. To compare each fault to its neighbors and see how strain was accommodated across a population of faults, we divided the faults into fault zones, and created new coordinate systems oriented along the central axis of each fault zone. We could then quantify the amount of fault overlap and add the displacement of overlapping faults to estimate the amount of strain accommodated in each zone. Faults in the southern portion of Padua have a strain of 0.031(+/-) 0.0097, central Padua exhibits a strain of .032(+/-) 0.012, and faults in northern Padua have a strain of 0.025(+/-) 0.0080. The western faults of

  1. Abnormal fault-recovery characteristics of the fault-tolerant multiprocessor uncovered using a new fault-injection methodology

    Science.gov (United States)

    Padilla, Peter A.

    1991-03-01

    An investigation was made in AIRLAB of the fault handling performance of the Fault Tolerant MultiProcessor (FTMP). Fault handling errors detected during fault injection experiments were characterized. In these fault injection experiments, the FTMP disabled a working unit instead of the faulted unit once in every 500 faults, on the average. System design weaknesses allow active faults to exercise a part of the fault management software that handles Byzantine or lying faults. Byzantine faults behave such that the faulted unit points to a working unit as the source of errors. The design's problems involve: (1) the design and interface between the simplex error detection hardware and the error processing software, (2) the functional capabilities of the FTMP system bus, and (3) the communication requirements of a multiprocessor architecture. These weak areas in the FTMP's design increase the probability that, for any hardware fault, a good line replacement unit (LRU) is mistakenly disabled by the fault management software.

  2. Automated Weather Observing System

    Data.gov (United States)

    Department of Transportation — The Automated Weather Observing System (AWOS) is a suite of sensors, which measure, collect, and disseminate weather data to help meteorologists, pilots, and flight...

  3. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. PMID:26065792

  4. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  5. Sparsity-based algorithm for detecting faults in rotating machines

    Science.gov (United States)

    He, Wangpeng; Ding, Yin; Zi, Yanyang; Selesnick, Ivan W.

    2016-05-01

    This paper addresses the detection of periodic transients in vibration signals so as to detect faults in rotating machines. For this purpose, we present a method to estimate periodic-group-sparse signals in noise. The method is based on the formulation of a convex optimization problem. A fast iterative algorithm is given for its solution. A simulated signal is formulated to verify the performance of the proposed approach for periodic feature extraction. The detection performance of comparative methods is compared with that of the proposed approach via RMSE values and receiver operating characteristic (ROC) curves. Finally, the proposed approach is applied to single fault diagnosis of a locomotive bearing and compound faults diagnosis of motor bearings. The processed results show that the proposed approach can effectively detect and extract the useful features of bearing outer race and inner race defect.

  6. Managing Fault Management Development

    Science.gov (United States)

    McDougal, John M.

    2010-01-01

    As the complexity of space missions grows, development of Fault Management (FM) capabilities is an increasingly common driver for significant cost overruns late in the development cycle. FM issues and the resulting cost overruns are rarely caused by a lack of technology, but rather by a lack of planning and emphasis by project management. A recent NASA FM Workshop brought together FM practitioners from a broad spectrum of institutions, mission types, and functional roles to identify the drivers underlying FM overruns and recommend solutions. They identified a number of areas in which increased program and project management focus can be used to control FM development cost growth. These include up-front planning for FM as a distinct engineering discipline; managing different, conflicting, and changing institutional goals and risk postures; ensuring the necessary resources for a disciplined, coordinated approach to end-to-end fault management engineering; and monitoring FM coordination across all mission systems.

  7. Seismic Fault Preserving Diffusion

    CERN Document Server

    Lavialle, Olivier; Germain, Christian; Donias, Marc; Guillon, Sebastien; Keskes, Naamen; Berthoumieu, Yannick

    2007-01-01

    This paper focuses on the denoising and enhancing of 3-D reflection seismic data. We propose a pre-processing step based on a non linear diffusion filtering leading to a better detection of seismic faults. The non linear diffusion approaches are based on the definition of a partial differential equation that allows us to simplify the images without blurring relevant details or discontinuities. Computing the structure tensor which provides information on the local orientation of the geological layers, we propose to drive the diffusion along these layers using a new approach called SFPD (Seismic Fault Preserving Diffusion). In SFPD, the eigenvalues of the tensor are fixed according to a confidence measure that takes into account the regularity of the local seismic structure. Results on both synthesized and real 3-D blocks show the efficiency of the proposed approach.

  8. Seismic Fault Preserving Diffusion

    OpenAIRE

    Lavialle, Olivier; Pop, Sorin; Germain, Christian; Donias, Marc; Guillon, Sebastien; Keskes, Naamen; Berthoumieu, Yannick

    2007-01-01

    This paper focuses on the denoising and enhancing of 3-D reflection seismic data. We propose a pre-processing step based on a non linear diffusion filtering leading to a better detection of seismic faults. The non linear diffusion approaches are based on the definition of a partial differential equation that allows us to simplify the images without blurring relevant details or discontinuities. Computing the structure tensor which provides information on the local orientation of the geological...

  9. Design of neuro fuzzy fault tolerant control using an adaptive observer

    International Nuclear Information System (INIS)

    New methodologies and concepts are developed in the control theory to meet the ever-increasing demands in industrial applications. Fault detection and diagnosis of technical processes have become important in the course of progressive automation in the operation of groups of electric drives. When a group of electric drives is under operation, fault tolerant control becomes complicated. For multiple motors in operation, fault detection and diagnosis might prove to be difficult. Estimation of all states and parameters of all drives is necessary to analyze the actuator and sensor faults. To maintain system reliability, detection and isolation of failures should be performed quickly and accurately, and hardware should be properly integrated. Luenberger full order observer can be used for estimation of the entire states in the system for the detection of actuator and sensor failures. Due to the insensitivity of the Luenberger observer to the system parameter variations, state estimation becomes inaccurate under the varying parameter conditions of the drives. Consequently, the estimation performance deteriorates, resulting in ordinary state observers unsuitable for fault detection technique. Therefore an adaptive observe, which can estimate the system states and parameter and detect the faults simultaneously, is designed in our paper. For a Group of D C drives, there may be parameter variations for some of the drives, and for other drives, there may not be parameter variations depending on load torque, friction, etc. So, estimation of all states and parameters of all drives is carried out using an adaptive observer. If there is any deviation with the estimated values, it is understood that fault has occurred and the nature of the fault, whether sensor fault or actuator fault, is determined by neural fuzzy network, and fault tolerant control is reconfigured. Experimental results with neuro fuzzy system using adaptive observer-based fault tolerant control are good, so as

  10. A PC based time domain reflectometer for space station cable fault isolation

    Science.gov (United States)

    Pham, Michael; McClean, Marty; Hossain, Sabbir; Vo, Peter; Kouns, Ken

    1994-01-01

    Significant problems are faced by astronauts on orbit in the Space Station when trying to locate electrical faults in multi-segment avionics and communication cables. These problems necessitate the development of an automated portable device that will detect and locate cable faults using the pulse-echo technique known as Time Domain Reflectometry. A breadboard time domain reflectometer (TDR) circuit board was designed and developed at the NASA-JSC. The TDR board works in conjunction with a GRiD lap-top computer to automate the fault detection and isolation process. A software program was written to automatically display the nature and location of any possible faults. The breadboard system can isolate open circuit and short circuit faults within two feet in a typical space station cable configuration. Follow-on efforts planned for 1994 will produce a compact, portable prototype Space Station TDR capable of automated switching in multi-conductor cables for high fidelity evaluation. This device has many possible commercial applications, including commercial and military aircraft avionics, cable TV, telephone, communication, information and computer network systems. This paper describes the principle of time domain reflectometry and the methodology for on-orbit avionics utility distribution system repair, utilizing the newly developed device called the Space Station Time Domain Reflectometer (SSTDR).

  11. Seismic Hazard and Fault Length

    Science.gov (United States)

    Black, N. M.; Jackson, D. D.; Mualchin, L.

    2005-12-01

    If mx is the largest earthquake magnitude that can occur on a fault, then what is mp, the largest magnitude that should be expected during the planned lifetime of a particular structure? Most approaches to these questions rely on an estimate of the Maximum Credible Earthquake, obtained by regression (e.g. Wells and Coppersmith, 1994) of fault length (or area) and magnitude. Our work differs in two ways. First, we modify the traditional approach to measuring fault length, to allow for hidden fault complexity and multi-fault rupture. Second, we use a magnitude-frequency relationship to calculate the largest magnitude expected to occur within a given time interval. Often fault length is poorly defined and multiple faults rupture together in a single event. Therefore, we need to expand the definition of a mapped fault length to obtain a more accurate estimate of the maximum magnitude. In previous work, we compared fault length vs. rupture length for post-1975 earthquakes in Southern California. In this study, we found that mapped fault length and rupture length are often unequal, and in several cases rupture broke beyond the previously mapped fault traces. To expand the geologic definition of fault length we outlined several guidelines: 1) if a fault truncates at young Quaternary alluvium, the fault line should be inferred underneath the younger sediments 2) faults striking within 45° of one another should be treated as a continuous fault line and 3) a step-over can link together faults at least 5 km apart. These definitions were applied to fault lines in Southern California. For example, many of the along-strike faults lines in the Mojave Desert are treated as a single fault trending from the Pinto Mountain to the Garlock fault. In addition, the Rose Canyon and Newport-Inglewood faults are treated as a single fault line. We used these more generous fault lengths, and the Wells and Coppersmith regression, to estimate the maximum magnitude (mx) for the major faults in

  12. Fault Creep on the Hayward Fault, CA: Implications for Fault Properties and Patterns of Moment Release

    Science.gov (United States)

    Malservisi, R.; Furlong, K. P.; Gans, C.

    2001-12-01

    The seismic risk associated with creeping faults such as the Hayward fault (San Francisco Bay Area, CA) will depend on the rate of moment accumulation (slip deficit) on the fault plane, on the specific geometry of locked and free portions of the fault, and on the interactions between the fault zone and the surrounding lithosphere. Using a visco-elastic finite-element model, we have investigated fault zone geometries and physical characteristics that produce the observed surface creep on the Hayward fault, driven by far field plate motions. This differs from most previous analyses in that we do not explicitly specify fault-creep on fault patches, but rather allow the rheology, geometry, and mechanics of the fault system to determine patterns of fault creep. Our model results show that for models that match the observed surface creep data, there is a smooth transition in creep rate from regions free to creep to locked patches. This behavior leads to "creepable" (low friction) areas that accumulate a high slip deficit as compared to other low friction segments of the fault. Interestingly, a comparison of the creep pattern from our results with Hayward Fault microseismicity indicates that events cluster in the locked areas and in transition zones -the "creepable" regions with a high creeping velocity gradient. Furthermore, seismicity seems to be more diffuse around the fault plane in the locked and transition zones than on the `creepable' areas with relatively high creep rates. Although the total amount of seismic moment stored on the Hayward fault does not differ significantly between our model and previous ones, and thus the potential magnitude of earthquakes are similar in all creep models, there is a difference in the location of fault patches with significant slip deficit. Additionally, since in our models there are regions free to creep that still accumulate a high slip deficit, energy release during rupture may vary among the models. That is, if the velocity of

  13. Research on the Comprehensive Demodulation of Gear Tooth Crack Early Fault

    Institute of Scientific and Technical Information of China (English)

    CUI Lingli; DING Fang; GAO Lixin; ZHANG Jianyu

    2006-01-01

    The component of gear vibration signal is very complex, when a localized tooth defect such as a tooth crack is present, the engagement of the cracked tooth will induce an impulsive change with comparatively low energy to the gear mesh signal and the background noise. This paper presents a new comprehensive demodulation method which combined with amplitude envelop demodulation and phase demodulation to extract gear crack early fault. A mathematical model of gear vibration signal contain crack fault is put forward. Simulation results based on this model show that the new comprehensive demodulation method is more effective in finding fault and judging fault level then conventional single amplitude demodulation at present.

  14. Literature classification for semi-automated updating of biological knowledgebases

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Kudahl, Ulrich Johan; Winther, Ole;

    2013-01-01

    abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion: We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining and...

  15. Automated addition of Chelex solution to tubes containing trace items

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Thomas Møller; Hansen, Anders Johannes; Morling, Niels

    2011-01-01

    Extraction of DNA from trace items for forensic genetic DNA typing using a manual Chelex based extraction protocol requires addition of Chelex solution to sample tubes containing trace items. Automated of addition of Chelex solution may be hampered by high viscosity of the solution and fast...

  16. Fault Tolerant Quantum Filtering and Fault Detection for Quantum Systems

    OpenAIRE

    Gao, Qing; Dong, Daoyi; Petersen, Ian R.

    2015-01-01

    This paper aims to determine the fault tolerant quantum filter and fault detection equation for a class of open quantum systems coupled to a laser field that is subject to stochastic faults. In order to analyze this class of open quantum systems, we propose a quantum-classical Bayesian inference method based on the definition of a so-called quantum-classical conditional expectation. It is shown that the proposed Bayesian inference approach provides a convenient tool to simultaneously derive t...

  17. Library Automation Style Guide.

    Science.gov (United States)

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  18. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and support

  19. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  20. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.