WorldWideScience

Sample records for automated fault extraction

  1. Automated fault extraction and classification using 3-D seismic data for the Ekofisk field development

    Energy Technology Data Exchange (ETDEWEB)

    Signer, C.; Nickel, M.; Randen, T.; Saeter, T.; Soenneland, H.H.

    1998-12-31

    Mapping of fractures is important for the prediction of fluid flow in many reservoir types. The fluid flow depends mainly on the efficiency of the reservoir seals. Improved spatial mapping of the open and closed fracture systems will allow a better prediction of the fluid flow pattern. The primary objectives of this paper is to present fracture characterization at the reservoir scale combined with seismic facies mapping. The complexity of the giant Ekofisk field on the Norwegian continental shelf provides an ideal framework for testing the validity and the applicability of an automated seismic fault and fracture detection and mapping tool. The mapping of the faults can be based on seismic attribute grids, which means that attribute-responses related to faults are extracted along key horizons which were interpreted in the reservoir interval. 3 refs., 3 figs.

  2. Automated extraction of faults and porous reservoir bodies. Examples from the Vallhall Field

    Energy Technology Data Exchange (ETDEWEB)

    Barkved, Olav Inge; Whitman, Doug; Kunz, Tim

    1998-12-31

    The Norwegian Vahall field is located 250 km South-West of Stavanger. The production is primarily from the highly porous and fractured chalk, the Tor formation. Fractures, evidently play a significant role in enhancing flow properties as well as production rates, are significantly higher than expected from matrix permeability alone. The fractures are primarily tectonically induced and related to faulting. Syn-depositional faulting is believed to be a controlling factor on reservoir thickness variations observed across the field. Due to the low acoustic contrast and weak appearance of the highly porous chalk, direct evidence of faulting in well bore logs is limited. The seismic data quality in the most central area of the field is very poor due to tertiary gas charging, but in the flank area of the field, the quality is excellent. 1 ref., 5 figs.

  3. Semi-automated fault system extraction and displacement analysis of an excavated oyster reef using high-resolution laser scanned data

    Science.gov (United States)

    Molnár, Gábor; Székely, Balázs; Harzhauser, Mathias; Djuricic, Ana; Mandic, Oleg; Dorninger, Peter; Nothegger, Clemens; Exner, Ulrike; Pfeifer, Norbert

    2015-04-01

    In this contribution we present a semi-automated method for reconstructing the brittle deformation field of an excavated Miocene oyster reef, in Stetten, Korneuburg Basin, Lower Austria. Oyster shells up to 80 cm in size were scattered in a shallow estuarine bay forming a continuous and almost isochronous layer as a consequence of a catastrophic event in the Miocene. This shell bed was preserved by burial of several hundred meters of sandy to silty sediments. Later the layers were tilted westward, uplifted and erosion almost exhumed them. An excavation revealed a 27 by 17 meters area of the oyster covered layer. During the tectonic processes the sediment volume suffered brittle deformation. Faults mostly with some centimeter normal component and NW-SE striking affected the oyster covered volume, dissecting many shells and the surrounding matrix as well. Faults and displacements due to them can be traced along the site typically at several meters long, and as fossil oysters are broken and parts are displaced due to the faulting, along some faults it is possible to follow these displacements in 3D. In order to quantify these varying displacements and to map the undulating fault traces high-resolution scanning of the excavated and cleaned surface of the oyster bed has been carried out using a terrestrial laser scanner. The resulting point clouds have been co-georeferenced at mm accuracy and a 1mm resolution 3D point cloud of the surface has been created. As the faults are well-represented in the point cloud, this enables us to measure the dislocations of the dissected shell parts along the fault lines. We used a semi-automatic method to quantify these dislocations. First we manually digitized the fault lines in 2D as an initial model. In the next step we estimated the vertical (i.e. perpendicular to the layer) component of the dislocation along these fault lines comparing the elevations on two sides of the faults with moving averaging windows. To estimate the strike

  4. Automated Water Extraction Index

    DEFF Research Database (Denmark)

    Feyisa, Gudina Legese; Meilby, Henrik; Fensholt, Rasmus

    2014-01-01

    Classifying surface cover types and analyzing changes are among the most common applications of remote sensing. One of the most basic classification tasks is to distinguish water bodies from dry land surfaces. Landsat imagery is among the most widely used sources of data in remote sensing of water...... resources; and although several techniques of surface water extraction using Landsat data are described in the literature, their application is constrained by low accuracy in various situations. Besides, with the use of techniques such as single band thresholding and two-band indices, identifying...... an appropriate threshold yielding the highest possible accuracy is a challenging and time consuming task, as threshold values vary with location and time of image acquisition. The purpose of this study was therefore to devise an index that consistently improves water extraction accuracy in the presence...

  5. Hidden Markov Model Based Automated Fault Localization for Integration Testing

    OpenAIRE

    Ge, Ning; NAKAJIMA, SHIN; Pantel, Marc

    2013-01-01

    International audience; Integration testing is an expensive activity in software testing, especially for fault localization in complex systems. Model-based diagnosis (MBD) provides various benefits in terms of scalability and robustness. In this work, we propose a novel MBD approach for the automated fault localization in integration testing. Our method is based on Hidden Markov Model (HMM) which is an abstraction of system's component to simulate component's behaviour. The core of this metho...

  6. Automated DNA extraction from pollen in honey.

    Science.gov (United States)

    Guertler, Patrick; Eicheldinger, Adelina; Muschler, Paul; Goerlich, Ottmar; Busch, Ulrich

    2014-04-15

    In recent years, honey has become subject of DNA analysis due to potential risks evoked by microorganisms, allergens or genetically modified organisms. However, so far, only a few DNA extraction procedures are available, mostly time-consuming and laborious. Therefore, we developed an automated DNA extraction method from pollen in honey based on a CTAB buffer-based DNA extraction using the Maxwell 16 instrument and the Maxwell 16 FFS Nucleic Acid Extraction System, Custom-Kit. We altered several components and extraction parameters and compared the optimised method with a manual CTAB buffer-based DNA isolation method. The automated DNA extraction was faster and resulted in higher DNA yield and sufficient DNA purity. Real-time PCR results obtained after automated DNA extraction are comparable to results after manual DNA extraction. No PCR inhibition was observed. The applicability of this method was further successfully confirmed by analysis of different routine honey samples.

  7. Automated Extraction of DNA from clothing

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin Benn; Nøhr Hansen, Thomas;

    2011-01-01

    Presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. We have compared three automated DNA extraction methods based on magnetic beads with a manual method with the aim of reducing t...... the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable DNA profiles.......Presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. We have compared three automated DNA extraction methods based on magnetic beads with a manual method with the aim of reducing...

  8. Automated Extraction of DNA from clothing

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin Benn; Nøhr Hansen, Thomas

    2011-01-01

    Presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. We have compared three automated DNA extraction methods based on magnetic beads with a manual method with the aim of reducing...... the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable DNA profiles....

  9. Automated Feature Extraction from Hyperspectral Imagery Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed activities will result in the development of a novel hyperspectral feature-extraction toolkit that will provide a simple, automated, and accurate...

  10. Seismicity and faulting attributable to fluid extraction

    Science.gov (United States)

    Yerkes, R.F.; Castle, R.O.

    1976-01-01

    The association between fluid injection and seismicity has been well documented and widely publicized. Less well known, but probably equally widespread are faulting and shallow seismicity attributable solely to fluid extraction, particularly in association with petroleum production. Two unequivocable examples of seismicity and faulting associated with fluid extraction in the United States are: The Goose Creek, Texas oil field event of 1925 (involving surface rupture); and the Wilmington, California oil field events (involving subsurface rupture) of 1947, 1949, 1951 (2), 1955, and 1961. Six additional cases of intensity I-VII earthquakes (M Small earthquakes in the Eloy-Picacho area of Arizona may be attributable to withdrawal of groundwater, but their relation to widespread fissuring is enigmatic. The clearest example of extraction-induced seismicity outside of North America is the 1951 series of earthquakes associated with gas production from the Po River delta near Caviga, Italy. Faulting and seismicity associated with fluid extraction are attributed to differential compaction at depth caused by reduction of reservoir fluid pressure and attendant increase in effective stress. Surface and subsurface measurements and theoretical and model studies show that differential compaction leads not only to differential subsidence and centripetally-directed horizontal displacements, but to changes in both vertical- and horizontal-strain regimes. Study of well-documented examples indicates that the occurrence and nature of faulting and seismicity associated with compaction are functions chiefly of: (1) the pre-exploitation strain regime, and (2) the magnitude of contractional horizontal strain centered over the compacting materials relative to that of the surrounding annulus of extensional horizontal strain. The examples cited include natural systems strained only by extraction of fluids, as well as some subsequently subjected to injection. Faulting and seismicity have

  11. Automated information extraction from web APIs documentation

    OpenAIRE

    Ly, Papa Alioune; Pedrinaci, Carlos; Domingue, John

    2012-01-01

    A fundamental characteristic of Web APIs is the fact that, de facto, providers hardly follow any standard practices while implementing, publishing, and documenting their APIs. As a consequence, the discovery and use of these services by third parties is significantly hampered. In order to achieve further automation while exploiting Web APIs we present an approach for automatically extracting relevant technical information from the Web pages documenting them. In particular we have devised two ...

  12. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    Science.gov (United States)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.

  13. Repetitive transients extraction algorithm for detecting bearing faults

    Science.gov (United States)

    He, Wangpeng; Ding, Yin; Zi, Yanyang; Selesnick, Ivan W.

    2017-02-01

    Rolling-element bearing vibrations are random cyclostationary. This paper addresses the problem of noise reduction with simultaneous components extraction in vibration signals for faults diagnosis of bearing. The observed vibration signal is modeled as a summation of two components contaminated by noise, and each component composes of repetitive transients. To extract the two components simultaneously, an approach by solving an optimization problem is proposed in this paper. The problem adopts convex sparsity-based regularization scheme for decomposition, and non-convex regularization is used to further promote the sparsity but preserving the global convexity. A synthetic example is presented to illustrate the performance of the proposed approach for repetitive feature extraction. The performance and effectiveness of the proposed method are further demonstrated by applying to compound faults and single fault diagnosis of a locomotive bearing. The results show the proposed approach can effectively extract the features of outer and inner race defects.

  14. Integrated Automation System for Rare Earth Countercurrent Extraction Process

    Institute of Scientific and Technical Information of China (English)

    柴天佑; 杨辉

    2004-01-01

    Lower automation level in industrial rare-earth extraction processes results in high production cost, inconsistent product quality and great consumption of resources in China. An integrated automation system for extraction process of rare earth is proposed to realize optimal product indices, such as product purity,recycle rate and output. The optimal control strategy for output component, structure and function of the two-gradcd integrated automation system composed of the process management grade and the process control grade were discussed. This system is successfully applied to a HAB yttrium extraction production process and was found to provide optimal control, optimal operation, optimal management and remarkable benefits.

  15. PCA Fault Feature Extraction in Complex Electric Power Systems

    Directory of Open Access Journals (Sweden)

    ZHANG, J.

    2010-08-01

    Full Text Available Electric power system is one of the most complex artificial systems in the world. The complexity is determined by its characteristics about constitution, configuration, operation, organization, etc. The fault in electric power system cannot be completely avoided. When electric power system operates from normal state to failure or abnormal, its electric quantities (current, voltage and angles, etc. may change significantly. Our researches indicate that the variable with the biggest coefficient in principal component usually corresponds to the fault. Therefore, utilizing real-time measurements of phasor measurement unit, based on principal components analysis technology, we have extracted successfully the distinct features of fault component. Of course, because of the complexity of different types of faults in electric power system, there still exists enormous problems need a close and intensive study.

  16. Automated Bearing Fault Diagnosis Using 2D Analysis of Vibration Acceleration Signals under Variable Speed Conditions

    Directory of Open Access Journals (Sweden)

    Sheraz Ali Khan

    2016-01-01

    Full Text Available Traditional fault diagnosis methods of bearings detect characteristic defect frequencies in the envelope power spectrum of the vibration signal. These defect frequencies depend upon the inherently nonstationary shaft speed. Time-frequency and subband signal analysis of vibration signals has been used to deal with random variations in speed, whereas design variations require retraining a new instance of the classifier for each operating speed. This paper presents an automated approach for fault diagnosis in bearings based upon the 2D analysis of vibration acceleration signals under variable speed conditions. Images created from the vibration signals exhibit unique textures for each fault, which show minimal variation with shaft speed. Microtexture analysis of these images is used to generate distinctive fault signatures for each fault type, which can be used to detect those faults at different speeds. A k-nearest neighbor classifier trained using fault signatures generated for one operating speed is used to detect faults at all the other operating speeds. The proposed approach is tested on the bearing fault dataset of Case Western Reserve University, and the results are compared with those of a spectrum imaging-based approach.

  17. Automated Fault Localization for Service-Oriented Software Systems

    NARCIS (Netherlands)

    Chen, C.

    2015-01-01

    In this thesis, we have focused on applying Spectrum-based Fault Localization (SFL) to diagnose Service-Oriented Systems at runtime. We reused a framework-based online monitoring technique to obtain the service transaction information. We devised a three-phased oracle and combined this with monitori

  18. Automated Generation of Fault Management Artifacts from a Simple System Model

    Science.gov (United States)

    Kennedy, Andrew K.; Day, John C.

    2013-01-01

    Our understanding of off-nominal behavior - failure modes and fault propagation - in complex systems is often based purely on engineering intuition; specific cases are assessed in an ad hoc fashion as a (fallible) fault management engineer sees fit. This work is an attempt to provide a more rigorous approach to this understanding and assessment by automating the creation of a fault management artifact, the Failure Modes and Effects Analysis (FMEA) through querying a representation of the system in a SysML model. This work builds off the previous development of an off-nominal behavior model for the upcoming Soil Moisture Active-Passive (SMAP) mission at the Jet Propulsion Laboratory. We further developed the previous system model to more fully incorporate the ideas of State Analysis, and it was restructured in an organizational hierarchy that models the system as layers of control systems while also incorporating the concept of "design authority". We present software that was developed to traverse the elements and relationships in this model to automatically construct an FMEA spreadsheet. We further discuss extending this model to automatically generate other typical fault management artifacts, such as Fault Trees, to efficiently portray system behavior, and depend less on the intuition of fault management engineers to ensure complete examination of off-nominal behavior.

  19. Technology Corner: Automated Data Extraction Using Facebook

    Directory of Open Access Journals (Sweden)

    Nick Flor

    2012-06-01

    Full Text Available Because of Facebook’s popularity, law enforcement agents often use it as a key source of evidence. But like many user digital trails, there can be a large amount of data to extract for analysis. In this paper, we explore the basics of extracting data programmatically from a user’s Facebook via a Web app. A data extraction app requests data using the Facebook Graph API, and Facebook returns a JSON object containing the data. Before an app can access a user’s Facebook data, the user must log into Facebook and give permission. Thus, this approach is limited to situations where users give consent to the data extraction.

  20. Path Searching Based Fault Automated Recovery Scheme for Distribution Grid with DG

    Science.gov (United States)

    Xia, Lin; Qun, Wang; Hui, Xue; Simeng, Zhu

    2016-12-01

    Applying the method of path searching based on distribution network topology in setting software has a good effect, and the path searching method containing DG power source is also applicable to the automatic generation and division of planned islands after the fault. This paper applies path searching algorithm in the automatic division of planned islands after faults: starting from the switch of fault isolation, ending in each power source, and according to the line load that the searching path traverses and the load integrated by important optimized searching path, forming optimized division scheme of planned islands that uses each DG as power source and is balanced to local important load. Finally, COBASE software and distribution network automation software applied are used to illustrate the effectiveness of the realization of such automatic restoration program.

  1. Toward the automation of road networks extraction processes

    Science.gov (United States)

    Leymarie, Frederic; Boichis, Nicolas; Airault, Sylvain; Jamet, Olivier

    1996-12-01

    Syseca and IGN are working on various steps in the ongoing march from digital photogrammetry to the semi-automation and ultimately the full automation of data manipulation, i.e., capture and analysis. The immediate goals are to reduce the production costs and the data availability delays. Within this context, we have tackle the distinctive problem of 'automated road network extraction.' The methodology adopted is to first study semi-automatic solutions which probably increase the global efficiency of human operators in topographic data capture; in a second step, automatic solutions are designed based upon the gained experience. We report on different (semi-)automatic solutions for the road following algorithm. One key aspect of our method is to have the stages of 'detection' and 'geometric recovery' cooperate together while remaining distinct. 'Detection' is based on a local (texture) analysis of the image, while 'geometric recovery' is concerned with the extraction of 'road objects' for both monocular and stereo information. 'Detection' is a low-level visual process, 'reasoning' directly at the level of image intensities, while the mid-level visual process, 'geometric recovery', uses contextual knowledge about roads, both generic, e.g. parallelism of borders, and specific, e.g. using previously extracted road segments and disparities. We then pursue our 'march' by reporting on steps we are exploring toward full automation. We have in particular made attempts at tackling the automation of the initialization step to start searching in a valid direction.

  2. Automated vasculature extraction from placenta images

    Science.gov (United States)

    Almoussa, Nizar; Dutra, Brittany; Lampe, Bryce; Getreuer, Pascal; Wittman, Todd; Salafia, Carolyn; Vese, Luminita

    2011-03-01

    Recent research in perinatal pathology argues that analyzing properties of the placenta may reveal important information on how certain diseases progress. One important property is the structure of the placental blood vessels, which supply a fetus with all of its oxygen and nutrition. An essential step in the analysis of the vascular network pattern is the extraction of the blood vessels, which has only been done manually through a costly and time-consuming process. There is no existing method to automatically detect placental blood vessels; in addition, the large variation in the shape, color, and texture of the placenta makes it difficult to apply standard edge-detection algorithms. We describe a method to automatically detect and extract blood vessels from a given image by using image processing techniques and neural networks. We evaluate several local features for every pixel, in addition to a novel modification to an existing road detector. Pixels belonging to blood vessel regions have recognizable responses; hence, we use an artificial neural network to identify the pattern of blood vessels. A set of images where blood vessels are manually highlighted is used to train the network. We then apply the neural network to recognize blood vessels in new images. The network is effective in capturing the most prominent vascular structures of the placenta.

  3. Automated sea floor extraction from underwater video

    Science.gov (United States)

    Kelly, Lauren; Rahmes, Mark; Stiver, James; McCluskey, Mike

    2016-05-01

    Ocean floor mapping using video is a method to simply and cost-effectively record large areas of the seafloor. Obtaining visual and elevation models has noteworthy applications in search and recovery missions. Hazards to navigation are abundant and pose a significant threat to the safety, effectiveness, and speed of naval operations and commercial vessels. This project's objective was to develop a workflow to automatically extract metadata from marine video and create image optical and elevation surface mosaics. Three developments made this possible. First, optical character recognition (OCR) by means of two-dimensional correlation, using a known character set, allowed for the capture of metadata from image files. Second, exploiting the image metadata (i.e., latitude, longitude, heading, camera angle, and depth readings) allowed for the determination of location and orientation of the image frame in mosaic. Image registration improved the accuracy of mosaicking. Finally, overlapping data allowed us to determine height information. A disparity map was created using the parallax from overlapping viewpoints of a given area and the relative height data was utilized to create a three-dimensional, textured elevation map.

  4. NEW METHOD FOR WEAK FAULT FEATURE EXTRACTION BASED ON SECOND GENERATION WAVELET TRANSFORM AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Duan Chendong; He Zhengjia; Jiang Hongkai

    2004-01-01

    A new time-domain analysis method that uses second generation wavelet transform (SGWT) for weak fault feature extraction is proposed. To extract incipient fault feature, a biorthogonal wavelet with the characteristics of impact is constructed by using SGWT. Processing detail signal of SGWT with a sliding window devised on the basis of rotating operation cycle, and extracting modulus maximum from each window, fault features in time-domain are highlighted. To make further analysis on the reason of the fault, wavelet package transform based on SGWT is used to process vibration data again. Calculating the energy of each frequency-band, the energy distribution features of the signal are attained. Then taking account of the fault features and the energy distribution, the reason of the fault is worked out. An early impact-rub fault caused by axis misalignment and rotor imbalance is successfully detected by using this method in an oil refinery.

  5. Deep Fault Recognizer: An Integrated Model to Denoise and Extract Features for Fault Diagnosis in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    Xiaojie Guo

    2016-12-01

    Full Text Available Fault diagnosis in rotating machinery is significant to avoid serious accidents; thus, an accurate and timely diagnosis method is necessary. With the breakthrough in deep learning algorithm, some intelligent methods, such as deep belief network (DBN and deep convolution neural network (DCNN, have been developed with satisfactory performances to conduct machinery fault diagnosis. However, only a few of these methods consider properly dealing with noises that exist in practical situations and the denoising methods are in need of extensive professional experiences. Accordingly, rethinking the fault diagnosis method based on deep architectures is essential. Hence, this study proposes an automatic denoising and feature extraction method that inherently considers spatial and temporal correlations. In this study, an integrated deep fault recognizer model based on the stacked denoising autoencoder (SDAE is applied to both denoise random noises in the raw signals and represent fault features in fault pattern diagnosis for both bearing rolling fault and gearbox fault, and trained in a greedy layer-wise fashion. Finally, the experimental validation demonstrates that the proposed method has better diagnosis accuracy than DBN, particularly in the existing situation of noises with superiority of approximately 7% in fault diagnosis accuracy.

  6. Feature evaluation and extraction based on neural network in analog circuit fault diagnosis

    Institute of Scientific and Technical Information of China (English)

    Yuan Haiying; Chen Guangju; Xie Yongle

    2007-01-01

    Choosing the right characteristic parameter is the key to fault diagnosis in analog circuit.The feature evaluation and extraction methods based on neural network are presented.Parameter evaluation of circuit features is realized by training results from neural network; the superior nonlinear mapping capability is competent for extracting fault features which are normalized and compressed subsequently.The complex classification problem on fault pattern recognition in analog circuit is transferred into feature processing stage by feature extraction based on neural network effectively, which improves the diagnosis efficiency.A fault diagnosis illustration validated this method.

  7. Critical Evaluation of Validation Rules Automated Extraction from Data

    Directory of Open Access Journals (Sweden)

    David Pejcoch

    2014-10-01

    Full Text Available The goal of this article is to critically evaluate a possibility of automatic extraction of such kind of rules which could be later used within a Data Quality Management process for validation of records newly incoming to Information System. For practical demonstration the 4FT-Miner procedure implemented in LISpMiner System was chosen. A motivation for this task is the potential simplification of projects focused on Data Quality Management. Initially, this article is going to critically evaluate a possibility of fully automated extraction with the aim to identify strengths and weaknesses of this approach in comparison to its alternative, when at least some a priori knowledge is available. As a result of practical implementation, this article provides design of recommended process which would be used as a guideline for future projects. Also the question of how to store and maintain extracted rules and how to integrate them with existing tools supporting Data Quality Management is discussed

  8. Arduino-based automation of a DNA extraction system.

    Science.gov (United States)

    Kim, Kyung-Won; Lee, Mi-So; Ryu, Mun-Ho; Kim, Jong-Won

    2015-01-01

    There have been many studies to detect infectious diseases with the molecular genetic method. This study presents an automation process for a DNA extraction system based on microfluidics and magnetic bead, which is part of a portable molecular genetic test system. This DNA extraction system consists of a cartridge with chambers, syringes, four linear stepper actuators, and a rotary stepper actuator. The actuators provide a sequence of steps in the DNA extraction process, such as transporting, mixing, and washing for the gene specimen, magnetic bead, and reagent solutions. The proposed automation system consists of a PC-based host application and an Arduino-based controller. The host application compiles a G code sequence file and interfaces with the controller to execute the compiled sequence. The controller executes stepper motor axis motion, time delay, and input-output manipulation. It drives the stepper motor with an open library, which provides a smooth linear acceleration profile. The controller also provides a homing sequence to establish the motor's reference position, and hard limit checking to prevent any over-travelling. The proposed system was implemented and its functionality was investigated, especially regarding positioning accuracy and velocity profile.

  9. ACME, a GIS tool for Automated Cirque Metric Extraction

    Science.gov (United States)

    Spagnolo, Matteo; Pellitero, Ramon; Barr, Iestyn D.; Ely, Jeremy C.; Pellicer, Xavier M.; Rea, Brice R.

    2017-02-01

    Regional scale studies of glacial cirque metrics provide key insights on the (palaeo) environment related to the formation of these erosional landforms. The growing availability of high resolution terrain models means that more glacial cirques can be identified and mapped in the future. However, the extraction of their metrics still largely relies on time consuming manual techniques or the combination of, more or less obsolete, GIS tools. In this paper, a newly coded toolbox is provided for the automated, and comparatively quick, extraction of 16 key glacial cirque metrics; including length, width, circularity, planar and 3D area, elevation, slope, aspect, plan closure and hypsometry. The set of tools, named ACME (Automated Cirque Metric Extraction), is coded in Python, runs in one of the most commonly used GIS packages (ArcGIS) and has a user friendly interface. A polygon layer of mapped cirques is required for all metrics, while a Digital Terrain Model and a point layer of cirque threshold midpoints are needed to run some of the tools. Results from ACME are comparable to those from other techniques and can be obtained rapidly, allowing large cirque datasets to be analysed and potentially important regional trends highlighted.

  10. Automated feature extraction for 3-dimensional point clouds

    Science.gov (United States)

    Magruder, Lori A.; Leigh, Holly W.; Soderlund, Alexander; Clymer, Bradley; Baer, Jessica; Neuenschwander, Amy L.

    2016-05-01

    Light detection and ranging (LIDAR) technology offers the capability to rapidly capture high-resolution, 3-dimensional surface data with centimeter-level accuracy for a large variety of applications. Due to the foliage-penetrating properties of LIDAR systems, these geospatial data sets can detect ground surfaces beneath trees, enabling the production of highfidelity bare earth elevation models. Precise characterization of the ground surface allows for identification of terrain and non-terrain points within the point cloud, and facilitates further discernment between natural and man-made objects based solely on structural aspects and relative neighboring parameterizations. A framework is presented here for automated extraction of natural and man-made features that does not rely on coincident ortho-imagery or point RGB attributes. The TEXAS (Terrain EXtraction And Segmentation) algorithm is used first to generate a bare earth surface from a lidar survey, which is then used to classify points as terrain or non-terrain. Further classifications are assigned at the point level by leveraging local spatial information. Similarly classed points are then clustered together into regions to identify individual features. Descriptions of the spatial attributes of each region are generated, resulting in the identification of individual tree locations, forest extents, building footprints, and 3-dimensional building shapes, among others. Results of the fully-automated feature extraction algorithm are then compared to ground truth to assess completeness and accuracy of the methodology.

  11. Fault Tolerant Control System Design Using Automated Methods from Risk Analysis

    DEFF Research Database (Denmark)

    Blanke, M.

    Fault tolerant controls have the ability to be resilient to simple faults in control loop components.......Fault tolerant controls have the ability to be resilient to simple faults in control loop components....

  12. Seismicity on Basement Faults Induced by Simultaneous Fluid Injection-Extraction

    Science.gov (United States)

    Chang, Kyung Won; Segall, Paul

    2016-08-01

    Large-scale carbon dioxide (CO2) injection into geological formations increases pore pressure, potentially inducing seismicity on critically stressed faults by reducing the effective normal stress. In addition, poroelastic expansion of the reservoir alters stresses, both within and around the formation, which may trigger earthquakes without direct pore-pressure diffusion. One possible solution to mitigate injection-induced earthquakes is to simultaneously extract pre-existing pore fluids from the target reservoir. To examine the feasibility of the injection-extraction strategy, we compute the spatiotemporal change in Coulomb stress on basement normal faults, including: (1) the change in poroelastic stresses Δ τ _s+fΔ σ _n, where Δ τ _s and Δ σ _n are changes in shear and normal stress. respectively, and (2) the change in pore-pressure fΔ p. Using the model of (J. Geophys. Res. Solid Earth 99(B2):2601-2618, 1994), we estimate the seismicity rate on basement fault zones. Fluid extraction reduces direct pore-pressure diffusion into conductive faults, generally reducing the risk of induced seismicity. Limited diffusion into/from sealing faults results in negligible pore pressure changes within them. However, fluid extraction can cause enhanced seismicity rates on deep normal faults near the injector as well as shallow normal faults near the producer by poroelastic stressing. Changes in seismicity rate driven by poroelastic response to fluid injection-extraction depends on fault geometry, well operations, and the background stressing rate.

  13. Automated Feature Extraction of Foredune Morphology from Terrestrial Lidar Data

    Science.gov (United States)

    Spore, N.; Brodie, K. L.; Swann, C.

    2014-12-01

    Foredune morphology is often described in storm impact prediction models using the elevation of the dune crest and dune toe and compared with maximum runup elevations to categorize the storm impact and predicted responses. However, these parameters do not account for other foredune features that may make them more or less erodible, such as alongshore variations in morphology, vegetation coverage, or compaction. The goal of this work is to identify other descriptive features that can be extracted from terrestrial lidar data that may affect the rate of dune erosion under wave attack. Daily, mobile-terrestrial lidar surveys were conducted during a 6-day nor'easter (Hs = 4 m in 6 m water depth) along 20km of coastline near Duck, North Carolina which encompassed a variety of foredune forms in close proximity to each other. This abstract will focus on the tools developed for the automated extraction of the morphological features from terrestrial lidar data, while the response of the dune will be presented by Brodie and Spore as an accompanying abstract. Raw point cloud data can be dense and is often under-utilized due to time and personnel constraints required for analysis, since many algorithms are not fully automated. In our approach, the point cloud is first projected into a local coordinate system aligned with the coastline, and then bare earth points are interpolated onto a rectilinear 0.5 m grid creating a high resolution digital elevation model. The surface is analyzed by identifying features along each cross-shore transect. Surface curvature is used to identify the position of the dune toe, and then beach and berm morphology is extracted shoreward of the dune toe, and foredune morphology is extracted landward of the dune toe. Changes in, and magnitudes of, cross-shore slope, curvature, and surface roughness are used to describe the foredune face and each cross-shore transect is then classified using its pre-storm morphology for storm-response analysis.

  14. The Rolling Bearing Fault Feature Extraction Based on the LMD and Envelope Demodulation

    Directory of Open Access Journals (Sweden)

    Jun Ma

    2015-01-01

    Full Text Available Since the working process of rolling bearings is a complex and nonstationary dynamic process, the common time and frequency characteristics of vibration signals are submerged in the noise. Thus, it is the key of fault diagnosis to extract the fault feature from vibration signal. Therefore, a fault feature extraction method for the rolling bearing based on the local mean decomposition (LMD and envelope demodulation is proposed. Firstly, decompose the original vibration signal by LMD to get a series of production functions (PFs. Then dispose the envelope demodulation analysis on PF component. Finally, perform Fourier Transform on the demodulation signals and judge failure condition according to the dominant frequency of the spectrum. The results show that the proposed method can correctly extract the fault characteristics to diagnose faults.

  15. Automated blood vessel extraction using local features on retinal images

    Science.gov (United States)

    Hatanaka, Yuji; Samo, Kazuki; Tajima, Mikiya; Ogohara, Kazunori; Muramatsu, Chisako; Okumura, Susumu; Fujita, Hiroshi

    2016-03-01

    An automated blood vessel extraction using high-order local autocorrelation (HLAC) on retinal images is presented. Although many blood vessel extraction methods based on contrast have been proposed, a technique based on the relation of neighbor pixels has not been published. HLAC features are shift-invariant; therefore, we applied HLAC features to retinal images. However, HLAC features are weak to turned image, thus a method was improved by the addition of HLAC features to a polar transformed image. The blood vessels were classified using an artificial neural network (ANN) with HLAC features using 105 mask patterns as input. To improve performance, the second ANN (ANN2) was constructed by using the green component of the color retinal image and the four output values of ANN, Gabor filter, double-ring filter and black-top-hat transformation. The retinal images used in this study were obtained from the "Digital Retinal Images for Vessel Extraction" (DRIVE) database. The ANN using HLAC output apparent white values in the blood vessel regions and could also extract blood vessels with low contrast. The outputs were evaluated using the area under the curve (AUC) based on receiver operating characteristics (ROC) analysis. The AUC of ANN2 was 0.960 as a result of our study. The result can be used for the quantitative analysis of the blood vessels.

  16. GPU Accelerated Automated Feature Extraction From Satellite Images

    Directory of Open Access Journals (Sweden)

    K. Phani Tejaswi

    2013-04-01

    Full Text Available The availability of large volumes of remote sensing data insists on higher degree of automation in featureextraction, making it a need of thehour. Fusingdata from multiple sources, such as panchromatic,hyperspectraland LiDAR sensors, enhances the probability of identifying and extracting features such asbuildings, vegetation or bodies of water by using a combination of spectral and elevation characteristics.Utilizing theaforementioned featuresin remote sensing is impracticable in the absence ofautomation.Whileefforts are underway to reduce human intervention in data processing, this attempt alone may notsuffice. Thehuge quantum of data that needs to be processed entailsaccelerated processing to be enabled.GPUs, which were originally designed to provide efficient visualization,arebeing massively employed forcomputation intensive parallel processing environments. Image processing in general and hence automatedfeatureextraction, is highly computation intensive, where performance improvements have a direct impacton societal needs. In this context, an algorithm has been formulated for automated feature extraction froma panchromatic or multispectral image based on image processing techniques.Two Laplacian of Guassian(LoGmasks were applied on the image individually followed by detection of zero crossing points andextracting the pixels based on their standard deviationwiththe surrounding pixels. The two extractedimages with different LoG masks were combined together which resulted in an image withthe extractedfeatures and edges.Finally the user is at liberty to apply the image smoothing step depending on the noisecontent in the extracted image.The image ispassed through a hybrid median filter toremove the salt andpepper noise from the image.This paper discusses theaforesaidalgorithmforautomated featureextraction, necessity of deployment of GPUs for thesame;system-level challenges and quantifies thebenefits of integrating GPUs in such environment. The

  17. Automated extraction of chemical structure information from digital raster images

    Directory of Open Access Journals (Sweden)

    Shedden Kerby A

    2009-02-01

    Full Text Available Abstract Background To search for chemical structures in research articles, diagrams or text representing molecules need to be translated to a standard chemical file format compatible with cheminformatic search engines. Nevertheless, chemical information contained in research articles is often referenced as analog diagrams of chemical structures embedded in digital raster images. To automate analog-to-digital conversion of chemical structure diagrams in scientific research articles, several software systems have been developed. But their algorithmic performance and utility in cheminformatic research have not been investigated. Results This paper aims to provide critical reviews for these systems and also report our recent development of ChemReader – a fully automated tool for extracting chemical structure diagrams in research articles and converting them into standard, searchable chemical file formats. Basic algorithms for recognizing lines and letters representing bonds and atoms in chemical structure diagrams can be independently run in sequence from a graphical user interface-and the algorithm parameters can be readily changed-to facilitate additional development specifically tailored to a chemical database annotation scheme. Compared with existing software programs such as OSRA, Kekule, and CLiDE, our results indicate that ChemReader outperforms other software systems on several sets of sample images from diverse sources in terms of the rate of correct outputs and the accuracy on extracting molecular substructure patterns. Conclusion The availability of ChemReader as a cheminformatic tool for extracting chemical structure information from digital raster images allows research and development groups to enrich their chemical structure databases by annotating the entries with published research articles. Based on its stable performance and high accuracy, ChemReader may be sufficiently accurate for annotating the chemical database with links

  18. Weak Fault Feature Extraction of Rolling Bearings Based on an Improved Kurtogram.

    Science.gov (United States)

    Chen, Xianglong; Feng, Fuzhou; Zhang, Bingzhi

    2016-09-13

    Kurtograms have been verified to be an efficient tool in bearing fault detection and diagnosis because of their superiority in extracting transient features. However, the short-time Fourier Transform is insufficient in time-frequency analysis and kurtosis is deficient in detecting cyclic transients. Those factors weaken the performance of the original kurtogram in extracting weak fault features. Correlated Kurtosis (CK) is then designed, as a more effective solution, in detecting cyclic transients. Redundant Second Generation Wavelet Packet Transform (RSGWPT) is deemed to be effective in capturing more detailed local time-frequency description of the signal, and restricting the frequency aliasing components of the analysis results. The authors in this manuscript, combining the CK with the RSGWPT, propose an improved kurtogram to extract weak fault features from bearing vibration signals. The analysis of simulation signals and real application cases demonstrate that the proposed method is relatively more accurate and effective in extracting weak fault features.

  19. Weak Fault Feature Extraction of Rolling Bearings Based on an Improved Kurtogram

    Directory of Open Access Journals (Sweden)

    Xianglong Chen

    2016-09-01

    Full Text Available Kurtograms have been verified to be an efficient tool in bearing fault detection and diagnosis because of their superiority in extracting transient features. However, the short-time Fourier Transform is insufficient in time-frequency analysis and kurtosis is deficient in detecting cyclic transients. Those factors weaken the performance of the original kurtogram in extracting weak fault features. Correlated Kurtosis (CK is then designed, as a more effective solution, in detecting cyclic transients. Redundant Second Generation Wavelet Packet Transform (RSGWPT is deemed to be effective in capturing more detailed local time-frequency description of the signal, and restricting the frequency aliasing components of the analysis results. The authors in this manuscript, combining the CK with the RSGWPT, propose an improved kurtogram to extract weak fault features from bearing vibration signals. The analysis of simulation signals and real application cases demonstrate that the proposed method is relatively more accurate and effective in extracting weak fault features.

  20. Faults

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Through the study of faults and their effects, much can be learned about the size and recurrence intervals of earthquakes. Faults also teach us about crustal...

  1. Automated Training Sample Extraction for Global Land Cover Mapping

    Directory of Open Access Journals (Sweden)

    Julien Radoux

    2014-05-01

    Full Text Available Land cover is one of the essential climate variables of the ESA Climate Change Initiative (CCI. In this context, the Land Cover CCI (LC CCI project aims at building global land cover maps suitable for climate modeling based on Earth observation by satellite sensors.  The  challenge  is  to  generate  a  set  of  successive  maps  that  are  both  accurate and consistent over time. To do so, operational methods for the automated classification of optical images are investigated. The proposed approach consists of a locally trained classification using an automated selection of training samples from existing, but outdated land cover information. Combinations of local extraction (based on spatial criteria and self-cleaning of training samples (based on spectral criteria are quantitatively assessed. Two large study areas, one in Eurasia and the other in South America, are considered. The proposed morphological cleaning of the training samples leads to higher accuracies than the statistical outlier removal in the spectral domain. An optimal neighborhood has been identified for the local sample extraction. The results are coherent for the two test areas, showing an improvement of the overall accuracy compared with the original reference datasets and a significant reduction of macroscopic errors. More importantly, the proposed method partly controls the reliability of existing land cover maps as sources of training samples for supervised classification.

  2. Fault Feature Extraction of Rolling Bearing Based on an Improved Cyclical Spectrum Density Method

    Institute of Scientific and Technical Information of China (English)

    LI Min; YANG Jianhong; WANG Xiaojing

    2015-01-01

    The traditional cyclical spectrum density(CSD) method is widely used to analyze the fault signals of rolling bearing. All modulation frequencies are demodulated in the cyclic frequency spectrum. Consequently, recognizing bearing fault type is difficult. Therefore, a new CSD method based on kurtosis(CSDK) is proposed. The kurtosis value of each cyclic frequency is used to measure the modulation capability of cyclic frequency. When the kurtosis value is large, the modulation capability is strong. Thus, the kurtosis value is regarded as the weight coefficient to accumulate all cyclic frequencies to extract fault features. Compared with the traditional method, CSDK can reduce the interference of harmonic frequency in fault frequency, which makes fault characteristics distinct from background noise. To validate the effectiveness of the method,experiments are performed on the simulation signal, the fault signal of the bearing outer race in the test bed, and the signal gathered from the bearing of the blast furnace belt cylinder. Experimental results show that the CSDK is better than the resonance demodulation method and the CSD in extracting fault features and recognizing degradation trends. The proposed method provides a new solution to fault diagnosis in bearings.

  3. Application of Waveform Factors in Extracting Fault Trend of Rotary Machines

    Institute of Scientific and Technical Information of China (English)

    YE Yu-gang; ZUO Yun-bo; HUANG Xiao-bin

    2009-01-01

    Vibration intensity and non-dimensional amplitude parameters are often used to extract the fault trend of rotary machines. But, they are the parameters related to energy, and can not describe the fault trend because of varying load and conditions or too slight change of vibration signal. For this reason, three non-dimensional parameters are presented, namely waveform repeatability factor, waveform jumping factor and waveform similarity factor, called as waveform factors jointly, which are based on statistics analysis for the waveform and sensitive to the change of signal waveform. When they are used to extract the fault trend of rotary machines as a kind of technology of instrument and meter, they can reflect the fault trend better than the vibration intensity, peak amplitude and margin index.

  4. A Fault Feature Extraction Method for Rolling Bearing Based on Pulse Adaptive Time-Frequency Transform

    Directory of Open Access Journals (Sweden)

    Jinbao Yao

    2016-01-01

    Full Text Available Shock pulse method is a widely used technique for condition monitoring of rolling bearing. However, it may cause erroneous diagnosis in the presence of strong background noise or other shock sources. Aiming at overcoming the shortcoming, a pulse adaptive time-frequency transform method is proposed to extract the fault features of the damaged rolling bearing. The method arranges the rolling bearing shock pulses extracted by shock pulse method in the order of time and takes the reciprocal of the time interval between the pulse at any moment and the other pulse as all instantaneous frequency components in the moment. And then it visually displays the changing rule of each instantaneous frequency after plane transformation of the instantaneous frequency components, realizes the time-frequency transform of shock pulse sequence through time-frequency domain amplitude relevancy processing, and highlights the fault feature frequencies by effective instantaneous frequency extraction, so as to extract the fault features of the damaged rolling bearing. The results of simulation and application show that the proposed method can suppress the noises well, highlight the fault feature frequencies, and avoid erroneous diagnosis, so it is an effective fault feature extraction method for the rolling bearing with high time-frequency resolution.

  5. Evaluation of Four Automated Protocols for Extraction of DNA from FTA Cards

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura

    2013-01-01

    Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction...... protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA...... from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore...

  6. Evaluation of Four Automated Protocols for Extraction of DNA from FTA Cards

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura;

    2013-01-01

    protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA......Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction...... from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore...

  7. The Fault Feature Extraction of Rolling Bearing Based on EMD and Difference Spectrum of Singular Value

    Directory of Open Access Journals (Sweden)

    Te Han

    2016-01-01

    Full Text Available Nowadays, the fault diagnosis of rolling bearing in aeroengines is based on the vibration signal measured on casing, instead of bearing block. However, the vibration signal of the bearing is often covered by a series of complex components caused by other structures (rotor, gears. Therefore, when bearings cause failure, it is still not certain that the fault feature can be extracted from the vibration signal on casing. In order to solve this problem, a novel fault feature extraction method for rolling bearing based on empirical mode decomposition (EMD and the difference spectrum of singular value is proposed in this paper. Firstly, the vibration signal is decomposed by EMD. Next, the difference spectrum of singular value method is applied. The study finds that each peak on the difference spectrum corresponds to each component in the original signal. According to the peaks on the difference spectrum, the component signal of the bearing fault can be reconstructed. To validate the proposed method, the bearing fault data collected on the casing are analyzed. The results indicate that the proposed rolling bearing diagnosis method can accurately extract the fault feature that is submerged in other component signals and noise.

  8. A new rolling bearing fault diagnosis method based on GFT impulse component extraction

    Science.gov (United States)

    Ou, Lu; Yu, Dejie; Yang, Hanjian

    2016-12-01

    Periodic impulses are vital indicators of rolling bearing faults. The extraction of impulse components from rolling bearing vibration signals is of great importance for fault diagnosis. In this paper, vibration signals are taken as the path graph signals in a manifold perspective, and the Graph Fourier Transform (GFT) of vibration signals are investigated from the graph spectrum domain, which are both introduced into the vibration signal analysis. To extract the impulse components efficiently, a new adjacency weight matrix is defined, and then the GFT of the impulse component and harmonic component in the rolling bearing vibration signals are analyzed. Furthermore, as the GFT graph spectrum of the impulse component is mainly concentrated in the high-order region, a new rolling bearing fault diagnosis method based on GFT impulse component extraction is proposed. In the proposed method, the GFT of a vibration signal is firstly performed, and its graph spectrum coefficients in the high-order region are extracted to reconstruct different impulse components. Next, the Hilbert envelope spectra of these impulse components are calculated, and the envelope spectrum values at the fault characteristic frequency are arranged in order. Furthermore, the envelope spectrum with the maximum value at the fault characteristic frequency is selected as the final result, from which the rolling bearing fault can be diagnosed. Finally, an index KR, which is the product of the kurtosis and Hilbert envelope spectrum fault feature ratio of the extracted impulse component, is put forward to measure the performance of the proposed method. Simulations and experiments are utilized to demonstrate the feasibility and effectiveness of the proposed method.

  9. Automatic extraction of faults and fractal analysis from remote sensing data

    Directory of Open Access Journals (Sweden)

    R. Gloaguen

    2007-01-01

    Full Text Available Object-based classification is a promising technique for image classification. Unlike pixel-based methods, which only use the measured radiometric values, the object-based techniques can also use shape and context information of scene textures. These extra degrees of freedom provided by the objects allow the automatic identification of geological structures. In this article, we present an evaluation of object-based classification in the context of extraction of geological faults. Digital elevation models and radar data of an area near Lake Magadi (Kenya have been processed. We then determine the statistics of the fault populations. The fractal dimensions of fault dimensions are similar to fractal dimensions directly measured on remote sensing images of the study area using power spectra (PSD and variograms. These methods allow unbiased statistics of faults and help us to understand the evolution of the fault systems in extensional domains. Furthermore, the direct analysis of image texture is a good indicator of the fault statistics and allows us to classify the intensity and type of deformation. We propose that extensional fault networks can be modeled by iterative function system (IFS.

  10. Auditory-model-based Feature Extraction Method for Mechanical Faults Diagnosis

    Institute of Scientific and Technical Information of China (English)

    LI Yungong; ZHANG Jinping; DAI Li; ZHANG Zhanyi; LIU Jie

    2010-01-01

    It is well known that the human auditory system possesses remarkable capabilities to analyze and identify signals. Therefore, it would be significant to build an auditory model based on the mechanism of human auditory systems, which may improve the effects of mechanical signal analysis and enrich the methods of mechanical faults features extraction. However the existing methods are all based on explicit senses of mathematics or physics, and have some shortages on distinguishing different faults, stability, and suppressing the disturbance noise, etc. For the purpose of improving the performances of the work of feature extraction, an auditory model, early auditory(EA) model, is introduced for the first time. This auditory model transforms time domain signal into auditory spectrum via bandpass filtering, nonlinear compressing, and lateral inhibiting by simulating the principle of the human auditory system. The EA model is developed with the Gammatone filterbank as the basilar membrane. According to the characteristics of vibration signals, a method is proposed for determining the parameter of inner hair cells model of EA model. The performance of EA model is evaluated through experiments on four rotor faults, including misalignment, rotor-to-stator rubbing, oil film whirl, and pedestal looseness. The results show that the auditory spectrum, output of EA model, can effectively distinguish different faults with satisfactory stability and has the ability to suppress the disturbance noise. Then, it is feasible to apply auditory model, as a new method, to the feature extraction for mechanical faults diagnosis with effect.

  11. Automated Fault Diagnostics, Prognostics, and Recovery in Spacecraft Power Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault detection and isolation (FDI) in spacecraft's electrical power system (EPS) has always received special attention. However, the power systems health management...

  12. Fault Diagnosis Technology Applied in Metering Automation Systems%故障诊断技术在计量自动化系统中的应用

    Institute of Scientific and Technical Information of China (English)

    危阜胜; 肖勇; 陈锐民

    2013-01-01

    Aiming at the development of the remote fault diagnosis technology and the metering automation system home and abroad, and analyzing some fault maintenance problems of the metering automation terminal, this paper puts forward a solution to the field operation metering automation terminal fault remote diagnostic, and independently develops an expert system of the metering automation terminal remote fault diagnosis. The system has realized the field large-scale measurement automation terminal remote fault diagnosis for the first time.%介绍国内外远程故障诊断技术研究的进展情况和计量自动化系统的发展历程,分析计量自动化终端故障维护面临的问题,提出一种现场运行计量自动化终端故障远程诊断的解决方案,自主开发计量自动化终端远程故障诊断专家系统,首次实现对现场大规模计量自动化终端的远程故障诊断。

  13. Reliable Fault Classification of Induction Motors Using Texture Feature Extraction and a Multiclass Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Jia Uddin

    2014-01-01

    Full Text Available This paper proposes a method for the reliable fault detection and classification of induction motors using two-dimensional (2D texture features and a multiclass support vector machine (MCSVM. The proposed model first converts time-domain vibration signals to 2D gray images, resulting in texture patterns (or repetitive patterns, and extracts these texture features by generating the dominant neighborhood structure (DNS map. The principal component analysis (PCA is then used for the purpose of dimensionality reduction of the high-dimensional feature vector including the extracted texture features due to the fact that the high-dimensional feature vector can degrade classification performance, and this paper configures an effective feature vector including discriminative fault features for diagnosis. Finally, the proposed approach utilizes the one-against-all (OAA multiclass support vector machines (MCSVMs to identify induction motor failures. In this study, the Gaussian radial basis function kernel cooperates with OAA MCSVMs to deal with nonlinear fault features. Experimental results demonstrate that the proposed approach outperforms three state-of-the-art fault diagnosis algorithms in terms of fault classification accuracy, yielding an average classification accuracy of 100% even in noisy environments.

  14. Manifold Learning with Self-Organizing Mapping for Feature Extraction of Nonlinear Faults in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    Lin Liang

    2015-01-01

    Full Text Available A new method for extracting the low-dimensional feature automatically with self-organization mapping manifold is proposed for the detection of rotating mechanical nonlinear faults (such as rubbing, pedestal looseness. Under the phase space reconstructed by single vibration signal, the self-organization mapping (SOM with expectation maximization iteration algorithm is used to divide the local neighborhoods adaptively without manual intervention. After that, the local tangent space alignment algorithm is adopted to compress the high-dimensional phase space into low-dimensional feature space. The proposed method takes advantages of the manifold learning in low-dimensional feature extraction and adaptive neighborhood construction of SOM and can extract intrinsic fault features of interest in two dimensional projection space. To evaluate the performance of the proposed method, the Lorenz system was simulated and rotation machinery with nonlinear faults was obtained for test purposes. Compared with the holospectrum approaches, the results reveal that the proposed method is superior in identifying faults and effective for rotating machinery condition monitoring.

  15. Feature extraction of induction motor stator fault based on particle swarm optimization and wavelet packet

    Institute of Scientific and Technical Information of China (English)

    WANG Pan-pan; SHI Li-ping; HU Yong-jun; MIAO Chang-xin

    2012-01-01

    To effectively extract the interturn short circuit fault features of induction motor from stator current signal,a novel feature extraction method based on the bare-bones particle swarm optimization (BBPSO) algorithm and wavelet packet was proposed.First,according to the maximum inner product between the current signal and the cosine basis functions,this method could precisely estimate the waveform parameters of the fundamental component using the powerful global search capability of the BBPSO,which can eliminate the fundamental component and not affect other harmonic components.Then,the harmonic components of residual current signal were decomposed to a series of frequency bands by wavelet packet to extract the interturn circuit fault features of the induction motor.Finally,the results of simulation and laboratory tests demonstrated the effectiveness of the proposed method.

  16. Automated extraction of DNA and PCR setup using a Tecan Freedom EVO® liquid handler

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G.; Frank-Hansen, Rune

    2009-01-01

    We have implemented and validated automated methods for DNA extraction and PCR setup developed for a Tecan Freedom EVO« liquid handler mounted with a Te-MagS(TM) magnetic separation device. The DNA was extracted using the Qiagen MagAttract« DNA Mini M48 kit. The DNA was amplified using Amp...

  17. An adaptive morphological impulses extraction method and its application to fault diagnosis

    Institute of Scientific and Technical Information of China (English)

    He Wei; Jiang Zhinong; Gao Jinji; Wang Hui

    2010-01-01

    An adaptive morphological impulses extraction method (AMIE) for beating fault diagnosis is proposed.This method uses the morphological closing operation with a flat structuring element (SE) to extract impulsive features from vibration signals with strong background noise.To optimize the flat SE, firstly, a theoretical study is carried out to investigate the effects of the length of the flat SE.Then, based on the theoretical findings, an adaptive algorithm for the flat SE optimization is proposed.The AMIE method is tested by the simulated signal and bearing vibration signals.The test results show that this method is effective and robust in extracting impulsive features.

  18. Automated serial extraction of DNA and RNA from biobanked tissue specimens

    OpenAIRE

    Mathot, Lucy; Wallin, Monica; Sjöblom, Tobias

    2013-01-01

    Background: With increasing biobanking of biological samples, methods for large scale extraction of nucleic acids are in demand. The lack of such techniques designed for extraction from tissues results in a bottleneck in downstream genetic analyses, particularly in the field of cancer research. We have developed an automated procedure for tissue homogenization and extraction of DNA and RNA into separate fractions from the same frozen tissue specimen. A purpose developed magnetic bead based te...

  19. Evaluation of four automated protocols for extraction of DNA from FTA cards.

    Science.gov (United States)

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J; Morling, Niels

    2013-10-01

    Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already extracted from the FTA cards.

  20. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    Science.gov (United States)

    Wang, Huaqing; Chen, Peng

    2009-01-01

    This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to. PMID:22574021

  1. A Feature Extraction Method Based on Information Theory for Fault Diagnosis of Reciprocating Machinery

    Directory of Open Access Journals (Sweden)

    Huaqing Wang

    2009-04-01

    Full Text Available This paper proposes a feature extraction method based on information theory for fault diagnosis of reciprocating machinery. A method to obtain symptom parameter waves is defined in the time domain using the vibration signals, and an information wave is presented based on information theory, using the symptom parameter waves. A new way to determine the difference spectrum of envelope information waves is also derived, by which the feature spectrum can be extracted clearly and machine faults can be effectively differentiated. This paper also compares the proposed method with the conventional Hilbert-transform-based envelope detection and with a wavelet analysis technique. Practical examples of diagnosis for a rolling element bearing used in a diesel engine are provided to verify the effectiveness of the proposed method. The verification results show that the bearing faults that typically occur in rolling element bearings, such as outer-race, inner-race, and roller defects, can be effectively identified by the proposed method, while these bearing faults are difficult to detect using either of the other techniques it was compared to.

  2. Self adaptive multi-scale morphology AVG-Hat filter and its application to fault feature extraction for wheel bearing

    Science.gov (United States)

    Deng, Feiyue; Yang, Shaopu; Tang, Guiji; Hao, Rujiang; Zhang, Mingliang

    2017-04-01

    Wheel bearings are essential mechanical components of trains, and fault detection of the wheel bearing is of great significant to avoid economic loss and casualty effectively. However, considering the operating conditions, detection and extraction of the fault features hidden in the heavy noise of the vibration signal have become a challenging task. Therefore, a novel method called adaptive multi-scale AVG-Hat morphology filter (MF) is proposed to solve it. The morphology AVG-Hat operator not only can suppress the interference of the strong background noise greatly, but also enhance the ability of extracting fault features. The improved envelope spectrum sparsity (IESS), as a new evaluation index, is proposed to select the optimal filtering signal processed by the multi-scale AVG-Hat MF. It can present a comprehensive evaluation about the intensity of fault impulse to the background noise. The weighted coefficients of the different scale structural elements (SEs) in the multi-scale MF are adaptively determined by the particle swarm optimization (PSO) algorithm. The effectiveness of the method is validated by analyzing the real wheel bearing fault vibration signal (e.g. outer race fault, inner race fault and rolling element fault). The results show that the proposed method could improve the performance in the extraction of fault features effectively compared with the multi-scale combined morphological filter (CMF) and multi-scale morphology gradient filter (MGF) methods.

  3. Extracting invariable fault features of rotating machines with multi-ICA networks

    Institute of Scientific and Technical Information of China (English)

    焦卫东; 杨世锡; 吴昭同

    2003-01-01

    This paper proposes novel multi-layer neural networks based on Independent Component Analysis for feature extraction of fault modes. By the use of ICA, invariable features embedded in multi-channel vibration measurements under different operating conditions (rotating speed and/or load) can be captured together.Thus, stable MLP classifiers insensitive to the variation of operation conditions are constructed. The successful results achieved by selected experiments indicate great potential of ICA in health condition monitoring of rotating machines.

  4. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N;

    2013-01-01

    that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either......The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors...

  5. Acoustic diagnosis of mechanical fault feature based on reference signal frequency domain semi-blind extraction

    Directory of Open Access Journals (Sweden)

    Zeguang YI

    2015-08-01

    Full Text Available Aiming at fault diagnosis problems caused by complex machinery parts, serious background noises and the application limitations of traditional blind signal processing algorithm to the mechanical acoustic signal processing, a failure acoustic diagnosis based on reference signal frequency domain semi-blind extraction is proposed. Key technologies are introduced: Based on frequency-domain blind deconvolution algorithm, the artificial fish swarm algorithm which is good for global optimization is used to construct improved multi-scale morphological filters which is applicable to mechanical failure in order to weaken the background noises; combining the structural parameters of parts to build a reference signal, complex components blind separation is carried out on the signals after noise reduction paragraph by paragraph by reference signal unit semi-blind extraction algorithm; then the improved KL-distance of complex independent components is employed as distance measure to resolve the permutation, and finally the mechanical fault characteristic signals are extracted and separated. The actual acoustic diagnosis of rolling bearing fault in sound field environment results proves the effectiveness of this algorithm.

  6. Automation System in Rare Earths Countercurrent Extraction Processes

    Institute of Scientific and Technical Information of China (English)

    贾江涛; 严纯华; 廖春生; 吴声; 王明文; 李标国

    2001-01-01

    Based on the countercurrent extraction theory for optimized designing and simulating, the rare earth separation processes, the selection of the detecting points (stages) and on-line analysis for elements, the simulation of open loop response and its response speed, the diagnosis and the regulative prescription for running the solvent extraction cascades were studied.

  7. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    Science.gov (United States)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-05

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis.

  8. Automated microfluidic DNA/RNA extraction with both disposable and reusable components

    Science.gov (United States)

    Kim, Jungkyu; Johnson, Michael; Hill, Parker; Sonkul, Rahul S.; Kim, Jongwon; Gale, Bruce K.

    2012-01-01

    An automated microfluidic nucleic extraction system was fabricated with a multilayer polydimethylsiloxane (PDMS) structure that consists of sample wells, microvalves, a micropump and a disposable microfluidic silica cartridge. Both the microvalves and micropump structures were fabricated in a single layer and are operated pneumatically using a 100 µm PDMS membrane. To fabricate the disposable microfluidic silica cartridge, two-cavity structures were made in a PDMS replica to fit the stacked silica membranes. A handheld controller for the microvalves and pumps was developed to enable system automation. With purified ribonucleic acid (RNA), whole blood and E. coli samples, the automated microfluidic nucleic acid extraction system was validated with a guanidine-based solid phase extraction procedure. An extraction efficiency of ~90% for deoxyribonucleic acid (DNA) and ~54% for RNA was obtained in 12 min from whole blood and E. coli samples, respectively. In addition, the same quantity and quality of extracted DNA was confirmed by polymerase chain reaction (PCR) amplification. The PCR also presented the appropriate amplification and melting profiles. Automated, programmable fluid control and physical separation of the reusable components and the disposable components significantly decrease the assay time and manufacturing cost and increase the flexibility and compatibility of the system with downstream components.

  9. Natural Environment Modeling and Fault-Diagnosis for Automated Agricultural Vehicle

    DEFF Research Database (Denmark)

    Blas, Morten Rufus; Blanke, Mogens

    2008-01-01

    This paper presents results for an automatic navigation system for agricultural vehicles. The system uses stereo-vision, inertial sensors and GPS. Special emphasis has been placed on modeling the natural environment in conjunction with a fault-tolerant navigation system. The results are exemplified...

  10. Feature Extraction and Selection Strategies for Automated Target Recognition

    Science.gov (United States)

    Greene, W. Nicholas; Zhang, Yuhan; Lu, Thomas T.; Chao, Tien-Hsin

    2010-01-01

    Several feature extraction and selection methods for an existing automatic target recognition (ATR) system using JPLs Grayscale Optical Correlator (GOC) and Optimal Trade-Off Maximum Average Correlation Height (OT-MACH) filter were tested using MATLAB. The ATR system is composed of three stages: a cursory region of-interest (ROI) search using the GOC and OT-MACH filter, a feature extraction and selection stage, and a final classification stage. Feature extraction and selection concerns transforming potential target data into more useful forms as well as selecting important subsets of that data which may aide in detection and classification. The strategies tested were built around two popular extraction methods: Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Performance was measured based on the classification accuracy and free-response receiver operating characteristic (FROC) output of a support vector machine(SVM) and a neural net (NN) classifier.

  11. Fault feature extraction and enhancement of rolling element bearing in varying speed condition

    Science.gov (United States)

    Ming, A. B.; Zhang, W.; Qin, Z. Y.; Chu, F. L.

    2016-08-01

    In engineering applications, the variability of load usually varies the shaft speed, which further degrades the efficacy of the diagnostic method based on the hypothesis of constant speed analysis. Therefore, the investigation of the diagnostic method suitable for the varying speed condition is significant for the bearing fault diagnosis. In this instance, a novel fault feature extraction and enhancement procedure was proposed by the combination of the iterative envelope analysis and a low pass filtering operation in this paper. At first, based on the analytical model of the collected vibration signal, the envelope signal was theoretically calculated and the iterative envelope analysis was improved for the varying speed condition. Then, a feature enhancement procedure was performed by applying a low pass filter on the temporal envelope obtained by the iterative envelope analysis. Finally, the temporal envelope signal was transformed to the angular domain by the computed order tracking and the fault feature was extracted on the squared envelope spectrum. Simulations and experiments were used to validate the efficacy of the theoretical analysis and proposed procedure. It is shown that the computed order tracking method is recommended to be applied on the envelope of the signal in order to avoid the energy spreading and amplitude distortion. Compared with the feature enhancement method performed by the fast kurtogram and corresponding optimal band pass filtering, the proposed method can efficiently extract the fault character in the varying speed condition with less amplitude attenuation. Furthermore, do not involve the center frequency estimation, the proposed method is more concise for engineering applications.

  12. Feature Extraction Using Discrete Wavelet Transform for Gear Fault Diagnosis of Wind Turbine Gearbox

    Directory of Open Access Journals (Sweden)

    Rusmir Bajric

    2016-01-01

    Full Text Available Vibration diagnosis is one of the most common techniques in condition evaluation of wind turbine equipped with gearbox. On the other side, gearbox is one of the key components of wind turbine drivetrain. Due to the stochastic operation of wind turbines, the gearbox shaft rotating speed changes with high percentage, which limits the application of traditional vibration signal processing techniques, such as fast Fourier transform. This paper investigates a new approach for wind turbine high speed shaft gear fault diagnosis using discrete wavelet transform and time synchronous averaging. First, the vibration signals are decomposed into a series of subbands signals with the use of a multiresolution analytical property of the discrete wavelet transform. Then, 22 condition indicators are extracted from the TSA signal, residual signal, and difference signal. Through the case study analysis, a new approach reveals the most relevant condition indicators based on vibrations that can be used for high speed shaft gear spalling fault diagnosis and their tracking abilities for fault degradation progression. It is also shown that the proposed approach enhances the gearbox fault diagnosis ability in wind turbines. The approach presented in this paper was programmed in Matlab environment using data acquired on a 2 MW wind turbine.

  13. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    Science.gov (United States)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A.G.; Sellergren, Börje; Reubsaet, Léon

    2017-01-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting. PMID:28303910

  14. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    Science.gov (United States)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon

    2017-03-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.

  15. Impulse feature extraction method for machinery fault detection using fusion sparse coding and online dictionary learning

    Directory of Open Access Journals (Sweden)

    Deng Sen

    2015-04-01

    Full Text Available Impulse components in vibration signals are important fault features of complex machines. Sparse coding (SC algorithm has been introduced as an impulse feature extraction method, but it could not guarantee a satisfactory performance in processing vibration signals with heavy background noises. In this paper, a method based on fusion sparse coding (FSC and online dictionary learning is proposed to extract impulses efficiently. Firstly, fusion scheme of different sparse coding algorithms is presented to ensure higher reconstruction accuracy. Then, an improved online dictionary learning method using FSC scheme is established to obtain redundant dictionary and it can capture specific features of training samples and reconstruct the sparse approximation of vibration signals. Simulation shows that this method has a good performance in solving sparse coefficients and training redundant dictionary compared with other methods. Lastly, the proposed method is further applied to processing aircraft engine rotor vibration signals. Compared with other feature extraction approaches, our method can extract impulse features accurately and efficiently from heavy noisy vibration signal, which has significant supports for machinery fault detection and diagnosis.

  16. Fault Tolerant Modular Linear Motor for Safe-Critical Automated Industrial Applications

    Directory of Open Access Journals (Sweden)

    Loránd SZABÓ

    2009-05-01

    Full Text Available In various safe-critical industrial, medical and defence applications the translational movements are performed by linear motors. In such applications both the motor and its power converter should be fault tolerant. To fulfil this assignment redesigned motorstructures with novel phase connections must be used. In the paper a modular double salient permanent magnet linear motor is studied. Its phases are split into independent channels. The study on the fault tolerant capability of the linear motor was performed via cosimulation, using the Flux-to-Simulink Technology. The conclusions of the paper could help the users to select the optimal linear motor topology for their certain application, function of the required meantraction force and its acceptable ripples.

  17. Feature Extraction Method of Rolling Bearing Fault Signal Based on EEMD and Cloud Model Characteristic Entropy

    Directory of Open Access Journals (Sweden)

    Long Han

    2015-09-01

    Full Text Available The randomness and fuzziness that exist in rolling bearings when faults occur result in uncertainty in acquisition signals and reduce the accuracy of signal feature extraction. To solve this problem, this study proposes a new method in which cloud model characteristic entropy (CMCE is set as the signal characteristic eigenvalue. This approach can overcome the disadvantages of traditional entropy complexity in parameter selection when solving uncertainty problems. First, the acoustic emission signals under normal and damage rolling bearing states collected from the experiments are decomposed via ensemble empirical mode decomposition. The mutual information method is then used to select the sensitive intrinsic mode functions that can reflect signal characteristics to reconstruct the signal and eliminate noise interference. Subsequently, CMCE is set as the eigenvalue of the reconstructed signal. Finally, through the comparison of experiments between sample entropy, root mean square and CMCE, the results show that CMCE can better represent the characteristic information of the fault signal.

  18. Automated extraction protocol for quantification of SARS-Coronavirus RNA in serum: an evaluation study

    Directory of Open Access Journals (Sweden)

    Lui Wing-bong

    2006-02-01

    Full Text Available Abstract Background We have previously developed a test for the diagnosis and prognostic assessment of the severe acute respiratory syndrome (SARS based on the detection of the SARS-coronavirus RNA in serum by real-time quantitative reverse transcriptase polymerase chain reaction (RT-PCR. In this study, we evaluated the feasibility of automating the serum RNA extraction procedure in order to increase the throughput of the assay. Methods An automated nucleic acid extraction platform using the MagNA Pure LC instrument (Roche Diagnostics was evaluated. We developed a modified protocol in compliance with the recommended biosafety guidelines from the World Health Organization based on the use of the MagNA Pure total nucleic acid large volume isolation kit for the extraction of SARS-coronavirus RNA. The modified protocol was compared with a column-based extraction kit (QIAamp viral RNA mini kit, Qiagen for quantitative performance, analytical sensitivity and precision. Results The newly developed automated protocol was shown to be free from carry-over contamination and have comparable performance with other standard protocols and kits designed for the MagNA Pure LC instrument. However, the automated method was found to be less sensitive, less precise and led to consistently lower serum SARS-coronavirus concentrations when compared with the column-based extraction method. Conclusion As the diagnostic efficiency and prognostic value of the serum SARS-CoV RNA RT-PCR test is critically associated with the analytical sensitivity and quantitative performance contributed both by the RNA extraction and RT-PCR components of the test, we recommend the use of the column-based manual RNA extraction method.

  19. RFI detection by automated feature extraction and statistical analysis

    Science.gov (United States)

    Winkel, B.; Kerp, J.; Stanko, S.

    2007-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4σ_rms level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the astronomical line emission of the Milky Way, (2) interferences are polarised, (3) electronic devices in the neighbourhood of the telescope contribute significantly to the RFI radiation. We also show that the radiometer equation is no longer fulfilled in presence of RFI signals.

  20. RFI detection by automated feature extraction and statistical analysis

    CERN Document Server

    Winkel, B; Stanko, S; Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4-sigma level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the a...

  1. Automated extraction improves multiplex molecular detection of infection in septic patients.

    Directory of Open Access Journals (Sweden)

    Benito J Regueiro

    Full Text Available Sepsis is one of the leading causes of morbidity and mortality in hospitalized patients worldwide. Molecular technologies for rapid detection of microorganisms in patients with sepsis have only recently become available. LightCycler SeptiFast test M(grade (Roche Diagnostics GmbH is a multiplex PCR analysis able to detect DNA of the 25 most frequent pathogens in bloodstream infections. The time and labor saved while avoiding excessive laboratory manipulation is the rationale for selecting the automated MagNA Pure compact nucleic acid isolation kit-I (Roche Applied Science, GmbH as an alternative to conventional SeptiFast extraction. For the purposes of this study, we evaluate extraction in order to demonstrate the feasibility of automation. Finally, a prospective observational study was done using 106 clinical samples obtained from 76 patients in our ICU. Both extraction methods were used in parallel to test the samples. When molecular detection test results using both manual and automated extraction were compared with the data from blood cultures obtained at the same time, the results show that SeptiFast with the alternative MagNA Pure compact extraction not only shortens the complete workflow to 3.57 hrs., but also increases sensitivity of the molecular assay for detecting infection as defined by positive blood culture confirmation.

  2. Automated extraction of lexical meanings from Polish corpora: potentialities and limitations

    Directory of Open Access Journals (Sweden)

    Maciej Piasecki

    2015-11-01

    Full Text Available Automated extraction of lexical meanings from Polish corpora: potentialities and limitations Large corpora are often consulted by linguists as a knowledge source with respect to lexicon, morphology or syntax. However, there are also several methods of automated extraction of semantic properties of language units from corpora. In the paper we focus on emerging potentialities of these methods, as well as on their identified limitations. Evidence that can be collected from corpora is confronted with the existing models of formalised description of lexical meanings. Two basic paradigms of lexical semantics extraction are briefly described. Their properties are analysed on the basis of several experiments performed on Polish corpora. Several potential applications of the methods, including a system supporting expansion of a Polish wordnet, are discussed. Finally, perspectives on the potential further development are discussed.

  3. Automated DNA extraction of single dog hairs without roots for mitochondrial DNA analysis.

    Science.gov (United States)

    Bekaert, Bram; Larmuseau, Maarten H D; Vanhove, Maarten P M; Opdekamp, Anouschka; Decorte, Ronny

    2012-03-01

    Dogs are intensely integrated in human social life and their shed hairs can play a major role in forensic investigations. The overall aim of this study was to validate a semi-automated extraction method for mitochondrial DNA analysis of telogenic dog hairs. Extracted DNA was amplified with a 95% success rate from 43 samples using two new experimental designs in which the mitochondrial control region was amplified as a single large (± 1260 bp) amplicon or as two individual amplicons (HV1 and HV2; ± 650 and 350 bp) with tailed-primers. The results prove that the extraction of dog hair mitochondrial DNA can easily be automated to provide sufficient DNA yield for the amplification of a forensically useful long mitochondrial DNA fragment or alternatively two short fragments with minimal loss of sequence in case of degraded samples.

  4. Multispectral Image Road Extraction Based Upon Automated Map Conflation

    Science.gov (United States)

    Chen, Bin

    Road network extraction from remotely sensed imagery enables many important and diverse applications such as vehicle tracking, drone navigation, and intelligent transportation studies. There are, however, a number of challenges to road detection from an image. Road pavement material, width, direction, and topology vary across a scene. Complete or partial occlusions caused by nearby buildings, trees, and the shadows cast by them, make maintaining road connectivity difficult. The problems posed by occlusions are exacerbated with the increasing use of oblique imagery from aerial and satellite platforms. Further, common objects such as rooftops and parking lots are made of materials similar or identical to road pavements. This problem of common materials is a classic case of a single land cover material existing for different land use scenarios. This work addresses these problems in road extraction from geo-referenced imagery by leveraging the OpenStreetMap digital road map to guide image-based road extraction. The crowd-sourced cartography has the advantages of worldwide coverage that is constantly updated. The derived road vectors follow only roads and so can serve to guide image-based road extraction with minimal confusion from occlusions and changes in road material. On the other hand, the vector road map has no information on road widths and misalignments between the vector map and the geo-referenced image are small but nonsystematic. Properly correcting misalignment between two geospatial datasets, also known as map conflation, is an essential step. A generic framework requiring minimal human intervention is described for multispectral image road extraction and automatic road map conflation. The approach relies on the road feature generation of a binary mask and a corresponding curvilinear image. A method for generating the binary road mask from the image by applying a spectral measure is presented. The spectral measure, called anisotropy-tunable distance (ATD

  5. The Hybrid KICA-GDA-LSSVM Method Research on Rolling Bearing Fault Feature Extraction and Classification

    Directory of Open Access Journals (Sweden)

    Jiyong Li

    2015-01-01

    Full Text Available Rolling element bearings are widely used in high-speed rotating machinery; thus proper monitoring and fault diagnosis procedure to avoid major machine failures is necessary. As feature extraction and classification based on vibration signals are important in condition monitoring technique, and superfluous features may degrade the classification performance, it is needed to extract independent features, so LSSVM (least square support vector machine based on hybrid KICA-GDA (kernel independent component analysis-generalized discriminate analysis is presented in this study. A new method named sensitive subband feature set design (SSFD based on wavelet packet is also presented; using proposed variance differential spectrum method, the sensitive subbands are selected. Firstly, independent features are obtained by KICA; the feature redundancy is reduced. Secondly, feature dimension is reduced by GDA. Finally, the projected feature is classified by LSSVM. The whole paper aims to classify the feature vectors extracted from the time series and magnitude of spectral analysis and to discriminate the state of the rolling element bearings by virtue of multiclass LSSVM. Experimental results from two different fault-seeded bearing tests show good performance of the proposed method.

  6. Highly efficient automated extraction of DNA from old and contemporary skeletal remains.

    Science.gov (United States)

    Zupanič Pajnič, Irena; Debska, Magdalena; Gornjak Pogorelc, Barbara; Vodopivec Mohorčič, Katja; Balažic, Jože; Zupanc, Tomaž; Štefanič, Borut; Geršak, Ksenija

    2016-01-01

    We optimised the automated extraction of DNA from old and contemporary skeletal remains using the AutoMate Express system and the PrepFiler BTA kit. 24 Contemporary and 25 old skeletal remains from WWII were analysed. For each skeleton, extraction using only 0.05 g of powder was performed according to the manufacturer's recommendations (no demineralisation - ND method). Since only 32% of full profiles were obtained from aged and 58% from contemporary casework skeletons, the extraction protocol was modified to acquire higher quality DNA and genomic DNA was obtained after full demineralisation (FD method). The nuclear DNA of the samples was quantified using the Investigator Quantiplex kit and STR typing was performed using the NGM kit to evaluate the performance of tested extraction methods. In the aged DNA samples, 64% of full profiles were obtained using the FD method. For the contemporary skeletal remains the performance of the ND method was closer to the FD method compared to the old skeletons, giving 58% of full profiles with the ND method and 71% of full profiles using the FD method. The extraction of DNA from only 0.05 g of bone or tooth powder using the AutoMate Express has proven highly successful in the recovery of DNA from old and contemporary skeletons, especially with the modified FD method. We believe that the results obtained will contribute to the possibilities of using automated devices for extracting DNA from skeletal remains, which would shorten the procedures for obtaining high-quality DNA from skeletons in forensic laboratories.

  7. An automated approach for extracting Barrier Island morphology from digital elevation models

    Science.gov (United States)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  8. An Analytical Model for Assessing Stability of Pre-Existing Faults in Caprock Caused by Fluid Injection and Extraction in a Reservoir

    Science.gov (United States)

    Wang, Lei; Bai, Bing; Li, Xiaochun; Liu, Mingze; Wu, Haiqing; Hu, Shaobin

    2016-07-01

    Induced seismicity and fault reactivation associated with fluid injection and depletion were reported in hydrocarbon, geothermal, and waste fluid injection fields worldwide. Here, we establish an analytical model to assess fault reactivation surrounding a reservoir during fluid injection and extraction that considers the stress concentrations at the fault tips and the effects of fault length. In this model, induced stress analysis in a full-space under the plane strain condition is implemented based on Eshelby's theory of inclusions in terms of a homogeneous, isotropic, and poroelastic medium. The stress intensity factor concept in linear elastic fracture mechanics is adopted as an instability criterion for pre-existing faults in surrounding rocks. To characterize the fault reactivation caused by fluid injection and extraction, we define a new index, the "fault reactivation factor" η, which can be interpreted as an index of fault stability in response to fluid pressure changes per unit within a reservoir resulting from injection or extraction. The critical fluid pressure change within a reservoir is also determined by the superposition principle using the in situ stress surrounding a fault. Our parameter sensitivity analyses show that the fault reactivation tendency is strongly sensitive to fault location, fault length, fault dip angle, and Poisson's ratio of the surrounding rock. Our case study demonstrates that the proposed model focuses on the mechanical behavior of the whole fault, unlike the conventional methodologies. The proposed method can be applied to engineering cases related to injection and depletion within a reservoir owing to its efficient computational codes implementation.

  9. Fault feature extraction of gearbox by using overcomplete rational dilation discrete wavelet transform on signals measured from vibration sensors

    Science.gov (United States)

    Chen, Binqiang; Zhang, Zhousuo; Sun, Chuang; Li, Bing; Zi, Yanyang; He, Zhengjia

    2012-11-01

    Gearbox fault diagnosis is very important for preventing catastrophic accidents. Vibration signals of gearboxes measured by sensors are useful and dependable as they carry key information related to the mechanical faults in gearboxes. Effective signal processing techniques are in necessary demands to extract the fault features contained in the collected gearbox vibration signals. Overcomplete rational dilation discrete wavelet transform (ORDWT) enjoys attractive properties such as better shift-invariance, adjustable time-frequency distributions and flexible wavelet atoms of tunable oscillation in comparison with classical dyadic wavelet transform (DWT). Due to these advantages, ORDWT is presented as a versatile tool that can be adapted to analysis of gearbox fault features of different types, especially in analyzing the non-stationary and transient characteristics of the signals. Aiming to extract the various types of fault features confronted in gearbox fault diagnosis, a fault feature extraction technique based on ORDWT is proposed in this paper. In the routine of the proposed technique, ORDWT is used as the pre-processing decomposition tool, and a corresponding post-processing method is combined with ORDWT to extract the fault feature of a specific type. For extracting periodical impulses in the signal, an impulse matching algorithm is presented. In this algorithm, ORDWT bases of varied time-frequency distributions and varied oscillatory natures are adopted, moreover an improved signal impulsiveness measure derived from kurtosis is developed for choosing optimal ORDWT bases that perfectly match the hidden periodical impulses. For demodulation purpose, an improved instantaneous time-frequency spectrum (ITFS), based on the combination of ORDWT and Hilbert transform, is presented. For signal denoising applications, ORDWT is enhanced by neighboring coefficient shrinkage strategy as well as subband selection step to reveal the buried transient vibration contents. The

  10. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N

    2013-01-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors...... that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either...... with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were...

  11. On- and off-fault coseismic surface deformation associated with the September 2013 M7.7 Balochistan, Pakistan earthquake measured from mapping and automated pixel correlation

    Science.gov (United States)

    Gold, R. D.; Reitman, N. G.; Briggs, R. W.; Barnhart, W. D.; Hayes, G. P.

    2014-12-01

    The 24 September 2013 Mw7.7 Balochistan, Pakistan earthquake ruptured a ~200 km-long stretch of the Hoshab fault in southern Pakistan. We remotely measured the coseismic surface deformation field using high-resolution (0.5 m) pre- and post-event satellite imagery. We measured ~300 near-field (0-10 m from fault) laterally offset piercing points (streams, terrace risers, roads, etc.) and find peak left-lateral offsets of ~12-15 m. We characterized the far-field (0-10 km from fault) displacement field using manual (~250 measurements) and automated image cross-correlation methods (e.g., pixel tracking) and find peak displacement values of ~16 m, which commonly exceed the on-fault displacement magnitudes. Our preliminary observations suggest the following: (1) coseismic surface displacement typically increases with distance away from the surface trace of the fault (e.g., highest displacement values in the far field), (2) for certain locations along the fault rupture, as little as 50% of the coseismic displacement field occurred in the near-field; and (3) the magnitudes of individual displacements are inversely correlated to the width of the surface rupture zone (e.g., largest displacements where the fault zone is narrowest). This analysis highlights the importance of identifying field study sites spanning fault sections with narrow deformation zones in order to capture the entire deformation field. For regions of distributed deformation, these results would predict that geologic slip rate studies underestimate a fault's complete slip rate.

  12. Time-frequency manifold sparse reconstruction: A novel method for bearing fault feature extraction

    Science.gov (United States)

    Ding, Xiaoxi; He, Qingbo

    2016-12-01

    In this paper, a novel transient signal reconstruction method, called time-frequency manifold (TFM) sparse reconstruction, is proposed for bearing fault feature extraction. This method introduces image sparse reconstruction into the TFM analysis framework. According to the excellent denoising performance of TFM, a more effective time-frequency (TF) dictionary can be learned from the TFM signature by image sparse decomposition based on orthogonal matching pursuit (OMP). Then, the TF distribution (TFD) of the raw signal in a reconstructed phase space would be re-expressed with the sum of learned TF atoms multiplied by corresponding coefficients. Finally, one-dimensional signal can be achieved again by the inverse process of TF analysis (TFA). Meanwhile, the amplitude information of the raw signal would be well reconstructed. The proposed technique combines the merits of the TFM in denoising and the atomic decomposition in image sparse reconstruction. Moreover, the combination makes it possible to express the nonlinear signal processing results explicitly in theory. The effectiveness of the proposed TFM sparse reconstruction method is verified by experimental analysis for bearing fault feature extraction.

  13. Weak transient fault feature extraction based on an optimized Morlet wavelet and kurtosis

    Science.gov (United States)

    Qin, Yi; Xing, Jianfeng; Mao, Yongfang

    2016-08-01

    Aimed at solving the key problem in weak transient detection, the present study proposes a new transient feature extraction approach using the optimized Morlet wavelet transform, kurtosis index and soft-thresholding. Firstly, a fast optimization algorithm based on the Shannon entropy is developed to obtain the optimized Morlet wavelet parameter. Compared to the existing Morlet wavelet parameter optimization algorithm, this algorithm has lower computation complexity. After performing the optimized Morlet wavelet transform on the analyzed signal, the kurtosis index is used to select the characteristic scales and obtain the corresponding wavelet coefficients. From the time-frequency distribution of the periodic impulsive signal, it is found that the transient signal can be reconstructed by the wavelet coefficients at several characteristic scales, rather than the wavelet coefficients at just one characteristic scale, so as to improve the accuracy of transient detection. Due to the noise influence on the characteristic wavelet coefficients, the adaptive soft-thresholding method is applied to denoise these coefficients. With the denoised wavelet coefficients, the transient signal can be reconstructed. The proposed method was applied to the analysis of two simulated signals, and the diagnosis of a rolling bearing fault and a gearbox fault. The superiority of the method over the fast kurtogram method was verified by the results of simulation analysis and real experiments. It is concluded that the proposed method is extremely suitable for extracting the periodic impulsive feature from strong background noise.

  14. Extraction of prostatic lumina and automated recognition for prostatic calculus image using PCA-SVM.

    Science.gov (United States)

    Wang, Zhuocai; Xu, Xiangmin; Ding, Xiaojun; Xiao, Hui; Huang, Yusheng; Liu, Jian; Xing, Xiaofen; Wang, Hua; Liao, D Joshua

    2011-01-01

    Identification of prostatic calculi is an important basis for determining the tissue origin. Computation-assistant diagnosis of prostatic calculi may have promising potential but is currently still less studied. We studied the extraction of prostatic lumina and automated recognition for calculus images. Extraction of lumina from prostate histology images was based on local entropy and Otsu threshold recognition using PCA-SVM and based on the texture features of prostatic calculus. The SVM classifier showed an average time 0.1432 second, an average training accuracy of 100%, an average test accuracy of 93.12%, a sensitivity of 87.74%, and a specificity of 94.82%. We concluded that the algorithm, based on texture features and PCA-SVM, can recognize the concentric structure and visualized features easily. Therefore, this method is effective for the automated recognition of prostatic calculi.

  15. Automated renal histopathology: digital extraction and quantification of renal pathology

    Science.gov (United States)

    Sarder, Pinaki; Ginley, Brandon; Tomaszewski, John E.

    2016-03-01

    The branch of pathology concerned with excess blood serum proteins being excreted in the urine pays particular attention to the glomerulus, a small intertwined bunch of capillaries located at the beginning of the nephron. Normal glomeruli allow moderate amount of blood proteins to be filtered; proteinuric glomeruli allow large amount of blood proteins to be filtered. Diagnosis of proteinuric diseases requires time intensive manual examination of the structural compartments of the glomerulus from renal biopsies. Pathological examination includes cellularity of individual compartments, Bowman's and luminal space segmentation, cellular morphology, glomerular volume, capillary morphology, and more. Long examination times may lead to increased diagnosis time and/or lead to reduced precision of the diagnostic process. Automatic quantification holds strong potential to reduce renal diagnostic time. We have developed a computational pipeline capable of automatically segmenting relevant features from renal biopsies. Our method first segments glomerular compartments from renal biopsies by isolating regions with high nuclear density. Gabor texture segmentation is used to accurately define glomerular boundaries. Bowman's and luminal spaces are segmented using morphological operators. Nuclei structures are segmented using color deconvolution, morphological processing, and bottleneck detection. Average computation time of feature extraction for a typical biopsy, comprising of ~12 glomeruli, is ˜69 s using an Intel(R) Core(TM) i7-4790 CPU, and is ~65X faster than manual processing. Using images from rat renal tissue samples, automatic glomerular structural feature estimation was reproducibly demonstrated for 15 biopsy images, which contained 148 individual glomeruli images. The proposed method holds immense potential to enhance information available while making clinical diagnoses.

  16. Analysis of Automated Modern Web Crawling and Testing Tools and Their Possible Employment for Information Extraction

    Directory of Open Access Journals (Sweden)

    Tomas Grigalis

    2012-04-01

    Full Text Available World Wide Web has become an enormously big repository of data. Extracting, integrating and reusing this kind of data has a wide range of applications, including meta-searching, comparison shopping, business intelligence tools and security analysis of information in websites. However, reaching information in modern WEB 2.0 web pages, where HTML tree is often dynamically modified by various JavaScript codes, new data are added by asynchronous requests to the web server and elements are positioned with the help of cascading style sheets, is a difficult task. The article reviews automated web testing tools for information extraction tasks.Article in Lithuanian

  17. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    Science.gov (United States)

    Mao, Y.; Ye, A.; Xu, J.; Ma, F.; Deng, X.; Miao, C.; Gong, W.; Di, Z.

    2014-07-01

    A high-resolution and high-accuracy drainage network map is a prerequisite for simulating the water cycle in land surface hydrological models. The objective of this study was to develop a new automated extraction of drainage network model, which can get high-precision continuous drainage network on high-resolution DEM (Digital Elevation Model). The high-resolution DEM need too much computer resources to extract drainage network. The conventional GIS method often can not complete to calculate on high-resolution DEM of big basins, because the number of grids is too large. In order to decrease the computation time, an advanced distributed automated extraction of drainage network model (Adam) was proposed in the study. The Adam model has two features: (1) searching upward from outlet of basin instead of sink filling, (2) dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM) at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales).

  18. Automated information extraction of key trial design elements from clinical trial publications.

    Science.gov (United States)

    de Bruijn, Berry; Carini, Simona; Kiritchenko, Svetlana; Martin, Joel; Sim, Ida

    2008-11-06

    Clinical trials are one of the most valuable sources of scientific evidence for improving the practice of medicine. The Trial Bank project aims to improve structured access to trial findings by including formalized trial information into a knowledge base. Manually extracting trial information from published articles is costly, but automated information extraction techniques can assist. The current study highlights a single architecture to extract a wide array of information elements from full-text publications of randomized clinical trials (RCTs). This architecture combines a text classifier with a weak regular expression matcher. We tested this two-stage architecture on 88 RCT reports from 5 leading medical journals, extracting 23 elements of key trial information such as eligibility rules, sample size, intervention, and outcome names. Results prove this to be a promising avenue to help critical appraisers, systematic reviewers, and curators quickly identify key information elements in published RCT articles.

  19. Early fault feature extraction of rolling bearing based on ICD and tunable Q-factor wavelet transform

    Science.gov (United States)

    Li, Yongbo; Liang, Xihui; Xu, Minqiang; Huang, Wenhu

    2017-03-01

    When a fault occurs on bearings, the measured bearing fault signals contain both high Q-factor oscillation component and low Q-factor periodic impact component. TQWT is the improvement of the traditional single Q-factor wavelet transform, which is very suitable for separating the low Q-factor component from the high Q-factor component. However, the accuracy of its decomposition heavily depended on the selection of Q-factors. There is no reported simple but effective method to select the Q-factors with enough accuracy. This study aims to develop a strategy to diagnostic the early fault of rolling bearings. In this paper, a characteristic frequency ratio (CFR) is used to optimize Q-factors of TQWT (OTQWT). However, directly application of OTQWT is difficult to extract fault signatures at early stage due to the weak fault symptoms and strong noise. A strategy of combination of intrinsic characteristic-scale decomposition (ICD) and TQWT is proposed. ICD owns significant advantages on computation efficiency and alleviation of mode mixing. The effectiveness of the proposed strategy is tested with both simulated and experimental vibration signals. Meanwhile, comparisons are conducted between the proposed method and other methods like: envelope demodulation and EEMD-TQWT. Results show that the proposed method has superior performance in extracting fault features of defective bearings at an early stage.

  20. Feature Extraction and Selection Scheme for Intelligent Engine Fault Diagnosis Based on 2DNMF, Mutual Information, and NSGA-II

    Directory of Open Access Journals (Sweden)

    Peng-yuan Liu

    2016-01-01

    Full Text Available A novel feature extraction and selection scheme is presented for intelligent engine fault diagnosis by utilizing two-dimensional nonnegative matrix factorization (2DNMF, mutual information, and nondominated sorting genetic algorithms II (NSGA-II. Experiments are conducted on an engine test rig, in which eight different engine operating conditions including one normal condition and seven fault conditions are simulated, to evaluate the presented feature extraction and selection scheme. In the phase of feature extraction, the S transform technique is firstly utilized to convert the engine vibration signals to time-frequency domain, which can provide richer information on engine operating conditions. Then a novel feature extraction technique, named two-dimensional nonnegative matrix factorization, is employed for characterizing the time-frequency representations. In the feature selection phase, a hybrid filter and wrapper scheme based on mutual information and NSGA-II is utilized to acquire a compact feature subset for engine fault diagnosis. Experimental results by adopted three different classifiers have demonstrated that the proposed feature extraction and selection scheme can achieve a very satisfying classification performance with fewer features for engine fault diagnosis.

  1. An advanced distributed automated extraction of drainage network model on high-resolution DEM

    Directory of Open Access Journals (Sweden)

    Y. Mao

    2014-07-01

    distributed automated extraction of drainage network model (Adam was proposed in the study. The Adam model has two features: (1 searching upward from outlet of basin instead of sink filling, (2 dividing sub-basins on low-resolution DEM, and then extracting drainage network on sub-basins of high-resolution DEM. The case study used elevation data of the Shuttle Radar Topography Mission (SRTM at 3 arc-second resolution in Zhujiang River basin, China. The results show Adam model can dramatically reduce the computation time. The extracting drainage network was continuous and more accurate than HydroSHEDS (Hydrological data and maps based on Shuttle Elevation Derivatives at multiple Scales.

  2. Extraction, identification, and functional characterization of a bioactive substance from automated compound-handling plastic tips.

    Science.gov (United States)

    Watson, John; Greenough, Emily B; Leet, John E; Ford, Michael J; Drexler, Dieter M; Belcastro, James V; Herbst, John J; Chatterjee, Moneesh; Banks, Martyn

    2009-06-01

    Disposable plastic labware is ubiquitous in contemporary pharmaceutical research laboratories. Plastic labware is routinely used for chemical compound storage and during automated liquid-handling processes that support assay development, high-throughput screening, structure-activity determinations, and liability profiling. However, there is little information available in the literature on the contaminants released from plastic labware upon DMSO exposure and their resultant effects on specific biological assays. The authors report here the extraction, by simple DMSO washing, of a biologically active substance from one particular size of disposable plastic tips used in automated compound handling. The active contaminant was identified as erucamide ((Z)-docos-13-enamide), a long-chain mono-unsaturated fatty acid amide commonly used in plastics manufacturing, by gas chromatography/mass spectroscopy analysis of the DMSO-extracted material. Tip extracts prepared in DMSO, as well as a commercially obtained sample of erucamide, were active in a functional bioassay of a known G-protein-coupled fatty acid receptor. A sample of a different disposable tip product from the same vendor did not release detectable erucamide following solvent extraction, and DMSO extracts prepared from this product were inactive in the receptor functional assay. These results demonstrate that solvent-extractable contaminants from some plastic labware used in the contemporary pharmaceutical research and development (R&D) environment can be introduced into physical and biological assays during routine compound management liquid-handling processes. These contaminants may further possess biological activity and are therefore a potential source of assay-specific confounding artifacts.

  3. Automated Kinematic Extraction of Wing and Body Motions of Free Flying Diptera

    Science.gov (United States)

    Kostreski, Nicholas I.

    In the quest to understand the forces generated by micro aerial systems powered by oscillating appendages, it is necessary to study the kinematics that generate those forces. Automated and manual tracking techniques were developed to extract the complex wing and body motions of dipteran insects, ideal micro aerial systems, in free flight. Video sequences were captured by three high speed cameras (7500 fps) oriented orthogonally around a clear flight test chamber. Synchronization and image-based triggering were made possible by an automated triggering circuit. A multi-camera calibration was implemented using image-based tracking techniques. Three-dimensional reconstructions of the insect were generated from the 2-D images by shape from silhouette (SFS) methods. An intensity based segmentation of the wings and body was performed using a mixture of Gaussians. In addition to geometric and cost based filtering, spectral clustering was also used to refine the reconstruction and Principal Component Analysis (PCA) was performed to find the body roll axis and wing-span axes. The unobservable roll state of the cylindrically shaped body was successfully estimated by combining observations of the wing kinematics with a wing symmetry assumption. Wing pitch was determined by a ray tracing technique to compute and minimize a point-to-line cost function. Linear estimation with assumed motion models was accomplished by discrete Kalman filtering the measured body states. Generative models were developed for different species of diptera for model based tracking, simulation, and extraction of inertial properties. Manual and automated tracking results were analyzed and insect flight simulation videos were developed to quantify ground truth errors for an assumed model. The results demonstrated the automated tracker to have comparable performance to a human digitizer, though manual techniques displayed superiority during aggressive maneuvers and image blur. Both techniques demonstrated

  4. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods.

    Science.gov (United States)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J; Morling, Niels

    2013-05-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were processed. The highest DNA recovery was obtained with the PrepFiler Express kit on an AutoMate Express while the lowest DNA recovery was obtained using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen. Extraction using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen resulted in the lowest percentage of PCR inhibition (0%) while extraction using manual Chelex resulted in the highest percentage of PCR inhibition (51%). The largest number of reportable STR-profiles was obtained with DNA from samples extracted with the PrepFiler Express kit (75%) while the lowest number was obtained with DNA from samples extracted using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen (41%).

  5. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  6. Dynamic electromembrane extraction: Automated movement of donor and acceptor phases to improve extraction efficiency.

    Science.gov (United States)

    Asl, Yousef Abdossalami; Yamini, Yadollah; Seidi, Shahram; Amanzadeh, Hatam

    2015-11-06

    In the present research, dynamic electromembrane extraction (DEME) was introduced for the first time for extraction and determination of ionizable species from different biological matrices. The setup proposed for DEME provides an efficient, stable, and reproducible method to increase extraction efficiency. This setup consists of a piece of hollow fiber mounted inside a glass flow cell by means of two plastics connector tubes. In this dynamic system, an organic solvent is impregnated into the pores of hollow fiber as supported liquid membrane (SLM); an aqueous acceptor solution is repeatedly pumped into the lumen of hollow fiber by a syringe pump whereas a peristaltic pump is used to move sample solution around the mounted hollow fiber into the flow cell. Two platinum electrodes connected to a power supply are used during extractions which are located into the lumen of the hollow fiber and glass flow cell, respectively. The method was applied for extraction of amitriptyline (AMI) and nortriptyline (NOR) as model analytes from biological fluids. Effective parameters on DEME of the model analytes were investigated and optimized. Under optimized conditions, the calibration curves were linear in the range of 2.0-100μgL(-1) with coefficient of determination (r(2)) more than 0.9902 for both of the analytes. The relative standard deviations (RSD %) were less than 8.4% based on four replicate measurements. LODs less than 1.0μgL(-1) were obtained for both AMI and NOR. The preconcentration factors higher than 83-fold were obtained for the extraction of AMI and NOR in various biological samples.

  7. Automation of lidar-based hydrologic feature extraction workflows using GIS

    Science.gov (United States)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  8. 配网自动化故障定位技术研究%Study on Fault Location in Distribution Network Automation

    Institute of Scientific and Technical Information of China (English)

    杨刘一; 苏海滨; 张庆辉; 杨振赢

    2014-01-01

    In response to the existent problems of Distribution Automation (DA) fault location algorithm, with inaccurate positioning and low fault tolerance, This paper offer a calculating method of FTU fault judgment based on Genetic Algorithm (GA) ,and analyze the encoding method as well as fitness function.It also combines with the example of double power supply line to make calculations and verify the method by simulation. And results indicate that the proposed algorithm performs better in improving the accuracy and fault tolerance.%针对现有配电自动化故障定位算法存在的定位不准、容错性不强的问题,提出了基于遗传算法的FTU故障判定计算方法,对编码方法和适应度函数进行了分析和研究,并结合双电源供电馈电线路的算例进行了计算和仿真验证,结果表明所提出的算法在准确性及容错性方面均有所提升。

  9. CHANNEL MORPHOLOGY TOOL (CMT): A GIS-BASED AUTOMATED EXTRACTION MODEL FOR CHANNEL GEOMETRY

    Energy Technology Data Exchange (ETDEWEB)

    JUDI, DAVID [Los Alamos National Laboratory; KALYANAPU, ALFRED [Los Alamos National Laboratory; MCPHERSON, TIMOTHY [Los Alamos National Laboratory; BERSCHEID, ALAN [Los Alamos National Laboratory

    2007-01-17

    This paper describes an automated Channel Morphology Tool (CMT) developed in ArcGIS 9.1 environment. The CMT creates cross-sections along a stream centerline and uses a digital elevation model (DEM) to create station points with elevations along each of the cross-sections. The generated cross-sections may then be exported into a hydraulic model. Along with the rapid cross-section generation the CMT also eliminates any cross-section overlaps that might occur due to the sinuosity of the channels using the Cross-section Overlap Correction Algorithm (COCoA). The CMT was tested by extracting cross-sections from a 5-m DEM for a 50-km channel length in Houston, Texas. The extracted cross-sections were compared directly with surveyed cross-sections in terms of the cross-section area. Results indicated that the CMT-generated cross-sections satisfactorily matched the surveyed data.

  10. High dimension feature extraction based visualized SOM fault diagnosis method and its application in p-xylene oxidation process☆

    Institute of Scientific and Technical Information of China (English)

    Ying Tian; Wenli Du; Feng Qian

    2015-01-01

    Purified terephthalic acid (PTA) is an important chemical raw material. P-xylene (PX) is transformed to terephthalic acid (TA) through oxidation process and TA is refined to produce PTA. The PX oxidation reaction is a complex process involving three-phase reaction of gas, liquid and solid. To monitor the process and to im-prove the product quality, as wel as to visualize the fault type clearly, a fault diagnosis method based on self-organizing map (SOM) and high dimensional feature extraction method, local tangent space alignment (LTSA), is proposed. In this method, LTSA can reduce the dimension and keep the topology information simultaneously, and SOM distinguishes various states on the output map. Monitoring results of PX oxidation reaction process in-dicate that the LTSA–SOM can wel detect and visualize the fault type.

  11. A Novel Characteristic Frequency Bands Extraction Method for Automatic Bearing Fault Diagnosis Based on Hilbert Huang Transform

    Directory of Open Access Journals (Sweden)

    Xiao Yu

    2015-11-01

    Full Text Available Because roller element bearings (REBs failures cause unexpected machinery breakdowns, their fault diagnosis has attracted considerable research attention. Established fault feature extraction methods focus on statistical characteristics of the vibration signal, which is an approach that loses sight of the continuous waveform features. Considering this weakness, this article proposes a novel feature extraction method for frequency bands, named Window Marginal Spectrum Clustering (WMSC to select salient features from the marginal spectrum of vibration signals by Hilbert–Huang Transform (HHT. In WMSC, a sliding window is used to divide an entire HHT marginal spectrum (HMS into window spectrums, following which Rand Index (RI criterion of clustering method is used to evaluate each window. The windows returning higher RI values are selected to construct characteristic frequency bands (CFBs. Next, a hybrid REBs fault diagnosis is constructed, termed by its elements, HHT-WMSC-SVM (support vector machines. The effectiveness of HHT-WMSC-SVM is validated by running series of experiments on REBs defect datasets from the Bearing Data Center of Case Western Reserve University (CWRU. The said test results evidence three major advantages of the novel method. First, the fault classification accuracy of the HHT-WMSC-SVM model is higher than that of HHT-SVM and ST-SVM, which is a method that combines statistical characteristics with SVM. Second, with Gauss white noise added to the original REBs defect dataset, the HHT-WMSC-SVM model maintains high classification accuracy, while the classification accuracy of ST-SVM and HHT-SVM models are significantly reduced. Third, fault classification accuracy by HHT-WMSC-SVM can exceed 95% under a Pmin range of 500–800 and a m range of 50–300 for REBs defect dataset, adding Gauss white noise at Signal Noise Ratio (SNR = 5. Experimental results indicate that the proposed WMSC method yields a high REBs fault

  12. Automated Device for Asynchronous Extraction of RNA, DNA, or Protein Biomarkers from Surrogate Patient Samples.

    Science.gov (United States)

    Bitting, Anna L; Bordelon, Hali; Baglia, Mark L; Davis, Keersten M; Creecy, Amy E; Short, Philip A; Albert, Laura E; Karhade, Aditya V; Wright, David W; Haselton, Frederick R; Adams, Nicholas M

    2016-12-01

    Many biomarker-based diagnostic methods are inhibited by nontarget molecules in patient samples, necessitating biomarker extraction before detection. We have developed a simple device that purifies RNA, DNA, or protein biomarkers from complex biological samples without robotics or fluid pumping. The device design is based on functionalized magnetic beads, which capture biomarkers and remove background biomolecules by magnetically transferring the beads through processing solutions arrayed within small-diameter tubing. The process was automated by wrapping the tubing around a disc-like cassette and rotating it past a magnet using a programmable motor. This device recovered biomarkers at ~80% of the operator-dependent extraction method published previously. The device was validated by extracting biomarkers from a panel of surrogate patient samples containing clinically relevant concentrations of (1) influenza A RNA in nasal swabs, (2) Escherichia coli DNA in urine, (3) Mycobacterium tuberculosis DNA in sputum, and (4) Plasmodium falciparum protein and DNA in blood. The device successfully extracted each biomarker type from samples representing low levels of clinically relevant infectivity (i.e., 7.3 copies/µL of influenza A RNA, 405 copies/µL of E. coli DNA, 0.22 copies/µL of TB DNA, 167 copies/µL of malaria parasite DNA, and 2.7 pM of malaria parasite protein).

  13. Automated extraction of DNA and PCR setup using a Tecan Freedom EVO® liquid handler

    DEFF Research Database (Denmark)

    Frøslev, Tobias Guldberg; Hansen, Anders Johannes; Stangegaard, Michael

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO® liquid handler mounted with the TeMagS magnetic separation device. The methods were validated for accredited, forensic genetic work according to ISO 17025 using the Qiagen Mag......Attract® DNA Mini M48 kit from fresh, whole blood and blood from deceased. The methods were simplified by returning the DNA extracts to the original tubes reducing the risk of misplacing samples. The original tubes that had contained the samples were washed with 700 µl Milli-Q water prior to the return...... of the DNA extracts. The PCR setup protocols were designed for 96 well microtiter plates. The methods were validated for the kits: AmpFlSTR® Identifiler® and Y-filer® (Applied Biosystems), GenePrint® FFFL and PowerPlex® Y (Promega). Within 3.5 hours, 96 samples were extracted and PCR master mix was added...

  14. Automating the Extraction of Metadata from Archaeological Data Using iRods Rules

    Directory of Open Access Journals (Sweden)

    David Walling

    2011-10-01

    Full Text Available The Texas Advanced Computing Center and the Institute for Classical Archaeology at the University of Texas at Austin developed a method that uses iRods rules and a Jython script to automate the extraction of metadata from digital archaeological data. The first step was to create a record-keeping system to classify the data. The record-keeping system employs file and directory hierarchy naming conventions designed specifically to maintain the relationship between the data objects and map the archaeological documentation process. The metadata implicit in the record-keeping system is automatically extracted upon ingest, combined with additional sources of metadata, and stored alongside the data in the iRods preservation environment. This method enables a more organized workflow for the researchers, helps them archive their data close to the moment of data creation, and avoids error prone manual metadata input. We describe the types of metadata extracted and provide technical details of the extraction process and storage of the data and metadata.

  15. Automated Extraction Of Associations Between Methylated Genes and Diseases From Biomedical Literature

    KAUST Repository

    Bin Res, Arwa A.

    2012-12-01

    Associations between methylated genes and diseases have been investigated in several studies, and it is critical to have such information available for better understanding of diseases and clinical decisions. However, such information is scattered in a large number of electronic publications and it is difficult to manually search for it. Therefore, the goal of the project is to develop a machine learning model that can efficiently extract such information. Twelve machine learning algorithms were applied and compared in application to this problem based on three approaches that involve: document-term frequency matrices, position weight matrices, and a hybrid approach that uses the combination of the previous two. The best results we obtained by the hybrid approach with a random forest model that, in a 10-fold cross-validation, achieved F-score and accuracy of nearly 85% and 84%, respectively. On a completely separate testing set, F-score and accuracy of 89% and 88%, respectively, were obtained. Based on this model, we developed a tool that automates extraction of associations between methylated genes and diseases from electronic text. Our study contributed an efficient method for extracting specific types of associations from free text and the methodology developed here can be extended to other similar association extraction problems.

  16. Automated Extraction of the Archaeological Tops of Qanat Shafts from VHR Imagery in Google Earth

    Directory of Open Access Journals (Sweden)

    Lei Luo

    2014-12-01

    Full Text Available Qanats in northern Xinjiang of China provide valuable information for agriculturists and anthropologists who seek fundamental understanding of the distribution of qanat water supply systems with regard to water resource utilization, the development of oasis agriculture, and eventually climate change. Only the tops of qanat shafts (TQSs, indicating the course of the qanats, can be observed from space, and their circular archaeological traces can also be seen in very high resolution imagery in Google Earth. The small size of the TQSs, vast search regions, and degraded features make manually extracting them from remote sensing images difficult and costly. This paper proposes an automated TQS extraction method that adopts mathematical morphological processing methods before an edge detecting module is used in the circular Hough transform approach. The accuracy assessment criteria for the proposed method include: (i extraction percentage (E = 95.9%, branch factor (B = 0 and quality percentage (Q = 95.9% in Site 1; and (ii extraction percentage (E = 83.4%, branch factor (B = 0.058 and quality percentage (Q = 79.5% in Site 2. Compared with the standard circular Hough transform, the quality percentages (Q of our proposed method were improved to 95.9% and 79.5% from 86.3% and 65.8% in test sites 1 and 2, respectively. The results demonstrate that wide-area discovery and mapping can be performed much more effectively based on our proposed method.

  17. Hybrid curation of gene–mutation relations combining automated extraction and crowdsourcing

    Science.gov (United States)

    Burger, John D.; Doughty, Emily; Khare, Ritu; Wei, Chih-Hsuan; Mishra, Rajashree; Aberdeen, John; Tresner-Kirsch, David; Wellner, Ben; Kann, Maricel G.; Lu, Zhiyong; Hirschman, Lynette

    2014-01-01

    Background: This article describes capture of biological information using a hybrid approach that combines natural language processing to extract biological entities and crowdsourcing with annotators recruited via Amazon Mechanical Turk to judge correctness of candidate biological relations. These techniques were applied to extract gene– mutation relations from biomedical abstracts with the goal of supporting production scale capture of gene–mutation–disease findings as an open source resource for personalized medicine. Results: The hybrid system could be configured to provide good performance for gene–mutation extraction (precision ∼82%; recall ∼70% against an expert-generated gold standard) at a cost of $0.76 per abstract. This demonstrates that crowd labor platforms such as Amazon Mechanical Turk can be used to recruit quality annotators, even in an application requiring subject matter expertise; aggregated Turker judgments for gene–mutation relations exceeded 90% accuracy. Over half of the precision errors were due to mismatches against the gold standard hidden from annotator view (e.g. incorrect EntrezGene identifier or incorrect mutation position extracted), or incomplete task instructions (e.g. the need to exclude nonhuman mutations). Conclusions: The hybrid curation model provides a readily scalable cost-effective approach to curation, particularly if coupled with expert human review to filter precision errors. We plan to generalize the framework and make it available as open source software. Database URL: http://www.mitre.org/publications/technical-papers/hybrid-curation-of-gene-mutation-relations-combining-automated PMID:25246425

  18. Bearing Fault Diagnosis Based on Multiscale Permutation Entropy and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Jian-Jiun Ding

    2012-07-01

    Full Text Available Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, multiscale permutation entropy (MPE was introduced for feature extraction from faulty bearing vibration signals. After extracting feature vectors by MPE, the support vector machine (SVM was applied to automate the fault diagnosis procedure. Simulation results demonstrated that the proposed method is a very powerful algorithm for bearing fault diagnosis and has much better performance than the methods based on single scale permutation entropy (PE and multiscale entropy (MSE.

  19. Automated multisyringe stir bar sorptive extraction using robust montmorillonite/epoxy-coated stir bars.

    Science.gov (United States)

    Ghani, Milad; Saraji, Mohammad; Maya, Fernando; Cerdà, Víctor

    2016-05-01

    Herein we present a simple, rapid and low cost strategy for the preparation of robust stir bar coatings based on the combination of montmorillonite with epoxy resin. The composite stir bar was implemented in a novel automated multisyringe stir bar sorptive extraction system (MS-SBSE), and applied to the extraction of four chlorophenols (4-chlorophenol, 2,4-dichlorophenol, 2,4,6-trichlorophenol and pentachlorophenol) as model compounds, followed by high performance liquid chromatography-diode array detection. The different experimental parameters of the MS-SBSE, such as sample volume, selection of the desorption solvent, desorption volume, desorption time, sample solution pH, salt effect and extraction time were studied. Under the optimum conditions, the detection limits were between 0.02 and 0.34μgL(-1). Relative standard deviations (RSD) of the method for the analytes at 10μgL(-1) concentration level ranged from 3.5% to 4.1% (as intra-day RSD) and from 3.9% to 4.3% (as inter-day RSD at 50μgL(-1) concentration level). Batch-to-batch reproducibility for three different stir bars was 4.6-5.1%. The enrichment factors were between 30 and 49. In order to investigate the capability of the developed technique for real sample analysis, well water, wastewater and leachates from a solid waste treatment plant were satisfactorily analyzed.

  20. Sequential Chomospheric Brightening: An Automated Approach to Extracting Physics from Ephemeral Brightening

    CERN Document Server

    Kirk, Michael S; Jackiewicz, Jason; McAteer, R T James; McNamara, Bernie J

    2012-01-01

    We make a comparison between small scale chromospheric brightenings and energy release processes through examining the temporal evolution of sequential chromospheric brightenings (SCBs), derive propagation velocities, and propose a connection of the small-scale features to solar flares. Our automated routine detects and distinguishes three separate types of brightening regularly observed in the chromosphere: plage, flare ribbon, and point brightenings. By studying their distinct dynamics, we separate out the flare-associated bright points commonly known as SCBs and identify a propagating Moreton wave. Superimposing our detections on complementary off-band images, we extract a Doppler velocity measurement beneath the point brightening locations. Using these dynamic measurements, we put forward a connection between point brightenings, the erupting flare, and overarching magnetic loops. A destabilization of the pre-flare loop topology by the erupting flare directly leads to the SCBs observed.

  1. Feature Extraction Method for High Impedance Ground Fault Localization in Radial Power Distribution Networks

    DEFF Research Database (Denmark)

    Jensen, Kåre Jean; Munk, Steen M.; Sørensen, John Aasted

    1998-01-01

    processes and communication systems lead to demands for improved monitoring of power distribution networks so that the quality of power delivery can be kept at a controlled level. The ground fault localization method for each feeder in a network is based on the centralized frequency broadband measurement...... of three phase voltages and currents. The method consists of a feature extractor, based on a grid description of the feeder by impulse responses, and a neural network for ground fault localization. The emphasis of this paper is the feature extractor, and the detection of the time instance of a ground fault...

  2. Automated extraction of precise protein expression patterns in lymphoma by text mining abstracts of immunohistochemical studies

    Directory of Open Access Journals (Sweden)

    Jia-Fu Chang

    2013-01-01

    Full Text Available Background: In general, surgical pathology reviews report protein expression by tumors in a semi-quantitative manner, that is, -, -/+, +/-, +. At the same time, the experimental pathology literature provides multiple examples of precise expression levels determined by immunohistochemical (IHC tissue examination of populations of tumors. Natural language processing (NLP techniques enable the automated extraction of such information through text mining. We propose establishing a database linking quantitative protein expression levels with specific tumor classifications through NLP. Materials and Methods: Our method takes advantage of typical forms of representing experimental findings in terms of percentages of protein expression manifest by the tumor population under study. Characteristically, percentages are represented straightforwardly with the % symbol or as the number of positive findings of the total population. Such text is readily recognized using regular expressions and templates permitting extraction of sentences containing these forms for further analysis using grammatical structures and rule-based algorithms. Results: Our pilot study is limited to the extraction of such information related to lymphomas. We achieved a satisfactory level of retrieval as reflected in scores of 69.91% precision and 57.25% recall with an F-score of 62.95%. In addition, we demonstrate the utility of a web-based curation tool for confirming and correcting our findings. Conclusions: The experimental pathology literature represents a rich source of pathobiological information, which has been relatively underutilized. There has been a combinatorial explosion of knowledge within the pathology domain as represented by increasing numbers of immunophenotypes and disease subclassifications. NLP techniques support practical text mining techniques for extracting this knowledge and organizing it in forms appropriate for pathology decision support systems.

  3. BLINKER: Automated Extraction of Ocular Indices from EEG Enabling Large-Scale Analysis.

    Science.gov (United States)

    Kleifges, Kelly; Bigdely-Shamlo, Nima; Kerick, Scott E; Robbins, Kay A

    2017-01-01

    Electroencephalography (EEG) offers a platform for studying the relationships between behavioral measures, such as blink rate and duration, with neural correlates of fatigue and attention, such as theta and alpha band power. Further, the existence of EEG studies covering a variety of subjects and tasks provides opportunities for the community to better characterize variability of these measures across tasks and subjects. We have implemented an automated pipeline (BLINKER) for extracting ocular indices such as blink rate, blink duration, and blink velocity-amplitude ratios from EEG channels, EOG channels, and/or independent components (ICs). To illustrate the use of our approach, we have applied the pipeline to a large corpus of EEG data (comprising more than 2000 datasets acquired at eight different laboratories) in order to characterize variability of certain ocular indicators across subjects. We also investigate dependence of ocular indices on task in a shooter study. We have implemented our algorithms in a freely available MATLAB toolbox called BLINKER. The toolbox, which is easy to use and can be applied to collections of data without user intervention, can automatically discover which channels or ICs capture blinks. The tools extract blinks, calculate common ocular indices, generate a report for each dataset, dump labeled images of the individual blinks, and provide summary statistics across collections. Users can run BLINKER as a script or as a plugin for EEGLAB. The toolbox is available at https://github.com/VisLab/EEG-Blinks. User documentation and examples appear at http://vislab.github.io/EEG-Blinks/.

  4. BLINKER: Automated Extraction of Ocular Indices from EEG Enabling Large-Scale Analysis

    Science.gov (United States)

    Kleifges, Kelly; Bigdely-Shamlo, Nima; Kerick, Scott E.; Robbins, Kay A.

    2017-01-01

    Electroencephalography (EEG) offers a platform for studying the relationships between behavioral measures, such as blink rate and duration, with neural correlates of fatigue and attention, such as theta and alpha band power. Further, the existence of EEG studies covering a variety of subjects and tasks provides opportunities for the community to better characterize variability of these measures across tasks and subjects. We have implemented an automated pipeline (BLINKER) for extracting ocular indices such as blink rate, blink duration, and blink velocity-amplitude ratios from EEG channels, EOG channels, and/or independent components (ICs). To illustrate the use of our approach, we have applied the pipeline to a large corpus of EEG data (comprising more than 2000 datasets acquired at eight different laboratories) in order to characterize variability of certain ocular indicators across subjects. We also investigate dependence of ocular indices on task in a shooter study. We have implemented our algorithms in a freely available MATLAB toolbox called BLINKER. The toolbox, which is easy to use and can be applied to collections of data without user intervention, can automatically discover which channels or ICs capture blinks. The tools extract blinks, calculate common ocular indices, generate a report for each dataset, dump labeled images of the individual blinks, and provide summary statistics across collections. Users can run BLINKER as a script or as a plugin for EEGLAB. The toolbox is available at https://github.com/VisLab/EEG-Blinks. User documentation and examples appear at http://vislab.github.io/EEG-Blinks/.

  5. Streamlining DNA barcoding protocols: automated DNA extraction and a new cox1 primer in arachnid systematics.

    Directory of Open Access Journals (Sweden)

    Nina Vidergar

    Full Text Available BACKGROUND: DNA barcoding is a popular tool in taxonomic and phylogenetic studies, but for most animal lineages protocols for obtaining the barcoding sequences--mitochondrial cytochrome C oxidase subunit I (cox1 AKA CO1--are not standardized. Our aim was to explore an optimal strategy for arachnids, focusing on the species-richest lineage, spiders by (1 improving an automated DNA extraction protocol, (2 testing the performance of commonly used primer combinations, and (3 developing a new cox1 primer suitable for more efficient alignment and phylogenetic analyses. METHODOLOGY: We used exemplars of 15 species from all major spider clades, processed a range of spider tissues of varying size and quality, optimized genomic DNA extraction using the MagMAX Express magnetic particle processor-an automated high throughput DNA extraction system-and tested cox1 amplification protocols emphasizing the standard barcoding region using ten routinely employed primer pairs. RESULTS: The best results were obtained with the commonly used Folmer primers (LCO1490/HCO2198 that capture the standard barcode region, and with the C1-J-2183/C1-N-2776 primer pair that amplifies its extension. However, C1-J-2183 is designed too close to HCO2198 for well-interpreted, continuous sequence data, and in practice the resulting sequences from the two primer pairs rarely overlap. We therefore designed a new forward primer C1-J-2123 60 base pairs upstream of the C1-J-2183 binding site. The success rate of this new primer (93% matched that of C1-J-2183. CONCLUSIONS: The use of C1-J-2123 allows full, indel-free overlap of sequences obtained with the standard Folmer primers and with C1-J-2123 primer pair. Our preliminary tests suggest that in addition to spiders, C1-J-2123 will also perform in other arachnids and several other invertebrates. We provide optimal PCR protocols for these primer sets, and recommend using them for systematic efforts beyond DNA barcoding.

  6. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis

    Science.gov (United States)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku

    2014-03-01

    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.

  7. Automated Extraction and Mapping for Desert Wadis from Landsat Imagery in Arid West Asia

    Directory of Open Access Journals (Sweden)

    Yongxue Liu

    2016-03-01

    Full Text Available Wadis, ephemeral dry rivers in arid desert regions that contain water in the rainy season, are often manifested as braided linear channels and are of vital importance for local hydrological environments and regional hydrological management. Conventional methods for effectively delineating wadis from heterogeneous backgrounds are limited for the following reasons: (1 the occurrence of numerous morphological irregularities which disqualify methods based on physical shape; (2 inconspicuous spectral contrast with backgrounds, resulting in frequent false alarms; and (3 the extreme complexity of wadi systems, with numerous tiny tributaries characterized by spectral anisotropy, resulting in a conflict between global and local accuracy. To overcome these difficulties, an automated method for extracting wadis (AMEW from Landsat-8 Operational Land Imagery (OLI was developed in order to take advantage of the complementarity between Water Indices (WIs, which is a technique of mathematically combining different bands to enhance water bodies and suppress backgrounds, and image processing technologies in the morphological field involving multi-scale Gaussian matched filtering and a local adaptive threshold segmentation. Evaluation of the AMEW was carried out in representative areas deliberately selected from Jordan, SW Arabian Peninsula in order to ensure a rigorous assessment. Experimental results indicate that the AMEW achieved considerably higher accuracy than other effective extraction methods in terms of visual inspection and statistical comparison, with an overall accuracy of up to 95.05% for the entire area. In addition, the AMEW (based on the New Water Index (NWI achieved higher accuracy than other methods (the maximum likelihood classifier and the support vector machine classifier used for bulk wadi extraction.

  8. Californian demonstration and validation of automated agricultural field extraction from multi-temporal Landsat data

    Science.gov (United States)

    Yan, L.; Roy, D. P.

    2013-12-01

    The spatial distribution of agricultural fields is a fundamental description of rural landscapes and the location and extent of fields is important to establish the area of land utilized for agricultural yield prediction, resource allocation, and for economic planning. To date, field objects have not been extracted from satellite data over large areas because of computational constraints and because consistently processed appropriate resolution data have not been available or affordable. We present a fully automated computational methodology to extract agricultural fields from 30m Web Enabled Landsat data (WELD) time series and results for approximately 250,000 square kilometers (eleven 150 x 150 km WELD tiles) encompassing all the major agricultural areas of California. The extracted fields, including rectangular, circular, and irregularly shaped fields, are evaluated by comparison with manually interpreted Landsat field objects. Validation results are presented in terms of standard confusion matrix accuracy measures and also the degree of field object over-segmentation, under-segmentation, fragmentation and shape distortion. The apparent success of the presented field extraction methodology is due to several factors. First, the use of multi-temporal Landsat data, as opposed to single Landsat acquisitions, that enables crop rotations and inter-annual variability in the state of the vegetation to be accommodated for and provides more opportunities for cloud-free, non-missing and atmospherically uncontaminated surface observations. Second, the adoption of an object based approach, namely the variational region-based geometric active contour method that enables robust segmentation with only a small number of parameters and that requires no training data collection. Third, the use of a watershed algorithm to decompose connected segments belonging to multiple fields into coherent isolated field segments and a geometry based algorithm to detect and associate parts of

  9. A new automated spectral feature extraction method and its application in spectral classification and defective spectra recovery

    Science.gov (United States)

    Wang, Ke; Guo, Ping; Luo, A.-Li

    2017-03-01

    Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.

  10. Automated centreline extraction of neuronal dendrite from optical microscopy image stacks

    Science.gov (United States)

    Xiao, Liang; Zhang, Fanbiao

    2010-11-01

    In this work we present a novel vision-based pipeline for automated skeleton detection and centreline extraction of neuronal dendrite from optical microscopy image stacks. The proposed pipeline is an integrated solution that merges image stacks pre-processing, the seed points detection, ridge traversal procedure, minimum spanning tree optimization and tree trimming into to a unified framework to deal with the challenge problem. In image stacks preprocessing, we first apply a curvelet transform based shrinkage and cycle spinning technique to remove the noise. This is followed by the adaptive threshold method to compute the result of neuronal object segmentation, and the 3D distance transformation is performed to get the distance map. According to the eigenvalues and eigenvectors of the Hessian matrix, the skeleton seed points are detected. Staring from the seed points, the initial centrelines are obtained using ridge traversal procedure. After that, we use minimum spanning tree to organize the geometrical structure of the skeleton points, and then we use graph trimming post-processing to compute the final centreline. Experimental results on different datasets demonstrate that our approach has high reliability, good robustness and requires less user interaction.

  11. A novel automated device for rapid nucleic acid extraction utilizing a zigzag motion of magnetic silica beads

    Energy Technology Data Exchange (ETDEWEB)

    Yamaguchi, Akemi [Graduate School of Science and Technology, Shinshu University, Nagano (Japan); Core Technology Development Center, Seiko Epson Corporation, Suwa (Japan); Matsuda, Kazuyuki, E-mail: kmatsuda@shinshu-u.ac.jp [Department of Laboratory Medicine, Shinshu University Hospital, Matsumoto (Japan); Uehara, Masayuki [Core Technology Development Center, Seiko Epson Corporation, Suwa (Japan); Honda, Takayuki [Department of Laboratory Medicine, Shinshu University Hospital, Matsumoto (Japan); Saito, Yasunori [Institute of Engineering, Academic Assembly, Shinshu University, Nagano (Japan)

    2016-02-04

    We report a novel automated device for nucleic acid extraction, which consists of a mechanical control system and a disposable cassette. The cassette is composed of a bottle, a capillary tube, and a chamber. After sample injection in the bottle, the sample is lysed, and nucleic acids are adsorbed on the surface of magnetic silica beads. These magnetic beads are transported and are vibrated through the washing reagents in the capillary tube under the control of the mechanical control system, and thus, the nucleic acid is purified without centrifugation. The purified nucleic acid is automatically extracted in 3 min for the polymerase chain reaction (PCR). The nucleic acid extraction is dependent on the transport speed and the vibration frequency of the magnetic beads, and optimizing these two parameters provided better PCR efficiency than the conventional manual procedure. There was no difference between the detection limits of our novel device and that of the conventional manual procedure. We have already developed the droplet-PCR machine, which can amplify and detect specific nucleic acids rapidly and automatically. Connecting the droplet-PCR machine to our novel automated extraction device enables PCR analysis within 15 min, and this system can be made available as a point-of-care testing in clinics as well as general hospitals. - Highlights: • Automatic nucleic acid extraction is performed in 3 min. • Zigzag motion of magnetic silica beads yields rapid and efficient extraction. • The present our device provides better performance than the conventional procedure.

  12. Automated feature extraction by combining polarimetric SAR and object-based image analysis for monitoring of natural resource exploitation

    OpenAIRE

    Plank, Simon; Mager, Alexander; Schöpfer, Elisabeth

    2015-01-01

    An automated feature extraction procedure based on the combination of a pixel-based unsupervised classification of polarimetric synthetic aperture radar data (PolSAR) and an object-based post-classification is presented. High resolution SpotLight dual-polarimetric (HH/VV) TerraSAR-X imagery acquired over the Doba basin, Chad, is used for method development and validation. In an iterative training procedure the best suited polarimetric speckle filter, processing parameters for the following en...

  13. Rapid and Semi-Automated Extraction of Neuronal Cell Bodies and Nuclei from Electron Microscopy Image Stacks

    Science.gov (United States)

    Holcomb, Paul S.; Morehead, Michael; Doretto, Gianfranco; Chen, Peter; Berg, Stuart; Plaza, Stephen; Spirou, George

    2016-01-01

    Connectomics—the study of how neurons wire together in the brain—is at the forefront of modern neuroscience research. However, many connectomics studies are limited by the time and precision needed to correctly segment large volumes of electron microscopy (EM) image data. We present here a semi-automated segmentation pipeline using freely available software that can significantly decrease segmentation time for extracting both nuclei and cell bodies from EM image volumes. PMID:27259933

  14. a New Object Based Method for Automated Extraction of Urban Objects from Airborne Sensors Data

    Science.gov (United States)

    Moussa, A.; El-Sheimy, N.

    2012-07-01

    The classification of urban objects such as buildings, trees and roads from airborne sensors data is an essential step in numerous mapping and modelling applications. The automation of this step is greatly needed as the manual processing is costly and time consuming. The increasing availability of airborne sensors data such as aerial imagery and LIDAR data offers new opportunities to develop more robust approaches for automatic classification. These approaches should integrate these data sources that have different characteristics to exceed the accuracy achieved using any individual data source. The proposed approach presented in this paper fuses the aerial images data with single return LIDAR data to extract buildings and trees for an urban area. Object based analysis is adopted to segment the entire DSM data into objects based on height variation. These objects are preliminarily classified into buildings, trees, and ground. This primary classification is used to compute the height to ground for each object to help improve the accuracy of the second phase of classification. The overlapping perspective aerial images are used to build an ortho-photo to derive a vegetation index value for each object. The second phase of classification is performed based on the height to ground and the vegetation index of each object. The proposed approach has been tested using three areas in the centre of the city of Vaihingen provided by ISPRS test project on urban classification and 3D building reconstruction. These areas have historic buildings having rather complex shapes, few high-rising residential buildings that are surrounded by trees, and a purely residential area with small detached houses. The results of the proposed approach are presented based on a reference solution for evaluation purposes. The classification evaluation exhibits highly successful classification results of buildings class. The proposed approach follows the exact boundary of trees based on LIDAR data

  15. A novel automated device for rapid nucleic acid extraction utilizing a zigzag motion of magnetic silica beads.

    Science.gov (United States)

    Yamaguchi, Akemi; Matsuda, Kazuyuki; Uehara, Masayuki; Honda, Takayuki; Saito, Yasunori

    2016-02-04

    We report a novel automated device for nucleic acid extraction, which consists of a mechanical control system and a disposable cassette. The cassette is composed of a bottle, a capillary tube, and a chamber. After sample injection in the bottle, the sample is lysed, and nucleic acids are adsorbed on the surface of magnetic silica beads. These magnetic beads are transported and are vibrated through the washing reagents in the capillary tube under the control of the mechanical control system, and thus, the nucleic acid is purified without centrifugation. The purified nucleic acid is automatically extracted in 3 min for the polymerase chain reaction (PCR). The nucleic acid extraction is dependent on the transport speed and the vibration frequency of the magnetic beads, and optimizing these two parameters provided better PCR efficiency than the conventional manual procedure. There was no difference between the detection limits of our novel device and that of the conventional manual procedure. We have already developed the droplet-PCR machine, which can amplify and detect specific nucleic acids rapidly and automatically. Connecting the droplet-PCR machine to our novel automated extraction device enables PCR analysis within 15 min, and this system can be made available as a point-of-care testing in clinics as well as general hospitals.

  16. 2D-HIDDEN MARKOV MODEL FEATURE EXTRACTION STRATEGY OF ROTATING MACHINERY FAULT DIAGNOSIS

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    A new feature extraction method based on 2D-hidden Markov model(HMM) is proposed.Meanwhile the time index and frequency index are introduced to represent the new features. The new feature extraction strategy is tested by the experimental data that collected from Bently rotor experiment system. The results show that this methodology is very effective to extract the feature of vibration signals in the rotor speed-up course and can be extended to other non-stationary signal analysis fields in the future.

  17. Highly integrated flow assembly for automated dynamic extraction and determination of readily bioaccessible chromium(VI) in soils exploiting carbon nanoparticle-based solid-phase extraction

    Energy Technology Data Exchange (ETDEWEB)

    Rosende, Maria; Miro, Manuel; Cerda, Victor [University of the Balearic Islands, Department of Chemistry, Palma de Mallorca (Spain); Segundo, Marcela A.; Lima, Jose L.F.C. [University of Porto, REQUIMTE, Department of Chemistry, Faculty of Pharmacy, Porto (Portugal)

    2011-06-15

    An automated dynamic leaching test integrated in a portable flow-based setup is herein proposed for reliable determination of readily bioaccessible Cr(VI) under worst-case scenarios in soils containing varying levels of contamination. The manifold is devised to accommodate bi-directional flow extraction followed by processing of extracts via either in-line clean-up/preconcentration using multi-walled carbon nanotubes or automatic dilution at will, along with Cr(VI) derivatization and flow-through spectrophotometric detection. The magnitude of readily mobilizable Cr(VI) pools was ascertained by resorting to water extraction as promulgated by current standard leaching tests. The role of carbon nanomaterials for the uptake of Cr(VI) in soil leachates and the configuration of the packed column integrated in the flow manifold were investigated in detail. The analytical performance of the proposed system for in vitro bioaccessibility tests was evaluated in chromium-enriched soils at environmentally relevant levels and in a standard reference soil material (SRM 2701) with a certified value of total hexavalent chromium. The automated method was proven to afford unbiased assessment of water-soluble Cr(VI) in soils as a result of the minimization of the chromium species transformation. By combination of the kinetic leaching profile and a first-order leaching model, the water-soluble Cr(VI) fraction in soils was determined in merely 6 h against >24 h taken in batchwise steady-state standard methods. (orig.)

  18. Metal-organic framework mixed-matrix disks: Versatile supports for automated solid-phase extraction prior to chromatographic separation.

    Science.gov (United States)

    Ghani, Milad; Font Picó, Maria Francesca; Salehinia, Shima; Palomino Cabello, Carlos; Maya, Fernando; Berlier, Gloria; Saraji, Mohammad; Cerdà, Víctor; Turnes Palomino, Gemma

    2017-03-10

    We present for the first time the application of metal-organic framework (MOF) mixed-matrix disks (MMD) for the automated flow-through solid-phase extraction (SPE) of environmental pollutants. Zirconium terephthalate UiO-66 and UiO-66-NH2 MOFs with different size (90, 200 and 300nm) have been incorporated into mechanically stable polyvinylidene difluoride (PVDF) disks. The performance of the MOF-MMDs for automated SPE of seven substituted phenols prior to HPLC analysis has been evaluated using the sequential injection analysis technique. MOF-MMDs enabled the simultaneous extraction of phenols with the concomitant size exclusion of molecules of larger size. The best extraction performance was obtained using a MOF-MMD containing 90nm UiO-66-NH2 crystals. Using the selected MOF-MMD, detection limits ranging from 0.1 to 0.2μgL(-1) were obtained. Relative standard deviations ranged from 3.9 to 5.3% intra-day, and 4.7-5.7% inter-day. Membrane batch-to-batch reproducibility was from 5.2 to 6.4%. Three different groundwater samples were analyzed with the proposed method using MOF-MMDs, obtaining recoveries ranging from 90 to 98% for all tested analytes.

  19. Analysis of halogenated and priority pesticides at different concentration levels. Automated SPE extraction followed by isotope dilution-GC/MS

    Energy Technology Data Exchange (ETDEWEB)

    Planas, C.; Saulo, J.; Rivera, J.; Caixach, J. [Institut Investigacions Quimiques i Ambientals (IIQAB-CSIC), Barcelona (Spain)

    2004-09-15

    In this work, automatic SPE extraction of 16 pesticides and metabolites with the automated Power-Prep trademark system is evaluated at different concentration levels using polymeric (ENV+) and C{sub 18} sorbent phases. The method was optimised by comparing recoveries obtained using different eluting solvents. The optimised procedure was then applied to spiked water samples at concentration levels of 0.1{mu}g/L (quality standard for individual pesticides in drinking water) and 0.02{mu}g/L (close to the detection limit of most pesticides).

  20. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications

    DEFF Research Database (Denmark)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik

    2016-01-01

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution,...

  1. Automated on-line liquid–liquid extraction system for temporal mass spectrometric analysis of dynamic samples

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, Kai-Ta; Liu, Pei-Han [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Urban, Pawel L. [Department of Applied Chemistry, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China); Institute of Molecular Science, National Chiao Tung University, 1001 University Rd, Hsinchu, 300, Taiwan (China)

    2015-09-24

    Most real samples cannot directly be infused to mass spectrometers because they could contaminate delicate parts of ion source and guides, or cause ion suppression. Conventional sample preparation procedures limit temporal resolution of analysis. We have developed an automated liquid–liquid extraction system that enables unsupervised repetitive treatment of dynamic samples and instantaneous analysis by mass spectrometry (MS). It incorporates inexpensive open-source microcontroller boards (Arduino and Netduino) to guide the extraction and analysis process. Duration of every extraction cycle is 17 min. The system enables monitoring of dynamic processes over many hours. The extracts are automatically transferred to the ion source incorporating a Venturi pump. Operation of the device has been characterized (repeatability, RSD = 15%, n = 20; concentration range for ibuprofen, 0.053–2.000 mM; LOD for ibuprofen, ∼0.005 mM; including extraction and detection). To exemplify its usefulness in real-world applications, we implemented this device in chemical profiling of pharmaceutical formulation dissolution process. Temporal dissolution profiles of commercial ibuprofen and acetaminophen tablets were recorded during 10 h. The extraction-MS datasets were fitted with exponential functions to characterize the rates of release of the main and auxiliary ingredients (e.g. ibuprofen, k = 0.43 ± 0.01 h{sup −1}). The electronic control unit of this system interacts with the operator via touch screen, internet, voice, and short text messages sent to the mobile phone, which is helpful when launching long-term (e.g. overnight) measurements. Due to these interactive features, the platform brings the concept of the Internet-of-Things (IoT) to the chemistry laboratory environment. - Highlights: • Mass spectrometric analysis normally requires sample preparation. • Liquid–liquid extraction can isolate analytes from complex matrices. • The proposed system automates

  2. Automated extraction of 11-nor-delta9-tetrahydrocannabinol carboxylic acid from urine samples using the ASPEC XL solid-phase extraction system.

    Science.gov (United States)

    Langen, M C; de Bijl, G A; Egberts, A C

    2000-09-01

    The analysis of 11-nor-delta9-tetrahydrocannabinol-carboxylic acid (THCCOOH, the major metabolite of cannabis) in urine with gas chromatography and mass spectrometry (GC-MS) and solid-phase extraction (SPE) sample preparation is well documented. Automated SPE sample preparation of THCCOOH in urine, although potentially advantageous, is to our knowledge poorly investigated. The objective of the present study was to develop and validate an automated SPE sample-preparation step using ASPEC XL suited for GC-MS confirmation analysis of THCCOOH in urine drug control. The recoveries showed that it was not possible to transfer the protocol for the manual SPE procedure with the vacuum manifold to the ASPEC XL without loss of recovery. Making the sample more lipophilic by adding 1 mL 2-propanol after hydrolysis to the urine sample in order to overcome the problem of surface adsorption of THCCOOH led to an extraction efficiency (77%) comparable to that reached with the vacuum manifold (84%). The reproducibility of the automated SPE procedure was better (coefficient of variation 5%) than that of the manual procedure (coefficient of variation 12%). The limit of detection was 1 ng/mL, and the limit of quantitation was 4 ng/mL. Precision at the 12.5-ng/mL level was as follows: mean, 12.4 and coefficient of variation, 3.0%. Potential carryover was evaluated, but a carryover effect could not be detected. It was concluded that the proposed method is suited for GC-MS confirmation urinalysis of THCCOOH for prisons and detoxification centers.

  3. Field-scale validation of an automated soil nitrate extraction and measurement system

    NARCIS (Netherlands)

    Sibley, K.J.; Astatkie, T.; Brewster, G.; Struik, P.C.; Adsett, J.F.; Pruski, K.

    2009-01-01

    One of the many gaps that needs to be solved by precision agriculture technologies is the availability of an economic, automated, on-the-go mapping system that can be used to obtain intensive and accurate ‘real-time’ data on the levels of nitrate nitrogen (NO3–N) in the soil. A soil nitrate mapping

  4. Screening for Anabolic Steroids in Urine of Forensic Cases Using Fully Automated Solid Phase Extraction and LC–MS-MS

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards...... and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids....... Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic...

  5. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    Science.gov (United States)

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids.

  6. INVESTIGATION OF ARSENIC SPECIATION ON DRINKING WATER TREATMENT MEDIA UTILIZING AUTOMATED SEQUENTIAL CONTINUOUS FLOW EXTRACTION WITH IC-ICP-MS DETECTION

    Science.gov (United States)

    Three treatment media, used for the removal of arsenic from drinking water, were sequentially extracted using 10mM MgCl2 (pH 8), 10mM NaH2PO4 (pH 7) followed by 10mM (NH4)2C2O4 (pH 3). The media were extracted using an on-line automated continuous extraction system which allowed...

  7. Auto-OBSD: Automatic parameter selection for reliable Oscillatory Behavior-based Signal Decomposition with an application to bearing fault signature extraction

    Science.gov (United States)

    Huang, Huan; Baddour, Natalie; Liang, Ming

    2017-03-01

    Bearing signals are often contaminated by in-band interferences and random noise. Oscillatory Behavior-based Signal Decomposition (OBSD) is a new technique which decomposes a signal according to its oscillatory behavior, rather than frequency or scale. Due to the low oscillatory transients of bearing fault-induced signals, the OBSD can be used to effectively extract bearing fault signatures from a blurred signal. However, the quality of the result highly relies on the selection of method-related parameters. Such parameters are often subjectively selected and a systematic approach has not been reported in the literature. As such, this paper proposes a systematic approach to automatic selection of OBSD parameters for reliable extraction of bearing fault signatures. The OBSD utilizes the idea of Morphological Component Analysis (MCA) that optimally projects the original signal to low oscillatory wavelets and high oscillatory wavelets established via the Tunable Q-factor Wavelet Transform (TQWT). In this paper, the effects of the selection of each parameter on the performance of the OBSD for bearing fault signature extraction are investigated. It is found that some method-related parameters can be fixed at certain values due to the nature of bearing fault-induced impulses. To adaptively tune the remaining parameters, index-guided parameter selection algorithms are proposed. A Convergence Index (CI) is proposed and a CI-guided self-tuning algorithm is developed to tune the convergence-related parameters, namely, penalty factor and number of iterations. Furthermore, a Smoothness Index (SI) is employed to measure the effectiveness of the extracted low oscillatory component (i.e. bearing fault signature). It is shown that a minimum SI implies an optimal result with respect to the adjustment of relevant parameters. Thus, two SI-guided automatic parameter selection algorithms are also developed to specify two other parameters, i.e., Q-factor of high-oscillatory wavelets and

  8. Direct Sampling and Analysis from Solid Phase Extraction Cards using an Automated Liquid Extraction Surface Analysis Nanoelectrospray Mass Spectrometry System

    Energy Technology Data Exchange (ETDEWEB)

    Walworth, Matthew J [ORNL; ElNaggar, Mariam S [ORNL; Stankovich, Joseph J [ORNL; WitkowskiII, Charles E. [Protein Discovery, Inc.; Norris, Jeremy L [ORNL; Van Berkel, Gary J [ORNL

    2011-01-01

    Direct liquid extraction based surface sampling, a technique previously demonstrated with continuous flow and autonomous pipette liquid microjunction surface sampling probes, has recently been implemented as the Liquid Extraction Surface Analysis (LESA) mode on the commercially available Advion NanoMate chip-based infusion nanoelectrospray ionization system. In the present paper, the LESA mode was applied to the analysis of 96-well format custom solid phase extraction (SPE) cards, with each well consisting of either a 1 or 2 mm diameter monolithic hydrophobic stationary phase. These substrate wells were conditioned, loaded with either single or multi-component aqueous mixtures, and read out using the LESA mode of a TriVersa NanoMate or a Nanomate 100 coupled to an ABI/Sciex 4000QTRAPTM hybrid triple quadrupole/linear ion trap mass spectrometer and a Thermo LTQ XL linear ion trap mass spectrometer. Extraction conditions, including extraction/nanoESI solvent composition, volume, and dwell times, were optimized in the analysis of targeted compounds. Limit of detection and quantitation as well as analysis reproducibility figures of merit were measured. Calibration data was obtained for propranolol using a deuterated internal standard which demonstrated linearity and reproducibility. A 10x increase in signal and cleanup of micromolar Angiotensin II from a concentrated salt solution was demonstrated. Additionally, a multicomponent herbicide mixture at ppb concentration levels was analyzed using MS3 spectra for compound identification in the presence of isobaric interferences.

  9. Analog integrated circuit design automation placement, routing and parasitic extraction techniques

    CERN Document Server

    Martins, Ricardo; Horta, Nuno

    2017-01-01

    This book introduces readers to a variety of tools for analog layout design automation. After discussing the placement and routing problem in electronic design automation (EDA), the authors overview a variety of automatic layout generation tools, as well as the most recent advances in analog layout-aware circuit sizing. The discussion includes different methods for automatic placement (a template-based Placer and an optimization-based Placer), a fully-automatic Router and an empirical-based Parasitic Extractor. The concepts and algorithms of all the modules are thoroughly described, enabling readers to reproduce the methodologies, improve the quality of their designs, or use them as starting point for a new tool. All the methods described are applied to practical examples for a 130nm design process, as well as placement and routing benchmark sets. Introduces readers to hierarchical combination of Pareto fronts of placements; Presents electromigration-aware routing with multilayer multiport terminal structures...

  10. An Automated Approach to Extracting River Bank Locations from Aerial Imagery Using Image Texture

    Science.gov (United States)

    2015-11-04

    consuming and labour intensive, and the quality is dependent on the individual doing the task. This paper describes a quick and fully automated method for...33: 4–24. Novikov A, Bagtzoglou A. 2006. Hydrodynamic model of the Lower Hud- son River estuarine system and its application for water quality manage ...ment. Water Resources Management 20(2): 257–276. Pasternak G, Wang C, Merz J. 2003. Application of a 2D hydrodynamic model to design of reach-scale

  11. Semi-automated solid-phase extraction method for studying the biodegradation of ochratoxin A by human intestinal microbiota.

    Science.gov (United States)

    Camel, Valérie; Ouethrani, Minale; Coudray, Cindy; Philippe, Catherine; Rabot, Sylvie

    2012-04-15

    A simple and rapid semi-automated solid-phase (SPE) extraction method has been developed for the analysis of ochratoxin A in aqueous matrices related to biodegradation experiments (namely digestive contents and faecal excreta), with a view of using this method to follow OTA biodegradation by human intestinal microbiota. Influence of extraction parameters that could affect semi-automated SPE efficiency was studied, using C18-silica as the sorbent and water as the simplest matrix, being further applied to the matrices of interest. Conditions finally retained were as follows: 5-mL aqueous samples (pH 3) containing an organic modifier (20% ACN) were applied on 100-mg cartridges. After drying (9 mL of air), the cartridge was rinsed with 5-mL H(2)O/ACN (80:20, v/v), before eluting the compounds with 3 × 1 mL of MeOH/THF (10:90, v/v). Acceptable recoveries and limits of quantification could be obtained considering the complexity of the investigated matrices and the low volumes sampled; this method was also suitable for the analysis of ochratoxin B in faecal extracts. Applicability of the method is illustrated by preliminary results of ochratoxin A biodegradation studies by human intestinal microbiota under simple in vitro conditions. Interestingly, partial degradation of ochratoxin A was observed, with efficiencies ranging from 14% to 47% after 72 h incubation. In addition, three phase I metabolites could be identified using high resolution mass spectrometry, namely ochratoxin α, open ochratoxin A and ochratoxin B.

  12. Robust Text Extraction for Automated Processing of Multi-Lingual Personal Identity Documents

    Directory of Open Access Journals (Sweden)

    Pushpa B R

    2016-04-01

    Full Text Available Text extraction is a technique to extract the textual portion from non-textual background like images. It plays an important role in deciphering valuable information from images. Variation in text size, font, orientation, alignment, contrast etc. makes the task of text extraction challenging. Existing text extraction methods focus on certain regions of interest and address characteristics like noise, blur, distortion and variations in fonts makes text extraction difficult. This paper proposes a technique to extract textual characters from scanned personal identity document images. Current procedures keep track of user records manually and thus give way to inefficient practices and need for abundant time and human resources. The proposed methodology digitizes personal identity documents and eliminates the need for a large portion of the manual work involved in existing data entry and verification procedures. The proposed method has been experimented extensively with large datasets of varying sizes and image qualities. The results obtained indicate high accuracy in the extraction of important textual features from the document images.

  13. Toward automated parasitic extraction of silicon photonics using layout physical verifications

    Science.gov (United States)

    Ismail, Mohamed; El Shamy, Raghi S.; Madkour, Kareem; Hammouda, Sherif; Swillam, Mohamed A.

    2016-08-01

    A physical verification flow of the layout of silicon photonic circuits is suggested. Simple empirical models are developed to estimate the bend power loss and coupled power in photonic integrated circuits fabricated using SOI standard wafers. These models are utilized in physical verification flow of the circuit layout to verify reliable fabrication using any electronic design automation tool. The models are accurate compared with electromagnetic solvers. The models are closed form and circumvent the need to utilize any EM solver for the verification process. Hence, it dramatically reduces the time of the verification process.

  14. Automated Large Scale Parameter Extraction of Road-Side Trees Sampled by a Laser Mobile Mapping System

    Science.gov (United States)

    Lindenbergh, R. C.; Berthold, D.; Sirmacek, B.; Herrero-Huerta, M.; Wang, J.; Ebersbach, D.

    2015-08-01

    In urbanized Western Europe trees are considered an important component of the built-up environment. This also means that there is an increasing demand for tree inventories. Laser mobile mapping systems provide an efficient and accurate way to sample the 3D road surrounding including notable roadside trees. Indeed, at, say, 50 km/h such systems collect point clouds consisting of half a million points per 100m. Method exists that extract tree parameters from relatively small patches of such data, but a remaining challenge is to operationally extract roadside tree parameters at regional level. For this purpose a workflow is presented as follows: The input point clouds are consecutively downsampled, retiled, classified, segmented into individual trees and upsampled to enable automated extraction of tree location, tree height, canopy diameter and trunk diameter at breast height (DBH). The workflow is implemented to work on a laser mobile mapping data set sampling 100 km of road in Sachsen, Germany and is tested on a stretch of road of 7km long. Along this road, the method detected 315 trees that were considered well detected and 56 clusters of tree points were no individual trees could be identified. Using voxels, the data volume could be reduced by about 97 % in a default scenario. Processing the results of this scenario took ~2500 seconds, corresponding to about 10 km/h, which is getting close to but is still below the acquisition rate which is estimated at 50 km/h.

  15. Semi-automated identification and extraction of geomorphological features using digital elevation data

    NARCIS (Netherlands)

    Seijmonsbergen, A.C.; Hengl, T.; Anders, N.S.; Smith, M.J.; Paron, P.; Griffiths, J.S.

    2011-01-01

    Geomorphological maps that are automatically extracted from digital elevation data are gradually replacing classical geomorphological maps. Commonly, digital mapping projects are based upon statistical techniques, object-based protocols or both. In addition to digital elevation data, expert knowledg

  16. Technical Note: Semi-automated effective width extraction from time-lapse RGB imagery of a remote, braided Greenlandic river

    Science.gov (United States)

    Gleason, C. J.; Smith, L. C.; Finnegan, D. C.; LeWinter, A. L.; Pitcher, L. H.; Chu, V. W.

    2015-06-01

    River systems in remote environments are often challenging to monitor and understand where traditional gauging apparatus are difficult to install or where safety concerns prohibit field measurements. In such cases, remote sensing, especially terrestrial time-lapse imaging platforms, offer a means to better understand these fluvial systems. One such environment is found at the proglacial Isortoq River in southwestern Greenland, a river with a constantly shifting floodplain and remote Arctic location that make gauging and in situ measurements all but impossible. In order to derive relevant hydraulic parameters for this river, two true color (RGB) cameras were installed in July 2011, and these cameras collected over 10 000 half hourly time-lapse images of the river by September of 2012. Existing approaches for extracting hydraulic parameters from RGB imagery require manual or supervised classification of images into water and non-water areas, a task that was impractical for the volume of data in this study. As such, automated image filters were developed that removed images with environmental obstacles (e.g., shadows, sun glint, snow) from the processing stream. Further image filtering was accomplished via a novel automated histogram similarity filtering process. This similarity filtering allowed successful (mean accuracy 79.6 %) supervised classification of filtered images from training data collected from just 10 % of those images. Effective width, a hydraulic parameter highly correlated with discharge in braided rivers, was extracted from these classified images, producing a hydrograph proxy for the Isortoq River between 2011 and 2012. This hydrograph proxy shows agreement with historic flooding observed in other parts of Greenland in July 2012 and offers promise that the imaging platform and processing methodology presented here will be useful for future monitoring studies of remote rivers.

  17. The ESO-LV project - Automated parameter extraction for 16000 ESO/Uppsala galaxies

    NARCIS (Netherlands)

    Lauberts, Andris; Valentijn, Edwin A.

    1987-01-01

    A program to extract photometric and morphological parameters of the galaxies in the ESO/Uppsala survey (Lauberts and Valentijn, 1982) is discussed. The completeness and accuracy of the survey are evaluated and compared with other surveys. The parameters obtained in the program are listed.

  18. Progress in automated extraction and purification of in situ {sup 14}C from quartz: Results from the Purdue in situ {sup 14}C laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Lifton, Nathaniel, E-mail: nlifton@purdue.edu [Department of Earth, Atmospheric, and Planetary Sciences, Purdue University, 550 Stadium Mall Drive, West Lafayette, IN 47907 (United States); Department of Physics and Astronomy and Purdue Rare Isotope Measurement Laboratory (PRIME Lab), Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States); Goehring, Brent, E-mail: bgoehrin@tulane.edu [Department of Earth, Atmospheric, and Planetary Sciences, Purdue University, 550 Stadium Mall Drive, West Lafayette, IN 47907 (United States); Wilson, Jim, E-mail: jim.wilson@aeonlaboratories.com [Aeon Laboratories, LLC, 5835 North Genematas Drive, Tucson, AZ 85704 (United States); Kubley, Thomas [Department of Physics and Astronomy and Purdue Rare Isotope Measurement Laboratory (PRIME Lab), Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States); Caffee, Marc [Department of Earth, Atmospheric, and Planetary Sciences, Purdue University, 550 Stadium Mall Drive, West Lafayette, IN 47907 (United States); Department of Physics and Astronomy and Purdue Rare Isotope Measurement Laboratory (PRIME Lab), Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States)

    2015-10-15

    Current extraction methods for in situ {sup 14}C from quartz [e.g., Lifton et al., (2001), Pigati et al., (2010), Hippe et al., (2013)] are time-consuming and repetitive, making them an attractive target for automation. We report on the status of in situ {sup 14}C extraction and purification systems originally automated at the University of Arizona that have now been reconstructed and upgraded at the Purdue Rare Isotope Measurement Laboratory (PRIME Lab). The Purdue in situ {sup 14}C laboratory builds on the flow-through extraction system design of Pigati et al. (2010), automating most of the procedure by retrofitting existing valves with external servo-controlled actuators, regulating the pressure of research purity O{sub 2} inside the furnace tube via a PID-based pressure controller in concert with an inlet mass flow controller, and installing an automated liquid N{sub 2} distribution system, all driven by LabView® software. A separate system for cryogenic CO{sub 2} purification, dilution, and splitting is also fully automated, ensuring a highly repeatable process regardless of the operator. We present results from procedural blanks and an intercomparison material (CRONUS-A), as well as results of experiments to increase the amount of material used in extraction, from the standard 5 g to 10 g or above. Results thus far are quite promising with procedural blanks comparable to previous work and significant improvements in reproducibility for CRONUS-A measurements. The latter analyses also demonstrate the feasibility of quantitative extraction of in situ {sup 14}C from sample masses up to 10 g. Our lab is now analyzing unknowns routinely, but lowering overall blank levels is the focus of ongoing research.

  19. Comprehensive automation of the solid phase extraction gas chromatographic mass spectrometric analysis (SPE-GC/MS) of opioids, cocaine, and metabolites from serum and other matrices.

    Science.gov (United States)

    Lerch, Oliver; Temme, Oliver; Daldrup, Thomas

    2014-07-01

    The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.

  20. Rhythmic brushstrokes distinguish van Gogh from his contemporaries: findings via automated brushstroke extraction.

    Science.gov (United States)

    Li, Jia; Yao, Lei; Hendriks, Ella; Wang, James Z

    2012-06-01

    Art historians have long observed the highly characteristic brushstroke styles of Vincent van Gogh and have relied on discerning these styles for authenticating and dating his works. In our work, we compared van Gogh with his contemporaries by statistically analyzing a massive set of automatically extracted brushstrokes. A novel extraction method is developed by exploiting an integration of edge detection and clustering-based segmentation. Evidence substantiates that van Gogh's brushstrokes are strongly rhythmic. That is, regularly shaped brushstrokes are tightly arranged, creating a repetitive and patterned impression. We also found that the traits that distinguish van Gogh's paintings in different time periods of his development are all different from those distinguishing van Gogh from his peers. This study confirms that the combined brushwork features identified as special to van Gogh are consistently held throughout his French periods of production (1886-1890).

  1. Evaluation of automated nucleic acid extraction methods for virus detection in a multicenter comparative trial

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Bruun; Uttenthal, Åse; Hakhverdyan, M.;

    2009-01-01

    Five European veterinary laboratories participated in an exercise to compare the performance of nucleic acid extraction robots. Identical sets of coded samples were prepared using serial dilutions of bovine viral diarrhoea virus (BVDV) from serum and cell culture propagated material. Each...... for comparison. The remaining equipment and protocols used were less sensitive, in an extreme case for serum, by a factor of 1000. There was no evidence for cross-contamination of RNA template in any of the negative samples included in these panels. These results are not intended to replace local optimisation...... and validation, but provide reassurance to laboratories to indicate that the best performing optimised nucleic acid extraction systems can have similar performance....

  2. Automating identification of avian vocalizations using time-frequency information extracted from the Gabor transform.

    Science.gov (United States)

    Connor, Edward F; Li, Shidong; Li, Steven

    2012-07-01

    Based on the Gabor transform, a metric is developed and applied to automatically identify bird species from a sample of 568 digital recordings of songs/calls from 67 species of birds. The Gabor frequency-amplitude spectrum and the Gabor time-amplitude profile are proposed as a means to characterize the frequency and time patterns of a bird song. An approach based on template matching where unknown song clips are compared to a library of known song clips is used. After adding noise to simulate the background environment and using an adaptive high-pass filter to de-noise the recordings, the successful identification rate exceeded 93% even at signal-to-noise ratios as low as 5 dB. Bird species whose songs/calls were dominated by low frequencies were more difficult to identify than species whose songs were dominated by higher frequencies. The results suggest that automated identification may be practical if comprehensive libraries of recordings that encompass the vocal variation within species can be assembled.

  3. Quantification of lung tumor rotation with automated landmark extraction using orthogonal cine MRI images

    Science.gov (United States)

    Paganelli, Chiara; Lee, Danny; Greer, Peter B.; Baroni, Guido; Riboldi, Marco; Keall, Paul

    2015-09-01

    The quantification of tumor motion in sites affected by respiratory motion is of primary importance to improve treatment accuracy. To account for motion, different studies analyzed the translational component only, without focusing on the rotational component, which was quantified in a few studies on the prostate with implanted markers. The aim of our study was to propose a tool able to quantify lung tumor rotation without the use of internal markers, thus providing accurate motion detection close to critical structures such as the heart or liver. Specifically, we propose the use of an automatic feature extraction method in combination with the acquisition of fast orthogonal cine MRI images of nine lung patients. As a preliminary test, we evaluated the performance of the feature extraction method by applying it on regions of interest around (i) the diaphragm and (ii) the tumor and comparing the estimated motion with that obtained by (i) the extraction of the diaphragm profile and (ii) the segmentation of the tumor, respectively. The results confirmed the capability of the proposed method in quantifying tumor motion. Then, a point-based rigid registration was applied to the extracted tumor features between all frames to account for rotation. The median lung rotation values were  -0.6   ±   2.3° and  -1.5   ±   2.7° in the sagittal and coronal planes respectively, confirming the need to account for tumor rotation along with translation to improve radiotherapy treatment.

  4. Exploratory normalized difference water indices for semi-automated extraction of Antarctic lake features

    Science.gov (United States)

    Jawak, Shridhar D.; Luis, Alvarinho J.

    2016-05-01

    This work presents various normalized difference water indices (NDWI) to delineate lakes from Schirmacher Oasis, East Antarctica, by using a very high resolution WorldView-2 (WV-2) satellite imagery. Schirmacher oasis region hosts a number of fresh as well as saline water lakes, such as epishelf lakes, ice-free or landlocked lakes, which are completely frozen or semi-frozen and in a ice-free state. Hence, detecting all these types of lakes distinctly on satellite imagery was the major challenge, as the spectral characteristics of various types of lakes were identical to the other land cover targets. Multiband spectral index pixel-based approach is most experimented and recently growing technique because of its unbeatable advantages such as its simplicity and comparatively lesser amount of processing-time. In present study, semiautomatic extraction of lakes in cryospheric region was carried out by designing specific spectral indices. The study utilized number of existing spectral indices to extract lakes but none could deliver satisfactory results and hence we modified NDWI. The potentials of newly added bands in WV-2 satellite imagery was explored by developing spectral indices comprising of Yellow (585 - 625 nm) band, in combination with Blue (450 - 510 nm), Coastal (400 - 450 nm) and Green (510 - 580 nm) bands. For extraction of frozen lakes, use of Yellow (585 - 625 nm) and near-infrared 2 (NIR2) band pair, and Yellow and Green band pair worked well, whereas for ice-free lakes extraction, a combination of Blue and Coastal band yielded appreciable results, when compared with manually digitized data. The results suggest that the modified NDWI approach rendered bias error varying from 1 to 34 m2.

  5. Application of automated image analysis to the identification and extraction of recyclable plastic bottles

    Institute of Scientific and Technical Information of China (English)

    Edgar SCAVINO; Dzuraidah Abdul WAHAB; Aini HUSSAIN; Hassan BASRI; Mohd Marzuki MUSTAFA

    2009-01-01

    An experimental machine vision apparatus was used to identify and extract recyclable plastic bottles out of a conveyor belt. Color images were taken with a commercially available Webcam, and the recognition was performed by our homemade software, based on the shape and dimensions of object images. The software was able to manage multiple bottles in a single image and was additionally extended to cases involving touching bottles. The identification was fulfilled by comparing the set of measured features with an existing database and meanwhile integrating various recognition techniques such as minimum distance in the feature space, self-organized maps, and neural networks. The recognition system was tested on a set of 50 different bottles and provided so far an accuracy of about 97% on bottle identification. The extraction of the bottles was performed by means of a pneumatic arm, which was activated according to the plastic type; polyethylene-terephthalate (PET) bottles were left on the conveyor belt, while non-PET boules were extracted. The software was designed to provide the best compromise between reliability and speed for real-time applications in view of the commercialization of the system at existing recycling plants.

  6. Evaluation of an Automated Information Extraction Tool for Imaging Data Elements to Populate a Breast Cancer Screening Registry.

    Science.gov (United States)

    Lacson, Ronilda; Harris, Kimberly; Brawarsky, Phyllis; Tosteson, Tor D; Onega, Tracy; Tosteson, Anna N A; Kaye, Abby; Gonzalez, Irina; Birdwell, Robyn; Haas, Jennifer S

    2015-10-01

    Breast cancer screening is central to early breast cancer detection. Identifying and monitoring process measures for screening is a focus of the National Cancer Institute's Population-based Research Optimizing Screening through Personalized Regimens (PROSPR) initiative, which requires participating centers to report structured data across the cancer screening continuum. We evaluate the accuracy of automated information extraction of imaging findings from radiology reports, which are available as unstructured text. We present prevalence estimates of imaging findings for breast imaging received by women who obtained care in a primary care network participating in PROSPR (n = 139,953 radiology reports) and compared automatically extracted data elements to a "gold standard" based on manual review for a validation sample of 941 randomly selected radiology reports, including mammograms, digital breast tomosynthesis, ultrasound, and magnetic resonance imaging (MRI). The prevalence of imaging findings vary by data element and modality (e.g., suspicious calcification noted in 2.6% of screening mammograms, 12.1% of diagnostic mammograms, and 9.4% of tomosynthesis exams). In the validation sample, the accuracy of identifying imaging findings, including suspicious calcifications, masses, and architectural distortion (on mammogram and tomosynthesis); masses, cysts, non-mass enhancement, and enhancing foci (on MRI); and masses and cysts (on ultrasound), range from 0.8 to1.0 for recall, precision, and F-measure. Information extraction tools can be used for accurate documentation of imaging findings as structured data elements from text reports for a variety of breast imaging modalities. These data can be used to populate screening registries to help elucidate more effective breast cancer screening processes.

  7. Development and validation of an automated extraction method (accelerated solvent extraction) and a reverse-phase HPLC analysis method for assay of ivermectin in a meat-based chewable formulation.

    Science.gov (United States)

    Abend, Andreas M; Chung, Le; McCollum, David G; Wuelfing, W Peter

    2003-04-10

    A new method for monitoring ivermectin content in HEARTGARD CHEWABLES has been developed and validated. The method consists of the automated extraction of ivermectin from the meat-based formulation under conditions of elevated temperature and pressure (accelerated solvent extraction, ASE, and determination of the active by reverse-phase high performance liquid chromatography (HPLC). The method resolves both active species of ivermectin (components H(2)B(1a) and H(2)B(1b)) from the formulation matrix.

  8. 基于专家系统的建筑自动化系统故障诊断%Fault Diagnosis of Building Automation System Based on Expert System

    Institute of Scientific and Technical Information of China (English)

    孟祥朋; 李决龙; 张炎文

    2011-01-01

    Aiming at increasing difficulty in fault diagnosis of Building Automation System(BAS), the theory of expert system and fault tree analysis are integrated to form a fault diagnosis expert system of BAS. An expression method integrating frame and rule is put forward, and the knowledge base is structured. A reasoning mode combining frame-rule with non-deterministic reasoning with high efficiency is designed to improve the usability of conclusion. Implementation of diagnosis expert system and results are presented to validate its feasibility.%针对建筑自动化系统(BAS)故障诊断日趋困难的问题,将专家系统与故障树分析法相结合,提出一种基于专家系统的BAS故障诊断方法.采用框架和规则相结合的知识表示法,建立相应的知识库.设计基于“框架规则+不确定性推理”的推理模式,给出专家诊断系统的实现方法.实验结果表明,该系统能提高推理结果的可用性.

  9. Automated identification and geometrical features extraction of individual trees from Mobile Laser Scanning data in Budapest

    Science.gov (United States)

    Koma, Zsófia; Székely, Balázs; Folly-Ritvay, Zoltán; Skobrák, Ferenc; Koenig, Kristina; Höfle, Bernhard

    2016-04-01

    Mobile Laser Scanning (MLS) is an evolving operational measurement technique for urban environment providing large amounts of high resolution information about trees, street features, pole-like objects on the street sides or near to motorways. In this study we investigate a robust segmentation method to extract the individual trees automatically in order to build an object-based tree database system. We focused on the large urban parks in Budapest (Margitsziget and Városliget; KARESZ project) which contained large diversity of different kind of tree species. The MLS data contained high density point cloud data with 1-8 cm mean absolute accuracy 80-100 meter distance from streets. The robust segmentation method contained following steps: The ground points are determined first. As a second step cylinders are fitted in vertical slice 1-1.5 meter relative height above ground, which is used to determine the potential location of each single trees trunk and cylinder-like object. Finally, residual values are calculated as deviation of each point from a vertically expanded fitted cylinder; these residual values are used to separate cylinder-like object from individual trees. After successful parameterization, the model parameters and the corresponding residual values of the fitted object are extracted and imported into the tree database. Additionally, geometric features are calculated for each segmented individual tree like crown base, crown width, crown length, diameter of trunk, volume of the individual trees. In case of incompletely scanned trees, the extraction of geometric features is based on fitted circles. The result of the study is a tree database containing detailed information about urban trees, which can be a valuable dataset for ecologist, city planners, planting and mapping purposes. Furthermore, the established database will be the initial point for classification trees into single species. MLS data used in this project had been measured in the framework of

  10. Pedestrian detection in thermal images: An automated scale based region extraction with curvelet space validation

    Science.gov (United States)

    Lakshmi, A.; Faheema, A. G. J.; Deodhare, Dipti

    2016-05-01

    Pedestrian detection is a key problem in night vision processing with a dozen of applications that will positively impact the performance of autonomous systems. Despite significant progress, our study shows that performance of state-of-the-art thermal image pedestrian detectors still has much room for improvement. The purpose of this paper is to overcome the challenge faced by the thermal image pedestrian detectors, which employ intensity based Region Of Interest (ROI) extraction followed by feature based validation. The most striking disadvantage faced by the first module, ROI extraction, is the failed detection of cloth insulted parts. To overcome this setback, this paper employs an algorithm and a principle of region growing pursuit tuned to the scale of the pedestrian. The statistics subtended by the pedestrian drastically vary with the scale and deviation from normality approach facilitates scale detection. Further, the paper offers an adaptive mathematical threshold to resolve the problem of subtracting the background while extracting cloth insulated parts as well. The inherent false positives of the ROI extraction module are limited by the choice of good features in pedestrian validation step. One such feature is curvelet feature, which has found its use extensively in optical images, but has as yet no reported results in thermal images. This has been used to arrive at a pedestrian detector with a reduced false positive rate. This work is the first venture made to scrutinize the utility of curvelet for characterizing pedestrians in thermal images. Attempt has also been made to improve the speed of curvelet transform computation. The classification task is realized through the use of the well known methodology of Support Vector Machines (SVMs). The proposed method is substantiated with qualified evaluation methodologies that permits us to carry out probing and informative comparisons across state-of-the-art features, including deep learning methods, with six

  11. Automated DICOM metadata and volumetric anatomical information extraction for radiation dosimetry

    Science.gov (United States)

    Papamichail, D.; Ploussi, A.; Kordolaimi, S.; Karavasilis, E.; Papadimitroulas, P.; Syrgiamiotis, V.; Efstathopoulos, E.

    2015-09-01

    Patient-specific dosimetry calculations based on simulation techniques have as a prerequisite the modeling of the modality system and the creation of voxelized phantoms. This procedure requires the knowledge of scanning parameters and patients’ information included in a DICOM file as well as image segmentation. However, the extraction of this information is complicated and time-consuming. The objective of this study was to develop a simple graphical user interface (GUI) to (i) automatically extract metadata from every slice image of a DICOM file in a single query and (ii) interactively specify the regions of interest (ROI) without explicit access to the radiology information system. The user-friendly application developed in Matlab environment. The user can select a series of DICOM files and manage their text and graphical data. The metadata are automatically formatted and presented to the user as a Microsoft Excel file. The volumetric maps are formed by interactively specifying the ROIs and by assigning a specific value in every ROI. The result is stored in DICOM format, for data and trend analysis. The developed GUI is easy, fast and and constitutes a very useful tool for individualized dosimetry. One of the future goals is to incorporate a remote access to a PACS server functionality.

  12. Automated bare earth extraction technique for complex topography in light detection and ranging surveys

    Science.gov (United States)

    Stevenson, Terry H.; Magruder, Lori A.; Neuenschwander, Amy L.; Bradford, Brian

    2013-01-01

    Bare earth extraction is an important component to light detection and ranging (LiDAR) data analysis in terms of terrain classification. The challenge in providing accurate digital surface models is augmented when there is diverse topography within the data set or complex combinations of vegetation and built structures. Few existing algorithms can handle substantial terrain diversity without significant editing or user interaction. This effort presents a newly developed methodology that provides a flexible, adaptable tool capable of integrating multiple LiDAR data attributes for an accurate terrain assessment. The terrain extraction and segmentation (TEXAS) approach uses a third-order spatial derivative for each point in the digital surface model to determine the curvature of the terrain rather than rely solely on the slope. The utilization of the curvature has shown to successfully preserve ground points in areas of steep terrain as they typically exhibit low curvature. Within the framework of TEXAS, the contiguous sets of points with low curvatures are grouped into regions using an edge-based segmentation method. The process does not require any user inputs and is completely data driven. This technique was tested on a variety of existing LiDAR surveys, each with varying levels of topographic complexity.

  13. Detecting and extracting clusters in atom probe data: A simple, automated method using Voronoi cells

    Energy Technology Data Exchange (ETDEWEB)

    Felfer, P., E-mail: peter.felfer@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Ceguerra, A.V., E-mail: anna.ceguerra@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Ringer, S.P., E-mail: simon.ringer@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Cairney, J.M., E-mail: julie.cairney@sydney.edu.au [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia)

    2015-03-15

    The analysis of the formation of clusters in solid solutions is one of the most common uses of atom probe tomography. Here, we present a method where we use the Voronoi tessellation of the solute atoms and its geometric dual, the Delaunay triangulation to test for spatial/chemical randomness of the solid solution as well as extracting the clusters themselves. We show how the parameters necessary for cluster extraction can be determined automatically, i.e. without user interaction, making it an ideal tool for the screening of datasets and the pre-filtering of structures for other spatial analysis techniques. Since the Voronoi volumes are closely related to atomic concentrations, the parameters resulting from this analysis can also be used for other concentration based methods such as iso-surfaces. - Highlights: • Cluster analysis of atom probe data can be significantly simplified by using the Voronoi cell volumes of the atomic distribution. • Concentration fields are defined on a single atomic basis using Voronoi cells. • All parameters for the analysis are determined by optimizing the separation probability of bulk atoms vs clustered atoms.

  14. Fully automated extraction and analysis of surface Urban Heat Island patterns from moderate resolution satellite images

    Science.gov (United States)

    Keramitsoglou, I.; Kiranoudis, C. T.

    2012-04-01

    Comparison of thermal patterns across different cities is hampered by the lack of an appropriate methodology to extract the patterns and characterize them. What is more, increased attention by the urban climate community has been expressed to assess the magnitude and dynamics of the surface Urban Heat Island effect and to identify environmental impacts of large cities and "megacities". Motivated by this need, we propose an innovative object-based image analysis procedure to extract thermal patterns for the quantitative analysis of satellite-derived land surface temperature maps. The spatial and thermal attributes associated with these objects are then calculated and used for the analyses of the intensity, the position and the spatial extent of SUHIs. The output eventually builds up and populates a database with comparable and consistent attributes, allowing comparisons between cities as well as urban climate studies. The methodology is demonstrated over the Greater Athens Area, Greece, with more than 3000 LST images acquired by MODIS over a decade being analyzed. The approach can be potentially applied to current and future (e.g. Sentinel-3) level-2 satellite-derived land surface temperature maps of 1km spatial resolution acquired over continental and coastal cities.

  15. Evaluation of Sample Stability and Automated DNA Extraction for Fetal Sex Determination Using Cell-Free Fetal DNA in Maternal Plasma

    Directory of Open Access Journals (Sweden)

    Elena Ordoñez

    2013-01-01

    Full Text Available Objective. The detection of paternally inherited sequences in maternal plasma, such as the SRY gene for fetal sexing or RHD for fetal blood group genotyping, is becoming part of daily routine in diagnostic laboratories. Due to the low percentage of fetal DNA, it is crucial to ensure sample stability and the efficiency of DNA extraction. We evaluated blood stability at 4°C for at least 24 hours and automated DNA extraction, for fetal sex determination in maternal plasma. Methods. A total of 158 blood samples were collected, using EDTA-K tubes, from women in their 1st trimester of pregnancy. Samples were kept at 4°C for at least 24 hours before processing. An automated DNA extraction was evaluated, and its efficiency was compared with a standard manual procedure. The SRY marker was used to quantify cfDNA by real-time PCR. Results. Although lower cfDNA amounts were obtained by automated DNA extraction (mean 107,35 GE/mL versus 259,43 GE/mL, the SRY sequence was successfully detected in all 108 samples from pregnancies with male fetuses. Conclusion. We successfully evaluated the suitability of standard blood tubes for the collection of maternal blood and assessed samples to be suitable for analysis at least 24 hours later. This would allow shipping to a central reference laboratory almost from anywhere in Europe.

  16. Analysis of the influence of tectonics on the evolution valley network based on the SRTM DEM and the relationship of automatically extracted lineaments and the tectonic faults, Jemma River basin, Ethiopia

    Science.gov (United States)

    Kusák, Michal

    2016-04-01

    The Ethiopian Highland is good example of high plateau landscape formed by combination of tectonic uplift and episodic volcanism (Kazmin, 1975; Pik et al., 2003; Gani et al., 2009). Deeply incised gorges indicate active fluvial erosion which leads to instabilities of over-steepened slopes. In this study we focus on Jemma River basin which is a left tributary of Abay - Blue Nile to assess the influence of neotectonics on the evolution of its river and valley network. Tectonic lineaments, shape of valley networks, direction of river courses and intensity of fluvial erosion were compared in six subregions which were delineate beforehand by means of morphometric analysis. The influence of tectonics on the valley network is low in the older deep and wide canyons and in the and on the high plateau covered with Tertiary lava flows while younger upper part of the canyons it is high. Furthermore, the coincidence of the valley network with the tectonic lineaments differs in the subregions. The fluvial erosion along the main tectonic zones (NE-SW) direction made the way for backward erosion possible to reach far distant areas in E for the fluvial erosion. This tectonic zone also separates older areas in the W from the youngest landscape evolution subregions in the E, next to the Rift Valley. We studied the functions that can automatically extract lineaments in programs ArcGIS 10.1 and PCI Geomatica. The values of input parameters and their influence of the final shape and number of lineaments. A map of automated extracted lineaments was created and compared with 1) the tectonic faults by Geology Survey of Ethiopia (1996); and 2) the lineaments based on visual interpretation of by the author. The comparation of lineaments by automated visualization in GIS and visual interpretation of lineaments by the author proves that both sets of lineaments are in the same azimuth (NE-SW) - the same direction as the orientation of the rift. But it the mapping of lineaments by automated

  17. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    Science.gov (United States)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  18. Automated Control of the Organic and Inorganic Composition of Aloe vera Extracts Using (1)H NMR Spectroscopy.

    Science.gov (United States)

    Monakhova, Yulia B; Randel, Gabriele; Diehl, Bernd W K

    2016-09-01

    Recent classification of Aloe vera whole-leaf extract by the International Agency for Research and Cancer as a possible carcinogen to humans as well as the continuous adulteration of A. vera's authentic material have generated renewed interest in controlling A. vera. The existing NMR spectroscopic method for the analysis of A. vera, which is based on a routine developed at Spectral Service, was extended. Apart from aloverose, glucose, malic acid, lactic acid, citric acid, whole-leaf material (WLM), acetic acid, fumaric acid, sodium benzoate, and potassium sorbate, the quantification of Mg(2+), Ca(2+), and fructose is possible with the addition of a Cs-EDTA solution to sample. The proposed methodology was automated, which includes phasing, baseline-correction, deconvolution (based on the Lorentzian function), integration, quantification, and reporting. The NMR method was applied to 41 A. vera preparations in the form of liquid A. vera juice and solid A. vera powder. The advantages of the new NMR methodology over the previous method were discussed. Correlation between the new and standard NMR methodologies was significant for aloverose, glucose, malic acid, lactic acid, citric acid, and WLM (P vera.

  19. Automated Breast Cancer Diagnosis based on GVF-Snake Segmentation, Wavelet Features Extraction and Neural Network Classification

    Directory of Open Access Journals (Sweden)

    Abderrahim Sebri

    2007-01-01

    Full Text Available Breast cancer accounts for the second most cancer diagnoses among women and the second most cancer deaths in the world. In fact, more than 11000 women die each year, all over the world, because this disease. The automatic breast cancer diagnosis is a very important purpose of medical informatics researches. Some researches has been oriented to make automatic the diagnosis at the step of mammographic diagnosis, some others treated the problem at the step of cytological diagnosis. In this work, we describes the current state of the ongoing the BC automated diagnosis research program. It is a software system that provides expert diagnosis of breast cancer based on three step of cytological image analysis. The first step is based on segmentation using an active contour for cell tracking and isolating of the nucleus in the studied image. Then from this nucleus, have been extracted some textural features using the wavelet transforms to characterize image using its texture, so that malign texture can be differentiated from benign on the assumption that tumoral texture is different from the texture of other kinds of tissues. Finally, the obtained features will be introduced as the input vector of a Multi-Layer Perceptron (MLP, to classify the images into malign and benign ones.

  20. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    Science.gov (United States)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and

  1. Development of an automated method for Folin-Ciocalteu total phenolic assay in artichoke extracts.

    Science.gov (United States)

    Yoo, Kil Sun; Lee, Eun Jin; Leskovar, Daniel; Patil, Bhimanagouda S

    2012-12-01

    We developed a system to run the Folin-Ciocalteu (F-C) total phenolic assay, in artichoke extract samples, which is fully automatic, consistent, and fast. The system uses 2 high performance liquid chromatography (HPLC) pumps, an autosampler, a column heater, a UV/Vis detector, and a data collection system. To test the system, a pump delivered 10-fold diluted F-C reagent solution at a rate of 0.7 mL/min, and 0.4 g/mL sodium carbonate at a rate of 2.1 mL/min. The autosampler injected 10 μL per 1.2 min, which was mixed with the F-C reagent and heated to 65 °C while it passed through the column heater. The heated reactant was mixed with sodium carbonate and color intensity was measured by the detector at 600 nm. The data collection system recorded the color intensity, and peak area of each sample was calculated as the concentration of the total phenolic content, expressed in μg/mL as either chlorogenic acid or gallic acid. This new method had superb repeatability (0.7% CV) and a high correlation with both the manual method (r(2) = 0.93) and the HPLC method (r(2) = 0.78). Ascorbic acid and quercetin showed variable antioxidant activity, but sugars did not. This method can be efficiently applied to research that needs to test many numbers of antioxidant capacity samples with speed and accuracy.

  2. Automated Classification of L/R Hand Movement EEG Signals using Advanced Feature Extraction and Machine Learning

    Directory of Open Access Journals (Sweden)

    Mohammad H. Alomari

    2013-07-01

    Full Text Available In this paper, we propose an automated computer platform for the purpose of classifying Electroencephalography (EEG signals associated with left and right hand movements using a hybrid system that uses advanced feature extraction techniques and machine learning algorithms. It is known that EEG represents the brain activity by the electrical voltage fluctuations along the scalp, and Brain-Computer Interface (BCI is a device that enables the use of the brain’s neural activity to communicate with others or to control machines, artificial limbs, or robots without direct physical movements. In our research work, we aspired to find the best feature extraction method that enables the differentiation between left and right executed fist movements through various classification algorithms. The EEG dataset used in this research was created and contributed to PhysioNet by the developers of the BCI2000 instrumentation system. Data was preprocessed using the EEGLAB MATLAB toolbox and artifacts removal was done using AAR. Data was epoched on the basis of Event-Related (De Synchronization (ERD/ERS and movement-related cortical potentials (MRCP features. Mu/beta rhythms were isolated for the ERD/ERS analysis and delta rhythms were isolated for the MRCP analysis. The Independent Component Analysis (ICA spatial filter was applied on related channels for noise reduction and isolation of both artifactually and neutrally generated EEG sources. The final feature vector included the ERD, ERS, and MRCP features in addition to the mean, power and energy of the activations of the resulting Independent Components (ICs of the epoched feature datasets. The datasets were inputted into two machine-learning algorithms: Neural Networks (NNs and Support Vector Machines (SVMs. Intensive experiments were carried out and optimum classification performances of 89.8 and 97.1 were obtained using NN and SVM, respectively. This research shows that this method of feature extraction

  3. Fault Tolerant Control Systems

    DEFF Research Database (Denmark)

    Bøgh, S.A.

    and isolation, remedial action decision, and reconfiguration. The integration of these modules in software were considered. The general methodology covered the analysis, design, and implementation of fault tolerant control systems on an overall level. Two detailed studies were presented, one on fault detection......, as for example a variable being zero, low or high. Examples were given that illustrate how such models can be established by simple means, and yet provide important information when combined into a complete system. A special achievement was a method to determine how control loops behave in case of faults......This thesis considered the development of fault tolerant control systems. The focus was on the category of automated processes that do not necessarily comprise a high number of identical sensors and actuators to maintain safe operation, but still have a potential for improving immunity to component...

  4. Automated oral cancer identification using histopathological images: a hybrid feature extraction paradigm.

    Science.gov (United States)

    Krishnan, M Muthu Rama; Venkatraghavan, Vikram; Acharya, U Rajendra; Pal, Mousumi; Paul, Ranjan Rashmi; Min, Lim Choo; Ray, Ajoy Kumar; Chatterjee, Jyotirmoy; Chakraborty, Chandan

    2012-02-01

    Oral cancer (OC) is the sixth most common cancer in the world. In India it is the most common malignant neoplasm. Histopathological images have widely been used in the differential diagnosis of normal, oral precancerous (oral sub-mucous fibrosis (OSF)) and cancer lesions. However, this technique is limited by subjective interpretations and less accurate diagnosis. The objective of this work is to improve the classification accuracy based on textural features in the development of a computer assisted screening of OSF. The approach introduced here is to grade the histopathological tissue sections into normal, OSF without Dysplasia (OSFWD) and OSF with Dysplasia (OSFD), which would help the oral onco-pathologists to screen the subjects rapidly. The biopsy sections are stained with H&E. The optical density of the pixels in the light microscopic images is recorded and represented as matrix quantized as integers from 0 to 255 for each fundamental color (Red, Green, Blue), resulting in a M×N×3 matrix of integers. Depending on either normal or OSF condition, the image has various granular structures which are self similar patterns at different scales termed "texture". We have extracted these textural changes using Higher Order Spectra (HOS), Local Binary Pattern (LBP), and Laws Texture Energy (LTE) from the histopathological images (normal, OSFWD and OSFD). These feature vectors were fed to five different classifiers: Decision Tree (DT), Sugeno Fuzzy, Gaussian Mixture Model (GMM), K-Nearest Neighbor (K-NN), Radial Basis Probabilistic Neural Network (RBPNN) to select the best classifier. Our results show that combination of texture and HOS features coupled with Fuzzy classifier resulted in 95.7% accuracy, sensitivity and specificity of 94.5% and 98.8% respectively. Finally, we have proposed a novel integrated index called Oral Malignancy Index (OMI) using the HOS, LBP, LTE features, to diagnose benign or malignant tissues using just one number. We hope that this OMI can

  5. Diagnosis and fault-tolerant control

    CERN Document Server

    Blanke, Mogens; Lunze, Jan; Staroswiecki, Marcel

    2016-01-01

    Fault-tolerant control aims at a gradual shutdown response in automated systems when faults occur. It satisfies the industrial demand for enhanced availability and safety, in contrast to traditional reactions to faults, which bring about sudden shutdowns and loss of availability. The book presents effective model-based analysis and design methods for fault diagnosis and fault-tolerant control. Architectural and structural models are used to analyse the propagation of the fault through the process, to test the fault detectability and to find the redundancies in the process that can be used to ensure fault tolerance. It also introduces design methods suitable for diagnostic systems and fault-tolerant controllers for continuous processes that are described by analytical models of discrete-event systems represented by automata. The book is suitable for engineering students, engineers in industry and researchers who wish to get an overview of the variety of approaches to process diagnosis and fault-tolerant contro...

  6. Automation of DNA and miRNA co-extraction for miRNA-based identification of human body fluids and tissues.

    Science.gov (United States)

    Kulstein, Galina; Marienfeld, Ralf; Miltner, Erich; Wiegand, Peter

    2016-10-01

    In the last years, microRNA (miRNA) analysis came into focus in the field of forensic genetics. Yet, no standardized and recommendable protocols for co-isolation of miRNA and DNA from forensic relevant samples have been developed so far. Hence, this study evaluated the performance of an automated Maxwell® 16 System-based strategy (Promega) for co-extraction of DNA and miRNA from forensically relevant (blood and saliva) samples compared to (semi-)manual extraction methods. Three procedures were compared on the basis of recovered quantity of DNA and miRNA (as determined by real-time PCR and Bioanalyzer), miRNA profiling (shown by Cq values and extraction efficiency), STR profiles, duration, contamination risk and handling. All in all, the results highlight that the automated co-extraction procedure yielded the highest miRNA and DNA amounts from saliva and blood samples compared to both (semi-)manual protocols. Also, for aged and genuine samples of forensically relevant traces the miRNA and DNA yields were sufficient for subsequent downstream analysis. Furthermore, the strategy allows miRNA extraction only in cases where it is relevant to obtain additional information about the sample type. Besides, this system enables flexible sample throughput and labor-saving sample processing with reduced risk of cross-contamination.

  7. Rough Set Theory Based Approach for Fault Diagnosis Rule Extraction of Distribution System%基于粗糙集理论的配电网故障诊断规则提取方法

    Institute of Scientific and Technical Information of China (English)

    周永勇; 周湶; 刘佳宾

    2008-01-01

    As the first step of service restoration of distribution system, rapid fault diagnosis is a significant task for reducing power outage time, decreasing outage loss, and subsequently improving service reliability and safety. This paper analyzes a fault diagnosis approach by using rough set theory in which how to reduce decision table of data set is a main calculation intensive task. Aiming at this reduction problem, a heuristic reduction algorithm based on attribution length and frequency is proposed. At the same time, the corresponding value reduction method is proposed in order to fulfill the reduction and diagnosis rules extraction. Meanwhile, a Euclid matching method is introduced to solve confliction problems among the extracted rules when some information is lacking. Principal of the whole algorithm is clear and diagnostic rules distilled from the reduction are concise. Moreover, it needs less calculation towards specific discernibility matrix, and thus avoids the corresponding NP hard problem. The whole process is realized by MATLAB programming. A simulation example shows that the method has a fast calculation speed, and the extracted rules can reflect the characteristic of fault with a concise form. The rule database, formed by different reduction of decision table, can diagnose single fault and multi-faults efficiently, and give satisfied results even when the existed information is incomplete. The proposed method has good error-tolerate capability and the potential for on-line fault diagnosis.

  8. Submicrometric Magnetic Nanoporous Carbons Derived from Metal-Organic Frameworks Enabling Automated Electromagnet-Assisted Online Solid-Phase Extraction.

    Science.gov (United States)

    Frizzarin, Rejane M; Palomino Cabello, Carlos; Bauzà, Maria Del Mar; Portugal, Lindomar A; Maya, Fernando; Cerdà, Víctor; Estela, José M; Turnes Palomino, Gemma

    2016-07-19

    We present the first application of submicrometric magnetic nanoporous carbons (μMNPCs) as sorbents for automated solid-phase extraction (SPE). Small zeolitic imidazolate framework-67 crystals are obtained at room temperature and directly carbonized under an inert atmosphere to obtain submicrometric nanoporous carbons containing magnetic cobalt nanoparticles. The μMNPCs have a high contact area, high stability, and their preparation is simple and cost-effective. The prepared μMNPCs are exploited as sorbents in a microcolumn format in a sequential injection analysis (SIA) system with online spectrophotometric detection, which includes a specially designed three-dimensional (3D)-printed holder containing an automatically actuated electromagnet. The combined action of permanent magnets and an automatically actuated electromagnet enabled the movement of the solid bed of particles inside the microcolumn, preventing their aggregation, increasing the versatility of the system, and increasing the preconcentration efficiency. The method was optimized using a full factorial design and Doehlert Matrix. The developed system was applied to the determination of anionic surfactants, exploiting the retention of the ion-pairs formed with Methylene Blue on the μMNPC. Using sodium dodecyl sulfate as a model analyte, quantification was linear from 50 to 1000 μg L(-1), and the detection limit was equal to 17.5 μg L(-1), the coefficient of variation (n = 8; 100 μg L(-1)) was 2.7%, and the analysis throughput was 13 h(-1). The developed approach was applied to the determination of anionic surfactants in water samples (natural water, groundwater, and wastewater), yielding recoveries of 93% to 110% (95% confidence level).

  9. Development of an automated sequential injection on-line solvent extraction-back extraction procedure as demonstrated for the determination of cadmium with detection by electrothermal atomic absorption spectrometry

    DEFF Research Database (Denmark)

    Wang, Jianhua; Hansen, Elo Harald

    2002-01-01

    An automated sequential injection (SI) on-line solvent extraction-back extraction separation/preconcentration procedure is described. Demonstrated for the assay of cadmium by electrothermal atomic absorption spectrometry (ETAAS), the analyte is initially complexed with ammonium...... pyrrolidinedithiocarbamate (APDC) in citrate buffer and the chelate is extracted into isobutyl methyl ketone (IBMK), which is separated from the aqueous phase by means of a newly designed dual-conical gravitational phase separator. A metered amount of the organic eluate is aspirated and stored in the PTFE holding coil (HC......) of the SI-system. Afterwards, it is dispensed and mixed with an aqueous back extractant of dilute nitric acid containing Hg(II) ions as stripping agent, thereby facilitating a rapid metal-exchange reaction with the APDC ligand and transfer of the Cd into the aqueous phase. The aqueous phase is separated...

  10. Automated extraction of DNA from blood and PCR setup using a Tecan Freedom EVO liquid handler for forensic genetic STR typing of reference samples

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune

    2011-01-01

    We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO...... 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained...... the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFlSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI...

  11. Determination of Low Concentrations of Acetochlor in Water by Automated Solid-Phase Extraction and Gas Chromatography with Mass-Selective Detection

    Science.gov (United States)

    Lindley, C.E.; Stewart, J.T.; Sandstrom, M.W.

    1996-01-01

    A sensitive and reliable gas chromatographic/mass spectrometric (GC/MS) method for determining acetochlor in environmental water samples was developed. The method involves automated extraction of the herbicide from a filtered 1 L water sample through a C18 solid-phase extraction column, elution from the column with hexane-isopropyl alcohol (3 + 1), and concentration of the extract with nitrogen gas. The herbicide is quantitated by capillary/column GC/MS with selected-ion monitoring of 3 characteristic ions. The single-operator method detection limit for reagent water samples is 0.0015 ??g/L. Mean recoveries ranged from about 92 to 115% for 3 water matrixes fortified at 0.05 and 0.5 ??g/L. Average single-operator precision, over the course of 1 week, was better than 5%.

  12. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rosas-Castor, J.M. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Portugal, L.; Ferrer, L. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Guzmán-Mar, J.L.; Hernández-Ramírez, A. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Cerdà, V. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinojosa-Reyes, L., E-mail: laura.hinojosary@uanl.edu.mx [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico)

    2015-05-18

    Highlights: • A fully automated flow-based modified-BCR extraction method was developed to evaluate the extractable As of soil. • The MSFIA–HG-AFS system included an UV photo-oxidation step for organic species degradation. • The accuracy and precision of the proposed method were found satisfactory. • The time analysis can be reduced up to eight times by using the proposed flow-based BCR method. • The labile As (F1 + F2) was <50% of total As in soil samples from As-contaminated-mining zones. - Abstract: A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L{sup −1} for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013–0.800, 0.011–0.900 and 0.079–1.400 mg L{sup −1} for F1, F2, and F3, respectively. The precision of the automated MSFIA–HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L{sup −1} As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural

  13. Fault Tree Generation and Augmentation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault Management (FM) is one of the key components of system autonomy. In order to guarantee FM effectiveness and control the cost, tools are required to automate...

  14. Fault Estimation

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, H.

    2002-01-01

    This paper presents a range of optimization based approaches to fault diagnosis. A variety of fault diagnosis prob-lems are reformulated in the so-called standard problem setup introduced in the literature on robust control. Once the standard problem formulations are given, the fault diagnosis pr...... problems can be solved by standard optimization tech-niques. The proposed methods include: (1) fault diagnosis (fault estimation, (FE)) for systems with model uncertainties; (2) FE for systems with parametric faults, and (3) FE for a class of nonlinear systems.......This paper presents a range of optimization based approaches to fault diagnosis. A variety of fault diagnosis prob-lems are reformulated in the so-called standard problem setup introduced in the literature on robust control. Once the standard problem formulations are given, the fault diagnosis...

  15. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry.

    Science.gov (United States)

    Rosas-Castor, J M; Portugal, L; Ferrer, L; Guzmán-Mar, J L; Hernández-Ramírez, A; Cerdà, V; Hinojosa-Reyes, L

    2015-05-18

    A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L(-1) for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013-0.800, 0.011-0.900 and 0.079-1.400 mg L(-1) for F1, F2, and F3, respectively. The precision of the automated MSFIA-HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L(-1) As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural soil samples from an arsenic-contaminated mining zone to evaluate its extractability. The frequency of analysis of the proposed method was eight times higher than that of the conventional BCR method (6 vs 48 h), and the kinetics of lixiviation were established for each fraction.

  16. Automated mini-column solid-phase extraction cleanup for high-throughput analysis of chemical contaminants in foods by low-pressure gas chromatography – tandem mass spectrometry

    Science.gov (United States)

    This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...

  17. A filter paper-based microdevice for low-cost, rapid, and automated DNA extraction and amplification from diverse sample types.

    Science.gov (United States)

    Gan, Wupeng; Zhuang, Bin; Zhang, Pengfei; Han, Junping; Li, Cai-Xia; Liu, Peng

    2014-10-07

    A plastic microfluidic device that integrates a filter disc as a DNA capture phase was successfully developed for low-cost, rapid and automated DNA extraction and PCR amplification from various raw samples. The microdevice was constructed by sandwiching a piece of Fusion 5 filter, as well as a PDMS (polydimethylsiloxane) membrane, between two PMMA (poly(methyl methacrylate)) layers. An automated DNA extraction from 1 μL of human whole blood can be finished on the chip in 7 minutes by sequentially aspirating NaOH, HCl, and water through the filter. The filter disc containing extracted DNA was then taken out directly for PCR. On-chip DNA purification from 0.25-1 μL of human whole blood yielded 8.1-21.8 ng of DNA, higher than those obtained using QIAamp® DNA Micro kits. To realize DNA extraction from raw samples, an additional sample loading chamber containing a filter net with an 80 μm mesh size was designed in front of the extraction chamber to accommodate sample materials. Real-world samples, including whole blood, dried blood stains on Whatman® 903 paper, dried blood stains on FTA™ cards, buccal swabs, saliva, and cigarette butts, can all be processed in the system in 8 minutes. In addition, multiplex amplification of 15 STR (short tandem repeat) loci and Sanger-based DNA sequencing of the 520 bp GJB2 gene were accomplished from the filters that contained extracted DNA from blood. To further prove the feasibility of integrating this extraction method with downstream analyses, "in situ" PCR amplifications were successfully performed in the DNA extraction chamber following DNA purification from blood and blood stains without DNA elution. Using a modified protocol to bond the PDMS and PMMA, our plastic PDMS devices withstood the PCR process without any leakage. This study represents a significant step towards the practical application of on-chip DNA extraction methods, as well as the development of fully integrated genetic analytical systems.

  18. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.

  19. Fault detection in reciprocating compressor valves under varying load conditions

    Science.gov (United States)

    Pichler, Kurt; Lughofer, Edwin; Pichler, Markus; Buchegger, Thomas; Klement, Erich Peter; Huschenbett, Matthias

    2016-03-01

    This paper presents a novel approach for detecting cracked or broken reciprocating compressor valves under varying load conditions. The main idea is that the time frequency representation of vibration measurement data will show typical patterns depending on the fault state. The problem is to detect these patterns reliably. For the detection task, we make a detour via the two dimensional autocorrelation. The autocorrelation emphasizes the patterns and reduces noise effects. This makes it easier to define appropriate features. After feature extraction, classification is done using logistic regression and support vector machines. The method's performance is validated by analyzing real world measurement data. The results will show a very high detection accuracy while keeping the false alarm rates at a very low level for different compressor loads, thus achieving a load-independent method. The proposed approach is, to our best knowledge, the first automated method for reciprocating compressor valve fault detection that can handle varying load conditions.

  20. Automated extraction method for the center line of spinal canal and its application to the spinal curvature quantification in torso X-ray CT images

    Science.gov (United States)

    Hayashi, Tatsuro; Zhou, Xiangrong; Chen, Huayue; Hara, Takeshi; Miyamoto, Kei; Kobayashi, Tatsunori; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Fujita, Hiroshi

    2010-03-01

    X-ray CT images have been widely used in clinical routine in recent years. CT images scanned by a modern CT scanner can show the details of various organs and tissues. This means various organs and tissues can be simultaneously interpreted on CT images. However, CT image interpretation requires a lot of time and energy. Therefore, support for interpreting CT images based on image-processing techniques is expected. The interpretation of the spinal curvature is important for clinicians because spinal curvature is associated with various spinal disorders. We propose a quantification scheme of the spinal curvature based on the center line of spinal canal on CT images. The proposed scheme consists of four steps: (1) Automated extraction of the skeletal region based on CT number thresholding. (2) Automated extraction of the center line of spinal canal. (3) Generation of the median plane image of spine, which is reformatted based on the spinal canal. (4) Quantification of the spinal curvature. The proposed scheme was applied to 10 cases, and compared with the Cobb angle that is commonly used by clinicians. We found that a high-correlation (for the 95% confidence interval, lumbar lordosis: 0.81-0.99) between values obtained by the proposed (vector) method and Cobb angle. Also, the proposed method can provide the reproducible result (inter- and intra-observer variability: within 2°). These experimental results suggested a possibility that the proposed method was efficient for quantifying the spinal curvature on CT images.

  1. 继电保护与配电自动化配合的配电网故障处理%Fault Restoration Based on Relay Protection and Distribution Automation for Distribution Systems

    Institute of Scientific and Technical Information of China (English)

    门强

    2016-01-01

    The relay protection of power distribution network, together with the distribution automation to a certain extent, can ensure the safety of the power supply circuit. This paper, based on the analysis of fault restora-tion of distribution network in Shaanxi Province Electric Power Company Baoji Power Supply Company, hopes that distribution equipment can be continuously optimized by power sectors in China according to different actual situation so as to better safeguard people's power usage.%对配电网进行继电保护,并且与配电自动化配合,可以在一定程度上保证供电线路的安全。本文主要分析了陕西省电力公司宝鸡供电公司对配电网的故障处理情况,以期我国电力部门可以针对实际情况,不断的优化配电设施,进而更好地保证人们的用电安全。

  2. Automated Extraction of Inundated Areas from Multi-Temporal Dual-Polarization RADARSAT-2 Images of the 2011 Central Thailand Flood

    Directory of Open Access Journals (Sweden)

    Pisut Nakmuenwai

    2017-01-01

    Full Text Available This study examines a novel extraction method for SAR imagery data of widespread flooding, particularly in the Chao Phraya river basin of central Thailand, where flooding occurs almost every year. Because the 2011 flood was among the largest events and of a long duration, a large number of satellites observed it, and imagery data are available. At that time, RADARSAT-2 data were mainly used to extract the affected areas by the Thai government, whereas ThaiChote-1 imagery data were also used as optical supporting data. In this study, the same data were also employed in a somewhat different and more detailed manner. Multi-temporal dual-polarized RADARSAT-2 images were used to classify water areas using a clustering-based thresholding technique, neighboring valley-emphasis, to establish an automated extraction system. The novel technique has been proposed to improve classification speed and efficiency. This technique selects specific water references throughout the study area to estimate local threshold values and then averages them by an area weight to obtain the threshold value for the entire area. The extracted results were validated using high-resolution optical images from the GeoEye-1 and ThaiChote-1 satellites and water elevation data from gaging stations.

  3. Demonstration and validation of automated agricultural field extraction from multi-temporal Landsat data for the majority of United States harvested cropland

    Science.gov (United States)

    Yan, L.; Roy, D. P.

    2014-12-01

    The spatial distribution of agricultural fields is a fundamental description of rural landscapes and the location and extent of fields is important to establish the area of land utilized for agricultural yield prediction, resource allocation, and for economic planning, and may be indicative of the degree of agricultural capital investment, mechanization, and labor intensity. To date, field objects have not been extracted from satellite data over large areas because of computational constraints, the complexity of the extraction task, and because consistently processed appropriate resolution data have not been available or affordable. A recently published automated methodology to extract agricultural crop fields from weekly 30 m Web Enabled Landsat data (WELD) time series was refined and applied to 14 states that cover 70% of harvested U.S. cropland (USDA 2012 Census). The methodology was applied to 2010 combined weekly Landsat 5 and 7 WELD data. The field extraction and quantitative validation results are presented for the following 14 states: Iowa, North Dakota, Illinois, Kansas, Minnesota, Nebraska, Texas, South Dakota, Missouri, Indiana, Ohio, Wisconsin, Oklahoma and Michigan (sorted by area of harvested cropland). These states include the top 11 U.S states by harvested cropland area. Implications and recommendations for systematic application to global coverage Landsat data are discussed.

  4. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds.

    Science.gov (United States)

    Dorninger, Peter; Pfeifer, Norbert

    2008-11-17

    Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects.

  5. A Comprehensive Automated 3D Approach for Building Extraction, Reconstruction, and Regularization from Airborne Laser Scanning Point Clouds

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-11-01

    Full Text Available Three dimensional city models are necessary for supporting numerous management applications. For the determination of city models for visualization purposes, several standardized workflows do exist. They are either based on photogrammetry or on LiDAR or on a combination of both data acquisition techniques. However, the automated determination of reliable and highly accurate city models is still a challenging task, requiring a workflow comprising several processing steps. The most relevant are building detection, building outline generation, building modeling, and finally, building quality analysis. Commercial software tools for building modeling require, generally, a high degree of human interaction and most automated approaches described in literature stress the steps of such a workflow individually. In this article, we propose a comprehensive approach for automated determination of 3D city models from airborne acquired point cloud data. It is based on the assumption that individual buildings can be modeled properly by a composition of a set of planar faces. Hence, it is based on a reliable 3D segmentation algorithm, detecting planar faces in a point cloud. This segmentation is of crucial importance for the outline detection and for the modeling approach. We describe the theoretical background, the segmentation algorithm, the outline detection, and the modeling approach, and we present and discuss several actual projects.

  6. Quantitative radiology: automated measurement of polyp volume in computed tomography colonography using Hessian matrix-based shape extraction and volume growing

    Science.gov (United States)

    Epstein, Mark L.; Obara, Piotr R.; Chen, Yisong; Liu, Junchi; Zarshenas, Amin; Makkinejad, Nazanin; Dachman, Abraham H.

    2015-01-01

    Background Current measurement of the single longest dimension of a polyp is subjective and has variations among radiologists. Our purpose was to develop a computerized measurement of polyp volume in computed tomography colonography (CTC). Methods We developed a 3D automated scheme for measuring polyp volume at CTC. Our scheme consisted of segmentation of colon wall to confine polyp segmentation to the colon wall, extraction of a highly polyp-like seed region based on the Hessian matrix, a 3D volume growing technique under the minimum surface expansion criterion for segmentation of polyps, and sub-voxel refinement and surface smoothing for obtaining a smooth polyp surface. Our database consisted of 30 polyp views (15 polyps) in CTC scans from 13 patients. Each patient was scanned in the supine and prone positions. Polyp sizes measured in optical colonoscopy (OC) ranged from 6-18 mm with a mean of 10 mm. A radiologist outlined polyps in each slice and calculated volumes by summation of volumes in each slice. The measurement study was repeated 3 times at least 1 week apart for minimizing a memory effect bias. We used the mean volume of the three studies as “gold standard”. Results Our measurement scheme yielded a mean polyp volume of 0.38 cc (range, 0.15-1.24 cc), whereas a mean “gold standard” manual volume was 0.40 cc (range, 0.15-1.08 cc). The “gold-standard” manual and computer volumetric reached excellent agreement (intra-class correlation coefficient =0.80), with no statistically significant difference [P (F≤f) =0.42]. Conclusions We developed an automated scheme for measuring polyp volume at CTC based on Hessian matrix-based shape extraction and volume growing. Polyp volumes obtained by our automated scheme agreed excellently with “gold standard” manual volumes. Our fully automated scheme can efficiently provide accurate polyp volumes for radiologists; thus, it would help radiologists improve the accuracy and efficiency of polyp volume

  7. Diagnosis Method for Analog Circuit Hard fault and Soft Fault

    Directory of Open Access Journals (Sweden)

    Baoru Han

    2013-09-01

    Full Text Available Because the traditional BP neural network slow convergence speed, easily falling in local minimum and the learning process will appear oscillation phenomena. This paper introduces a tolerance analog circuit hard fault and soft fault diagnosis method based on adaptive learning rate and the additional momentum algorithm BP neural network. Firstly, tolerance analog circuit is simulated by OrCAD / Pspice circuit simulation software, accurately extracts fault waveform data by matlab program automatically. Secondly, using the adaptive learning rate and momentum BP algorithm to train neural network, and then applies it to analog circuit hard fault and soft fault diagnosis. With shorter training time, high precision and global convergence effectively reduces the misjudgment, missing, it can improve the accuracy of fault diagnosis and fast.  

  8. Automated Building Extraction from High-Resolution Satellite Imagery in Urban Areas Using Structural, Contextual, and Spectral Information

    Directory of Open Access Journals (Sweden)

    Curt H. Davis

    2005-08-01

    Full Text Available High-resolution satellite imagery provides an important new data source for building extraction. We demonstrate an integrated strategy for identifying buildings in 1-meter resolution satellite imagery of urban areas. Buildings are extracted using structural, contextual, and spectral information. First, a series of geodesic opening and closing operations are used to build a differential morphological profile (DMP that provides image structural information. Building hypotheses are generated and verified through shape analysis applied to the DMP. Second, shadows are extracted using the DMP to provide reliable contextual information to hypothesize position and size of adjacent buildings. Seed building rectangles are verified and grown on a finely segmented image. Next, bright buildings are extracted using spectral information. The extraction results from the different information sources are combined after independent extraction. Performance evaluation of the building extraction on an urban test site using IKONOS satellite imagery of the City of Columbia, Missouri, is reported. With the combination of structural, contextual, and spectral information, 72.7% of the building areas are extracted with a quality percentage 58.8%.

  9. Research on Fault Feature Extraction Method for Water-Powered Rod-less Pumping Unit%水机动力无杆抽油机故障特征提取方法研究

    Institute of Scientific and Technical Information of China (English)

    吕俊燕; 朱春梅

    2015-01-01

    In view of deficiencies that the common signal processing algorithm in the processing of signals such as the early failure, weak fault and compound fault signal, the method of wavelet transform is put forward, focused on the combination of lifting wavelet transform and LE manifold learning algorithm to extract fault feature. The poor noise robustness problem of LE is solved, and the superiority of manifold learning in signal processing is enhanced. Through verification of the fault simulation experiment platform of water-powered rod-less pumping unit, it is proved that the method can accurately extract the fault fea-tures of pumping unit, classify and distinguish the fault of pumping system, and to provide the basis for subsequent analysis.%针对常用的信号处理算法在早期故障、微弱故障、复合故障等信号的处理方面存在的不足,提出了小波变换的方法。重点介绍了提升小波和流形学习LE算法相结合提取系统故障特征。解决了LE对噪声鲁棒性较差的问题,增强了流形学习在信号处理中的优越性。经水基动力无杆抽油机抽油机系统故障模拟试验台验证,该方法能准确提取抽油机故障特征,对抽油机故障进行分类与识别,为后续分析奠定了基础。

  10. Fault Management Techniques in Human Spaceflight Operations

    Science.gov (United States)

    O'Hagan, Brian; Crocker, Alan

    2006-01-01

    This paper discusses human spaceflight fault management operations. Fault detection and response capabilities available in current US human spaceflight programs Space Shuttle and International Space Station are described while emphasizing system design impacts on operational techniques and constraints. Preflight and inflight processes along with products used to anticipate, mitigate and respond to failures are introduced. Examples of operational products used to support failure responses are presented. Possible improvements in the state of the art, as well as prioritization and success criteria for their implementation are proposed. This paper describes how the architecture of a command and control system impacts operations in areas such as the required fault response times, automated vs. manual fault responses, use of workarounds, etc. The architecture includes the use of redundancy at the system and software function level, software capabilities, use of intelligent or autonomous systems, number and severity of software defects, etc. This in turn drives which Caution and Warning (C&W) events should be annunciated, C&W event classification, operator display designs, crew training, flight control team training, and procedure development. Other factors impacting operations are the complexity of a system, skills needed to understand and operate a system, and the use of commonality vs. optimized solutions for software and responses. Fault detection, annunciation, safing responses, and recovery capabilities are explored using real examples to uncover underlying philosophies and constraints. These factors directly impact operations in that the crew and flight control team need to understand what happened, why it happened, what the system is doing, and what, if any, corrective actions they need to perform. If a fault results in multiple C&W events, or if several faults occur simultaneously, the root cause(s) of the fault(s), as well as their vehicle-wide impacts, must be

  11. Development of an automated batch-type solid-liquid extraction apparatus and extraction of Zr, Hf, and Th by triisooctylamine from HCl solutions for chemistry of element 104, Rf

    Energy Technology Data Exchange (ETDEWEB)

    Kasamatsu, Yoshitaka; Kino, Aiko; Yokokita, Takuya [Osaka Univ. (Japan). Graduate School of Science; and others

    2015-07-01

    Solid-liquid extraction of the group 4 elements Zr and Hf, which are homologues of Rf (Z = 104), and Th, a pseudo homologue, by triisooctylamine (TIOA) from HCl solutions was performed by batch method. After examining the time required to reach extraction equilibrium for these elements in various concentrations of TIOA and HCl, we investigated in detail variations in the distribution coefficients (K{sub d}) with TIOA and HCl concentrations. The K{sub d} values of Zr and Hf increased with increasing the HCl and TIOA concentrations, suggesting an increase in the abundance of the anionic chloride complexes of Zr and Hf. On the other hand, the K{sub d} values of Th were low in all the HCl concentrations studied, implying that Th does not form anionic species dominantly. We developed a new automated batch-type solid-liquid extraction apparatus for repetitive experiments on transactinide elements. Using this apparatus, we performed solid-liquid extraction employing the radioactive nuclides {sup 89m}Zr and {sup 175}Hf produced by nuclear reactions and transported continuously from the nuclear reaction chamber by the He/KCl gas-jet system. It was found that the distribution behaviors in 7-11 M HCl are almost constant in the time range 10-120 s, and the K{sub d} values are consistent with those obtained in the above manual experiment. This result suggests that the chemical reactions in the extraction process reach equilibrium within 10 s for Zr and Hf under the present experimental conditions. It took about 35 s for the extraction using the apparatus. These results indicate the applicability of the present extraction using the developed apparatus to {sup 261}Rf (T{sub 1/2} = 68 s) experiments.

  12. 基于混合特征提取和WNN的齿轮箱故障诊断%Gearbox fault diagnosis based on hybrid feature extraction and wavelet neural network

    Institute of Scientific and Technical Information of China (English)

    鲁艳军; 陈汉新; 贺文杰; 尚云飞; 陈绪兵

    2011-01-01

    A new method of fault diagnosis for gearbox based on hybrid feature extraction and wavelet neural network (WNN) was proposed in this paper.The time domain analysis, wavelet packet decomposition and wavelet decomposition were applied to extract the fault feature information of vibration signals collected from gearbox.The extracted feature values were regarded as the feature input vector of WNN.The scale parameters, translation parameters, weight values and threshold values in WNN structure were optimized by traditional back- propagation (BP) algorithm.Three gear fault modes were simulated with different crack sizes in the experiment.The effectiveness and reliability of the presented fault diagnosis method were demonstrated through identification and classification for several fault modes.%提出了一种基于混合特征提取和小波神经网络(WNN)的齿轮箱故障诊断方法,运用时域分析法、小波分解和小波包分解相结合的方法对齿轮箱振动信号进行故障特征提取,将所提取的特征值作为WNN分类器的特征输入参数,采用反向传播(BP)算法对WNN结构中的平移参数、尺度参数、连接权值和阈值进行调整和优化.在实验中采用不同裂纹尺寸的齿轮来模拟三种故障模式,通过对三种故障齿轮进行诊断和分类,能证明本文所提议的故障诊断方法是有效且可靠的.

  13. Automated detection of feeding strikes by larval fish using continuous high-speed digital video: a novel method to extract quantitative data from fast, sparse kinematic events.

    Science.gov (United States)

    Shamur, Eyal; Zilka, Miri; Hassner, Tal; China, Victor; Liberzon, Alex; Holzman, Roi

    2016-06-01

    Using videography to extract quantitative data on animal movement and kinematics constitutes a major tool in biomechanics and behavioral ecology. Advanced recording technologies now enable acquisition of long video sequences encompassing sparse and unpredictable events. Although such events may be ecologically important, analysis of sparse data can be extremely time-consuming and potentially biased; data quality is often strongly dependent on the training level of the observer and subject to contamination by observer-dependent biases. These constraints often limit our ability to study animal performance and fitness. Using long videos of foraging fish larvae, we provide a framework for the automated detection of prey acquisition strikes, a behavior that is infrequent yet critical for larval survival. We compared the performance of four video descriptors and their combinations against manually identified feeding events. For our data, the best single descriptor provided a classification accuracy of 77-95% and detection accuracy of 88-98%, depending on fish species and size. Using a combination of descriptors improved the accuracy of classification by ∼2%, but did not improve detection accuracy. Our results indicate that the effort required by an expert to manually label videos can be greatly reduced to examining only the potential feeding detections in order to filter false detections. Thus, using automated descriptors reduces the amount of manual work needed to identify events of interest from weeks to hours, enabling the assembly of an unbiased large dataset of ecologically relevant behaviors.

  14. Automated diagnosis of congestive heart failure using dual tree complex wavelet transform and statistical features extracted from 2s of ECG signals.

    Science.gov (United States)

    Sudarshan, Vidya K; Acharya, U Rajendra; Oh, Shu Lih; Adam, Muhammad; Tan, Jen Hong; Chua, Chua Kuang; Chua, Kok Poo; Tan, Ru San

    2017-02-07

    Identification of alarming features in the electrocardiogram (ECG) signal is extremely significant for the prediction of congestive heart failure (CHF). ECG signal analysis carried out using computer-aided techniques can speed up the diagnosis process and aid in the proper management of CHF patients. Therefore, in this work, dual tree complex wavelets transform (DTCWT)-based methodology is proposed for an automated identification of ECG signals exhibiting CHF from normal. In the experiment, we have performed a DTCWT on ECG segments of 2s duration up to six levels to obtain the coefficients. From these DTCWT coefficients, statistical features are extracted and ranked using Bhattacharyya, entropy, minimum redundancy maximum relevance (mRMR), receiver-operating characteristics (ROC), Wilcoxon, t-test and reliefF methods. Ranked features are subjected to k-nearest neighbor (KNN) and decision tree (DT) classifiers for automated differentiation of CHF and normal ECG signals. We have achieved 99.86% accuracy, 99.78% sensitivity and 99.94% specificity in the identification of CHF affected ECG signals using 45 features. The proposed method is able to detect CHF patients accurately using only 2s of ECG signal length and hence providing sufficient time for the clinicians to further investigate on the severity of CHF and treatments.

  15. An Investigation of the Relationship Between Automated Machine Translation Evaluation Metrics and User Performance on an Information Extraction Task

    Science.gov (United States)

    2007-01-01

    AN INFORMATION EXTRACTION TASK Calandra Rilette Tate, Doctor of Philosophy, 2007 Dissertation directed by: Professor Eric V. Slud Department of...User Performance on an Information Extraction Task by Calandra Rilette Tate Dissertation submitted to the Faculty of the Graduate School of the...Calandra Rilette Tate 2007 DEDICATION This work is dedicated to two of my greatest influencers—there since the be- ginning, but unfortunately unable to

  16. A simple micro-extraction plate assay for automated LC-MS/MS analysis of human serum 25-hydroxyvitamin D levels.

    Science.gov (United States)

    Geib, Timon; Meier, Florian; Schorr, Pascal; Lammert, Frank; Stokes, Caroline S; Volmer, Dietrich A

    2015-01-01

    This short application note describes a simple and automated assay for determination of 25-hydroxyvitamin D (25(OH)D) levels in very small volumes of human serum. It utilizes commercial 96-well micro-extraction plates with commercial 25(OH)D isotope calibration and quality control kits. Separation was achieved using a pentafluorophenyl liquid chromatography column followed by multiple reaction monitoring-based quantification on an electrospray triple quadrupole mass spectrometer. Emphasis was placed on providing a simple assay that can be rapidly established in non-specialized laboratories within days, without the need for laborious and time consuming sample preparation steps, advanced calibration or data acquisition routines. The analytical figures of merit obtained from this assay compared well to established assays. To demonstrate the applicability, the assay was applied to analysis of serum samples from patients with chronic liver diseases and compared to results from a routine clinical immunoassay.

  17. RNA extracted from blood samples with a rapid automated procedure is fit for molecular diagnosis or minimal residual disease monitoring in patients with a variety of malignant blood disorders.

    Science.gov (United States)

    Bechlian, Didier; Honstettre, Amélie; Terrier, Michèle; Brest, Christelle; Malenfant, Carine; Mozziconacci, Marie-Joëlle; Chabannon, Christian

    2009-06-01

    Scientific studies in oncology, cancer diagnosis, and monitoring tumor response to therapeutics currently rely on a growing number of clinico-pathological information. These often include molecular analyses. The quality of these analyses depends on both pre-analytical and analytical information and often includes the extraction of DNA and/or RNA from human tissues and cells. The quality and quantity of obtained nucleic acids are of utmost importance. The use of automated techniques presents several advantages over manual techniques, such as reducing technical time and thus cost, and facilitating standardization. The purpose of this study was to validate an automated technique for RNA extraction from cells of patients treated for various malignant blood diseases. A well-established manual technique was compared to an automated technique, in order to extract RNA from blood samples drawn for the molecular diagnosis of a variety of leukemic diseases or monitoring of minimal residual disease. The quality of the RNA was evaluated by real-time quantitative RT-PCR (RQ-PCR) analyses of the Abelson gene transcript. The results show that both techniques produce RNA with comparable quality and quantity, thus suggesting that an automated technique can be substituted for the reference and manual technique used in the daily routine of a molecular pathology laboratory involved in minimal residual disease monitoring. Increased costs of reagents and disposables used for automated techniques can be compensated by a decrease in human resource.

  18. Fault detection and diagnosis for complex multivariable processes using neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Weerasinghe, M

    1998-06-01

    the complex input-output mapping performed by a network, and are in general difficult to obtain. Statistical techniques and relationships between fuzzy systems and standard radial basis function networks have been exploited to prune a trained network and to extract qualitative rules that explain the network operation for fault diagnosis. Pruning the networks improved the fault classification, while offering simple qualitative rules on process behaviour. Automation of the pruning procedure introduced flexibility and ease of application of the methods. (author)

  19. 基于提升小波降噪与LMD的转子故障特征提取方法%A method of extracting rotor fault features based on lifting wavelet denoising and LMD

    Institute of Scientific and Technical Information of China (English)

    陈勇; 孙虎儿; 王志武; 苏飞

    2013-01-01

    笔者提出了一种基于提升小波降噪与局域均值分解(Local Mean Decom position,LMD)的转子故障特征提取方法.LMD在分析非线性、非平稳信号方面效果较好,但是对噪声较敏感.为了消除噪声对LMD分解效果的影响,先用提升小波对原始信号降噪,然后对去噪信号进行LMD分解,选取有用的PF分量进行频谱分析,并提取出转子故障特征.通过仿真试验和转子故障特征提取试验,证明了该方法在提取转子故障特征中的有效性能.%A method of extracting rotor fault features based on lifting wavelet denoising and local mean decomposition (LMD) was put forward.The LMD was good at analyzing the nonlinear and nonstationary signals,while sensitive to noises.In order to eliminate influence of noises on LMD decomposition effects,lifting wavelet was applied to denoise original signals,and then the denoised signals was decomposed by LMD.After that,effective PF components were selected to conduct spectral analysis,so as to extract the rotor fault features.The contrast between simulation with actual rotor fault feature extracting test showed the proposed method was effective in extraction of rotor fault features.

  20. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  1. Comparison of Boiling and Robotics Automation Method in DNA Extraction for Metagenomic Sequencing of Human Oral Microbes.

    Science.gov (United States)

    Yamagishi, Junya; Sato, Yukuto; Shinozaki, Natsuko; Ye, Bin; Tsuboi, Akito; Nagasaki, Masao; Yamashita, Riu

    2016-01-01

    The rapid improvement of next-generation sequencing performance now enables us to analyze huge sample sets with more than ten thousand specimens. However, DNA extraction can still be a limiting step in such metagenomic approaches. In this study, we analyzed human oral microbes to compare the performance of three DNA extraction methods: PowerSoil (a method widely used in this field), QIAsymphony (a robotics method), and a simple boiling method. Dental plaque was initially collected from three volunteers in the pilot study and then expanded to 12 volunteers in the follow-up study. Bacterial flora was estimated by sequencing the V4 region of 16S rRNA following species-level profiling. Our results indicate that the efficiency of PowerSoil and QIAsymphony was comparable to the boiling method. Therefore, the boiling method may be a promising alternative because of its simplicity, cost effectiveness, and short handling time. Moreover, this method was reliable for estimating bacterial species and could be used in the future to examine the correlation between oral flora and health status. Despite this, differences in the efficiency of DNA extraction for various bacterial species were observed among the three methods. Based on these findings, there is no "gold standard" for DNA extraction. In future, we suggest that the DNA extraction method should be selected on a case-by-case basis considering the aims and specimens of the study.

  2. Automated Identification of the Heart Wall Throughout the Entire Cardiac Cycle Using Optimal Cardiac Phase for Extracted Features

    Science.gov (United States)

    Takahashi, Hiroki; Hasegawa, Hideyuki; Kanai, Hiroshi

    2011-07-01

    In most methods for evaluation of cardiac function based on echocardiography, the heart wall is currently identified manually by an operator. However, this task is very time-consuming and suffers from inter- and intraobserver variability. The present paper proposes a method that uses multiple features of ultrasonic echo signals for automated identification of the heart wall region throughout an entire cardiac cycle. In addition, the optimal cardiac phase to select a frame of interest, i.e., the frame for the initiation of tracking, was determined. The heart wall region at the frame of interest in this cardiac phase was identified by the expectation-maximization (EM) algorithm, and heart wall regions in the following frames were identified by tracking each point classified in the initial frame as the heart wall region using the phased tracking method. The results for two subjects indicate the feasibility of the proposed method in the longitudinal axis view of the heart.

  3. Semi-automated extraction of microbial DNA from feces for qPCR and phylogenetic microarray analysis

    NARCIS (Netherlands)

    Nylund, L.; Heilig, G.H.J.; Salminen, S.; Vos, de W.M.; Satokari, R.M.

    2010-01-01

    The human gastrointestinal tract (GI-tract) harbors a complex microbial ecosystem, largely composed of so far uncultured species, which can be detected only by using techniques such as PCR and by different hybridization techniques including phylogenetic microarrays. Manual DNA extraction from feces

  4. Parameter Extraction for PSpice Models by means of an Automated Optimization Tool – An IGBT model Study Case

    DEFF Research Database (Denmark)

    Suárez, Carlos Gómez; Reigosa, Paula Diaz; Iannuzzo, Francesco;

    2016-01-01

    An original tool for parameter extraction of PSpice models has been released, enabling a simple parameter identification. A physics-based IGBT model is used to demonstrate that the optimization tool is capable of generating a set of parameters which predicts the steady-state and switching behavio...

  5. Comparison of automated nucleic acid extraction methods for the detection of cytomegalovirus DNA in fluids and tissues

    Directory of Open Access Journals (Sweden)

    Jesse J. Waggoner

    2014-04-01

    Full Text Available Testing for cytomegalovirus (CMV DNA is increasingly being used for specimen types other than plasma or whole blood. However, few studies have investigated the performance of different nucleic acid extraction protocols in such specimens. In this study, CMV extraction using the Cell-free 1000 and Pathogen Complex 400 protocols on the QIAsymphony Sample Processing (SP system were compared using bronchoalveolar lavage fluid (BAL, tissue samples, and urine. The QIAsymphonyAssay Set-up (AS system was used to assemble reactions using artus CMV PCR reagents and amplification was carried out on the Rotor-Gene Q. Samples from 93 patients previously tested for CMV DNA and negative samples spiked with CMV AD-169 were used to evaluate assay performance. The Pathogen Complex 400 protocol yielded the following results: BAL, sensitivity 100% (33/33, specificity 87% (20/23; tissue, sensitivity 100% (25/25, specificity 100% (20/20; urine, sensitivity 100% (21/21, specificity 100% (20/20. Cell-free 1000 extraction gave comparable results for BAL and tissue, however, for urine, the sensitivity was 86% (18/21 and specimen quantitation was inaccurate. Comparative studies of different extraction protocols and DNA detection methods in body fluids and tissues are needed, as assays optimized for blood or plasma will not necessarily perform well on other specimen types.

  6. Regenerable immuno-biochip for screening ochratoxin A in green coffee extract using an automated microarray chip reader with chemiluminescence detection.

    Science.gov (United States)

    Sauceda-Friebe, Jimena C; Karsunke, Xaver Y Z; Vazac, Susanna; Biselli, Scarlett; Niessner, Reinhard; Knopp, Dietmar

    2011-03-18

    Ochratoxin A (OTA) can contaminate foodstuffs in the ppb to ppm range and once formed, it is difficult to remove. Because of its toxicity and potential risks to human health, the need exists for rapid, efficient detection methods that comply with legal maximum residual limits. In this work we have synthesized an OTA conjugate functionalized with a water-soluble peptide for covalent immobilization on a glass biochip by means of contact spotting. The chip was used for OTA determination with an indirect competitive immunoassay format with flow-through reagent addition and chemiluminescence detection, carried out with the stand-alone automated Munich Chip Reader 3 (MCR 3) platform. A buffer model and real green coffee extracts were used for this purpose. At the present, covalent conjugate immobilization allowed for at least 20 assay-regeneration cycles of the biochip surface. The total analysis time for a single sample, including measurement and surface regeneration, was 12 min and the LOQ of OTA in green coffee extract was 0.3 μg L(-1) which corresponds to 7 μg kg(-1).

  7. High quality DNA obtained with an automated DNA extraction method with 70+ year old formalin-fixed celloidin-embedded (FFCE) blocks from the indiana medical history museum.

    Science.gov (United States)

    Niland, Erin E; McGuire, Audrey; Cox, Mary H; Sandusky, George E

    2012-01-01

    DNA and RNA have been used as markers of tissue quality and integrity throughout the last few decades. In this research study, genomic quality DNA of kidney, liver, heart, lung, spleen, and brain were analyzed in tissues from post-mortem patients and surgical cancer cases spanning the past century. DNA extraction was performed on over 180 samples from: 70+ year old formalin-fixed celloidin-embedded (FFCE) tissues, formalin-fixed paraffin-embedded (FFPE) tissue samples from surgical cases and post-mortem cases from the 1970's, 1980's, 1990's, and 2000's, tissues fixed in 10% neutral buffered formalin/stored in 70% ethanol from the 1990's, 70+ year old tissues fixed in unbuffered formalin of various concentrations, and fresh tissue as a control. To extract DNA from FFCE samples and ethanol-soaked samples, a modified standard operating procedure was used in which all tissues were homogenized, digested with a proteinase K solution for a long period of time (24-48 hours), and DNA was extracted using the Autogen Flexstar automated extraction machine. To extract DNA from FFPE, all tissues were soaked in xylene to remove the paraffin from the tissue prior to digestion, and FFPE tissues were not homogenized. The results were as follows: celloidin-embedded and paraffin-embedded tissues yielded the highest DNA concentration and greatest DNA quality, while the formalin in various concentrations, and long term formalin/ethanol-stored tissue yielded both the lowest DNA concentration and quality of the tissues tested. The average DNA yield for the various fixatives was: 367.77 μg/ mL FFCE, 590.7 μg/mL FFPE, 53.74 μg/mL formalin-fixed/70% ethanol-stored and 33.2 μg/mL unbuffered formalin tissues. The average OD readings for FFCE, FFPE, formalin-fixed/70% ethanol-stored tissues, and tissues fixed in unbuffered formalin were 1.86, 1.87, 1.43, and 1.48 respectively. The results show that usable DNA can be extracted from tissue fixed in formalin and embedded in celloidin or

  8. Fault diagnosis

    Science.gov (United States)

    Abbott, Kathy

    1990-01-01

    The objective of the research in this area of fault management is to develop and implement a decision aiding concept for diagnosing faults, especially faults which are difficult for pilots to identify, and to develop methods for presenting the diagnosis information to the flight crew in a timely and comprehensible manner. The requirements for the diagnosis concept were identified by interviewing pilots, analyzing actual incident and accident cases, and examining psychology literature on how humans perform diagnosis. The diagnosis decision aiding concept developed based on those requirements takes abnormal sensor readings as input, as identified by a fault monitor. Based on these abnormal sensor readings, the diagnosis concept identifies the cause or source of the fault and all components affected by the fault. This concept was implemented for diagnosis of aircraft propulsion and hydraulic subsystems in a computer program called Draphys (Diagnostic Reasoning About Physical Systems). Draphys is unique in two important ways. First, it uses models of both functional and physical relationships in the subsystems. Using both models enables the diagnostic reasoning to identify the fault propagation as the faulted system continues to operate, and to diagnose physical damage. Draphys also reasons about behavior of the faulted system over time, to eliminate possibilities as more information becomes available, and to update the system status as more components are affected by the fault. The crew interface research is examining display issues associated with presenting diagnosis information to the flight crew. One study examined issues for presenting system status information. One lesson learned from that study was that pilots found fault situations to be more complex if they involved multiple subsystems. Another was pilots could identify the faulted systems more quickly if the system status was presented in pictorial or text format. Another study is currently under way to

  9. Automated extraction and assessment of functional features of areal measured microstructures using a segmentation-based evaluation method

    Science.gov (United States)

    Hartmann, Wito; Loderer, Andreas

    2014-10-01

    In addition to currently available surface parameters, according to ISO 4287:2010 and ISO 25178-2:2012—which are defined particularly for stochastic surfaces—a universal evaluation procedure is provided for geometrical, well-defined, microstructured surfaces. Since several million of features (like diameters, depths, etc) are present on microstructured surfaces, segmentation techniques are used for the automation of the feature-based dimensional evaluation. By applying an additional extended 3D evaluation after the segmentation and classification procedure, the accuracy of the evaluation is improved compared to the direct evaluation of segments, and additional functional parameters can be derived. Advantages of the extended segmentation-based evaluation method include not only the ability to evaluate the manufacturing process statistically (e.g. by capability indices, according to ISO 21747:2007 and ISO 3534-2:2013) and to derive statistical reliable values for the correction of microstructuring processes but also the direct re-use of the evaluated parameter (including its statistical distribution) in simulations for the calculation of probabilities with respect to the functionality of the microstructured surface. The practical suitability of this method is demonstrated using examples of microstructures for the improvement of sliding and ink transfers for printing machines.

  10. Warehouse automation

    OpenAIRE

    Pogačnik, Jure

    2017-01-01

    An automated high bay warehouse is commonly used for storing large number of material with a high throughput. In an automated warehouse pallet movements are mainly performed by a number of automated devices like conveyors systems, trolleys, and stacker cranes. From the introduction of the material to the automated warehouse system to its dispatch the system requires no operator input or intervention since all material movements are done automatically. This allows the automated warehouse to op...

  11. Monitoring and Fault Diagnosis for Batch Process Based on Feature Extract in Fisher Subspace%基于Fisher子空间特征提取的间歇过程监控和故障诊断

    Institute of Scientific and Technical Information of China (English)

    赵旭; 阎威武; 邵惠鹤

    2006-01-01

    Multivariate statistical process control methods have been widely used in biochemical industries. Batch process is usually monitored by the method of multi-way principal component analysis (MPCA). In this article, a new batch process monitoring and fault diagnosis method based on feature extract in Fisher subspace is proposed.The feature vector and the feature direction are extracted by projecting the high-dimension process data onto the low-dimension Fisher space. The similarity of feature vector between the current and the reference batch is calculated for on-line process monitoring and the contribution plot of weights in feature direction is calculated for fault diagnosis. The approach overcomes the need for estimating or tilling in the unknown portion of the process variables trajectories from the current time to the end of the batch. Simulation results on the benchmark model of penicillin fermentation process can demonstrate that in comparison to the MPCA method, the proposed method is more accurate and efficient for process monitoring and fault diagnosis.

  12. Automation of Silica Bead-based Nucleic Acid Extraction on a Centrifugal Lab-on-a-Disc Platform

    Science.gov (United States)

    Kinahan, David J.; Mangwanya, Faith; Garvey, Robert; Chung, Danielle WY; Lipinski, Artur; Julius, Lourdes AN; King, Damien; Mohammadi, Mehdi; Mishra, Rohit; Al-Ofi, May; Miyazaki, Celina; Ducrée, Jens

    2016-10-01

    We describe a centrifugal microfluidic ‘Lab-on-a-Disc’ (LoaD) technology for DNA purification towards eventual integration into a Sample-to-Answer platform for detection of the pathogen Escherichia coli O157:H7 from food samples. For this application, we use a novel microfluidic architecture which combines ‘event-triggered’ dissolvable film (DF) valves with a reaction chamber gated by a centrifugo-pneumatic siphon valve (CPSV). This architecture permits comprehensive flow control by simple changes in the speed of the platform innate spindle motor. Even before method optimisation, characterisation by DNA fluorescence reveals an extraction efficiency of 58%, which is close to commercial spin columns.

  13. Cytomegalovirus DNA quantification using an automated platform for nucleic acid extraction and real-time PCR assay setup.

    Science.gov (United States)

    Forman, Michael; Wilson, Andy; Valsamakis, Alexandra

    2011-07-01

    Analytical performance characteristics of the QIAsymphony RGQ system with artus cytomegalovirus (CMV) reagents were determined. Measurable range spanned 2.0 to ≥ 7.0 log(10) copies/ml. The detection limit was 23 copies/ml. Intrarun and interrun coefficients of variation were ≤ 2.1% at 3.0 and 5.0 log(10) copies/ml. In clinical specimens, RGQ values were ~0.2 log(10) copies/ml higher than those in an assay using a BioRobot M48 extraction/manual reaction setup/7500 Real-Time PCR instrument. No cross-contamination was observed.

  14. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  15. Intelligent Fault Diagnosis in Lead-zinc Smelting Process

    Institute of Scientific and Technical Information of China (English)

    Wei-Hua Gui; Chun-Hua Yang; Jing Teng

    2007-01-01

    According to the fault characteristic of the imperial smelting process (ISP), a novel intelligent integrated fault diagnostic system is developed. In the system fuzzy neural networks are utilized to extract fault symptom and expert system is employed for effective fault diagnosis of the process. Furthermore, fuzzy abductive inference is introduced to diagnose multiple faults. Feasibility of the proposed system is demonstrated through a pilot plant case study.

  16. Automatic location of short circuit faults

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [VTT Energy, Espoo (Finland); Hakola, T.; Antila, E. [ABB Power Oy (Finland); Seppaenen, M. [North-Carelian Power Company (Finland)

    1998-08-01

    In this chapter, the automatic location of short circuit faults on medium voltage distribution lines, based on the integration of computer systems of medium voltage distribution network automation is discussed. First the distribution data management systems and their interface with the substation telecontrol, or SCADA systems, is studied. Then the integration of substation telecontrol system and computerized relay protection is discussed. Finally, the implementation of the fault location system is presented and the practical experience with the system is discussed

  17. Automatic location of short circuit faults

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [VTT Energy, Espoo (Finland); Hakola, T.; Antila, E. [ABB Power Oy, Helsinki (Finland); Seppaenen, M. [North-Carelian Power Company (Finland)

    1996-12-31

    In this presentation, the automatic location of short circuit faults on medium voltage distribution lines, based on the integration of computer systems of medium voltage distribution network automation is discussed. First the distribution data management systems and their interface with the substation telecontrol, or SCADA systems, is studied. Then the integration of substation telecontrol system and computerised relay protection is discussed. Finally, the implementation of the fault location system is presented and the practical experience with the system is discussed

  18. Debug automation from pre-silicon to post-silicon

    CERN Document Server

    Dehbashi, Mehdi

    2015-01-01

    This book describes automated debugging approaches for the bugs and the faults which appear in different abstraction levels of a hardware system. The authors employ a transaction-based debug approach to systems at the transaction-level, asserting the correct relation of transactions. The automated debug approach for design bugs finds the potential fault candidates at RTL and gate-level of a circuit. Debug techniques for logic bugs and synchronization bugs are demonstrated, enabling readers to localize the most difficult bugs. Debug automation for electrical faults (delay faults)finds the potentially failing speedpaths in a circuit at gate-level. The various debug approaches described achieve high diagnosis accuracy and reduce the debugging time, shortening the IC development cycle and increasing the productivity of designers. Describes a unified framework for debug automation used at both pre-silicon and post-silicon stages; Provides approaches for debug automation of a hardware system at different levels of ...

  19. Fault tolerance and reliability in integrated ship control

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Izadi-Zamanabadi, Roozbeh; Schiøler, Henrik

    2002-01-01

    Various strategies for achieving fault tolerance in large scale control systems are discussed. The positive and negative impacts of distribution through network communication are presented. The ATOMOS framework for standardized reliable marine automation is presented along with the corresponding...

  20. miRSel: Automated extraction of associations between microRNAs and genes from the biomedical literature

    Directory of Open Access Journals (Sweden)

    Zimmer Ralf

    2010-03-01

    Full Text Available Abstract Background MicroRNAs have been discovered as important regulators of gene expression. To identify the target genes of microRNAs, several databases and prediction algorithms have been developed. Only few experimentally confirmed microRNA targets are available in databases. Many of the microRNA targets stored in databases were derived from large-scale experiments that are considered not very reliable. We propose to use text mining of publication abstracts for extracting microRNA-gene associations including microRNA-target relations to complement current repositories. Results The microRNA-gene association database miRSel combines text-mining results with existing databases and computational predictions. Text mining enables the reliable extraction of microRNA, gene and protein occurrences as well as their relationships from texts. Thereby, we increased the number of human, mouse and rat miRNA-gene associations by at least three-fold as compared to e.g. TarBase, a resource for miRNA-gene associations. Conclusions Our database miRSel offers the currently largest collection of literature derived miRNA-gene associations. Comprehensive collections of miRNA-gene associations are important for the development of miRNA target prediction tools and the analysis of regulatory networks. miRSel is updated daily and can be queried using a web-based interface via microRNA identifiers, gene and protein names, PubMed queries as well as gene ontology (GO terms. miRSel is freely available online at http://services.bio.ifi.lmu.de/mirsel.

  1. Planetary Gearbox Fault Diagnosis Using Envelope Manifold Demodulation

    OpenAIRE

    Weigang Wen; Gao, Robert X.; Weidong Cheng

    2016-01-01

    The important issue in planetary gear fault diagnosis is to extract the dependable fault characteristics from the noisy vibration signal of planetary gearbox. To address this critical problem, an envelope manifold demodulation method is proposed for planetary gear fault detection in the paper. This method combines complex wavelet, manifold learning, and frequency spectrogram to implement planetary gear fault characteristic extraction. The vibration signal of planetary gear is demodulated by w...

  2. A new GIS-based model for automated extraction of Sand Dune encroachment case study: Dakhla Oases, western desert of Egypt

    Directory of Open Access Journals (Sweden)

    M. Ghadiry

    2012-06-01

    Full Text Available The movements of the sand dunes are considered as a threat for roads, irrigation networks, water resources, urban areas, agriculture and infrastructures. The main objectives of this study are to develop a new GIS-based model for automated extraction of sand dune encroachment using remote sensing data and to assess the rate of sand dune movement. To monitor and assess the movements of sand dunes in Dakhla oases area, multi-temporal satellite images and a GIS-developed model, using Python script in Arc GIS, were used. The satellite images (SPOT images, 1995 and 2007 were geo-rectified using Erdas Imagine. Image subtraction was performed using spatial analyst in Arc GIS, the result of image subtraction obtains the sand dune movement between the two dates. The raster and vector shape of sand dune migration was automatically extracted using spatial analyst tools. The frontiers of individual dunes were measured at different dates and movement rates were analyzed in GIS. The ModelBuilder in Arc GIS was used in order to create a user friendly tool. The custom built model window is easy to handle by any user who wishes to adapt the model in his work. It was found that the rate of sand dune movement ranged between 3 and 9 m per year. The majority of sand dunes have a rate movement between 0 and 6 m and very few dunes had a movement rate between 6 and 9 m. Integrating remote sensing and GIS provided the necessary information for determining the minimum, maximum, mean, rate and area of sand dune migration.

  3. Analysis of trace contamination of phthalate esters in ultrapure water using a modified solid-phase extraction procedure and automated thermal desorption-gas chromatography/mass spectrometry.

    Science.gov (United States)

    Liu, Hsu-Chuan; Den, Walter; Chan, Shu-Fei; Kin, Kuan Tzu

    2008-04-25

    The present study was aimed to develop a procedure modified from the conventional solid-phase extraction (SPE) method for the analysis of trace concentration of phthalate esters in industrial ultrapure water (UPW). The proposed procedure allows UPW sample to be drawn through a sampling tube containing hydrophobic sorbent (Tenax TA) to concentrate the aqueous phthalate esters. The solid trap was then demoisturized by two-stage gas drying before subjecting to thermal desorption and analysis by gas chromatography-mass spectrometry. This process removes the solvent extraction procedure necessary for the conventional SPE method, and permits automation of the analytical procedure for high-volume analyses. Several important parameters, including desorption temperature and duration, packing quantity and demoisturizing procedure, were optimized in this study based on the analytical sensitivity for a standard mixture containing five different phthalate esters. The method detection limits for the five phthalate esters were between 36 ng l(-1) and 95 ng l(-1) and recovery rates between 15% and 101%. Dioctyl phthalate (DOP) was not recovered adequately because the compound was both poorly adsorbed and desorbed on and off Tenax TA sorbents. Furthermore, analyses of material leaching from poly(vinyl chloride) (PVC) tubes as well as the actual water samples showed that di-n-butyl phthalate (DBP) and di(2-ethylhexyl) phthalate (DEHP) were the common contaminants detected from PVC contaminated UPW and the actual UPW, as well as in tap water. The reduction of DEHP in the production processes of actual UPW was clearly observed, however a DEHP concentration of 0.20 microg l(-1) at the point of use was still being quantified, suggesting that the contamination of phthalate esters could present a barrier to the future cleanliness requirement of UPW. The work demonstrated that the proposed modified SPE procedure provided an effective method for rapid analysis and contamination

  4. High sensitivity measurements of active oxysterols with automated filtration/filter backflush-solid phase extraction-liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Roberg-Larsen, Hanne; Strand, Martin Frank; Grimsmo, Anders; Olsen, Petter Angell; Dembinski, Jennifer L; Rise, Frode; Lundanes, Elsa; Greibrokk, Tyge; Krauss, Stefan; Wilson, Steven Ray

    2012-09-14

    Oxysterols are important in numerous biological processes, including cell signaling. Here we present an automated filtration/filter backflush-solid phase extraction-liquid chromatography-tandem mass spectrometry (AFFL-SPE-LC-MS/MS) method for determining 24-hydroxysterol and the isomers 25-hydroxycholesterol and 22S-hydroxycholesterol that enables simplified sample preparation, high sensitivity (~25 pg/mL cell lysis sample) and low sample variability. Only one sample transfer step was required for the entire process of cell lysis, derivatization and determination of selected oxysterols. During the procedure, autoxidation of cholesterol, a potential/common problem using standard analytical methods, was found to be negligible. The reversed phase AFFL-SPE-LC-MS/MS method utilizing a 1mm inner diameter column was validated, and used to determine levels of the oxysterol analytes in mouse fibroblast cell lines SSh-LII and NIH-3T3, and human cancer cell lines, BxPC3, HCT-15 and HCT-116. In BxPC3 cells, the AFFL-SPE-LC-MS/MS method was used to detect significant differences in 24S-OHC levels between vimentin+ and vimentin- heterogenous sub-populations. The methodology also allowed monitoring of significant alterations in 24S-OHC levels upon delivery of the Hedgehog (Hh) antagonist MS-0022 in HCT-116 colorectal carcinoma cell lines.

  5. 基于多重分形去趋势波动分析的齿轮箱故障特征提取方法%Fault feature extraction of gearboxes based on multifractal detrended fluctuation analysis

    Institute of Scientific and Technical Information of China (English)

    林近山; 陈前

    2013-01-01

    Gearbox fault data are usually characterized by nonstationarity and multiple scaling behaviors, a detrended fluctuation analysis ( DFA) often fails to uncover their underlying dynamical mechanism. Multifractal DFA (MF-DFA) is an extension of DFA and able to effectively reveal their underlying dynamical mechanism hidden in nonstationary data with multiple scaling behaviors. To start with, MF-DFA was used to compute the multifractal singularity spectrum of gearbox fault data. Next, four characteristic parameters including multifractal spectrum width, maximum singularity exponent, minimum singularity exponent and singularity exponent corresponding to extremum of multifractal spectrum had clear physical meaning, they could express underlying dynamical mechanism of gearbox fault data and could be employed as fault features of gearbox fault data. Consequently, a novel method for feature extraction of gearbox fault data was proposed based on MF-DFA. Besides, the proposed method together with DFA was utilized to separate the normal, the slight-worn, the medium-worn and the broken-tooth vibration data from a four-speed motorcycle gearbox. The results showed that the proposed method overcomes the deficiencies of DFA, it is sensitive to small changes of gearbox fault conditions, it can totally separate the fault patterns close to each other and is a feasible method for feature extraction of gearbox fault data.%齿轮箱故障信号通常是具有多标度行为的非平稳信号,去趋势波动分析(Detrended Fluctuation Analysis,DFA)不能准确揭示隐藏在这类信号中的动力学行为.多重分形去趋势波动分析(Multifractal Detrended Fluctuation Analysis,MF-DFA)是DFA方法的拓展,能够有效地揭示隐藏在多标度非平稳信号中的动力学行为.利用MF-DFA计算齿轮箱故障信号的多重分形奇异谱,而多重分形奇异谱的宽度、最大奇异指数、最小奇异指数和极值点对应的奇异指数都具有明确的物理意义,

  6. 基于LMD和MED的滚动轴承故障特征提取方法%Fault feature extraction method for rolling element bearings based on LMD and MED

    Institute of Scientific and Technical Information of China (English)

    周士帅; 窦东阳; 薛斌

    2016-01-01

    机械系统所拾取的振动信号包含着许多复杂的信息成分,微弱故障信号的提取往往会受到这些成分的影响,故障识别非常困难,尤其是滚动体故障识别,往往比内圈和外圈故障识别更困难。提出局域均值分解(local mean decomposition, LMD)与最小熵反褶积(minimum entropy deconvolution, MED)结合的方式,提取强噪声、强确定性成分下微弱故障信号的特征。先用LMD对信号做预处理,自适应地分解为若干个乘积函数(product function, PF)分量,再对前4个PF分量做MED处理以放大故障脉冲特征,最后对MED处理后的信号进行包络分析。通过对强噪声背景下滚动轴承滚动体的故障实例分析,该方法得到的输出频谱故障特征频率处峰值与200 Hz内所有峰值均值的比值较原信号的增加了96.4%,同时信噪比提高了18.3%,成功地提取了故障特征,取得了良好的效果,该研究可为强噪声环境下轴承故障识别和诊断提供参考。%The vibration signals collected from mechanical systems consist of cyclic impulse response, deterministic component and noise. The rolling bearing’s fault features are usually so weak that they are overwhelmed by these components, leading difficulty for fault diagnosis. Compared with the inner race and outer race defects of rolling bearing, recognizing the rolling element defects are much more challenging. Therefore, the key problem of fault diagnosis of rolling bears is to exactly extract the weak fault features from a strong noisy background. In this paper, we developed a method based on the minimum entropy deconvolution (MED) and local mean decomposition (LMD) for diagnosing fault features. First, the LMD was used to decompose the original signals into a set of production functions(PFs) adaptively. Each PF was a product of an amplitude envelope signal and a frequency-modulated signal. By doing so, we aimed to obtaining different

  7. Development and validation of an automated liquid-liquid extraction GC/MS method for the determination of THC, 11-OH-THC, and free THC-carboxylic acid (THC-COOH) from blood serum.

    Science.gov (United States)

    Purschke, Kirsten; Heinl, Sonja; Lerch, Oliver; Erdmann, Freidoon; Veit, Florian

    2016-06-01

    The analysis of Δ(9)-tetrahydrocannabinol (THC) and its metabolites 11-hydroxy-Δ(9)-tetrahydrocannabinol (11-OH-THC), and 11-nor-9-carboxy-Δ(9)-tetrahydrocannabinol (THC-COOH) from blood serum is a routine task in forensic toxicology laboratories. For examination of consumption habits, the concentration of the phase I metabolite THC-COOH is used. Recommendations for interpretation of analysis values in medical-psychological assessments (regranting of driver's licenses, Germany) include threshold values for the free, unconjugated THC-COOH. Using a fully automated two-step liquid-liquid extraction, THC, 11-OH-THC, and free, unconjugated THC-COOH were extracted from blood serum, silylated with N-methyl-N-(trimethylsilyl) trifluoroacetamide (MSTFA), and analyzed by GC/MS. The automation was carried out by an x-y-z sample robot equipped with modules for shaking, centrifugation, and solvent evaporation. This method was based on a previously developed manual sample preparation method. Validation guidelines of the Society of Toxicological and Forensic Chemistry (GTFCh) were fulfilled for both methods, at which the focus of this article is the automated one. Limits of detection and quantification for THC were 0.3 and 0.6 μg/L, for 11-OH-THC were 0.1 and 0.8 μg/L, and for THC-COOH were 0.3 and 1.1 μg/L, when extracting only 0.5 mL of blood serum. Therefore, the required limit of quantification for THC of 1 μg/L in driving under the influence of cannabis cases in Germany (and other countries) can be reached and the method can be employed in that context. Real and external control samples were analyzed, and a round robin test was passed successfully. To date, the method is employed in the Institute of Legal Medicine in Giessen, Germany, in daily routine. Automation helps in avoiding errors during sample preparation and reduces the workload of the laboratory personnel. Due to its flexibility, the analysis system can be employed for other liquid-liquid extractions as

  8. Accounting Automation

    OpenAIRE

    Laynebaril1

    2017-01-01

    Accounting Automation   Click Link Below To Buy:   http://hwcampus.com/shop/accounting-automation/  Or Visit www.hwcampus.com Accounting Automation” Please respond to the following: Imagine you are a consultant hired to convert a manual accounting system to an automated system. Suggest the key advantages and disadvantages of automating a manual accounting system. Identify the most important step in the conversion process. Provide a rationale for your response. ...

  9. EXTRACT

    DEFF Research Database (Denmark)

    Pafilis, Evangelos; Buttigieg, Pier Luigi; Ferrell, Barbra

    2016-01-01

    The microbial and molecular ecology research communities have made substantial progress on developing standards for annotating samples with environment metadata. However, sample manual annotation is a highly labor intensive process and requires familiarity with the terminologies used. We have the...... and text-mining-assisted curation revealed that EXTRACT speeds up annotation by 15-25% and helps curators to detect terms that would otherwise have been missed.Database URL: https://extract.hcmr.gr/....

  10. Concepts and Methods in Fault-tolerant Control

    DEFF Research Database (Denmark)

    Blanke, Mogens; Staroswiecly, M.; Wu, N.E.

    2001-01-01

    in an intelligent way. The aim is to prevent that simple faults develop into serious failure and hence increase plant availability and reduce the risk of safety hazards. Fault-tolerant control merges several disciplines into a common framework to achieve these goals. The desired features are obtained through on......Faults in automated processes will often cause undesired reactions and shut-down of a controlled plant, and the consequences could be damage to technical parts of the plant, to personnel or the environment. Fault-tolerant control combines diagnosis with control methods to handle faults...

  11. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  12. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  13. Cooperative Human-Machine Fault Diagnosis

    Science.gov (United States)

    Remington, Roger; Palmer, Everett

    1987-02-01

    Current expert system technology does not permit complete automatic fault diagnosis; significant levels of human intervention are still required. This requirement dictates a need for a division of labor that recognizes the strengths and weaknesses of both human and machine diagnostic skills. Relevant findings from the literature on human cognition are combined with the results of reviews of aircrew performance with highly automated systems to suggest how the interface of a fault diagnostic expert system can be designed to assist human operators in verifying machine diagnoses and guiding interactive fault diagnosis. It is argued that the needs of the human operator should play an important role in the design of the knowledge base.

  14. Fault Current Characteristics of the DFIG under Asymmetrical Fault Conditions

    Directory of Open Access Journals (Sweden)

    Fan Xiao

    2015-09-01

    Full Text Available During non-severe fault conditions, crowbar protection is not activated and the rotor windings of a doubly-fed induction generator (DFIG are excited by the AC/DC/AC converter. Meanwhile, under asymmetrical fault conditions, the electrical variables oscillate at twice the grid frequency in synchronous dq frame. In the engineering practice, notch filters are usually used to extract the positive and negative sequence components. In these cases, the dynamic response of a rotor-side converter (RSC and the notch filters have a large influence on the fault current characteristics of the DFIG. In this paper, the influence of the notch filters on the proportional integral (PI parameters is discussed and the simplified calculation models of the rotor current are established. Then, the dynamic performance of the stator flux linkage under asymmetrical fault conditions is also analyzed. Based on this, the fault characteristics of the stator current under asymmetrical fault conditions are studied and the corresponding analytical expressions of the stator fault current are obtained. Finally, digital simulation results validate the analytical results. The research results are helpful to meet the requirements of a practical short-circuit calculation and the construction of a relaying protection system for the power grid with penetration of DFIGs.

  15. Automated 96-well solid phase extraction and hydrophilic interaction liquid chromatography-tandem mass spectrometric method for the analysis of cetirizine (ZYRTEC) in human plasma--with emphasis on method ruggedness.

    Science.gov (United States)

    Song, Qi; Junga, Heiko; Tang, Yong; Li, Austin C; Addison, Tom; McCort-Tipton, Melanie; Beato, Brian; Naidong, Weng

    2005-01-05

    A high-throughput bioanalytical method based on automated sample transfer, automated solid phase extraction, and hydrophilic interaction liquid chromatography-tandem mass spectrometry (HILIC-MS/MS) analysis, has been developed for the determination of cetirizine, a selective H(1)-receptor antagonist. Deuterated cetirizine (cetirizine-d(8)) was synthesized as described and was used as the internal standard. Samples were transferred into 96-well plates using an automated sample handling system. Automated solid phase extraction was carried out using a 96-channel programmable liquid-handling workstation. Solid phase extraction 96-well plate on polymer sorbent (Strata X) was used to extract the analyte. The extracted samples were injected onto a Betasil silica column (50 x 3, 5 microm) using a mobile phase of acetonitrile-water-acetic acid-trifluroacetic acid (93:7:1:0.025, v/v/v/v) at a flow rate of 0.5 ml/min. The chromatographic run time is 2.0 min per injection, with retention time of cetirizine and cetirizine-d(8) both at 1.1 min. The system consisted of a Shimadzu HPLC system and a PE Sciex API 3000 or API 4000 tandem mass spectrometer with (+) ESI. The method has been validated over the concentration range of 1.00-1000 ng/ml cetirizine in human plasma, based on a 0.10-ml sample size. The inter-day precision and accuracy of the quality control (QC) samples demonstrated <3.0% relative standard deviation (R.S.D.) and <6.0% relative error (RE). Stability of cetirizine in stock solution, in plasma, and in reconstitution solution was established. The absolute extraction recovery was 85.8%, 84.5%, and 88.0% at 3, 40, and 800 ng/ml, respectively. The recovery for the internal standard was 84.1%. No adverse matrix effects were noticed for this assay. The automation of the sample preparation steps not only increased the analysis throughput, but also increased method ruggedness. The use of a stable isotope-labeled internal standard further improved the method ruggedness

  16. Faults Discovery By Using Mined Data

    Science.gov (United States)

    Lee, Charles

    2005-01-01

    Fault discovery in the complex systems consist of model based reasoning, fault tree analysis, rule based inference methods, and other approaches. Model based reasoning builds models for the systems either by mathematic formulations or by experiment model. Fault Tree Analysis shows the possible causes of a system malfunction by enumerating the suspect components and their respective failure modes that may have induced the problem. The rule based inference build the model based on the expert knowledge. Those models and methods have one thing in common; they have presumed some prior-conditions. Complex systems often use fault trees to analyze the faults. Fault diagnosis, when error occurs, is performed by engineers and analysts performing extensive examination of all data gathered during the mission. International Space Station (ISS) control center operates on the data feedback from the system and decisions are made based on threshold values by using fault trees. Since those decision-making tasks are safety critical and must be done promptly, the engineers who manually analyze the data are facing time challenge. To automate this process, this paper present an approach that uses decision trees to discover fault from data in real-time and capture the contents of fault trees as the initial state of the trees.

  17. Improving the software fault localization process through testability information

    OpenAIRE

    Gonzalez-Sanchez, A.; Abreu, R.; Gross, H; Van Gemund, A.

    2010-01-01

    When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow down the set of fault candidates, and, eventually, pinpoint the root cause. An essential ingredient of effective and efficient fault localization is the knowledge about the intermittency of occurr...

  18. Bearing fault detection using motor current signal analysis based on wavelet packet decomposition and Hilbert envelope

    Directory of Open Access Journals (Sweden)

    Imaouchen Yacine

    2015-01-01

    Full Text Available To detect rolling element bearing defects, many researches have been focused on Motor Current Signal Analysis (MCSA using spectral analysis and wavelet transform. This paper presents a new approach for rolling element bearings diagnosis without slip estimation, based on the wavelet packet decomposition (WPD and the Hilbert transform. Specifically, the Hilbert transform first extracts the envelope of the motor current signal, which contains bearings fault-related frequency information. Subsequently, the envelope signal is adaptively decomposed into a number of frequency bands by the WPD algorithm. Two criteria based on the energy and correlation analyses have been investigated to automate the frequency band selection. Experimental studies have confirmed that the proposed approach is effective in diagnosing rolling element bearing faults for improved induction motor condition monitoring and damage assessment.

  19. Feature extraction and fusion for thruster faults of AUV with random disturbance%随机干扰下 AUV 推进器故障特征提取与融合

    Institute of Scientific and Technical Information of China (English)

    张铭钧; 殷宝吉; 刘维新; 王玉甲

    2015-01-01

    The correctness of fault diagnosis results for thrusters of AUV (autonomous underwater vehicle) was frequently influenced by random disturbance ,which was caused by the internal noise of underwater sensors .To decrease the influence ,two feature extraction methods that extracting fault feature from the wavelet approximate component of longitudinal velocity and from the changing rate of control voltage ,and a feature fusion method with normalization were proposed .After the wavelet re‐construction of scale coefficients for wavelet decomposition of longitudinal velocity ,the wavelet ap‐proximate component was obtained .After the derivation of control voltage ,the changing rate was ac‐quired .Two kinds of fault feature were extracted from the wavelet approximate component and the changing rate based on modified Bayes′classification algorithm separately .Following the feature fu‐sion of the two kinds of fault feature based on evidence theory ,the fusion result were normalized .The effectiveness of the proposed methods was verified by the experiments of AUV ,which were carried out in the pool .%针对水下传感器自身噪声等随机干扰影响水下机器人推进器故障诊断结果的准确性问题,为降低随机干扰影响,提出了基于小波近似分量提取故障特征、基于控制信号变化率提取故障特征以及带有归一化处理的特征融合方法。将速度信号进行小波分解,对分解后的尺度系数进行小波重构得到小波近似分量;对控制信号进行求导,得到控制信号变化率。基于修正贝叶斯算法,分别从小波近似分量和控制信号变化率中提取故障特征。基于证据理论对提取到的两个单一特征进行融合,并将融合结果进行归一化处理。水下机器人实验样机的水池实验结果验证了所提方法的有效性。

  20. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  1. RAT非线性递归特征提取及故障检测中的应用%Nonlinear RAT Feature Extraction and Application in Fault Detection

    Institute of Scientific and Technical Information of China (English)

    赵竞雄; 王晓菊

    2014-01-01

    The average mutual information algorithm and false nearest neighbors algorithm were presented for calculating the optimal parameters of phase space reconstruction. On the basis of recurrence plot, the ratio of recurrence rate to the de-terminism which called RAT in this paper was proposed as a new nonlinear recurrence feature. And the algorithm of RAT was researched in detail. Three typical types of fault such as gas compressor fault, fuel apply fault and burning fault. Simu-lation result shows that the RAT feature can realize the fault diagnosis of engine effectively, and the fault diagnosis preci-sion can reach to 95.7%. According to the research result, it shows predominant performance and good engineering value in application.%提出使用平均互信息算法和虚假最近邻点算法提取非线性时间序列相空间重构的最优化重构参数。在研究递归图算法的基础上,提出使用递归图中的递归率与确定性的比值RAT作为一种新的非线性递归特征量,对其算法进行描述。对涡轮发动机涉及到气缸压缩、供油系统和燃烧室等涡轮机子系统3类典型故障进行了故障诊断实验。仿真实验结果表明,使用RAT特征能有效实现3类故障下的发动机故障的聚类和诊断,故障诊断准确率为95.7%,具有绝对优越的诊断性能,具有较强的工程实践意义。

  2. Automated Contingency Management for Advanced Propulsion Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Automated Contingency Management (ACM), or the ability to confidently and autonomously adapt to fault conditions with the goal of still achieving mission objectives,...

  3. Data Driven Fault Tolerant Control: A Subspace Approach

    NARCIS (Netherlands)

    Dong, J.

    2009-01-01

    The main stream research on fault detection and fault tolerant control has been focused on model based methods. As far as a model is concerned, changes therein due to faults have to be extracted from measured data. Generally speaking, existing approaches process measured inputs and outputs either by

  4. Automated diagnosis of rolling bearings using MRA and neural networks

    Science.gov (United States)

    Castejón, C.; Lara, O.; García-Prada, J. C.

    2010-01-01

    Any industry needs an efficient predictive plan in order to optimize the management of resources and improve the economy of the plant by reducing unnecessary costs and increasing the level of safety. A great percentage of breakdowns in productive processes are caused by bearings. They begin to deteriorate from early stages of their functional life, also called the incipient level. This manuscript develops an automated diagnosis of rolling bearings based on the analysis and classification of signature vibrations. The novelty of this work is the application of the methodology proposed for data collected from a quasi-real industrial machine, where rolling bearings support the radial and axial loads the bearings are designed for. Multiresolution analysis (MRA) is used in a first stage in order to extract the most interesting features from signals. Features will be used in a second stage as inputs of a supervised neural network (NN) for classification purposes. Experimental results carried out in a real system show the soundness of the method which detects four bearing conditions (normal, inner race fault, outer race fault and ball fault) in a very incipient stage.

  5. Faint Fault Feature Extraction of Hydraulic Pump Based on Adaptive EEMD-Enhancement Factor%基于EEMD-增强因子自适应的液压泵微弱故障特征提取

    Institute of Scientific and Technical Information of China (English)

    王余奎; 李洪儒; 许葆华

    2014-01-01

    针对斜盘式轴向柱塞泵微弱故障特征难以提取的问题,提出了一种基于EEMD-增强因子自适应的液压泵微弱故障特征提取方法。对故障信号EEMD分解得到一组IMFs,采用增强因子作为各IMF权值合成信号以突出故障特征并抑制不相关成分;对合成信号EEMD分解,用敏感因子筛选出最能够表征故障信息的IMFs分量重构信号;对重构信号做Hil-bert变换求得包络谱,分析包络谱诊断出具体故障。仿真信号和液压泵实测信号的分析结果均很好地验证了该方法的有效性和优越性。%Aimed at the problem of difficult to extract the faint fault feature of axial plunger piston pump of inclined disk type,a method based on adaptive EEMD-enhancement factor was presented. Fault signals were decomposed into a group of intrinsic mode func-tions (IMFs)with ensemble empirical mode decomposition (EEMD),as to highlight the fault characteristic and inhibit unrelated ele-ments by using the enhancement factor of IMF as its weight to synthetic signal. The synthetic signal was decomposed with EEMD opera-tion,and the sensitive component was constructed with the IMFs which were the best represents of fault information as they were select-ed according to their sensitive factor. The envelope spectral of reconstructed signal was obtained by executing Hilbert transform to it, and the actual fault was diagnosed by the analysis of the gained envelope spectral. The validity and superiority of the method are dem-onstrated by the analysis results of simulation signal and the engineering measured data of hydraulic pump.

  6. Fault feature extraction of automotive engine camshaft bearing loosening%车用发动机凸轮轴轴承松脱的故障特征提取

    Institute of Scientific and Technical Information of China (English)

    喻菲菲; 杜灿谊

    2015-01-01

    Automotive engine camshaft bearing loosening fault will cause impact force because of uneven stress on camshaft , so abnormal vibration and noise appear on the body surface near the camshaft .By the test for engine camshaft bearing loosening fault,vibration acceleration signal of cylinder head is collected to analyze in time domain ,cyclical shock can be seen .Then wave-let packet decomposition method is used to decompose the signal and demodulation analysis is applied to higher frequency band signal,fault excitation frequency in the demodulation spectrum is very obvious .The analysis result shows that camshaft bearing loosening fault would cause mid-high frequency modulation phenomenon ,it could extract the corresponding fault feature by com-positely using time domain analysis ,frequency spectrum analysis and wavelet packer decomposition and demodulation analysis .%车用发动机凸轮轴轴承松脱故障使得凸轮轴受力不均而引起冲击力,导致凸轮轴附近机体表面出现异常振动和显著噪声。通过对某发动机凸轮轴轴承松脱故障的试验测试,提取缸盖振动加速度信号进行时域分析,发现有明显的周期性冲击;利用小波包分解至较高频段并做解调分析,解调谱的故障激励频率非常显著。分析结果表明,凸轮轴轴承松脱故障有中高频调制现象,可综合利用时域分析、频谱分析和小波包分解与解调分析进行相应的故障特征提取。

  7. Interactive Fault Localization Using Test Information

    Institute of Scientific and Technical Information of China (English)

    Dan Hao; Lu Zhang; Tao Xie; Hong Mei; Jia-Su Sun

    2009-01-01

    Debugging is a time-consuming task in software development.Although various automated approaches have been proposed,they are not effective enough.On the other hand,in manual debugging,developers have difficulty in choosing breakpoints.To address these problems and help developers locate faults effectively,we propose an interactive fault-localization framework,combining the benefits of automated approaches and manual debugging.Before the fault is found,this framework continuously recommends checking points based on statements'suspicions.which are calculated according to the execution information of test cases and the feedback information from the developer at earlier checking points.Then we propose a naive approach.which is an initial implementation of this framework.However.with this naive approach or manual debugging,developers'wrong estimation of whether the faulty statement is executed before the checking point(breakpoint)may make the debugging process fail.So we propose another robust approach based on this framework,handling cases where developers make mistakes during the fault-localization process.We performed two experimental studies and the results show that the two interactive approaches are quite effective compared with existing fault-localization approaches.Moreover,the robust approach can help developers find faults when they make wrong estimation at some checking points.

  8. Machine Fault Signature Analysis

    Directory of Open Access Journals (Sweden)

    Pratesh Jayaswal

    2008-01-01

    Full Text Available The objective of this paper is to present recent developments in the field of machine fault signature analysis with particular regard to vibration analysis. The different types of faults that can be identified from the vibration signature analysis are, for example, gear fault, rolling contact bearing fault, journal bearing fault, flexible coupling faults, and electrical machine fault. It is not the intention of the authors to attempt to provide a detailed coverage of all the faults while detailed consideration is given to the subject of the rolling element bearing fault signature analysis.

  9. Radar Determination of Fault Slip and Location in Partially Decorrelated Images

    Science.gov (United States)

    Parker, Jay; Glasscoe, Margaret; Donnellan, Andrea; Stough, Timothy; Pierce, Marlon; Wang, Jun

    2016-09-01

    Faced with the challenge of thousands of frames of radar interferometric images, automated feature extraction promises to spur data understanding and highlight geophysically active land regions for further study. We have developed techniques for automatically determining surface fault slip and location using deformation images from the NASA Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR), which is similar to satellite-based SAR but has more mission flexibility and higher resolution (pixels are approximately 7 m). This radar interferometry provides a highly sensitive method, clearly indicating faults slipping at levels of 10 mm or less. But interferometric images are subject to decorrelation between revisit times, creating spots of bad data in the image. Our method begins with freely available data products from the UAVSAR mission, chiefly unwrapped interferograms, coherence images, and flight metadata. The computer vision techniques we use assume no data gaps or holes; so a preliminary step detects and removes spots of bad data and fills these holes by interpolation and blurring. Detected and partially validated surface fractures from earthquake main shocks, aftershocks, and aseismic-induced slip are shown for faults in California, including El Mayor-Cucapah (M7.2, 2010), the Ocotillo aftershock (M5.7, 2010), and South Napa (M6.0, 2014). Aseismic slip is detected on the San Andreas Fault from the El Mayor-Cucapah earthquake, in regions of highly patterned partial decorrelation. Validation is performed by comparing slip estimates from two interferograms with published ground truth measurements.

  10. An Integrated Framework of Drivetrain Degradation Assessment and Fault Localization for Offshore Wind Turbines

    Directory of Open Access Journals (Sweden)

    Jay Lee

    2013-01-01

    Full Text Available As wind energy proliferates in onshore and offshore applications, it has become significantly important to predict wind turbine downtime and maintain operation uptime to ensure maximal yield. Two types of data systems have been widely adopted for monitoring turbine health condition: supervisory control and data acquisition (SCADA and condition monitoring system (CMS. Provided that research and development have focused on advancing analytical techniques based on these systems independently, an intelligent model that associates information from both systems is necessary and beneficial. In this paper, a systematic framework is designed to integrate CMS and SCADA data and assess drivetrain degradation over its lifecycle. Information reference and advanced feature extraction techniques are employed to procure heterogeneous health indicators. A pattern recognition algorithm is used to model baseline behavior and measure deviation of current behavior, where a Self-organizing Map (SOM and minimum quantization error (MQE method is selected to achieve degradation assessment. Eventually, the computation and ranking of component contribution to the detected degradation offers component-level fault localization. When validated and automated by various applications, the approach is able to incorporate diverse data resources and output actionable information to advise predictive maintenance with precise fault information. The approach is validated on a 3 MW offshore turbine, where an incipient fault is detected well before existing system shuts down the unit. A radar chart is used to illustrate the fault localization result.

  11. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V.; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  12. Bearing Fault Diagnosis Using a Novel Classifier Ensemble Based on Lifting Wavelet Packet Transforms and Sample Entropy

    Directory of Open Access Journals (Sweden)

    Lei Zhang

    2016-01-01

    Full Text Available In order to improve the fault detection accuracy for rolling bearings, an automated fault diagnosis system is presented based on lifting wavelet packet transform (LWPT, sample entropy (SampEn, and classifier ensemble. Bearing vibration signals are firstly decomposed into different frequency subbands through a three-level LWPT, resulting in a total of 8 frequency-band signals throughout the third layers of the LWPT decomposition tree. The SampEns of all the 8 components are then calculated as feature vectors. Such a feature extraction paradigm is expected to depict complexity, irregularity, and nonstationarity of bearing vibrations. Moreover, a novel classifier ensemble is proposed to alleviate the effect of initial parameters on the performance of member classifiers and to improve classification effectiveness. Experiments were conducted on electric motor bearings considering various set of fault categories and fault severity levels. Experimental results demonstrate the proposed diagnosis system can effectively improve bearing fault recognition accuracy and stability in comparison with diagnosis methods based on a single classifier.

  13. A System for Automated Extraction of Astronomical English Terms%天文学英语新词自动提取系统∗

    Institute of Scientific and Technical Information of China (English)

    余恒; 崔辰州; 张晖

    2015-01-01

    Standardized Chinese translations of scientific terms are important for scientific research as well as science communication. Identifying new English terms in time is a basic requirement for standardized translations. In this paper we introduce a system that is designed for automated extraction of astronomical English terms from scientific publications. The system combines several techniques, e. g. the script filter, automatic term recognition, and regular-expression match. It can automatically trace updates of the arXiv paper database, analyze contents of papers, and generate lists of candidates of new terms. By using the system the China National Committee for Terms in Sciences and Technologies can focus on deciding Chinese translations of terms instead of spending time on term collection. We expect the system to contribute substantially to standardization of Chinese translations of astronomical English terms in the near future and promote other activities of standardization in astronomy.%科技名词中文译名标准化和规范化工作是推动我国科技进步和科学知识传播的重要基础。新的科学概念和技术名词层出不穷,如何及时发现并确定新生术语的中文译名是一项普遍的社会需求。介绍了一套全新的天文学英语新词自动提取系统。该系统综合使用脚本过滤、术语识别、正则表达匹配等多种方法,能够自动追踪ArXiv论文数据库的更新,分析天文学论文的内容,生成推荐术语列表,从而将学科专家从繁重的科技新词收集整理工作中解放出来,把有限的精力集中到更能体现专业素养的新词审定工作当中。这个系统将为推动天文学等基础学科的新词收集,乃至学科标准化等工作发挥积极作用。

  14. Automation or De-automation

    Science.gov (United States)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  15. Bearing initial fault feature extraction via sparse representation based on dictionary learning%基于字典学习的轴承早期故障稀疏特征提取

    Institute of Scientific and Technical Information of China (English)

    余发军; 周凤星; 严保康

    2016-01-01

    As initial fault occurs in rolling bearings of low-speed and heavy-duty machinery,the impulse component,reflecting the fault feature in vibration signals is difficult to extract because it is relatively weak and easily corrupted by strong background noise.The authors attempted to extract the impulse component from a vibration signal with the sparse representation method.However,it is difficult to construct an accurate dictionary that matches the impulse component since operation conditions of bearing are not stable.Hence,a method of extracting the initial fault feature, which is based on dictionary learning,was proposed in this research.Firstly,an adaptive dictionary was obtained by the developed K-SVD dictionary-learning algorithm.Then,Orthogonal Matching Pursuit (OMP)algorithm was utilized for sparse decomposition of the vibration signal,and all kurtosis values of approximation signal of iterations were calculated. Finally,the corresponding approximation signal of maximal kurtosis value was reconstructed and analyzed with the envelope spectrum to diagnose the fault type.The test results of simulated data and bearing vibration signals demonstrate that the proposed method,which can extract the feature component more accurately than other methods,meets the demand of real-time bearing condition monitoring.%针对低速重载机械滚动轴承早期故障的振动信号中故障特征冲击成分微弱易被噪声覆盖难以识别,而利用稀疏表示方法提取冲击成分时因轴承工况非平稳性,准确匹配冲击成分字典难以构造问题,提出基于字典学习的轴承早期故障稀疏特征提取方法。利用改进型 K-SVD 字典学习算法构造自适应字典;采用正交匹配追踪算法(Orthogonal Matching Pursuit,OMP)对振动信号进行稀疏分解,计算每次迭代逼近信号的峭度值,找出最大峭度值对应的逼近信号;重构特征成分并进行包络谱分析,获得故障类型。仿真及轴承

  16. Application of Fault Tree Analysis and Fuzzy Neural Networks to Fault Diagnosis in the Internet of Things (IoT for Aquaculture

    Directory of Open Access Journals (Sweden)

    Yingyi Chen

    2017-01-01

    Full Text Available In the Internet of Things (IoT equipment used for aquaculture is often deployed in outdoor ponds located in remote areas. Faults occur frequently in these tough environments and the staff generally lack professional knowledge and pay a low degree of attention in these areas. Once faults happen, expert personnel must carry out maintenance outdoors. Therefore, this study presents an intelligent method for fault diagnosis based on fault tree analysis and a fuzzy neural network. In the proposed method, first, the fault tree presents a logic structure of fault symptoms and faults. Second, rules extracted from the fault trees avoid duplicate and redundancy. Third, the fuzzy neural network is applied to train the relationship mapping between fault symptoms and faults. In the aquaculture IoT, one fault can cause various fault symptoms, and one symptom can be caused by a variety of faults. Four fault relationships are obtained. Results show that one symptom-to-one fault, two symptoms-to-two faults, and two symptoms-to-one fault relationships can be rapidly diagnosed with high precision, while one symptom-to-two faults patterns perform not so well, but are still worth researching. This model implements diagnosis for most kinds of faults in the aquaculture IoT.

  17. Application of Fault Tree Analysis and Fuzzy Neural Networks to Fault Diagnosis in the Internet of Things (IoT) for Aquaculture.

    Science.gov (United States)

    Chen, Yingyi; Zhen, Zhumi; Yu, Huihui; Xu, Jing

    2017-01-14

    In the Internet of Things (IoT) equipment used for aquaculture is often deployed in outdoor ponds located in remote areas. Faults occur frequently in these tough environments and the staff generally lack professional knowledge and pay a low degree of attention in these areas. Once faults happen, expert personnel must carry out maintenance outdoors. Therefore, this study presents an intelligent method for fault diagnosis based on fault tree analysis and a fuzzy neural network. In the proposed method, first, the fault tree presents a logic structure of fault symptoms and faults. Second, rules extracted from the fault trees avoid duplicate and redundancy. Third, the fuzzy neural network is applied to train the relationship mapping between fault symptoms and faults. In the aquaculture IoT, one fault can cause various fault symptoms, and one symptom can be caused by a variety of faults. Four fault relationships are obtained. Results show that one symptom-to-one fault, two symptoms-to-two faults, and two symptoms-to-one fault relationships can be rapidly diagnosed with high precision, while one symptom-to-two faults patterns perform not so well, but are still worth researching. This model implements diagnosis for most kinds of faults in the aquaculture IoT.

  18. Design of Fault Diagnosis Observer for HAGC System on Strip Rolling Mill

    Institute of Scientific and Technical Information of China (English)

    DONG Min; LIU Cai

    2006-01-01

    By building mathematical model for HAGC (hydraulic automation gauge control) system of strip rolling mill, treating faults as unknown inputs induced by model uncertainty, and analyzing fault direction, an unknown input fault diagnosis observer group was designed. Fault detection and isolation were realized through making observer residuals robust to specific faults but sensitive to other faults. Sufficient existence conditions and design of the observers were given in detail. Diagnosis observer parameters for servo valve, cylinder, roller and body rolling mill were obtained respectively. The effectiveness of this diagnosis method was proved by actual data simulations.

  19. 全矢 LMD 能量熵在齿轮故障特征提取中的应用%Full Vector LMD Energy Entropy in Gear Fault Feature Extraction

    Institute of Scientific and Technical Information of China (English)

    王洪明; 郝旺身; 韩捷; 董辛旻; 郝伟; 欧阳贺龙

    2015-01-01

    齿轮故障信号具有非线性、非平稳特征,齿轮发生故障时,信号的能量结构随之改变,在不同的频带内能量不同。传统方法采用局部均值分解(LMD)提取振动信号的能量熵,将能量熵指标作为故障评判标准进行故障分类,依靠单一传感器信息源进行故障诊断,因而容易造成误诊、漏诊。全矢 LMD能量熵法融合了双通道同源信息的回转能量,可降低故障误判率。通过实验模拟齿轮正常、齿根裂纹、断齿、缺齿等4种状态,验证了全矢 LMD 能量熵作为故障特征能达到很好的故障分类效果。%Gear vibration signals in the events of failure were often non-stationary,non-linear.En-ergy structure would change in the fault signals,leading to different energy in different frequency bands.LMD was used to extract energy entropy of the vibration signals,and energy entropy was used as failure evaluation standards for fault classification.It is easy to be misdiagnosed with the traditional single channel signal diagnostic method.Full vector LMD energy entropy integrated two-channel ho-mologous informations of vibration signals,and reduced the misdiagnosis rate.Through experiments the gear normal state,tooth root crack,broken teeth,missing teeth were simulated,and it is proved that with full vector LMD energy entropy as fault feature can achieve good fault classification results.

  20. Automated Cooperative Trajectories

    Science.gov (United States)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  1. Direct DNA isolation from solid biological sources without pretreatments with proteinase-K and/or homogenization through automated DNA extraction.

    Science.gov (United States)

    Ki, Jang-Seu; Chang, Ki Byum; Roh, Hee June; Lee, Bong Youb; Yoon, Joon Yong; Jang, Gi Young

    2007-03-01

    Genomic DNA from solid biomaterials was directly isolated with an automated DNA extractor, which was based on magnetic bead technology with a bore-mediated grinding (BMG) system. The movement of the bore broke down the solid biomaterials, mixed crude lysates thoroughly with reagents to isolate the DNA, and carried the beads to the next step. The BMG system was suitable for the mechanical homogenization of the solid biomaterials and valid as an automated system for purifying the DNA from the solid biomaterials without the need for pretreatment or disruption procedures prior to the application of the solid biomaterials.

  2. Evaluating Fault Management Operations Concepts for Next-Generation Spacecraft: What Eye Movements Tell Us

    Science.gov (United States)

    Hayashi, Miwa; Ravinder, Ujwala; McCann, Robert S.; Beutter, Brent; Spirkovska, Lily

    2009-01-01

    Performance enhancements associated with selected forms of automation were quantified in a recent human-in-the-loop evaluation of two candidate operational concepts for fault management on next-generation spacecraft. The baseline concept, called Elsie, featured a full-suite of "soft" fault management interfaces. However, operators were forced to diagnose malfunctions with minimal assistance from the standalone caution and warning system. The other concept, called Besi, incorporated a more capable C&W system with an automated fault diagnosis capability. Results from analyses of participants' eye movements indicate that the greatest empirical benefit of the automation stemmed from eliminating the need for text processing on cluttered, text-rich displays.

  3. Characterization of eleutheroside B metabolites derived from an extract of Acanthopanax senticosus Harms by high-resolution liquid chromatography/quadrupole time-of-flight mass spectrometry and automated data analysis.

    Science.gov (United States)

    Lu, Fang; Sun, Qiang; Bai, Yun; Bao, Shunru; Li, Xuzhao; Yan, Guangli; Liu, Shumin

    2012-10-01

    We elucidated the structure and metabolite profile of eleutheroside B, a component derived from the extract of Acanthopanax senticosus Harms, after oral administration of the extract in rats. Samples of rat plasma were collected and analyzed by selective high-resolution liquid chromatography/quadrupole time-of-flight mass spectrometry (UPLC/Q-TOF MS) automated data analysis method. A total of 11 metabolites were detected: four were identified, and three of those four are reported for the first time here. The three new plasma metabolites were identified on the basis of mass fragmentation patterns and literature reports. The major in vivo metabolic processes associated with eleutheroside B in A. senticosus include demethylation, acetylation, oxidation and glucuronidation after deglycosylation. A fairly comprehensive metabolic pathway was proposed for eleutheroside B. Our results provide a meaningful basis for drug discovery, design and clinical applications related to A. senticosus in traditional Chinese medicine.

  4. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck

    2013-01-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography–tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids......-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C18 column using a 6.5 min 0.1 % ammonia (25...... %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity...

  5. Research on the Fault Coefficient in Complex Electrical Engineering

    Directory of Open Access Journals (Sweden)

    Yi Sun

    2015-08-01

    Full Text Available Fault detection and isolation in a complex system are research hotspots and frontier problems in the reliability engineering field. Fault identification can be regarded as a procedure of excavating key characteristics from massive failure data, then classifying and identifying fault samples. In this paper, based on the fundamental of feature extraction about the fault coefficient, we will discuss the fault coefficient feature in complex electrical engineering in detail. For general fault types in a complex power system, even if there is a strong white Gaussian stochastic interference, the fault coefficient feature is still accurate and reliable. The results about comparative analysis of noise influence will also demonstrate the strong anti-interference ability and great redundancy of the fault coefficient feature in complex electrical engineering.

  6. Operator Performance Evaluation of Fault Management Interfaces for Next-Generation Spacecraft

    Science.gov (United States)

    Hayashi, Miwa; Ravinder, Ujwala; Beutter, Brent; McCann, Robert S.; Spirkovska, Lilly; Renema, Fritz

    2008-01-01

    In the cockpit of the NASA's next generation of spacecraft, most of vehicle commanding will be carried out via electronic interfaces instead of hard cockpit switches. Checklists will be also displayed and completed on electronic procedure viewers rather than from paper. Transitioning to electronic cockpit interfaces opens up opportunities for more automated assistance, including automated root-cause diagnosis capability. The paper reports an empirical study evaluating two potential concepts for fault management interfaces incorporating two different levels of automation. The operator performance benefits produced by automation were assessed. Also, some design recommendations for spacecraft fault management interfaces are discussed.

  7. Research on the Sparse Representation for Gearbox Compound Fault Features Using Wavelet Bases

    Directory of Open Access Journals (Sweden)

    Chunyan Luo

    2015-01-01

    Full Text Available The research on gearbox fault diagnosis has been gaining increasing attention in recent years, especially on single fault diagnosis. In engineering practices, there is always more than one fault in the gearbox, which is demonstrated as compound fault. Hence, it is equally important for gearbox compound fault diagnosis. Both bearing and gear faults in the gearbox tend to result in different kinds of transient impulse responses in the captured signal and thus it is necessary to propose a potential approach for compound fault diagnosis. Sparse representation is one of the effective methods for feature extraction from strong background noise. Therefore, sparse representation under wavelet bases for compound fault features extraction is developed in this paper. With the proposed method, the different transient features of both bearing and gear can be separated and extracted. Both the simulated study and the practical application in the gearbox with compound fault verify the effectiveness of the proposed method.

  8. Fault Feature Extraction Method of Gear Based on Improved Local Mean Decomposition and Instantaneous Energy Distribution-sample Entropy%基于改进LMD和IED-SampEn的齿轮故障特征提取方法

    Institute of Scientific and Technical Information of China (English)

    孟宗; 王亚超; 胡猛

    2016-01-01

    提出基于改进的局部均值分解(Local mean decomposition, LMD)和瞬时能量分布(Instantaneous energy distribution, IED)-样本熵(Sample entropy, SampEn)的齿轮故障特征提取方法。针对LMD存在的端点效应问题,提出最大相似系数法改进的LMD方法,该方法通过在信号内部寻找与两端指定波段相似系数最大的波段,来实现端点效应的改善。进行仿真验证,结果表明该方法能有效改善LMD的端点效应问题。采用改进的LMD方法分解信号得到瞬时幅值函数,由此可以获得信号的瞬时能量分布,将其作为样本熵输入获得IED-SampEn,通过试验研究并与PF-SampEn进行对比,结果表明IED-SampEn能够合理地、有效地反应齿轮的故障状态,作为齿轮振动信号的特征矢量具有典型性,可以作为一种有效的故障特征。%A method of gear fault feature extraction based on an improved local mean decomposition (LMD) and instantaneous energy distribution (IED) - sample entropy (SampEn) is proposed. Aiming at end effects of LMD, the maximum similarity coefficient improved LMD method is put forward. The method achieves the improvement of end effect by looking for bands that have the biggest similarity coefficient to the specified bands at both ends in the internal signal. The simulation results show that this method can effectively improve the end effect of LMD. Using the improved LMD to decompose the signal can get instantaneous amplitude functions, from that, instantaneous energy distributions of the signal as sample entropy input of IED-SampEn can be obtained. Through the experimental study and compared with PF-SampEn, the results show that IED-SampEn can reasonably and effectively response gear fault state, it is typical as the feature vector of gear vibration signal and can be used as an effective fault feature.

  9. 基于核主元分析的滚动轴承故障混合域特征提取方法%Mixed-domain feature extraction approach to rolling bearings faults based on kernel principle component analysis

    Institute of Scientific and Technical Information of China (English)

    彭涛; 杨慧斌; 李健宝; 姜海燕; 魏巍

    2011-01-01

    In order to effectively use the various nonstationary statistical features with significant differences from time domain, frequency domain and time-frequency domain, a novel mixed-domain feature extraction approach was proposed, which was based on kernel principle component analysis to improve the performance and efficiency for condition monitoring and fault diagnosis of rolling bearings. At first, the time-domain and frequency-domain features which were generated by the original signal, and time-frequency-domain features which were generated by the multi-resolution wavelet decomposition were extracted. The mixed-domain features set including 144 features were composed to characterize the original vibration signals. Then the kernel principle component analysis method was used to secondary extract the features which reflected sensitively the failure characteristics in the mixed-domain features set. According to the accumulated contribution rate of more than 90%, the first 11 nonlinear principal components were extracted as primary feature vector for support vector machine classifier to recognize. The results show that the mixed-domain features set can reflect the failure characteristics more comprehensively and accurately than a single feature or single-domain features. Kernel principle component analysis method can effectively reduce the input feature dimensions, and ensure the output features to be of high sensitivity to reflect the operational status of bearings and high separabilityfor pattern recognition. Compared to the common feature extraction method based on wavelet decomposition, this proposed method becomes more apparent to extract fault feature of rolling bearing in different types and degrees under different operating conditions.%为有效利用时域、频域、时-频域中各类具有显著类别差异信息的非平稳统计特征,提高滚动轴承状态监测和故障诊断的性能和效率,提出一种基于核主元分析的混合

  10. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  11. An automatic fault management model for distribution networks

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Haenninen, S. [VTT Energy, Espoo (Finland); Seppaenen, M. [North-Carelian Power Co (Finland); Antila, E.; Markkila, E. [ABB Transmit Oy (Finland)

    1998-08-01

    An automatic computer model, called the FI/FL-model, for fault location, fault isolation and supply restoration is presented. The model works as an integrated part of the substation SCADA, the AM/FM/GIS system and the medium voltage distribution network automation systems. In the model, three different techniques are used for fault location. First, by comparing the measured fault current to the computed one, an estimate for the fault distance is obtained. This information is then combined, in order to find the actual fault point, with the data obtained from the fault indicators in the line branching points. As a third technique, in the absence of better fault location data, statistical information of line section fault frequencies can also be used. For combining the different fault location information, fuzzy logic is used. As a result, the probability weights for the fault being located in different line sections, are obtained. Once the faulty section is identified, it is automatically isolated by remote control of line switches. Then the supply is restored to the remaining parts of the network. If needed, reserve connections from other adjacent feeders can also be used. During the restoration process, the technical constraints of the network are checked. Among these are the load carrying capacity of line sections, voltage drop and the settings of relay protection. If there are several possible network topologies, the model selects the technically best alternative. The FI/IL-model has been in trial use at two substations of the North-Carelian Power Company since November 1996. This chapter lists the practical experiences during the test use period. Also the benefits of this kind of automation are assessed and future developments are outlined

  12. Fault Tolerant Feedback Control

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, H.

    2001-01-01

    An architecture for fault tolerant feedback controllers based on the Youla parameterization is suggested. It is shown that the Youla parameterization will give a residual vector directly in connection with the fault diagnosis part of the fault tolerant feedback controller. It turns out...... that there is a separation be-tween the feedback controller and the fault tolerant part. The closed loop feedback properties are handled by the nominal feedback controller and the fault tolerant part is handled by the design of the Youla parameter. The design of the fault tolerant part will not affect the design...... of the nominal feedback con-troller....

  13. Planetary Gearbox Fault Diagnosis Using Envelope Manifold Demodulation

    Directory of Open Access Journals (Sweden)

    Weigang Wen

    2016-01-01

    Full Text Available The important issue in planetary gear fault diagnosis is to extract the dependable fault characteristics from the noisy vibration signal of planetary gearbox. To address this critical problem, an envelope manifold demodulation method is proposed for planetary gear fault detection in the paper. This method combines complex wavelet, manifold learning, and frequency spectrogram to implement planetary gear fault characteristic extraction. The vibration signal of planetary gear is demodulated by wavelet enveloping. The envelope energy is adopted as an indicator to select meshing frequency band. Manifold learning is utilized to reduce the effect of noise within meshing frequency band. The fault characteristic frequency of the planetary gear is shown by spectrogram. The planetary gearbox model and test rig are established and experiments with planet gear faults are conducted for verification. All results of experiment analysis demonstrate its effectiveness and reliability.

  14. Rapid and automated analysis of aflatoxin M1 in milk and dairy products by online solid phase extraction coupled to ultra-high-pressure-liquid-chromatography tandem mass spectrometry.

    Science.gov (United States)

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Pagano, Imma; Russo, Mariateresa; Rastrelli, Luca

    2016-01-08

    This study reports a fast and automated analytical procedure for the analysis of aflatoxin M1 (AFM1) in milk and dairy products. The method is based on the simultaneous protein precipitation and AFM1 extraction, by salt-induced liquid-liquid extraction (SI-LLE), followed by an online solid-phase extraction (online SPE) coupled to ultra-high-pressure-liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) analysis to the automatic pre-concentration, clean up and sensitive and selective determination of AFM1. The main parameters affecting the extraction efficiency and accuracy of the analytical method were studied in detail. In the optimal conditions, acetonitrile and NaCl were used as extraction/denaturant solvent and salting-out agent in SI-LLE, respectively. After centrifugation, the organic phase (acetonitrile) was diluted with water (1:9 v/v) and purified (1mL) by online C18 cartridge coupled with an UHPLC column. Finally, selected reaction monitoring (SRM) acquisition mode was applied to the detection of AFM1. Validation studies were carried out on different dairy products (whole and skimmed cow milk, yogurt, goat milk, and powder infant formula), providing method quantification limits about 25 times lower than AFM1 maximum levels permitted by EU regulation 1881/2006 in milk and dairy products for direct human consumption. Recoveries (86-102%) and repeatability (RSDdairy products studied. The proposed method improves the performance of AFM1 analysis in milk samples as AFM1 determination is performed with a degree of accuracy higher than the conventional methods. Other advantages are the reduction of sample preparation procedure, time and cost of the analysis, enabling high sample throughput that meet the current concerns of food safety and the public health protection.

  15. LC-HR-MS/MS standard urine screening approach: Pros and cons of automated on-line extraction by turbulent flow chromatography versus dilute-and-shoot and comparison with established urine precipitation.

    Science.gov (United States)

    Helfer, Andreas G; Michely, Julian A; Weber, Armin A; Meyer, Markus R; Maurer, Hans H

    2017-02-01

    Comprehensive urine screening for drugs and metabolites by LC-HR-MS/MS using Orbitrap technology has been described with precipitation as simple workup. In order to fasten, automate, and/or simplify the workup, on-line extraction by turbulent flow chromatography and a dilute-and-shoot approach were developed and compared. After chromatographic separation within 10min, the Q-Exactive mass spectrometer was run in full scan mode with positive/negative switching and subsequent data dependent acquisition mode. The workup approaches were validated concerning selectivity, recovery, matrix effects, process efficiency, and limits of identification and detection for typical drug representatives and metabolites. The total workup time for on-line extraction was 6min, for the dilution approach 3min. For comparison, the established urine precipitation and evaporation lasted 10min. The validation results were acceptable. The limits for on-line extraction were comparable with those described for precipitation, but lower than for dilution. Thanks to the high sensitivity of the LC-HR-MS/MS system, all three workup approaches were sufficient for comprehensive urine screening and allowed fast, reliable, and reproducible detection of cardiovascular drugs, drugs of abuse, and other CNS acting drugs after common doses.

  16. Fault tolerant system design for uninterruptible power supplies

    Directory of Open Access Journals (Sweden)

    B. Y. Volochiy

    2012-02-01

    Full Text Available The problem of design for reliability of a fault tolerant system for uninterruptible power supplies is considered. Configuration of a fault tolerant system determines the structure of an uninterruptible power supply: power supply built from modules of the same type, stand-by sliding reserve for them, twice total reserve of the power supply with two accumulator batteries, the controls and diagnostics means. The developed tool for automated analytical model of fault tolerant systems generation and illustration of its capabilities in determination of requirements for repair service and accumulator batteries are given.

  17. Fault Diagnosis for Electrical Distribution Systems using Structural Analysis

    DEFF Research Database (Denmark)

    Knüppel, Thyge; Blanke, Mogens; Østergaard, Jacob

    2014-01-01

    Fault-tolerance in electrical distribution relies on the ability to diagnose possible faults and determine which components or units cause a problem or are close to doing so. Faults include defects in instrumentation, power generation, transformation and transmission. The focus of this paper...... redundancies in large sets of equations only from the structure (topology) of the equations. A salient feature is automated generation of redundancy relations. The method is indeed feasible in electrical networks where circuit theory and network topology together formulate the constraints that define...

  18. Automated extraction of DNA from reference samples from various types of biological materials on the Qiagen BioRobot EZ1 Workstation

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Jørgensen, Mads; Hansen, Anders Johannes

    2009-01-01

    We have validated and implemented a protocol for DNA extraction from various types of biological materials using a Qiagen BioRobot EZ1 Workstation. The sample materials included whole blood, blood from deceased, buccal cells on Omni swabs and FTA Cards, blood on FTA Cards and cotton swabs......, and muscle biopsies. The DNA extraction was validated according to EN/ISO 17025 for the STR kits AmpFlSTR« Identifiler« and AmpFlSTR« Yfiler« (Applied Biosystems). Of 298 samples extracted, 11 (4%) did not yield acceptable results. In conclusion, we have demonstrated that extraction of DNA from various types...... of biological material can be performed quickly and without the use of hazardous chemicals, and that the DNA may be successfully STR typed according to the requirements of forensic genetic investigations accredited according to EN/ISO 17025...

  19. Fault detection and isolation in systems with parametric faults

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, Hans Henrik

    1999-01-01

    The problem of fault detection and isolation of parametric faults is considered in this paper. A fault detection problem based on parametric faults are associated with internal parameter variations in the dynamical system. A fault detection and isolation method for parametric faults is formulated...

  20. Iowa Bedrock Faults

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — This fault coverage locates and identifies all currently known/interpreted fault zones in Iowa, that demonstrate offset of geologic units in exposure or subsurface...

  1. null Faults, null Images

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Through the study of faults and their effects, much can be learned about the size and recurrence intervals of earthquakes. Faults also teach us about crustal...

  2. Fault Monitoring and Re-Configurable Control for a Ship Propulsion Plant

    DEFF Research Database (Denmark)

    Blanke, M.; Izadi-Zamanabadi, Roozbeh; Lootsma, T.F.

    1998-01-01

    Minor faults in ship propulsion and their associated automation systems can cause dramatic reduction on ships' ability to propel and manoeuvre, and effective means are needed to prevent that simple faults develop into severe failure. The paper analyses the control system for a propulsion plant...

  3. Heating automation

    OpenAIRE

    Tomažič, Tomaž

    2013-01-01

    This degree paper presents usage and operation of peripheral devices with microcontroller for heating automation. The main goal is to make a quality system control for heating three house floors and with that, increase efficiency of heating devices and lower heating expenses. Heat pump, furnace, boiler pump, two floor-heating pumps and two radiator pumps need to be controlled by this system. For work, we have chosen a development kit stm32f4 - discovery with five temperature sensors, LCD disp...

  4. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  5. Marketing automation

    OpenAIRE

    Raluca Dania TODOR

    2017-01-01

    The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the...

  6. Do faults stay cool under stress?

    Science.gov (United States)

    Savage, H. M.; Polissar, P. J.; Sheppard, R. E.; Brodsky, E. E.; Rowe, C. D.

    2011-12-01

    Determining the absolute stress on faults during slip is one of the major goals of earthquake physics as this information is necessary for full mechanical modeling of the rupture process. One indicator of absolute stress is the total energy dissipated as heat through frictional resistance. The heat results in a temperature rise on the fault that is potentially measurable and interpretable as an indicator of the absolute stress. We present a new paleothermometer for fault zones that utilizes the thermal maturity of extractable organic material to determine the maximum frictional heating experienced by the fault. Because there are no retrograde reactions in these organic systems, maximum heating is preserved. We investigate four different faults: 1) the Punchbowl Fault, a strike-slip fault that is part of the ancient San Andreas system in southern California, 2) the Muddy Mountain Thrust, a continental thrust sheet in Nevada, 3) large shear zones of Sitkanik Island, AK, part of the proto-megathrust of the Kodiak Accretionary Complex and 4) the Pasagshak Point Megathrust, Kodiak Accretionary Complex, AK. According to a variety of organic thermal maturity indices, the thermal maturity of the rocks falls within the range of heating expected from the bounds on burial depth and time, indicating that the method is robust and in some cases improving our knowledge of burial depth. Only the Pasagshak Point Thrust, which is also pseudotachylyte-bearing, shows differential heating between the fault and off-fault samples. This implies that most of the faults did not get hotter than the surrounding rock during slip. Simple temperature models coupled to the kinetic reactions for organic maturity let us constrain certain aspects of the fault during slip such as fault friction, maximum slip in a single earthquake, the thickness of the active slipping zone and the effective normal stress. Because of the significant length of these faults, we find it unlikely that they never sustained

  7. Performance based fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2002-01-01

    Different aspects of fault detection and fault isolation in closed-loop systems are considered. It is shown that using the standard setup known from feedback control, it is possible to formulate fault diagnosis problems based on a performance index in this general standard setup. It is also shown...

  8. 全矢谱-粗集理论在旋转机械故障频谱特征提取中的应用研究%Research on spplication of full vector spectrum-rough set in extracting fault spectrum feature of rotating machinery

    Institute of Scientific and Technical Information of China (English)

    王宏超; 韩捷; 陈宏; 巩晓赟

    2011-01-01

    随着旋转机械的大型化、高速化、高精度化,全面、及时、有效的对其进行故障特征提取的重要性愈来愈明显.传统的单通道信息采集方式有着信息量不全面易造成误判的弊端;传统的信息处理方式存在着效率低等弊端.基于同源信息融合和故障特征提取的思想,将全矢谱技术和粗集理论结合,提出了全矢-粗集理论在旋转机械故障频谱特征提取中的应用方法,给出了相关的定义和算法.并通过典型故障的实验验证,此方法在旋转机械故障频谱特征提取中有着更为准确、全面的优势,是一种有效的故障频谱特征提取方法.为旋转机械故障的在线监测提供参考.%With the rotating machinery developign more larger in scale,more faster in speed and more higher in accuracy,it is becoming more important to extract its faults feature comprehensively,timely and effectively.White the traditional extraction is characterized with its low efficiency,incomplete that may lead to wrong judgement Basing on the ideology of same source information fusion and fault feature extraction,it the vector spectrum shall be combined with rough set and propose the method on application of full vector spectrum- rough set in extracting fault spectrum feature of rotating machinery,which definition and algorithm are given.And with the experimental test for typical rotating machinery fault,this method is proven with the advantages of accuracy and comprehensiveness in fault spectrum feature extraction.therefore it is not only an effective way in fault spectrum feature extruction but also an reference for online monitoring and diagnosis.

  9. Analysis of fault using microcomputer protection by symmetrical component method

    Directory of Open Access Journals (Sweden)

    Ashish Choubey

    2012-09-01

    Full Text Available To enhance power supply reliability for the user terminals in the case of the distribution system to avoid interference by the fault again, rapidly complete the automatic identification, positioning, automatic fault isolation, network reconfiguration until the resumption of supply of non-fault section, a microprocessor-based relay protection device has developed. As the fault component theory is widely used in microcomputer protection, and fault component exists in the network of fault component, it is necessary to build up the fault component network when short circuit fault emerging and to draw the current and voltage component phasor diagram at fault point. In order to understand microcomputer protection based on the symmetrical component principle, we obtained the sequence current and sequence voltage according to the concept of symmetrical component. Distribution line directly to user-oriented power supply, the reliability of its operation determines the quality and level of electricity supply. In recent decades, because of the general power of the tireless efforts of scientists and technicians, relay protection technology and equipment application level has been greatly improved, but the current domestic production of computer hardware, protection devices are still outdated systems. Software development has maintenance difficulties and short survival time. With the factory automation system interface functions weak points, the network communication cannot meet the actual requirements. Protection principle configuration and device manufacturing process to be improved and so on.

  10. A Fault Alarm and Diagnosis Method Based on Sensitive Parameters and Support Vector Machine

    Science.gov (United States)

    Zhang, Jinjie; Yao, Ziyun; Lv, Zhiquan; Zhu, Qunxiong; Xu, Fengtian; Jiang, Zhinong

    2015-08-01

    Study on the extraction of fault feature and the diagnostic technique of reciprocating compressor is one of the hot research topics in the field of reciprocating machinery fault diagnosis at present. A large number of feature extraction and classification methods have been widely applied in the related research, but the practical fault alarm and the accuracy of diagnosis have not been effectively improved. Developing feature extraction and classification methods to meet the requirements of typical fault alarm and automatic diagnosis in practical engineering is urgent task. The typical mechanical faults of reciprocating compressor are presented in the paper, and the existing data of online monitoring system is used to extract fault feature parameters within 15 types in total; the inner sensitive connection between faults and the feature parameters has been made clear by using the distance evaluation technique, also sensitive characteristic parameters of different faults have been obtained. On this basis, a method based on fault feature parameters and support vector machine (SVM) is developed, which will be applied to practical fault diagnosis. A better ability of early fault warning has been proved by the experiment and the practical fault cases. Automatic classification by using the SVM to the data of fault alarm has obtained better diagnostic accuracy.

  11. Analytical Model-based Fault Detection and Isolation in Control Systems

    DEFF Research Database (Denmark)

    Vukic, Z.; Ozbolt, H.; Blanke, M.

    1998-01-01

    The paper gives an introduction and an overview of the field of fault detection and isolation for control systems. The summary of analytical (quantitative model-based) methodds and their implementation are presented. The focus is given to mthe analytical model-based fault-detection and fault...... diagnosis methods, often viewed as the classical or deterministic ones. Emphasis is placed on the algorithms suitable for ship automation, unmanned underwater vehicles, and other systems of automatic control....

  12. Enantioselective determination of methylphenidate and ritalinic acid in whole blood from forensic cases using automated solid-phase extraction and liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Thomsen, Ragnar; B. Rasmussen, Henrik; Linnet, Kristian

    2012-01-01

    A chiral liquid chromatography tandem mass spectrometry (LC–MS-MS) method was developed and validated for quantifying methylphenidate and its major metabolite ritalinic acid in blood from forensic cases. Blood samples were prepared in a fully automated system by protein precipitation followed...... methylphenidate was not determined to be related to the cause of death, the femoral blood concentration of d-methylphenidate ranged from 5 to 58 ng/g, and from undetected to 48 ng/g for l-methylphenidate (median d/l-ratio 5.9). Ritalinic acid was present at concentrations 10–20 times higher with roughly equal...... amounts of the d- and l-forms. In blood from 10 living subjects that were not suspected of being intoxicated by methylphenidate, the concentration ranges and patterns were similar to those of the postmortem cases. Thus, methylphenidate does not appear to undergo significant postmortem redistribution....

  13. 基于机载激光点云数据的电力线自动提取算法%An Automated Extraction Algorithm of Power Lines Based on Airborne Laser Scanning Data

    Institute of Scientific and Technical Information of China (English)

    尹辉增; 孙轩; 聂振钢

    2012-01-01

    An efficient automated extraction algorithm of power lines based on Airborne Laser Scanning ( ALS) data was put forward. The algorithm adopted point clouds classification based on region part height histogram distribution patterns,lines extraction method with global direction feature in Hough space, mathematical estimating method of hanging point position and local partitioned polynomial fitting method. Four key problems were solved by use of the algorithm,namely, point clouds classification, plane position extraction of power lines,power line hanging points extraction and power line fitting. Finally,the applicability of the algorithm was proved by some practical engineering data%设计并开发了一种从机载激光扫描的三维点云数据中自动提取电力线的算法,采用局部高程分布直方图模式分类滤波、Hough特征空间中全局方向特征优先的线特征提取、悬挂点位置数学推算和局部分段多项式拟合的方法,有效解决了电力线提取过程中电力线点云与电塔点云的自动分类、电力线平面位置提取、电力线悬挂点提取、电力线拟合问题.最后通过实际的工程数据验证了该算法的实用性.

  14. Fault-Tree Compiler

    Science.gov (United States)

    Butler, Ricky W.; Boerschlein, David P.

    1993-01-01

    Fault-Tree Compiler (FTC) program, is software tool used to calculate probability of top event in fault tree. Gates of five different types allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language easy to understand and use. In addition, program supports hierarchical fault-tree definition feature, which simplifies tree-description process and reduces execution time. Set of programs created forming basis for reliability-analysis workstation: SURE, ASSIST, PAWS/STEM, and FTC fault-tree tool (LAR-14586). Written in PASCAL, ANSI-compliant C language, and FORTRAN 77. Other versions available upon request.

  15. Information Based Fault Diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2008-01-01

    Fault detection and isolation, (FDI) of parametric faults in dynamic systems will be considered in this paper. An active fault diagnosis (AFD) approach is applied. The fault diagnosis will be investigated with respect to different information levels from the external inputs to the systems....... These inputs are disturbance inputs, reference inputs and auxilary inputs. The diagnosis of the system is derived by an evaluation of the signature from the inputs in the residual outputs. The changes of the signatures form the external inputs are used for detection and isolation of the parametric faults....

  16. Improving the software fault localization process through testability information

    NARCIS (Netherlands)

    Gonzalez-Sanchez, A.; Abreu, R.; Gross, H.; Van Gemund, A.

    2010-01-01

    When failures occur during software testing, automated software fault localization helps to diagnose their root causes and identify the defective components of a program to support debugging. Diagnosis is carried out by selecting test cases in such way that their pass or fail information will narrow

  17. 基于双空间的融合特权信息支持向量机模拟电路故障诊断%A novel analog circuit fault diagnosis method based on dual-space feature extraction algorithm and SVM of learning using privileged information

    Institute of Scientific and Technical Information of China (English)

    李涛柱; 李红波; 朱世先; 李燕杰; 李楠

    2012-01-01

    a novel fault diagnosis method based on dual-space feature extraction algorithm and LUPI-SVM is proposed in this paper. Aiming at solving the problem of correctly identifying fault classes in analog circuit fault diagnosis. Firstly, the fault feature vectors are extracted by PCA feature extraction method. Then.pre-classfy them by the LUPI_SVM classfier and the SVM-GA classfier.and then the different vectors of the pre-claasfying results are extracted by ICA. Finally, inputing them into the trained LUPI-SVM model to identify the different fault cases. The simulation results for analog and mixed-signal test benchmark Sallen-Key filter circuits demonstrate that the proposed method improved classification ability correctly,which develops a new direction for the fault diagnosis of analog circuit.%针对模拟电路故障诊断识别率较低的问题,提出了基于双空间特征提取的融合特权信息支持向量机的模拟电路故障诊断新方法.首先对采集的信号进行主成分分析(principal component analysis,PCA)——特征提取;并用融合特权信息支持向量机LUPI-SVM(SVM of learning using privileged information,LUPI-SVM)分类器和SVM-GA分类器进行预分类;对分类结果不同的样本进行独立成分分析(independent component analysis,ICA)—特征提取,并用LUPI_SVM进行分类识别,Sallen-Key滤波电路故障诊断仿真实验结果表明该方法有效提高了分类的性能,为模拟电路故障诊断提供了新的途径.

  18. MODIFIED LAPLACIAN EIGENMAP METHOD FOR FAULT DIAGNOSIS

    Institute of Scientific and Technical Information of China (English)

    JIANG Quansheng; JIA Minping; HU Jianzhong; XU Feiyun

    2008-01-01

    A novel method based on the improved Laplacian eigenmap algorithm for fault pattern classification is proposed. Via modifying the Laplacian eigenmap algorithm to replace Euclidean distance with kernel-based geometric distance in the neighbor graph construction, the method can preserve the consistency of local neighbor information and effectively extract the low-dimensional manifold features embedded in the high-dimensional nonlinear data sets. A nonlinear dimensionality reduction algorithm based on the improved Laplacian eigenmap is to directly learn high-dimensional fault signals and extract the intrinsic manifold features from them. The method greatly preserves the global geometry structure information embedded in the signals, and obviously improves the classification performance of fault pattern recognition. The experimental results on both simulation and engineering indicate the feasibility and effectiveness of the new method.

  19. Semi-automated building extraction from airborne laser scanning data. (Polish Title: Półautomatyczne modelowanie brył budynków na podstawie danych z lotniczego skaningu laserowego)

    Science.gov (United States)

    Marjasiewicz, M.; Malej, T.

    2014-12-01

    The main idea of this project is to introduce a conception of semi - automated method for building model extraction from Airborne Laser Scanning data. The presented method is based on the RANSAC algorithm, which provides automatic collection planes for roofs model creation. In the case of Airborne Laser Scanning, the algorithm can process point clouds influenced with noise and erroneous measurement (gross errors). The RANSAC algorithm is based on the iterative processing of a set of points in order to estimate the geometric model. Research of u sing algorithm for ALS data was performed in available Cloud Compare and SketchUP software. An important aspect in this research was algorithm parameters selection, which was made on the basis of characteristics of point cloud and scanned objects. Analysis showed that the accuracy of plane extraction with RANSAC algorithm does not exceed 20 centimeters for point clouds of density 4 pts . /m 2 . RANSAC can be successfully used in buildings modelling based on ALS data. Roofs created by the presented method could be used in visualizations on a much better level than Level of Detail 2 by CityGML standard. If model is textured it can represent LoD3 standard.

  20. Automated and sensitive determination of four anabolic androgenic steroids in urine by online turbulent flow solid-phase extraction coupled with liquid chromatography-tandem mass spectrometry: a novel approach for clinical monitoring and doping control.

    Science.gov (United States)

    Guo, Feng; Shao, Jing; Liu, Qian; Shi, Jian-Bo; Jiang, Gui-Bin

    2014-07-01

    A novel method for automated and sensitive analysis of testosterone, androstenedione, methyltestosterone and methenolone in urine samples by online turbulent flow solid-phase extraction coupled with high performance liquid chromatography-tandem mass spectrometry was developed. The optimization and validation of the method were discussed in detail. The Turboflow C18-P SPE column showed the best extraction efficiency for all the analytes. Nanogram per liter (ng/L) level of AAS could be determined directly and the limits of quantification (LOQs) were 0.01 ng/mL, which were much lower than normally concerned concentrations for these typical anabolic androgenic steroids (AAS) (0.1 ng/mL). The linearity range was from the LOQ to 100 ng/mL for each compound, with the coefficients of determination (r(2)) ranging from 0.9990 to 0.9999. The intraday and interday relative standard deviations (RSDs) ranged from 1.1% to 14.5% (n=5). The proposed method was successfully applied to the analysis of urine samples collected from 24 male athletes and 15 patients of prostate cancer. The proposed method provides an alternative practical way to rapidly determine AAS in urine samples, especially for clinical monitoring and doping control.

  1. Automated extraction of lysergic acid diethylamide (LSD) and N-demethyl-LSD from blood, serum, plasma, and urine samples using the Zymark RapidTrace with LC/MS/MS confirmation.

    Science.gov (United States)

    de Kanel, J; Vickery, W E; Waldner, B; Monahan, R M; Diamond, F X

    1998-05-01

    A forensic procedure for the quantitative confirmation of lysergic acid diethylamide (LSD) and the qualitative confirmation of its metabolite, N-demethyl-LSD, in blood, serum, plasma, and urine samples is presented. The Zymark RapidTrace was used to perform fully automated solid-phase extractions of all specimen types. After extract evaporation, confirmations were performed using liquid chromatography (LC) followed by positive electrospray ionization (ESI+) mass spectrometry/mass spectrometry (MS/MS) without derivatization. Quantitation of LSD was accomplished using LSD-d3 as an internal standard. The limit of quantitation (LOQ) for LSD was 0.05 ng/mL. The limit of detection (LOD) for both LSD and N-demethyl-LSD was 0.025 ng/mL. The recovery of LSD was greater than 95% at levels of 0.1 ng/mL and 2.0 ng/mL. For LSD at 1.0 ng/mL, the within-run and between-run (different day) relative standard deviation (RSD) was 2.2% and 4.4%, respectively.

  2. Intelligent Automated Diagnosis of Client Device Bottlenecks in Private Clouds

    CERN Document Server

    Widanapathirana, C; Sekercioglu, Y A; Ivanovich, M; Fitzpatrick, P; 10.1109/UCC.2011.42

    2012-01-01

    We present an automated solution for rapid diagnosis of client device problems in private cloud environments: the Intelligent Automated Client Diagnostic (IACD) system. Clients are diagnosed with the aid of Transmission Control Protocol (TCP) packet traces, by (i) observation of anomalous artifacts occurring as a result of each fault and (ii) subsequent use of the inference capabilities of soft-margin Support Vector Machine (SVM) classifiers. The IACD system features a modular design and is extendible to new faults, with detection capability unaffected by the TCP variant used at the client. Experimental evaluation of the IACD system in a controlled environment demonstrated an overall diagnostic accuracy of 98%.

  3. An Automated Method for Extracting Spatially Varying Time-Dependent Quantities from an ALEGRA Simulation Using VisIt Visualization Software

    Science.gov (United States)

    2014-07-01

    Visualization software such as VisIt presents an alternative method to examine data through the use of EXODUS databases.3 In addition, VisIt...extracting transient quantities that vary spatially from an EXODUS database using a VisIt macro written in the Python programming language. 2...Graphics; Sandia National Laboratories: Albuquerque, NM, September 1991. Revised April 1994. 3 Schoof, L. A.; Yarberry, V. R. EXODUS II: A Finite

  4. Bearing Fault Diagnosis Based on Statistical Locally Linear Embedding

    Directory of Open Access Journals (Sweden)

    Xiang Wang

    2015-07-01

    Full Text Available Fault diagnosis is essentially a kind of pattern recognition. The measured signal samples usually distribute on nonlinear low-dimensional manifolds embedded in the high-dimensional signal space, so how to implement feature extraction, dimensionality reduction and improve recognition performance is a crucial task. In this paper a novel machinery fault diagnosis approach based on a statistical locally linear embedding (S-LLE algorithm which is an extension of LLE by exploiting the fault class label information is proposed. The fault diagnosis approach first extracts the intrinsic manifold features from the high-dimensional feature vectors which are obtained from vibration signals that feature extraction by time-domain, frequency-domain and empirical mode decomposition (EMD, and then translates the complex mode space into a salient low-dimensional feature space by the manifold learning algorithm S-LLE, which outperforms other feature reduction methods such as PCA, LDA and LLE. Finally in the feature reduction space pattern classification and fault diagnosis by classifier are carried out easily and rapidly. Rolling bearing fault signals are used to validate the proposed fault diagnosis approach. The results indicate that the proposed approach obviously improves the classification performance of fault pattern recognition and outperforms the other traditional approaches.

  5. Automated Extraction of Building Facade Footprints from Mobile LiDAR Point Clouds%车载LiDAR点云中建筑物立面位置边界的自动提取

    Institute of Scientific and Technical Information of China (English)

    魏征; 董震; 李清泉; 杨必胜

    2012-01-01

    提出了一种基于点云特征图像和特征值分析的车载LiDAR点云建筑物立面位置边界的自动提取方法。首先利用车载LiDAR点云数据生成扫描区域的点云特征图像,并通过图像处理手段提取可能的建筑物目标点云;然后对提取的目标点云进行剖面分析和特征值分析,识别建筑物目标;最后对建筑物点云进行平面分割,提取建筑物立面,并对立面点云进行特征值分析,得到建筑物立面与地面交接的三维位置边界。实验结果表明,该方法能快速有效地提取车载LiDAR点云数据中的建筑物目标,同时提取的建筑物立面位置边界与原始点云能准确符合。%We present a novel method for automated extraction of building facade footprints from mobile LiDAR point clouds.The proposed method first generates the georeferenced feature image of a mobile LiDAR point cloud and then uses image segmentation to extract contour areas which contain facade points of buildings,points of trees and points of other objects in the georeferenced feature image.After all the points in each contour area are extracted,a classification based on eigenvalue analysis and profile analysis is adopted to identify building objects from point clouds extracted in contour areas.Then all the points in a building object are segmented into different planes using RANSAC algorithm.For each building,points in facade planes are chosen to calculate the direction,the start point,and the end point of the facade footprint using eigenvalue analysis.Finally,footprints of different facades of building are refined,harmonized,and joined.The experimental results show that the proposed method provides a promising and valid solution for automatically extracting building facade footprints from mobile LiDAR point clouds.

  6. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  7. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  8. Complex Fault Interaction in the Yuha Desert

    Science.gov (United States)

    Kroll, K.; Cochran, E. S.; Richards-Dinger, K. B.; Sumy, D. F.

    2012-12-01

    abruptly shuts off suggesting fault activity is highly sensitive to local stress conditions. To further our investigation, we locate over 15,000 previously unreported aftershocks in the YD during the same time period. For this analysis we detect arrivals using an STA/LTA filter from data continuously recorded on 8 seismometers installed in the YD from 6 April through 14 June 2010. Event association was performed with the Antelope software package. Absolute locations were first determined with hypoinverse using the automated phase picks, and the velocity model used in the above relocation procedure. We refined the relative locations using the automated detections and cross-correlation delay times in hypoDD. We use these newly detected earthquakes to further the investigation of fault geometry at the surface and how it relates to fault structure at depth, rheology of the crust, and the spatiotemporal migration patterns within the aftershock distribution.

  9. Support Vector Machine for mechanical faults classification

    Institute of Scientific and Technical Information of China (English)

    JIANG Zhi-qiang; FU Han-guang; LI Ling-jun

    2005-01-01

    Support Vector Machine (SVM) is a machine learning algorithm based on the Statistical Learning Theory (SLT), which can get good classification effects with a few learning samples. SVM represents a new approach to pattern classification and has been shown to be particularly successful in many fields such as image identification and face recognition. It also provides us with a new method to develop intelligent fault diagnosis. This paper presents an SVM based approach for fault diagnosis of rolling bearings. Experimentation with vibration signals of bearing was conducted. The vibration signals acquired from the bearings were directly used in the calculating without the preprocessing of extracting its features. Compared with the Artificial Neural Network (ANN) based method, the SVM based method has desirable advantages. Also a multi-fault SVM classifier based on binary classifier is constructed for gear faults in this paper. Other experiments with gear fault samples showed that the multi-fault SVM classifier has good classification ability and high efficiency in mechanical system. It is suitable for online diagnosis for mechanical system.

  10. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  11. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  12. Fault Locating, Prediction and Protection (FLPPS)

    Energy Technology Data Exchange (ETDEWEB)

    Yinger, Robert, J.; Venkata, S., S.; Centeno, Virgilio

    2010-09-30

    One of the main objectives of this DOE-sponsored project was to reduce customer outage time. Fault location, prediction, and protection are the most important aspects of fault management for the reduction of outage time. In the past most of the research and development on power system faults in these areas has focused on transmission systems, and it is not until recently with deregulation and competition that research on power system faults has begun to focus on the unique aspects of distribution systems. This project was planned with three Phases, approximately one year per phase. The first phase of the project involved an assessment of the state-of-the-art in fault location, prediction, and detection as well as the design, lab testing, and field installation of the advanced protection system on the SCE Circuit of the Future located north of San Bernardino, CA. The new feeder automation scheme, with vacuum fault interrupters, will limit the number of customers affected by the fault. Depending on the fault location, the substation breaker might not even trip. Through the use of fast communications (fiber) the fault locations can be determined and the proper fault interrupting switches opened automatically. With knowledge of circuit loadings at the time of the fault, ties to other circuits can be closed automatically to restore all customers except the faulted section. This new automation scheme limits outage time and increases reliability for customers. The second phase of the project involved the selection, modeling, testing and installation of a fault current limiter on the Circuit of the Future. While this project did not pay for the installation and testing of the fault current limiter, it did perform the evaluation of the fault current limiter and its impacts on the protection system of the Circuit of the Future. After investigation of several fault current limiters, the Zenergy superconducting, saturable core fault current limiter was selected for

  13. Fault Locating, Prediction and Protection (FLPPS)

    Energy Technology Data Exchange (ETDEWEB)

    Yinger, Robert, J.; Venkata, S., S.; Centeno, Virgilio

    2010-09-30

    One of the main objectives of this DOE-sponsored project was to reduce customer outage time. Fault location, prediction, and protection are the most important aspects of fault management for the reduction of outage time. In the past most of the research and development on power system faults in these areas has focused on transmission systems, and it is not until recently with deregulation and competition that research on power system faults has begun to focus on the unique aspects of distribution systems. This project was planned with three Phases, approximately one year per phase. The first phase of the project involved an assessment of the state-of-the-art in fault location, prediction, and detection as well as the design, lab testing, and field installation of the advanced protection system on the SCE Circuit of the Future located north of San Bernardino, CA. The new feeder automation scheme, with vacuum fault interrupters, will limit the number of customers affected by the fault. Depending on the fault location, the substation breaker might not even trip. Through the use of fast communications (fiber) the fault locations can be determined and the proper fault interrupting switches opened automatically. With knowledge of circuit loadings at the time of the fault, ties to other circuits can be closed automatically to restore all customers except the faulted section. This new automation scheme limits outage time and increases reliability for customers. The second phase of the project involved the selection, modeling, testing and installation of a fault current limiter on the Circuit of the Future. While this project did not pay for the installation and testing of the fault current limiter, it did perform the evaluation of the fault current limiter and its impacts on the protection system of the Circuit of the Future. After investigation of several fault current limiters, the Zenergy superconducting, saturable core fault current limiter was selected for

  14. Chaos Synchronization Based Novel Real-Time Intelligent Fault Diagnosis for Photovoltaic Systems

    Directory of Open Access Journals (Sweden)

    Chin-Tsung Hsieh

    2014-01-01

    Full Text Available The traditional solar photovoltaic fault diagnosis system needs two to three sets of sensing elements to capture fault signals as fault features and many fault diagnosis methods cannot be applied with real time. The fault diagnosis method proposed in this study needs only one set of sensing elements to intercept the fault features of the system, which can be real-time-diagnosed by creating the fault data of only one set of sensors. The aforesaid two points reduce the cost and fault diagnosis time. It can improve the construction of the huge database. This study used Matlab to simulate the faults in the solar photovoltaic system. The maximum power point tracker (MPPT is used to keep a stable power supply to the system when the system has faults. The characteristic signal of system fault voltage is captured and recorded, and the dynamic error of the fault voltage signal is extracted by chaos synchronization. Then, the extension engineering is used to implement the fault diagnosis. Finally, the overall fault diagnosis system only needs to capture the voltage signal of the solar photovoltaic system, and the fault type can be diagnosed instantly.

  15. DAME: planetary-prototype drilling automation.

    Science.gov (United States)

    Glass, B; Cannon, H; Branson, M; Hanagud, S; Paulsen, G

    2008-06-01

    We describe results from the Drilling Automation for Mars Exploration (DAME) project, including those of the summer 2006 tests from an Arctic analog site. The drill hardware is a hardened, evolved version of the Advanced Deep Drill by Honeybee Robotics. DAME has developed diagnostic and executive software for hands-off surface operations of the evolved version of this drill. The DAME drill automation tested from 2004 through 2006 included adaptively controlled drilling operations and the downhole diagnosis of drilling faults. It also included dynamic recovery capabilities when unexpected failures or drilling conditions were discovered. DAME has developed and tested drill automation software and hardware under stressful operating conditions during its Arctic field testing campaigns at a Mars analog site.

  16. NASA Space Flight Vehicle Fault Isolation Challenges

    Science.gov (United States)

    Bramon, Christopher; Inman, Sharon K.; Neeley, James R.; Jones, James V.; Tuttle, Loraine

    2016-01-01

    The Space Launch System (SLS) is the new NASA heavy lift launch vehicle and is scheduled for its first mission in 2017. The goal of the first mission, which will be uncrewed, is to demonstrate the integrated system performance of the SLS rocket and spacecraft before a crewed flight in 2021. SLS has many of the same logistics challenges as any other large scale program. Common logistics concerns for SLS include integration of discrete programs geographically separated, multiple prime contractors with distinct and different goals, schedule pressures and funding constraints. However, SLS also faces unique challenges. The new program is a confluence of new hardware and heritage, with heritage hardware constituting seventy-five percent of the program. This unique approach to design makes logistics concerns such as testability of the integrated flight vehicle especially problematic. The cost of fully automated diagnostics can be completely justified for a large fleet, but not so for a single flight vehicle. Fault detection is mandatory to assure the vehicle is capable of a safe launch, but fault isolation is another issue. SLS has considered various methods for fault isolation which can provide a reasonable balance between adequacy, timeliness and cost. This paper will address the analyses and decisions the NASA Logistics engineers are making to mitigate risk while providing a reasonable testability solution for fault isolation.

  17. FAULT DETECTION AND LOCALIZATION IN MOTORCYCLES BASED ON THE CHAIN CODE OF PSEUDOSPECTRA AND ACOUSTIC SIGNALS

    Directory of Open Access Journals (Sweden)

    B. S. Anami

    2013-06-01

    Full Text Available Vehicles produce sound signals with varying temporal and spectral properties under different working conditions. These sounds are indicative of the condition of the engine. Fault diagnosis is a significantly difficult task in geographically remote places where expertise is scarce. Automated fault diagnosis can assist riders to assess the health condition of their vehicles. This paper presents a method for fault detection and location in motorcycles based on the chain code of the pseudospectra and Mel-frequency cepstral coefficient (MFCC features of acoustic signals. The work comprises two stages: fault detection and fault location. The fault detection stage uses the chain code of the pseudospectrum as a feature vector. If the motorcycle is identified as faulty, the MFCCs of the same sample are computed and used as features for fault location. Both stages employ dynamic time warping for the classification of faults. Five types of faults in motorcycles are considered in this work. Observed classification rates are over 90% for the fault detection stage and over 94% for the fault location stage. The work identifies other interesting applications in the development of acoustic fingerprints for fault diagnosis of machinery, tuning of musical instruments, medical diagnosis, etc.

  18. Detection of Staphylococcus aureus enterotoxin production genes from patient samples using an automated extraction platform and multiplex real-time PCR.

    Science.gov (United States)

    Chiefari, Amy K; Perry, Michael J; Kelly-Cirino, Cassandra; Egan, Christina T

    2015-12-01

    To minimize specimen volume, handling and testing time, we have developed two TaqMan(®) multiplex real-time PCR (rtPCR) assays to detect staphylococcal enterotoxins A-E and Toxic Shock Syndrome Toxin production genes directly from clinical patient stool specimens utilizing a novel lysis extraction process in parallel with the Roche MagNA Pure Compact. These assays are specific, sensitive and reliable for the detection of the staphylococcal enterotoxin encoding genes and the tst1 gene from known toxin producing strains of Staphylococcus aureus. Specificity was determined by testing a total of 47 microorganism strains, including 8 previously characterized staphylococcal enterotoxin producing strains against each rtPCR target. Sensitivity for these assays range from 1 to 25 cfu per rtPCR reaction for cultured isolates and 8-20 cfu per rtPCR for the clinical stool matrix.

  19. Automated growth of metal-organic framework coatings on flow-through functional supports.

    Science.gov (United States)

    Maya, F; Palomino Cabello, C; Clavijo, S; Estela, J M; Cerdà, V; Turnes Palomino, G

    2015-05-11

    A fully automated method for the controlled growth of metal-organic framework coatings on flow-through functional supports is reported. The obtained hybrid flow-through supports show high performance for the automated extraction of water pollutants.

  20. Algorithms Could Automate Cancer Diagnosis

    Science.gov (United States)

    Baky, A. A.; Winkler, D. G.

    1982-01-01

    Five new algorithms are a complete statistical procedure for quantifying cell abnormalities from digitized images. Procedure could be basis for automated detection and diagnosis of cancer. Objective of procedure is to assign each cell an atypia status index (ASI), which quantifies level of abnormality. It is possible that ASI values will be accurate and economical enough to allow diagnoses to be made quickly and accurately by computer processing of laboratory specimens extracted from patients.

  1. Simultaneous-Fault Diagnosis of Gas Turbine Generator Systems Using a Pairwise-Coupled Probabilistic Classifier

    Directory of Open Access Journals (Sweden)

    Zhixin Yang

    2013-01-01

    Full Text Available A reliable fault diagnostic system for gas turbine generator system (GTGS, which is complicated and inherent with many types of component faults, is essential to avoid the interruption of electricity supply. However, the GTGS diagnosis faces challenges in terms of the existence of simultaneous-fault diagnosis and high cost in acquiring the exponentially increased simultaneous-fault vibration signals for constructing the diagnostic system. This research proposes a new diagnostic framework combining feature extraction, pairwise-coupled probabilistic classifier, and decision threshold optimization. The feature extraction module adopts wavelet packet transform and time-domain statistical features to extract vibration signal features. Kernel principal component analysis is then applied to further reduce the redundant features. The features of single faults in a simultaneous-fault pattern are extracted and then detected using a probabilistic classifier, namely, pairwise-coupled relevance vector machine, which is trained with single-fault patterns only. Therefore, the training dataset of simultaneous-fault patterns is unnecessary. To optimize the decision threshold, this research proposes to use grid search method which can ensure a global solution as compared with traditional computational intelligence techniques. Experimental results show that the proposed framework performs well for both single-fault and simultaneous-fault diagnosis and is superior to the frameworks without feature extraction and pairwise coupling.

  2. Automated liquid-liquid extraction based on 96-well plate format in conjunction with ultra-performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS) for the quantitation of methoxsalen in human plasma.

    Science.gov (United States)

    Yadav, Manish; Contractor, Pritesh; Upadhyay, Vivek; Gupta, Ajay; Guttikar, Swati; Singhal, Puran; Goswami, Sailendra; Shrivastav, Pranav S

    2008-09-01

    A sensitive, specific and high throughput bioanalytical method using automated sample processing via 96-well plate liquid-liquid extraction and ultra-performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS) has been developed for the determination of methoxsalen in human plasma. Plasma samples with ketoconazole as internal standard (IS) were prepared by employing 0.2 mL human plasma in ethyl acetate:dichloromethane (80:20, v/v). The chromatographic separation was achieved on a Waters Acquity UPLC BEH C18 column using isocratic mobile phase, consisting of 10 mM ammonium formate and acetonitrile (60:40, v/v), at a flow rate of 0.5 mL/min. The linear dynamic range was established over the concentration range 1.1-213.1 ng/mL for methoxsalen. The method was rugged and rapid with a total run time of 1.5 min. It was successfully applied to a pivotal bioequivalence study in 12 healthy human subjects after oral administration of 10 mg extended release methoxsalen formulation under fasting condition.

  3. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection

    Energy Technology Data Exchange (ETDEWEB)

    Perez-Ruiz, Tomas [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain)]. E-mail: tpr@um.es; Martinez-Lozano, Carmen [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain); Garcia, Maria Dolores [Department of Analytical Chemistry, Faculty of Chemistry, University of Murcia, 30071 Murcia (Spain)

    2007-02-19

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 {mu}g mL{sup -1} of propoxur, with a detection limit of 5 ng mL{sup -1}. The repeatability was 0.82% expressed as relative standard deviation (n = 10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL{sup -1} levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L{sup -1} using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 {mu}g kg{sup -1}.

  4. Determination of propoxur in environmental samples by automated solid-phase extraction followed by flow-injection analysis with tris(2,2'-bipyridyl)ruthenium(II) chemiluminescence detection.

    Science.gov (United States)

    Pérez-Ruiz, Tomás; Martínez-Lozano, Carmen; García, María Dolores

    2007-02-19

    A sensitive method for the analysis of propoxur in environmental samples has been developed. It involves an automated solid-phase extraction (SPE) procedure using a Gilson Aspec XLi and flow-injection analysis (FI) with chemiluminescence (CL) detection. The FI-CL system relies on the photolysis of propoxur by irradiation using a low-pressure mercury lamp (main spectral line 254 nm). The resultant methylamine is subsequently detected by CL using tris(2,2'-bipyridyl)ruthenium(III), which is on-line generated by photo-oxidation of the ruthenium(II) complex in the presence of peroxydisulfate. The linear concentration range of application was 0.05-5 microg mL(-1) of propoxur, with a detection limit of 5 ng mL(-1). The repeatability was 0.82% expressed as relative standard deviation (n=10) and the reproducibility, studied on 5 consecutive days, was 2.1%. The sample throughput was 160 injection per hour. Propoxur residues below ng mL(-1) levels could be determined in environmental water samples when an SPE preconcentration device was coupled on-line with the FI system. This SPE-FI-CL arrangement provides a detection limit as low as 5 ng L(-1) using only 500 mL of sample. In the analysis of fruits and vegetables, the detection limit was about 10 microg kg(-1).

  5. Maneuver Automation Software

    Science.gov (United States)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  6. Sub-module Short Circuit Fault Diagnosis in Modular Multilevel Converter Based on Wavelet Transform and Adaptive Neuro Fuzzy Inference System

    DEFF Research Database (Denmark)

    Liu, Hui; Loh, Poh Chiang; Blaabjerg, Frede

    2015-01-01

    for continuous operation and post-fault maintenance. In this article, a fault diagnosis technique is proposed for the short circuit fault in a modular multi-level converter sub-module using the wavelet transform and adaptive neuro fuzzy inference system. The fault features are extracted from output phase voltage...

  7. Fault kinematic and Mesozoic paleo-stress evolution of the Hoop fault complex, Barents Sea

    Science.gov (United States)

    Etchebes, Marie; Athmer, Wiebke; Stueland, Eirik; Robertson, Sarah C.; Bounaim, Aicha; Steckhan, Dirk; Hellem Boe, Trond; Brenna, Trond; Sonneland, Lars; Reidar Granli, John

    2016-04-01

    The Hoop fault complex is an extensional fault system characterized by a series of multiscale half- and full-grabens trending NNE-SSW, NE-SW and E-W, and transfer zones striking ENE-WSW. In a joint collaboration between OMV Norge and Schlumberger Stavanger Research, the tectonic history of the Hoop area was assessed. A dense fault network was extracted from 3D seismic data using a novel workflow for mapping large and complex fault systems. The characterization of the fault systems was performed by integrating observations from (1) fault plane analysis, (2) geometrical shapes and crosscutting relationships of the different fault sets, (3) time-thickness maps, and (4) by establishing the relative timing of the tectonic events on key seismic lines orthogonal to the main fault strike azimuths. At least four successive extensional tectonic events affecting the Hoop fault complex have been identified in the Mesozoic. The first tectonic event is characterized by an Upper Triassic extensional event with an E-W trending maximum horizontal paleo-stress direction (Phase 1). This event led to new accommodation space established as a set of full-grabens. The grabens were orthogonally crosscut during the Middle Jurassic by a set of NNE-SSW striking grabens and half-grabens (Phase 2). Phase 3 was inferred from a set of E-W striking reactivated normal faults sealed by the Upper Jurassic-Lower Cretaceous sequence. In the Lower Cretaceous, the general trend of the maximum horizontal paleo-stress axis of Phase 2 rotates clockwise from NNE-SSW to NE-SW (Phase 4). This stress rotation induced the reactivation of Phase 2 and Phase 3 normal fault sets, producing west-dipping half-grabens/tilt-block systems and transtensional fault zones. A comparison between our results and the Mesozoic regional-scale tectonic events published for the Atlantic-Arctic region agrees with our reconstructed paleo-stress history. This implies that the Hoop fault complex is the result of far-field forces

  8. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  9. 基于不完备信息的直升机传动系统故障诊断规则提取方法%Method of rule extraction for fault diagnosis of a helicopter power train based on incomplete information

    Institute of Scientific and Technical Information of China (English)

    王珉; 胡茑庆; 秦国军

    2011-01-01

    在直升机传动系统诊断知识获取中,从不完备信息中获取故障诊断决策规则是一个难题,为此提出一种基于最大特征相似集,从不完备诊断决策表中提取传动系统最优广义诊断决策规则的方法,分析了未知属性值的两种类型,以属性-值集的形式表示不完备诊断决策表中实例的关系,引入特征关系,给出最大特征相似集的定义,介绍了广义决策规则,以最大特征相似集为单位构造了不完备决策表的分辨函数矩阵,结合命题逻辑中的基本等价式,实现了不完备诊断决策表中的规则提取与约简,结合直升机尾减轴承振动信号的故障诊断实例对所提出的方法进行工程应用,给出应用步骤,并证明了该方法的有效性.%It is very difficult to extract decision rules for fault diagnosis from incomplete information in acquisition of knowledge for diagnosis of a helicopter power train. A method extracting the optimal generalized decision rules for fault diagnosis of a power-train based on maximal characteristic similar set (MCSS) was proposed. The two types of unknown attribute value were analyzed. The relation between examples in an incomplete decision table for fault diagnosis was denoted with an attribute-value set. The characteristic relation was induced and the definition of MCSS was presented. The generalized decision rule was introduced. The discernibility function matrix of the incomplete decision table was constructed by using MCSS as a unit. With the basic equivalent forms in a propositional logic, the rules were extracted and simplified from the incomplete decision table for fault diagnosis. With a fault diagnosis example of helicopter's tall-decelerator bear vibration signals, the application of the method was presented, and the validity of this method was proved.

  10. Efficient RT-Level Fault Diagnosis

    Institute of Scientific and Technical Information of China (English)

    Ozgur Sinanoglu; Alex Orailoglu

    2005-01-01

    Increasing IC densities necessitate diagnosis methodologies with enhanced defect locating capabilities. Yet the computational effort expended in extracting diagnostic information and the stringent storage requirements constitute major concerns due to the tremendous number of faults in typical ICs. In this paper, we propose an RT-level diagnosis methodology capable of responding to these challenges. In the proposed scheme, diagnostic information is computed on a grouped fault effect basis, enhancing both the storage and the computational aspects. The fault effect grouping criteria are identified based on a module structure analysis, improving the propagation ability of the diagnostic information through RT modules. Experimental results show that the proposed methodology provides superior speed-ups and significant diagnostic information compression at no sacrifice in diagnostic resolution, compared to the existing gate-level diagnosis approaches.

  11. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE

    Directory of Open Access Journals (Sweden)

    T. Fang

    2014-07-01

    Full Text Available A variety of methods are used to measure the capability of particulate matter (PM to catalytically generate reactive oxygen species (ROS in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with the large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (Coefficient of Variation of 12% for standards, 4% for ambient samples, and reasonably low limit of detection (0.31 nmol min−1. Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9. The system was utilized for the Southeastern Center for Air Pollution and Epidemiology (SCAPE to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, road-side, and rural in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per air volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88, suggesting regional sources contributing to the PM oxidative potential in southeast US. However, the greater heterogeneity in the intrinsic DTT activity (per PM mass basis across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the instrument can also be used to determine oxidative potential with other acellular assays.

  12. A semi-automated system for quantifying the oxidative potential of ambient particles in aqueous extracts using the dithiothreitol (DTT) assay: results from the Southeastern Center for Air Pollution and Epidemiology (SCAPE)

    Science.gov (United States)

    Fang, T.; Verma, V.; Guo, H.; King, L. E.; Edgerton, E. S.; Weber, R. J.

    2015-01-01

    A variety of methods are used to measure the capability of particulate matter (PM) to catalytically generate reactive oxygen species (ROS) in vivo, also defined as the aerosol oxidative potential. A widely used measure of aerosol oxidative potential is the dithiothreitol (DTT) assay, which monitors the depletion of DTT (a surrogate for cellular antioxidants) as catalyzed by the redox-active species in PM. However, a major constraint in the routine use of the DTT assay for integrating it with large-scale health studies is its labor-intensive and time-consuming protocol. To specifically address this concern, we have developed a semi-automated system for quantifying the oxidative potential of aerosol liquid extracts using the DTT assay. The system, capable of unattended analysis at one sample per hour, has a high analytical precision (coefficient of variation of 15% for positive control, 4% for ambient samples) and reasonably low limit of detection (0.31 nmol min-1). Comparison of the automated approach with the manual method conducted on ambient samples yielded good agreement (slope = 1.08 ± 0.12, r2 = 0.92, N = 9). The system was utilized for the Southeastern Center for Air Pollution & Epidemiology (SCAPE) to generate an extensive data set on DTT activity of ambient particles collected from contrasting environments (urban, roadside, and rural) in the southeastern US. We find that water-soluble PM2.5 DTT activity on a per-air-volume basis was spatially uniform and often well correlated with PM2.5 mass (r = 0.49 to 0.88), suggesting regional sources contributing to the PM oxidative potential in the southeastern US. The correlation may also suggest a mechanistic explanation (oxidative stress) for observed PM2.5 mass-health associations. The heterogeneity in the intrinsic DTT activity (per-PM-mass basis) across seasons indicates variability in the DTT activity associated with aerosols from sources that vary with season. Although developed for the DTT assay, the

  13. Fault-tolerant design

    CERN Document Server

    Dubrova, Elena

    2013-01-01

    This textbook serves as an introduction to fault-tolerance, intended for upper-division undergraduate students, graduate-level students and practicing engineers in need of an overview of the field.  Readers will develop skills in modeling and evaluating fault-tolerant architectures in terms of reliability, availability and safety.  They will gain a thorough understanding of fault tolerant computers, including both the theory of how to design and evaluate them and the practical knowledge of achieving fault-tolerance in electronic, communication and software systems.  Coverage includes fault-tolerance techniques through hardware, software, information and time redundancy.  The content is designed to be highly accessible, including numerous examples and exercises.  Solutions and powerpoint slides are available for instructors.   ·         Provides textbook coverage of the fundamental concepts of fault-tolerance; ·         Describes a variety of basic techniques for achieving fault-toleran...

  14. Fault tolerant control for uncertain systems with parametric faults

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2006-01-01

    A fault tolerant control (FTC) architecture based on active fault diagnosis (AFD) and the YJBK (Youla, Jarb, Bongiorno and Kucera)parameterization is applied in this paper. Based on the FTC architecture, fault tolerant control of uncertain systems with slowly varying parametric faults...

  15. Fault isolability conditions for linear systems with additive faults

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Stoustrup, Jakob

    2006-01-01

    In this paper, we shall show that an unlimited number of additive single faults can be isolated under mild conditions if a general isolation scheme is applied. Multiple faults are also covered. The approach is algebraic and is based on a set representation of faults, where all faults within a set...

  16. Fault Monitoring and Fault Recovery Control for Position Moored Tanker

    DEFF Research Database (Denmark)

    Fang, Shaoji; Blanke, Mogens

    2011-01-01

    This paper addresses fault tolerant control for position mooring of a shuttle tanker operating in the North Sea. A complete framework for fault diagnosis is presented but the loss of a sub-sea mooring line buoyancy element is given particular attention, since this fault could lead to mooring line....... Properties of detection and fault-tolerant control are demonstrated by high fidelity simulations....

  17. Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry

    Energy Technology Data Exchange (ETDEWEB)

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2014-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  18. Latest Progress of Fault Detection and Localization in Complex Electrical Engineering

    Science.gov (United States)

    Zhao, Zheng; Wang, Can; Zhang, Yagang; Sun, Yi

    2014-01-01

    In the researches of complex electrical engineering, efficient fault detection and localization schemes are essential to quickly detect and locate faults so that appropriate and timely corrective mitigating and maintenance actions can be taken. In this paper, under the current measurement precision of PMU, we will put forward a new type of fault detection and localization technology based on fault factor feature extraction. Lots of simulating experiments indicate that, although there are disturbances of white Gaussian stochastic noise, based on fault factor feature extraction principal, the fault detection and localization results are still accurate and reliable, which also identifies that the fault detection and localization technology has strong anti-interference ability and great redundancy.

  19. Wavelet Transform and Neural Networks in Fault Diagnosis of a Motor Rotor

    Institute of Scientific and Technical Information of China (English)

    RONG Ming-xing

    2012-01-01

    In the motor fault diagnosis technique, vibration and stator current frequency components of detection are two main means. This article will discuss the signal detection method based on vibration fault. Because the motor vibration signal is a non-stationary random signal, fault signals often contain a lot of time-varying, burst proper- ties of ingredients. The traditional Fourier signal analysis can not effectively extract the motor fault characteristics, but are also likely to be rich in failure information but a weak signal as noise. Therefore, we introduce wavelet packet transforms to extract the fault characteristics of the signal information. Obtained was the result as the neural network input signal, using the L-M neural network optimization method for training, and then used the BP net- work for fault recognition. This paper uses Matlab software to simulate and confirmed the method of motor fault di- agnosis validity and accuracy

  20. Fault Analysis in Cryptography

    CERN Document Server

    Joye, Marc

    2012-01-01

    In the 1970s researchers noticed that radioactive particles produced by elements naturally present in packaging material could cause bits to flip in sensitive areas of electronic chips. Research into the effect of cosmic rays on semiconductors, an area of particular interest in the aerospace industry, led to methods of hardening electronic devices designed for harsh environments. Ultimately various mechanisms for fault creation and propagation were discovered, and in particular it was noted that many cryptographic algorithms succumb to so-called fault attacks. Preventing fault attacks without

  1. Layered clustering multi-fault diagnosis for hydraulic piston pump

    Science.gov (United States)

    Du, Jun; Wang, Shaoping; Zhang, Haiyan

    2013-04-01

    Efficient diagnosis is very important for improving reliability and performance of aircraft hydraulic piston pump, and it is one of the key technologies in prognostic and health management system. In practice, due to harsh working environment and heavy working loads, multiple faults of an aircraft hydraulic pump may occur simultaneously after long time operations. However, most existing diagnosis methods can only distinguish pump faults that occur individually. Therefore, new method needs to be developed to realize effective diagnosis of simultaneous multiple faults on aircraft hydraulic pump. In this paper, a new method based on the layered clustering algorithm is proposed to diagnose multiple faults of an aircraft hydraulic pump that occur simultaneously. The intensive failure mechanism analyses of the five main types of faults are carried out, and based on these analyses the optimal combination and layout of diagnostic sensors is attained. The three layered diagnosis reasoning engine is designed according to the faults' risk priority number and the characteristics of different fault feature extraction methods. The most serious failures are first distinguished with the individual signal processing. To the desultory faults, i.e., swash plate eccentricity and incremental clearance increases between piston and slipper, the clustering diagnosis algorithm based on the statistical average relative power difference (ARPD) is proposed. By effectively enhancing the fault features of these two faults, the ARPDs calculated from vibration signals are employed to complete the hypothesis testing. The ARPDs of the different faults follow different probability distributions. Compared with the classical fast Fourier transform-based spectrum diagnosis method, the experimental results demonstrate that the proposed algorithm can diagnose the multiple faults, which occur synchronously, with higher precision and reliability.

  2. Manufacturing and automation

    Directory of Open Access Journals (Sweden)

    Ernesto Córdoba Nieto

    2010-04-01

    Full Text Available The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and examined, especially that referring to the problem’s relationship with reconciling the level of automation with the flexibility and productivity demanded by competitive, worldwide manufacturing.

  3. Fault Locating in HVDC Transmission Lines Using Generalized Regression Neural Network and Random Forest Algorithm

    Directory of Open Access Journals (Sweden)

    M. Farshad

    2013-09-01

    Full Text Available This paper presents a novel method based on machine learning strategies for fault locating in high voltage direct current (HVDC transmission lines. In the proposed fault-location method, only post-fault voltage signals measured at one terminal are used for feature extraction. In this paper, due to high dimension of input feature vectors, two different estimators including the generalized regression neural network (GRNN and the random forest (RF algorithm are examined to find the relation between the features and the fault location. The results of evaluation using training and test patterns obtained by simulating various fault types in a long overhead transmission line with different fault locations, fault resistance and pre-fault current values have indicated the efficiency and the acceptable accuracy of the proposed approach.

  4. Quaternary Fault Lines

    Data.gov (United States)

    Department of Homeland Security — This data set contains locations and information on faults and associated folds in the United States that are believed to be sources of M>6 earthquakes during the...

  5. 一种运用非线性特征提取的动车牵引齿轮箱故障可靠诊断新方法%A novel approach for reliable gearbox fault diagnosis in high-speed train driving system based on nonlinear feature extraction

    Institute of Scientific and Technical Information of China (English)

    何晓琴; 常友渠

    2015-01-01

    动车牵引齿轮箱是其动力源保障,开展齿轮箱故障诊断能够保证机车安全运行。但是目前在故障非线性特征方面仍需进一步研究。为此,提出了一种新的动车牵引齿轮箱故障诊断方法,利用集合经验模态分解( EEMD)和局部线性嵌入(LLE)优异的非线性分析能力巧妙提取齿轮振动信号的关键特征,并利用支持向量机(SVM)实现对齿轮箱多种故障的可靠诊断。通过齿轮箱故障试验台进行实验分析,结果表明,提出的新方法能够有效检测齿轮磨损、裂纹以及断齿故障,且诊断率比现有方法(如线性特征提取方法)高5%,从而验证了新方法是有效的,有望应用于工程实践之中。%The normal operation of the gearbox can ensure the safety of the high-speed train;therefore it is crucial to implement the gearbox fault diagnosis.Up to date the issue of the nonlinear characteristics analysis for gears is still a challenge.To deal with it,this work proposes a new gear fault diagnosis method based on the integration of the Ensemble Empirical Mode Decomposition ( EEMD) and Local Linear Embedding ( LLE) .Key fault features of gear vibration signals could be extracted by EEMD-LLE,and then the Support Vector Machine ( SVM) could provide high performance on the gear fault diagnosis.A series of experiments have been carried on the gear box fault test-bed to evaluate and verify the proposed method.The analysis results show that the proposed new method can effectively detect the gear wear,crack and broken teeth and the detection rate is 5% higher than the existing methods such as the linear feature extraction methods.Hence,the proposed method has importance in engineering practice.

  6. Fault Length Vs Fault Displacement Evaluation In The Case Of Cerro Prieto Pull-Apart Basin (Baja California, Mexico) Subsidence

    Science.gov (United States)

    Glowacka, E.; Sarychikhina, O.; Nava Pichardo, F. A.; Farfan, F.; Garcia Arthur, M. A.; Orozco, L.; Brassea, J.

    2013-05-01

    The Cerro Prieto pull-apart basin is located in the southern part of San Andreas Fault system, and is characterized by high seismicity, recent volcanism, tectonic deformation and hydrothermal activity (Lomnitz et al, 1970; Elders et al., 1984; Suárez-Vidal et al., 2008). Since the Cerro Prieto geothermal field production started, in 1973, significant subsidence increase was observed (Glowacka and Nava, 1996, Glowacka et al., 1999), and a relation between fluid extraction rate and subsidence rate has been suggested (op. cit.). Analysis of existing deformation data (Glowacka et al., 1999, 2005, Sarychikhina 2011) points to the fact that, although the extraction changes influence the subsidence rate, the tectonic faults control the spatial extent of the observed subsidence. Tectonic faults act as water barriers in the direction perpendicular to the fault, and/or separate regions with different compaction, and as effect the significant part of the subsidence is released as vertical displacement on the ground surface along fault rupture. These faults ruptures cause damages to roads and irrigation canals and water leakage. Since 1996, a network of geotechnical instruments has operated in the Mexicali Valley, for continuous recording of deformation phenomena. To date, the network (REDECVAM: Mexicali Valley Crustal Strain Measurement Array) includes two crackmeters and eight tiltmeters installed on, or very close to, the main faults; all instruments have sampling intervals in the 1 to 20 minutes range. Additionally, there are benchmarks for measuring vertical fault displacements for which readings are recorded every 3 months. Since the crackmeter measures vertical displacement on the fault at one place only, the question appears: can we use the crackmeter data to evaluate how long is the lenth of the fractured fault, and how quickly it grows, so we can know where we can expect fractures in the canals or roads? We used the Wells and Coppersmith (1994) relations between

  7. A Systematic Methodology for Gearbox Health Assessment and Fault Classification

    Directory of Open Access Journals (Sweden)

    Jay Lee

    2011-01-01

    Full Text Available A systematic methodology for gearbox health assessment and fault classification is developed and evaluated for 560 data sets of gearbox vibration data provided by the Prognostics and Health Management Society for the 2009 data challenge competition. A comprehensive set of signal processing and feature extraction methods are used to extract over 200 features, including features extracted from the raw time signal, time synchronous signal, wavelet decomposition signal, frequency domain spectrum, envelope spectrum, among others. A regime segmentation approach using the tachometer signal, a spectrum similarity metric, and gear mesh frequency peak information are used to segment the data by gear type, input shaft speed, and braking torque load. A health assessment method that finds the minimum feature vector sum in each regime is used to classify and find the 80 baseline healthy data sets. A fault diagnosis method based on a distance calculation from normal along with specific features correlated to different fault signatures is used to diagnosis specific faults. The fault diagnosis method is evaluated for the diagnosis of a gear tooth breakage, input shaft imbalance, bent shaft, bearing inner race defect, and bad key, and the method could be further extended for other faults as long as a set of features can be correlated with a known fault signature. Future work looks to further refine the distance calculation algorithm for fault diagnosis, as well as further evaluate other signal processing method such as the empirical mode decomposition to see if an improved set of features can be used to improve the fault diagnosis accuracy.

  8. On-line solid-phase extraction coupled with high-performance liquid chromatography and tandem mass spectrometry (SPE-HPLC-MS-MS) for quantification of bromazepam in human plasma: an automated method for bioequivalence studies.

    Science.gov (United States)

    Gonçalves, José Carlos Saraiva; Monteiro, Tânia Maria; Neves, Claúdia Silvana de Miranda; Gram, Karla Regina da Silva; Volpato, Nádia Maria; Silva, Vivian A; Caminha, Ricardo; Gonçalves, Maria do Rocio Bencke; Santos, Fábio Monteiro Dos; Silveira, Gabriel Estolano da; Noël, François

    2005-10-01

    A validated method for on-line solid-phase extraction coupled with high-performance liquid chromatography tandem mass spectrometry (SPE-HPLC-MS-MS) is described for the quantification of bromazepam in human plasma. The method involves a dilution of 300 muL of plasma with 100 muL of carbamazepine (2.5 ng/mL), used as internal standard, vortex-mixing, centrifugation, and injection of 100 muL of the supernate. The analytes were ionized using positive electrospray mass spectrometry then detected by multiple reaction monitoring (MRM). The m/z transitions 316-->182 (bromazepam) and 237-->194 (carbamazepine) were used for quantification. The calibration curve was linear from 1 ng/mL (limit of quantification) to 200 ng/mL. The retention times of bromazepam and carbamazepine were 2.6 and 3.2 minutes, respectively. The intraday and interday precisions were 3.43%-15.45% and 5.2%-17%, respectively. The intraday and interday accuracy was 94.00%-103.94%. This new automated method has been successfully applied in a bioequivalence study of 2 tablet formulations of 6 mg bromazepam: Lexotan(R) from Produtos Roche Químicos e Farmacêuticos SA, Rio de Janeiro, Brazil (reference) and test formulation from Laboratórios Biosintética Ltda, São Paulo, Brazil. Because the 90% CI of geometric mean ratios between reference and test were completely included in the 80%-125% interval, the 2 formulations were considered bioequivalent. The comparison of different experimental conditions for establishing a dissolution profile in vitro along with our bioavailability data further allowed us to propose rationally based experimental conditions for a dissolution test of bromazepam tablets, actually lacking a pharmacopeial monograph.

  9. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  10. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  11. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  12. Development of an automated scoring system for plant comet assay

    Directory of Open Access Journals (Sweden)

    Bertrand Pourrut

    2015-05-01

    -\tnucleus density: increase the density of nuclei is of importance to increase scoring reliability (Sharma et al., 2012. In conclusion, increasing plant nucleus extraction yield and automated scoring of nuclei do represent big challenges. However, our promising preliminary results open up the perspective of an automated high-throughput scoring of plant nuclei.

  13. Active Fault Isolation in MIMO Systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2014-01-01

    Active fault isolation of parametric faults in closed-loop MIMO system s are considered in this paper. The fault isolation consists of two steps. T he first step is group- wise fault isolation. Here, a group of faults is isolated from other pos sible faults in the system. The group-wise fault iso...

  14. Bevel Gearbox Fault Diagnosis using Vibration Measurements

    Directory of Open Access Journals (Sweden)

    Hartono Dennis

    2016-01-01

    Full Text Available The use of vibration measurementanalysis has been proven to be effective for gearbox fault diagnosis. However, the complexity of vibration signals observed from a gearbox makes it difficult to accurately detectfaults in the gearbox. This work is based on a comparative studyof several time-frequency signal processing methods that can be used to extract information from transient vibration signals containing useful diagnostic information. Experiments were performed on a bevel gearbox test rig using vibration measurements obtained from accelerometers. Initially, thediscrete wavelet transform was implementedfor vibration signal analysis to extract the frequency content of signal from the relevant frequency region. Several time-frequency signal processing methods werethen incorporated to extract the fault features of vibration signals and their diagnostic performances were compared. It was shown thatthe Short Time Fourier Transform (STFT could not offer a good time resolution to detect the periodicity of the faulty gear tooth due the difficulty in choosing an appropriate window length to capture the impulse signal. The Continuous Wavelet Transform (CWT, on the other hand, was suitable to detection of vibration transients generated by localized fault from a gearbox due to its multi-scale property. However, both methods still require a thorough visual inspection. In contrast, it was shown from the experiments that the diagnostic method using the Cepstrumanalysis could provide a direct indication of the faulty tooth without the need of a thorough visual inspection as required by CWT and STFT.

  15. Rough Faults, Distributed Weakening, and Off-Fault Deformation

    Science.gov (United States)

    Griffith, W. A.; Nielsen, S. B.; di Toro, G.; Smith, S. A.; Niemeijer, A. R.

    2009-12-01

    We report systematic spatial variations of fault rocks along non-planar strike-slip faults cross-cutting the Lake Edison Granodiorite, Sierra Nevada, California (Sierran Wavy Fault) and the Lobbia outcrops of the Adamello Batholith in the Italian Alps (Lobbia Wavy Fault). In the case of the Sierran fault, pseudotachylyte formed at contractional fault bends, where it is found as thin (1-2 mm) fault-parallel veins. Epidote and chlorite developed in the same seismic context as the pseudotachylyte and are especially abundant in extensional fault bends. We argue that the presence of fluids, as illustrated by this example, does not necessarily preclude the development of frictional melt. In the case of the Lobbia fault, pseudotachylyte is present in variable thickness along the length of the fault, but the pseudotachylyte veins thicken and pool in extensional bends. The Lobbia fault surface is self-affine, and we conduct a quantitative analysis of microcrack distribution, stress, and friction along the fault. Numerical modeling results show that opening in extensional bends and localized thermal weakening in contractional bends counteract resistance encountered by fault waviness, resulting in an overall weaker fault than suggested by the corresponding static friction coefficient. Models also predict stress redistribution around bends in the faults which mirror microcrack distributions, indicating significant elastic and anelastic strain energy is dissipated into the wall rocks due to non-planar fault geometry. Together these observations suggest that, along non-planar faults, damage and energy dissipation occurs along the entire fault during slip, rather than being confined to the region close to the crack tip as predicted by classical fracture mechanics.

  16. A FRAMEWORK FOR AUTOMATED CHANGE DETECTION SYSTEM

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    To enhance the ability of remote sensing system to provide accurate,timely,and c omplete geo_spatial information at regional or global scale,an automated change detection system has been and will continue to be one of the important and chall enging problems in remote sensing.In this paper,the authors propose a framework for auto mated change detection system at landscape level using various geo_spatial data sources including multi_sensor remotely sensed imagery and ancillary data layers .In this framework,database is the central part and some associated techniques a re discussed.These techniques includes five subsystems:automated feature_based i mage registration,automated change finding,automated change feature extraction a nd identification,intelligent change recognition,change accuracy assessment and database updating and visualization.

  17. Transformer Fault Diagnosis Based on Support Vector Machines%基于支持向量机的变压器故障诊断

    Institute of Scientific and Technical Information of China (English)

    刘义艳; 陈晨; 亢旭红; 巨永锋

    2011-01-01

    Due to lack of typical damage samples in the transformer fault diagnosis, a new fault diagnosis method based on support vector machines (SVMs) is presented. According to the method, the five characteristic gases dissolved in transformer oil are extracted by the K-means clustering (KMC) method as feature vectors, which are input into multi-classified SVMs for training, and then the SVMs diagnosis model is established to implement fault samples classification. The results of experiment and analysis show that with KMC algorithm, the diagnosis information are concentrated and the great time consumption in parameter determination is remitted effectively. The presented method can detect the faults in transformer with a high correct judgment rate and can reach the purpose of automation diagnosis for transformer faults under the condition of few samples.%针对变压器故障诊断中缺少实际典型故障样本的问题,提出了支持向量机(SVMs)变压器故障诊断方法.该方法采用K均值聚类(KMC)对变压器油中5种特征气体样本进行预选取作为特征向量,输入到多分类支持向量机中进行训练,建立SVMs诊断模型,实现对故障样本的诊断分类.实例分析表明,KMC算法浓缩了故障信息,有效地解决了确定模型参数时耗时巨大的问题.该方法在有限样本情况下,能够达到较高的故障正判率,满足变压器故障自动诊断的目的.

  18. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  19. Transient Fault Locating Method Based on Line Voltage and Zero-mode Current in Non-solidly Earthed Network

    Institute of Scientific and Technical Information of China (English)

    ZHANG Linli; XU Bingyin; XUE Yongduan; GAO Houlei

    2012-01-01

    Non-solidly earthed systems are widely used for middle voltage distribution network at home and abroad. Fault point location especially the single phase-to-earth fault is very difficult because the fault current is very weak and the fault arc is intermittent. Although several methods have been developed, the problem of fault location has not yet been resolved very well. A new fault location method based on transient component of line voltage and 0-mode current is presented in this paper, which can realize fault section location by the feeder automation (FA) system. Line voltage signal can be obtained conveniently without requiring any additional equipment. This method is based on transient information, not affected by arc suppression coil.

  20. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2016-12-22

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  1. Detecting eccentricity faults in a PMSM in non-stationary conditions

    OpenAIRE

    Javier Rosero García; José Luis Romeral; Esteban Rosero García

    2012-01-01

    Permanent magnet alternating current machines are being widely used in applications demanding high and rugged performance, such as industrial automation and the aerospace and automotive industries. This paper presents a study of a permanent magnet synchronous machine (PMSM) running in eccentricity; these machines’ condition monitoring and fault detection would provide added value and they are also assuming growing importance. This paper investigates the effect of eccentricity faults on PMSM m...

  2. Automated Feature Extraction from Hyperspectral Imagery Project

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to NASA Topic S7.01, Visual Learning Systems, Inc. (VLS) will develop a novel hyperspectral plug-in toolkit for its award winning Feature AnalystREG...

  3. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  4. The effect of mechanical discontinuities on the growth of faults

    Science.gov (United States)

    Bonini, Lorenzo; Basili, Roberto; Bonanno, Emanuele; Toscani, Giovanni; Burrato, Pierfrancesco; Seno, Silvio; Valensise, Gianluca

    2016-04-01

    The growth of natural faults is controlled by several factors, including the nature of host rocks, the strain rate, the temperature, and the presence of fluids. In this work we focus on the mechanical characteristics of host rocks, and in particular on the role played by thin mechanical discontinuities on the upward propagation of faults and on associated secondary effects such as folding and fracturing. Our approach uses scaled, analogue models where natural rocks are simulated by wet clay (kaolin). A clay cake is placed above two rigid blocks in a hanging wall/footwall configuration on either side of a planar fault. Fault activity is simulated by motor-controlled movements of the hanging wall. We reproduce three types of faults: a 45°-dipping normal fault, a 45°-dipping reverse fault and a 30°-dipping reverse fault. These angles are selected as representative of most natural dip-slip faults. The analogues of the mechanical discontinuities are obtained by precutting the wet clay cake before starting the hanging wall movement. We monitor the experiments with high-resolution cameras and then obtain most of the data through the Digital Image Correlation method (D.I.C.). This technique accurately tracks the trajectories of the particles of the analogue material during the deformation process: this allows us to extract displacement field vectors plus the strain and shear rate distributions on the lateral side of the clay block, where the growth of new faults is best seen. Initially we run a series of isotropic experiments, i.e. experiments without discontinuities, to generate a reference model: then we introduce the discontinuities. For the extensional models they are cut at different dip angles, from horizontal to 45°-dipping, both synthetic and antithetic with respect to the master fault, whereas only horizontal discontinuities are introduced in the contractional models. Our experiments show that such discontinuities control: 1) the propagation rate of faults

  5. 基于自适应多尺度形态梯度变换的滚动轴承故障特征提取%Feature extraction for roller bearing fault diagnosis based on adaptive multi-scale morphological gradient transformation

    Institute of Scientific and Technical Information of China (English)

    李兵; 张培林; 刘东升; 米双山; 任国全

    2011-01-01

    Impulsive type signal is the characteristic response of a defected roller bearing. How to extract impulsive signal from a noised vibration signal becomes the key step for bearing fault diagnosis. A novel method named adaptive multi-scale morphological gradient ( AMMG) based on mathematical morphology was proposed for feature extraction of a roller bearing fault signal here. The AMMG technique had the advantage to depress noise and keep the detail of a signal. Compared with the envelope demodulation method and the morphological closed transformation one, simulation and test results demonstrated that the proposed AMMG technique can extract impulse signals more effectively from the original signals disturbed with strong background noise; moreover, the computation of AMMG is relatively simple and fast; it provides an effective way to extract features for fault diagnosis of roller bearings.%滚动轴承故障信号是一种典型的周期性冲击信号,如何从含有强噪声的振动信号中有效地提取出冲击特征信号是轴承故障诊断的关键.基于数学形态学理论,提出了一种自适应多尺度形态梯度变换(AMMG)方法,能够在有效抑制噪声的同时很好的保留信号的细节.仿真信号和实测轴承故障信号的分析结果表明,与常用的包络解调分析和近来提出的另一种基于数学形态学的形态闭变换方法相比较,自适应多尺度形态梯度变换具有更强的噪声抑制和脉冲提取能力,并且计算简单、快速,为滚动轴承故障特征提取提供了一种有效的方法.

  6. Study on Fault Current of DFIG during Slight Fault Condition

    OpenAIRE

    Xiangping Kong; Zhe Zhang; Xianggen Yin; Zhenxing Li

    2013-01-01

    In order to ensure the safety of DFIG when severe fault happens, crowbar protection is adopted. But during slight fault condition, the crowbar protection will not trip, and the DFIG is still excited by AC-DC-AC converter. In this condition, operation characteristics of the converter have large influence on the fault current characteristics of DFIG. By theoretical analysis and digital simulation, the fault current characteristics of DFIG during slight voltage dips are studied. And the influenc...

  7. Fault Monitooring and Fault Recovery Control for Position Moored Tanker

    DEFF Research Database (Denmark)

    Fang, Shaoji; Blanke, Mogens

    2009-01-01

    This paper addresses fault tolerant control for position mooring of a shuttle tanker operating in the North Sea. A complete framework for fault diagnosis is presented but the loss of a sub-sea mooring line buoyancy element is given particular attention, since this fault could lead to line breakage...... algorithm is proposed to accommodate buoyancy element failure and keep the mooring system in a safe state. Detection properties and fault-tolerant control are demonstrated by high delity simulations...

  8. Fault Tolerant Computer Architecture

    CERN Document Server

    Sorin, Daniel

    2009-01-01

    For many years, most computer architects have pursued one primary goal: performance. Architects have translated the ever-increasing abundance of ever-faster transistors provided by Moore's law into remarkable increases in performance. Recently, however, the bounty provided by Moore's law has been accompanied by several challenges that have arisen as devices have become smaller, including a decrease in dependability due to physical faults. In this book, we focus on the dependability challenge and the fault tolerance solutions that architects are developing to overcome it. The two main purposes

  9. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  10. Fault tolerant control based on active fault diagnosis

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik

    2005-01-01

    An active fault diagnosis (AFD) method will be considered in this paper in connection with a Fault Tolerant Control (FTC) architecture based on the YJBK parameterization of all stabilizing controllers. The architecture consists of a fault diagnosis (FD) part and a controller reconfiguration (CR...

  11. Wind turbine fault detection and fault tolerant control

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Johnson, Kathryn

    2013-01-01

    In this updated edition of a previous wind turbine fault detection and fault tolerant control challenge, we present a more sophisticated wind turbine model and updated fault scenarios to enhance the realism of the challenge and therefore the value of the solutions. This paper describes the challe...

  12. Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time

    Science.gov (United States)

    Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan

    2012-01-01

    Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).

  13. ESR dating of fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2002-03-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then trow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs grain size shows a plateau for grains below critical size : these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected from the Yangsan fault system. ESR dates from the this fault system range from 870 to 240 ka. Results of this research suggest that long-term cyclic fault activity continued into the pleistocene.

  14. Fault Management Assistant (FMA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — S&K Aerospace (SKA) proposes to develop the Fault Management Assistant (FMA) to aid project managers and fault management engineers in developing better and more...

  15. Improving Multiple Fault Diagnosability using Possible Conflicts

    Data.gov (United States)

    National Aeronautics and Space Administration — Multiple fault diagnosis is a difficult problem for dynamic systems. Due to fault masking, compensation, and relative time of fault occurrence, multiple faults can...

  16. Testing Cross-Talk Induced Delay Faults in Digital Circuit Based on Transient Current Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Youren; DENG Xiaoqian; CUI Jiang; YAO Rui; ZHANG Zhai

    2006-01-01

    The delay fault induced by cross-talk effect is one of the difficult problems in the fault diagnosis of digital circuit. An intelligent fault diagnosis based on IDDT testing and support vector machines (SVM) classifier was proposed in this paper. Firstly, the fault model induced by cross-talk effect and the IDDT testing method were analyzed, and then a delay fault localization method based on SVM was presented. The fault features of the sampled signals were extracted by wavelet packet decomposition and served as input parameters of SVM classifier to classify the different fault types. The simulation results illustrate that the method presented is accurate and effective, reaches a high diagnosis rate above 95%.

  17. Automated DNA Sequencing System

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  18. Electromagnetic Transient Response Analysis of DFIG under Cascading Grid Faults Considering Phase Angel Jumps

    DEFF Research Database (Denmark)

    Wang, Yun; Wu, Qiuwei

    2014-01-01

    This paper analysis the electromagnetic transient response characteristics of DFIG under symmetrical and asymmetrical cascading grid fault conditions considering phaseangel jump of grid. On deriving the dynamic equations of the DFIG with considering multiple constraints on balanced and unbalanced...... conditions, phase angel jumps, interval of cascading fault, electromagnetic transient characteristics, the principle of the DFIG response under cascading voltage fault can be extract. The influence of grid angel jump on the transient characteristic of DFIG is analyzed and electromagnetic response...

  19. Investigating fault coupling: Creep and microseismicity on the Hayward fault

    Science.gov (United States)

    Evans, E. L.; Loveless, J. P.; Meade, B. J.; Burgmann, R.

    2009-12-01

    We seek to quantify the relationship between interseismic slip activity and microseismicity along the Hayward fault in the eastern San Francisco Bay Area. During the interseismic regime the Hayward fault is known to exhibit variable degrees of locking both along strike and down-dip. Background microseismicity on and near the fault has been suggested to provide independent information about the rates of interseismic creep and the boundaries of creeping regions. In particular, repeating earthquakes within the fault zone have been suggested as a proxy for fault creep rates. To investigate this relationship, we invert GPS data for microplate rotations, fault slip rates, and fault coupling using a block model that spans western United States and includes the San Andreas, Hayward, Calaveras, Rogers Creek, and Green Valley faults in the greater Bay area. The tectonic context provided by the regional scale model ensures that the slip budget across Bay Area faults is consistent with large scale tectonic motions and kinematically connected to the central San Andreas fault. We image the spatial distribution of interseismic slip on a triangulated mesh of the Hayward fault and compare the distribution of interseismic fault coupling with the number of earthquakes and the moment rate of all on-fault seismicity. We quantitatively test the hypothesis that microseismicity might define the transitions between locked and creeping regions. The calculated correlations are tested against a null hypothesis that microseismicity is randomly distributed. We further extend this investigation to the step over region between the Hayward and Calaveras faults to illuminate the interactions between linking faults.

  20. 小电流接地电网故障选线方法的研究%Research on Low Current Grounding Electric Network Fault Line Selection

    Institute of Scientific and Technical Information of China (English)

    何金朋; 聂赫; 刘颖

    2011-01-01

    The 35 kV or below power distribution network in China and small current grounding system single phase occurs single grounding,there is less difference between normal lines and fault line transient zero sequence current,which affects the precision of line selection.In view of this,one kind of digital filter is designed.The pure fault component of zero sequence current is extracted and the data is being analyzed,combined with substation automation generatrix insulate monitoring device and automation reclose device,the fault line is selected.The theoretical analysis and Matlab simulation experiment indicates the accuracy and reliability of line selection method has improved significantly.%针对我国35 kV及以下配电网小电流接地系统发生单相接地时,健全线路与故障线路暂态零序电流相差不大,影响了选线的精度等状况,设计了一种数字滤波器。它可提取零序电流的纯故障分量,对数据进行相关分析,结合变电站自动化母线绝缘监视装置以及自动重合闸装置,选出故障线路。通过理论分析和Matlab实验仿真表明,这种选线方法,其准确性、可靠性有明显提高。

  1. Transformer fault diagnosis using continuous sparse autoencoder.

    Science.gov (United States)

    Wang, Lukun; Zhao, Xiaoying; Pei, Jiangnan; Tang, Gongyou

    2016-01-01

    This paper proposes a novel continuous sparse autoencoder (CSAE) which can be used in unsupervised feature learning. The CSAE adds Gaussian stochastic unit into activation function to extract features of nonlinear data. In this paper, CSAE is applied to solve the problem of transformer fault recognition. Firstly, based on dissolved gas analysis method, IEC three ratios are calculated by the concentrations of dissolved gases. Then IEC three ratios data is normalized to reduce data singularity and improve training speed. Secondly, deep belief network is established by two layers of CSAE and one layer of back propagation (BP) network. Thirdly, CSAE is adopted to unsupervised training and getting features. Then BP network is used for supervised training and getting transformer fault. Finally, the experimental data from IEC TC 10 dataset aims to illustrate the effectiveness of the presented approach. Comparative experiments clearly show that CSAE can extract features from the original data, and achieve a superior correct differentiation rate on transformer fault diagnosis.

  2. Study of Fault Diagnosis Method for Wind Turbine with Decision Classification Algorithms and Expert System

    Directory of Open Access Journals (Sweden)

    Feng Yongxin

    2012-09-01

    Full Text Available Study on the fault diagnosis method through the combination of decision classification algorithms and expert system. The method of extracting diagnosis rules with the CTree software was given, and a fault diagnosis system based on CLIPS was developed. In order to verify the feasibility of the method, at first the sample data was got through the simulations under fault of direct-drive wind turbine and gearbox, then the diagnosis rules was extracted with the CTree software, at last the fault diagnosis system proposed and the rules was used to extracted to diagnose the fault simulated. Test results showed that the misdiagnosis rate both within 5%, thus the feasibility of the method was verified.

  3. Row fault detection system

    Science.gov (United States)

    Archer, Charles Jens; Pinnow, Kurt Walter; Ratterman, Joseph D.; Smith, Brian Edward

    2008-10-14

    An apparatus, program product and method checks for nodal faults in a row of nodes by causing each node in the row to concurrently communicate with its adjacent neighbor nodes in the row. The communications are analyzed to determine a presence of a faulty node or connection.

  4. Fault-Mechanism Simulator

    Science.gov (United States)

    Guyton, J. W.

    1972-01-01

    An inexpensive, simple mechanical model of a fault can be produced to simulate the effects leading to an earthquake. This model has been used successfully with students from elementary to college levels and can be demonstrated to classes as large as thirty students. (DF)

  5. Heat reveals faults

    Energy Technology Data Exchange (ETDEWEB)

    Weinreich, Bernhard [Solarschmiede GmbH, Muenchen (Germany). Engineering Dept.

    2010-07-01

    Gremlins cannot hide from the all-revealing view of a thermographic camera, whereby it makes no difference whether it is a roof-mounted system or a megawatt-sized farm. Just as diverse are the range of faults that, with the growing level of expertise, can now be detected and differentiated with even greater detail. (orig.)

  6. Fault Diagnosis of Batch Reactor Using Machine Learning Methods

    Directory of Open Access Journals (Sweden)

    Sujatha Subramanian

    2014-01-01

    Full Text Available Fault diagnosis of a batch reactor gives the early detection of fault and minimizes the risk of thermal runaway. It provides superior performance and helps to improve safety and consistency. It has become more vital in this technical era. In this paper, support vector machine (SVM is used to estimate the heat release (Qr of the batch reactor both normal and faulty conditions. The signature of the residual, which is obtained from the difference between nominal and estimated faulty Qr values, characterizes the different natures of faults occurring in the batch reactor. Appropriate statistical and geometric features are extracted from the residual signature and the total numbers of features are reduced using SVM attribute selection filter and principle component analysis (PCA techniques. artificial neural network (ANN classifiers like multilayer perceptron (MLP, radial basis function (RBF, and Bayes net are used to classify the different types of faults from the reduced features. It is observed from the result of the comparative study that the proposed method for fault diagnosis with limited number of features extracted from only one estimated parameter (Qr shows that it is more efficient and fast for diagnosing the typical faults.

  7. Fault-Related Sanctuaries

    Science.gov (United States)

    Piccardi, L.

    2001-12-01

    Beyond the study of historical surface faulting events, this work investigates the possibility, in specific cases, of identifying pre-historical events whose memory survives in myths and legends. The myths of many famous sacred places of the ancient world contain relevant telluric references: "sacred" earthquakes, openings to the Underworld and/or chthonic dragons. Given the strong correspondence with local geological evidence, these myths may be considered as describing natural phenomena. It has been possible in this way to shed light on the geologic origin of famous myths (Piccardi, 1999, 2000 and 2001). Interdisciplinary researches reveal that the origin of several ancient sanctuaries may be linked in particular to peculiar geological phenomena observed on local active faults (like ground shaking and coseismic surface ruptures, gas and flames emissions, strong underground rumours). In many of these sanctuaries the sacred area is laid directly above the active fault. In a few cases, faulting has affected also the archaeological relics, right through the main temple (e.g. Delphi, Cnidus, Hierapolis of Phrygia). As such, the arrangement of the cult site and content of relative myths suggest that specific points along the trace of active faults have been noticed in the past and worshiped as special `sacred' places, most likely interpreted as Hades' Doors. The mythological stratification of most of these sanctuaries dates back to prehistory, and points to a common derivation from the cult of the Mother Goddess (the Lady of the Doors), which was largely widespread since at least 25000 BC. The cult itself was later reconverted into various different divinities, while the `sacred doors' of the Great Goddess and/or the dragons (offspring of Mother Earth and generally regarded as Keepers of the Doors) persisted in more recent mythologies. Piccardi L., 1999: The "Footprints" of the Archangel: Evidence of Early-Medieval Surface Faulting at Monte Sant'Angelo (Gargano, Italy

  8. Repeated extraction of DNA from FTA cards

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Ferrero, Laura; Børsting, Claus;

    2011-01-01

    Extraction of DNA using magnetic bead based techniques on automated DNA extraction instruments provides a fast, reliable and reproducible method for DNA extraction from various matrices. However, the yield of extracted DNA from FTA-cards is typically low. Here, we demonstrate that it is possible ...

  9. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  10. Landscape response to normal fault growth and linkage in the Southern Apennines, Italy.

    Science.gov (United States)

    Roda-Boluda, Duna; Whittaker, Alex

    2016-04-01

    It is now well-established that landscape can record spatial and temporal variations in tectonic rates. However, decoding this information to extract detailed histories of fault growth is often a complex problem that requires careful integration of tectonic and geomorphic data sets. Here, we present new data addressing both normal fault evolution and coupled landscape response for two normal faults in the Southern Apennines: the Vallo di Diano and East Agri faults. By integrating published constraints with new data, we show that these faults have total throws of up to 2100 m, and Holocene throw rates of up to 1 mm/yr at their maximum. We demonstrate that geomorphology is effectively recording tectonics, with relief, channel and catchment slopes varying along fault strike as normal fault activity does. Therefore, valuable information about fault growth and interaction can be extracted from their geomorphic expression. We use the spatial distribution of knickpoints on the footwall channels to infer two episodes of base level change, which can be associated with distinct fault interaction events. From our detailed fault throw profiles, we reconstruct the amount of throw accumulated after each of these events, and the segments involved in each, and we use slip rate enhancement factors derived from fault interaction theory to estimate the magnitude of the tectonic perturbation in each case. From this approach, we are able to reconstruct pre-linkage throw rates, and we estimate that fault linkage events likely took place 0.7 ± 0.2 Ma and 1.9 ± 0.6 Ma in the Vallo di Diano fault, and 1.1 ± 0.1 and 2.3 ± 0.9 Ma in the East Agri fault. Our study suggests that both faults started their activity at 3.6 ± 0.5 Ma. These fault linkage scenarios are consistent with the knickpoint heights, and may relate to soft-linkage interaction with the Southern Apennines normal fault array, the existence of which has been the subject of considerable debate. Our combined geomorphic and

  11. 基于PS-InSAR技术的断裂带近场变形特征提取%The extraction of the near-field deformation features along the faulted zone based on PS-InSAR survey

    Institute of Scientific and Technical Information of China (English)

    李凌婧; 姚鑫; 张永双; 王桂杰; 郭长宝

    2015-01-01

    Aperture Radar)technology and using L band data, the authors conducted the survey of near-field deformation around Bamei-Daofu section of Xianshuihe ac⁃tive fault from 2007 to 2011 and, based on analysis in combination with other materials, inferred some complex fault near-field defor⁃mation information:①the deformation velocity of the north section is larger than that of the north section, and velocities on the two sides of the fault are somewhat different from each other, the velocity of SW wall is large than that of NE wall, the velocity difference of the far-field is more significant, and the velocity of the near-field is feeble; ②in area close to the active faulted zone, the values of PS(Persistent Scatterer)points deformation velocities are mainly comparatively small negative and positive values, reflecting the sur⁃ face ascent and suggesting that the location is composed mainly of wet land, exposed point of ground water, bank and gully. It is in⁃ferred that these phenomena are attributed to surface bulging and deformation caused by weather warming—glaciers melting—uplift of ground water level, the tendency uplift of wet land resulting from seasonal frost heaving, and certain expansibility of cataclastic rock and soil near the faulted zone;③the uplift deformation around Zhonggu-Bamei section results from the thrust movement near Xianshuihe fault, and the ductile shear zone absorbs and coordinates the entire block deformation; ④high deformation PS blocks re⁃flect the slope gravity deformation,especially in sections of Daofu-shonglinkou and Qianning basin-Longdengba, revealing the geo⁃hazard effects of the fault; ⑤the precise PS-InSAR results show that the deformation of the fault is complex and shows remarkable differences in different sections, different periods and different tectonic locations, so we can't simply consider the movement to be overall translation or elevation-subsidence with the faulted zone as the boundary.

  12. Automated Periodontal Diseases Classification System

    Directory of Open Access Journals (Sweden)

    Aliaa A. A. Youssif

    2012-01-01

    Full Text Available This paper presents an efficient and innovative system for automated classification of periodontal diseases, The strength of our technique lies in the fact that it incorporates knowledge from the patients' clinical data, along with the features automatically extracted from the Haematoxylin and Eosin (H&E stained microscopic images. Our system uses image processing techniques based on color deconvolution, morphological operations, and watershed transforms for epithelium & connective tissue segmentation, nuclear segmentation, and extraction of the microscopic immunohistochemical features for the nuclei, dilated blood vessels & collagen fibers. Also, Feedforward Backpropagation Artificial Neural Networks are used for the classification process. We report 100% classification accuracy in correctly identifying the different periodontal diseases observed in our 30 samples dataset.

  13. Analysis of fault using microcomputer protection by symmetrical component method

    Directory of Open Access Journals (Sweden)

    Mr. Ashish Choubey

    2012-09-01

    Full Text Available To enhance power supply reliability for the userterminals in the case of the distribution system toavoid interference by the fault again, rapidlycomplete the automatic identification, positioning,automatic fault isolation, network reconfigurationuntil the resumption of supply of non-fault section,a microprocessor-based relay protection device hasdeveloped. As the fault component theory is widelyused in microcomputer protection, and faultcomponent exists in the network of faultcomponent, it is necessary to build up the faultcomponent network when short circuit faultemerging and to draw the current and voltagecomponent phasor diagram at fault point. In orderto understand microcomputer protection based onthe symmetrical component principle, we obtainedthe sequence current and sequence voltageaccording to the concept of symmetrical component.Distribution line directly to user-oriented powersupply, the reliability of its operation determines thequality and level of electricity supply. In recentdecades, because of the general power of the tirelessefforts of scientists and technicians, relay protectiontechnology and equipment application level hasbeen greatly improved, but the current domesticproduction of computer hardware, protectiondevices are still outdated systems. Softwaredevelopment has maintenance difficulties and shortsurvival time. With the factory automation systeminterface functions weak points, the networkcommunication cannot meet the actualrequirements. Protection principle configurationand device manufacturing process to be improvedand so on.

  14. Automatic software fault localization based on ar tificial bee colony

    Institute of Scientific and Technical Information of China (English)

    Linzhi Huang∗; Jun Ai

    2015-01-01

    Software debugging accounts for a vast majority of the financial and time costs in software developing and maintenance. Thus, approaches of software fault localization that can help au-tomate the debugging process have become a hot topic in the field of software engineering. Given the great demand for software fault localization, an approach based on the artificial bee colony (ABC) algorithm is proposed to be integrated with other related techniques. In this process, the source program is initial y instru-mented after analyzing the dependence information. The test case sets are then compiled and run on the instrumented program, and execution results are input to the ABC algorithm. The algorithm can determine the largest fitness value and best food source by calculating the average fitness of the employed bees in the iter-ative process. The program unit with the highest suspicion score corresponding to the best test case set is regarded as the final fault localization. Experiments are conducted with the TCAS program in the Siemens suite. Results demonstrate that the proposed fault localization method is effective and efficient. The ABC algorithm can efficiently avoid the local optimum, and ensure the validity of the fault location to a larger extent.

  15. Remote online machine fault diagnostic system

    Science.gov (United States)

    Pan, Min-Chun; Li, Po-Ching

    2004-07-01

    The study aims at implementing a remote online machine fault diagnostic system built up in the architecture of both the BCB software-developing environment and Internet transmission communication. Variant signal-processing computation schemes for signal analysis and pattern recognition purposes are implemented in the BCB graphical user interface. Hence, machine fault diagnostic capability can be extended by using the socket application program interface as the TCP/IP protocol. In the study, the effectiveness of the developed remote diagnostic system is validated by monitoring a transmission-element test rig. A complete monitoring cycle includes data acquisition, signal processing, feature extraction, pattern recognition through the ANNs, and online video monitoring, is demonstrated.

  16. Mechanical Fault Diagnosis Using Support Vector Machine

    Institute of Scientific and Technical Information of China (English)

    LI Ling-jun; ZHANG Zhou-suo; HE Zheng-jia

    2003-01-01

    The Support Vector Machine (SVM) is a machine learning algorithm based on the Statistical Learning Theory ( SLT) , which can get good classification effects even with a few learning samples. SVM represents a new approach to pattern classification and has been shown to be particularly successful in many fields such as image identification and face recognition. It also provides us with a new method to develop intelligent fault diagnosis. This paper presents a SVM-based approach for fault diagnosis of rolling bearings. Experimentation with vibration signals of bearings is conducted. The vibration signals acquired from the bearings are used directly in the calculating without the preprocessing of extracting its features. Compared with the methods based on Artificial Neural Network (ANN), the SVM-based meth-od has desirable advantages. It is applicable for on-line diagnosis of mechanical systems.

  17. Fault tolerant architecture for artificial olfactory system

    Science.gov (United States)

    Lotfivand, Nasser; Nizar Hamidon, Mohd; Abdolzadeh, Vida

    2015-05-01

    In this paper, to cover and mask the faults that occur in the sensing unit of an artificial olfactory system, a novel architecture is offered. The proposed architecture is able to tolerate failures in the sensors of the array and the faults that occur are masked. The proposed architecture for extracting the correct results from the output of the sensors can provide the quality of service for generated data from the sensor array. The results of various evaluations and analysis proved that the proposed architecture has acceptable performance in comparison with the classic form of the sensor array in gas identification. According to the results, achieving a high odor discrimination based on the suggested architecture is possible.

  18. An intelligent online fault diagnostic scheme for nonlinear systems

    Institute of Scientific and Technical Information of China (English)

    Hing Tung MOK; Che Wai CHAN; Zaiyue YANG

    2008-01-01

    An online fault diagnostic scheme for nonlinear systems based on neurofuzzy networks is proposed in this paper.The scheme involves two stages.In the first stage,the nonlinear system is approximated by a neurofuzzy network,which is trained offline from data obtained during the normal operation of the system.In the second stage,residual is generated online from this network and is modelled by another neurofuzzy network trained online.Fuzzy rules are extracted from this network,and are compared with those in the fault database obmined under different faulty operations,from which faults are diagnosed.The performance of the proposed intelligent fault scheme is illustrated using a two.tank water level control system under different faulty conditions.

  19. A New Fault Diagnosis Method of Rotating Machinery

    Directory of Open Access Journals (Sweden)

    Chih-Hao Chen

    2008-01-01

    Full Text Available This paper presents a new fault diagnosis procedure for rotating machinery using the wavelet packets-fractal technology and a radial basis function neural network. The faults of rotating machinery considered in this study include imbalance, misalignment, looseness and imbalance combined with misalignment conditions. When such faults occur, they usually induce non-stationary vibrations to the machine. After measuring the vibration signals, the wavelet packets transform is applied to these signals. The fractal dimension of each frequency bands is extracted and the box counting dimension is used to depict the failure characteristics of the vibration signals. The failure modes are then classified by a radial basis function neural network. An experimental study was performed to evaluate the proposed method and the results show that the method can effectively detect and recognize different kinds of faults of rotating machinery.

  20. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  1. Sensor fault diagnosis with a probabilistic decision process

    Science.gov (United States)

    Sharifi, Reza; Langari, Reza

    2013-01-01

    In this paper a probabilistic approach to sensor fault diagnosis is presented. The proposed method is applicable to systems whose dynamic can be approximated with only few active states, especially in process control where we usually have a relatively slow dynamics. Unlike most existing probabilistic approaches to fault diagnosis, which are based on Bayesian Belief Networks, in this approach the probabilistic model is directly extracted from a parity equation. The relevant parity equation can be found using a model of the system or through principal component analysis of data measured from the system. In addition, a sensor detectability index is introduced that specifies the level of detectability of sensor faults in a set of analytically redundant sensors. This index depends only on the internal relationships of the variables of the system and noise level. The method is tested on a model of the Tennessee Eastman process and the result shows a fast and reliable prediction of fault in the detectable sensors.

  2. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory.

  3. Using the EMD method to determine fault criterion for medium-low pressure gas regulators

    Science.gov (United States)

    Hao, Xuejun; Liu, Qiang; Yang, Guobin; Du, Yi

    2015-11-01

    By extracting the outlet pressure data of gas regulators, this paper uses the EMD toolbox of the MATLAB software, which can perform data decomposition and the Hilbert-Huang Transform to find the rules with fault data. Eventually, the medium-low pressure gas regulator fault criterion can be established.

  4. Research on Gear-broken Fault Diagnosis in a Tank Gearbox

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A fault diagnosis method of working position gear in a tank gearbox is put forward based on simulating the fault of working position gear in an actual tank, extracting the envelope of vibration signal by Hilbert transformation amplitude demodulation method, and zooming the low-frequency band to envelope signal.

  5. Network Fault Diagnosis Using DSM

    Institute of Scientific and Technical Information of China (English)

    Jiang Hao; Yan Pu-liu; Chen Xiao; Wu Jing

    2004-01-01

    Difference similitude matrix (DSM) is effective in reducing information system with its higher reduction rate and higher validity. We use DSM method to analyze the fault data of computer networks and obtain the fault diagnosis rules. Through discretizing the relative value of fault data, we get the information system of the fault data. DSM method reduces the information system and gets the diagnosis rules. The simulation with the actual scenario shows that the fault diagnosis based on DSM can obtain few and effective rules.

  6. Fault structure and deformation rates at the Lastros-Sfaka Graben, Crete

    Science.gov (United States)

    Mason, J.; Schneiderwind, S.; Pallikarakis, A.; Wiatr, T.; Mechernich, S.; Papanikolaou, I.; Reicherter, K.

    2016-06-01

    The Lastros and Sfaka faults have an antithetic relationship and form a ca. 2 km wide graben within the Ierapetra fault zone in eastern Crete. Both faults have impressive bedrock fault scarps many metres in height which form prominent features within the landscape. t-LiDAR investigations undertaken on the Lastros fault are used to accurately determine vertical displacements along a ca. 1.3 km long scanned segment. Analyses show that previous estimations of post glacial slip rate are too high because there are many areas along strike where the scarp is exhumed by natural erosion and/or anthropogenic activity. In areas not affected by erosion there is mean scarp height of 9.4 m. This leads to a slip rate of 0.69 ± 0.15 mm/a using 15 ± 3 ka for scarp exhumation. Using empirical calculations the expected earthquake magnitudes and displacement per event are discussed based on our observations. Trenching investigations on the Sfaka fault identify different generations of fissure fills. Retrodeformation analyses and 14C dating of the fill material indicate at least four events dating back to 16,055 ± 215 cal BP, with the last event having occurred soon after 6102 ± 113 cal BP. The Lastros fault is likely the controlling fault in the graben, and ruptures on the Lastros fault will sympathetically affect the Sfaka fault, which merges with the Lastros fault at a depth of 2.4 km. The extracted dates from the Sfaka fault fissure fills therefore either represent activity on the Lastros fault, assuming they formed coseismically, or accommodation events. Cross sections show that the finite throw is limited to around 300 m, and the derived slip rate for the Lastros fault therefore indicates that both faults are relatively young having initiated 435 ± 120 ka.

  7. Uncertainty estimation in finite fault inversion

    Science.gov (United States)

    Dettmer, Jan; Cummins, Phil R.; Benavente, Roberto

    2016-04-01

    This work considers uncertainty estimation for kinematic rupture models in finite fault inversion by Bayesian sampling. Since the general problem of slip estimation on an unknown fault from incomplete and noisy data is highly non-linear and currently intractable, assumptions are typically made to simplify the problem. These almost always include linearization of the time dependence of rupture by considering multiple discrete time windows, and a tessellation of the fault surface into a set of 'subfaults' whose dimensions are fixed below what is subjectively thought to be resolvable by the data. Even non-linear parameterizations are based on a fixed discretization. This results in over-parametrized models which include more parameters than resolvable by the data and require regularization criteria that stabilize the inversion. While it is increasingly common to consider slip uncertainties arising from observational error, the effects of the assumptions implicit in parameterization choices are rarely if ever considered. Here, we show that linearization and discretization assumptions can strongly affect both slip and uncertainty estimates and that therefore the selection of parametrizations should be included in the inference process. We apply Bayesian model selection to study the effect of parametrization choice on inversion results. The Bayesian sampling method which produces inversion results is based on a trans-dimensional rupture discretization which adapts the spatial and temporal parametrization complexity based on data information and does not require regularization. Slip magnitude, direction and rupture velocity are unknowns across the fault and causal first rupture times are obtained by solving the Eikonal equation for a spatially variable rupture-velocity field. The method provides automated local adaptation of rupture complexity based on data information and does not assume globally constant resolution. This is an important quality since seismic data do not

  8. A user`s guide to SABLE 2.0: The Sandia Automated Boolean Logic Evaluation software

    Energy Technology Data Exchange (ETDEWEB)

    Hays, K.M.; Wyss, G.D.; Daniel, S.L. [Sandia National Labs., Albuquerque, NM (United States). Risk Assessment and Systems Modeling Dept.

    1996-04-01

    This document is a reference guide for the Sandia Automated Boolean Logic Evaluation software (SABLE) version 2.0 developed at Sandia National Laboratories. SABLE 2.0 is designed to solve and quantify fault trees on IBM-compatible personal computers using the Microsoft Windows operating environment. SABLE 2.0 consists of a Windows user interface combined with a fault tree solution engine that is derived from the well-known SETS fault tree analysis code. This manual explains the fundamentals of solving fault trees and shows how to use the Windows SABLE 2.0 interface to specify a problem, solve the problem, and view the output.

  9. Sparsity-based algorithm for detecting faults in rotating machines

    Science.gov (United States)

    He, Wangpeng; Ding, Yin; Zi, Yanyang; Selesnick, Ivan W.

    2016-05-01

    This paper addresses the detection of periodic transients in vibration signals so as to detect faults in rotating machines. For this purpose, we present a method to estimate periodic-group-sparse signals in noise. The method is based on the formulation of a convex optimization problem. A fast iterative algorithm is given for its solution. A simulated signal is formulated to verify the performance of the proposed approach for periodic feature extraction. The detection performance of comparative methods is compared with that of the proposed approach via RMSE values and receiver operating characteristic (ROC) curves. Finally, the proposed approach is applied to single fault diagnosis of a locomotive bearing and compound faults diagnosis of motor bearings. The processed results show that the proposed approach can effectively detect and extract the useful features of bearing outer race and inner race defect.

  10. Cyclostationary Analysis for Gearbox and Bearing Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Zhipeng Feng

    2015-01-01

    Full Text Available Gearbox and rolling element bearing vibration signals feature modulation, thus being cyclostationary. Therefore, the cyclic correlation and cyclic spectrum are suited to analyze their modulation characteristics and thereby extract gearbox and bearing fault symptoms. In order to thoroughly understand the cyclostationarity of gearbox and bearing vibrations, the explicit expressions of cyclic correlation and cyclic spectrum for amplitude modulation and frequency modulation (AM-FM signals are derived, and their properties are summarized. The theoretical derivations are illustrated and validated by gearbox and bearing experimental signal analyses. The modulation characteristics caused by gearbox and bearing faults are extracted. In faulty gearbox and bearing cases, more peaks appear in cyclic correlation slice of 0 lag and cyclic spectrum, than in healthy cases. The gear and bearing faults are detected by checking the presence or monitoring the magnitude change of peaks in cyclic correlation and cyclic spectrum and are located according to the peak cyclic frequency locations or sideband frequency spacing.

  11. A solution for applying IEC 61499 function blocks in the development of substation automation systems

    OpenAIRE

    Vlad, Valentin; Popa, Cezar D.; Turcu, Corneliu O.; Buzduga, Corneliu

    2015-01-01

    This paper presents a solution for applying IEC 61499 function blocks along with IEC 61850 specifications in modeling and implementing control applications for substations automation. The IEC 61499 artifacts are used for structuring the control logic, while the IEC 61850 concepts for communication and information exchange between the automation devices. The proposed control architecture was implemented and validated in a simple fault protection scenario with simulated power equipment.

  12. Managing Fault Management Development

    Science.gov (United States)

    McDougal, John M.

    2010-01-01

    As the complexity of space missions grows, development of Fault Management (FM) capabilities is an increasingly common driver for significant cost overruns late in the development cycle. FM issues and the resulting cost overruns are rarely caused by a lack of technology, but rather by a lack of planning and emphasis by project management. A recent NASA FM Workshop brought together FM practitioners from a broad spectrum of institutions, mission types, and functional roles to identify the drivers underlying FM overruns and recommend solutions. They identified a number of areas in which increased program and project management focus can be used to control FM development cost growth. These include up-front planning for FM as a distinct engineering discipline; managing different, conflicting, and changing institutional goals and risk postures; ensuring the necessary resources for a disciplined, coordinated approach to end-to-end fault management engineering; and monitoring FM coordination across all mission systems.

  13. Seismic Fault Preserving Diffusion

    CERN Document Server

    Lavialle, Olivier; Germain, Christian; Donias, Marc; Guillon, Sebastien; Keskes, Naamen; Berthoumieu, Yannick

    2007-01-01

    This paper focuses on the denoising and enhancing of 3-D reflection seismic data. We propose a pre-processing step based on a non linear diffusion filtering leading to a better detection of seismic faults. The non linear diffusion approaches are based on the definition of a partial differential equation that allows us to simplify the images without blurring relevant details or discontinuities. Computing the structure tensor which provides information on the local orientation of the geological layers, we propose to drive the diffusion along these layers using a new approach called SFPD (Seismic Fault Preserving Diffusion). In SFPD, the eigenvalues of the tensor are fixed according to a confidence measure that takes into account the regularity of the local seismic structure. Results on both synthesized and real 3-D blocks show the efficiency of the proposed approach.

  14. Seismic fault preserving diffusion

    Science.gov (United States)

    Lavialle, Olivier; Pop, Sorin; Germain, Christian; Donias, Marc; Guillon, Sebastien; Keskes, Naamen; Berthoumieu, Yannick

    2007-02-01

    This paper focuses on the denoising and enhancing of 3-D reflection seismic data. We propose a pre-processing step based on a non-linear diffusion filtering leading to a better detection of seismic faults. The non-linear diffusion approaches are based on the definition of a partial differential equation that allows us to simplify the images without blurring relevant details or discontinuities. Computing the structure tensor which provides information on the local orientation of the geological layers, we propose to drive the diffusion along these layers using a new approach called SFPD (Seismic Fault Preserving Diffusion). In SFPD, the eigenvalues of the tensor are fixed according to a confidence measure that takes into account the regularity of the local seismic structure. Results on both synthesized and real 3-D blocks show the efficiency of the proposed approach.

  15. Automating checks of plan check automation.

    Science.gov (United States)

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  16. Fault Tree Handbook

    Science.gov (United States)

    1981-01-01

    to be Evaluated Manufacturer Location Seismic Susceptibility Flood Susceptibility Temperature Humidity Radiation Wear-out Susceptibility Test...For the category " Seismic Susceptibility," we might define several sensitivity levels ranging from no sensitivity to extreme sensitivity, and for more... Hanford Company, Richland, Wash- ington, ARH-ST-l 12, July 1975. 40. W.E. Vesely, "Analysis of Fault Trees by Kinetic Tree Theory," Idaho Nuclear

  17. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and support

  18. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  19. More Benefits of Automation.

    Science.gov (United States)

    Getz, Malcolm

    1988-01-01

    Describes a study that measured the benefits of an automated catalog and automated circulation system from the library user's point of view in terms of the value of time saved. Topics discussed include patterns of use, access time, availability of information, search behaviors, and the effectiveness of the measures used. (seven references)…

  20. Faults in Linux

    DEFF Research Database (Denmark)

    Palix, Nicolas Jean-Michel; Thomas, Gaël; Saha, Suman;

    2011-01-01

    In 2001, Chou et al. published a study of faults found by applying a static analyzer to Linux versions 1.0 through 2.4.1. A major result of their work was that the drivers directory contained up to 7 times more of certain kinds of faults than other directories. This result inspired a number...... of development and research efforts on improving the reliability of driver code. Today Linux is used in a much wider range of environments, provides a much wider range of services, and has adopted a new development and release model. What has been the impact of these changes on code quality? Are drivers still...... a major problem? To answer these questions, we have transported the experiments of Chou et al. to Linux versions 2.6.0 to 2.6.33, released between late 2003 and early 2010. We find that Linux has more than doubled in size during this period, but that the number of faults per line of code has been...

  1. Automated addition of Chelex solution to tubes containing trace items

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Thomas Møller; Hansen, Anders Johannes;

    2011-01-01

    Extraction of DNA from trace items for forensic genetic DNA typing using a manual Chelex based extraction protocol requires addition of Chelex solution to sample tubes containing trace items. Automated of addition of Chelex solution may be hampered by high viscosity of the solution and fast...

  2. Automated addition of Chelex solution to tubes containing trace items

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hansen, Thomas Møller; Hansen, Anders Johannes;

    2011-01-01

    Extraction of DNA from trace items for forensic genetic DNA typing using a manual Chelex based extraction protocol requires addition of Chelex solution to sample tubes containing trace items. Automated of addition of Chelex solution may be hampered by high viscosity of the solution and fast sedim...

  3. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  4. Automation in immunohematology.

    Science.gov (United States)

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  5. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  6. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  7. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  8. Research on the Comprehensive Demodulation of Gear Tooth Crack Early Fault

    Institute of Scientific and Technical Information of China (English)

    CUI Lingli; DING Fang; GAO Lixin; ZHANG Jianyu

    2006-01-01

    The component of gear vibration signal is very complex, when a localized tooth defect such as a tooth crack is present, the engagement of the cracked tooth will induce an impulsive change with comparatively low energy to the gear mesh signal and the background noise. This paper presents a new comprehensive demodulation method which combined with amplitude envelop demodulation and phase demodulation to extract gear crack early fault. A mathematical model of gear vibration signal contain crack fault is put forward. Simulation results based on this model show that the new comprehensive demodulation method is more effective in finding fault and judging fault level then conventional single amplitude demodulation at present.

  9. ESR dating of fault rocks

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hee Kwon [Kangwon National Univ., Chuncheon (Korea, Republic of)

    2003-02-15

    Past movement on faults can be dated by measurement of the intensity of ESR signals in quartz. These signals are reset by local lattice deformation and local frictional heating on grain contacts at the time of fault movement. The ESR signals then grow back as a result of bombardment by ionizing radiation from surrounding rocks. The age is obtained from the ratio of the equivalent dose, needed to produce the observed signal, to the dose rate. Fine grains are more completely reset during faulting, and a plot of age vs. grain size shows a plateau for grains below critical size; these grains are presumed to have been completely zeroed by the last fault activity. We carried out ESR dating of fault rocks collected near the Gori nuclear reactor. Most of the ESR signals of fault rocks collected from the basement are saturated. This indicates that the last movement of the faults had occurred before the Quaternary period. However, ESR dates from the Oyong fault zone range from 370 to 310 ka. Results of this research suggest that long-term cyclic fault activity of the Oyong fault zone continued into the Pleistocene.

  10. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-26

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  11. Hybrid Model-Based and Data-Driven Fault Detection and Diagnostics for Commercial Buildings: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Frank, Stephen; Heaney, Michael; Jin, Xin; Robertson, Joseph; Cheung, Howard; Elmore, Ryan; Henze, Gregor

    2016-08-01

    Commercial buildings often experience faults that produce undesirable behavior in building systems. Building faults waste energy, decrease occupants' comfort, and increase operating costs. Automated fault detection and diagnosis (FDD) tools for buildings help building owners discover and identify the root causes of faults in building systems, equipment, and controls. Proper implementation of FDD has the potential to simultaneously improve comfort, reduce energy use, and narrow the gap between actual and optimal building performance. However, conventional rule-based FDD requires expensive instrumentation and valuable engineering labor, which limit deployment opportunities. This paper presents a hybrid, automated FDD approach that combines building energy models and statistical learning tools to detect and diagnose faults noninvasively, using minimal sensors, with little customization. We compare and contrast the performance of several hybrid FDD algorithms for a small security building. Our results indicate that the algorithms can detect and diagnose several common faults, but more work is required to reduce false positive rates and improve diagnosis accuracy.

  12. Fault Diagnosis for Rolling Bearing under Variable Conditions Based on Image Recognition

    Directory of Open Access Journals (Sweden)

    Bo Zhou

    2016-01-01

    Full Text Available Rolling bearing faults often lead to electromechanical system failure due to its high speed and complex working conditions. Recently, a large amount of fault diagnosis studies for rolling bearing based on vibration data has been reported. However, few studies have focused on fault diagnosis for rolling bearings under variable conditions. This paper proposes a fault diagnosis method based on image recognition for rolling bearings to realize fault classification under variable working conditions. The proposed method includes the following steps. First, the vibration signal data are transformed into a two-dimensional image based on recurrence plot (RP technique. Next, a popular feature extraction method which has been widely used in the image field, scale invariant feature transform (SIFT, is employed to extract fault features from the two-dimensional RP and subsequently generate a 128-dimensional feature vector. Third, due to the redundancy of the high-dimensional feature, kernel principal component analysis is utilized to reduce the feature dimensionality. Finally, a neural network classifier trained by probabilistic neural network is used to perform fault diagnosis. Verification experiment results demonstrate the effectiveness of the proposed fault diagnosis method for rolling bearings under variable conditions, thereby providing a promising approach to fault diagnosis for rolling bearings.

  13. A Method for Aileron Actuator Fault Diagnosis Based on PCA and PGC-SVM

    Directory of Open Access Journals (Sweden)

    Wei-Li Qin

    2016-01-01

    Full Text Available Aileron actuators are pivotal components for aircraft flight control system. Thus, the fault diagnosis of aileron actuators is vital in the enhancement of the reliability and fault tolerant capability. This paper presents an aileron actuator fault diagnosis approach combining principal component analysis (PCA, grid search (GS, 10-fold cross validation (CV, and one-versus-one support vector machine (SVM. This method is referred to as PGC-SVM and utilizes the direct drive valve input, force motor current, and displacement feedback signal to realize fault detection and location. First, several common faults of aileron actuators, which include force motor coil break, sensor coil break, cylinder leakage, and amplifier gain reduction, are extracted from the fault quadrantal diagram; the corresponding fault mechanisms are analyzed. Second, the data feature extraction is performed with dimension reduction using PCA. Finally, the GS and CV algorithms are employed to train a one-versus-one SVM for fault classification, thus obtaining the optimal model parameters and assuring the generalization of the trained SVM, respectively. To verify the effectiveness of the proposed approach, four types of faults are introduced into the simulation model established by AMESim and Simulink. The results demonstrate its desirable diagnostic performance which outperforms that of the traditional SVM by comparison.

  14. An adaptive unsaturated bistable stochastic resonance method and its application in mechanical fault diagnosis

    Science.gov (United States)

    Qiao, Zijian; Lei, Yaguo; Lin, Jing; Jia, Feng

    2017-02-01

    In mechanical fault diagnosis, most traditional methods for signal processing attempt to suppress or cancel noise imbedded in vibration signals for extracting weak fault characteristics, whereas stochastic resonance (SR), as a potential tool for signal processing, is able to utilize the noise to enhance fault characteristics. The classical bistable SR (CBSR), as one of the most widely used SR methods, however, has the disadvantage of inherent output saturation. The output saturation not only reduces the output signal-to-noise ratio (SNR) but also limits the enhancement capability for fault characteristics. To overcome this shortcoming, a novel method is proposed to extract the fault characteristics, where a piecewise bistable potential model is established. Simulated signals are used to illustrate the effectiveness of the proposed method, and the results show that the method is able to extract weak fault characteristics and has good enhancement performance and anti-noise capability. Finally, the method is applied to fault diagnosis of bearings and planetary gearboxes, respectively. The diagnosis results demonstrate that the proposed method can obtain larger output SNR, higher spectrum peaks at fault characteristic frequencies and therefore larger recognizable degree than the CBSR method.

  15. Time-frequency atoms-driven support vector machine method for bearings incipient fault diagnosis

    Science.gov (United States)

    Liu, Ruonan; Yang, Boyuan; Zhang, Xiaoli; Wang, Shibin; Chen, Xuefeng

    2016-06-01

    Bearing plays an essential role in the performance of mechanical system and fault diagnosis of mechanical system is inseparably related to the diagnosis of the bearings. However, it is a challenge to detect weak fault from the complex and non-stationary vibration signals with a large amount of noise, especially at the early stage. To improve the anti-noise ability and detect incipient fault, a novel fault detection method based on a short-time matching method and Support Vector Machine (SVM) is proposed. In this paper, the mechanism of roller bearing is discussed and the impact time frequency dictionary is constructed targeting the multi-component characteristics and fault feature of roller bearing fault vibration signals. Then, a short-time matching method is described and the simulation results show the excellent feature extraction effects in extremely low signal-to-noise ratio (SNR). After extracting the most relevance atoms as features, SVM was trained for fault recognition. Finally, the practical bearing experiments indicate that the proposed method is more effective and efficient than the traditional methods in weak impact signal oscillatory characters extraction and incipient fault diagnosis.

  16. Improving treatment plan evaluation with automation.

    Science.gov (United States)

    Covington, Elizabeth L; Chen, Xiaoping; Younge, Kelly C; Lee, Choonik; Matuszak, Martha M; Kessler, Marc L; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M; Filpansick, Stephanie E; Moran, Jean M

    2016-11-01

    The goal of this work is to evaluate the effectiveness of Plan-Checker Tool (PCT) which was created to improve first-time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the physics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33 checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was successfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. PACS number(s): 87.55.-x, 87.55.N-, 87.55.Qr, 87.55.tm, 89.20.Bb.

  17. Fault-tolerant Supervisory Control

    DEFF Research Database (Denmark)

    Izadi-Zamanabadi, Roozbeh

    The main purpose of this work has been to achieve active fault-tolerance in control systems, defined as a methodology where fault detection and isolation techniques are combined with supervisory control to achieve autonomous accommodation of faults before they develop into failures. The aim...... control algorithms. The drawback is, however, that these control systems have become more vulnerable to even simple faults in instrumentation. On the other hand, due to cost-optimality requirements, an extensive use of hardware redundancy has been prohibited. Nevertheless, the dependency and availability...... could be increased through enhancing control systems' ability to on-line perform fault detection and reconfiguration when a fault occurs and before a safety system shuts-down the entire process. The main contributions of this research effort are development and experimentation with methodologies...

  18. Fault Tolerant Wind Farm Control

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2013-01-01

    with best at a wind turbine control level. However, some faults are better dealt with at the wind farm control level, if the wind turbine is located in a wind farm. In this paper a benchmark model for fault detection and isolation, and fault tolerant control of wind turbines implemented at the wind farm...... control level is presented. The benchmark model includes a small wind farm of nine wind turbines, based on simple models of the wind turbines as well as the wind and interactions between wind turbines in the wind farm. The model includes wind and power references scenarios as well as three relevant fault...... scenarios. This benchmark model is used in an international competition dealing with Wind Farm fault detection and isolation and fault tolerant control....

  19. Mechanical stratigraphy and normal faulting

    Science.gov (United States)

    Ferrill, David A.; Morris, Alan P.; McGinnis, Ronald N.; Smart, Kevin J.; Wigginton, Sarah S.; Hill, Nicola J.

    2017-01-01

    Mechanical stratigraphy encompasses the mechanical properties, thicknesses, and interface properties of rock units. Although mechanical stratigraphy often relates directly to lithostratigraphy, lithologic description alone does not adequately describe mechanical behavior. Analyses of normal faults with displacements of millimeters to 10's of kilometers in mechanically layered rocks reveal that mechanical stratigraphy influences nucleation, failure mode, fault geometry, displacement gradient, displacement distribution, fault core and damage zone characteristics, and fault zone deformation processes. The relationship between normal faulting and mechanical stratigraphy can be used either to predict structural style using knowledge of mechanical stratigraphy, or conversely to interpret mechanical stratigraphy based on characterization of the structural style. This review paper explores a range of mechanical stratigraphic controls on normal faulting illustrated by natural and modeled examples.

  20. USING MUTATION IN FAULT LOCALIZATION

    Directory of Open Access Journals (Sweden)

    Chenglong Sun

    2016-05-01

    Full Text Available Fault localization is time-consuming and difficult, which makes it the bottleneck of the debugging progress. To help facilitate this task, there exist many fault localization techniques that help narrow down the region of the suspicious code in a program. Better accuracy in fault localization is achieved from heavy computation cost. Fault localization techniques that can effectively locate faults also manifest slow response rate. In this paper, we promote the use of pre-computing to distribute the time-intensive computations to the idle period of coding phase, in order to speed up such techniques and achieve both low-cost and high accuracy. We raise the research problems of finding suitable techniques that can be pre-computed and adapt it to the pre-computing paradigm in a continuous integration environment. Further, we use an existing fault localization technique to demonstrate our research exploration, and shows visions and challenges of the related methodologies.

  1. Mine-hoist fault-condition detection based on the wavelet packet transform and kernel PCA

    Institute of Scientific and Technical Information of China (English)

    XIA Shi-xiong; NIU Qiang; ZHOU Yong; ZHANG Lei

    2008-01-01

    A new algorithm was developed to correctly identify fault conditions and accurately monitor fault development in a mine hoist. The new method is based on the Wavelet Packet Transform (WPT) and kernel PCA (Kernel Principal Component Analysis, KPCA). For non-linear monitoring systems the key to fault detection is the extracting of main features. The wavelet packet transform is a novel technique of signal processing that possesses excellent characteristics of time-frequency localization. It is suitable for analysing time-varying or transient signals. KPCA maps the original input features into a higher dimension feature space through a non-linear mapping. The principal components are then found in the higher dimension feature space. The KPCA transformation was applied to extracting the main nonlinear features from experimental fault feature data after wavelet packet transformation. The results show that the proposed method affords credible fault detection and identification.

  2. Dynamic Reconstruction-Based Fuzzy Neural Network Method for Fault Detection in Chaotic System

    Institute of Scientific and Technical Information of China (English)

    YANG Hongying; YE Hao; WANG Guizeng

    2008-01-01

    This paper presents a method for detecting weak fault signals in chaotic systems based on the chaotic dynamics reconstruction technique and the fuzzy neural system (FNS). The Grassberger-Procaccia algorithm and least squares regression were used to calculate the correlation dimension for the model order estimate. Based on the model order, an appropriately structured FNS model was designed to predict system faults. Through reasonable analysis of predicted errors, the disturbed signal can be extracted efficiently and correctly from the chaotic background. Satisfactory results were obtained by using several kinds of simula-tive faults which were extracted from the practical chaotic fault systems. Experimental results demonstra tethat the proposed approach has good prediction accuracy and can deal with data having a -40 dB signal to noise ratio (SNR). The low SNR requirement makes the approach a powerful tool for early fault detection.

  3. Discrete wavelet transform-based fault diagnosis for driving system of pipeline detection robot arm

    Institute of Scientific and Technical Information of China (English)

    Deng Huiyu; Wang Xinli; Ma Peisun

    2005-01-01

    A real-time wavelet multi-resolution analysis (MRA)-based fault detection algorithm is proposed. The first stage detailed MRA signals extracted from the original signals were used as the criteria for fault detection. By measuring sharp variations in the detailed MRA signals, faults in the motor driving system of pipeline detection robot arm could be detected. The fault type was then identified by comparison of the three-phase MRA sharp variations. The effects of the faults were examined. The simulation results show that this algorithm is effective and robust, it is promising for fault detection in a robot's joint driving system. The method is simple, rapid and it can operate in real time.

  4. Diagonal slice spectrum assisted optimal scale morphological filter for rolling element bearing fault diagnosis

    Science.gov (United States)

    Li, Yifan; Liang, Xihui; Zuo, Ming J.

    2017-02-01

    This paper presents a novel signal processing scheme, diagonal slice spectrum assisted optimal scale morphological filter (DSS-OSMF), for rolling element fault diagnosis. In this scheme, the concept of quadratic frequency coupling (QFC) is firstly defined and the ability of diagonal slice spectrum (DSS) in detection QFC is derived. The DSS-OSMF possesses the merits of depressing noise and detecting QFC. It can remove fault independent frequency components and give a clear representation of fault symptoms. A simulated vibration signal and experimental vibration signals collected from a bearing test rig are employed to evaluate the effectiveness of the proposed method. Results show that the proposed method has a superior performance in extracting fault features of defective rolling element bearing. In addition, comparisons are performed between a multi-scale morphological filter (MMF) and a DSS-OSMF. DSS-OSMF outperforms MMF in detection of an outer race fault and a rolling element fault of a rolling element bearing.

  5. FAULT DIAGNOSIS APPROACH FOR ROLLER BEARINGS BASED ON EMPIRICAL MODE DECOMPOSITION METHOD AND HILBERT TRANSFORM

    Institute of Scientific and Technical Information of China (English)

    Yu Dejie; Cheng Junsheng; Yang Yu

    2005-01-01

    Based upon empirical mode decomposition (EMD) method and Hilbert spectrum, a method for fault diagnosis of roller bearing is proposed. The orthogonal wavelet bases are used to translate vibration signals of a roller bearing into time-scale representation, then, an envelope signal can be obtained by envelope spectrum analysis of wavelet coefficients of high scales. By applying EMD method and Hilbert transform to the envelope signal, we can get the local Hilbert marginal spectrum from which the faults in a roller bearing can be diagnosed and fault patterns can be identified. Practical vibration signals measured from roller bearings with out-race faults or inner-race faults are analyzed by the proposed method. The results show that the proposed method is superior to the traditional envelope spectrum method in extracting the fault characteristics of roller bearings.

  6. Comparative Study of Parametric and Non-parametric Approaches in Fault Detection and Isolation

    DEFF Research Database (Denmark)

    Katebi, S.D.; Blanke, M.; Katebi, M.R.

    This report describes a comparative study between two approaches to fault detection and isolation in dynamic systems. The first approach uses a parametric model of the system. The main components of such techniques are residual and signature generation for processing and analyzing. The second...... approach is non-parametric in the sense that the signature analysis is only dependent on the frequency or time domain information extracted directly from the input-output signals. Based on these approaches, two different fault monitoring schemes are developed where the feature extraction and fault decision...

  7. Planetary gearbox fault diagnosis using an adaptive stochastic resonance method

    Science.gov (United States)

    Lei, Yaguo; Han, Dong; Lin, Jing; He, Zhengjia

    2013-07-01

    Planetary gearboxes are widely used in aerospace, automotive and heavy industry applications due to their large transmission ratio, strong load-bearing capacity and high transmission efficiency. The tough operation conditions of heavy duty and intensive impact load may cause gear tooth damage such as fatigue crack and teeth missed etc. The challenging issues in fault diagnosis of planetary gearboxes include selection of sensitive measurement locations, investigation of vibration transmission paths and weak feature extraction. One of them is how to effectively discover the weak characteristics from noisy signals of faulty components in planetary gearboxes. To address the issue in fault diagnosis of planetary gearboxes, an adaptive stochastic resonance (ASR) method is proposed in this paper. The ASR method utilizes the optimization ability of ant colony algorithms and adaptively realizes the optimal stochastic resonance system matching input signals. Using the ASR method, the noise may be weakened and weak characteristics highlighted, and therefore the faults can be diagnosed accurately. A planetary gearbox test rig is established and experiments with sun gear faults including a chipped tooth and a missing tooth are conducted. And the vibration signals are collected under the loaded condition and various motor speeds. The proposed method is used to process the collected signals and the results of feature extraction and fault diagnosis demonstrate its effectiveness.

  8. Multiscale Permutation Entropy Based Rolling Bearing Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Jinde Zheng

    2014-01-01

    Full Text Available A new rolling bearing fault diagnosis approach based on multiscale permutation entropy (MPE, Laplacian score (LS, and support vector machines (SVMs is proposed in this paper. Permutation entropy (PE was recently proposed and defined to measure the randomicity and detect dynamical changes of time series. However, for the complexity of mechanical systems, the randomicity and dynamic changes of the vibration signal will exist in different scales. Thus, the definition of MPE is introduced and employed to extract the nonlinear fault characteristics from the bearing vibration signal in different scales. Besides, the SVM is utilized to accomplish the fault feature classification to fulfill diagnostic procedure automatically. Meanwhile, in order to avoid a high dimension of features, the Laplacian score (LS is used to refine the feature vector by ranking the features according to their importance and correlations with the main fault information. Finally, the rolling bearing fault diagnosis method based on MPE, LS, and SVM is proposed and applied to the experimental data. The experimental data analysis results indicate that the proposed method could identify the fault categories effectively.

  9. Thruster fault identification method for autonomous underwater vehicle using peak region energy and least square grey relational grade

    Directory of Open Access Journals (Sweden)

    Mingjun Zhang

    2015-12-01

    Full Text Available A novel thruster fault identification method for autonomous underwater vehicle is presented in this article. It uses the proposed peak region energy method to extract fault feature and uses the proposed least square grey relational grade method to estimate fault degree. The peak region energy method is developed from fusion feature modulus maximum method. It applies the fusion feature modulus maximum method to get fusion feature and then regards the maximum of peak region energy in the convolution operation results of fusion feature as fault feature. The least square grey relational grade method is developed from grey relational analysis algorithm. It determines the fault degree interval by the grey relational analysis algorithm and then estimates fault degree in the interval by least square algorithm. Pool experiments of the experimental prototype are conducted to verify the effectiveness of the proposed methods. The experimental results show that the fault feature extracted by the peak region energy method is monotonic to fault degree while the one extracted by the fusion feature modulus maximum method is not. The least square grey relational grade method can further get an estimation result between adjacent standard fault degrees while the estimation result of the grey relational analysis algorithm is just one of the standard fault degrees.

  10. Final Technical Report: PV Fault Detection Tool.

    Energy Technology Data Exchange (ETDEWEB)

    King, Bruce Hardison [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Christian Birk [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The PV Fault Detection Tool project plans to demonstrate that the FDT can (a) detect catastrophic and degradation faults and (b) identify the type of fault. This will be accomplished by collecting fault signatures using different instruments and integrating this information to establish a logical controller for detecting, diagnosing and classifying each fault.

  11. Toward Automated Feature Detection in UAVSAR Images

    Science.gov (United States)

    Parker, J. W.; Donnellan, A.; Glasscoe, M. T.

    2014-12-01

    Edge detection identifies seismic or aseismic fault motion, as demonstrated in repeat-pass inteferograms obtained by the Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) program. But this identification is not robust at present: it requires a flattened background image, interpolation into missing data (holes) and outliers, and background noise that is either sufficiently small or roughly white Gaussian. Identification and mitigation of nongaussian background image noise is essential to creating a robust, automated system to search for such features. Clearly a robust method is needed for machine scanning of the thousands of UAVSAR repeat-pass interferograms for evidence of fault slip, landslides, and other local features.Empirical examination of detrended noise based on 20 km east-west profiles through desert terrain with little tectonic deformation for a suite of flight interferograms shows nongaussian characteristics. Statistical measurement of curvature with varying length scale (Allan variance) shows nearly white behavior (Allan variance slope with spatial distance from roughly -1.76 to -2) from 25 to 400 meters, deviations from -2 suggesting short-range differences (such as used in detecting edges) are often freer of noise than longer-range differences. At distances longer than 400 m the Allan variance flattens out without consistency from one interferogram to another. We attribute this additional noise afflicting difference estimates at longer distances to atmospheric water vapor and uncompensated aircraft motion.Paradoxically, California interferograms made with increasing time intervals before and after the El Mayor Cucapah earthquake (2008, M7.2, Mexico) show visually stronger and more interesting edges, but edge detection methods developed for the first year do not produce reliable results over the first two years, because longer time spans suffer reduced coherence in the interferogram. The changes over time are reflecting fault slip and block

  12. Automated cognome construction and semi-automated hypothesis generation.

    Science.gov (United States)

    Voytek, Jessica B; Voytek, Bradley

    2012-06-30

    Modern neuroscientific research stands on the shoulders of countless giants. PubMed alone contains more than 21 million peer-reviewed articles with 40-50,000 more published every month. Understanding the human brain, cognition, and disease will require integrating facts from dozens of scientific fields spread amongst millions of studies locked away in static documents, making any such integration daunting, at best. The future of scientific progress will be aided by bridging the gap between the millions of published research articles and modern databases such as the Allen brain atlas (ABA). To that end, we have analyzed the text of over 3.5 million scientific abstracts to find associations between neuroscientific concepts. From the literature alone, we show that we can blindly and algorithmically extract a "cognome": relationships between brain structure, function, and disease. We demonstrate the potential of data-mining and cross-platform data-integration with the ABA by introducing two methods for semi-automated hypothesis generation. By analyzing statistical "holes" and discrepancies in the literature we can find understudied or overlooked research paths. That is, we have added a layer of semi-automation to a part of the scientific process itself. This is an important step toward fundamentally incorporating data-mining algorithms into the scientific method in a manner that is generalizable to any scientific or medical field.

  13. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  14. Fault-Tree Compiler Program

    Science.gov (United States)

    Butler, Ricky W.; Martensen, Anna L.

    1992-01-01

    FTC, Fault-Tree Compiler program, is reliability-analysis software tool used to calculate probability of top event of fault tree. Five different types of gates allowed in fault tree: AND, OR, EXCLUSIVE OR, INVERT, and M OF N. High-level input language of FTC easy to understand and use. Program supports hierarchical fault-tree-definition feature simplifying process of description of tree and reduces execution time. Solution technique implemented in FORTRAN, and user interface in Pascal. Written to run on DEC VAX computer operating under VMS operating system.

  15. Seismological Studies for Tensile Faults

    Directory of Open Access Journals (Sweden)

    Gwo-Bin Ou

    2008-01-01

    Full Text Available A shear slip fault, an equivalence of a double couple source, has often been assumed to be a kinematic source model in ground motion simulation. Estimation of seismic moment based on the shear slip model indicates the size of an earthquake. However, if the dislocation of the hanging wall relative to the footwall includes not only a shear slip tangent to the fault plane but also expansion and compression normal to the fault plane, the radiating seismic waves will feature differences from those out of the shear slip fault. Taking account of the effects resulting from expansion and compression to a fault plane, we can resolve the tension and pressure axes as well as the fault plane solution more exactly from ground motions than previously, and can evaluate how far a fault zone opens or contracts during a developing rupture. In addition to a tensile angle and Poisson¡¦s ratio for the medium, a tensile fault with five degrees of freedom has been extended from the shear slip fault with only three degrees of freedom, strike, dip, and slip.

  16. An Overview of Transmission Line Protection by Artificial Neural Network: Fault Detection, Fault Classification, Fault Location, and Fault Direction Discrimination

    Directory of Open Access Journals (Sweden)

    Anamika Yadav

    2014-01-01

    Full Text Available Contemporary power systems are associated with serious issues of faults on high voltage transmission lines. Instant isolation of fault is necessary to maintain the system stability. Protective relay utilizes current and voltage signals to detect, classify, and locate the fault in transmission line. A trip signal will be sent by the relay to a circuit breaker with the purpose of disconnecting the faulted line from the rest of the system in case of a disturbance for maintaining the stability of the remaining healthy system. This paper focuses on the studies of fault detection, fault classification, fault location, fault phase selection, and fault direction discrimination by using artificial neural networks approach. Artificial neural networks are valuable for power system applications as they can be trained with offline data. Efforts have been made in this study to incorporate and review approximately all important techniques and philosophies of transmission line protection reported in the literature till June 2014. This comprehensive and exhaustive survey will reduce the difficulty of new researchers to evaluate different ANN based techniques with a set of references of all concerned contributions.

  17. Application of neural networks for identification of faults in a 3D seismic survey offshore Tunisia

    Science.gov (United States)

    Mastouri, Raja; Marchant, Robin; Marillier, François; Jaboyedoff, Michel; Bouaziz, Samir

    2013-04-01

    The Kerkennah High area (offshore Tunisia) is dominated by series of horst and grabens resulting from multiple tectonic events and multiphase stress (extension, compression, translation). In order to decipher this complex structural history from a 3D seismic survey, a neural network is applied to extract a fault-cube from the amplitude data (which does not image faults directly). The neural network transforms seismic attributes into a new 3D data cube in which faults are highlighted. This technique comprises the following steps. First, we compute several seismic attributes (dip-steering similarity, curvature, frequency, ridge and fault enhancement filters…) that enhance different aspects of the seismic data related to faulting. In a second step, a number of points in the seismic data are selected as representative of either faults or areas devoid of faults. These points are tested by the artificial neural network to determine the range in which the different attributes are representative of faults or not. Based on this learning phase, the neural network is then applied to the entire 3D seismic cube to produce a fault-cube that contains only faults which contrast and continuity have been enhance.

  18. Diagnosis of combined faults in Rotary Machinery by Non-Naive Bayesian approach

    Science.gov (United States)

    Asr, Mahsa Yazdanian; Ettefagh, Mir Mohammad; Hassannejad, Reza; Razavi, Seyed Naser

    2017-02-01

    When combined faults happen in different parts of the rotating machines, their features are profoundly dependent. Experts are completely familiar with individuals faults characteristics and enough data are available from single faults but the problem arises, when the faults combined and the separation of characteristics becomes complex. Therefore, the experts cannot declare exact information about the symptoms of combined fault and its quality. In this paper to overcome this drawback, a novel method is proposed. The core idea of the method is about declaring combined fault without using combined fault features as training data set and just individual fault features are applied in training step. For this purpose, after data acquisition and resampling the obtained vibration signals, Empirical Mode Decomposition (EMD) is utilized to decompose multi component signals to Intrinsic Mode Functions (IMFs). With the use of correlation coefficient, proper IMFs for feature extraction are selected. In feature extraction step, Shannon energy entropy of IMFs was extracted as well as statistical features. It is obvious that most of extracted features are strongly dependent. To consider this matter, Non-Naive Bayesian Classifier (NNBC) is appointed, which release the fundamental assumption of Naive Bayesian, i.e., the independence among features. To demonstrate the superiority of NNBC, other counterpart methods, include Normal Naive Bayesian classifier, Kernel Naive Bayesian classifier and Back Propagation Neural Networks were applied and the classification results are compared. An experimental vibration signals, collected from automobile gearbox, were used to verify the effectiveness of the proposed method. During the classification process, only the features, related individually to healthy state, bearing failure and gear failures, were assigned for training the classifier. But, combined fault features (combined gear and bearing failures) were examined as test data. The achieved

  19. Semi-automatic mapping of fault rocks on a Digital Outcrop Model, Gole Larghe Fault Zone (Southern Alps, Italy)

    Science.gov (United States)

    Mittempergher, Silvia; Vho, Alice; Bistacchi, Andrea

    2016-04-01

    them with respect to biotite. In higher resolution images this could be performed using circularity and size thresholds, however this could not be easily implemented in an automated procedure since the thresholds must be varied by the interpreter almost for each image. In 1 x 1 m images the resolution is generally too low to distinguish cataclasite and pseudotachylyte, so most of the time fault rocks were treated together. For this analysis we developed a fully automated workflow that, after applying noise correction, classification and skeletonization algorithms, returns labeled edge images of fault segments together with vector polylines associated to edge properties. Vector and edge properties represent a useful format to perform further quantitative analysis, for instance for classifying fault segments based on structural criteria, detect continuous fault traces, and detect the kind of termination of faults/fractures. This approach allows to collect statistically relevant datasets useful for further quantitative structural analysis.

  20. A New Autom ated Fingerprint Identification System

    Institute of Scientific and Technical Information of China (English)

    沈学宁; 程民德; 等

    1989-01-01

    A new automated fingerpring identification system is proposed.In this system,based on some local properties of digital image,the shape and minutiae features of fingerprint can be extracted from the grey level image without binarizing and thinning.In query,a latent fingerprint can be matched with the filed fingerprints by shape and/or minutiae features.Matching by shape features is much faster than by minutiae.

  1. Fault Tolerance in ZigBee Wireless Sensor Networks

    Science.gov (United States)

    Alena, Richard; Gilstrap, Ray; Baldwin, Jarren; Stone, Thom; Wilson, Pete

    2011-01-01

    Wireless sensor networks (WSN) based on the IEEE 802.15.4 Personal Area Network standard are finding increasing use in the home automation and emerging smart energy markets. The network and application layers, based on the ZigBee 2007 PRO Standard, provide a convenient framework for component-based software that supports customer solutions from multiple vendors. This technology is supported by System-on-a-Chip solutions, resulting in extremely small and low-power nodes. The Wireless Connections in Space Project addresses the aerospace flight domain for both flight-critical and non-critical avionics. WSNs provide the inherent fault tolerance required for aerospace applications utilizing such technology. The team from Ames Research Center has developed techniques for assessing the fault tolerance of ZigBee WSNs challenged by radio frequency (RF) interference or WSN node failure.

  2. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  3. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  4. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  5. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last 48...

  6. Automating the Media Center.

    Science.gov (United States)

    Holloway, Mary A.

    1988-01-01

    Discusses the need to develop more efficient information retrieval skills by the use of new technology. Lists four stages used in automating the media center. Describes North Carolina's pilot programs. Proposes benefits and looks at the media center's future. (MVL)

  7. Study on Fault Current of DFIG during Slight Fault Condition

    Directory of Open Access Journals (Sweden)

    Xiangping Kong

    2013-04-01

    Full Text Available In order to ensure the safety of DFIG when severe fault happens, crowbar protection is adopted. But during slight fault condition, the crowbar protection will not trip, and the DFIG is still excited by AC-DC-AC converter. In this condition, operation characteristics of the converter have large influence on the fault current characteristics of DFIG. By theoretical analysis and digital simulation, the fault current characteristics of DFIG during slight voltage dips are studied. And the influence of controller parameters of converter on the fault current characteristics is analyzed emphatically. It builds a basis for the construction of relay protection which is suitable for the power gird with accession of DFIG.

  8. Homogeneous Earthquake Faulting, Stress and Fault Strength on Kilometer Scales

    Science.gov (United States)

    Hardebeck, J. L.

    2006-12-01

    I investigate small-scale fault structure using three new high-quality focal mechanism datasets of small (MLoma Prieta earthquake. I quantify the degree of mechanism variability on a range of length scales, by comparing the hypocentral distance between every pair of events and the angular difference between their focal mechanisms. I explore the implications of focal mechanism variability for the heterogeneity or homogeneity of stress and fault strength on various length scales. Focal mechanisms are very similar, often identical to within the 1σ uncertainty of ~25°, on small length scales of effect of uncertainty in earthquake locations and focal mechanisms on the apparent mechanism variability. The result that fault geometry, stress and fault strength are generally homogeneous on ~10 km length scales is encouraging for understanding earthquake physics. It may be possible to measure these parameters with enough precision to be useful in studying and modeling large earthquakes and the behavior of major faults.

  9. Fault Diagnosis and Fault Handling for Autonomous Aircraft

    DEFF Research Database (Denmark)

    Hansen, Søren

    Unmanned Aerial vehicles (UAVs) or drones are used increasingly for missions where piloted aircraft are unsuitable. The unmanned aircraft has a number of advantages with respect to size, weight and manoeuvrability that makes it possible for them to solve tasks that an aircraft previously has been...... that the fault is discovered in time such that appropriate actions can be taken. That could either be the aircraft controlling computer taking the fault into account or a human operator that intervenes. Detection of faults that occur during flight is exactly the subject of this thesis. Safety towards faults...... to another type of aircraft with different parameters. Amongst the main findings of this research project is a method to handle faults on the UAV’s pitot tube, which measures the aircraft speed. A set of software redundancies based on GPS velocity information and engine thrust are used to detect abnormal...

  10. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  11. Repeated extraction of DNA from FTA cards

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Ferrero, Laura; Børsting, Claus

    2011-01-01

    Extraction of DNA using magnetic bead based techniques on automated DNA extraction instruments provides a fast, reliable and reproducible method for DNA extraction from various matrices. However, the yield of extracted DNA from FTA-cards is typically low. Here, we demonstrate that it is possible...... to repeatedly extract DNA from the processed FTA-disk. The method increases the yield from the nanogram range to the microgram range....

  12. Repeated extraction of DNA from FTA cards

    OpenAIRE

    Stangegaard, Michael; Ferrero, Laura; Børsting, Claus; Frank-Hansen, Rune; Hansen, Anders Johannes; Morling, Niels

    2011-01-01

    Extraction of DNA using magnetic bead based techniques on automated DNA extraction instruments provides a fast, reliable and reproducible method for DNA extraction from various matrices. However, the yield of extracted DNA from FTA-cards is typically low. Here, we demonstrate that it is possible to repeatedly extract DNA from the processed FTA-disk. The method increases the yield from the nanogram range to the microgram range.

  13. ACCOUNTING AUTOMATIONS RISKS

    OpenAIRE

    Муравський, В. В.; Хома, Н. Г.

    2015-01-01

    Accountant accepts active voice in organization of the automated account in the conditions of the informative systems introduction in enterprise activity. Effective accounting automation needs identification and warning of organizational risks. Authors researched, classified and generalized the risks of introduction of the informative accounting systems. The ways of liquidation of the organizational risks sources andminimization of their consequences are gives. The method of the effective con...

  14. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  15. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  16. Automation of Diagrammatic Reasoning

    OpenAIRE

    Jamnik, Mateja; Bundy, Alan; Green, Ian

    1997-01-01

    Theorems in automated theorem proving are usually proved by logical formal proofs. However, there is a subset of problems which humans can prove in a different way by the use of geometric operations on diagrams, so called diagrammatic proofs. Insight is more clearly perceived in these than in the corresponding algebraic proofs: they capture an intuitive notion of truthfulness that humans find easy to see and understand. We are identifying and automating this diagrammatic reasoning on mathemat...

  17. Automated Lattice Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  18. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  19. Wavelet neural network based fault diagnosis in nonlinear analog circuits

    Institute of Scientific and Technical Information of China (English)

    Yin Shirong; Chen Guangju; Xie Yongle

    2006-01-01

    The theories of diagnosing nonlinear analog circuits by means of the transient response testing are studied. Wavelet analysis is made to extract the transient response signature of nonlinear circuits and compress the signature dada. The best wavelet function is selected based on the between-category total scatter of signature. The fault dictionary of nonlinear circuits is constructed based on improved back-propagation(BP) neural network. Experimental results demonstrate that the method proposed has high diagnostic sensitivity and fast fault identification and deducibility.

  20. Application of Rough Set Theory in Fault Diagnostic Rules Acquisition

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Rough set theory is a new mathematical tool to deal with vagueness and uncertainty. But original rough sets theory only generates deterministic rules and deals with data sets in which there is no noise. The variable precision rough set model (VPRSM) is presented to handle uncertain and noisy information. A method based on VPRSM is proposed to apply to fault diagnosis feature extraction and rules acquisition for industrial applications. An example for fault diagnosis of rotary machinery is given to show that the method is very effective.