WorldWideScience

Sample records for automated defect classification

  1. Automated Diagnosis and Classification of Steam Generator Tube Defects

    International Nuclear Information System (INIS)

    Garcia, Gabe V.

    2004-01-01

    A major cause of failure in nuclear steam generators is tube degradation. Tube defects are divided into seven categories, one of which is intergranular attack/stress corrosion cracking (IGA/SCC). Defects of this type usually begin on the outer surface of the tubes and propagate both inward and laterally. In many cases these defects occur at or near the tube support plates. Several different methods exist for the nondestructive evaluation of nuclear steam generator tubes for defect characterization

  2. Integrating image processing and classification technology into automated polarizing film defect inspection

    Science.gov (United States)

    Kuo, Chung-Feng Jeffrey; Lai, Chun-Yu; Kao, Chih-Hsiang; Chiu, Chin-Hsun

    2018-05-01

    In order to improve the current manual inspection and classification process for polarizing film on production lines, this study proposes a high precision automated inspection and classification system for polarizing film, which is used for recognition and classification of four common defects: dent, foreign material, bright spot, and scratch. First, the median filter is used to remove the impulse noise in the defect image of polarizing film. The random noise in the background is smoothed by the improved anisotropic diffusion, while the edge detail of the defect region is sharpened. Next, the defect image is transformed by Fourier transform to the frequency domain, combined with a Butterworth high pass filter to sharpen the edge detail of the defect region, and brought back by inverse Fourier transform to the spatial domain to complete the image enhancement process. For image segmentation, the edge of the defect region is found by Canny edge detector, and then the complete defect region is obtained by two-stage morphology processing. For defect classification, the feature values, including maximum gray level, eccentricity, the contrast, and homogeneity of gray level co-occurrence matrix (GLCM) extracted from the images, are used as the input of the radial basis function neural network (RBFNN) and back-propagation neural network (BPNN) classifier, 96 defect images are then used as training samples, and 84 defect images are used as testing samples to validate the classification effect. The result shows that the classification accuracy by using RBFNN is 98.9%. Thus, our proposed system can be used by manufacturing companies for a higher yield rate and lower cost. The processing time of one single image is 2.57 seconds, thus meeting the practical application requirement of an industrial production line.

  3. Increasing reticle inspection efficiency and reducing wafer print-checks using automated defect classification and simulation

    Science.gov (United States)

    Ryu, Sung Jae; Lim, Sung Taek; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2013-09-01

    IC fabs inspect critical masks on a regular basis to ensure high wafer yields. These requalification inspections are costly for many reasons including the capital equipment, system maintenance, and labor costs. In addition, masks typically remain in the "requal" phase for extended, non-productive periods of time. The overall "requal" cycle time in which reticles remain non-productive is challenging to control. Shipping schedules can slip when wafer lots are put on hold until the master critical layer reticle is returned to production. Unfortunately, substituting backup critical layer reticles can significantly reduce an otherwise tightly controlled process window adversely affecting wafer yields. One major requal cycle time component is the disposition process of mask inspections containing hundreds of defects. Not only is precious non-productive time extended by reviewing hundreds of potentially yield-limiting detections, each additional classification increases the risk of manual review techniques accidentally passing real yield limiting defects. Even assuming all defects of interest are flagged by operators, how can any person's judgment be confident regarding lithographic impact of such defects? The time reticles spend away from scanners combined with potential yield loss due to lithographic uncertainty presents significant cycle time loss and increased production costs. Fortunately, a software program has been developed which automates defect classification with simulated printability measurement greatly reducing requal cycle time and improving overall disposition accuracy. This product, called ADAS (Auto Defect Analysis System), has been tested in both engineering and high-volume production environments with very successful results. In this paper, data is presented supporting significant reduction for costly wafer print checks, improved inspection area productivity, and minimized risk of misclassified yield limiting defects.

  4. Increasing reticle inspection efficiency and reducing wafer printchecks at 14nm using automated defect classification and simulation

    Science.gov (United States)

    Paracha, Shazad; Goodman, Eliot; Eynon, Benjamin G.; Noyes, Ben F.; Ha, Steven; Kim, Jong-Min; Lee, Dong-Seok; Lee, Dong-Heok; Cho, Sang-Soo; Ham, Young M.; Vacca, Anthony D.; Fiekowsky, Peter J.; Fiekowsky, Daniel I.

    2014-10-01

    IC fabs inspect critical masks on a regular basis to ensure high wafer yields. These requalification inspections are costly for many reasons including the capital equipment, system maintenance, and labor costs. In addition, masks typically remain in the "requal" phase for extended, non-productive periods of time. The overall "requal" cycle time in which reticles remain non-productive is challenging to control. Shipping schedules can slip when wafer lots are put on hold until the master critical layer reticle is returned to production. Unfortunately, substituting backup critical layer reticles can significantly reduce an otherwise tightly controlled process window adversely affecting wafer yields. One major requal cycle time component is the disposition process of mask inspections containing hundreds of defects. Not only is precious non-productive time extended by reviewing hundreds of potentially yield-limiting detections, each additional classification increases the risk of manual review techniques accidentally passing real yield limiting defects. Even assuming all defects of interest are flagged by operators, how can any person's judgment be confident regarding lithographic impact of such defects? The time reticles spend away from scanners combined with potential yield loss due to lithographic uncertainty presents significant cycle time loss and increased production costs An automatic defect analysis system (ADAS), which has been in fab production for numerous years, has been improved to handle the new challenges of 14nm node automate reticle defect classification by simulating each defect's printability under the intended illumination conditions. In this study, we have created programmed defects on a production 14nm node critical-layer reticle. These defects have been analyzed with lithographic simulation software and compared to the results of both AIMS optical simulation and to actual wafer prints.

  5. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  6. An automated cirrus classification

    Science.gov (United States)

    Gryspeerdt, Edward; Quaas, Johannes; Goren, Tom; Klocke, Daniel; Brueck, Matthias

    2018-05-01

    Cirrus clouds play an important role in determining the radiation budget of the earth, but many of their properties remain uncertain, particularly their response to aerosol variations and to warming. Part of the reason for this uncertainty is the dependence of cirrus cloud properties on the cloud formation mechanism, which itself is strongly dependent on the local meteorological conditions. In this work, a classification system (Identification and Classification of Cirrus or IC-CIR) is introduced to identify cirrus clouds by the cloud formation mechanism. Using reanalysis and satellite data, cirrus clouds are separated into four main types: orographic, frontal, convective and synoptic. Through a comparison to convection-permitting model simulations and back-trajectory-based analysis, it is shown that these observation-based regimes can provide extra information on the cloud-scale updraughts and the frequency of occurrence of liquid-origin ice, with the convective regime having higher updraughts and a greater occurrence of liquid-origin ice compared to the synoptic regimes. Despite having different cloud formation mechanisms, the radiative properties of the regimes are not distinct, indicating that retrieved cloud properties alone are insufficient to completely describe them. This classification is designed to be easily implemented in GCMs, helping improve future model-observation comparisons and leading to improved parametrisations of cirrus cloud processes.

  7. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    Anuar Mikdad Muad; Mohd Ashhar Hj Khalid; Abdul Aziz Mohamad; Abu Bakar Mhd Ghazali; Abdul Razak Hamzah

    2000-01-01

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replace by the computer. In this paper a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such as area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results

  8. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    Anuar Mikdad Muad; Mohd Ashhar Khalid; Abdul Aziz Mohamad; Abu Bakar Mhd Ghazali; Abdul Razak Hamzah

    2001-01-01

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replaced by the computer. In this paper, a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results. (Author)

  9. Disc defect classification for optical disc drives

    NARCIS (Netherlands)

    Helvoirt, van J.; Leenknegt, G.A.L.; Steinbuch, M.; Goossens, H.J.

    2005-01-01

    Optical disc drives are subject to various disturbances and faults. A special type of fault is the so-called disc defect. In this paper we present an approach for disc defect classification. It is based on hierarchical clustering of measured signals that are affected by disc defects. The

  10. Maxillectomy defects: a suggested classification scheme.

    Science.gov (United States)

    Akinmoladun, V I; Dosumu, O O; Olusanya, A A; Ikusika, O F

    2013-06-01

    The term "maxillectomy" has been used to describe a variety of surgical procedures for a spectrum of diseases involving a diverse anatomical site. Hence, classifications of maxillectomy defects have often made communication difficult. This article highlights this problem, emphasises the need for a uniform system of classification and suggests a classification system which is simple and comprehensive. Articles related to this subject, especially those with specified classifications of maxillary surgical defects were sourced from the internet through Google, Scopus and PubMed using the search terms maxillectomy defects classification. A manual search through available literature was also done. The review of the materials revealed many classifications and modifications of classifications from the descriptive, reconstructive and prosthodontic perspectives. No globally acceptable classification exists among practitioners involved in the management of diseases in the mid-facial region. There were over 14 classifications of maxillary defects found in the English literature. Attempts made to address the inadequacies of previous classifications have tended to result in cumbersome and relatively complex classifications. A single classification that is based on both surgical and prosthetic considerations is most desirable and is hereby proposed.

  11. Classification of Automated Search Traffic

    Science.gov (United States)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  12. Automated Decision Tree Classification of Corneal Shape

    Science.gov (United States)

    Twa, Michael D.; Parthasarathy, Srinivasan; Roberts, Cynthia; Mahmoud, Ashraf M.; Raasch, Thomas W.; Bullimore, Mark A.

    2011-01-01

    Purpose The volume and complexity of data produced during videokeratography examinations present a challenge of interpretation. As a consequence, results are often analyzed qualitatively by subjective pattern recognition or reduced to comparisons of summary indices. We describe the application of decision tree induction, an automated machine learning classification method, to discriminate between normal and keratoconic corneal shapes in an objective and quantitative way. We then compared this method with other known classification methods. Methods The corneal surface was modeled with a seventh-order Zernike polynomial for 132 normal eyes of 92 subjects and 112 eyes of 71 subjects diagnosed with keratoconus. A decision tree classifier was induced using the C4.5 algorithm, and its classification performance was compared with the modified Rabinowitz–McDonnell index, Schwiegerling’s Z3 index (Z3), Keratoconus Prediction Index (KPI), KISA%, and Cone Location and Magnitude Index using recommended classification thresholds for each method. We also evaluated the area under the receiver operator characteristic (ROC) curve for each classification method. Results Our decision tree classifier performed equal to or better than the other classifiers tested: accuracy was 92% and the area under the ROC curve was 0.97. Our decision tree classifier reduced the information needed to distinguish between normal and keratoconus eyes using four of 36 Zernike polynomial coefficients. The four surface features selected as classification attributes by the decision tree method were inferior elevation, greater sagittal depth, oblique toricity, and trefoil. Conclusions Automated decision tree classification of corneal shape through Zernike polynomials is an accurate quantitative method of classification that is interpretable and can be generated from any instrument platform capable of raw elevation data output. This method of pattern classification is extendable to other classification

  13. Classification and printability of EUV mask defects from SEM images

    Science.gov (United States)

    Cho, Wonil; Price, Daniel; Morgan, Paul A.; Rost, Daniel; Satake, Masaki; Tolani, Vikram L.

    2017-10-01

    Classification and Printability of EUV Mask Defects from SEM images EUV lithography is starting to show more promise for patterning some critical layers at 5nm technology node and beyond. However, there still are many key technical obstacles to overcome before bringing EUV Lithography into high volume manufacturing (HVM). One of the greatest obstacles is manufacturing defect-free masks. For pattern defect inspections in the mask-shop, cutting-edge 193nm optical inspection tools have been used so far due to lacking any e-beam mask inspection (EBMI) or EUV actinic pattern inspection (API) tools. The main issue with current 193nm inspection tools is the limited resolution for mask dimensions targeted for EUV patterning. The theoretical resolution limit for 193nm mask inspection tools is about 60nm HP on masks, which means that main feature sizes on EUV masks will be well beyond the practical resolution of 193nm inspection tools. Nevertheless, 193nm inspection tools with various illumination conditions that maximize defect sensitivity and/or main-pattern modulation are being explored for initial EUV defect detection. Due to the generally low signal-to-noise in the 193nm inspection imaging at EUV patterning dimensions, these inspections often result in hundreds and thousands of defects which then need to be accurately reviewed and dispositioned. Manually reviewing each defect is difficult due to poor resolution. In addition, the lack of a reliable aerial dispositioning system makes it very challenging to disposition for printability. In this paper, we present the use of SEM images of EUV masks for higher resolution review and disposition of defects. In this approach, most of the defects detected by the 193nm inspection tools are first imaged on a mask SEM tool. These images together with the corresponding post-OPC design clips are provided to KLA-Tencor's Reticle Decision Center (RDC) platform which provides ADC (Automated Defect Classification) and S2A (SEM

  14. Defect sizing using automated ultrasonic inspection techniques at RNL

    International Nuclear Information System (INIS)

    Rogerson, A.; Highmore, P.J.; Poulter, L.N.J.

    1983-10-01

    RNL has developed and applied automated wide-beam pulse-echo and time-of-flight techniques with synthetic aperture processing for sizing defects in clad thick-section weldments and nozzle corner regions. These techniques were amongst those used in the four test plate inspections making up the UKAEA Defect Detection Trials. In this report a critical appraisal is given of the sizing procedures adopted by RNL in these inspections. Several factors influencing sizing accuracy are discussed and results from particular defects highlighted. The time-of-flight technique with colour graphics data display is shown to be highly effective in imaging near-vertical buried defects and underclad defects of height greater than 5 mm. Early characterisation of any identified defect from its ultrasonic response under pulse-echo inspection is seen as a desirable aid to the selection of an appropriate advanced sizing technique for buried defects. (author)

  15. A computational framework for automation of point defect calculations

    International Nuclear Information System (INIS)

    Goyal, Anuj; Gorai, Prashun; Peng, Haowei

    2017-01-01

    We have developed a complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory. Furthermore, the framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. This package provides the capability to compute widely-accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3 as test examples, we demonstrate the package capabilities and validate the methodology.

  16. Automated estimation of defects in magnetographic defectoscopy. 1. Automated magnetographic flow detectors

    International Nuclear Information System (INIS)

    Mikhajlov, S.P.; Vaulin, S.L.; Shcherbinin, V.E.; Shur, M.L.

    1993-01-01

    Consideration is given to specific features and possible functions of equipment for automated estimation of stretched continuity defects for samples with plane surface in magnetographic defectoscopy are discussed. Two models of automated magnetographic flow detectors, those with built-in microcomputer and in the form computer attachment, are described. Directions of further researches and development are discussed. 35 refs., 6 figs

  17. Automated defect spatial signature analysis for semiconductor manufacturing process

    Science.gov (United States)

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  18. Woven fabric defects detection based on texture classification algorithm

    International Nuclear Information System (INIS)

    Ben Salem, Y.; Nasri, S.

    2011-01-01

    In this paper we have compared two famous methods in texture classification to solve the problem of recognition and classification of defects occurring in a textile manufacture. We have compared local binary patterns method with co-occurrence matrix. The classifier used is the support vector machines (SVM). The system has been tested using TILDA database. The results obtained are interesting and show that LBP is a good method for the problems of recognition and classifcation defects, it gives a good running time especially for the real time applications.

  19. Feature selection for neural network based defect classification of ceramic components using high frequency ultrasound.

    Science.gov (United States)

    Kesharaju, Manasa; Nagarajah, Romesh

    2015-09-01

    The motivation for this research stems from a need for providing a non-destructive testing method capable of detecting and locating any defects and microstructural variations within armour ceramic components before issuing them to the soldiers who rely on them for their survival. The development of an automated ultrasonic inspection based classification system would make possible the checking of each ceramic component and immediately alert the operator about the presence of defects. Generally, in many classification problems a choice of features or dimensionality reduction is significant and simultaneously very difficult, as a substantial computational effort is required to evaluate possible feature subsets. In this research, a combination of artificial neural networks and genetic algorithms are used to optimize the feature subset used in classification of various defects in reaction-sintered silicon carbide ceramic components. Initially wavelet based feature extraction is implemented from the region of interest. An Artificial Neural Network classifier is employed to evaluate the performance of these features. Genetic Algorithm based feature selection is performed. Principal Component Analysis is a popular technique used for feature selection and is compared with the genetic algorithm based technique in terms of classification accuracy and selection of optimal number of features. The experimental results confirm that features identified by Principal Component Analysis lead to improved performance in terms of classification percentage with 96% than Genetic algorithm with 94%. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Defect detection and classification of machined surfaces under multiple illuminant directions

    Science.gov (United States)

    Liao, Yi; Weng, Xin; Swonger, C. W.; Ni, Jun

    2010-08-01

    Continuous improvement of product quality is crucial to the successful and competitive automotive manufacturing industry in the 21st century. The presence of surface porosity located on flat machined surfaces such as cylinder heads/blocks and transmission cases may allow leaks of coolant, oil, or combustion gas between critical mating surfaces, thus causing damage to the engine or transmission. Therefore 100% inline inspection plays an important role for improving product quality. Although the techniques of image processing and machine vision have been applied to machined surface inspection and well improved in the past 20 years, in today's automotive industry, surface porosity inspection is still done by skilled humans, which is costly, tedious, time consuming and not capable of reliably detecting small defects. In our study, an automated defect detection and classification system for flat machined surfaces has been designed and constructed. In this paper, the importance of the illuminant direction in a machine vision system was first emphasized and then the surface defect inspection system under multiple directional illuminations was designed and constructed. After that, image processing algorithms were developed to realize 5 types of 2D or 3D surface defects (pore, 2D blemish, residue dirt, scratch, and gouge) detection and classification. The steps of image processing include: (1) image acquisition and contrast enhancement (2) defect segmentation and feature extraction (3) defect classification. An artificial machined surface and an actual automotive part: cylinder head surface were tested and, as a result, microscopic surface defects can be accurately detected and assigned to a surface defect class. The cycle time of this system can be sufficiently fast that implementation of 100% inline inspection is feasible. The field of view of this system is 150mm×225mm and the surfaces larger than the field of view can be stitched together in software.

  1. Deep sub-wavelength metrology for advanced defect classification

    Science.gov (United States)

    van der Walle, P.; Kramer, E.; van der Donck, J. C. J.; Mulckhuyse, W.; Nijsten, L.; Bernal Arango, F. A.; de Jong, A.; van Zeijl, E.; Spruit, H. E. T.; van den Berg, J. H.; Nanda, G.; van Langen-Suurling, A. K.; Alkemade, P. F. A.; Pereira, S. F.; Maas, D. J.

    2017-06-01

    Particle defects are important contributors to yield loss in semi-conductor manufacturing. Particles need to be detected and characterized in order to determine and eliminate their root cause. We have conceived a process flow for advanced defect classification (ADC) that distinguishes three consecutive steps; detection, review and classification. For defect detection, TNO has developed the Rapid Nano (RN3) particle scanner, which illuminates the sample from nine azimuth angles. The RN3 is capable of detecting 42 nm Latex Sphere Equivalent (LSE) particles on XXX-flat Silicon wafers. For each sample, the lower detection limit (LDL) can be verified by an analysis of the speckle signal, which originates from the surface roughness of the substrate. In detection-mode (RN3.1), the signal from all illumination angles is added. In review-mode (RN3.9), the signals from all nine arms are recorded individually and analyzed in order to retrieve additional information on the shape and size of deep sub-wavelength defects. This paper presents experimental and modelling results on the extraction of shape information from the RN3.9 multi-azimuth signal such as aspect ratio, skewness, and orientation of test defects. Both modeling and experimental work confirm that the RN3.9 signal contains detailed defect shape information. After review by RN3.9, defects are coarsely classified, yielding a purified Defect-of-Interest (DoI) list for further analysis on slower metrology tools, such as SEM, AFM or HIM, that provide more detailed review data and further classification. Purifying the DoI list via optical metrology with RN3.9 will make inspection time on slower review tools more efficient.

  2. Automated defect location and sizing by advanced ultrasonic techniques

    International Nuclear Information System (INIS)

    Murgatroyd, R.A.

    1983-01-01

    From this assessment of advanced automated defect location and sizing techniques it is concluded that, 1. Pulse-echo techniques, when used at high sensitivity, are capable of detecting all known defects in the test weldments inspected; 2. Search sensitivity has a marked influence on defect detection at both 1 and 2 MHz, and it is considered that 20% DAC is the highest amplitude threshold level which could be prudently adopted at the search stage; 3. The important through-thickness dimension of deeply buried defects in the height range 5 to 50mm can be sized to an estimated accuracy of +2mm using the Silk technique and that applying a SAFT-type algorithm to the data gives good lateral positioning of defects; 4. The 70 0 longitudinal wave twin-crystal technique has proved to be a highly effective method of detecting underclad cracks. A 70 0 shear wave, pulse-echo technique and a 0 0 longitudinal wave twin crystal method also give good detection results in the near surface region; 5. The Silk technique has been effective in sizing defects in the height range 5 to 35mm in the near-surface region

  3. Automated Classification of Seedlings Using Computer Vision

    DEFF Research Database (Denmark)

    Dyrmann, Mads; Christiansen, Peter

    The objective of this project is to investigate the possibilities of recognizing plant species at multiple growth stages based on RGB images. Plants and leaves are initially segmented from a database through a partly automated procedure providing samples of 2438 plants and 4767 leaves distributed...

  4. Intelligent Computer Vision System for Automated Classification

    International Nuclear Information System (INIS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-01-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  5. Operational experiences with automated acoustic burst classification by neural networks

    International Nuclear Information System (INIS)

    Olma, B.; Ding, Y.; Enders, R.

    1996-01-01

    Monitoring of Loose Parts Monitoring System sensors for signal bursts associated with metallic impacts of loose parts has proved as an useful methodology for on-line assessing the mechanical integrity of components in the primary circuit of nuclear power plants. With the availability of neural networks new powerful possibilities for classification and diagnosis of burst signals can be realized for acoustic monitoring with the online system RAMSES. In order to look for relevant burst signals an automated classification is needed, that means acoustic signature analysis and assessment has to be performed automatically on-line. A back propagation neural network based on five pre-calculated signal parameter values has been set up for identification of different signal types. During a three-month monitoring program of medium-operated check valves burst signals have been measured and classified separately according to their cause. The successful results of the three measurement campaigns with an automated burst type classification are presented. (author)

  6. Improving reticle defect disposition via fully automated lithography simulation

    Science.gov (United States)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  7. Automated lung nodule classification following automated nodule detection on CT: A serial approach

    International Nuclear Information System (INIS)

    Armato, Samuel G. III; Altman, Michael B.; Wilkie, Joel; Sone, Shusuke; Li, Feng; Doi, Kunio; Roy, Arunabha S.

    2003-01-01

    We have evaluated the performance of an automated classifier applied to the task of differentiating malignant and benign lung nodules in low-dose helical computed tomography (CT) scans acquired as part of a lung cancer screening program. The nodules classified in this manner were initially identified by our automated lung nodule detection method, so that the output of automated lung nodule detection was used as input to automated lung nodule classification. This study begins to narrow the distinction between the 'detection task' and the 'classification task'. Automated lung nodule detection is based on two- and three-dimensional analyses of the CT image data. Gray-level-thresholding techniques are used to identify initial lung nodule candidates, for which morphological and gray-level features are computed. A rule-based approach is applied to reduce the number of nodule candidates that correspond to non-nodules, and the features of remaining candidates are merged through linear discriminant analysis to obtain final detection results. Automated lung nodule classification merges the features of the lung nodule candidates identified by the detection algorithm that correspond to actual nodules through another linear discriminant classifier to distinguish between malignant and benign nodules. The automated classification method was applied to the computerized detection results obtained from a database of 393 low-dose thoracic CT scans containing 470 confirmed lung nodules (69 malignant and 401 benign nodules). Receiver operating characteristic (ROC) analysis was used to evaluate the ability of the classifier to differentiate between nodule candidates that correspond to malignant nodules and nodule candidates that correspond to benign lesions. The area under the ROC curve for this classification task attained a value of 0.79 during a leave-one-out evaluation

  8. Developmental defects in zebrafish for classification of EGF pathway inhibitors

    International Nuclear Information System (INIS)

    Pruvot, Benoist; Curé, Yoann; Djiotsa, Joachim; Voncken, Audrey; Muller, Marc

    2014-01-01

    One of the major challenges when testing drug candidates targeted at a specific pathway in whole animals is the discrimination between specific effects and unwanted, off-target effects. Here we used the zebrafish to define several developmental defects caused by impairment of Egf signaling, a major pathway of interest in tumor biology. We inactivated Egf signaling by genetically blocking Egf expression or using specific inhibitors of the Egf receptor function. We show that the combined occurrence of defects in cartilage formation, disturbance of blood flow in the trunk and a decrease of myelin basic protein expression represent good indicators for impairment of Egf signaling. Finally, we present a classification of known tyrosine kinase inhibitors according to their specificity for the Egf pathway. In conclusion, we show that developmental indicators can help to discriminate between specific effects on the target pathway from off-target effects in molecularly targeted drug screening experiments in whole animal systems. - Highlights: • We analyze the functions of Egf signaling on zebrafish development. • Genetic blocking of Egf expression causes cartilage, myelin and circulatory defects. • Chemical inhibition of Egf receptor function causes similar defects. • Developmental defects can reveal the specificity of Egf pathway inhibitors

  9. Developmental defects in zebrafish for classification of EGF pathway inhibitors

    Energy Technology Data Exchange (ETDEWEB)

    Pruvot, Benoist; Curé, Yoann; Djiotsa, Joachim; Voncken, Audrey; Muller, Marc, E-mail: m.muller@ulg.ac.be

    2014-01-15

    One of the major challenges when testing drug candidates targeted at a specific pathway in whole animals is the discrimination between specific effects and unwanted, off-target effects. Here we used the zebrafish to define several developmental defects caused by impairment of Egf signaling, a major pathway of interest in tumor biology. We inactivated Egf signaling by genetically blocking Egf expression or using specific inhibitors of the Egf receptor function. We show that the combined occurrence of defects in cartilage formation, disturbance of blood flow in the trunk and a decrease of myelin basic protein expression represent good indicators for impairment of Egf signaling. Finally, we present a classification of known tyrosine kinase inhibitors according to their specificity for the Egf pathway. In conclusion, we show that developmental indicators can help to discriminate between specific effects on the target pathway from off-target effects in molecularly targeted drug screening experiments in whole animal systems. - Highlights: • We analyze the functions of Egf signaling on zebrafish development. • Genetic blocking of Egf expression causes cartilage, myelin and circulatory defects. • Chemical inhibition of Egf receptor function causes similar defects. • Developmental defects can reveal the specificity of Egf pathway inhibitors.

  10. Clever Toolbox - the Art of Automated Genre Classification

    DEFF Research Database (Denmark)

    2005-01-01

    Automatic musical genre classification can be defined as the science of finding computer algorithms that a digitized sound clip as input and yield a musical genre as output. The goal of automated genre classification is, of course, that the musical genre should agree with the human classificasion....... This demo illustrates an approach to the problem that first extract frequency-based sound features followed by a "linear regression" classifier. The basic features are the so-called mel-frequency cepstral coefficients (MFCCs), which are extracted on a time-scale of 30 msec. From these MFCC features, auto......) is subsequently used for classification. This classifier is rather simple; current research investigates more advanced methods of classification....

  11. Evolutionary fuzzy ARTMAP neural networks for classification of semiconductor defects.

    Science.gov (United States)

    Tan, Shing Chiang; Watada, Junzo; Ibrahim, Zuwairie; Khalid, Marzuki

    2015-05-01

    Wafer defect detection using an intelligent system is an approach of quality improvement in semiconductor manufacturing that aims to enhance its process stability, increase production capacity, and improve yields. Occasionally, only few records that indicate defective units are available and they are classified as a minority group in a large database. Such a situation leads to an imbalanced data set problem, wherein it engenders a great challenge to deal with by applying machine-learning techniques for obtaining effective solution. In addition, the database may comprise overlapping samples of different classes. This paper introduces two models of evolutionary fuzzy ARTMAP (FAM) neural networks to deal with the imbalanced data set problems in a semiconductor manufacturing operations. In particular, both the FAM models and hybrid genetic algorithms are integrated in the proposed evolutionary artificial neural networks (EANNs) to classify an imbalanced data set. In addition, one of the proposed EANNs incorporates a facility to learn overlapping samples of different classes from the imbalanced data environment. The classification results of the proposed evolutionary FAM neural networks are presented, compared, and analyzed using several classification metrics. The outcomes positively indicate the effectiveness of the proposed networks in handling classification problems with imbalanced data sets.

  12. Automating the expert consensus paradigm for robust lung tissue classification

    Science.gov (United States)

    Rajagopalan, Srinivasan; Karwoski, Ronald A.; Raghunath, Sushravya; Bartholmai, Brian J.; Robb, Richard A.

    2012-03-01

    Clinicians confirm the efficacy of dynamic multidisciplinary interactions in diagnosing Lung disease/wellness from CT scans. However, routine clinical practice cannot readily accomodate such interactions. Current schemes for automating lung tissue classification are based on a single elusive disease differentiating metric; this undermines their reliability in routine diagnosis. We propose a computational workflow that uses a collection (#: 15) of probability density functions (pdf)-based similarity metrics to automatically cluster pattern-specific (#patterns: 5) volumes of interest (#VOI: 976) extracted from the lung CT scans of 14 patients. The resultant clusters are refined for intra-partition compactness and subsequently aggregated into a super cluster using a cluster ensemble technique. The super clusters were validated against the consensus agreement of four clinical experts. The aggregations correlated strongly with expert consensus. By effectively mimicking the expertise of physicians, the proposed workflow could make automation of lung tissue classification a clinical reality.

  13. “The Naming of Cats”: Automated Genre Classification

    Directory of Open Access Journals (Sweden)

    Yunhyong Kim

    2007-07-01

    Full Text Available This paper builds on the work presented at the ECDL 2006 in automated genre classification as a step toward automating metadata extraction from digital documents for ingest into digital repositories such as those run by archives, libraries and eprint services (Kim & Ross, 2006b. We have previously proposed dividing features of a document into five types (features for visual layout, language model features, stylometric features, features for semantic structure, and contextual features as an object linked to previously classified objects and other external sources and have examined visual and language model features. The current paper compares results from testing classifiers based on image and stylometric features in a binary classification to show that certain genres have strong image features which enable effective separation of documents belonging to the genre from a large pool of other documents.

  14. Automated cell type discovery and classification through knowledge transfer

    Science.gov (United States)

    Lee, Hao-Chih; Kosoy, Roman; Becker, Christine E.

    2017-01-01

    Abstract Motivation: Recent advances in mass cytometry allow simultaneous measurements of up to 50 markers at single-cell resolution. However, the high dimensionality of mass cytometry data introduces computational challenges for automated data analysis and hinders translation of new biological understanding into clinical applications. Previous studies have applied machine learning to facilitate processing of mass cytometry data. However, manual inspection is still inevitable and becoming the barrier to reliable large-scale analysis. Results: We present a new algorithm called Automated Cell-type Discovery and Classification (ACDC) that fully automates the classification of canonical cell populations and highlights novel cell types in mass cytometry data. Evaluations on real-world data show ACDC provides accurate and reliable estimations compared to manual gating results. Additionally, ACDC automatically classifies previously ambiguous cell types to facilitate discovery. Our findings suggest that ACDC substantially improves both reliability and interpretability of results obtained from high-dimensional mass cytometry profiling data. Availability and Implementation: A Python package (Python 3) and analysis scripts for reproducing the results are availability on https://bitbucket.org/dudleylab/acdc. Contact: brian.kidd@mssm.edu or joel.dudley@mssm.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28158442

  15. Automated Classification of Asteroids into Families at Work

    Science.gov (United States)

    Knežević, Zoran; Milani, Andrea; Cellino, Alberto; Novaković, Bojan; Spoto, Federica; Paolicchi, Paolo

    2014-07-01

    We have recently proposed a new approach to the asteroid family classification by combining the classical HCM method with an automated procedure to add newly discovered members to existing families. This approach is specifically intended to cope with ever increasing asteroid data sets, and consists of several steps to segment the problem and handle the very large amount of data in an efficient and accurate manner. We briefly present all these steps and show the results from three subsequent updates making use of only the automated step of attributing the newly numbered asteroids to the known families. We describe the changes of the individual families membership, as well as the evolution of the classification due to the newly added intersections between the families, resolved candidate family mergers, and emergence of the new candidates for the mergers. We thus demonstrate how by the new approach the asteroid family classification becomes stable in general terms (converging towards a permanent list of confirmed families), and in the same time evolving in details (to account for the newly discovered asteroids) at each update.

  16. Literature classification for semi-automated updating of biological knowledgebases

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Kudahl, Ulrich Johan; Winther, Ole

    2013-01-01

    abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion: We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining...... types of biological data, such as sequence data, are extensively stored in biological databases, functional annotations, such as immunological epitopes, are found primarily in semi-structured formats or free text embedded in primary scientific literature. Results: We defined and applied a machine...

  17. Automated retinal vessel type classification in color fundus images

    Science.gov (United States)

    Yu, H.; Barriga, S.; Agurto, C.; Nemeth, S.; Bauman, W.; Soliz, P.

    2013-02-01

    Automated retinal vessel type classification is an essential first step toward machine-based quantitative measurement of various vessel topological parameters and identifying vessel abnormalities and alternations in cardiovascular disease risk analysis. This paper presents a new and accurate automatic artery and vein classification method developed for arteriolar-to-venular width ratio (AVR) and artery and vein tortuosity measurements in regions of interest (ROI) of 1.5 and 2.5 optic disc diameters from the disc center, respectively. This method includes illumination normalization, automatic optic disc detection and retinal vessel segmentation, feature extraction, and a partial least squares (PLS) classification. Normalized multi-color information, color variation, and multi-scale morphological features are extracted on each vessel segment. We trained the algorithm on a set of 51 color fundus images using manually marked arteries and veins. We tested the proposed method in a previously unseen test data set consisting of 42 images. We obtained an area under the ROC curve (AUC) of 93.7% in the ROI of AVR measurement and 91.5% of AUC in the ROI of tortuosity measurement. The proposed AV classification method has the potential to assist automatic cardiovascular disease early detection and risk analysis.

  18. Automated Tissue Classification Framework for Reproducible Chronic Wound Assessment

    Directory of Open Access Journals (Sweden)

    Rashmi Mukherjee

    2014-01-01

    Full Text Available The aim of this paper was to develop a computer assisted tissue classification (granulation, necrotic, and slough scheme for chronic wound (CW evaluation using medical image processing and statistical machine learning techniques. The red-green-blue (RGB wound images grabbed by normal digital camera were first transformed into HSI (hue, saturation, and intensity color space and subsequently the “S” component of HSI color channels was selected as it provided higher contrast. Wound areas from 6 different types of CW were segmented from whole images using fuzzy divergence based thresholding by minimizing edge ambiguity. A set of color and textural features describing granulation, necrotic, and slough tissues in the segmented wound area were extracted using various mathematical techniques. Finally, statistical learning algorithms, namely, Bayesian classification and support vector machine (SVM, were trained and tested for wound tissue classification in different CW images. The performance of the wound area segmentation protocol was further validated by ground truth images labeled by clinical experts. It was observed that SVM with 3rd order polynomial kernel provided the highest accuracies, that is, 86.94%, 90.47%, and 75.53%, for classifying granulation, slough, and necrotic tissues, respectively. The proposed automated tissue classification technique achieved the highest overall accuracy, that is, 87.61%, with highest kappa statistic value (0.793.

  19. Generalized classification of welds according to defect type based on raidation testing results

    International Nuclear Information System (INIS)

    Adamenko, A.A.; Demidko, V.G.

    1980-01-01

    Constructed is a generalized classification of welds according to defect type, with respect to real danger of defect, which in the first approximation is proportional to relatively decrease of the thickness, and with respect to defect potential danger which can be determined by its pointing. According to this classification the welded joints are divided into five classes according to COMECON guides. The division into classes is carried out according to two-fold numerical criterium which is applicable in case of the presence of experimental data on three defect linear sizes. The above classification is of main importance while automatic data processing of the radiation testing

  20. Automated recognition system for ELM classification in JET

    International Nuclear Information System (INIS)

    Duro, N.; Dormido, R.; Vega, J.; Dormido-Canto, S.; Farias, G.; Sanchez, J.; Vargas, H.; Murari, A.

    2009-01-01

    Edge localized modes (ELMs) are instabilities occurring in the edge of H-mode plasmas. Considerable efforts are being devoted to understanding the physics behind this non-linear phenomenon. A first characterization of ELMs is usually their identification as type I or type III. An automated pattern recognition system has been developed in JET for off-line ELM recognition and classification. The empirical method presented in this paper analyzes each individual ELM instead of starting from a temporal segment containing many ELM bursts. The ELM recognition and isolation is carried out using three signals: Dα, line integrated electron density and stored diamagnetic energy. A reduced set of characteristics (such as diamagnetic energy drop, ELM period or Dα shape) has been extracted to build supervised and unsupervised learning systems for classification purposes. The former are based on support vector machines (SVM). The latter have been developed with hierarchical and K-means clustering methods. The success rate of the classification systems is about 98% for a database of almost 300 ELMs.

  1. Advanced defect classification by smart sampling, based on sub-wavelength anisotropic scatterometry

    Science.gov (United States)

    van der Walle, Peter; Kramer, Esther; Ebeling, Rob; Spruit, Helma; Alkemade, Paul; Pereira, Silvania; van der Donck, Jacques; Maas, Diederik

    2018-03-01

    We report on advanced defect classification using TNO's RapidNano particle scanner. RapidNano was originally designed for defect detection on blank substrates. In detection-mode, the RapidNano signal from nine azimuth angles is added for sensitivity. In review-mode signals from individual angles are analyzed to derive additional defect properties. We define the Fourier coefficient parameter space that is useful to study the statistical variation in defect types on a sample. By selecting defects from each defect type for further review by SEM, information on all defects can be obtained efficiently.

  2. Classification of maxillectomy defects: a systematic review and criteria necessary for a universal description.

    Science.gov (United States)

    Bidra, Avinash S; Jacob, Rhonda F; Taylor, Thomas D

    2012-04-01

    Maxillectomy defects are complex and involve a number of anatomic structures. Several maxillectomy defect classifications have been proposed with no universal acceptance among surgeons and prosthodontists. Established criteria for describing the maxillectomy defect are lacking. This systematic review aimed to evaluate classification systems in the available literature, to provide a critical appraisal, and to identify the criteria necessary for a universal description of maxillectomy and midfacial defects. An electronic search of the English language literature between the periods of 1974 and June 2011 was performed by using PubMed, Scopus, and Cochrane databases with predetermined inclusion criteria. Key terms included in the search were maxillectomy classification, maxillary resection classification, maxillary removal classification, maxillary reconstruction classification, midfacial defect classification, and midfacial reconstruction classification. This was supplemented by a manual search of selected journals. After application of predetermined exclusion criteria, the final list of articles was reviewed in-depth to provide a critical appraisal and identify criteria for a universal description of a maxillectomy defect. The electronic database search yielded 261 titles. Systematic application of inclusion and exclusion criteria resulted in identification of 14 maxillectomy and midfacial defect classification systems. From these articles, 6 different criteria were identified as necessary for a universal description of a maxillectomy defect. Multiple deficiencies were noted in each classification system. Though most articles described the superior-inferior extent of the defect, only a small number of articles described the anterior-posterior and medial-lateral extent of the defect. Few articles listed dental status and soft palate involvement when describing maxillectomy defects. No classification system has accurately described the maxillectomy defect, based on

  3. ROLE OF DATA MINING CLASSIFICATION TECHNIQUE IN SOFTWARE DEFECT PREDICTION

    OpenAIRE

    Dr.A.R.Pon Periyasamy; Mrs A.Misbahulhuda

    2017-01-01

    Software defect prediction is the process of locating defective modules in software. Software quality may be a field of study and apply that describes the fascinating attributes of software package product. The performance should be excellent with none defects. Software quality metrics are a set of software package metrics that target the standard aspects of the product, process, and project. The software package defect prediction model helps in early detection of defects and contributes to t...

  4. Automated vegetation classification using Thematic Mapper Simulation data

    Science.gov (United States)

    Nedelman, K. S.; Cate, R. B.; Bizzell, R. M.

    1983-01-01

    The present investigation is concerned with the results of a study of Thematic Mapper Simulation (TMS) data. One of the objectives of the study was related to an evaluation of the usefulness of the Thematic Mapper's (TM) improved spatial resolution and spectral coverage. The study was undertaken as part of a preparation for the efficient incorporation of Landsat 4 data into ongoing technology development in remote sensing. The study included an application of automated Landsat vegetation classification technology to TMS data. Results of comparing TMS data to Multispectral Scanner (MSS) data were found to indicate that all field definition, crop type discrimination, and subsequent proportion estimation may be greatly increased with the availability of TM data.

  5. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    Science.gov (United States)

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  6. Automated authorship attribution using advanced signal classification techniques.

    Directory of Open Access Journals (Sweden)

    Maryam Ebrahimpour

    Full Text Available In this paper, we develop two automated authorship attribution schemes, one based on Multiple Discriminant Analysis (MDA and the other based on a Support Vector Machine (SVM. The classification features we exploit are based on word frequencies in the text. We adopt an approach of preprocessing each text by stripping it of all characters except a-z and space. This is in order to increase the portability of the software to different types of texts. We test the methodology on a corpus of undisputed English texts, and use leave-one-out cross validation to demonstrate classification accuracies in excess of 90%. We further test our methods on the Federalist Papers, which have a partly disputed authorship and a fair degree of scholarly consensus. And finally, we apply our methodology to the question of the authorship of the Letter to the Hebrews by comparing it against a number of original Greek texts of known authorship. These tests identify where some of the limitations lie, motivating a number of open questions for future work. An open source implementation of our methodology is freely available for use at https://github.com/matthewberryman/author-detection.

  7. Frequency Optimization for Enhancement of Surface Defect Classification Using the Eddy Current Technique

    Science.gov (United States)

    Fan, Mengbao; Wang, Qi; Cao, Binghua; Ye, Bo; Sunny, Ali Imam; Tian, Guiyun

    2016-01-01

    Eddy current testing is quite a popular non-contact and cost-effective method for nondestructive evaluation of product quality and structural integrity. Excitation frequency is one of the key performance factors for defect characterization. In the literature, there are many interesting papers dealing with wide spectral content and optimal frequency in terms of detection sensitivity. However, research activity on frequency optimization with respect to characterization performances is lacking. In this paper, an investigation into optimum excitation frequency has been conducted to enhance surface defect classification performance. The influences of excitation frequency for a group of defects were revealed in terms of detection sensitivity, contrast between defect features, and classification accuracy using kernel principal component analysis (KPCA) and a support vector machine (SVM). It is observed that probe signals are the most sensitive on the whole for a group of defects when excitation frequency is set near the frequency at which maximum probe signals are retrieved for the largest defect. After the use of KPCA, the margins between the defect features are optimum from the perspective of the SVM, which adopts optimal hyperplanes for structure risk minimization. As a result, the best classification accuracy is obtained. The main contribution is that the influences of excitation frequency on defect characterization are interpreted, and experiment-based procedures are proposed to determine the optimal excitation frequency for a group of defects rather than a single defect with respect to optimal characterization performances. PMID:27164112

  8. The method of diagnosis and classification of the gingival line defects of the teeth hard tissues

    Directory of Open Access Journals (Sweden)

    Olena Bulbuk

    2017-06-01

    Full Text Available For solving the problem of diagnosis and treatment of hard tissue defects the significant role belongs to the choice of tactics for dental treatment of hard tissue defects located in the gingival line of any tooth. This work aims to study the problems of diagnosis and classification of gingival line defects of the teeth hard tissues. That will contribute to the objectification of differentiated diagnostic and therapeutic approaches in the dental treatment of various clinical variants of these defects localization. The objective of the study – is to develop the anatomical-functional classification for differentiated estimation of hard tissue defects in the gingival part, as the basis for the application of differential diagnostic-therapeutic approaches to the dental treatment of hard tissue defects disposed in the gingival part of any tooth. Materials and methods of investigation: There was conducted the examination of 48 patients with hard tissue defects located in the gingival part of any tooth. To assess the magnitude of gingival line destruction the periodontal probe and X-ray examination were used. Results. The result of the performed research the classification of the gingival line defects of the hard tissues was offered using exponent power. The value of this indicator is equal to an integer number expressed in millimeters of distance from the epithelial attachment to the cavity’s bottom of defect. Conclusions. The proposed classification fills an obvious gap in academic representations about hard tissue defects located in the gingival part of any tooth. Also it offers the prospects of consensus on differentiated diagnostic-therapeutic approaches in different clinical variants of location.  This classification builds methodological “bridge of continuity” between therapeutic and prosthetic dentistry in the field of treatment of the gingival line defects of dental hard tissues.

  9. Automated classification of Acid Rock Drainage potential from Corescan drill core imagery

    Science.gov (United States)

    Cracknell, M. J.; Jackson, L.; Parbhakar-Fox, A.; Savinova, K.

    2017-12-01

    Classification of the acid forming potential of waste rock is important for managing environmental hazards associated with mining operations. Current methods for the classification of acid rock drainage (ARD) potential usually involve labour intensive and subjective assessment of drill core and/or hand specimens. Manual methods are subject to operator bias, human error and the amount of material that can be assessed within a given time frame is limited. The automated classification of ARD potential documented here is based on the ARD Index developed by Parbhakar-Fox et al. (2011). This ARD Index involves the combination of five indicators: A - sulphide content; B - sulphide alteration; C - sulphide morphology; D - primary neutraliser content; and E - sulphide mineral association. Several components of the ARD Index require accurate identification of sulphide minerals. This is achieved by classifying Corescan Red-Green-Blue true colour images into the presence or absence of sulphide minerals using supervised classification. Subsequently, sulphide classification images are processed and combined with Corescan SWIR-based mineral classifications to obtain information on sulphide content, indices representing sulphide textures (disseminated versus massive and degree of veining), and spatially associated minerals. This information is combined to calculate ARD Index indicator values that feed into the classification of ARD potential. Automated ARD potential classifications of drill core samples associated with a porphyry Cu-Au deposit are compared to manually derived classifications and those obtained by standard static geochemical testing and X-ray diffractometry analyses. Results indicate a high degree of similarity between automated and manual ARD potential classifications. Major differences between approaches are observed in sulphide and neutraliser mineral percentages, likely due to the subjective nature of manual estimates of mineral content. The automated approach

  10. Comparison of an automated classification system with an empirical classification of circulation patterns over the Pannonian basin, Central Europe

    Science.gov (United States)

    Maheras, Panagiotis; Tolika, Konstantia; Tegoulias, Ioannis; Anagnostopoulou, Christina; Szpirosz, Klicász; Károssy, Csaba; Makra, László

    2018-04-01

    The aim of the study is to compare the performance of the two classification methods, based on the atmospheric circulation types over the Pannonian basin in Central Europe. Moreover, relationships including seasonal occurrences and correlation coefficients, as well as comparative diagrams of the seasonal occurrences of the circulation types of the two classification systems are presented. When comparing of the automated (objective) and empirical (subjective) classification methods, it was found that the frequency of the empirical anticyclonic (cyclonic) types is much higher (lower) than that of the automated anticyclonic (cyclonic) types both on an annual and seasonal basis. The highest and statistically significant correlations between the circulation types of the two classification systems, as well as those between the cumulated seasonal anticyclonic and cyclonic types occur in winter for both classifications, since the weather-influencing effect of the atmospheric circulation in this season is the most prevalent. Precipitation amounts in Budapest display a decreasing trend in accordance with the decrease in the occurrence of the automated cyclonic types. In contrast, the occurrence of the empirical cyclonic types displays an increasing trend. There occur types in a given classification that are usually accompanied by high ratios of certain types in the other classification.

  11. Yarn-dyed fabric defect classification based on convolutional neural network

    Science.gov (United States)

    Jing, Junfeng; Dong, Amei; Li, Pengfei; Zhang, Kaibing

    2017-09-01

    Considering that manual inspection of the yarn-dyed fabric can be time consuming and inefficient, we propose a yarn-dyed fabric defect classification method by using a convolutional neural network (CNN) based on a modified AlexNet. CNN shows powerful ability in performing feature extraction and fusion by simulating the learning mechanism of human brain. The local response normalization layers in AlexNet are replaced by the batch normalization layers, which can enhance both the computational efficiency and classification accuracy. In the training process of the network, the characteristics of the defect are extracted step by step and the essential features of the image can be obtained from the fusion of the edge details with several convolution operations. Then the max-pooling layers, the dropout layers, and the fully connected layers are employed in the classification model to reduce the computation cost and extract more precise features of the defective fabric. Finally, the results of the defect classification are predicted by the softmax function. The experimental results show promising performance with an acceptable average classification rate and strong robustness on yarn-dyed fabric defect classification.

  12. An Ultrasonic Pattern Recognition Approach to Welding Defect Classification

    International Nuclear Information System (INIS)

    Song, Sung Jin

    1995-01-01

    Classification of flaws in weldments from their ultrasonic scattering signals is very important in quantitative nondestructive evaluation. This problem is ideally suited to a modern ultrasonic pattern recognition technique. Here brief discussion on systematic approach to this methodology is presented including ultrasonic feature extraction, feature selection and classification. A stronger emphasis is placed on probabilistic neural networks as efficient classifiers for many practical classification problems. In an example probabilistic neural networks are applied to classify flaws in weldments into 3 classes such as cracks, porosity and slag inclusions. Probabilistic nets are shown to be able to exhibit high performance of other classifiers without any training time overhead. In addition, forward selection scheme for sensitive features is addressed to enhance network performance

  13. Land Cover and Land Use Classification with TWOPAC: towards Automated Processing for Pixel- and Object-Based Image Classification

    Directory of Open Access Journals (Sweden)

    Stefan Dech

    2012-09-01

    Full Text Available We present a novel and innovative automated processing environment for the derivation of land cover (LC and land use (LU information. This processing framework named TWOPAC (TWinned Object and Pixel based Automated classification Chain enables the standardized, independent, user-friendly, and comparable derivation of LC and LU information, with minimized manual classification labor. TWOPAC allows classification of multi-spectral and multi-temporal remote sensing imagery from different sensor types. TWOPAC enables not only pixel-based classification, but also allows classification based on object-based characteristics. Classification is based on a Decision Tree approach (DT for which the well-known C5.0 code has been implemented, which builds decision trees based on the concept of information entropy. TWOPAC enables automatic generation of the decision tree classifier based on a C5.0-retrieved ascii-file, as well as fully automatic validation of the classification output via sample based accuracy assessment.Envisaging the automated generation of standardized land cover products, as well as area-wide classification of large amounts of data in preferably a short processing time, standardized interfaces for process control, Web Processing Services (WPS, as introduced by the Open Geospatial Consortium (OGC, are utilized. TWOPAC’s functionality to process geospatial raster or vector data via web resources (server, network enables TWOPAC’s usability independent of any commercial client or desktop software and allows for large scale data processing on servers. Furthermore, the components of TWOPAC were built-up using open source code components and are implemented as a plug-in for Quantum GIS software for easy handling of the classification process from the user’s perspective.

  14. A systematic literature review of automated clinical coding and classification systems.

    Science.gov (United States)

    Stanfill, Mary H; Williams, Margaret; Fenton, Susan H; Jenders, Robert A; Hersh, William R

    2010-01-01

    Clinical coding and classification processes transform natural language descriptions in clinical text into data that can subsequently be used for clinical care, research, and other purposes. This systematic literature review examined studies that evaluated all types of automated coding and classification systems to determine the performance of such systems. Studies indexed in Medline or other relevant databases prior to March 2009 were considered. The 113 studies included in this review show that automated tools exist for a variety of coding and classification purposes, focus on various healthcare specialties, and handle a wide variety of clinical document types. Automated coding and classification systems themselves are not generalizable, nor are the results of the studies evaluating them. Published research shows these systems hold promise, but these data must be considered in context, with performance relative to the complexity of the task and the desired outcome.

  15. Development of an intelligent ultrasonic welding defect classification software

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Hak Joon; Jeong, Hee Don

    1997-01-01

    Ultrasonic pattern recognition is the most effective approach to the problem of discriminating types of flaws in weldments based on ultrasonic flaw signals. In spite of significant progress in the research on this methodology, it has not been widely used in many practical ultrasonic inspections of weldments in industry. Hence, for the convenient application of this approach in many practical situations, we develop an intelligent ultrasonic signature classification software which can discriminate types of flaws in weldments based on their ultrasonic signals using various tools in artificial intelligence such as neural networks. This software shows the excellent performance in an experimental problem where flaws in weldments are classified into two categories of cracks and non-cracks. This performance demonstrates the high possibility of this software as a practical tool for ultrasonic flaw classification in weldments.

  16. Lithography-based automation in the design of program defect masks

    Science.gov (United States)

    Vakanas, George P.; Munir, Saghir; Tejnil, Edita; Bald, Daniel J.; Nagpal, Rajesh

    2004-05-01

    In this work, we are reporting on a lithography-based methodology and automation in the design of Program Defect masks (PDM"s). Leading edge technology masks have ever-shrinking primary features and more pronounced model-based secondary features such as optical proximity corrections (OPC), sub-resolution assist features (SRAF"s) and phase-shifted mask (PSM) structures. In order to define defect disposition specifications for critical layers of a technology node, experience alone in deciding worst-case scenarios for the placement of program defects is necessary but may not be sufficient. MEEF calculations initiated from layout pattern data and their integration in a PDM layout flow provide a natural approach for improvements, relevance and accuracy in the placement of programmed defects. This methodology provides closed-loop feedback between layout and hard defect disposition specifications, thereby minimizing engineering test restarts, improving quality and reducing cost of high-end masks. Apart from SEMI and industry standards, best-known methods (BKM"s) in integrated lithographically-based layout methodologies and automation specific to PDM"s are scarce. The contribution of this paper lies in the implementation of Design-For-Test (DFT) principles to a synergistic interaction of CAD Layout and Aerial Image Simulator to drive layout improvements, highlight layout-to-fracture interactions and output accurate program defect placement coordinates to be used by tools in the mask shop.

  17. Estimated accuracy of classification of defects detected in welded joints by radiographic tests

    International Nuclear Information System (INIS)

    Siqueira, M.H.S.; De Silva, R.R.; De Souza, M.P.V.; Rebello, J.M.A.; Caloba, L.P.; Mery, D.

    2004-01-01

    This work is a study to estimate the accuracy of classification of the main classes of weld defects detected by radiography test, such as: undercut, lack of penetration, porosity, slag inclusion, crack or lack of fusion. To carry out this work non-linear pattern classifiers were developed, using neural networks, and the largest number of radiographic patterns as possible was used as well as statistical inference techniques of random selection of samples with and without repositioning (bootstrap) in order to estimate the accuracy of the classification. The results pointed to an estimated accuracy of around 80% for the classes of defects analyzed. (author)

  18. Estimated accuracy of classification of defects detected in welded joints by radiographic tests

    Energy Technology Data Exchange (ETDEWEB)

    Siqueira, M.H.S.; De Silva, R.R.; De Souza, M.P.V.; Rebello, J.M.A. [Federal Univ. of Rio de Janeiro, Dept., of Metallurgical and Materials Engineering, Rio de Janeiro (Brazil); Caloba, L.P. [Federal Univ. of Rio de Janeiro, Dept., of Electrical Engineering, Rio de Janeiro (Brazil); Mery, D. [Pontificia Unversidad Catolica de Chile, Escuela de Ingenieria - DCC, Dept. de Ciencia de la Computacion, Casilla, Santiago (Chile)

    2004-07-01

    This work is a study to estimate the accuracy of classification of the main classes of weld defects detected by radiography test, such as: undercut, lack of penetration, porosity, slag inclusion, crack or lack of fusion. To carry out this work non-linear pattern classifiers were developed, using neural networks, and the largest number of radiographic patterns as possible was used as well as statistical inference techniques of random selection of samples with and without repositioning (bootstrap) in order to estimate the accuracy of the classification. The results pointed to an estimated accuracy of around 80% for the classes of defects analyzed. (author)

  19. Unsupervised Classification of Surface Defects in Wire Rod Production Obtained by Eddy Current Sensors

    Directory of Open Access Journals (Sweden)

    Sergio Saludes-Rodil

    2015-04-01

    Full Text Available An unsupervised approach to classify surface defects in wire rod manufacturing is developed in this paper. The defects are extracted from an eddy current signal and classified using a clustering technique that uses the dynamic time warping distance as the dissimilarity measure. The new approach has been successfully tested using industrial data. It is shown that it outperforms other classification alternatives, such as the modified Fourier descriptors.

  20. Defect detection and classification of galvanized stamping parts based on fully convolution neural network

    Science.gov (United States)

    Xiao, Zhitao; Leng, Yanyi; Geng, Lei; Xi, Jiangtao

    2018-04-01

    In this paper, a new convolution neural network method is proposed for the inspection and classification of galvanized stamping parts. Firstly, all workpieces are divided into normal and defective by image processing, and then the defective workpieces extracted from the region of interest (ROI) area are input to the trained fully convolutional networks (FCN). The network utilizes an end-to-end and pixel-to-pixel training convolution network that is currently the most advanced technology in semantic segmentation, predicts result of each pixel. Secondly, we mark the different pixel values of the workpiece, defect and background for the training image, and use the pixel value and the number of pixels to realize the recognition of the defects of the output picture. Finally, the defect area's threshold depended on the needs of the project is set to achieve the specific classification of the workpiece. The experiment results show that the proposed method can successfully achieve defect detection and classification of galvanized stamping parts under ordinary camera and illumination conditions, and its accuracy can reach 99.6%. Moreover, it overcomes the problem of complex image preprocessing and difficult feature extraction and performs better adaptability.

  1. Semi-Automated Classification of Seafloor Data Collected on the Delmarva Inner Shelf

    Science.gov (United States)

    Sweeney, E. M.; Pendleton, E. A.; Brothers, L. L.; Mahmud, A.; Thieler, E. R.

    2017-12-01

    We tested automated classification methods on acoustic bathymetry and backscatter data collected by the U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) on the Delmarva inner continental shelf to efficiently and objectively identify sediment texture and geomorphology. Automated classification techniques are generally less subjective and take significantly less time than manual classification methods. We used a semi-automated process combining unsupervised and supervised classification techniques to characterize seafloor based on bathymetric slope and relative backscatter intensity. Statistical comparison of our automated classification results with those of a manual classification conducted on a subset of the acoustic imagery indicates that our automated method was highly accurate (95% total accuracy and 93% Kappa). Our methods resolve sediment ridges, zones of flat seafloor and areas of high and low backscatter. We compared our classification scheme with mean grain size statistics of samples collected in the study area and found that strong correlations between backscatter intensity and sediment texture exist. High backscatter zones are associated with the presence of gravel and shells mixed with sand, and low backscatter areas are primarily clean sand or sand mixed with mud. Slope classes further elucidate textural and geomorphologic differences in the seafloor, such that steep slopes (>0.35°) with high backscatter are most often associated with the updrift side of sand ridges and bedforms, whereas low slope with high backscatter correspond to coarse lag or shell deposits. Low backscatter and high slopes are most often found on the downdrift side of ridges and bedforms, and low backscatter and low slopes identify swale areas and sand sheets. We found that poor acoustic data quality was the most significant cause of inaccurate classification results, which required additional user input to mitigate. Our method worked well

  2. Automated Stellar Classification for Large Surveys with EKF and RBF Neural Networks

    Institute of Scientific and Technical Information of China (English)

    Ling Bai; Ping Guo; Zhan-Yi Hu

    2005-01-01

    An automated classification technique for large size stellar surveys is proposed. It uses the extended Kalman filter as a feature selector and pre-classifier of the data, and the radial basis function neural networks for the classification.Experiments with real data have shown that the correct classification rate can reach as high as 93%, which is quite satisfactory. When different system models are selected for the extended Kalman filter, the classification results are relatively stable. It is shown that for this particular case the result using extended Kalman filter is better than using principal component analysis.

  3. Reconstruction of road defects and road roughness classification using vehicle responses with artificial neural networks simulation

    CSIR Research Space (South Africa)

    Ngwangwa, HM

    2010-04-01

    Full Text Available -1 Journal of Terramechanics Volume 47, Issue 2, April 2010, Pages 97-111 Reconstruction of road defects and road roughness classification using vehicle responses with artificial neural networks simulation H.M. Ngwangwaa, P.S. Heynsa, , , F...

  4. Simple Fully Automated Group Classification on Brain fMRI

    International Nuclear Information System (INIS)

    Honorio, J.; Goldstein, R.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-01-01

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  5. Simple Fully Automated Group Classification on Brain fMRI

    Energy Technology Data Exchange (ETDEWEB)

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  6. Automation of the Analysis and Classification of the Line Material

    Directory of Open Access Journals (Sweden)

    A. A. Machuev

    2011-03-01

    Full Text Available The work is devoted to the automation of the process of the analysis and verification of various formats of data presentation for what the special software is developed. Working out and testing the special software were made on an example of files with the typical expansions which features of structure are known in advance.

  7. Supervised learning for the automated transcription of spacer classification from spoligotype films

    Directory of Open Access Journals (Sweden)

    Abernethy Neil

    2009-08-01

    Full Text Available Abstract Background Molecular genotyping of bacteria has revolutionized the study of tuberculosis epidemiology, yet these established laboratory techniques typically require subjective and laborious interpretation by trained professionals. In the context of a Tuberculosis Case Contact study in The Gambia we used a reverse hybridization laboratory assay called spoligotype analysis. To facilitate processing of spoligotype images we have developed tools and algorithms to automate the classification and transcription of these data directly to a database while allowing for manual editing. Results Features extracted from each of the 1849 spots on a spoligo film were classified using two supervised learning algorithms. A graphical user interface allows manual editing of the classification, before export to a database. The application was tested on ten films of differing quality and the results of the best classifier were compared to expert manual classification, giving a median correct classification rate of 98.1% (inter quartile range: 97.1% to 99.2%, with an automated processing time of less than 1 minute per film. Conclusion The software implementation offers considerable time savings over manual processing whilst allowing expert editing of the automated classification. The automatic upload of the classification to a database reduces the chances of transcription errors.

  8. Experimental Study of the Effect of Internal Defects on Stress Waves during Automated Fiber Placement

    Directory of Open Access Journals (Sweden)

    Zhenyu Han

    2018-04-01

    Full Text Available The detection technique of component defects is currently only realized to detect offline defects and online surface defects during automated fiber placement (AFP. The characteristics of stress waves can be effectively applied to identify and detect internal defects in material structure. However, the correlation mechanism between stress waves and internal defects remains unclear during the AFP process. This paper proposes a novel experimental method to test stress waves, where continuous loading induced by process itself is used as an excitation source without other external excitation. Twenty-seven groups of thermosetting prepreg laminates under different processing parameters are manufactured to obtain different void content. In order to quantitatively estimate the void content in the prepreg structure, the relation model between the void content and ultrasonic attenuation coefficient is revealed using an A-scan ultrasonic flaw detector and photographic methods by optical microscope. Furthermore, the high-frequency noises of stress waves are removed using Haar wavelet transform. The peaks, the Manhattan distance and mean stress during the laying process are analyzed and evaluated. Partial conclusions in this paper could provide theoretical support for online real-time detection of internal defects based on stress wave characteristics.

  9. Automated supervised classification of variable stars. I. Methodology

    NARCIS (Netherlands)

    Debosscher, J.; Sarro, L.M.; Aerts, C.C.; Cuypers, J.; Vandenbussche, B.; Garrido, R.; Solano, E.

    2007-01-01

    Context: The fast classification of new variable stars is an important step in making them available for further research. Selection of science targets from large databases is much more efficient if they have been classified first. Defining the classes in terms of physical parameters is also

  10. Categorizing Children: Automated Text Classification of CHILDES files

    NARCIS (Netherlands)

    Opsomer, Rob; Knoth, Peter; Wiering, Marco; van Polen, Freek; Trapman, Jantine

    2008-01-01

    In this paper we present the application of machine learning text classification methods to two tasks: categorization of children’s speech in the CHILDES Database according to gender and age. Both tasks are binary. For age, we distinguish two age groups between the age of 1.9 and 3.0 years old. The

  11. Generating Clustered Journal Maps : An Automated System for Hierarchical Classification

    NARCIS (Netherlands)

    Leydesdorff, L.; Bornmann, L.; Wagner, C.S.

    2017-01-01

    Journal maps and classifications for 11,359 journals listed in the combined Journal Citation Reports 2015 of the Science and Social Sciences Citation Indexes are provided at https://leydesdorff.github.io/journals/ and http://www.leydesdorff.net/jcr15. A routine using VOSviewer for integrating the

  12. How automated image analysis techniques help scientists in species identification and classification?

    Science.gov (United States)

    Yousef Kalafi, Elham; Town, Christopher; Kaur Dhillon, Sarinder

    2017-09-04

    Identification of taxonomy at a specific level is time consuming and reliant upon expert ecologists. Hence the demand for automated species identification increased over the last two decades. Automation of data classification is primarily focussed on images, incorporating and analysing image data has recently become easier due to developments in computational technology. Research efforts in identification of species include specimens' image processing, extraction of identical features, followed by classifying them into correct categories. In this paper, we discuss recent automated species identification systems, categorizing and evaluating their methods. We reviewed and compared different methods in step by step scheme of automated identification and classification systems of species images. The selection of methods is influenced by many variables such as level of classification, number of training data and complexity of images. The aim of writing this paper is to provide researchers and scientists an extensive background study on work related to automated species identification, focusing on pattern recognition techniques in building such systems for biodiversity studies.

  13. Automated Classification of Consumer Health Information Needs in Patient Portal Messages

    Science.gov (United States)

    Cronin, Robert M.; Fabbri, Daniel; Denny, Joshua C.; Jackson, Gretchen Purcell

    2015-01-01

    Patients have diverse health information needs, and secure messaging through patient portals is an emerging means by which such needs are expressed and met. As patient portal adoption increases, growing volumes of secure messages may burden healthcare providers. Automated classification could expedite portal message triage and answering. We created four automated classifiers based on word content and natural language processing techniques to identify health information needs in 1000 patient-generated portal messages. Logistic regression and random forest classifiers detected single information needs well, with area under the curves of 0.804–0.914. A logistic regression classifier accurately found the set of needs within a message, with a Jaccard index of 0.859 (95% Confidence Interval: (0.847, 0.871)). Automated classification of consumer health information needs expressed in patient portal messages is feasible and may allow direct linking to relevant resources or creation of institutional resources for commonly expressed needs. PMID:26958285

  14. Automated Classification of Consumer Health Information Needs in Patient Portal Messages.

    Science.gov (United States)

    Cronin, Robert M; Fabbri, Daniel; Denny, Joshua C; Jackson, Gretchen Purcell

    2015-01-01

    Patients have diverse health information needs, and secure messaging through patient portals is an emerging means by which such needs are expressed and met. As patient portal adoption increases, growing volumes of secure messages may burden healthcare providers. Automated classification could expedite portal message triage and answering. We created four automated classifiers based on word content and natural language processing techniques to identify health information needs in 1000 patient-generated portal messages. Logistic regression and random forest classifiers detected single information needs well, with area under the curves of 0.804-0.914. A logistic regression classifier accurately found the set of needs within a message, with a Jaccard index of 0.859 (95% Confidence Interval: (0.847, 0.871)). Automated classification of consumer health information needs expressed in patient portal messages is feasible and may allow direct linking to relevant resources or creation of institutional resources for commonly expressed needs.

  15. Automated otolith image classification with multiple views: an evaluation on Sciaenidae.

    Science.gov (United States)

    Wong, J Y; Chu, C; Chong, V C; Dhillon, S K; Loh, K H

    2016-08-01

    Combined multiple 2D views (proximal, anterior and ventral aspects) of the sagittal otolith are proposed here as a method to capture shape information for fish classification. Classification performance of single view compared with combined 2D views show improved classification accuracy of the latter, for nine species of Sciaenidae. The effects of shape description methods (shape indices, Procrustes analysis and elliptical Fourier analysis) on classification performance were evaluated. Procrustes analysis and elliptical Fourier analysis perform better than shape indices when single view is considered, but all perform equally well with combined views. A generic content-based image retrieval (CBIR) system that ranks dissimilarity (Procrustes distance) of otolith images was built to search query images without the need for detailed information of side (left or right), aspect (proximal or distal) and direction (positive or negative) of the otolith. Methods for the development of this automated classification system are discussed. © 2016 The Fisheries Society of the British Isles.

  16. Automated Feature Design for Time Series Classification by Genetic Programming

    OpenAIRE

    Harvey, Dustin Yewell

    2014-01-01

    Time series classification (TSC) methods discover and exploit patterns in time series and other one-dimensional signals. Although many accurate, robust classifiers exist for multivariate feature sets, general approaches are needed to extend machine learning techniques to make use of signal inputs. Numerous applications of TSC can be found in structural engineering, especially in the areas of structural health monitoring and non-destructive evaluation. Additionally, the fields of process contr...

  17. Aerial image measurement technique for automated reticle defect disposition (ARDD) in wafer fabs

    Science.gov (United States)

    Zibold, Axel M.; Schmid, Rainer M.; Stegemann, B.; Scheruebl, Thomas; Harnisch, Wolfgang; Kobiyama, Yuji

    2004-08-01

    The Aerial Image Measurement System (AIMS)* for 193 nm lithography emulation has been brought into operation successfully worldwide. A second generation system comprising 193 nm AIMS capability, mini-environment and SMIF, the AIMS fab 193 plus is currently introduced into the market. By adjustment of numerical aperture (NA), illumination type and partial illumination coherence to match the conditions in 193 nm steppers or scanners, it can emulate the exposure tool for any type of reticles like binary, OPC and PSM down to the 65 nm node. The system allows a rapid prediction of wafer printability of defects or defect repairs, and critical features, like dense patterns or contacts on the masks without the need to perform expensive image qualification consisting of test wafer exposures followed by SEM measurements. Therefore, AIMS is a mask quality verification standard for high-end photo masks and established in mask shops worldwide. The progress on the AIMS technology described in this paper will highlight that besides mask shops there will be a very beneficial use of the AIMS in the wafer fab and we propose an Automated Reticle Defect Disposition (ARDD) process. With smaller nodes, where design rules are 65 nm or less, it is expected that smaller defects on reticles will occur in increasing numbers in the wafer fab. These smaller mask defects will matter more and more and become a serious yield limiting factor. With increasing mask prices and increasing number of defects and severability on reticles it will become cost beneficial to perform defect disposition on the reticles in wafer production. Currently ongoing studies demonstrate AIMS benefits for wafer fab applications. An outlook will be given for extension of 193 nm aerial imaging down to the 45 nm node based on emulation of immersion scanners.

  18. Feature Subset Selection and Instance Filtering for Cross-project Defect Prediction - Classification and Ranking

    Directory of Open Access Journals (Sweden)

    Faimison Porto

    2016-12-01

    Full Text Available The defect prediction models can be a good tool on organizing the project's test resources. The models can be constructed with two main goals: 1 to classify the software parts - defective or not; or 2 to rank the most defective parts in a decreasing order. However, not all companies maintain an appropriate set of historical defect data. In this case, a company can build an appropriate dataset from known external projects - called Cross-project Defect Prediction (CPDP. The CPDP models, however, present low prediction performances due to the heterogeneity of data. Recently, Instance Filtering methods were proposed in order to reduce this heterogeneity by selecting the most similar instances from the training dataset. Originally, the similarity is calculated based on all the available dataset features (or independent variables. We propose that using only the most relevant features on the similarity calculation can result in more accurate filtered datasets and better prediction performances. In this study we extend our previous work. We analyse both prediction goals - Classification and Ranking. We present an empirical evaluation of 41 different methods by associating Instance Filtering methods with Feature Selection methods. We used 36 versions of 11 open source projects on experiments. The results show similar evidences for both prediction goals. First, the defect prediction performance of CPDP models can be improved by associating Feature Selection and Instance Filtering. Second, no evaluated method presented general better performances. Indeed, the most appropriate method can vary according to the characteristics of the project being predicted.

  19. Automated Classification of Phonological Errors in Aphasic Language

    Science.gov (United States)

    Ahuja, Sanjeev B.; Reggia, James A.; Berndt, Rita S.

    1984-01-01

    Using heuristically-guided state space search, a prototype program has been developed to simulate and classify phonemic errors occurring in the speech of neurologically-impaired patients. Simulations are based on an interchangeable rule/operator set of elementary errors which represent a theory of phonemic processing faults. This work introduces and evaluates a novel approach to error simulation and classification, it provides a prototype simulation tool for neurolinguistic research, and it forms the initial phase of a larger research effort involving computer modelling of neurolinguistic processes.

  20. Cascade classification of endocytoscopic images of colorectal lesions for automated pathological diagnosis

    Science.gov (United States)

    Itoh, Hayato; Mori, Yuichi; Misawa, Masashi; Oda, Masahiro; Kudo, Shin-ei; Mori, Kensaku

    2018-02-01

    This paper presents a new classification method for endocytoscopic images. Endocytoscopy is a new endoscope that enables us to perform conventional endoscopic observation and ultramagnified observation of cell level. This ultramagnified views (endocytoscopic images) make possible to perform pathological diagnosis only on endo-scopic views of polyps during colonoscopy. However, endocytoscopic image diagnosis requires higher experiences for physicians. An automated pathological diagnosis system is required to prevent the overlooking of neoplastic lesions in endocytoscopy. For this purpose, we propose a new automated endocytoscopic image classification method that classifies neoplastic and non-neoplastic endocytoscopic images. This method consists of two classification steps. At the first step, we classify an input image by support vector machine. We forward the image to the second step if the confidence of the first classification is low. At the second step, we classify the forwarded image by convolutional neural network. We reject the input image if the confidence of the second classification is also low. We experimentally evaluate the classification performance of the proposed method. In this experiment, we use about 16,000 and 4,000 colorectal endocytoscopic images as training and test data, respectively. The results show that the proposed method achieves high sensitivity 93.4% with small rejection rate 9.3% even for difficult test data.

  1. Automated Diatom Classification (Part A: Handcrafted Feature Approaches

    Directory of Open Access Journals (Sweden)

    Gloria Bueno

    2017-07-01

    Full Text Available This paper deals with automatic taxa identification based on machine learning methods. The aim is therefore to automatically classify diatoms, in terms of pattern recognition terminology. Diatoms are a kind of algae microorganism with high biodiversity at the species level, which are useful for water quality assessment. The most relevant features for diatom description and classification have been selected using an extensive dataset of 80 taxa with a minimum of 100 samples/taxon augmented to 300 samples/taxon. In addition to published morphological, statistical and textural descriptors, a new textural descriptor, Local Binary Patterns (LBP, to characterize the diatom’s valves, and a log Gabor implementation not tested before for this purpose are introduced in this paper. Results show an overall accuracy of 98.11% using bagging decision trees and combinations of descriptors. Finally, some phycological features of diatoms that are still difficult to integrate in computer systems are discussed for future work.

  2. An Automated Defect Prediction Framework using Genetic Algorithms: A Validation of Empirical Studies

    Directory of Open Access Journals (Sweden)

    Juan Murillo-Morera

    2016-05-01

    Full Text Available Today, it is common for software projects to collect measurement data through development processes. With these data, defect prediction software can try to estimate the defect proneness of a software module, with the objective of assisting and guiding software practitioners. With timely and accurate defect predictions, practitioners can focus their limited testing resources on higher risk areas. This paper reports the results of three empirical studies that uses an automated genetic defect prediction framework. This framework generates and compares different learning schemes (preprocessing + attribute selection + learning algorithms and selects the best one using a genetic algorithm, with the objective to estimate the defect proneness of a software module. The first empirical study is a performance comparison of our framework with the most important framework of the literature. The second empirical study is a performance and runtime comparison between our framework and an exhaustive framework. The third empirical study is a sensitivity analysis. The last empirical study, is our main contribution in this paper. Performance of the software development defect prediction models (using AUC, Area Under the Curve was validated using NASA-MDP and PROMISE data sets. Seventeen data sets from NASA-MDP (13 and PROMISE (4 projects were analyzed running a NxM-fold cross-validation. A genetic algorithm was used to select the components of the learning schemes automatically, and to assess and report the results. Our results reported similar performance between frameworks. Our framework reported better runtime than exhaustive framework. Finally, we reported the best configuration according to sensitivity analysis.

  3. Identification and classification of spine vertebrae by automated methods

    Science.gov (United States)

    Long, L. Rodney; Thoma, George R.

    2001-07-01

    We are currently working toward developing computer-assisted methods for the indexing of a collection of 17,000 digitized x-ray images by biomedical content. These images were collected as part of a nationwide health survey and form a research resource for osteoarthitis and bone morphometry. This task requires the development of algorithms to robustly analyze the x-ray contents for key landmarks, to segment the vertebral bodies, to accurately measure geometric features of the individual vertebrae and inter-vertebral areas, and to classify the spine anatomy into normal or abnormal classes for conditions of interest, including anterior osteophytes and disc space narrowing. Subtasks of this work have been created and divided among collaborators. In this paper, we provide a technical description of the overall task, report on progress made by collaborators, and provide the most recent results of our own research into obtaining first-order location of the spine region of interest by automated methods. We are currently concentrating on images of the cervical spine, but will expand the work to include the lumbar spine as well. Development of successful image processing techniques for computer-assisted indexing of medical image collections is expected to have a significant impact within the medical research and patient care systems.

  4. Detection and classification of defects in ultrasonic NDE signals using time-frequency representations

    Science.gov (United States)

    Qidwai, Uvais; Costa, Antonio H.; Chen, C. H.

    2000-05-01

    The ultrasonic wave, generated by a piezoelectric transducer coupled to the test specimen, propagates through the material and part of its energy is reflected when it encounters an non-homogeneity or discontinuity in its path, while the remainder is reflected by the back surface of the test specimen. Defect echo signals are masked by the characteristics of the measuring instruments, the propagation paths taken by the ultrasonic wave, and are corrupted by additive noise. This leads to difficulties in comparing and analyzing signals, particularly in automated defect identification systems employing different transducers. Further, the multi-component nature of material defects can add to the complexity of the defect identification criteria. With many one-dimensional (1-D) approaches, the multi-component defects can not be detected. Another drawback is that these techniques are not very robust for sharp ultrasonic peaks especially in a very hazardous environment. This paper proposes a technique based on the time-frequency representations (TFRs) of the real defect signals corresponding to artificially produced defects of various geometries in metals. Cohen's class (quadratic) TFRs with Gaussian kernels are then used to represent the signals in the time-frequency (TF) plane. Once the TFR is obtained, various image processing morphological techniques are applied to the TFR (e.g. region of interest masking, edge detection, and profile separation). Based on the results of these operations, a binary image is produced which, in turn, leads to a novel set of features. Using these new features, defects have not only been detected but also classified as flat-cut, angular-cut, and circular-drills. Moreover, with some modifications of the threshold levels of the TFR kernel design, our technique can be used in relatively hostile environments with SNRs as low as 0 dB. Another important characteristic of our approach is the detection of multiple defects. This consists of detection of

  5. Automated classification of cell morphology by coherence-controlled holographic microscopy.

    Science.gov (United States)

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  6. Automated classification of cell morphology by coherence-controlled holographic microscopy

    Science.gov (United States)

    Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim

    2017-08-01

    In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.

  7. Automated processing of webcam images for phenological classification.

    Science.gov (United States)

    Bothmann, Ludwig; Menzel, Annette; Menze, Bjoern H; Schunk, Christian; Kauermann, Göran

    2017-01-01

    Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels' time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the statistical software

  8. Update on Automated Classification of Interplanetary Dust Particles

    Science.gov (United States)

    Maroger, I.; Lasue, J.; Zolensky, M.

    2018-01-01

    Every year, the Earth accretes about 40,000 tons of extraterrestrial material less than 1 mm in size on its surface. These dust particles originate from active comets, from impacts between asteroids and may also be coming from interstellar space for the very small particles. Since 1981, NASA Jonhson Space Center (JSC) has been systematically collecting the dust from Earth's strastosphere by airborne collectors and gathered them into "Cosmic Dust Catalogs". In those catalogs, a preliminary analysis of the dust particles based on SEM images, some geological characteristics and X-ray energy-dispersive spectrometry (EDS) composition is compiled. Based on those properties, the IDPs are classified into four main groups: C (Cosmic), TCN (Natural Terrestrial Contaminant), TCA (Artificial Terrestrial Contaminant) and AOS (Aluminium Oxide Sphere). Nevertheless, 20% of those particles remain ambiguously classified. Lasue et al. presented a methodology to help automatically classify the particles published in the catalog 15 based on their EDS spectra and nonlinear multivariate projections (as shown in Fig. 1). This work allowed to relabel 155 particles out of the 467 particles in catalog 15 and reclassify some contaminants as potential cosmic dusts. Further analyses of three such particles indicated their probable cosmic origin. The current work aims to bring complementary information to the automatic classification of IDPs to improve identification criteria.

  9. Automated classification of four types of developmental odontogenic cysts.

    Science.gov (United States)

    Frydenlund, A; Eramian, M; Daley, T

    2014-04-01

    Odontogenic cysts originate from remnants of the tooth forming epithelium in the jaws and gingiva. There are various kinds of such cysts with different biological behaviours that carry different patient risks and require different treatment plans. Types of odontogenic cysts can be distinguished by the properties of their epithelial layers in H&E stained samples. Herein we detail a set of image features for automatically distinguishing between four types of odontogenic cyst in digital micrographs and evaluate their effectiveness using two statistical classifiers - a support vector machine (SVM) and bagging with logistic regression as the base learner (BLR). Cyst type was correctly predicted from among four classes of odontogenic cysts between 83.8% and 92.3% of the time with an SVM and between 90 ± 0.92% and 95.4 ± 1.94% with a BLR. One particular cyst type was associated with the majority of misclassifications. Omission of this cyst type from the data set improved the classification rate for the remaining three cyst types to 96.2% for both SVM and BLR. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Automated color classification of urine dipstick image in urine examination

    Science.gov (United States)

    Rahmat, R. F.; Royananda; Muchtar, M. A.; Taqiuddin, R.; Adnan, S.; Anugrahwaty, R.; Budiarto, R.

    2018-03-01

    Urine examination using urine dipstick has long been used to determine the health status of a person. The economical and convenient use of urine dipstick is one of the reasons urine dipstick is still used to check people health status. The real-life implementation of urine dipstick is done manually, in general, that is by comparing it with the reference color visually. This resulted perception differences in the color reading of the examination results. In this research, authors used a scanner to obtain the urine dipstick color image. The use of scanner can be one of the solutions in reading the result of urine dipstick because the light produced is consistent. A method is required to overcome the problems of urine dipstick color matching and the test reference color that have been conducted manually. The method proposed by authors is Euclidean Distance, Otsu along with RGB color feature extraction method to match the colors on the urine dipstick with the standard reference color of urine examination. The result shows that the proposed approach was able to classify the colors on a urine dipstick with an accuracy of 95.45%. The accuracy of color classification on urine dipstick against the standard reference color is influenced by the level of scanner resolution used, the higher the scanner resolution level, the higher the accuracy.

  11. A fuzzy automated object classification by infrared laser camera

    Science.gov (United States)

    Kanazawa, Seigo; Taniguchi, Kazuhiko; Asari, Kazunari; Kuramoto, Kei; Kobashi, Syoji; Hata, Yutaka

    2011-06-01

    Home security in night is very important, and the system that watches a person's movements is useful in the security. This paper describes a classification system of adult, child and the other object from distance distribution measured by an infrared laser camera. This camera radiates near infrared waves and receives reflected ones. Then, it converts the time of flight into distance distribution. Our method consists of 4 steps. First, we do background subtraction and noise rejection in the distance distribution. Second, we do fuzzy clustering in the distance distribution, and form several clusters. Third, we extract features such as the height, thickness, aspect ratio, area ratio of the cluster. Then, we make fuzzy if-then rules from knowledge of adult, child and the other object so as to classify the cluster to one of adult, child and the other object. Here, we made the fuzzy membership function with respect to each features. Finally, we classify the clusters to one with the highest fuzzy degree among adult, child and the other object. In our experiment, we set up the camera in room and tested three cases. The method successfully classified them in real time processing.

  12. DEEP LEARNING AND IMAGE PROCESSING FOR AUTOMATED CRACK DETECTION AND DEFECT MEASUREMENT IN UNDERGROUND STRUCTURES

    Directory of Open Access Journals (Sweden)

    F. Panella

    2018-05-01

    Full Text Available This work presents the combination of Deep-Learning (DL and image processing to produce an automated cracks recognition and defect measurement tool for civil structures. The authors focus on tunnel civil structures and survey and have developed an end to end tool for asset management of underground structures. In order to maintain the serviceability of tunnels, regular inspection is needed to assess their structural status. The traditional method of carrying out the survey is the visual inspection: simple, but slow and relatively expensive and the quality of the output depends on the ability and experience of the engineer as well as on the total workload (stress and tiredness may influence the ability to observe and record information. As a result of these issues, in the last decade there is the desire to automate the monitoring using new methods of inspection. The present paper has the goal of combining DL with traditional image processing to create a tool able to detect, locate and measure the structural defect.

  13. Automated processing of webcam images for phenological classification.

    Directory of Open Access Journals (Sweden)

    Ludwig Bothmann

    Full Text Available Along with the global climate change, there is an increasing interest for its effect on phenological patterns such as start and end of the growing season. Scientific digital webcams are used for this purpose taking every day one or more images from the same natural motive showing for example trees or grassland sites. To derive phenological patterns from the webcam images, regions of interest are manually defined on these images by an expert and subsequently a time series of percentage greenness is derived and analyzed with respect to structural changes. While this standard approach leads to satisfying results and allows to determine dates of phenological change points, it is associated with a considerable amount of manual work and is therefore constrained to a limited number of webcams only. In particular, this forbids to apply the phenological analysis to a large network of publicly accessible webcams in order to capture spatial phenological variation. In order to be able to scale up the analysis to several hundreds or thousands of webcams, we propose and evaluate two automated alternatives for the definition of regions of interest, allowing for efficient analyses of webcam images. A semi-supervised approach selects pixels based on the correlation of the pixels' time series of percentage greenness with a few prototype pixels. An unsupervised approach clusters pixels based on scores of a singular value decomposition. We show for a scientific webcam that the resulting regions of interest are at least as informative as those chosen by an expert with the advantage that no manual action is required. Additionally, we show that the methods can even be applied to publicly available webcams accessed via the internet yielding interesting partitions of the analyzed images. Finally, we show that the methods are suitable for the intended big data applications by analyzing 13988 webcams from the AMOS database. All developed methods are implemented in the

  14. A Fully Automated Classification for Mapping the Annual Cropland Extent

    Science.gov (United States)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  15. Automated detection and classification of cryptographic algorithms in binary programs through machine learning

    OpenAIRE

    Hosfelt, Diane Duros

    2015-01-01

    Threats from the internet, particularly malicious software (i.e., malware) often use cryptographic algorithms to disguise their actions and even to take control of a victim's system (as in the case of ransomware). Malware and other threats proliferate too quickly for the time-consuming traditional methods of binary analysis to be effective. By automating detection and classification of cryptographic algorithms, we can speed program analysis and more efficiently combat malware. This thesis wil...

  16. An Automated Cropland Classification Algorithm (ACCA) for Tajikistan by Combining Landsat, MODIS, and Secondary Data

    OpenAIRE

    Thenkabail, Prasad S.; Wu, Zhuoting

    2012-01-01

    The overarching goal of this research was to develop and demonstrate an automated Cropland Classification Algorithm (ACCA) that will rapidly, routinely, and accurately classify agricultural cropland extent, areas, and characteristics (e.g., irrigated vs. rainfed) over large areas such as a country or a region through combination of multi-sensor remote sensing and secondary data. In this research, a rule-based ACCA was conceptualized, developed, and demonstrated for the country of Tajikistan u...

  17. A graphical automated detection system to locate hardwood log surface defects using high-resolution three-dimensional laser scan data

    Science.gov (United States)

    Liya Thomas; R. Edward. Thomas

    2011-01-01

    We have developed an automated defect detection system and a state-of-the-art Graphic User Interface (GUI) for hardwood logs. The algorithm identifies defects at least 0.5 inch high and at least 3 inches in diameter on barked hardwood log and stem surfaces. To summarize defect features and to build a knowledge base, hundreds of defects were measured, photographed, and...

  18. Perforator chimerism for the reconstruction of complex defects: A new chimeric free flap classification system.

    Science.gov (United States)

    Kim, Jeong Tae; Kim, Youn Hwan; Ghanem, Ali M

    2015-11-01

    Complex defects present structural and functional challenges to reconstructive surgeons. When compared to multiple free flaps or staged reconstruction, the use of chimeric flaps to reconstruct such defects have many advantages such as reduced number of operative procedures and donor site morbidity as well as preservation of recipient vessels. With increased popularity of perforator flaps, chimeric flaps' harvest and design has benefited from 'perforator concept' towards more versatile and better reconstruction solutions. This article discusses perforator based chimeric flaps and presents a practice based classification system that incorporates the perforator flap concept into "Perforator Chimerism". The authors analyzed a variety of chimeric patterns used in 31 consecutive cases to present illustrative case series and their new classification system. Accordingly, chimeric flaps are classified into four types. Type I: Classical Chimerism, Type II: Anastomotic Chimerism, Type III: Perforator Chimerism and Type IV Mixed Chimerism. Types I on specific source vessel anatomy whilst Type II requires microvascular anastomosis to create the chimeric reconstructive solution. Type III chimeric flaps utilizes the perforator concept to raise two components of tissues without microvascular anastomosis between them. Type IV chimeric flaps are mixed type flaps comprising any combination of Types I to III. Incorporation of the perforator concept in planning and designing chimeric flaps has allowed safe, effective and aesthetically superior reconstruction of complex defects. The new classification system aids reconstructive surgeons and trainees to understand chimeric flaps design, facilitating effective incorporation of this important reconstructive technique into the armamentarium of the reconstruction toolbox. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  19. Prototype semantic infrastructure for automated small molecule classification and annotation in lipidomics.

    Science.gov (United States)

    Chepelev, Leonid L; Riazanov, Alexandre; Kouznetsov, Alexandre; Low, Hong Sang; Dumontier, Michel; Baker, Christopher J O

    2011-07-26

    The development of high-throughput experimentation has led to astronomical growth in biologically relevant lipids and lipid derivatives identified, screened, and deposited in numerous online databases. Unfortunately, efforts to annotate, classify, and analyze these chemical entities have largely remained in the hands of human curators using manual or semi-automated protocols, leaving many novel entities unclassified. Since chemical function is often closely linked to structure, accurate structure-based classification and annotation of chemical entities is imperative to understanding their functionality. As part of an exploratory study, we have investigated the utility of semantic web technologies in automated chemical classification and annotation of lipids. Our prototype framework consists of two components: an ontology and a set of federated web services that operate upon it. The formal lipid ontology we use here extends a part of the LiPrO ontology and draws on the lipid hierarchy in the LIPID MAPS database, as well as literature-derived knowledge. The federated semantic web services that operate upon this ontology are deployed within the Semantic Annotation, Discovery, and Integration (SADI) framework. Structure-based lipid classification is enacted by two core services. Firstly, a structural annotation service detects and enumerates relevant functional groups for a specified chemical structure. A second service reasons over lipid ontology class descriptions using the attributes obtained from the annotation service and identifies the appropriate lipid classification. We extend the utility of these core services by combining them with additional SADI services that retrieve associations between lipids and proteins and identify publications related to specified lipid types. We analyze the performance of SADI-enabled eicosanoid classification relative to the LIPID MAPS classification and reflect on the contribution of our integrative methodology in the context of

  20. Prototype semantic infrastructure for automated small molecule classification and annotation in lipidomics

    Directory of Open Access Journals (Sweden)

    Dumontier Michel

    2011-07-01

    Full Text Available Abstract Background The development of high-throughput experimentation has led to astronomical growth in biologically relevant lipids and lipid derivatives identified, screened, and deposited in numerous online databases. Unfortunately, efforts to annotate, classify, and analyze these chemical entities have largely remained in the hands of human curators using manual or semi-automated protocols, leaving many novel entities unclassified. Since chemical function is often closely linked to structure, accurate structure-based classification and annotation of chemical entities is imperative to understanding their functionality. Results As part of an exploratory study, we have investigated the utility of semantic web technologies in automated chemical classification and annotation of lipids. Our prototype framework consists of two components: an ontology and a set of federated web services that operate upon it. The formal lipid ontology we use here extends a part of the LiPrO ontology and draws on the lipid hierarchy in the LIPID MAPS database, as well as literature-derived knowledge. The federated semantic web services that operate upon this ontology are deployed within the Semantic Annotation, Discovery, and Integration (SADI framework. Structure-based lipid classification is enacted by two core services. Firstly, a structural annotation service detects and enumerates relevant functional groups for a specified chemical structure. A second service reasons over lipid ontology class descriptions using the attributes obtained from the annotation service and identifies the appropriate lipid classification. We extend the utility of these core services by combining them with additional SADI services that retrieve associations between lipids and proteins and identify publications related to specified lipid types. We analyze the performance of SADI-enabled eicosanoid classification relative to the LIPID MAPS classification and reflect on the contribution of

  1. Leveraging Long-term Seismic Catalogs for Automated Real-time Event Classification

    Science.gov (United States)

    Linville, L.; Draelos, T.; Pankow, K. L.; Young, C. J.; Alvarez, S.

    2017-12-01

    We investigate the use of labeled event types available through reviewed seismic catalogs to produce automated event labels on new incoming data from the crustal region spanned by the cataloged events. Using events cataloged by the University of Utah Seismograph Stations between October, 2012 and June, 2017, we calculate the spectrogram for a time window that spans the duration of each event as seen on individual stations, resulting in 110k event spectrograms (50% local earthquakes examples, 50% quarry blasts examples). Using 80% of the randomized example events ( 90k), a classifier is trained to distinguish between local earthquakes and quarry blasts. We explore variations of deep learning classifiers, incorporating elements of convolutional and recurrent neural networks. Using a single-layer Long Short Term Memory recurrent neural network, we achieve 92% accuracy on the classification task on the remaining 20K test examples. Leveraging the decisions from a group of stations that detected the same event by using the median of all classifications in the group increases the model accuracy to 96%. Additional data with equivalent processing from 500 more recently cataloged events (July, 2017), achieves the same accuracy as our test data on both single-station examples and multi-station medians, suggesting that the model can maintain accurate and stable classification rates on real-time automated events local to the University of Utah Seismograph Stations, with potentially minimal levels of re-training through time.

  2. Classification of Atrial Septal Defect and Ventricular Septal Defect with Documented Hemodynamic Parameters via Cardiac Catheterization by Genetic Algorithms and Multi-Layered Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Mustafa Yıldız

    2012-08-01

    Full Text Available Introduction: We aimed to develop a classification method to discriminate ventricular septal defect and atrial septal defect by using severalhemodynamic parameters.Patients and Methods: Forty three patients (30 atrial septal defect, 13 ventricular septal defect; 26 female, 17 male with documentedhemodynamic parameters via cardiac catheterization are included to study. Such parameters as blood pressure values of different areas,gender, age and Qp/Qs ratios are used for classification. Parameters, we used in classification are determined by divergence analysismethod. Those parameters are; i pulmonary artery diastolic pressure, ii Qp/Qs ratio, iii right atrium pressure, iv age, v pulmonary arterysystolic pressure, vi left ventricular sistolic pressure, vii aorta mean pressure, viii left ventricular diastolic pressure, ix aorta diastolicpressure, x aorta systolic pressure. Those parameters detected from our study population, are uploaded to multi-layered artificial neuralnetwork and the network was trained by genetic algorithm.Results: Trained cluster consists of 14 factors (7 atrial septal defect and 7 ventricular septal defect. Overall success ratio is 79.2%, andwith a proper instruction of artificial neural network this ratio increases up to 89%.Conclusion: Parameters, belonging to artificial neural network, which are needed to be detected by the investigator in classical methods,can easily be detected with the help of genetic algorithms. During the instruction of artificial neural network by genetic algorithms, boththe topology of network and factors of network can be determined. During the test stage, elements, not included in instruction cluster, areassumed as in test cluster, and as a result of this study, we observed that multi-layered artificial neural network can be instructed properly,and neural network is a successful method for aimed classification.

  3. Towards more reliable automated multi-dose dispensing: retrospective follow-up study on medication dose errors and product defects.

    Science.gov (United States)

    Palttala, Iida; Heinämäki, Jyrki; Honkanen, Outi; Suominen, Risto; Antikainen, Osmo; Hirvonen, Jouni; Yliruusi, Jouko

    2013-03-01

    To date, little is known on applicability of different types of pharmaceutical dosage forms in an automated high-speed multi-dose dispensing process. The purpose of the present study was to identify and further investigate various process-induced and/or product-related limitations associated with multi-dose dispensing process. The rates of product defects and dose dispensing errors in automated multi-dose dispensing were retrospectively investigated during a 6-months follow-up period. The study was based on the analysis of process data of totally nine automated high-speed multi-dose dispensing systems. Special attention was paid to the dependence of multi-dose dispensing errors/product defects and pharmaceutical tablet properties (such as shape, dimensions, weight, scored lines, coatings, etc.) to profile the most suitable forms of tablets for automated dose dispensing systems. The relationship between the risk of errors in dose dispensing and tablet characteristics were visualized by creating a principal component analysis (PCA) model for the outcome of dispensed tablets. The two most common process-induced failures identified in the multi-dose dispensing are predisposal of tablet defects and unexpected product transitions in the medication cassette (dose dispensing error). The tablet defects are product-dependent failures, while the tablet transitions are dependent on automated multi-dose dispensing systems used. The occurrence of tablet defects is approximately twice as common as tablet transitions. Optimal tablet preparation for the high-speed multi-dose dispensing would be a round-shaped, relatively small/middle-sized, film-coated tablet without any scored line. Commercial tablet products can be profiled and classified based on their suitability to a high-speed multi-dose dispensing process.

  4. AUTOMATED UNSUPERVISED CLASSIFICATION OF THE SLOAN DIGITAL SKY SURVEY STELLAR SPECTRA USING k-MEANS CLUSTERING

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Almeida, J.; Allende Prieto, C., E-mail: jos@iac.es, E-mail: callende@iac.es [Instituto de Astrofisica de Canarias, E-38205 La Laguna, Tenerife (Spain)

    2013-01-20

    Large spectroscopic surveys require automated methods of analysis. This paper explores the use of k-means clustering as a tool for automated unsupervised classification of massive stellar spectral catalogs. The classification criteria are defined by the data and the algorithm, with no prior physical framework. We work with a representative set of stellar spectra associated with the Sloan Digital Sky Survey (SDSS) SEGUE and SEGUE-2 programs, which consists of 173,390 spectra from 3800 to 9200 A sampled on 3849 wavelengths. We classify the original spectra as well as the spectra with the continuum removed. The second set only contains spectral lines, and it is less dependent on uncertainties of the flux calibration. The classification of the spectra with continuum renders 16 major classes. Roughly speaking, stars are split according to their colors, with enough finesse to distinguish dwarfs from giants of the same effective temperature, but with difficulties to separate stars with different metallicities. There are classes corresponding to particular MK types, intrinsically blue stars, dust-reddened, stellar systems, and also classes collecting faulty spectra. Overall, there is no one-to-one correspondence between the classes we derive and the MK types. The classification of spectra without continuum renders 13 classes, the color separation is not so sharp, but it distinguishes stars of the same effective temperature and different metallicities. Some classes thus obtained present a fairly small range of physical parameters (200 K in effective temperature, 0.25 dex in surface gravity, and 0.35 dex in metallicity), so that the classification can be used to estimate the main physical parameters of some stars at a minimum computational cost. We also analyze the outliers of the classification. Most of them turn out to be failures of the reduction pipeline, but there are also high redshift QSOs, multiple stellar systems, dust-reddened stars, galaxies, and, finally, odd

  5. ClassyFire: automated chemical classification with a comprehensive, computable taxonomy.

    Science.gov (United States)

    Djoumbou Feunang, Yannick; Eisner, Roman; Knox, Craig; Chepelev, Leonid; Hastings, Janna; Owen, Gareth; Fahy, Eoin; Steinbeck, Christoph; Subramanian, Shankar; Bolton, Evan; Greiner, Russell; Wishart, David S

    2016-01-01

    Scientists have long been driven by the desire to describe, organize, classify, and compare objects using taxonomies and/or ontologies. In contrast to biology, geology, and many other scientific disciplines, the world of chemistry still lacks a standardized chemical ontology or taxonomy. Several attempts at chemical classification have been made; but they have mostly been limited to either manual, or semi-automated proof-of-principle applications. This is regrettable as comprehensive chemical classification and description tools could not only improve our understanding of chemistry but also improve the linkage between chemistry and many other fields. For instance, the chemical classification of a compound could help predict its metabolic fate in humans, its druggability or potential hazards associated with it, among others. However, the sheer number (tens of millions of compounds) and complexity of chemical structures is such that any manual classification effort would prove to be near impossible. We have developed a comprehensive, flexible, and computable, purely structure-based chemical taxonomy (ChemOnt), along with a computer program (ClassyFire) that uses only chemical structures and structural features to automatically assign all known chemical compounds to a taxonomy consisting of >4800 different categories. This new chemical taxonomy consists of up to 11 different levels (Kingdom, SuperClass, Class, SubClass, etc.) with each of the categories defined by unambiguous, computable structural rules. Furthermore each category is named using a consensus-based nomenclature and described (in English) based on the characteristic common structural properties of the compounds it contains. The ClassyFire webserver is freely accessible at http://classyfire.wishartlab.com/. Moreover, a Ruby API version is available at https://bitbucket.org/wishartlab/classyfire_api, which provides programmatic access to the ClassyFire server and database. ClassyFire has been used to

  6. Robust automated classification of first-motion polarities for focal mechanism determination with machine learning

    Science.gov (United States)

    Ross, Z. E.; Meier, M. A.; Hauksson, E.

    2017-12-01

    Accurate first-motion polarities are essential for determining earthquake focal mechanisms, but are difficult to measure automatically because of picking errors and signal to noise issues. Here we develop an algorithm for reliable automated classification of first-motion polarities using machine learning algorithms. A classifier is designed to identify whether the first-motion polarity is up, down, or undefined by examining the waveform data directly. We first improve the accuracy of automatic P-wave onset picks by maximizing a weighted signal/noise ratio for a suite of candidate picks around the automatic pick. We then use the waveform amplitudes before and after the optimized pick as features for the classification. We demonstrate the method's potential by training and testing the classifier on tens of thousands of hand-made first-motion picks by the Southern California Seismic Network. The classifier assigned the same polarity as chosen by an analyst in more than 94% of the records. We show that the method is generalizable to a variety of learning algorithms, including neural networks and random forest classifiers. The method is suitable for automated processing of large seismic waveform datasets, and can potentially be used in real-time applications, e.g. for improving the source characterizations of earthquake early warning algorithms.

  7. A Framework to Support Automated Classification and Labeling of Brain Electromagnetic Patterns

    Directory of Open Access Journals (Sweden)

    Gwen A. Frishkoff

    2007-01-01

    Full Text Available This paper describes a framework for automated classification and labeling of patterns in electroencephalographic (EEG and magnetoencephalographic (MEG data. We describe recent progress on four goals: 1 specification of rules and concepts that capture expert knowledge of event-related potentials (ERP patterns in visual word recognition; 2 implementation of rules in an automated data processing and labeling stream; 3 data mining techniques that lead to refinement of rules; and 4 iterative steps towards system evaluation and optimization. This process combines top-down, or knowledge-driven, methods with bottom-up, or data-driven, methods. As illustrated here, these methods are complementary and can lead to development of tools for pattern classification and labeling that are robust and conceptually transparent to researchers. The present application focuses on patterns in averaged EEG (ERP data. We also describe efforts to extend our methods to represent patterns in MEG data, as well as EM patterns in source (anatomical space. The broader aim of this work is to design an ontology-based system to support cross-laboratory, cross-paradigm, and cross-modal integration of brain functional data. Tools developed for this project are implemented in MATLAB and are freely available on request.

  8. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach.

    Science.gov (United States)

    Murat, Miraemiliana; Chang, Siow-Wee; Abu, Arpah; Yap, Hwa Jen; Yong, Kien-Thai

    2017-01-01

    Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM), Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD), Histogram of Oriented Gradients (HOG), Hu invariant moments (Hu) and Zernike moments (ZM). Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN), random forest (RF), support vector machine (SVM), k-nearest neighbour (k-NN), linear discriminant analysis (LDA) and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM). In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS) and Pearson's coefficient correlation (PCC). The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia dataset and 99

  9. A simple and robust method for automated photometric classification of supernovae using neural networks

    Science.gov (United States)

    Karpenka, N. V.; Feroz, F.; Hobson, M. P.

    2013-02-01

    A method is presented for automated photometric classification of supernovae (SNe) as Type Ia or non-Ia. A two-step approach is adopted in which (i) the SN light curve flux measurements in each observing filter are fitted separately to an analytical parametrized function that is sufficiently flexible to accommodate virtually all types of SNe and (ii) the fitted function parameters and their associated uncertainties, along with the number of flux measurements, the maximum-likelihood value of the fit and Bayesian evidence for the model, are used as the input feature vector to a classification neural network that outputs the probability that the SN under consideration is of Type Ia. The method is trained and tested using data released following the Supernova Photometric Classification Challenge (SNPCC), consisting of light curves for 20 895 SNe in total. We consider several random divisions of the data into training and testing sets: for instance, for our sample D_1 (D_4), a total of 10 (40) per cent of the data are involved in training the algorithm and the remainder used for blind testing of the resulting classifier; we make no selection cuts. Assigning a canonical threshold probability of pth = 0.5 on the network output to class an SN as Type Ia, for the sample D_1 (D_4) we obtain a completeness of 0.78 (0.82), purity of 0.77 (0.82) and SNPCC figure of merit of 0.41 (0.50). Including the SN host-galaxy redshift and its uncertainty as additional inputs to the classification network results in a modest 5-10 per cent increase in these values. We find that the quality of the classification does not vary significantly with SN redshift. Moreover, our probabilistic classification method allows one to calculate the expected completeness, purity and figure of merit (or other measures of classification quality) as a function of the threshold probability pth, without knowing the true classes of the SNe in the testing sample, as is the case in the classification of real SNe

  10. Automated artery-venous classification of retinal blood vessels based on structural mapping method

    Science.gov (United States)

    Joshi, Vinayak S.; Garvin, Mona K.; Reinhardt, Joseph M.; Abramoff, Michael D.

    2012-03-01

    Retinal blood vessels show morphologic modifications in response to various retinopathies. However, the specific responses exhibited by arteries and veins may provide a precise diagnostic information, i.e., a diabetic retinopathy may be detected more accurately with the venous dilatation instead of average vessel dilatation. In order to analyze the vessel type specific morphologic modifications, the classification of a vessel network into arteries and veins is required. We previously described a method for identification and separation of retinal vessel trees; i.e. structural mapping. Therefore, we propose the artery-venous classification based on structural mapping and identification of color properties prominent to the vessel types. The mean and standard deviation of each of green channel intensity and hue channel intensity are analyzed in a region of interest around each centerline pixel of a vessel. Using the vector of color properties extracted from each centerline pixel, it is classified into one of the two clusters (artery and vein), obtained by the fuzzy-C-means clustering. According to the proportion of clustered centerline pixels in a particular vessel, and utilizing the artery-venous crossing property of retinal vessels, each vessel is assigned a label of an artery or a vein. The classification results are compared with the manually annotated ground truth (gold standard). We applied the proposed method to a dataset of 15 retinal color fundus images resulting in an accuracy of 88.28% correctly classified vessel pixels. The automated classification results match well with the gold standard suggesting its potential in artery-venous classification and the respective morphology analysis.

  11. Automated classification and quantitative analysis of arterial and venous vessels in fundus images

    Science.gov (United States)

    Alam, Minhaj; Son, Taeyoon; Toslak, Devrim; Lim, Jennifer I.; Yao, Xincheng

    2018-02-01

    It is known that retinopathies may affect arteries and veins differently. Therefore, reliable differentiation of arteries and veins is essential for computer-aided analysis of fundus images. The purpose of this study is to validate one automated method for robust classification of arteries and veins (A-V) in digital fundus images. We combine optical density ratio (ODR) analysis and blood vessel tracking algorithm to classify arteries and veins. A matched filtering method is used to enhance retinal blood vessels. Bottom hat filtering and global thresholding are used to segment the vessel and skeleton individual blood vessels. The vessel tracking algorithm is used to locate the optic disk and to identify source nodes of blood vessels in optic disk area. Each node can be identified as vein or artery using ODR information. Using the source nodes as starting point, the whole vessel trace is then tracked and classified as vein or artery using vessel curvature and angle information. 50 color fundus images from diabetic retinopathy patients were used to test the algorithm. Sensitivity, specificity, and accuracy metrics were measured to assess the validity of the proposed classification method compared to ground truths created by two independent observers. The algorithm demonstrated 97.52% accuracy in identifying blood vessels as vein or artery. A quantitative analysis upon A-V classification showed that average A-V ratio of width for NPDR subjects with hypertension decreased significantly (43.13%).

  12. Automated Detection of Connective Tissue by Tissue Counter Analysis and Classification and Regression Trees

    Directory of Open Access Journals (Sweden)

    Josef Smolle

    2001-01-01

    Full Text Available Objective: To evaluate the feasibility of the CART (Classification and Regression Tree procedure for the recognition of microscopic structures in tissue counter analysis. Methods: Digital microscopic images of H&E stained slides of normal human skin and of primary malignant melanoma were overlayed with regularly distributed square measuring masks (elements and grey value, texture and colour features within each mask were recorded. In the learning set, elements were interactively labeled as representing either connective tissue of the reticular dermis, other tissue components or background. Subsequently, CART models were based on these data sets. Results: Implementation of the CART classification rules into the image analysis program showed that in an independent test set 94.1% of elements classified as connective tissue of the reticular dermis were correctly labeled. Automated measurements of the total amount of tissue and of the amount of connective tissue within a slide showed high reproducibility (r=0.97 and r=0.94, respectively; p < 0.001. Conclusions: CART procedure in tissue counter analysis yields simple and reproducible classification rules for tissue elements.

  13. Detection of delamination defects in plate type fuel elements applying an automated C-Scan ultrasonic system

    International Nuclear Information System (INIS)

    Katchadjian, P.; Desimone, C.; Ziobrowski, C.; Garcia, A.

    2002-01-01

    For the inspection of plate type fuel elements to be used in Research Nuclear Reactors it was applied an immersion pulse-echo ultrasonic technique. For that reason an automated movement system was implemented according to the axes X, Y and Z that allows to automate the test and to show the results obtained in format of C-Scan, facilitating the immediate identification of possible defects and making repetitive the inspection. In this work problems found during the laboratory tests and factors that difficult the inspection are commented. Also the results of C-Scans over UMo fuel elements with pattern defects are shown. Finally, the main characteristics of the transducer with the one the better results were obtained are detailed. (author)

  14. Effective automated feature construction and selection for classification of biological sequences.

    Directory of Open Access Journals (Sweden)

    Uday Kamath

    Full Text Available Many open problems in bioinformatics involve elucidating underlying functional signals in biological sequences. DNA sequences, in particular, are characterized by rich architectures in which functional signals are increasingly found to combine local and distal interactions at the nucleotide level. Problems of interest include detection of regulatory regions, splice sites, exons, hypersensitive sites, and more. These problems naturally lend themselves to formulation as classification problems in machine learning. When classification is based on features extracted from the sequences under investigation, success is critically dependent on the chosen set of features.We present an algorithmic framework (EFFECT for automated detection of functional signals in biological sequences. We focus here on classification problems involving DNA sequences which state-of-the-art work in machine learning shows to be challenging and involve complex combinations of local and distal features. EFFECT uses a two-stage process to first construct a set of candidate sequence-based features and then select a most effective subset for the classification task at hand. Both stages make heavy use of evolutionary algorithms to efficiently guide the search towards informative features capable of discriminating between sequences that contain a particular functional signal and those that do not.To demonstrate its generality, EFFECT is applied to three separate problems of importance in DNA research: the recognition of hypersensitive sites, splice sites, and ALU sites. Comparisons with state-of-the-art algorithms show that the framework is both general and powerful. In addition, a detailed analysis of the constructed features shows that they contain valuable biological information about DNA architecture, allowing biologists and other researchers to directly inspect the features and potentially use the insights obtained to assist wet-laboratory studies on retainment or modification

  15. Automated classification of bone marrow cells in microscopic images for diagnosis of leukemia: a comparison of two classification schemes with respect to the segmentation quality

    Science.gov (United States)

    Krappe, Sebastian; Benz, Michaela; Wittenberg, Thomas; Haferlach, Torsten; Münzenmayer, Christian

    2015-03-01

    The morphological analysis of bone marrow smears is fundamental for the diagnosis of leukemia. Currently, the counting and classification of the different types of bone marrow cells is done manually with the use of bright field microscope. This is a time consuming, partly subjective and tedious process. Furthermore, repeated examinations of a slide yield intra- and inter-observer variances. For this reason an automation of morphological bone marrow analysis is pursued. This analysis comprises several steps: image acquisition and smear detection, cell localization and segmentation, feature extraction and cell classification. The automated classification of bone marrow cells is depending on the automated cell segmentation and the choice of adequate features extracted from different parts of the cell. In this work we focus on the evaluation of support vector machines (SVMs) and random forests (RFs) for the differentiation of bone marrow cells in 16 different classes, including immature and abnormal cell classes. Data sets of different segmentation quality are used to test the two approaches. Automated solutions for the morphological analysis for bone marrow smears could use such a classifier to pre-classify bone marrow cells and thereby shortening the examination duration.

  16. Automated method for identification and artery-venous classification of vessel trees in retinal vessel networks.

    Science.gov (United States)

    Joshi, Vinayak S; Reinhardt, Joseph M; Garvin, Mona K; Abramoff, Michael D

    2014-01-01

    The separation of the retinal vessel network into distinct arterial and venous vessel trees is of high interest. We propose an automated method for identification and separation of retinal vessel trees in a retinal color image by converting a vessel segmentation image into a vessel segment map and identifying the individual vessel trees by graph search. Orientation, width, and intensity of each vessel segment are utilized to find the optimal graph of vessel segments. The separated vessel trees are labeled as primary vessel or branches. We utilize the separated vessel trees for arterial-venous (AV) classification, based on the color properties of the vessels in each tree graph. We applied our approach to a dataset of 50 fundus images from 50 subjects. The proposed method resulted in an accuracy of 91.44% correctly classified vessel pixels as either artery or vein. The accuracy of correctly classified major vessel segments was 96.42%.

  17. Developing and Integrating Advanced Movement Features Improves Automated Classification of Ciliate Species.

    Science.gov (United States)

    Soleymani, Ali; Pennekamp, Frank; Petchey, Owen L; Weibel, Robert

    2015-01-01

    Recent advances in tracking technologies such as GPS or video tracking systems describe the movement paths of individuals in unprecedented details and are increasingly used in different fields, including ecology. However, extracting information from raw movement data requires advanced analysis techniques, for instance to infer behaviors expressed during a certain period of the recorded trajectory, or gender or species identity in case data is obtained from remote tracking. In this paper, we address how different movement features affect the ability to automatically classify the species identity, using a dataset of unicellular microbes (i.e., ciliates). Previously, morphological attributes and simple movement metrics, such as speed, were used for classifying ciliate species. Here, we demonstrate that adding advanced movement features, in particular such based on discrete wavelet transform, to morphological features can improve classification. These results may have practical applications in automated monitoring of waste water facilities as well as environmental monitoring of aquatic systems.

  18. A FULLY AUTOMATED PIPELINE FOR CLASSIFICATION TASKS WITH AN APPLICATION TO REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    K. Suzuki

    2016-06-01

    Full Text Available Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed ‘shallow’ machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyperparameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset, small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  19. Automated Analysis and Classification of Histological Tissue Features by Multi-Dimensional Microscopic Molecular Profiling.

    Directory of Open Access Journals (Sweden)

    Daniel P Riordan

    Full Text Available Characterization of the molecular attributes and spatial arrangements of cells and features within complex human tissues provides a critical basis for understanding processes involved in development and disease. Moreover, the ability to automate steps in the analysis and interpretation of histological images that currently require manual inspection by pathologists could revolutionize medical diagnostics. Toward this end, we developed a new imaging approach called multidimensional microscopic molecular profiling (MMMP that can measure several independent molecular properties in situ at subcellular resolution for the same tissue specimen. MMMP involves repeated cycles of antibody or histochemical staining, imaging, and signal removal, which ultimately can generate information analogous to a multidimensional flow cytometry analysis on intact tissue sections. We performed a MMMP analysis on a tissue microarray containing a diverse set of 102 human tissues using a panel of 15 informative antibody and 5 histochemical stains plus DAPI. Large-scale unsupervised analysis of MMMP data, and visualization of the resulting classifications, identified molecular profiles that were associated with functional tissue features. We then directly annotated H&E images from this MMMP series such that canonical histological features of interest (e.g. blood vessels, epithelium, red blood cells were individually labeled. By integrating image annotation data, we identified molecular signatures that were associated with specific histological annotations and we developed statistical models for automatically classifying these features. The classification accuracy for automated histology labeling was objectively evaluated using a cross-validation strategy, and significant accuracy (with a median per-pixel rate of 77% per feature from 15 annotated samples for de novo feature prediction was obtained. These results suggest that high-dimensional profiling may advance the

  20. Automated classification of self-grooming in mice using open-source software.

    Science.gov (United States)

    van den Boom, Bastijn J G; Pavlidi, Pavlina; Wolf, Casper J H; Mooij, Adriana H; Willuhn, Ingo

    2017-09-01

    Manual analysis of behavior is labor intensive and subject to inter-rater variability. Although considerable progress in automation of analysis has been made, complex behavior such as grooming still lacks satisfactory automated quantification. We trained a freely available, automated classifier, Janelia Automatic Animal Behavior Annotator (JAABA), to quantify self-grooming duration and number of bouts based on video recordings of SAPAP3 knockout mice (a mouse line that self-grooms excessively) and wild-type animals. We compared the JAABA classifier with human expert observers to test its ability to measure self-grooming in three scenarios: mice in an open field, mice on an elevated plus-maze, and tethered mice in an open field. In each scenario, the classifier identified both grooming and non-grooming with great accuracy and correlated highly with results obtained by human observers. Consistently, the JAABA classifier confirmed previous reports of excessive grooming in SAPAP3 knockout mice. Thus far, manual analysis was regarded as the only valid quantification method for self-grooming. We demonstrate that the JAABA classifier is a valid and reliable scoring tool, more cost-efficient than manual scoring, easy to use, requires minimal effort, provides high throughput, and prevents inter-rater variability. We introduce the JAABA classifier as an efficient analysis tool for the assessment of rodent self-grooming with expert quality. In our "how-to" instructions, we provide all information necessary to implement behavioral classification with JAABA. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Automated image processing method for the diagnosis and classification of malaria on thin blood smears.

    Science.gov (United States)

    Ross, Nicholas E; Pritchard, Charles J; Rubin, David M; Dusé, Adriano G

    2006-05-01

    Malaria is a serious global health problem, and rapid, accurate diagnosis is required to control the disease. An image processing algorithm to automate the diagnosis of malaria on thin blood smears is developed. The image classification system is designed to positively identify malaria parasites present in thin blood smears, and differentiate the species of malaria. Images are acquired using a charge-coupled device camera connected to a light microscope. Morphological and novel threshold selection techniques are used to identify erythrocytes (red blood cells) and possible parasites present on microscopic slides. Image features based on colour, texture and the geometry of the cells and parasites are generated, as well as features that make use of a priori knowledge of the classification problem and mimic features used by human technicians. A two-stage tree classifier using backpropogation feedforward neural networks distinguishes between true and false positives, and then diagnoses the species (Plasmodium falciparum, P. vivax, P. ovale or P. malariae) of the infection. Malaria samples obtained from the Department of Clinical Microbiology and Infectious Diseases at the University of the Witwatersrand Medical School are used for training and testing of the system. Infected erythrocytes are positively identified with a sensitivity of 85% and a positive predictive value (PPV) of 81%, which makes the method highly sensitive at diagnosing a complete sample provided many views are analysed. Species were correctly determined for 11 out of 15 samples.

  2. Automated Classification of Heritage Buildings for As-Built Bim Using Machine Learning Techniques

    Science.gov (United States)

    Bassier, M.; Vergauwen, M.; Van Genechten, B.

    2017-08-01

    Semantically rich three dimensional models such as Building Information Models (BIMs) are increasingly used in digital heritage. They provide the required information to varying stakeholders during the different stages of the historic buildings life cyle which is crucial in the conservation process. The creation of as-built BIM models is based on point cloud data. However, manually interpreting this data is labour intensive and often leads to misinterpretations. By automatically classifying the point cloud, the information can be proccesed more effeciently. A key aspect in this automated scan-to-BIM process is the classification of building objects. In this research we look to automatically recognise elements in existing buildings to create compact semantic information models. Our algorithm efficiently extracts the main structural components such as floors, ceilings, roofs, walls and beams despite the presence of significant clutter and occlusions. More specifically, Support Vector Machines (SVM) are proposed for the classification. The algorithm is evaluated using real data of a variety of existing buildings. The results prove that the used classifier recognizes the objects with both high precision and recall. As a result, entire data sets are reliably labelled at once. The approach enables experts to better document and process heritage assets.

  3. Automated Segmentation and Classification of Coral using Fluid Lensing from Unmanned Airborne Platforms

    Science.gov (United States)

    Instrella, Ron; Chirayath, Ved

    2016-01-01

    In recent years, there has been a growing interest among biologists in monitoring the short and long term health of the world's coral reefs. The environmental impact of climate change poses a growing threat to these biologically diverse and fragile ecosystems, prompting scientists to use remote sensing platforms and computer vision algorithms to analyze shallow marine systems. In this study, we present a novel method for performing coral segmentation and classification from aerial data collected from small unmanned aerial vehicles (sUAV). Our method uses Fluid Lensing algorithms to remove and exploit strong optical distortions created along the air-fluid boundary to produce cm-scale resolution imagery of the ocean floor at depths up to 5 meters. A 3D model of the reef is reconstructed using structure from motion (SFM) algorithms, and the associated depth information is combined with multidimensional maximum a posteriori (MAP) estimation to separate organic from inorganic material and classify coral morphologies in the Fluid-Lensed transects. In this study, MAP estimation is performed using a set of manually classified 100 x 100 pixel training images to determine the most probable coral classification within an interrogated region of interest. Aerial footage of a coral reef was captured off the coast of American Samoa and used to test our proposed method. 90 x 20 meter transects of the Samoan coastline undergo automated classification and are manually segmented by a marine biologist for comparison, leading to success rates as high as 85%. This method has broad applications for coastal remote sensing, and will provide marine biologists access to large swaths of high resolution, segmented coral imagery.

  4. Automated classification of tropical shrub species: a hybrid of leaf shape and machine learning approach

    Directory of Open Access Journals (Sweden)

    Miraemiliana Murat

    2017-09-01

    Full Text Available Plants play a crucial role in foodstuff, medicine, industry, and environmental protection. The skill of recognising plants is very important in some applications, including conservation of endangered species and rehabilitation of lands after mining activities. However, it is a difficult task to identify plant species because it requires specialized knowledge. Developing an automated classification system for plant species is necessary and valuable since it can help specialists as well as the public in identifying plant species easily. Shape descriptors were applied on the myDAUN dataset that contains 45 tropical shrub species collected from the University of Malaya (UM, Malaysia. Based on literature review, this is the first study in the development of tropical shrub species image dataset and classification using a hybrid of leaf shape and machine learning approach. Four types of shape descriptors were used in this study namely morphological shape descriptors (MSD, Histogram of Oriented Gradients (HOG, Hu invariant moments (Hu and Zernike moments (ZM. Single descriptor, as well as the combination of hybrid descriptors were tested and compared. The tropical shrub species are classified using six different classifiers, which are artificial neural network (ANN, random forest (RF, support vector machine (SVM, k-nearest neighbour (k-NN, linear discriminant analysis (LDA and directed acyclic graph multiclass least squares twin support vector machine (DAG MLSTSVM. In addition, three types of feature selection methods were tested in the myDAUN dataset, Relief, Correlation-based feature selection (CFS and Pearson’s coefficient correlation (PCC. The well-known Flavia dataset and Swedish Leaf dataset were used as the validation dataset on the proposed methods. The results showed that the hybrid of all descriptors of ANN outperformed the other classifiers with an average classification accuracy of 98.23% for the myDAUN dataset, 95.25% for the Flavia

  5. Methods for sorting out the defects according to size in automated ultrasonic testing of large-diameter thin-walled tubes

    International Nuclear Information System (INIS)

    Golovkin, A.M.; Matveev, A.S

    1977-01-01

    Two methods have been considered of identifying defects according to their size in the course of an automated ultrasonic testing, namely, according to the echo-signal amplitude, and according to the conventional depth of a defect. The peculiar features of the second method are analyzed, and its equivalence to the first one is proved. For the purpose of identifying defects according to their conventional width, a technique is suggested of standartizing flaw detectors according to the control reflectors of two sizes

  6. An Automated Algorithm to Screen Massive Training Samples for a Global Impervious Surface Classification

    Science.gov (United States)

    Tan, Bin; Brown de Colstoun, Eric; Wolfe, Robert E.; Tilton, James C.; Huang, Chengquan; Smith, Sarah E.

    2012-01-01

    An algorithm is developed to automatically screen the outliers from massive training samples for Global Land Survey - Imperviousness Mapping Project (GLS-IMP). GLS-IMP is to produce a global 30 m spatial resolution impervious cover data set for years 2000 and 2010 based on the Landsat Global Land Survey (GLS) data set. This unprecedented high resolution impervious cover data set is not only significant to the urbanization studies but also desired by the global carbon, hydrology, and energy balance researches. A supervised classification method, regression tree, is applied in this project. A set of accurate training samples is the key to the supervised classifications. Here we developed the global scale training samples from 1 m or so resolution fine resolution satellite data (Quickbird and Worldview2), and then aggregate the fine resolution impervious cover map to 30 m resolution. In order to improve the classification accuracy, the training samples should be screened before used to train the regression tree. It is impossible to manually screen 30 m resolution training samples collected globally. For example, in Europe only, there are 174 training sites. The size of the sites ranges from 4.5 km by 4.5 km to 8.1 km by 3.6 km. The amount training samples are over six millions. Therefore, we develop this automated statistic based algorithm to screen the training samples in two levels: site and scene level. At the site level, all the training samples are divided to 10 groups according to the percentage of the impervious surface within a sample pixel. The samples following in each 10% forms one group. For each group, both univariate and multivariate outliers are detected and removed. Then the screen process escalates to the scene level. A similar screen process but with a looser threshold is applied on the scene level considering the possible variance due to the site difference. We do not perform the screen process across the scenes because the scenes might vary due to

  7. Automated classification of RNA 3D motifs and the RNA 3D Motif Atlas

    Science.gov (United States)

    Petrov, Anton I.; Zirbel, Craig L.; Leontis, Neocles B.

    2013-01-01

    The analysis of atomic-resolution RNA three-dimensional (3D) structures reveals that many internal and hairpin loops are modular, recurrent, and structured by conserved non-Watson–Crick base pairs. Structurally similar loops define RNA 3D motifs that are conserved in homologous RNA molecules, but can also occur at nonhomologous sites in diverse RNAs, and which often vary in sequence. To further our understanding of RNA motif structure and sequence variability and to provide a useful resource for structure modeling and prediction, we present a new method for automated classification of internal and hairpin loop RNA 3D motifs and a new online database called the RNA 3D Motif Atlas. To classify the motif instances, a representative set of internal and hairpin loops is automatically extracted from a nonredundant list of RNA-containing PDB files. Their structures are compared geometrically, all-against-all, using the FR3D program suite. The loops are clustered into motif groups, taking into account geometric similarity and structural annotations and making allowance for a variable number of bulged bases. The automated procedure that we have implemented identifies all hairpin and internal loop motifs previously described in the literature. All motif instances and motif groups are assigned unique and stable identifiers and are made available in the RNA 3D Motif Atlas (http://rna.bgsu.edu/motifs), which is automatically updated every four weeks. The RNA 3D Motif Atlas provides an interactive user interface for exploring motif diversity and tools for programmatic data access. PMID:23970545

  8. Automated, high accuracy classification of Parkinsonian disorders: a pattern recognition approach.

    Directory of Open Access Journals (Sweden)

    Andre F Marquand

    Full Text Available Progressive supranuclear palsy (PSP, multiple system atrophy (MSA and idiopathic Parkinson's disease (IPD can be clinically indistinguishable, especially in the early stages, despite distinct patterns of molecular pathology. Structural neuroimaging holds promise for providing objective biomarkers for discriminating these diseases at the single subject level but all studies to date have reported incomplete separation of disease groups. In this study, we employed multi-class pattern recognition to assess the value of anatomical patterns derived from a widely available structural neuroimaging sequence for automated classification of these disorders. To achieve this, 17 patients with PSP, 14 with IPD and 19 with MSA were scanned using structural MRI along with 19 healthy controls (HCs. An advanced probabilistic pattern recognition approach was employed to evaluate the diagnostic value of several pre-defined anatomical patterns for discriminating the disorders, including: (i a subcortical motor network; (ii each of its component regions and (iii the whole brain. All disease groups could be discriminated simultaneously with high accuracy using the subcortical motor network. The region providing the most accurate predictions overall was the midbrain/brainstem, which discriminated all disease groups from one another and from HCs. The subcortical network also produced more accurate predictions than the whole brain and all of its constituent regions. PSP was accurately predicted from the midbrain/brainstem, cerebellum and all basal ganglia compartments; MSA from the midbrain/brainstem and cerebellum and IPD from the midbrain/brainstem only. This study demonstrates that automated analysis of structural MRI can accurately predict diagnosis in individual patients with Parkinsonian disorders, and identifies distinct patterns of regional atrophy particularly useful for this process.

  9. Interpreting complex data by methods of recognition and classification in an automated system of aerogeophysical material processing

    Energy Technology Data Exchange (ETDEWEB)

    Koval' , L.A.; Dolgov, S.V.; Liokumovich, G.B.; Ovcharenko, A.V.; Priyezzhev, I.I.

    1984-01-01

    The system of automated processing of aerogeophysical data, ASOM-AGS/YeS, is equipped with complex interpretation of multichannel measurements. Algorithms of factor analysis, automatic classification and apparatus of a priori specified (selected) decisive rules are used. The areas of effect of these procedures can be initially limited to the specified geological information. The possibilities of the method are demonstrated by the results of automated processing of the aerogram-spectrometric measurements in the region of the known copper-porphyr manifestation in Kazakhstan. This ore deposit was clearly noted after processing by the method of main components by complex aureole of independent factors U (severe increase), Th (noticeable increase), K (decrease).

  10. Olive oil sensory defects classification with data fusion of instrumental techniques and multivariate analysis (PLS-DA).

    Science.gov (United States)

    Borràs, Eva; Ferré, Joan; Boqué, Ricard; Mestres, Montserrat; Aceña, Laura; Calvo, Angels; Busto, Olga

    2016-07-15

    Three instrumental techniques, headspace-mass spectrometry (HS-MS), mid-infrared spectroscopy (MIR) and UV-visible spectrophotometry (UV-vis), have been combined to classify virgin olive oil samples based on the presence or absence of sensory defects. The reference sensory values were provided by an official taste panel. Different data fusion strategies were studied to improve the discrimination capability compared to using each instrumental technique individually. A general model was applied to discriminate high-quality non-defective olive oils (extra-virgin) and the lowest-quality olive oils considered non-edible (lampante). A specific identification of key off-flavours, such as musty, winey, fusty and rancid, was also studied. The data fusion of the three techniques improved the classification results in most of the cases. Low-level data fusion was the best strategy to discriminate musty, winey and fusty defects, using HS-MS, MIR and UV-vis, and the rancid defect using only HS-MS and MIR. The mid-level data fusion approach using partial least squares-discriminant analysis (PLS-DA) scores was found to be the best strategy for defective vs non-defective and edible vs non-edible oil discrimination. However, the data fusion did not sufficiently improve the results obtained by a single technique (HS-MS) to classify non-defective classes. These results indicate that instrumental data fusion can be useful for the identification of sensory defects in virgin olive oils. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Accuracy of automated classification of major depressive disorder as a function of symptom severity.

    Science.gov (United States)

    Ramasubbu, Rajamannar; Brown, Matthew R G; Cortese, Filmeno; Gaxiola, Ismael; Goodyear, Bradley; Greenshaw, Andrew J; Dursun, Serdar M; Greiner, Russell

    2016-01-01

    Growing evidence documents the potential of machine learning for developing brain based diagnostic methods for major depressive disorder (MDD). As symptom severity may influence brain activity, we investigated whether the severity of MDD affected the accuracies of machine learned MDD-vs-Control diagnostic classifiers. Forty-five medication-free patients with DSM-IV defined MDD and 19 healthy controls participated in the study. Based on depression severity as determined by the Hamilton Rating Scale for Depression (HRSD), MDD patients were sorted into three groups: mild to moderate depression (HRSD 14-19), severe depression (HRSD 20-23), and very severe depression (HRSD ≥ 24). We collected functional magnetic resonance imaging (fMRI) data during both resting-state and an emotional-face matching task. Patients in each of the three severity groups were compared against controls in separate analyses, using either the resting-state or task-based fMRI data. We use each of these six datasets with linear support vector machine (SVM) binary classifiers for identifying individuals as patients or controls. The resting-state fMRI data showed statistically significant classification accuracy only for the very severe depression group (accuracy 66%, p = 0.012 corrected), while mild to moderate (accuracy 58%, p = 1.0 corrected) and severe depression (accuracy 52%, p = 1.0 corrected) were only at chance. With task-based fMRI data, the automated classifier performed at chance in all three severity groups. Binary linear SVM classifiers achieved significant classification of very severe depression with resting-state fMRI, but the contribution of brain measurements may have limited potential in differentiating patients with less severe depression from healthy controls.

  12. Modified classification and single-stage microsurgical repair of posttraumatic infected massive bone defects in lower extremities.

    Science.gov (United States)

    Yang, Yun-fa; Xu, Zhong-he; Zhang, Guang-ming; Wang, Jian-wei; Hu, Si-wang; Hou, Zhi-qi; Xu, Da-chuan

    2013-11-01

    Posttraumatic infected massive bone defects in lower extremities are difficult to repair because they frequently exhibit massive bone and/or soft tissue defects, serious bone infection, and excessive scar proliferation. This study aimed to determine whether these defects could be classified and repaired at a single stage. A total of 51 cases of posttraumatic infected massive bone defect in lower extremity were included in this study. They were classified into four types on the basis of the conditions of the bone defects, soft tissue defects, and injured limb length, including Type A (without soft tissue defects), Type B (with soft tissue defects of 10 × 20 cm or less), Type C (with soft tissue defects of 10 × 20 cm or more), and Type D (with the limb shortening of 3 cm or more). Four types of single-stage microsurgical repair protocols were planned accordingly and implemented respectively. These protocols included the following: Protocol A, where vascularized fibular graft was implemented for Type A; Protocol B, where vascularized fibular osteoseptocutaneous graft was implemented for Type B; Protocol C, where vascularized fibular graft and anterior lateral thigh flap were used for Type C; and Protocol D, where limb lengthening and Protocols A, B, or C were used for Type D. There were 12, 33, 4, and 2 cases of Types A, B, C, and D, respectively, according to this classification. During the surgery, three cases of planned Protocol B had to be shifted into Protocol C; however, all microsurgical repairs were completed. With reference to Johner-Wruhs evaluation method, the total percentage of excellent and good results was 82.35% after 6 to 41 months of follow-up. It was concluded that posttraumatic massive bone defects could be accurately classified into four types on the basis of the conditions of bone defects, soft tissue coverage, and injured limb length, and successfully repaired with the single-stage repair protocols after thorough debridement. Thieme Medical

  13. Seasonal cultivated and fallow cropland mapping using MODIS-based automated cropland classification algorithm

    Science.gov (United States)

    Wu, Zhuoting; Thenkabail, Prasad S.; Mueller, Rick; Zakzeski, Audra; Melton, Forrest; Johnson, Lee; Rosevelt, Carolyn; Dwyer, John; Jones, Jeanine; Verdin, James P.

    2014-01-01

    Increasing drought occurrences and growing populations demand accurate, routine, and consistent cultivated and fallow cropland products to enable water and food security analysis. The overarching goal of this research was to develop and test automated cropland classification algorithm (ACCA) that provide accurate, consistent, and repeatable information on seasonal cultivated as well as seasonal fallow cropland extents and areas based on the Moderate Resolution Imaging Spectroradiometer remote sensing data. Seasonal ACCA development process involves writing series of iterative decision tree codes to separate cultivated and fallow croplands from noncroplands, aiming to accurately mirror reliable reference data sources. A pixel-by-pixel accuracy assessment when compared with the U.S. Department of Agriculture (USDA) cropland data showed, on average, a producer’s accuracy of 93% and a user’s accuracy of 85% across all months. Further, ACCA-derived cropland maps agreed well with the USDA Farm Service Agency crop acreage-reported data for both cultivated and fallow croplands with R-square values over 0.7 and field surveys with an accuracy of ≥95% for cultivated croplands and ≥76% for fallow croplands. Our results demonstrated the ability of ACCA to generate cropland products, such as cultivated and fallow cropland extents and areas, accurately, automatically, and repeatedly throughout the growing season.

  14. Seasonal cultivated and fallow cropland mapping using MODIS-based automated cropland classification algorithm

    Science.gov (United States)

    Wu, Zhuoting; Thenkabail, Prasad S.; Mueller, Rick; Zakzeski, Audra; Melton, Forrest; Johnson, Lee; Rosevelt, Carolyn; Dwyer, John; Jones, Jeanine; Verdin, James P.

    2014-01-01

    Increasing drought occurrences and growing populations demand accurate, routine, and consistent cultivated and fallow cropland products to enable water and food security analysis. The overarching goal of this research was to develop and test automated cropland classification algorithm (ACCA) that provide accurate, consistent, and repeatable information on seasonal cultivated as well as seasonal fallow cropland extents and areas based on the Moderate Resolution Imaging Spectroradiometer remote sensing data. Seasonal ACCA development process involves writing series of iterative decision tree codes to separate cultivated and fallow croplands from noncroplands, aiming to accurately mirror reliable reference data sources. A pixel-by-pixel accuracy assessment when compared with the U.S. Department of Agriculture (USDA) cropland data showed, on average, a producer's accuracy of 93% and a user's accuracy of 85% across all months. Further, ACCA-derived cropland maps agreed well with the USDA Farm Service Agency crop acreage-reported data for both cultivated and fallow croplands with R-square values over 0.7 and field surveys with an accuracy of ≥95% for cultivated croplands and ≥76% for fallow croplands. Our results demonstrated the ability of ACCA to generate cropland products, such as cultivated and fallow cropland extents and areas, accurately, automatically, and repeatedly throughout the growing season.

  15. Automated segmentation of geographic atrophy in fundus autofluorescence images using supervised pixel classification.

    Science.gov (United States)

    Hu, Zhihong; Medioni, Gerard G; Hernandez, Matthias; Sadda, Srinivas R

    2015-01-01

    Geographic atrophy (GA) is a manifestation of the advanced or late stage of age-related macular degeneration (AMD). AMD is the leading cause of blindness in people over the age of 65 in the western world. The purpose of this study is to develop a fully automated supervised pixel classification approach for segmenting GA, including uni- and multifocal patches in fundus autofluorescene (FAF) images. The image features include region-wise intensity measures, gray-level co-occurrence matrix measures, and Gaussian filter banks. A [Formula: see text]-nearest-neighbor pixel classifier is applied to obtain a GA probability map, representing the likelihood that the image pixel belongs to GA. Sixteen randomly chosen FAF images were obtained from 16 subjects with GA. The algorithm-defined GA regions are compared with manual delineation performed by a certified image reading center grader. Eight-fold cross-validation is applied to evaluate the algorithm performance. The mean overlap ratio (OR), area correlation (Pearson's [Formula: see text]), accuracy (ACC), true positive rate (TPR), specificity (SPC), positive predictive value (PPV), and false discovery rate (FDR) between the algorithm- and manually defined GA regions are [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], and [Formula: see text], respectively.

  16. Entropy-based automated classification of independent components separated from fMCG

    International Nuclear Information System (INIS)

    Comani, S; Srinivasan, V; Alleva, G; Romani, G L

    2007-01-01

    Fetal magnetocardiography (fMCG) is a noninvasive technique suitable for the prenatal diagnosis of the fetal heart function. Reliable fetal cardiac signals can be reconstructed from multi-channel fMCG recordings by means of independent component analysis (ICA). However, the identification of the separated components is usually accomplished by visual inspection. This paper discusses a novel automated system based on entropy estimators, namely approximate entropy (ApEn) and sample entropy (SampEn), for the classification of independent components (ICs). The system was validated on 40 fMCG datasets of normal fetuses with the gestational age ranging from 22 to 37 weeks. Both ApEn and SampEn were able to measure the stability and predictability of the physiological signals separated with ICA, and the entropy values of the three categories were significantly different at p <0.01. The system performances were compared with those of a method based on the analysis of the time and frequency content of the components. The outcomes of this study showed a superior performance of the entropy-based system, in particular for early gestation, with an overall ICs detection rate of 98.75% and 97.92% for ApEn and SampEn respectively, as against a value of 94.50% obtained with the time-frequency-based system. (note)

  17. Deep SOMs for automated feature extraction and classification from big data streaming

    Science.gov (United States)

    Sakkari, Mohamed; Ejbali, Ridha; Zaied, Mourad

    2017-03-01

    In this paper, we proposed a deep self-organizing map model (Deep-SOMs) for automated features extracting and learning from big data streaming which we benefit from the framework Spark for real time streams and highly parallel data processing. The SOMs deep architecture is based on the notion of abstraction (patterns automatically extract from the raw data, from the less to more abstract). The proposed model consists of three hidden self-organizing layers, an input and an output layer. Each layer is made up of a multitude of SOMs, each map only focusing at local headmistress sub-region from the input image. Then, each layer trains the local information to generate more overall information in the higher layer. The proposed Deep-SOMs model is unique in terms of the layers architecture, the SOMs sampling method and learning. During the learning stage we use a set of unsupervised SOMs for feature extraction. We validate the effectiveness of our approach on large data sets such as Leukemia dataset and SRBCT. Results of comparison have shown that the Deep-SOMs model performs better than many existing algorithms for images classification.

  18. Eddy Current Signature Classification of Steam Generator Tube Defects Using A Learning Vector Quantization Neural Network

    International Nuclear Information System (INIS)

    Garcia, Gabe V.

    2005-01-01

    A major cause of failure in nuclear steam generators is degradation of their tubes. Although seven primary defect categories exist, one of the principal causes of tube failure is intergranular attack/stress corrosion cracking (IGA/SCC). This type of defect usually begins on the secondary side surface of the tubes and propagates both inwards and laterally. In many cases this defect is found at or near the tube support plates

  19. An automated classification system for the differentiation of obstructive lung diseases based on the textural analysis of HRCT images

    International Nuclear Information System (INIS)

    Park, Seong Hoon; Seo, Joon Beom; Kim, Nam Kug; Lee, Young Kyung; Kim, Song Soo; Chae, Eun Jin; Lee, June Goo

    2007-01-01

    To develop an automated classification system for the differentiation of obstructive lung diseases based on the textural analysis of HRCT images, and to evaluate the accuracy and usefulness of the system. For textural analysis, histogram features, gradient features, run length encoding, and a co-occurrence matrix were employed. A Bayesian classifier was used for automated classification. The images (image number n = 256) were selected from the HRCT images obtained from 17 healthy subjects (n = 67), 26 patients with bronchiolitis obliterans (n = 70), 28 patients with mild centrilobular emphysema (n = 65), and 21 patients with panlobular emphysema or severe centrilobular emphysema (n = 63). An five-fold cross-validation method was used to assess the performance of the system. Class-specific sensitivities were analyzed and the overall accuracy of the system was assessed with kappa statistics. The sensitivity of the system for each class was as follows: normal lung 84.9%, bronchiolitis obliterans 83.8%, mild centrilobular emphysema 77.0%, and panlobular emphysema or severe centrilobular emphysema 95.8%. The overall performance for differentiating each disease and the normal lung was satisfactory with a kappa value of 0.779. An automated classification system for the differentiation between obstructive lung diseases based on the textural analysis of HRCT images was developed. The proposed system discriminates well between the various obstructive lung diseases and the normal lung

  20. Using multiclass classification to automate the identification of patient safety incident reports by type and severity.

    Science.gov (United States)

    Wang, Ying; Coiera, Enrico; Runciman, William; Magrabi, Farah

    2017-06-12

    Approximately 10% of admissions to acute-care hospitals are associated with an adverse event. Analysis of incident reports helps to understand how and why incidents occur and can inform policy and practice for safer care. Unfortunately our capacity to monitor and respond to incident reports in a timely manner is limited by the sheer volumes of data collected. In this study, we aim to evaluate the feasibility of using multiclass classification to automate the identification of patient safety incidents in hospitals. Text based classifiers were applied to identify 10 incident types and 4 severity levels. Using the one-versus-one (OvsO) and one-versus-all (OvsA) ensemble strategies, we evaluated regularized logistic regression, linear support vector machine (SVM) and SVM with a radial-basis function (RBF) kernel. Classifiers were trained and tested with "balanced" datasets (n_ Type  = 2860, n_ SeverityLevel  = 1160) from a state-wide incident reporting system. Testing was also undertaken with imbalanced "stratified" datasets (n_ Type  = 6000, n_ SeverityLevel =5950) from the state-wide system and an independent hospital reporting system. Classifier performance was evaluated using a confusion matrix, as well as F-score, precision and recall. The most effective combination was a OvsO ensemble of binary SVM RBF classifiers with binary count feature extraction. For incident type, classifiers performed well on balanced and stratified datasets (F-score: 78.3, 73.9%), but were worse on independent datasets (68.5%). Reports about falls, medications, pressure injury, aggression and blood products were identified with high recall and precision. "Documentation" was the hardest type to identify. For severity level, F-score for severity assessment code (SAC) 1 (extreme risk) was 87.3 and 64% for SAC4 (low risk) on balanced data. With stratified data, high recall was achieved for SAC1 (82.8-84%) but precision was poor (6.8-11.2%). High risk incidents (SAC2) were confused

  1. Automated classification of Permanent Scatterers time-series based on statistical characterization tests

    Science.gov (United States)

    Berti, Matteo; Corsini, Alessandro; Franceschini, Silvia; Iannacone, Jean Pascal

    2013-04-01

    time series are typically affected by a significant noise to signal ratio. The results of the analysis show that even with such a rough-quality dataset, our automated classification procedure can greatly improve radar interpretation of mass movements. In general, uncorrelated PS (type 0) are concentrated in flat areas such as fluvial terraces and valley bottoms, and along stable watershed divides; linear PS (type 1) are mainly located on slopes (both inside or outside mapped landslides) or near the edge of scarps or steep slopes; non-linear PS (types 2 to 5) typically fall inside landslide deposits or in the surrounding areas. The spatial distribution of classified PS allows to detect deformation phenomena not visible by considering the average velocity alone, and provide important information on the temporal evolution of the phenomena such as acceleration, deceleration, seasonal fluctuations, abrupt or continuous changes of the displacement rate. Based on these encouraging results we integrated all the classification algorithms into a Graphical User Interface (called PSTime) which is freely available as a standalone application.

  2. Feasibility of Genetic Algorithm for Textile Defect Classification Using Neural Network

    OpenAIRE

    Habib, Md. Tarek; Faisal, Rahat Hossain; Rokonuzzaman, M.

    2012-01-01

    The global market for textile industry is highly competitive nowadays. Quality control in production process in textile industry has been a key factor for retaining existence in such competitive market. Automated textile inspection systems are very useful in this respect, because manual inspection is time consuming and not accurate enough. Hence, automated textile inspection systems have been drawing plenty of attention of the researchers of different countries in order to replace...

  3. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    International Nuclear Information System (INIS)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Brink, Henrik; Crellin-Quick, Arien; Butler, Nathaniel R.

    2012-01-01

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  4. CONSTRUCTION OF A CALIBRATED PROBABILISTIC CLASSIFICATION CATALOG: APPLICATION TO 50k VARIABLE SOURCES IN THE ALL-SKY AUTOMATED SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Richards, Joseph W.; Starr, Dan L.; Miller, Adam A.; Bloom, Joshua S.; Brink, Henrik; Crellin-Quick, Arien [Astronomy Department, University of California, Berkeley, CA 94720-3411 (United States); Butler, Nathaniel R., E-mail: jwrichar@stat.berkeley.edu [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287 (United States)

    2012-12-15

    With growing data volumes from synoptic surveys, astronomers necessarily must become more abstracted from the discovery and introspection processes. Given the scarcity of follow-up resources, there is a particularly sharp onus on the frameworks that replace these human roles to provide accurate and well-calibrated probabilistic classification catalogs. Such catalogs inform the subsequent follow-up, allowing consumers to optimize the selection of specific sources for further study and permitting rigorous treatment of classification purities and efficiencies for population studies. Here, we describe a process to produce a probabilistic classification catalog of variability with machine learning from a multi-epoch photometric survey. In addition to producing accurate classifications, we show how to estimate calibrated class probabilities and motivate the importance of probability calibration. We also introduce a methodology for feature-based anomaly detection, which allows discovery of objects in the survey that do not fit within the predefined class taxonomy. Finally, we apply these methods to sources observed by the All-Sky Automated Survey (ASAS), and release the Machine-learned ASAS Classification Catalog (MACC), a 28 class probabilistic classification catalog of 50,124 ASAS sources in the ASAS Catalog of Variable Stars. We estimate that MACC achieves a sub-20% classification error rate and demonstrate that the class posterior probabilities are reasonably calibrated. MACC classifications compare favorably to the classifications of several previous domain-specific ASAS papers and to the ASAS Catalog of Variable Stars, which had classified only 24% of those sources into one of 12 science classes.

  5. Automated radial basis function neural network based image classification system for diabetic retinopathy detection in retinal images

    Science.gov (United States)

    Anitha, J.; Vijila, C. Kezi Selva; Hemanth, D. Jude

    2010-02-01

    Diabetic retinopathy (DR) is a chronic eye disease for which early detection is highly essential to avoid any fatal results. Image processing of retinal images emerge as a feasible tool for this early diagnosis. Digital image processing techniques involve image classification which is a significant technique to detect the abnormality in the eye. Various automated classification systems have been developed in the recent years but most of them lack high classification accuracy. Artificial neural networks are the widely preferred artificial intelligence technique since it yields superior results in terms of classification accuracy. In this work, Radial Basis function (RBF) neural network based bi-level classification system is proposed to differentiate abnormal DR Images and normal retinal images. The results are analyzed in terms of classification accuracy, sensitivity and specificity. A comparative analysis is performed with the results of the probabilistic classifier namely Bayesian classifier to show the superior nature of neural classifier. Experimental results show promising results for the neural classifier in terms of the performance measures.

  6. Studying the potential impact of automated document classification on scheduling a systematic review update

    Science.gov (United States)

    2012-01-01

    Background Systematic Reviews (SRs) are an essential part of evidence-based medicine, providing support for clinical practice and policy on a wide range of medical topics. However, producing SRs is resource-intensive, and progress in the research they review leads to SRs becoming outdated, requiring updates. Although the question of how and when to update SRs has been studied, the best method for determining when to update is still unclear, necessitating further research. Methods In this work we study the potential impact of a machine learning-based automated system for providing alerts when new publications become available within an SR topic. Some of these new publications are especially important, as they report findings that are more likely to initiate a review update. To this end, we have designed a classification algorithm to identify articles that are likely to be included in an SR update, along with an annotation scheme designed to identify the most important publications in a topic area. Using an SR database containing over 70,000 articles, we annotated articles from 9 topics that had received an update during the study period. The algorithm was then evaluated in terms of the overall correct and incorrect alert rate for publications meeting the topic inclusion criteria, as well as in terms of its ability to identify important, update-motivating publications in a topic area. Results Our initial approach, based on our previous work in topic-specific SR publication classification, identifies over 70% of the most important new publications, while maintaining a low overall alert rate. Conclusions We performed an initial analysis of the opportunities and challenges in aiding the SR update planning process with an informatics-based machine learning approach. Alerts could be a useful tool in the planning, scheduling, and allocation of resources for SR updates, providing an improvement in timeliness and coverage for the large number of medical topics needing SRs

  7. Automated classification of immunostaining patterns in breast tissue from the human protein atlas.

    Science.gov (United States)

    Swamidoss, Issac Niwas; Kårsnäs, Andreas; Uhlmann, Virginie; Ponnusamy, Palanisamy; Kampf, Caroline; Simonsson, Martin; Wählby, Carolina; Strand, Robin

    2013-01-01

    The Human Protein Atlas (HPA) is an effort to map the location of all human proteins (http://www.proteinatlas.org/). It contains a large number of histological images of sections from human tissue. Tissue micro arrays (TMA) are imaged by a slide scanning microscope, and each image represents a thin slice of a tissue core with a dark brown antibody specific stain and a blue counter stain. When generating antibodies for protein profiling of the human proteome, an important step in the quality control is to compare staining patterns of different antibodies directed towards the same protein. This comparison is an ultimate control that the antibody recognizes the right protein. In this paper, we propose and evaluate different approaches for classifying sub-cellular antibody staining patterns in breast tissue samples. The proposed methods include the computation of various features including gray level co-occurrence matrix (GLCM) features, complex wavelet co-occurrence matrix (CWCM) features, and weighted neighbor distance using compound hierarchy of algorithms representing morphology (WND-CHARM)-inspired features. The extracted features are used into two different multivariate classifiers (support vector machine (SVM) and linear discriminant analysis (LDA) classifier). Before extracting features, we use color deconvolution to separate different tissue components, such as the brownly stained positive regions and the blue cellular regions, in the immuno-stained TMA images of breast tissue. We present classification results based on combinations of feature measurements. The proposed complex wavelet features and the WND-CHARM features have accuracy similar to that of a human expert. Both human experts and the proposed automated methods have difficulties discriminating between nuclear and cytoplasmic staining patterns. This is to a large extent due to mixed staining of nucleus and cytoplasm. Methods for quantification of staining patterns in histopathology have many

  8. Automated classification of immunostaining patterns in breast tissue from the human protein Atlas

    Directory of Open Access Journals (Sweden)

    Issac Niwas Swamidoss

    2013-01-01

    Full Text Available Background: The Human Protein Atlas (HPA is an effort to map the location of all human proteins (http://www.proteinatlas.org/. It contains a large number of histological images of sections from human tissue. Tissue micro arrays (TMA are imaged by a slide scanning microscope, and each image represents a thin slice of a tissue core with a dark brown antibody specific stain and a blue counter stain. When generating antibodies for protein profiling of the human proteome, an important step in the quality control is to compare staining patterns of different antibodies directed towards the same protein. This comparison is an ultimate control that the antibody recognizes the right protein. In this paper, we propose and evaluate different approaches for classifying sub-cellular antibody staining patterns in breast tissue samples. Materials and Methods: The proposed methods include the computation of various features including gray level co-occurrence matrix (GLCM features, complex wavelet co-occurrence matrix (CWCM features, and weighted neighbor distance using compound hierarchy of algorithms representing morphology (WND-CHARM-inspired features. The extracted features are used into two different multivariate classifiers (support vector machine (SVM and linear discriminant analysis (LDA classifier. Before extracting features, we use color deconvolution to separate different tissue components, such as the brownly stained positive regions and the blue cellular regions, in the immuno-stained TMA images of breast tissue. Results: We present classification results based on combinations of feature measurements. The proposed complex wavelet features and the WND-CHARM features have accuracy similar to that of a human expert. Conclusions: Both human experts and the proposed automated methods have difficulties discriminating between nuclear and cytoplasmic staining patterns. This is to a large extent due to mixed staining of nucleus and cytoplasm. Methods for

  9. Automated classification of mouse pup isolation syllables: from cluster analysis to an Excel based ‘mouse pup syllable classification calculator’

    Directory of Open Access Journals (Sweden)

    Jasmine eGrimsley

    2013-01-01

    Full Text Available Mouse pups vocalize at high rates when they are cold or isolated from the nest. The proportions of each syllable type produced carry information about disease state and are being used as behavioral markers for the internal state of animals. Manual classifications of these vocalizations identified ten syllable types based on their spectro-temporal features. However, manual classification of mouse syllables is time consuming and vulnerable to experimenter bias. This study uses an automated cluster analysis to identify acoustically distinct syllable types produced by CBA/CaJ mouse pups, and then compares the results to prior manual classification methods. The cluster analysis identified two syllable types, based on their frequency bands, that have continuous frequency-time structure, and two syllable types featuring abrupt frequency transitions. Although cluster analysis computed fewer syllable types than manual classification, the clusters represented well the probability distributions of the acoustic features within syllables. These probability distributions indicate that some of the manually classified syllable types are not statistically distinct. The characteristics of the four classified clusters were used to generate a Microsoft Excel-based mouse syllable classifier that rapidly categorizes syllables, with over a 90% match, into the syllable types determined by cluster analysis.

  10. Identifying and locating surface defects in wood: Part of an automated lumber processing system

    Science.gov (United States)

    Richard W. Conners; Charles W. McMillin; Kingyao Lin; Ramon E. Vasquez-Espinosa

    1983-01-01

    Continued increases in the cost of materials and labor make it imperative for furniture manufacturers to control costs by improved yield and increased productivity. This paper describes an Automated Lumber Processing System (ALPS) that employs computer tomography, optical scanning technology, the calculation of an optimum cutting strategy, and 1 computer-driven laser...

  11. Automated classification of self-grooming in mice using open-source software

    NARCIS (Netherlands)

    Van den Boom, B.; Pavlidi, Pavlina; Wolf, Casper M H; Mooij, Hanne A H; Willuhn, Ingo

    BACKGROUND: Manual analysis of behavior is labor intensive and subject to inter-rater variability. Although considerable progress in automation of analysis has been made, complex behavior such as grooming still lacks satisfactory automated quantification. NEW METHOD: We trained a freely available,

  12. Automated classification of self-grooming in mice using open-source software

    NARCIS (Netherlands)

    van den Boom, Bastijn J. G.; Pavlidi, Pavlina; Wolf, Casper M. H.; Mooij, Hanne A. H.; Willuhn, Ingo

    2017-01-01

    Background: Manual analysis of behavior is labor intensive and subject to inter-rater variability. Although considerable progress in automation of analysis has been made, complex behavior such as grooming still lacks satisfactory automated quantification. New method: We trained a freely available,

  13. Vertebral Body Compression Fractures and Bone Density: Automated Detection and Classification on CT Images.

    Science.gov (United States)

    Burns, Joseph E; Yao, Jianhua; Summers, Ronald M

    2017-09-01

    Purpose To create and validate a computer system with which to detect, localize, and classify compression fractures and measure bone density of thoracic and lumbar vertebral bodies on computed tomographic (CT) images. Materials and Methods Institutional review board approval was obtained, and informed consent was waived in this HIPAA-compliant retrospective study. A CT study set of 150 patients (mean age, 73 years; age range, 55-96 years; 92 women, 58 men) with (n = 75) and without (n = 75) compression fractures was assembled. All case patients were age and sex matched with control subjects. A total of 210 thoracic and lumbar vertebrae showed compression fractures and were electronically marked and classified by a radiologist. Prototype fully automated spinal segmentation and fracture detection software were then used to analyze the study set. System performance was evaluated with free-response receiver operating characteristic analysis. Results Sensitivity for detection or localization of compression fractures was 95.7% (201 of 210; 95% confidence interval [CI]: 87.0%, 98.9%), with a false-positive rate of 0.29 per patient. Additionally, sensitivity was 98.7% and specificity was 77.3% at case-based receiver operating characteristic curve analysis. Accuracy for classification by Genant type (anterior, middle, or posterior height loss) was 0.95 (107 of 113; 95% CI: 0.89, 0.98), with weighted κ of 0.90 (95% CI: 0.81, 0.99). Accuracy for categorization by Genant height loss grade was 0.68 (77 of 113; 95% CI: 0.59, 0.76), with a weighted κ of 0.59 (95% CI: 0.47, 0.71). The average bone attenuation for T12-L4 vertebrae was 146 HU ± 29 (standard deviation) in case patients and 173 HU ± 42 in control patients; this difference was statistically significant (P high sensitivity and with a low false-positive rate, as well as to calculate vertebral bone density, on CT images. © RSNA, 2017 Online supplemental material is available for this article.

  14. Algorithms and data structures for automated change detection and classification of sidescan sonar imagery

    Science.gov (United States)

    Gendron, Marlin Lee

    During Mine Warfare (MIW) operations, MIW analysts perform change detection by visually comparing historical sidescan sonar imagery (SSI) collected by a sidescan sonar with recently collected SSI in an attempt to identify objects (which might be explosive mines) placed at sea since the last time the area was surveyed. This dissertation presents a data structure and three algorithms, developed by the author, that are part of an automated change detection and classification (ACDC) system. MIW analysts at the Naval Oceanographic Office, to reduce the amount of time to perform change detection, are currently using ACDC. The dissertation introductory chapter gives background information on change detection, ACDC, and describes how SSI is produced from raw sonar data. Chapter 2 presents the author's Geospatial Bitmap (GB) data structure, which is capable of storing information geographically and is utilized by the three algorithms. This chapter shows that a GB data structure used in a polygon-smoothing algorithm ran between 1.3--48.4x faster than a sparse matrix data structure. Chapter 3 describes the GB clustering algorithm, which is the author's repeatable, order-independent method for clustering. Results from tests performed in this chapter show that the time to cluster a set of points is not affected by the distribution or the order of the points. In Chapter 4, the author presents his real-time computer-aided detection (CAD) algorithm that automatically detects mine-like objects on the seafloor in SSI. The author ran his GB-based CAD algorithm on real SSI data, and results of these tests indicate that his real-time CAD algorithm performs comparably to or better than other non-real-time CAD algorithms. The author presents his computer-aided search (CAS) algorithm in Chapter 5. CAS helps MIW analysts locate mine-like features that are geospatially close to previously detected features. A comparison between the CAS and a great circle distance algorithm shows that the

  15. Automated Inspection of Defects in Optical Fiber Connector End Face Using Novel Morphology Approaches.

    Science.gov (United States)

    Mei, Shuang; Wang, Yudan; Wen, Guojun; Hu, Yang

    2018-05-03

    Increasing deployment of optical fiber networks and the need for reliable high bandwidth make the task of inspecting optical fiber connector end faces a crucial process that must not be neglected. Traditional end face inspections are usually performed by manual visual methods, which are low in efficiency and poor in precision for long-term industrial applications. More seriously, the inspection results cannot be quantified for subsequent analysis. Aiming at the characteristics of typical defects in the inspection process for optical fiber end faces, we propose a novel method, “difference of min-max ranking filtering” (DO2MR), for detection of region-based defects, e.g., dirt, oil, contamination, pits, and chips, and a special model, a “linear enhancement inspector” (LEI), for the detection of scratches. The DO2MR is a morphology method that intends to determine whether a pixel belongs to a defective region by comparing the difference of gray values of pixels in the neighborhood around the pixel. The LEI is also a morphology method that is designed to search for scratches at different orientations with a special linear detector. These two approaches can be easily integrated into optical inspection equipment for automatic quality verification. As far as we know, this is the first time that complete defect detection methods for optical fiber end faces are available in the literature. Experimental results demonstrate that the proposed DO2MR and LEI models yield good comprehensive performance with high precision and accepted recall rates, and the image-level detection accuracies reach 96.0 and 89.3%, respectively.

  16. Automated Inspection of Defects in Optical Fiber Connector End Face Using Novel Morphology Approaches

    Directory of Open Access Journals (Sweden)

    Shuang Mei

    2018-05-01

    Full Text Available Increasing deployment of optical fiber networks and the need for reliable high bandwidth make the task of inspecting optical fiber connector end faces a crucial process that must not be neglected. Traditional end face inspections are usually performed by manual visual methods, which are low in efficiency and poor in precision for long-term industrial applications. More seriously, the inspection results cannot be quantified for subsequent analysis. Aiming at the characteristics of typical defects in the inspection process for optical fiber end faces, we propose a novel method, “difference of min-max ranking filtering” (DO2MR, for detection of region-based defects, e.g., dirt, oil, contamination, pits, and chips, and a special model, a “linear enhancement inspector” (LEI, for the detection of scratches. The DO2MR is a morphology method that intends to determine whether a pixel belongs to a defective region by comparing the difference of gray values of pixels in the neighborhood around the pixel. The LEI is also a morphology method that is designed to search for scratches at different orientations with a special linear detector. These two approaches can be easily integrated into optical inspection equipment for automatic quality verification. As far as we know, this is the first time that complete defect detection methods for optical fiber end faces are available in the literature. Experimental results demonstrate that the proposed DO2MR and LEI models yield good comprehensive performance with high precision and accepted recall rates, and the image-level detection accuracies reach 96.0 and 89.3%, respectively.

  17. Regional Landslide Mapping Aided by Automated Classification of SqueeSAR™ Time Series (Northern Apennines, Italy)

    Science.gov (United States)

    Iannacone, J.; Berti, M.; Allievi, J.; Del Conte, S.; Corsini, A.

    2013-12-01

    Space borne InSAR has proven to be very valuable for landslides detection. In particular, extremely slow landslides (Cruden and Varnes, 1996) can be now clearly identified, thanks to the millimetric precision reached by recent multi-interferometric algorithms. The typical approach in radar interpretation for landslides mapping is based on average annual velocity of the deformation which is calculated over the entire times series. The Hotspot and Cluster Analysis (Lu et al., 2012) and the PSI-based matrix approach (Cigna et al., 2013) are examples of landslides mapping techniques based on average annual velocities. However, slope movements can be affected by non-linear deformation trends, (i.e. reactivation of dormant landslides, deceleration due to natural or man-made slope stabilization, seasonal activity, etc). Therefore, analyzing deformation time series is crucial in order to fully characterize slope dynamics. While this is relatively simple to be carried out manually when dealing with small dataset, the time series analysis over regional scale dataset requires automated classification procedures. Berti et al. (2013) developed an automatic procedure for the analysis of InSAR time series based on a sequence of statistical tests. The analysis allows to classify the time series into six distinctive target trends (0=uncorrelated; 1=linear; 2=quadratic; 3=bilinear; 4=discontinuous without constant velocity; 5=discontinuous with change in velocity) which are likely to represent different slope processes. The analysis also provides a series of descriptive parameters which can be used to characterize the temporal changes of ground motion. All the classification algorithms were integrated into a Graphical User Interface called PSTime. We investigated an area of about 2000 km2 in the Northern Apennines of Italy by using SqueeSAR™ algorithm (Ferretti et al., 2011). Two Radarsat-1 data stack, comprising of 112 scenes in descending orbit and 124 scenes in ascending orbit

  18. Classification of defects in honeycomb composite structure of helicopter rotor blades

    International Nuclear Information System (INIS)

    Balasko, M.; Svab, E.; Molnar, Gy.; Veres, I.

    2005-01-01

    The use of non-destructive testing methods to qualify the state of rotor blades with respect to their expected flight hours, with the aim to extend their lifetime without any risk of breakdown, is an important financial demand. In order to detect the possible defects in the composite structure of Mi-8 and Mi-24 type helicopter rotor blades used by the Hungarian Army, we have performed combined neutron- and X-ray radiography measurements at the Budapest Research Reactor. Several types of defects were detected, analysed and typified. Among the most frequent and important defects observed were cavities, holes and or cracks in the sealing elements on the interface of the honeycomb structure and the section boarders. Inhomogeneities of the resin materials (resin-rich or starved areas) at the core-honeycomb surfaces proved to be an other important point. Defects were detected at the adhesive filling, and water percolation was visualized at the sealing interfaces of the honeycomb sections. Corrosion effects, and metal inclusions have also been detected

  19. Classification of defects in honeycomb composite structure of helicopter rotor blades

    Science.gov (United States)

    Balaskó, M.; Sváb, E.; Molnár, Gy.; Veres, I.

    2005-04-01

    The use of non-destructive testing methods to qualify the state of rotor blades with respect to their expected flight hours, with the aim to extend their lifetime without any risk of breakdown, is an important financial demand. In order to detect the possible defects in the composite structure of Mi-8 and Mi-24 type helicopter rotor blades used by the Hungarian Army, we have performed combined neutron- and X-ray radiography measurements at the Budapest Research Reactor. Several types of defects were detected, analysed and typified. Among the most frequent and important defects observed were cavities, holes and/or cracks in the sealing elements on the interface of the honeycomb structure and the section boarders. Inhomogeneities of the resin materials (resin-rich or starved areas) at the core-honeycomb surfaces proved to be an other important point. Defects were detected at the adhesive filling, and water percolation was visualized at the sealing interfaces of the honeycomb sections. Corrosion effects, and metal inclusions have also been detected.

  20. Development of automated system based on neural network algorithm for detecting defects on molds installed on casting machines

    Science.gov (United States)

    Bazhin, V. Yu; Danilov, I. V.; Petrov, P. A.

    2018-05-01

    During the casting of light alloys and ligatures based on aluminum and magnesium, problems of the qualitative distribution of the metal and its crystallization in the mold arise. To monitor the defects of molds on the casting conveyor, a camera with a resolution of 780 x 580 pixels and a shooting rate of 75 frames per second was selected. Images of molds from casting machines were used as input data for neural network algorithm. On the preparation of a digital database and its analytical evaluation stage, the architecture of the convolutional neural network was chosen for the algorithm. The information flow from the local controller is transferred to the OPC server and then to the SCADA system of foundry. After the training, accuracy of neural network defect recognition was about 95.1% on a validation split. After the training, weight coefficients of the neural network were used on testing split and algorithm had identical accuracy with validation images. The proposed technical solutions make it possible to increase the efficiency of the automated process control system in the foundry by expanding the digital database.

  1. An automated Pearson's correlation change classification (APC3) approach for GC/MS metabonomic data using total ion chromatograms (TICs).

    Science.gov (United States)

    Prakash, Bhaskaran David; Esuvaranathan, Kesavan; Ho, Paul C; Pasikanti, Kishore Kumar; Chan, Eric Chun Yong; Yap, Chun Wei

    2013-05-21

    A fully automated and computationally efficient Pearson's correlation change classification (APC3) approach is proposed and shown to have overall comparable performance with both an average accuracy and an average AUC of 0.89 ± 0.08 but is 3.9 to 7 times faster, easier to use and have low outlier susceptibility in contrast to other dimensional reduction and classification combinations using only the total ion chromatogram (TIC) intensities of GC/MS data. The use of only the TIC permits the possible application of APC3 to other metabonomic data such as LC/MS TICs or NMR spectra. A RapidMiner implementation is available for download at http://padel.nus.edu.sg/software/padelapc3.

  2. Using support vector machines with tract-based spatial statistics for automated classification of Tourette syndrome children

    Science.gov (United States)

    Wen, Hongwei; Liu, Yue; Wang, Jieqiong; Zhang, Jishui; Peng, Yun; He, Huiguang

    2016-03-01

    Tourette syndrome (TS) is a developmental neuropsychiatric disorder with the cardinal symptoms of motor and vocal tics which emerges in early childhood and fluctuates in severity in later years. To date, the neural basis of TS is not fully understood yet and TS has a long-term prognosis that is difficult to accurately estimate. Few studies have looked at the potential of using diffusion tensor imaging (DTI) in conjunction with machine learning algorithms in order to automate the classification of healthy children and TS children. Here we apply Tract-Based Spatial Statistics (TBSS) method to 44 TS children and 48 age and gender matched healthy children in order to extract the diffusion values from each voxel in the white matter (WM) skeleton, and a feature selection algorithm (ReliefF) was used to select the most salient voxels for subsequent classification with support vector machine (SVM). We use a nested cross validation to yield an unbiased assessment of the classification method and prevent overestimation. The accuracy (88.04%), sensitivity (88.64%) and specificity (87.50%) were achieved in our method as peak performance of the SVM classifier was achieved using the axial diffusion (AD) metric, demonstrating the potential of a joint TBSS and SVM pipeline for fast, objective classification of healthy and TS children. These results support that our methods may be useful for the early identification of subjects with TS, and hold promise for predicting prognosis and treatment outcome for individuals with TS.

  3. Quantum Cascade Laser-Based Infrared Microscopy for Label-Free and Automated Cancer Classification in Tissue Sections.

    Science.gov (United States)

    Kuepper, Claus; Kallenbach-Thieltges, Angela; Juette, Hendrik; Tannapfel, Andrea; Großerueschkamp, Frederik; Gerwert, Klaus

    2018-05-16

    A feasibility study using a quantum cascade laser-based infrared microscope for the rapid and label-free classification of colorectal cancer tissues is presented. Infrared imaging is a reliable, robust, automated, and operator-independent tissue classification method that has been used for differential classification of tissue thin sections identifying tumorous regions. However, long acquisition time by the so far used FT-IR-based microscopes hampered the clinical translation of this technique. Here, the used quantum cascade laser-based microscope provides now infrared images for precise tissue classification within few minutes. We analyzed 110 patients with UICC-Stage II and III colorectal cancer, showing 96% sensitivity and 100% specificity of this label-free method as compared to histopathology, the gold standard in routine clinical diagnostics. The main hurdle for the clinical translation of IR-Imaging is overcome now by the short acquisition time for high quality diagnostic images, which is in the same time range as frozen sections by pathologists.

  4. Automated Classification of Radiology Reports for Acute Lung Injury: Comparison of Keyword and Machine Learning Based Natural Language Processing Approaches.

    Science.gov (United States)

    Solti, Imre; Cooke, Colin R; Xia, Fei; Wurfel, Mark M

    2009-11-01

    This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators.

  5. Characterization of glycosylphosphatidylinositol biosynthesis defects by clinical features, flow cytometry, and automated image analysis

    DEFF Research Database (Denmark)

    Knaus, Alexej; Pantel, Jean Tori; Pendziwiat, Manuela

    2018-01-01

    , the increasing number of individuals with a GPIBD shows that hyperphosphatasia is a variable feature that is not ideal for a clinical classification. METHODS: We studied the discriminatory power of multiple GPI-linked substrates that were assessed by flow cytometry in blood cells and fibroblasts of 39 and 14...... those with PIGA mutations. Although the impairment of GPI-linked substrates is supposed to play the key role in the pathophysiology of GPIBDs, we could not observe gene-specific profiles for flow cytometric markers or a correlation between their cell surface levels and the severity of the phenotype...

  6. Development and evaluation of automated systems for detection and classification of banded chromosomes: current status and future perspectives

    International Nuclear Information System (INIS)

    Wang Xingwei; Zheng Bin; Wood, Marc; Li Shibo; Chen Wei; Liu Hong

    2005-01-01

    Automated detection and classification of banded chromosomes may help clinicians diagnose cancers and other genetic disorders at an early stage more efficiently and accurately. However, developing such an automated system (including both a high-speed microscopic image scanning device and related computer-assisted schemes) is quite a challenging and difficult task. Since the 1980s, great research efforts have been made to develop fast and more reliable methods to assist clinical technicians in performing this important and time-consuming task. A number of computer-assisted methods including classical statistical methods, artificial neural networks and knowledge-based fuzzy logic systems, have been applied and tested. Based on the initial test using limited datasets, encouraging results in algorithm and system development have been demonstrated. Despite the significant research effort and progress made over the last two decades, computer-assisted chromosome detection and classification systems have not been routinely accepted and used in clinical laboratories. Further research and development is needed

  7. Development and evaluation of automated systems for detection and classification of banded chromosomes: current status and future perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Wang Xingwei [Center for Bioengineering and School of Electrical and Computer Engineering, University of Oklahoma, OK (United States); Zheng Bin [Department of Radiology, University of Pittsburgh Medical Center, Pittsburgh, PA (United States); Wood, Marc [Center for Bioengineering and School of Electrical and Computer Engineering, University of Oklahoma, OK (United States); Li Shibo [Department of Pediatrics, University of Oklahoma Medical Center, Oklahoma City, OK (United States); Chen Wei [Department of Physics and Engineering, University of Central Oklahoma, Edmond, OK (United States); Liu Hong [Center for Bioengineering and School of Electrical and Computer Engineering, University of Oklahoma, OK (United States)

    2005-08-07

    Automated detection and classification of banded chromosomes may help clinicians diagnose cancers and other genetic disorders at an early stage more efficiently and accurately. However, developing such an automated system (including both a high-speed microscopic image scanning device and related computer-assisted schemes) is quite a challenging and difficult task. Since the 1980s, great research efforts have been made to develop fast and more reliable methods to assist clinical technicians in performing this important and time-consuming task. A number of computer-assisted methods including classical statistical methods, artificial neural networks and knowledge-based fuzzy logic systems, have been applied and tested. Based on the initial test using limited datasets, encouraging results in algorithm and system development have been demonstrated. Despite the significant research effort and progress made over the last two decades, computer-assisted chromosome detection and classification systems have not been routinely accepted and used in clinical laboratories. Further research and development is needed.

  8. Suggestions on performance of finite element limit analysis for eliminating the necessity of stress classifications in design and defect assessment

    Energy Technology Data Exchange (ETDEWEB)

    Fujioka, T. [Central Research Institute of Electric Power Industry, Tokyo (Japan)

    2001-07-01

    In structural design of a nuclear power component, stress classification from elastic stress analysis resultants is often used. Alternatively, to improve accuracy, finite element limit analysis may be performed. This paper examines some issues relating to the use of limit analysis; specifically, the treatment of multiple applied loads and the definition of the limit load from analysis using hardening plasticity laws. These are addressed both by detailed analysis for a simple geometry and by using the reference stress approach to estimate the inelastic displacement. The proposals are also applicable to a defect assessment of a cracked component, and treatment of distributed loads. It is shown that multiple or distributed loads should be treated as if they were applied proportionally irrespective of the actual nature of loads, and that the limit load from analysis with general plasticity laws may be estimated using a newly suggested reduced elastic slope method. (author)

  9. Suggestions on performance of finite element limit analysis for eliminating the necessity of stress classifications in design and defect assessment

    International Nuclear Information System (INIS)

    Fujioka, T.

    2001-01-01

    In structural design of a nuclear power component, stress classification from elastic stress analysis resultants is often used. Alternatively, to improve accuracy, finite element limit analysis may be performed. This paper examines some issues relating to the use of limit analysis; specifically, the treatment of multiple applied loads and the definition of the limit load from analysis using hardening plasticity laws. These are addressed both by detailed analysis for a simple geometry and by using the reference stress approach to estimate the inelastic displacement. The proposals are also applicable to a defect assessment of a cracked component, and treatment of distributed loads. It is shown that multiple or distributed loads should be treated as if they were applied proportionally irrespective of the actual nature of loads, and that the limit load from analysis with general plasticity laws may be estimated using a newly suggested reduced elastic slope method. (author)

  10. Automated grain extraction and classification by combining improved region growing segmentation and shape descriptors in electromagnetic mill classification system

    Science.gov (United States)

    Budzan, Sebastian

    2018-04-01

    In this paper, the automatic method of grain detection and classification has been presented. As input, it uses a single digital image obtained from milling process of the copper ore with an high-quality digital camera. The grinding process is an extremely energy and cost consuming process, thus granularity evaluation process should be performed with high efficiency and time consumption. The method proposed in this paper is based on the three-stage image processing. First, using Seeded Region Growing (SRG) segmentation with proposed adaptive thresholding based on the calculation of Relative Standard Deviation (RSD) all grains are detected. In the next step results of the detection are improved using information about the shape of the detected grains using distance map. Finally, each grain in the sample is classified into one of the predefined granularity class. The quality of the proposed method has been obtained by using nominal granularity samples, also with a comparison to the other methods.

  11. Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine.

    Directory of Open Access Journals (Sweden)

    Fernanda C Dórea

    Full Text Available BACKGROUND: Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes--syndromic surveillance--using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. METHODS: This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. RESULTS: High performance (F1-macro = 0.9995 was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F(1-micro score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F(1-macro, due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F(1-micro score of 0.923 (falling to 0.311 when classes are given equal weight. A Naïve Bayes classifier learned all classes and achieved high performance (F(1-micro= 0.994 and F(1-macro = .955, however the classification process is not transparent to the domain experts. CONCLUSION: The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish

  12. Precision automation of cell type classification and sub-cellular fluorescence quantification from laser scanning confocal images

    Directory of Open Access Journals (Sweden)

    Hardy Craig Hall

    2016-02-01

    Full Text Available While novel whole-plant phenotyping technologies have been successfully implemented into functional genomics and breeding programs, the potential of automated phenotyping with cellular resolution is largely unexploited. Laser scanning confocal microscopy has the potential to close this gap by providing spatially highly resolved images containing anatomic as well as chemical information on a subcellular basis. However, in the absence of automated methods, the assessment of the spatial patterns and abundance of fluorescent markers with subcellular resolution is still largely qualitative and time-consuming. Recent advances in image acquisition and analysis, coupled with improvements in microprocessor performance, have brought such automated methods within reach, so that information from thousands of cells per image for hundreds of images may be derived in an experimentally convenient time-frame. Here, we present a MATLAB-based analytical pipeline to 1 segment radial plant organs into individual cells, 2 classify cells into cell type categories based upon random forest classification, 3 divide each cell into sub-regions, and 4 quantify fluorescence intensity to a subcellular degree of precision for a separate fluorescence channel. In this research advance, we demonstrate the precision of this analytical process for the relatively complex tissues of Arabidopsis hypocotyls at various stages of development. High speed and robustness make our approach suitable for phenotyping of large collections of stem-like material and other tissue types.

  13. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    Directory of Open Access Journals (Sweden)

    Mei Zhan

    2015-04-01

    Full Text Available Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM. These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a

  14. Automated Processing of Imaging Data through Multi-tiered Classification of Biological Structures Illustrated Using Caenorhabditis elegans.

    Science.gov (United States)

    Zhan, Mei; Crane, Matthew M; Entchev, Eugeni V; Caballero, Antonio; Fernandes de Abreu, Diana Andrea; Ch'ng, QueeLim; Lu, Hang

    2015-04-01

    Quantitative imaging has become a vital technique in biological discovery and clinical diagnostics; a plethora of tools have recently been developed to enable new and accelerated forms of biological investigation. Increasingly, the capacity for high-throughput experimentation provided by new imaging modalities, contrast techniques, microscopy tools, microfluidics and computer controlled systems shifts the experimental bottleneck from the level of physical manipulation and raw data collection to automated recognition and data processing. Yet, despite their broad importance, image analysis solutions to address these needs have been narrowly tailored. Here, we present a generalizable formulation for autonomous identification of specific biological structures that is applicable for many problems. The process flow architecture we present here utilizes standard image processing techniques and the multi-tiered application of classification models such as support vector machines (SVM). These low-level functions are readily available in a large array of image processing software packages and programming languages. Our framework is thus both easy to implement at the modular level and provides specific high-level architecture to guide the solution of more complicated image-processing problems. We demonstrate the utility of the classification routine by developing two specific classifiers as a toolset for automation and cell identification in the model organism Caenorhabditis elegans. To serve a common need for automated high-resolution imaging and behavior applications in the C. elegans research community, we contribute a ready-to-use classifier for the identification of the head of the animal under bright field imaging. Furthermore, we extend our framework to address the pervasive problem of cell-specific identification under fluorescent imaging, which is critical for biological investigation in multicellular organisms or tissues. Using these examples as a guide, we envision

  15. Research and practice on NPP safety DCS application software V and V defect classification system

    International Nuclear Information System (INIS)

    Zhang Dongwei; Li Yunjian; Li Xiangjian

    2012-01-01

    One of the most significant aims of Verification and Validation (V and V) is to find software errors and risks, especially for a DCS application software designed for nuclear power plant (NPP). Through classifying and analyzing errors, a number of obtained data can be utilized to estimate current status and potential risks of software development and improve the quality of project. A method of error classification is proposed, which is applied to whole V and V life cycle, using a MW pressurized reactor project as an example. The purpose is to analyze errors discovered by V and V activities, and result in improvement of safety critical DCS application software. (authors)

  16. Automated Surface Classification of SRF Cavities for the Investigation of the Influence of Surface Properties onto the Operational Performance

    International Nuclear Information System (INIS)

    Wenskat, Marc

    2015-07-01

    Superconducting niobium radio-frequency cavities are fundamental for the European XFEL and the International Linear Collider. To use the operational advantages of superconducting cavities, the inner surface has to fulfill quite demanding requirements. The surface roughness and cleanliness improved over the last decades and with them, the achieved maximal accelerating field. Still, limitations of the maximal achieved accelerating field are observed, which are not explained by localized geometrical defects or impurities. The scope of this thesis is a better understanding of these limitations in defect free cavities based on global, rather than local, surface properties. For this goal, more than 30 cavities underwent subsequent surface treatments, cold RF tests and optical inspections within the ILC-HiGrade research program and the XFEL cavity production. An algorithm was developed which allows an automated surface characterization based on an optical inspection robot. This algorithm delivers a set of optical surface properties, which describes the inner cavity surface. These optical surface properties deliver a framework for a quality assurance of the fabrication procedures. Furthermore, they shows promising results for a better understanding of the observed limitations in defect free cavities.

  17. Automated Surface Classification of SRF Cavities for the Investigation of the Influence of Surface Properties onto the Operational Performance

    Energy Technology Data Exchange (ETDEWEB)

    Wenskat, Marc

    2015-07-15

    Superconducting niobium radio-frequency cavities are fundamental for the European XFEL and the International Linear Collider. To use the operational advantages of superconducting cavities, the inner surface has to fulfill quite demanding requirements. The surface roughness and cleanliness improved over the last decades and with them, the achieved maximal accelerating field. Still, limitations of the maximal achieved accelerating field are observed, which are not explained by localized geometrical defects or impurities. The scope of this thesis is a better understanding of these limitations in defect free cavities based on global, rather than local, surface properties. For this goal, more than 30 cavities underwent subsequent surface treatments, cold RF tests and optical inspections within the ILC-HiGrade research program and the XFEL cavity production. An algorithm was developed which allows an automated surface characterization based on an optical inspection robot. This algorithm delivers a set of optical surface properties, which describes the inner cavity surface. These optical surface properties deliver a framework for a quality assurance of the fabrication procedures. Furthermore, they shows promising results for a better understanding of the observed limitations in defect free cavities.

  18. Automated classification of eligibility criteria in clinical trials to facilitate patient-trial matching for specific patient populations.

    Science.gov (United States)

    Zhang, Kevin; Demner-Fushman, Dina

    2017-07-01

    To develop automated classification methods for eligibility criteria in ClinicalTrials.gov to facilitate patient-trial matching for specific populations such as persons living with HIV or pregnant women. We annotated 891 interventional cancer trials from ClinicalTrials.gov based on their eligibility for human immunodeficiency virus (HIV)-positive patients using their eligibility criteria. These annotations were used to develop classifiers based on regular expressions and machine learning (ML). After evaluating classification of cancer trials for eligibility of HIV-positive patients, we sought to evaluate the generalizability of our approach to more general diseases and conditions. We annotated the eligibility criteria for 1570 of the most recent interventional trials from ClinicalTrials.gov for HIV-positive and pregnancy eligibility, and the classifiers were retrained and reevaluated using these data. On the cancer-HIV dataset, the baseline regex model, the bag-of-words ML classifier, and the ML classifier with named entity recognition (NER) achieved macro-averaged F2 scores of 0.77, 0.87, and 0.87, respectively; the addition of NER did not result in a significant performance improvement. On the general dataset, ML + NER achieved macro-averaged F2 scores of 0.91 and 0.85 for HIV and pregnancy, respectively. The eligibility status of specific patient populations, such as persons living with HIV and pregnant women, for clinical trials is of interest to both patients and clinicians. We show that it is feasible to develop a high-performing, automated trial classification system for eligibility status that can be integrated into consumer-facing search engines as well as patient-trial matching systems. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the US.

  19. Classification

    Science.gov (United States)

    Clary, Renee; Wandersee, James

    2013-01-01

    In this article, Renee Clary and James Wandersee describe the beginnings of "Classification," which lies at the very heart of science and depends upon pattern recognition. Clary and Wandersee approach patterns by first telling the story of the "Linnaean classification system," introduced by Carl Linnacus (1707-1778), who is…

  20. Accuracy of automated classification of major depressive disorder as a function of symptom severity

    Directory of Open Access Journals (Sweden)

    Rajamannar Ramasubbu, MD, FRCPC, MSc

    2016-01-01

    Conclusions: Binary linear SVM classifiers achieved significant classification of very severe depression with resting-state fMRI, but the contribution of brain measurements may have limited potential in differentiating patients with less severe depression from healthy controls.

  1. Semi-automated landform classification for hazard mapping of soil liquefaction by earthquake

    Science.gov (United States)

    Nakano, Takayuki

    2018-05-01

    Soil liquefaction damages were caused by huge earthquake in Japan, and the similar damages are concerned in near future huge earthquake. On the other hand, a preparation of soil liquefaction risk map (soil liquefaction hazard map) is impeded by the difficulty of evaluation of soil liquefaction risk. Generally, relative soil liquefaction risk should be able to be evaluated from landform classification data by using experimental rule based on the relationship between extent of soil liquefaction damage and landform classification items associated with past earthquake. Therefore, I rearranged the relationship between landform classification items and soil liquefaction risk intelligibly in order to enable the evaluation of soil liquefaction risk based on landform classification data appropriately and efficiently. And I developed a new method of generating landform classification data of 50-m grid size from existing landform classification data of 250-m grid size by using digital elevation model (DEM) data and multi-band satellite image data in order to evaluate soil liquefaction risk in detail spatially. It is expected that the products of this study contribute to efficient producing of soil liquefaction hazard map by local government.

  2. Automated retinal nerve fiber layer defect detection using fundus imaging in glaucoma.

    Science.gov (United States)

    Panda, Rashmi; Puhan, N B; Rao, Aparna; Padhy, Debananda; Panda, Ganapati

    2018-06-01

    Retinal nerve fiber layer defect (RNFLD) provides an early objective evidence of structural changes in glaucoma. RNFLD detection is currently carried out using imaging modalities like OCT and GDx which are expensive for routine practice. In this regard, we propose a novel automatic method for RNFLD detection and angular width quantification using cost effective redfree fundus images to be practically useful for computer-assisted glaucoma risk assessment. After blood vessel inpainting and CLAHE based contrast enhancement, the initial boundary pixels are identified by local minima analysis of the 1-D intensity profiles on concentric circles. The true boundary pixels are classified using random forest trained by newly proposed cumulative zero count local binary pattern (CZC-LBP) and directional differential energy (DDE) along with Shannon, Tsallis entropy and intensity features. Finally, the RNFLD angular width is obtained by random sample consensus (RANSAC) line fitting on the detected set of boundary pixels. The proposed method is found to achieve high RNFLD detection performance on a newly created dataset with sensitivity (SN) of 0.7821 at 0.2727 false positives per image (FPI) and the area under curve (AUC) value is obtained as 0.8733. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Genetic algorithm based feature selection combined with dual classification for the automated detection of proliferative diabetic retinopathy.

    Science.gov (United States)

    Welikala, R A; Fraz, M M; Dehmeshki, J; Hoppe, A; Tah, V; Mann, S; Williamson, T H; Barman, S A

    2015-07-01

    Proliferative diabetic retinopathy (PDR) is a condition that carries a high risk of severe visual impairment. The hallmark of PDR is the growth of abnormal new vessels. In this paper, an automated method for the detection of new vessels from retinal images is presented. This method is based on a dual classification approach. Two vessel segmentation approaches are applied to create two separate binary vessel map which each hold vital information. Local morphology features are measured from each binary vessel map to produce two separate 4-D feature vectors. Independent classification is performed for each feature vector using a support vector machine (SVM) classifier. The system then combines these individual outcomes to produce a final decision. This is followed by the creation of additional features to generate 21-D feature vectors, which feed into a genetic algorithm based feature selection approach with the objective of finding feature subsets that improve the performance of the classification. Sensitivity and specificity results using a dataset of 60 images are 0.9138 and 0.9600, respectively, on a per patch basis and 1.000 and 0.975, respectively, on a per image basis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. A copeptin-based classification of the osmoregulatory defects in the syndrome of inappropriate antidiuresis.

    Science.gov (United States)

    Fenske, Wiebke Kristin; Christ-Crain, Mirjam; Hörning, Anna; Simet, Jessica; Szinnai, Gabor; Fassnacht, Martin; Rutishauser, Jonas; Bichet, Daniel G; Störk, Stefan; Allolio, Bruno

    2014-10-01

    Hyponatremia, the most frequent electrolyte disorder, is caused predominantly by the syndrome of inappropriate antidiuresis (SIAD). A comprehensive characterization of SIAD subtypes, defined by type of osmotic dysregulation, is lacking, but may aid in predicting therapeutic success. Here, we analyzed serial measurements of serum osmolality and serum sodium, plasma arginine vasopressin (AVP), and plasma copeptin concentrations from 50 patients with hyponatremia who underwent hypertonic saline infusion. A close correlation between copeptin concentrations and serum osmolality existed in 68 healthy controls, with a mean osmotic threshold±SD of 282±4 mOsM/kg H2O. Furthermore, saline-induced changes in copeptin concentrations correlated with changes in AVP concentrations in controls and patients. With use of copeptin concentration as a surrogate measure of AVP concentration, patients with SIAD could be grouped according to osmoregulatory defect: Ten percent of patients had grossly elevated copeptin concentrations independent of serum osmolality (type A); 14% had copeptin concentrations that increased linearly with rising serum osmolality but had abnormally low osmotic thresholds (type B); 44% had normal copeptin concentrations independent of osmolality (type C), and 12% had suppressed copeptin concentrations independent of osmolality (type D). A novel SIAD subtype discovered in 20% of patients was characterized by a linear decrease in copeptin concentrations with increasing serum osmolality (type E or "barostat reset"). In conclusion, a partial or complete loss of AVP osmoregulation occurs in patients with SIAD. Although the mechanisms underlying osmoregulatory defects in individual patients are presumably diverse, we hypothesize that treatment responses and patient outcomes will vary according to SIAD subtype. Copyright © 2014 by the American Society of Nephrology.

  5. Automated and unbiased image analyses as tools in phenotypic classification of small-spored Alternaria species

    DEFF Research Database (Denmark)

    Andersen, Birgitte; Hansen, Michael Edberg; Smedsgaard, Jørn

    2005-01-01

    often has been broadly applied to various morphologically and chemically distinct groups of isolates from different hosts. The purpose of this study was to develop and evaluate automated and unbiased image analysis systems that will analyze different phenotypic characters and facilitate testing...

  6. Classification

    DEFF Research Database (Denmark)

    Hjørland, Birger

    2017-01-01

    This article presents and discusses definitions of the term “classification” and the related concepts “Concept/conceptualization,”“categorization,” “ordering,” “taxonomy” and “typology.” It further presents and discusses theories of classification including the influences of Aristotle...... and Wittgenstein. It presents different views on forming classes, including logical division, numerical taxonomy, historical classification, hermeneutical and pragmatic/critical views. Finally, issues related to artificial versus natural classification and taxonomic monism versus taxonomic pluralism are briefly...

  7. An Automated Cropland Classification Algorithm (ACCA) for Tajikistan by combining Landsat, MODIS, and secondary data

    Science.gov (United States)

    Thenkabail, Prasad S.; Wu, Zhuoting

    2012-01-01

    The overarching goal of this research was to develop and demonstrate an automated Cropland Classification Algorithm (ACCA) that will rapidly, routinely, and accurately classify agricultural cropland extent, areas, and characteristics (e.g., irrigated vs. rainfed) over large areas such as a country or a region through combination of multi-sensor remote sensing and secondary data. In this research, a rule-based ACCA was conceptualized, developed, and demonstrated for the country of Tajikistan using mega file data cubes (MFDCs) involving data from Landsat Global Land Survey (GLS), Landsat Enhanced Thematic Mapper Plus (ETM+) 30 m, Moderate Resolution Imaging Spectroradiometer (MODIS) 250 m time-series, a suite of secondary data (e.g., elevation, slope, precipitation, temperature), and in situ data. First, the process involved producing an accurate reference (or truth) cropland layer (TCL), consisting of cropland extent, areas, and irrigated vs. rainfed cropland areas, for the entire country of Tajikistan based on MFDC of year 2005 (MFDC2005). The methods involved in producing TCL included using ISOCLASS clustering, Tasseled Cap bi-spectral plots, spectro-temporal characteristics from MODIS 250 m monthly normalized difference vegetation index (NDVI) maximum value composites (MVC) time-series, and textural characteristics of higher resolution imagery. The TCL statistics accurately matched with the national statistics of Tajikistan for irrigated and rainfed croplands, where about 70% of croplands were irrigated and the rest rainfed. Second, a rule-based ACCA was developed to replicate the TCL accurately (~80% producer’s and user’s accuracies or within 20% quantity disagreement involving about 10 million Landsat 30 m sized cropland pixels of Tajikistan). Development of ACCA was an iterative process involving series of rules that are coded, refined, tweaked, and re-coded till ACCA derived croplands (ACLs) match accurately with TCLs. Third, the ACCA derived cropland

  8. An Automated Cropland Classification Algorithm (ACCA for Tajikistan by Combining Landsat, MODIS, and Secondary Data

    Directory of Open Access Journals (Sweden)

    Prasad S. Thenkabail

    2012-09-01

    Full Text Available The overarching goal of this research was to develop and demonstrate an automated Cropland Classification Algorithm (ACCA that will rapidly, routinely, and accurately classify agricultural cropland extent, areas, and characteristics (e.g., irrigated vs. rainfed over large areas such as a country or a region through combination of multi-sensor remote sensing and secondary data. In this research, a rule-based ACCA was conceptualized, developed, and demonstrated for the country of Tajikistan using mega file data cubes (MFDCs involving data from Landsat Global Land Survey (GLS, Landsat Enhanced Thematic Mapper Plus (ETM+ 30 m, Moderate Resolution Imaging Spectroradiometer (MODIS 250 m time-series, a suite of secondary data (e.g., elevation, slope, precipitation, temperature, and in situ data. First, the process involved producing an accurate reference (or truth cropland layer (TCL, consisting of cropland extent, areas, and irrigated vs. rainfed cropland areas, for the entire country of Tajikistan based on MFDC of year 2005 (MFDC2005. The methods involved in producing TCL included using ISOCLASS clustering, Tasseled Cap bi-spectral plots, spectro-temporal characteristics from MODIS 250 m monthly normalized difference vegetation index (NDVI maximum value composites (MVC time-series, and textural characteristics of higher resolution imagery. The TCL statistics accurately matched with the national statistics of Tajikistan for irrigated and rainfed croplands, where about 70% of croplands were irrigated and the rest rainfed. Second, a rule-based ACCA was developed to replicate the TCL accurately (~80% producer’s and user’s accuracies or within 20% quantity disagreement involving about 10 million Landsat 30 m sized cropland pixels of Tajikistan. Development of ACCA was an iterative process involving series of rules that are coded, refined, tweaked, and re-coded till ACCA derived croplands (ACLs match accurately with TCLs. Third, the ACCA derived

  9. Use of self-organizing maps for classification of defects in the tubes from the steam generator of nuclear power plants

    International Nuclear Information System (INIS)

    Mesquita, Roberto Navarro de

    2002-01-01

    This thesis obtains a new classification method for different steam generator tube defects in nuclear power plants using Eddy Current Test signals. The method uses self-organizing maps to compare different signal characteristics efficiency to identify and classify these defects. A multiple inference system is proposed which composes the different extracted characteristic trained maps classification to infer the final defect type. The feature extraction methods used are the Wavelet zero-crossings representation, the linear predictive coding (LPC), and other basic signal representations on time like module and phase. Many characteristic vectors are obtained with combinations of these extracted characteristics. These vectors are tested to classify the defects and the best ones are applied to the multiple inference system. A systematic study of pre-processing, calibration and analysis methods for the steam generator tube defect signals in nuclear power plants is done. The method efficiency is demonstrated and characteristic maps with the main prototypes are obtained for each steam generator tube defect type. (author)

  10. Panacea : Automating attack classification for anomaly-based network intrusion detection systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, S.; Hartel, P.H.; Kirda, E.; Jha, S.; Balzarotti, D.

    2009-01-01

    Anomaly-based intrusion detection systems are usually criticized because they lack a classification of attacks, thus security teams have to manually inspect any raised alert to classify it. We present a new approach, Panacea, to automatically and systematically classify attacks detected by an

  11. Panacea : Automating attack classification for anomaly-based network intrusion detection systems

    NARCIS (Netherlands)

    Bolzoni, D.; Etalle, S.; Hartel, P.H.

    2009-01-01

    Anomaly-based intrusion detection systems are usually criticized because they lack a classification of attack, thus security teams have to manually inspect any raised alert to classify it. We present a new approach, Panacea, to automatically and systematically classify attacks detected by an

  12. Automated and unbiased classification of chemical profiles from fungi using high performance liquid chromatography

    DEFF Research Database (Denmark)

    Hansen, Michael Edberg; Andersen, Birgitte; Smedsgaard, Jørn

    2005-01-01

    In this paper we present a method for unbiased/unsupervised classification and identification of closely related fungi, using chemical analysis of secondary metabolite profiles created by HPLC with UV diode array detection. For two chromatographic data matrices a vector of locally aligned full sp...

  13. A Graphical User Interface (GUI) for Automated Classification of Bradley Fighting Vehicle Shock Absorbers

    National Research Council Canada - National Science Library

    Sincebaugh, Patrick

    1998-01-01

    .... We then explain the design and capabilities of the SSATS graphical user interface (GUI), which includes the integration of a neural network classification scheme. We finish by discussing recent results of utilizing the system to test and evaluate Bradley armored vehicle shock absorbers.

  14. An automated image processing method for classification of diabetic retinopathy stages from conjunctival microvasculature images

    Science.gov (United States)

    Khansari, Maziyar M.; O'Neill, William; Penn, Richard; Blair, Norman P.; Chau, Felix; Shahidi, Mahnaz

    2017-03-01

    The conjunctiva is a densely vascularized tissue of the eye that provides an opportunity for imaging of human microcirculation. In the current study, automated fine structure analysis of conjunctival microvasculature images was performed to discriminate stages of diabetic retinopathy (DR). The study population consisted of one group of nondiabetic control subjects (NC) and 3 groups of diabetic subjects, with no clinical DR (NDR), non-proliferative DR (NPDR), or proliferative DR (PDR). Ordinary least square regression and Fisher linear discriminant analyses were performed to automatically discriminate images between group pairs of subjects. Human observers who were masked to the grouping of subjects performed image discrimination between group pairs. Over 80% and 70% of images of subjects with clinical and non-clinical DR were correctly discriminated by the automated method, respectively. The discrimination rates of the automated method were higher than human observers. The fine structure analysis of conjunctival microvasculature images provided discrimination of DR stages and can be potentially useful for DR screening and monitoring.

  15. Exploring repetitive DNA landscapes using REPCLASS, a tool that automates the classification of transposable elements in eukaryotic genomes.

    Science.gov (United States)

    Feschotte, Cédric; Keswani, Umeshkumar; Ranganathan, Nirmal; Guibotsy, Marcel L; Levine, David

    2009-07-23

    Eukaryotic genomes contain large amount of repetitive DNA, most of which is derived from transposable elements (TEs). Progress has been made to develop computational tools for ab initio identification of repeat families, but there is an urgent need to develop tools to automate the annotation of TEs in genome sequences. Here we introduce REPCLASS, a tool that automates the classification of TE sequences. Using control repeat libraries, we show that the program can classify accurately virtually any known TE types. Combining REPCLASS to ab initio repeat finding in the genomes of Caenorhabditis elegans and Drosophila melanogaster allowed us to recover the contrasting TE landscape characteristic of these species. Unexpectedly, REPCLASS also uncovered several novel TE families in both genomes, augmenting the TE repertoire of these model species. When applied to the genomes of distant Caenorhabditis and Drosophila species, the approach revealed a remarkable conservation of TE composition profile within each genus, despite substantial interspecific covariations in genome size and in the number of TEs and TE families. Lastly, we applied REPCLASS to analyze 10 fungal genomes from a wide taxonomic range, most of which have not been analyzed for TE content previously. The results showed that TE diversity varies widely across the fungi "kingdom" and appears to positively correlate with genome size, in particular for DNA transposons. Together, these data validate REPCLASS as a powerful tool to explore the repetitive DNA landscapes of eukaryotes and to shed light onto the evolutionary forces shaping TE diversity and genome architecture.

  16. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    Science.gov (United States)

    Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  17. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer.

    Directory of Open Access Journals (Sweden)

    Bart Rogiers

    Full Text Available Cone penetration testing (CPT is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.

  18. An Automated and Intelligent Medical Decision Support System for Brain MRI Scans Classification.

    Directory of Open Access Journals (Sweden)

    Muhammad Faisal Siddiqui

    Full Text Available A wide interest has been observed in the medical health care applications that interpret neuroimaging scans by machine learning systems. This research proposes an intelligent, automatic, accurate, and robust classification technique to classify the human brain magnetic resonance image (MRI as normal or abnormal, to cater down the human error during identifying the diseases in brain MRIs. In this study, fast discrete wavelet transform (DWT, principal component analysis (PCA, and least squares support vector machine (LS-SVM are used as basic components. Firstly, fast DWT is employed to extract the salient features of brain MRI, followed by PCA, which reduces the dimensions of the features. These reduced feature vectors also shrink the memory storage consumption by 99.5%. At last, an advanced classification technique based on LS-SVM is applied to brain MR image classification using reduced features. For improving the efficiency, LS-SVM is used with non-linear radial basis function (RBF kernel. The proposed algorithm intelligently determines the optimized values of the hyper-parameters of the RBF kernel and also applied k-fold stratified cross validation to enhance the generalization of the system. The method was tested by 340 patients' benchmark datasets of T1-weighted and T2-weighted scans. From the analysis of experimental results and performance comparisons, it is observed that the proposed medical decision support system outperformed all other modern classifiers and achieves 100% accuracy rate (specificity/sensitivity 100%/100%. Furthermore, in terms of computation time, the proposed technique is significantly faster than the recent well-known methods, and it improves the efficiency by 71%, 3%, and 4% on feature extraction stage, feature reduction stage, and classification stage, respectively. These results indicate that the proposed well-trained machine learning system has the potential to make accurate predictions about brain abnormalities

  19. Automated Classification and Removal of EEG Artifacts With SVM and Wavelet-ICA.

    Science.gov (United States)

    Sai, Chong Yeh; Mokhtar, Norrima; Arof, Hamzah; Cumming, Paul; Iwahashi, Masahiro

    2018-05-01

    Brain electrical activity recordings by electroencephalography (EEG) are often contaminated with signal artifacts. Procedures for automated removal of EEG artifacts are frequently sought for clinical diagnostics and brain-computer interface applications. In recent years, a combination of independent component analysis (ICA) and discrete wavelet transform has been introduced as standard technique for EEG artifact removal. However, in performing the wavelet-ICA procedure, visual inspection or arbitrary thresholding may be required for identifying artifactual components in the EEG signal. We now propose a novel approach for identifying artifactual components separated by wavelet-ICA using a pretrained support vector machine (SVM). Our method presents a robust and extendable system that enables fully automated identification and removal of artifacts from EEG signals, without applying any arbitrary thresholding. Using test data contaminated by eye blink artifacts, we show that our method performed better in identifying artifactual components than did existing thresholding methods. Furthermore, wavelet-ICA in conjunction with SVM successfully removed target artifacts, while largely retaining the EEG source signals of interest. We propose a set of features including kurtosis, variance, Shannon's entropy, and range of amplitude as training and test data of SVM to identify eye blink artifacts in EEG signals. This combinatorial method is also extendable to accommodate multiple types of artifacts present in multichannel EEG. We envision future research to explore other descriptive features corresponding to other types of artifactual components.

  20. Automated Classification and Analysis of Non-metallic Inclusion Data Sets

    Science.gov (United States)

    Abdulsalam, Mohammad; Zhang, Tongsheng; Tan, Jia; Webler, Bryan A.

    2018-05-01

    The aim of this study is to utilize principal component analysis (PCA), clustering methods, and correlation analysis to condense and examine large, multivariate data sets produced from automated analysis of non-metallic inclusions. Non-metallic inclusions play a major role in defining the properties of steel and their examination has been greatly aided by automated analysis in scanning electron microscopes equipped with energy dispersive X-ray spectroscopy. The methods were applied to analyze inclusions on two sets of samples: two laboratory-scale samples and four industrial samples from a near-finished 4140 alloy steel components with varying machinability. The laboratory samples had well-defined inclusions chemistries, composed of MgO-Al2O3-CaO, spinel (MgO-Al2O3), and calcium aluminate inclusions. The industrial samples contained MnS inclusions as well as (Ca,Mn)S + calcium aluminate oxide inclusions. PCA could be used to reduce inclusion chemistry variables to a 2D plot, which revealed inclusion chemistry groupings in the samples. Clustering methods were used to automatically classify inclusion chemistry measurements into groups, i.e., no user-defined rules were required.

  1. Automated fault extraction and classification using 3-D seismic data for the Ekofisk field development

    Energy Technology Data Exchange (ETDEWEB)

    Signer, C.; Nickel, M.; Randen, T.; Saeter, T.; Soenneland, H.H.

    1998-12-31

    Mapping of fractures is important for the prediction of fluid flow in many reservoir types. The fluid flow depends mainly on the efficiency of the reservoir seals. Improved spatial mapping of the open and closed fracture systems will allow a better prediction of the fluid flow pattern. The primary objectives of this paper is to present fracture characterization at the reservoir scale combined with seismic facies mapping. The complexity of the giant Ekofisk field on the Norwegian continental shelf provides an ideal framework for testing the validity and the applicability of an automated seismic fault and fracture detection and mapping tool. The mapping of the faults can be based on seismic attribute grids, which means that attribute-responses related to faults are extracted along key horizons which were interpreted in the reservoir interval. 3 refs., 3 figs.

  2. Automated Region of Interest Retrieval of Metallographic Images for Quality Classification in Industry

    Directory of Open Access Journals (Sweden)

    Petr Kotas

    2012-01-01

    Full Text Available The aim of the research is development and testing of new methods to classify the quality of metallographic samples of steels with high added value (for example grades X70 according API. In this paper, we address the development of methods to classify the quality of slab samples images with the main emphasis on the quality of the image center called as segregation area. For this reason, we introduce an alternative method for automated retrieval of region of interest. In the first step, the metallographic image is segmented using both spectral method and thresholding. Then, the extracted macrostructure of the metallographic image is automatically analyzed by statistical methods. Finally, automatically extracted region of interests are compared with results of human experts.  Practical experience with retrieval of non-homogeneous noised digital images in industrial environment is discussed as well.

  3. A controlled trial of automated classification of negation from clinical notes

    Directory of Open Access Journals (Sweden)

    Carruth William

    2005-05-01

    Full Text Available Abstract Background Identification of negation in electronic health records is essential if we are to understand the computable meaning of the records: Our objective is to compare the accuracy of an automated mechanism for assignment of Negation to clinical concepts within a compositional expression with Human Assigned Negation. Also to perform a failure analysis to identify the causes of poorly identified negation (i.e. Missed Conceptual Representation, Inaccurate Conceptual Representation, Missed Negation, Inaccurate identification of Negation. Methods 41 Clinical Documents (Medical Evaluations; sometimes outside of Mayo these are referred to as History and Physical Examinations were parsed using the Mayo Vocabulary Server Parsing Engine. SNOMED-CT™ was used to provide concept coverage for the clinical concepts in the record. These records resulted in identification of Concepts and textual clues to Negation. These records were reviewed by an independent medical terminologist, and the results were tallied in a spreadsheet. Where questions on the review arose Internal Medicine Faculty were employed to make a final determination. Results SNOMED-CT was used to provide concept coverage of the 14,792 Concepts in 41 Health Records from John's Hopkins University. Of these, 1,823 Concepts were identified as negative by Human review. The sensitivity (Recall of the assignment of negation was 97.2% (p Conclusion Automated assignment of negation to concepts identified in health records based on review of the text is feasible and practical. Lexical assignment of negation is a good test of true Negativity as judged by the high sensitivity, specificity and positive likelihood ratio of the test. SNOMED-CT had overall coverage of 88.7% of the concepts being negated.

  4. Towards automated human gait disease classification using phase space representation of intrinsic mode functions

    Science.gov (United States)

    Pratiher, Sawon; Patra, Sayantani; Pratiher, Souvik

    2017-06-01

    A novel analytical methodology for segregating healthy and neurological disorders from gait patterns is proposed by employing a set of oscillating components called intrinsic mode functions (IMF's). These IMF's are generated by the Empirical Mode Decomposition of the gait time series and the Hilbert transformed analytic signal representation forms the complex plane trace of the elliptical shaped analytic IMFs. The area measure and the relative change in the centroid position of the polygon formed by the Convex Hull of these analytic IMF's are taken as the discriminative features. Classification accuracy of 79.31% with Ensemble learning based Adaboost classifier validates the adequacy of the proposed methodology for a computer aided diagnostic (CAD) system for gait pattern identification. Also, the efficacy of several potential biomarkers like Bandwidth of Amplitude Modulation and Frequency Modulation IMF's and it's Mean Frequency from the Fourier-Bessel expansion from each of these analytic IMF's has been discussed for its potency in diagnosis of gait pattern identification and classification.

  5. Automated classification of maxillofacial cysts in cone beam CT images using contourlet transformation and Spherical Harmonics.

    Science.gov (United States)

    Abdolali, Fatemeh; Zoroofi, Reza Aghaeizadeh; Otake, Yoshito; Sato, Yoshinobu

    2017-02-01

    Accurate detection of maxillofacial cysts is an essential step for diagnosis, monitoring and planning therapeutic intervention. Cysts can be of various sizes and shapes and existing detection methods lead to poor results. Customizing automatic detection systems to gain sufficient accuracy in clinical practice is highly challenging. For this purpose, integrating the engineering knowledge in efficient feature extraction is essential. This paper presents a novel framework for maxillofacial cysts detection. A hybrid methodology based on surface and texture information is introduced. The proposed approach consists of three main steps as follows: At first, each cystic lesion is segmented with high accuracy. Then, in the second and third steps, feature extraction and classification are performed. Contourlet and SPHARM coefficients are utilized as texture and shape features which are fed into the classifier. Two different classifiers are used in this study, i.e. support vector machine and sparse discriminant analysis. Generally SPHARM coefficients are estimated by the iterative residual fitting (IRF) algorithm which is based on stepwise regression method. In order to improve the accuracy of IRF estimation, a method based on extra orthogonalization is employed to reduce linear dependency. We have utilized a ground-truth dataset consisting of cone beam CT images of 96 patients, belonging to three maxillofacial cyst categories: radicular cyst, dentigerous cyst and keratocystic odontogenic tumor. Using orthogonalized SPHARM, residual sum of squares is decreased which leads to a more accurate estimation. Analysis of the results based on statistical measures such as specificity, sensitivity, positive predictive value and negative predictive value is reported. The classification rate of 96.48% is achieved using sparse discriminant analysis and orthogonalized SPHARM features. Classification accuracy at least improved by 8.94% with respect to conventional features. This study

  6. Challenges in the automated classification of variable stars in large databases

    Directory of Open Access Journals (Sweden)

    Graham Matthew

    2017-01-01

    Full Text Available With ever-increasing numbers of astrophysical transient surveys, new facilities and archives of astronomical time series, time domain astronomy is emerging as a mainstream discipline. However, the sheer volume of data alone - hundreds of observations for hundreds of millions of sources – necessitates advanced statistical and machine learning methodologies for scientific discovery: characterization, categorization, and classification. Whilst these techniques are slowly entering the astronomer’s toolkit, their application to astronomical problems is not without its issues. In this paper, we will review some of the challenges posed by trying to identify variable stars in large data collections, including appropriate feature representations, dealing with uncertainties, establishing ground truths, and simple discrete classes.

  7. Automated Classification of Variable Stars in the Asteroseismology Program of the Kepler Space Mission

    DEFF Research Database (Denmark)

    Blomme, J.; Debosscher, J.; De Ridder, J.

    2010-01-01

    missions, are capable of identifying the most common types of stellar variability in a reliable way. Many new variables have been discovered, among which a large fraction are eclipsing/ellipsoidal binaries unknown prior to launch. A comparison is made between our classification from the Kepler data...... and the pre-launch class based on data from the ground, showing that the latter needs significant improvement. The noise properties of the Kepler data are compared to those of the exoplanet program of the CoRoT satellite.We find that Kepler improves on CoRoT by a factor of 2–2.3 in point-to-point scatter....

  8. An Automated Strategy for Unbiased Morphometric Analyses and Classifications of Growth Cones In Vitro.

    Directory of Open Access Journals (Sweden)

    Daryan Chitsaz

    Full Text Available During neural circuit development, attractive or repulsive guidance cue molecules direct growth cones (GCs to their targets by eliciting cytoskeletal remodeling, which is reflected in their morphology. The experimental power of in vitro neuronal cultures to assay this process and its molecular mechanisms is well established, however, a method to rapidly find and quantify multiple morphological aspects of GCs is lacking. To this end, we have developed a free, easy to use, and fully automated Fiji macro, Conographer, which accurately identifies and measures many morphological parameters of GCs in 2D explant culture images. These measurements are then subjected to principle component analysis and k-means clustering to mathematically classify the GCs as "collapsed" or "extended". The morphological parameters measured for each GC are found to be significantly different between collapsed and extended GCs, and are sufficient to classify GCs as such with the same level of accuracy as human observers. Application of a known collapse-inducing ligand results in significant changes in all parameters, resulting in an increase in 'collapsed' GCs determined by k-means clustering, as expected. Our strategy provides a powerful tool for exploring the relationship between GC morphology and guidance cue signaling, which in particular will greatly facilitate high-throughput studies of the effects of drugs, gene silencing or overexpression, or any other experimental manipulation in the context of an in vitro axon guidance assay.

  9. An Automated Strategy for Unbiased Morphometric Analyses and Classifications of Growth Cones In Vitro.

    Science.gov (United States)

    Chitsaz, Daryan; Morales, Daniel; Law, Chris; Kania, Artur

    2015-01-01

    During neural circuit development, attractive or repulsive guidance cue molecules direct growth cones (GCs) to their targets by eliciting cytoskeletal remodeling, which is reflected in their morphology. The experimental power of in vitro neuronal cultures to assay this process and its molecular mechanisms is well established, however, a method to rapidly find and quantify multiple morphological aspects of GCs is lacking. To this end, we have developed a free, easy to use, and fully automated Fiji macro, Conographer, which accurately identifies and measures many morphological parameters of GCs in 2D explant culture images. These measurements are then subjected to principle component analysis and k-means clustering to mathematically classify the GCs as "collapsed" or "extended". The morphological parameters measured for each GC are found to be significantly different between collapsed and extended GCs, and are sufficient to classify GCs as such with the same level of accuracy as human observers. Application of a known collapse-inducing ligand results in significant changes in all parameters, resulting in an increase in 'collapsed' GCs determined by k-means clustering, as expected. Our strategy provides a powerful tool for exploring the relationship between GC morphology and guidance cue signaling, which in particular will greatly facilitate high-throughput studies of the effects of drugs, gene silencing or overexpression, or any other experimental manipulation in the context of an in vitro axon guidance assay.

  10. Automated Thermal Image Processing for Detection and Classification of Birds and Bats - FY2012 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Duberstein, Corey A.; Matzner, Shari; Cullinan, Valerie I.; Virden, Daniel J.; Myers, Joshua R.; Maxwell, Adam R.

    2012-09-01

    Surveying wildlife at risk from offshore wind energy development is difficult and expensive. Infrared video can be used to record birds and bats that pass through the camera view, but it is also time consuming and expensive to review video and determine what was recorded. We proposed to conduct algorithm and software development to identify and to differentiate thermally detected targets of interest that would allow automated processing of thermal image data to enumerate birds, bats, and insects. During FY2012 we developed computer code within MATLAB to identify objects recorded in video and extract attribute information that describes the objects recorded. We tested the efficiency of track identification using observer-based counts of tracks within segments of sample video. We examined object attributes, modeled the effects of random variability on attributes, and produced data smoothing techniques to limit random variation within attribute data. We also began drafting and testing methodology to identify objects recorded on video. We also recorded approximately 10 hours of infrared video of various marine birds, passerine birds, and bats near the Pacific Northwest National Laboratory (PNNL) Marine Sciences Laboratory (MSL) at Sequim, Washington. A total of 6 hours of bird video was captured overlooking Sequim Bay over a series of weeks. An additional 2 hours of video of birds was also captured during two weeks overlooking Dungeness Bay within the Strait of Juan de Fuca. Bats and passerine birds (swallows) were also recorded at dusk on the MSL campus during nine evenings. An observer noted the identity of objects viewed through the camera concurrently with recording. These video files will provide the information necessary to produce and test software developed during FY2013. The annotation will also form the basis for creation of a method to reliably identify recorded objects.

  11. Automated web usage data mining and recommendation system using K-Nearest Neighbor (KNN classification method

    Directory of Open Access Journals (Sweden)

    D.A. Adeniyi

    2016-01-01

    Full Text Available The major problem of many on-line web sites is the presentation of many choices to the client at a time; this usually results to strenuous and time consuming task in finding the right product or information on the site. In this work, we present a study of automatic web usage data mining and recommendation system based on current user behavior through his/her click stream data on the newly developed Really Simple Syndication (RSS reader website, in order to provide relevant information to the individual without explicitly asking for it. The K-Nearest-Neighbor (KNN classification method has been trained to be used on-line and in Real-Time to identify clients/visitors click stream data, matching it to a particular user group and recommend a tailored browsing option that meet the need of the specific user at a particular time. To achieve this, web users RSS address file was extracted, cleansed, formatted and grouped into meaningful session and data mart was developed. Our result shows that the K-Nearest Neighbor classifier is transparent, consistent, straightforward, simple to understand, high tendency to possess desirable qualities and easy to implement than most other machine learning techniques specifically when there is little or no prior knowledge about data distribution.

  12. ViCTree: An automated framework for taxonomic classification from protein sequences.

    Science.gov (United States)

    Modha, Sejal; Thanki, Anil; Cotmore, Susan F; Davison, Andrew J; Hughes, Joseph

    2018-02-20

    The increasing rate of submission of genetic sequences into public databases is providing a growing resource for classifying the organisms that these sequences represent. To aid viral classification, we have developed ViCTree, which automatically integrates the relevant sets of sequences in NCBI GenBank and transforms them into an interactive maximum likelihood phylogenetic tree that can be updated automatically. ViCTree incorporates ViCTreeView, which is a JavaScript-based visualisation tool that enables the tree to be explored interactively in the context of pairwise distance data. To demonstrate utility, ViCTree was applied to subfamily Densovirinae of family Parvoviridae. This led to the identification of six new species of insect virus. ViCTree is open-source and can be run on any Linux- or Unix-based computer or cluster. A tutorial, the documentation and the source code are available under a GPL3 license, and can be accessed at http://bioinformatics.cvr.ac.uk/victree_web/. sejal.modha@glasgow.ac.uk.

  13. Automated Image Sampling and Classification Can Be Used to Explore Perceived Naturalness of Urban Spaces.

    Directory of Open Access Journals (Sweden)

    Roger Hyam

    Full Text Available The psychological restorative effects of exposure to nature are well established and extend to just viewing of images of nature. A previous study has shown that Perceived Naturalness (PN of images correlates with their restorative value. This study tests whether it is possible to detect degree of PN of images using an image classifier. It takes images that have been scored by humans for PN (including a subset that have been assessed for restorative value and passes them through the Google Vision API image classification service. The resulting labels are assigned to broad semantic classes to create a Calculated Semantic Naturalness (CSN metric for each image. It was found that CSN correlates with PN. CSN was then calculated for a geospatial sampling of Google Street View images across the city of Edinburgh. CSN was found to correlate with PN in this sample also indicating the technique may be useful in large scale studies. Because CSN correlates with PN which correlates with restorativeness it is suggested that CSN or a similar measure may be useful in automatically detecting restorative images and locations. In an exploratory aside CSN was not found to correlate with an indicator of socioeconomic deprivation.

  14. Automated age-related macular degeneration classification in OCT using unsupervised feature learning

    Science.gov (United States)

    Venhuizen, Freerk G.; van Ginneken, Bram; Bloemen, Bart; van Grinsven, Mark J. J. P.; Philipsen, Rick; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2015-03-01

    Age-related Macular Degeneration (AMD) is a common eye disorder with high prevalence in elderly people. The disease mainly affects the central part of the retina, and could ultimately lead to permanent vision loss. Optical Coherence Tomography (OCT) is becoming the standard imaging modality in diagnosis of AMD and the assessment of its progression. However, the evaluation of the obtained volumetric scan is time consuming, expensive and the signs of early AMD are easy to miss. In this paper we propose a classification method to automatically distinguish AMD patients from healthy subjects with high accuracy. The method is based on an unsupervised feature learning approach, and processes the complete image without the need for an accurate pre-segmentation of the retina. The method can be divided in two steps: an unsupervised clustering stage that extracts a set of small descriptive image patches from the training data, and a supervised training stage that uses these patches to create a patch occurrence histogram for every image on which a random forest classifier is trained. Experiments using 384 volume scans show that the proposed method is capable of identifying AMD patients with high accuracy, obtaining an area under the Receiver Operating Curve of 0:984. Our method allows for a quick and reliable assessment of the presence of AMD pathology in OCT volume scans without the need for accurate layer segmentation algorithms.

  15. A novel scheme for the validation of an automated classification method for epileptic spikes by comparison with multiple observers.

    Science.gov (United States)

    Sharma, Niraj K; Pedreira, Carlos; Centeno, Maria; Chaudhary, Umair J; Wehner, Tim; França, Lucas G S; Yadee, Tinonkorn; Murta, Teresa; Leite, Marco; Vos, Sjoerd B; Ourselin, Sebastien; Diehl, Beate; Lemieux, Louis

    2017-07-01

    To validate the application of an automated neuronal spike classification algorithm, Wave_clus (WC), on interictal epileptiform discharges (IED) obtained from human intracranial EEG (icEEG) data. Five 10-min segments of icEEG recorded in 5 patients were used. WC and three expert EEG reviewers independently classified one hundred IED events into IED classes or non-IEDs. First, we determined whether WC-human agreement variability falls within inter-reviewer agreement variability by calculating the variation of information for each classifier pair and quantifying the overlap between all WC-reviewer and all reviewer-reviewer pairs. Second, we compared WC and EEG reviewers' spike identification and individual spike class labels visually and quantitatively. The overlap between all WC-human pairs and all human pairs was >80% for 3/5 patients and >58% for the other 2 patients demonstrating WC falling within inter-human variation. The average sensitivity of spike marking for WC was 91% and >87% for all three EEG reviewers. Finally, there was a strong visual and quantitative similarity between WC and EEG reviewers. WC performance is indistinguishable to that of EEG reviewers' suggesting it could be a valid clinical tool for the assessment of IEDs. WC can be used to provide quantitative analysis of epileptic spikes. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  16. Machine Learning of Human Pluripotent Stem Cell-Derived Engineered Cardiac Tissue Contractility for Automated Drug Classification

    Directory of Open Access Journals (Sweden)

    Eugene K. Lee

    2017-11-01

    Full Text Available Accurately predicting cardioactive effects of new molecular entities for therapeutics remains a daunting challenge. Immense research effort has been focused toward creating new screening platforms that utilize human pluripotent stem cell (hPSC-derived cardiomyocytes and three-dimensional engineered cardiac tissue constructs to better recapitulate human heart function and drug responses. As these new platforms become increasingly sophisticated and high throughput, the drug screens result in larger multidimensional datasets. Improved automated analysis methods must therefore be developed in parallel to fully comprehend the cellular response across a multidimensional parameter space. Here, we describe the use of machine learning to comprehensively analyze 17 functional parameters derived from force readouts of hPSC-derived ventricular cardiac tissue strips (hvCTS electrically paced at a range of frequencies and exposed to a library of compounds. A generated metric is effective for then determining the cardioactivity of a given drug. Furthermore, we demonstrate a classification model that can automatically predict the mechanistic action of an unknown cardioactive drug.

  17. Segmentation methodology for automated classification and differentiation of soft tissues in multiband images of high-resolution ultrasonic transmission tomography.

    Science.gov (United States)

    Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z

    2006-08-01

    This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.

  18. An automated form of video image analysis applied to classification of movement disorders.

    Science.gov (United States)

    Chang, R; Guan, L; Burne, J A

    Video image analysis is able to provide quantitative data on postural and movement abnormalities and thus has an important application in neurological diagnosis and management. The conventional techniques require patients to be videotaped while wearing markers in a highly structured laboratory environment. This restricts the utility of video in routine clinical practise. We have begun development of intelligent software which aims to provide a more flexible system able to quantify human posture and movement directly from whole-body images without markers and in an unstructured environment. The steps involved are to extract complete human profiles from video frames, to fit skeletal frameworks to the profiles and derive joint angles and swing distances. By this means a given posture is reduced to a set of basic parameters that can provide input to a neural network classifier. To test the system's performance we videotaped patients with dopa-responsive Parkinsonism and age-matched normals during several gait cycles, to yield 61 patient and 49 normal postures. These postures were reduced to their basic parameters and fed to the neural network classifier in various combinations. The optimal parameter sets (consisting of both swing distances and joint angles) yielded successful classification of normals and patients with an accuracy above 90%. This result demonstrated the feasibility of the approach. The technique has the potential to guide clinicians on the relative sensitivity of specific postural/gait features in diagnosis. Future studies will aim to improve the robustness of the system in providing accurate parameter estimates from subjects wearing a range of clothing, and to further improve discrimination by incorporating more stages of the gait cycle into the analysis.

  19. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    Science.gov (United States)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  20. Automated time activity classification based on global positioning system (GPS) tracking data.

    Science.gov (United States)

    Wu, Jun; Jiang, Chengsheng; Houston, Douglas; Baker, Dean; Delfino, Ralph

    2011-11-14

    Air pollution epidemiological studies are increasingly using global positioning system (GPS) to collect time-location data because they offer continuous tracking, high temporal resolution, and minimum reporting burden for participants. However, substantial uncertainties in the processing and classifying of raw GPS data create challenges for reliably characterizing time activity patterns. We developed and evaluated models to classify people's major time activity patterns from continuous GPS tracking data. We developed and evaluated two automated models to classify major time activity patterns (i.e., indoor, outdoor static, outdoor walking, and in-vehicle travel) based on GPS time activity data collected under free living conditions for 47 participants (N = 131 person-days) from the Harbor Communities Time Location Study (HCTLS) in 2008 and supplemental GPS data collected from three UC-Irvine research staff (N = 21 person-days) in 2010. Time activity patterns used for model development were manually classified by research staff using information from participant GPS recordings, activity logs, and follow-up interviews. We evaluated two models: (a) a rule-based model that developed user-defined rules based on time, speed, and spatial location, and (b) a random forest decision tree model. Indoor, outdoor static, outdoor walking and in-vehicle travel activities accounted for 82.7%, 6.1%, 3.2% and 7.2% of manually-classified time activities in the HCTLS dataset, respectively. The rule-based model classified indoor and in-vehicle travel periods reasonably well (Indoor: sensitivity > 91%, specificity > 80%, and precision > 96%; in-vehicle travel: sensitivity > 71%, specificity > 99%, and precision > 88%), but the performance was moderate for outdoor static and outdoor walking predictions. No striking differences in performance were observed between the rule-based and the random forest models. The random forest model was fast and easy to execute, but was likely less robust

  1. Automated system for lung nodules classification based on wavelet feature descriptor and support vector machine.

    Science.gov (United States)

    Madero Orozco, Hiram; Vergara Villegas, Osslan Osiris; Cruz Sánchez, Vianey Guadalupe; Ochoa Domínguez, Humberto de Jesús; Nandayapa Alfaro, Manuel de Jesús

    2015-02-12

    Lung cancer is a leading cause of death worldwide; it refers to the uncontrolled growth of abnormal cells in the lung. A computed tomography (CT) scan of the thorax is the most sensitive method for detecting cancerous lung nodules. A lung nodule is a round lesion which can be either non-cancerous or cancerous. In the CT, the lung cancer is observed as round white shadow nodules. The possibility to obtain a manually accurate interpretation from CT scans demands a big effort by the radiologist and might be a fatiguing process. Therefore, the design of a computer-aided diagnosis (CADx) system would be helpful as a second opinion tool. The stages of the proposed CADx are: a supervised extraction of the region of interest to eliminate the shape differences among CT images. The Daubechies db1, db2, and db4 wavelet transforms are computed with one and two levels of decomposition. After that, 19 features are computed from each wavelet sub-band. Then, the sub-band and attribute selection is performed. As a result, 11 features are selected and combined in pairs as inputs to the support vector machine (SVM), which is used to distinguish CT images containing cancerous nodules from those not containing nodules. The clinical data set used for experiments consists of 45 CT scans from ELCAP and LIDC. For the training stage 61 CT images were used (36 with cancerous lung nodules and 25 without lung nodules). The system performance was tested with 45 CT scans (23 CT scans with lung nodules and 22 without nodules), different from that used for training. The results obtained show that the methodology successfully classifies cancerous nodules with a diameter from 2 mm to 30 mm. The total preciseness obtained was 82%; the sensitivity was 90.90%, whereas the specificity was 73.91%. The CADx system presented is competitive with other literature systems in terms of sensitivity. The system reduces the complexity of classification by not performing the typical segmentation stage of most CADx

  2. Population-based evaluation of a suggested anatomic and clinical classification of congenital heart defects based on the International Paediatric and Congenital Cardiac Code

    Directory of Open Access Journals (Sweden)

    Goffinet François

    2011-10-01

    Full Text Available Abstract Background Classification of the overall spectrum of congenital heart defects (CHD has always been challenging, in part because of the diversity of the cardiac phenotypes, but also because of the oft-complex associations. The purpose of our study was to establish a comprehensive and easy-to-use classification of CHD for clinical and epidemiological studies based on the long list of the International Paediatric and Congenital Cardiac Code (IPCCC. Methods We coded each individual malformation using six-digit codes from the long list of IPCCC. We then regrouped all lesions into 10 categories and 23 subcategories according to a multi-dimensional approach encompassing anatomic, diagnostic and therapeutic criteria. This anatomic and clinical classification of congenital heart disease (ACC-CHD was then applied to data acquired from a population-based cohort of patients with CHD in France, made up of 2867 cases (82% live births, 1.8% stillbirths and 16.2% pregnancy terminations. Results The majority of cases (79.5% could be identified with a single IPCCC code. The category "Heterotaxy, including isomerism and mirror-imagery" was the only one that typically required more than one code for identification of cases. The two largest categories were "ventricular septal defects" (52% and "anomalies of the outflow tracts and arterial valves" (20% of cases. Conclusion Our proposed classification is not new, but rather a regrouping of the known spectrum of CHD into a manageable number of categories based on anatomic and clinical criteria. The classification is designed to use the code numbers of the long list of IPCCC but can accommodate ICD-10 codes. Its exhaustiveness, simplicity, and anatomic basis make it useful for clinical and epidemiologic studies, including those aimed at assessment of risk factors and outcomes.

  3. Visual detection of defects in solder joints

    Science.gov (United States)

    Blaignan, V. B.; Bourbakis, Nikolaos G.; Moghaddamzadeh, Ali; Yfantis, Evangelos A.

    1995-03-01

    The automatic, real-time visual acquisition and inspection of VLSI boards requires the use of machine vision and artificial intelligence methodologies in a new `frame' for the achievement of better results regarding efficiency, products quality and automated service. In this paper the visual detection and classification of different types of defects on solder joints in PC boards is presented by combining several image processing methods, such as smoothing, segmentation, edge detection, contour extraction and shape analysis. The results of this paper are based on simulated solder defects and a real one.

  4. A review of the automated detection and classification of acute leukaemia: Coherent taxonomy, datasets, validation and performance measurements, motivation, open challenges and recommendations.

    Science.gov (United States)

    Alsalem, M A; Zaidan, A A; Zaidan, B B; Hashim, M; Madhloom, H T; Azeez, N D; Alsyisuf, S

    2018-05-01

    Acute leukaemia diagnosis is a field requiring automated solutions, tools and methods and the ability to facilitate early detection and even prediction. Many studies have focused on the automatic detection and classification of acute leukaemia and their subtypes to promote enable highly accurate diagnosis. This study aimed to review and analyse literature related to the detection and classification of acute leukaemia. The factors that were considered to improve understanding on the field's various contextual aspects in published studies and characteristics were motivation, open challenges that confronted researchers and recommendations presented to researchers to enhance this vital research area. We systematically searched all articles about the classification and detection of acute leukaemia, as well as their evaluation and benchmarking, in three main databases: ScienceDirect, Web of Science and IEEE Xplore from 2007 to 2017. These indices were considered to be sufficiently extensive to encompass our field of literature. Based on our inclusion and exclusion criteria, 89 articles were selected. Most studies (58/89) focused on the methods or algorithms of acute leukaemia classification, a number of papers (22/89) covered the developed systems for the detection or diagnosis of acute leukaemia and few papers (5/89) presented evaluation and comparative studies. The smallest portion (4/89) of articles comprised reviews and surveys. Acute leukaemia diagnosis, which is a field requiring automated solutions, tools and methods, entails the ability to facilitate early detection or even prediction. Many studies have been performed on the automatic detection and classification of acute leukaemia and their subtypes to promote accurate diagnosis. Research areas on medical-image classification vary, but they are all equally vital. We expect this systematic review to help emphasise current research opportunities and thus extend and create additional research fields. Copyright

  5. Automatically high accurate and efficient photomask defects management solution for advanced lithography manufacture

    Science.gov (United States)

    Zhu, Jun; Chen, Lijun; Ma, Lantao; Li, Dejian; Jiang, Wei; Pan, Lihong; Shen, Huiting; Jia, Hongmin; Hsiang, Chingyun; Cheng, Guojie; Ling, Li; Chen, Shijie; Wang, Jun; Liao, Wenkui; Zhang, Gary

    2014-04-01

    Defect review is a time consuming job. Human error makes result inconsistent. The defects located on don't care area would not hurt the yield and no need to review them such as defects on dark area. However, critical area defects can impact yield dramatically and need more attention to review them such as defects on clear area. With decrease in integrated circuit dimensions, mask defects are always thousands detected during inspection even more. Traditional manual or simple classification approaches are unable to meet efficient and accuracy requirement. This paper focuses on automatic defect management and classification solution using image output of Lasertec inspection equipment and Anchor pattern centric image process technology. The number of mask defect found during an inspection is always in the range of thousands or even more. This system can handle large number defects with quick and accurate defect classification result. Our experiment includes Die to Die and Single Die modes. The classification accuracy can reach 87.4% and 93.3%. No critical or printable defects are missing in our test cases. The missing classification defects are 0.25% and 0.24% in Die to Die mode and Single Die mode. This kind of missing rate is encouraging and acceptable to apply on production line. The result can be output and reloaded back to inspection machine to have further review. This step helps users to validate some unsure defects with clear and magnification images when captured images can't provide enough information to make judgment. This system effectively reduces expensive inline defect review time. As a fully inline automated defect management solution, the system could be compatible with current inspection approach and integrated with optical simulation even scoring function and guide wafer level defect inspection.

  6. Automated morphological analysis of bone marrow cells in microscopic images for diagnosis of leukemia: nucleus-plasma separation and cell classification using a hierarchical tree model of hematopoesis

    Science.gov (United States)

    Krappe, Sebastian; Wittenberg, Thomas; Haferlach, Torsten; Münzenmayer, Christian

    2016-03-01

    The morphological differentiation of bone marrow is fundamental for the diagnosis of leukemia. Currently, the counting and classification of the different types of bone marrow cells is done manually under the use of bright field microscopy. This is a time-consuming, subjective, tedious and error-prone process. Furthermore, repeated examinations of a slide may yield intra- and inter-observer variances. For that reason a computer assisted diagnosis system for bone marrow differentiation is pursued. In this work we focus (a) on a new method for the separation of nucleus and plasma parts and (b) on a knowledge-based hierarchical tree classifier for the differentiation of bone marrow cells in 16 different classes. Classification trees are easily interpretable and understandable and provide a classification together with an explanation. Using classification trees, expert knowledge (i.e. knowledge about similar classes and cell lines in the tree model of hematopoiesis) is integrated in the structure of the tree. The proposed segmentation method is evaluated with more than 10,000 manually segmented cells. For the evaluation of the proposed hierarchical classifier more than 140,000 automatically segmented bone marrow cells are used. Future automated solutions for the morphological analysis of bone marrow smears could potentially apply such an approach for the pre-classification of bone marrow cells and thereby shortening the examination time.

  7. An objective method to optimize the MR sequence set for plaque classification in carotid vessel wall images using automated image segmentation.

    Directory of Open Access Journals (Sweden)

    Ronald van 't Klooster

    Full Text Available A typical MR imaging protocol to study the status of atherosclerosis in the carotid artery consists of the application of multiple MR sequences. Since scanner time is limited, a balance has to be reached between the duration of the applied MR protocol and the quantity and quality of the resulting images which are needed to assess the disease. In this study an objective method to optimize the MR sequence set for classification of soft plaque in vessel wall images of the carotid artery using automated image segmentation was developed. The automated method employs statistical pattern recognition techniques and was developed based on an extensive set of MR contrast weightings and corresponding manual segmentations of the vessel wall and soft plaque components, which were validated by histological sections. Evaluation of the results from nine contrast weightings showed the tradeoff between scan duration and automated image segmentation performance. For our dataset the best segmentation performance was achieved by selecting five contrast weightings. Similar performance was achieved with a set of three contrast weightings, which resulted in a reduction of scan time by more than 60%. The presented approach can help others to optimize MR imaging protocols by investigating the tradeoff between scan duration and automated image segmentation performance possibly leading to shorter scanning times and better image interpretation. This approach can potentially also be applied to other research fields focusing on different diseases and anatomical regions.

  8. AUTOMATED CLASSIFICATION AND SEGREGATION OF BRAIN MRI IMAGES INTO IMAGES CAPTURED WITH RESPECT TO VENTRICULAR REGION AND EYE-BALL REGION

    Directory of Open Access Journals (Sweden)

    C. Arunkumar

    2014-05-01

    Full Text Available Magnetic Resonance Imaging (MRI images of the brain are used for detection of various brain diseases including tumor. In such cases, classification of MRI images captured with respect to ventricular and eye ball regions helps in automated location and classification of such diseases. The methods employed in the paper can segregate the given MRI images of brain into images of brain captured with respect to ventricular region and images of brain captured with respect to eye ball region. First, the given MRI image of brain is segmented using Particle Swarm Optimization (PSO algorithm, which is an optimized algorithm for MRI image segmentation. The algorithm proposed in the paper is then applied on the segmented image. The algorithm detects whether the image consist of a ventricular region or an eye ball region and classifies it accordingly.

  9. UT simulation using a fully automated 3D hybrid model: Application to planar backwall breaking defects inspection

    Science.gov (United States)

    Imperiale, Alexandre; Chatillon, Sylvain; Darmon, Michel; Leymarie, Nicolas; Demaldent, Edouard

    2018-04-01

    The high frequency models gathered in the CIVA software allow fast computations and provide satisfactory quantitative predictions in a wide range of situations. However, the domain of validity of these models is limited since they do not accurately predict the ultrasound response in configurations involving subwavelength complex phenomena. In addition, when modelling backwall breaking defects inspection, an important challenge remains to capture the propagation of the creeping waves that are generated at the critical angle. Hybrid models combining numerical and asymptotic methods have already been shown to be an effective strategy to overcome these limitations in 2D [1]. However, 3D simulations remain a crucial issue for industrial applications because of the computational cost of the numerical solver. A dedicated three dimensional high order finite element model combined with a domain decomposition method has been recently proposed to tackle 3D limitations [2]. In this communication, we will focus on the specific case of planar backwall breaking defects, with an adapted coupling strategy in order to efficiently model the propagation of creeping waves. Numerical and experimental validations will be proposed on various configurations.

  10. Detection, identification and classification of defects using ANN and a robotic manipulator of 2 G.L. (Kohonen and MLP algorithms)

    International Nuclear Information System (INIS)

    Barrera, G.; Fabian, M. A.; Ugalde, C. A.

    2002-01-01

    The ultrasonic inspection technique had a sustained growth since the 80's It has several advantages, compared with the contact technique. A flexible and low cost solution is presented based on virtual instrumentation for the servomechanism (manipulator) control of the ultrasound inspection transducer in the immersion technique. The developed system uses a personal computer (PC). a Windows Operating System. Virtual Instrumentation Software. DAQ cards and a GPIB card. As a solution to detection, classification and evaluation of defects an Artificial Neuronal Networks technique proposed. It consists of characterization and interpretation of acoustic signals (echoes) acquired by the immersion ultrasonic inspection technique. Two neuronal networks are proposed: Kohonen and Multilayer Perceptron (MLP). With this techniques non-linear complex processes can be modeled with great precision. The 2-degree of freedom manipulator control, the data acquisition and the net training have been carried out in a virtual instrument environment using LabVIEV and Data Engine. (Author) 14 refs

  11. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    International Nuclear Information System (INIS)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B.; Koch, R.

    2012-01-01

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 ± 17.4 sec) was the same as with the manual approach (29.1 ± 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  12. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    Energy Technology Data Exchange (ETDEWEB)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B. [Muenster Univ. (Germany). Dept. of Clinical Radiology; Koch, R. [Muenster Univ. (Germany). Inst. of Biostatistics and Clinical Research

    2012-09-15

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 {+-} 17.4 sec) was the same as with the manual approach (29.1 {+-} 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  13. Automated Defect Recognition as a Critical Element of a Three Dimensional X-ray Computed Tomography Imaging-Based Smart Non-Destructive Testing Technique in Additive Manufacturing of Near Net-Shape Parts

    Directory of Open Access Journals (Sweden)

    Istvan Szabo

    2017-11-01

    Full Text Available In this paper, a state of the art automated defect recognition (ADR system is presented that was developed specifically for Non-Destructive Testing (NDT of powder metallurgy (PM parts using three dimensional X-ray Computed Tomography (CT imaging, towards enabling online quality assurance and enhanced integrity confidence. PM parts exhibit typical defects such as microscopic cracks, porosity, and voids, internal to components that without an effective detection system, limit the growth of industrial applications. Compared to typical testing methods (e.g., destructive such as metallography that is based on sampling, cutting, and polishing of parts, CT provides full coverage of defect detection. This paper establishes the importance and advantages of an automated NDT system for the PM industry applications with particular emphasis on image processing procedures for defect recognition. Moreover, the article describes how to establish a reference library based on real 3D X-ray CT images of net-shape parts. The paper follows the development of the ADR system from processing 2D image slices of a measured 3D X-ray image to processing the complete 3D X-ray image as a whole. The introduced technique is successfully integrated into an automated in-line quality control system highly sought by major industry sectors in Oil and Gas, Automotive, and Aerospace.

  14. Phenotype classification of zebrafish embryos by supervised learning.

    Directory of Open Access Journals (Sweden)

    Nathalie Jeanray

    Full Text Available Zebrafish is increasingly used to assess biological properties of chemical substances and thus is becoming a specific tool for toxicological and pharmacological studies. The effects of chemical substances on embryo survival and development are generally evaluated manually through microscopic observation by an expert and documented by several typical photographs. Here, we present a methodology to automatically classify brightfield images of wildtype zebrafish embryos according to their defects by using an image analysis approach based on supervised machine learning. We show that, compared to manual classification, automatic classification results in 90 to 100% agreement with consensus voting of biological experts in nine out of eleven considered defects in 3 days old zebrafish larvae. Automation of the analysis and classification of zebrafish embryo pictures reduces the workload and time required for the biological expert and increases the reproducibility and objectivity of this classification.

  15. Late gadolinium uptake demonstrated with magnetic resonance in patients where automated PERFIT analysis of myocardial SPECT suggests irreversible perfusion defect

    International Nuclear Information System (INIS)

    Rosendahl, Lene; Blomstrand, Peter; Ohlsson, Jan L; Björklund, Per-Gunnar; Ahlander, Britt-Marie; Starck, Sven-Åke; Engvall, Jan E

    2008-01-01

    Myocardial perfusion single photon emission computed tomography (MPS) is frequently used as the reference method for the determination of myocardial infarct size. PERFIT ® is a software utilizing a three-dimensional gender specific, averaged heart model for the automatic evaluation of myocardial perfusion. The purpose of this study was to compare the perfusion defect size on MPS, assessed with PERFIT, with the hyperenhanced volume assessed by late gadolinium enhancement magnetic resonance imaging (LGE) and to relate their effect on the wall motion score index (WMSI) assessed with cine magnetic resonance imaging (cine-MRI) and echocardiography (echo). LGE was performed in 40 patients where clinical MPS showed an irreversible uptake reduction suggesting a myocardial scar. Infarct volume, extent and major coronary supply were compared between MPS and LGE as well as the relationship between infarct size from both methods and WMSI. MPS showed a slightly larger infarct volume than LGE (MPS 29.6 ± 23.2 ml, LGE 22.1 ± 16.9 ml, p = 0.01), while no significant difference was found in infarct extent (MPS 11.7 ± 9.4%, LGE 13.0 ± 9.6%). The correlation coefficients between methods in respect to infarct size and infarct extent were 0.71 and 0.63 respectively. WMSI determined with cine-MRI correlated moderately with infarct volume and infarct extent (cine-MRI vs MPS volume r = 0.71, extent r = 0.71, cine-MRI vs LGE volume r = 0.62, extent r = 0.60). Similar results were achieved when wall motion was determined with echo. Both MPS and LGE showed the same major coronary supply to the infarct area in a majority of patients, Kappa = 0.84. MPS and LGE agree moderately in the determination of infarct size in both absolute and relative terms, although infarct volume is slightly larger with MPS. The correlation between WMSI and infarct size is moderate

  16. Radiological assessment of breast density by visual classification (BI-RADS) compared to automated volumetric digital software (Quantra): implications for clinical practice.

    Science.gov (United States)

    Regini, Elisa; Mariscotti, Giovanna; Durando, Manuela; Ghione, Gianluca; Luparia, Andrea; Campanino, Pier Paolo; Bianchi, Caterina Chiara; Bergamasco, Laura; Fonio, Paolo; Gandini, Giovanni

    2014-10-01

    This study was done to assess breast density on digital mammography and digital breast tomosynthesis according to the visual Breast Imaging Reporting and Data System (BI-RADS) classification, to compare visual assessment with Quantra software for automated density measurement, and to establish the role of the software in clinical practice. We analysed 200 digital mammograms performed in 2D and 3D modality, 100 of which positive for breast cancer and 100 negative. Radiological density was assessed with the BI-RADS classification; a Quantra density cut-off value was sought on the 2D images only to discriminate between BI-RADS categories 1-2 and BI-RADS 3-4. Breast density was correlated with age, use of hormone therapy, and increased risk of disease. The agreement between the 2D and 3D assessments of BI-RADS density was high (K 0.96). A cut-off value of 21% is that which allows us to best discriminate between BI-RADS categories 1-2 and 3-4. Breast density was negatively correlated to age (r = -0.44) and positively to use of hormone therapy (p = 0.0004). Quantra density was higher in breasts with cancer than in healthy breasts. There is no clear difference between the visual assessments of density on 2D and 3D images. Use of the automated system requires the adoption of a cut-off value (set at 21%) to effectively discriminate BI-RADS 1-2 and 3-4, and could be useful in clinical practice.

  17. Use of self-organizing maps for classification of defects in the tubes from the steam generator of nuclear power plants; Classificacao de defeitos em tubos de gerador de vapor de plantas nucleares utilizando mapas auto-organizaveis

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto Navarro de

    2002-07-01

    This thesis obtains a new classification method for different steam generator tube defects in nuclear power plants using Eddy Current Test signals. The method uses self-organizing maps to compare different signal characteristics efficiency to identify and classify these defects. A multiple inference system is proposed which composes the different extracted characteristic trained maps classification to infer the final defect type. The feature extraction methods used are the Wavelet zero-crossings representation, the linear predictive coding (LPC), and other basic signal representations on time like module and phase. Many characteristic vectors are obtained with combinations of these extracted characteristics. These vectors are tested to classify the defects and the best ones are applied to the multiple inference system. A systematic study of pre-processing, calibration and analysis methods for the steam generator tube defect signals in nuclear power plants is done. The method efficiency is demonstrated and characteristic maps with the main prototypes are obtained for each steam generator tube defect type. (author)

  18. Application of a new genetic classification and semi-automated geomorphic mapping approach in the Perth submarine canyon, Australia

    Science.gov (United States)

    Picard, K.; Nanson, R.; Huang, Z.; Nichol, S.; McCulloch, M.

    2017-12-01

    The acquisition of high resolution marine geophysical data has intensified in recent years (e.g. multibeam echo-sounding, sub-bottom profiling). This progress provides the opportunity to classify and map the seafloor in greater detail, using new methods that preserve the links between processes and morphology. Geoscience Australia has developed a new genetic classification approach, nested within the Harris et al (2014) global seafloor mapping framework. The approach divides parent units into sub-features based on established classification schemes and feature descriptors defined by Bradwell et al. (2016: http://nora.nerc.ac.uk/), the International Hydrographic Organization (https://www.iho.int) and the Coastal Marine and Ecological Classification Standard (https://www.cmecscatalog.org). Owing to the ecological significance of submarine canyon systems in particular, much recent attention has focused on defining their variation in form and process, whereby they can be classified using a range of topographic metrics, fluvial dis/connection and shelf-incising status. The Perth Canyon is incised into the continental slope and shelf of southwest Australia, covering an area of >1500 km2 and extending from 4700 m water depth to the shelf break in 170 m. The canyon sits within a Marine Protected Area, incorporating a Marine National Park and Habitat Protection Zone in recognition of its benthic and pelagic biodiversity values. However, detailed information of the spatial patterns of the seabed habitats that influence this biodiversity is lacking. Here we use 20 m resolution bathymetry and acoustic backscatter data acquired in 2015 by the Schmidt Ocean Institute plus sub-bottom datasets and sediment samples collected Geoscience Australia in 2005 to apply the new geomorphic classification system to the Perth Canyon. This presentation will show the results of the geomorphic feature mapping of the canyon and its application to better defining potential benthic habitats.

  19. Food intake monitoring: an acoustical approach to automated food intake activity detection and classification of consumed food

    International Nuclear Information System (INIS)

    Päßler, Sebastian; Fischer, Wolf-Joachim; Wolff, Matthias

    2012-01-01

    Obesity and nutrition-related diseases are currently growing challenges for medicine. A precise and timesaving method for food intake monitoring is needed. For this purpose, an approach based on the classification of sounds produced during food intake is presented. Sounds are recorded non-invasively by miniature microphones in the outer ear canal. A database of 51 participants eating seven types of food and consuming one drink has been developed for algorithm development and model training. The database is labeled manually using a protocol with introductions for annotation. The annotation procedure is evaluated using Cohen's kappa coefficient. The food intake activity is detected by the comparison of the signal energy of in-ear sounds to environmental sounds recorded by a reference microphone. Hidden Markov models are used for the recognition of single chew or swallowing events. Intake cycles are modeled as event sequences in finite-state grammars. Classification of consumed food is realized by a finite-state grammar decoder based on the Viterbi algorithm. We achieved a detection accuracy of 83% and a food classification accuracy of 79% on a test set of 10% of all records. Our approach faces the need of monitoring the time and occurrence of eating. With differentiation of consumed food, a first step toward the goal of meal weight estimation is taken. (paper)

  20. Development of an Automated MRI-Based Diagnostic Protocol for Amyotrophic Lateral Sclerosis Using Disease-Specific Pathognomonic Features: A Quantitative Disease-State Classification Study.

    Science.gov (United States)

    Schuster, Christina; Hardiman, Orla; Bede, Peter

    2016-01-01

    Despite significant advances in quantitative neuroimaging, the diagnosis of ALS remains clinical and MRI-based biomarkers are not currently used to aid the diagnosis. The objective of this study is to develop a robust, disease-specific, multimodal classification protocol and validate its diagnostic accuracy in independent, early-stage and follow-up data sets. 147 participants (81 ALS patients and 66 healthy controls) were divided into a training sample and a validation sample. Patients in the validation sample underwent follow-up imaging longitudinally. After removing age-related variability, indices of grey and white matter integrity in ALS-specific pathognomonic brain regions were included in a cross-validated binary logistic regression model to determine the probability of individual scans indicating ALS. The following anatomical regions were assessed for diagnostic classification: average grey matter density of the left and right precentral gyrus, the average fractional anisotropy and radial diffusivity of the left and right superior corona radiata, inferior corona radiata, internal capsule, mesencephalic crus of the cerebral peduncles, pontine segment of the corticospinal tract, and the average diffusivity values of the genu, corpus and splenium of the corpus callosum. Using a 50% probability cut-off value of suffering from ALS, the model was able to discriminate ALS patients and HC with good sensitivity (80.0%) and moderate accuracy (70.0%) in the training sample and superior sensitivity (85.7%) and accuracy (78.4%) in the independent validation sample. This diagnostic classification study endeavours to advance ALS biomarker research towards pragmatic clinical applications by providing an approach of automated individual-data interpretation based on group-level observations.

  1. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning versus non-deep learning.

    Science.gov (United States)

    Wang, Xinggang; Yang, Wei; Weinreb, Jeffrey; Han, Juan; Li, Qiubai; Kong, Xiangchuang; Yan, Yongluan; Ke, Zan; Luo, Bo; Liu, Tao; Wang, Liang

    2017-11-13

    Prostate cancer (PCa) is a major cause of death since ancient time documented in Egyptian Ptolemaic mummy imaging. PCa detection is critical to personalized medicine and varies considerably under an MRI scan. 172 patients with 2,602 morphologic images (axial 2D T2-weighted imaging) of the prostate were obtained. A deep learning with deep convolutional neural network (DCNN) and a non-deep learning with SIFT image feature and bag-of-word (BoW), a representative method for image recognition and analysis, were used to distinguish pathologically confirmed PCa patients from prostate benign conditions (BCs) patients with prostatitis or prostate benign hyperplasia (BPH). In fully automated detection of PCa patients, deep learning had a statistically higher area under the receiver operating characteristics curve (AUC) than non-deep learning (P = 0.0007 deep learning method and 0.70 (95% CI 0.63-0.77) for non-deep learning method, respectively. Our results suggest that deep learning with DCNN is superior to non-deep learning with SIFT image feature and BoW model for fully automated PCa patients differentiation from prostate BCs patients. Our deep learning method is extensible to image modalities such as MR imaging, CT and PET of other organs.

  2. Automated classification and visualization of healthy and pathological dental tissues based on near-infrared hyper-spectral imaging

    Science.gov (United States)

    Usenik, Peter; Bürmen, Miran; Vrtovec, Tomaž; Fidler, Aleš; Pernuš, Franjo; Likar, Boštjan

    2011-03-01

    Despite major improvements in dental healthcare and technology, dental caries remains one of the most prevalent chronic diseases of modern society. The initial stages of dental caries are characterized by demineralization of enamel crystals, commonly known as white spots which are difficult to diagnose. If detected early enough, such demineralization can be arrested and reversed by non-surgical means through well established dental treatments (fluoride therapy, anti-bacterial therapy, low intensity laser irradiation). Near-infrared (NIR) hyper-spectral imaging is a new promising technique for early detection of demineralization based on distinct spectral features of healthy and pathological dental tissues. In this study, we apply NIR hyper-spectral imaging to classify and visualize healthy and pathological dental tissues including enamel, dentin, calculus, dentin caries, enamel caries and demineralized areas. For this purpose, a standardized teeth database was constructed consisting of 12 extracted human teeth with different degrees of natural dental lesions imaged by NIR hyper-spectral system, X-ray and digital color camera. The color and X-ray images of teeth were presented to a clinical expert for localization and classification of the dental tissues, thereby obtaining the gold standard. Principal component analysis was used for multivariate local modeling of healthy and pathological dental tissues. Finally, the dental tissues were classified by employing multiple discriminant analysis. High agreement was observed between the resulting classification and the gold standard with the classification sensitivity and specificity exceeding 85 % and 97 %, respectively. This study demonstrates that NIR hyper-spectral imaging has considerable diagnostic potential for imaging hard dental tissues.

  3. Online Surface Defect Identification of Cold Rolled Strips Based on Local Binary Pattern and Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2018-03-01

    Full Text Available In the production of cold-rolled strip, the strip surface may suffer from various defects which need to be detected and identified using an online inspection system. The system is equipped with high-speed and high-resolution cameras to acquire images from the moving strip surface. Features are then extracted from the images and are used as inputs of a pre-trained classifier to identify the type of defect. New types of defect often appear in production. At this point the pre-trained classifier needs to be quickly retrained and deployed in seconds to meet the requirement of the online identification of all defects in the environment of a continuous production line. Therefore, the method for extracting the image features and the training for the classification model should be automated and fast enough, normally within seconds. This paper presents our findings in investigating the computational and classification performance of various feature extraction methods and classification models for the strip surface defect identification. The methods include Scale Invariant Feature Transform (SIFT, Speeded Up Robust Features (SURF and Local Binary Patterns (LBP. The classifiers we have assessed include Back Propagation (BP neural network, Support Vector Machine (SVM and Extreme Learning Machine (ELM. By comparing various combinations of different feature extraction and classification methods, our experiments show that the hybrid method of LBP for feature extraction and ELM for defect classification results in less training and identification time with higher classification accuracy, which satisfied online real-time identification.

  4. Automated Detection, Localization, and Classification of Traumatic Vertebral Body Fractures in the Thoracic and Lumbar Spine at CT.

    Science.gov (United States)

    Burns, Joseph E; Yao, Jianhua; Muñoz, Hector; Summers, Ronald M

    2016-01-01

    To design and validate a fully automated computer system for the detection and anatomic localization of traumatic thoracic and lumbar vertebral body fractures at computed tomography (CT). This retrospective study was HIPAA compliant. Institutional review board approval was obtained, and informed consent was waived. CT examinations in 104 patients (mean age, 34.4 years; range, 14-88 years; 32 women, 72 men), consisting of 94 examinations with positive findings for fractures (59 with vertebral body fractures) and 10 control examinations (without vertebral fractures), were performed. There were 141 thoracic and lumbar vertebral body fractures in the case set. The locations of fractures were marked and classified by a radiologist according to Denis column involvement. The CT data set was divided into training and testing subsets (37 and 67 subsets, respectively) for analysis by means of prototype software for fully automated spinal segmentation and fracture detection. Free-response receiver operating characteristic analysis was performed. Training set sensitivity for detection and localization of fractures within each vertebra was 0.82 (28 of 34 findings; 95% confidence interval [CI]: 0.68, 0.90), with a false-positive rate of 2.5 findings per patient. The sensitivity for fracture localization to the correct vertebra was 0.88 (23 of 26 findings; 95% CI: 0.72, 0.96), with a false-positive rate of 1.3. Testing set sensitivity for the detection and localization of fractures within each vertebra was 0.81 (87 of 107 findings; 95% CI: 0.75, 0.87), with a false-positive rate of 2.7. The sensitivity for fracture localization to the correct vertebra was 0.92 (55 of 60 findings; 95% CI: 0.79, 0.94), with a false-positive rate of 1.6. The most common cause of false-positive findings was nutrient foramina (106 of 272 findings [39%]). The fully automated computer system detects and anatomically localizes vertebral body fractures in the thoracic and lumbar spine on CT images with a

  5. Defect reduction of patterned media templates and disks

    Science.gov (United States)

    Luo, Kang; Ha, Steven; Fretwell, John; Ramos, Rick; Ye, Zhengmao; Schmid, Gerard; LaBrake, Dwayne; Resnick, Douglas J.; Sreenivasan, S. V.

    2010-05-01

    Imprint lithography has been shown to be an effective technique for the replication of nano-scale features. Acceptance of imprint lithography for manufacturing will require a demonstration of defect levels commensurate with cost-effective device production. This work summarizes the results of defect inspections of hard disks patterned using Jet and Flash Imprint Lithography (J-FILTM). Inspections were performed with optical based automated inspection tools. For the hard drive market, it is important to understand the defectivity of both the template and the imprinted disk. This work presents a methodology for automated pattern inspection and defect classification for imprint-patterned media. Candela CS20 and 6120 tools from KLA-Tencor map the optical properties of the disk surface, producing highresolution grayscale images of surface reflectivity and scattered light. Defects that have been identified in this manner are further characterized according to the morphology. The imprint process was tested after optimizing both the disk cleaning and adhesion layers processes that precede imprinting. An extended imprint run was performed and both the defect types and trends are reported.

  6. A machine vision system for automated non-invasive assessment of cell viability via dark field microscopy, wavelet feature selection and classification

    Directory of Open Access Journals (Sweden)

    Friehs Karl

    2008-10-01

    Full Text Available Abstract Background Cell viability is one of the basic properties indicating the physiological state of the cell, thus, it has long been one of the major considerations in biotechnological applications. Conventional methods for extracting information about cell viability usually need reagents to be applied on the targeted cells. These reagent-based techniques are reliable and versatile, however, some of them might be invasive and even toxic to the target cells. In support of automated noninvasive assessment of cell viability, a machine vision system has been developed. Results This system is based on supervised learning technique. It learns from images of certain kinds of cell populations and trains some classifiers. These trained classifiers are then employed to evaluate the images of given cell populations obtained via dark field microscopy. Wavelet decomposition is performed on the cell images. Energy and entropy are computed for each wavelet subimage as features. A feature selection algorithm is implemented to achieve better performance. Correlation between the results from the machine vision system and commonly accepted gold standards becomes stronger if wavelet features are utilized. The best performance is achieved with a selected subset of wavelet features. Conclusion The machine vision system based on dark field microscopy in conjugation with supervised machine learning and wavelet feature selection automates the cell viability assessment, and yields comparable results to commonly accepted methods. Wavelet features are found to be suitable to describe the discriminative properties of the live and dead cells in viability classification. According to the analysis, live cells exhibit morphologically more details and are intracellularly more organized than dead ones, which display more homogeneous and diffuse gray values throughout the cells. Feature selection increases the system's performance. The reason lies in the fact that feature

  7. Automated classification of seismic sources in a large database: a comparison of Random Forests and Deep Neural Networks.

    Science.gov (United States)

    Hibert, Clement; Stumpf, André; Provost, Floriane; Malet, Jean-Philippe

    2017-04-01

    In the past decades, the increasing quality of seismic sensors and capability to transfer remotely large quantity of data led to a fast densification of local, regional and global seismic networks for near real-time monitoring of crustal and surface processes. This technological advance permits the use of seismology to document geological and natural/anthropogenic processes (volcanoes, ice-calving, landslides, snow and rock avalanches, geothermal fields), but also led to an ever-growing quantity of seismic data. This wealth of seismic data makes the construction of complete seismicity catalogs, which include earthquakes but also other sources of seismic waves, more challenging and very time-consuming as this critical pre-processing stage is classically done by human operators and because hundreds of thousands of seismic signals have to be processed. To overcome this issue, the development of automatic methods for the processing of continuous seismic data appears to be a necessity. The classification algorithm should satisfy the need of a method that is robust, precise and versatile enough to be deployed to monitor the seismicity in very different contexts. In this study, we evaluate the ability of machine learning algorithms for the analysis of seismic sources at the Piton de la Fournaise volcano being Random Forest and Deep Neural Network classifiers. We gather a catalog of more than 20,000 events, belonging to 8 classes of seismic sources. We define 60 attributes, based on the waveform, the frequency content and the polarization of the seismic waves, to parameterize the seismic signals recorded. We show that both algorithms provide similar positive classification rates, with values exceeding 90% of the events. When trained with a sufficient number of events, the rate of positive identification can reach 99%. These very high rates of positive identification open the perspective of an operational implementation of these algorithms for near-real time monitoring of

  8. Automated classifications of topography from DEMs by an unsupervised nested-means algorithm and a three-part geometric signature

    Science.gov (United States)

    Iwahashi, J.; Pike, R.J.

    2007-01-01

    An iterative procedure that implements the classification of continuous topography as a problem in digital image-processing automatically divides an area into categories of surface form; three taxonomic criteria-slope gradient, local convexity, and surface texture-are calculated from a square-grid digital elevation model (DEM). The sequence of programmed operations combines twofold-partitioned maps of the three variables converted to greyscale images, using the mean of each variable as the dividing threshold. To subdivide increasingly subtle topography, grid cells sloping at less than mean gradient of the input DEM are classified by designating mean values of successively lower-sloping subsets of the study area (nested means) as taxonomic thresholds, thereby increasing the number of output categories from the minimum 8 to 12 or 16. Program output is exemplified by 16 topographic types for the world at 1-km spatial resolution (SRTM30 data), the Japanese Islands at 270??m, and part of Hokkaido at 55??m. Because the procedure is unsupervised and reflects frequency distributions of the input variables rather than pre-set criteria, the resulting classes are undefined and must be calibrated empirically by subsequent analysis. Maps of the example classifications reflect physiographic regions, geological structure, and landform as well as slope materials and processes; fine-textured terrain categories tend to correlate with erosional topography or older surfaces, coarse-textured classes with areas of little dissection. In Japan the resulting classes approximate landform types mapped from airphoto analysis, while in the Americas they create map patterns resembling Hammond's terrain types or surface-form classes; SRTM30 output for the United States compares favorably with Fenneman's physical divisions. Experiments are suggested for further developing the method; the Arc/Info AML and the map of terrain classes for the world are available as online downloads. ?? 2006 Elsevier

  9. Automated segmentation of ultrasonic breast lesions using statistical texture classification and active contour based on probability distance.

    Science.gov (United States)

    Liu, Bo; Cheng, H D; Huang, Jianhua; Tian, Jiawei; Liu, Jiafeng; Tang, Xianglong

    2009-08-01

    Because of its complicated structure, low signal/noise ratio, low contrast and blurry boundaries, fully automated segmentation of a breast ultrasound (BUS) image is a difficult task. In this paper, a novel segmentation method for BUS images without human intervention is proposed. Unlike most published approaches, the proposed method handles the segmentation problem by using a two-step strategy: ROI generation and ROI segmentation. First, a well-trained texture classifier categorizes the tissues into different classes, and the background knowledge rules are used for selecting the regions of interest (ROIs) from them. Second, a novel probability distance-based active contour model is applied for segmenting the ROIs and finding the accurate positions of the breast tumors. The active contour model combines both global statistical information and local edge information, using a level set approach. The proposed segmentation method was performed on 103 BUS images (48 benign and 55 malignant). To validate the performance, the results were compared with the corresponding tumor regions marked by an experienced radiologist. Three error metrics, true-positive ratio (TP), false-negative ratio (FN) and false-positive ratio (FP) were used for measuring the performance of the proposed method. The final results (TP = 91.31%, FN = 8.69% and FP = 7.26%) demonstrate that the proposed method can segment BUS images efficiently, quickly and automatically.

  10. Automated measurement and classification of pulmonary blood-flow velocity patterns using phase-contrast MRI and correlation analysis.

    Science.gov (United States)

    van Amerom, Joshua F P; Kellenberger, Christian J; Yoo, Shi-Joon; Macgowan, Christopher K

    2009-01-01

    An automated method was evaluated to detect blood flow in small pulmonary arteries and classify each as artery or vein, based on a temporal correlation analysis of their blood-flow velocity patterns. The method was evaluated using velocity-sensitive phase-contrast magnetic resonance data collected in vitro with a pulsatile flow phantom and in vivo in 11 human volunteers. The accuracy of the method was validated in vitro, which showed relative velocity errors of 12% at low spatial resolution (four voxels per diameter), but was reduced to 5% at increased spatial resolution (16 voxels per diameter). The performance of the method was evaluated in vivo according to its reproducibility and agreement with manual velocity measurements by an experienced radiologist. In all volunteers, the correlation analysis was able to detect and segment peripheral pulmonary vessels and distinguish arterial from venous velocity patterns. The intrasubject variability of repeated measurements was approximately 10% of peak velocity, or 2.8 cm/s root-mean-variance, demonstrating the high reproducibility of the method. Excellent agreement was obtained between the correlation analysis and radiologist measurements of pulmonary velocities, with a correlation of R2=0.98 (P<.001) and a slope of 0.99+/-0.01.

  11. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  12. Mass defect filtering-oriented classification and precursor ions list-triggered high-resolution mass spectrometry analysis for the discovery of indole alkaloids from Uncaria sinensis.

    Science.gov (United States)

    Pan, Huiqin; Yang, Wenzhi; Yao, Changliang; Shen, Yao; Zhang, Yibei; Shi, Xiaojian; Yao, Shuai; Wu, Wanying; Guo, Dean

    2017-09-22

    Discovery of new natural compounds is becoming increasingly challenging because of the interference from those known and abundant components. The aim of this study is to report a dereplication strategy, by integrating mass defect filtering (MDF)-oriented novelty classification and precursor ions list (PIL)-triggered high-resolution mass spectrometry analysis, and to validate it by discovering new indole alkaloids from the medicinal herb Uncaria sinensis. Rapid chromatographic separation was achieved on a Kinetex ® EVO C18 column (<16min). An in-house MDF algorithm, developed based on the informed phytochemistry information and molecular design, could more exactly screen the target alkaloids and divide them into three novelty levels: Known (KN), Unknown-but-Predicted (UP), and Unexpected (UN). A hybrid data acquisition method, namely PIL-triggered collision-induced dissociation-MS 2 and high-energy C-trap dissociation-MS 3 with dynamic exclusion on a linear ion trap/Orbitrap mass spectrometer, facilitated the acquisition of diverse product ions sufficient for the structural elucidation of both indole alkaloids and the N-oxides. Ultimately, 158 potentially new alkaloids, including 10 UP and 108 UN, were rapidly characterized from the stem, leaf, and flower of U. sinensis. Two new alkaloid compounds thereof were successfully isolated and identified by 1D and 2D NMR analyses. The varied ring E and novel alkaloid-acylquinic acid conjugates were first reported from the whole Uncaria genus. Conclusively, it is a practical chemical dereplication strategy that can enhance the efficiency and has the potential to be a routine approach for the discovery of new natural compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Automated detection and classification of major retinal vessels for determination of diameter ratio of arteries and veins

    Science.gov (United States)

    Muramatsu, Chisako; Hatanaka, Yuji; Iwase, Tatsuhiko; Hara, Takeshi; Fujita, Hiroshi

    2010-03-01

    Abnormalities of retinal vasculatures can indicate health conditions in the body, such as the high blood pressure and diabetes. Providing automatically determined width ratio of arteries and veins (A/V ratio) on retinal fundus images may help physicians in the diagnosis of hypertensive retinopathy, which may cause blindness. The purpose of this study was to detect major retinal vessels and classify them into arteries and veins for the determination of A/V ratio. Images used in this study were obtained from DRIVE database, which consists of 20 cases each for training and testing vessel detection algorithms. Starting with the reference standard of vasculature segmentation provided in the database, major arteries and veins each in the upper and lower temporal regions were manually selected for establishing the gold standard. We applied the black top-hat transformation and double-ring filter to detect retinal blood vessels. From the extracted vessels, large vessels extending from the optic disc to temporal regions were selected as target vessels for calculation of A/V ratio. Image features were extracted from the vessel segments from quarter-disc to one disc diameter from the edge of optic discs. The target segments in the training cases were classified into arteries and veins by using the linear discriminant analysis, and the selected parameters were applied to those in the test cases. Out of 40 pairs, 30 pairs (75%) of arteries and veins in the 20 test cases were correctly classified. The result can be used for the automated calculation of A/V ratio.

  14. Evaluation of a rule-based method for epidemiological document classification towards the automation of systematic reviews.

    Science.gov (United States)

    Karystianis, George; Thayer, Kristina; Wolfe, Mary; Tsafnat, Guy

    2017-06-01

    Most data extraction efforts in epidemiology are focused on obtaining targeted information from clinical trials. In contrast, limited research has been conducted on the identification of information from observational studies, a major source for human evidence in many fields, including environmental health. The recognition of key epidemiological information (e.g., exposures) through text mining techniques can assist in the automation of systematic reviews and other evidence summaries. We designed and applied a knowledge-driven, rule-based approach to identify targeted information (study design, participant population, exposure, outcome, confounding factors, and the country where the study was conducted) from abstracts of epidemiological studies included in several systematic reviews of environmental health exposures. The rules were based on common syntactical patterns observed in text and are thus not specific to any systematic review. To validate the general applicability of our approach, we compared the data extracted using our approach versus hand curation for 35 epidemiological study abstracts manually selected for inclusion in two systematic reviews. The returned F-score, precision, and recall ranged from 70% to 98%, 81% to 100%, and 54% to 97%, respectively. The highest precision was observed for exposure, outcome and population (100%) while recall was best for exposure and study design with 97% and 89%, respectively. The lowest recall was observed for the population (54%), which also had the lowest F-score (70%). The generated performance of our text-mining approach demonstrated encouraging results for the identification of targeted information from observational epidemiological study abstracts related to environmental exposures. We have demonstrated that rules based on generic syntactic patterns in one corpus can be applied to other observational study design by simple interchanging the dictionaries aiming to identify certain characteristics (i.e., outcomes

  15. Automated correlation and classification of secondary ion mass spectrometry images using a k-means cluster method.

    Science.gov (United States)

    Konicek, Andrew R; Lefman, Jonathan; Szakal, Christopher

    2012-08-07

    We present a novel method for correlating and classifying ion-specific time-of-flight secondary ion mass spectrometry (ToF-SIMS) images within a multispectral dataset by grouping images with similar pixel intensity distributions. Binary centroid images are created by employing a k-means-based custom algorithm. Centroid images are compared to grayscale SIMS images using a newly developed correlation method that assigns the SIMS images to classes that have similar spatial (rather than spectral) patterns. Image features of both large and small spatial extent are identified without the need for image pre-processing, such as normalization or fixed-range mass-binning. A subsequent classification step tracks the class assignment of SIMS images over multiple iterations of increasing n classes per iteration, providing information about groups of images that have similar chemistry. Details are discussed while presenting data acquired with ToF-SIMS on a model sample of laser-printed inks. This approach can lead to the identification of distinct ion-specific chemistries for mass spectral imaging by ToF-SIMS, as well as matrix-assisted laser desorption ionization (MALDI), and desorption electrospray ionization (DESI).

  16. Automated and simultaneous fovea center localization and macula segmentation using the new dynamic identification and classification of edges model

    Science.gov (United States)

    Onal, Sinan; Chen, Xin; Satamraju, Veeresh; Balasooriya, Maduka; Dabil-Karacal, Humeyra

    2016-01-01

    Abstract. Detecting the position of retinal structures, including the fovea center and macula, in retinal images plays a key role in diagnosing eye diseases such as optic nerve hypoplasia, amblyopia, diabetic retinopathy, and macular edema. However, current detection methods are unreliable for infants or certain ethnic populations. Thus, a methodology is proposed here that may be useful for infants and across ethnicities that automatically localizes the fovea center and segments the macula on digital fundus images. First, dark structures and bright artifacts are removed from the input image using preprocessing operations, and the resulting image is transformed to polar space. Second, the fovea center is identified, and the macula region is segmented using the proposed dynamic identification and classification of edges (DICE) model. The performance of the method was evaluated using 1200 fundus images obtained from the relatively large, diverse, and publicly available Messidor database. In 96.1% of these 1200 cases, the distance between the fovea center identified manually by ophthalmologists and automatically using the proposed method remained within 0 to 8 pixels. The dice similarity index comparing the manually obtained results with those of the model for macula segmentation was 96.12% for these 1200 cases. Thus, the proposed method displayed a high degree of accuracy. The methodology using the DICE model is unique and advantageous over previously reported methods because it simultaneously determines the fovea center and segments the macula region without using any structural information, such as optic disc or blood vessel location, and it may prove useful for all populations, including infants. PMID:27660803

  17. Spectral matching techniques (SMTs) and automated cropland classification algorithms (ACCAs) for mapping croplands of Australia using MODIS 250-m time-series (2000–2015) data

    Science.gov (United States)

    Teluguntla, Pardhasaradhi G.; Thenkabail, Prasad S.; Xiong, Jun N.; Gumma, Murali Krishna; Congalton, Russell G.; Oliphant, Adam; Poehnelt, Justin; Yadav, Kamini; Rao, Mahesh N.; Massey, Richard

    2017-01-01

    Mapping croplands, including fallow areas, are an important measure to determine the quantity of food that is produced, where they are produced, and when they are produced (e.g. seasonality). Furthermore, croplands are known as water guzzlers by consuming anywhere between 70% and 90% of all human water use globally. Given these facts and the increase in global population to nearly 10 billion by the year 2050, the need for routine, rapid, and automated cropland mapping year-after-year and/or season-after-season is of great importance. The overarching goal of this study was to generate standard and routine cropland products, year-after-year, over very large areas through the use of two novel methods: (a) quantitative spectral matching techniques (QSMTs) applied at continental level and (b) rule-based Automated Cropland Classification Algorithm (ACCA) with the ability to hind-cast, now-cast, and future-cast. Australia was chosen for the study given its extensive croplands, rich history of agriculture, and yet nonexistent routine yearly generated cropland products using multi-temporal remote sensing. This research produced three distinct cropland products using Moderate Resolution Imaging Spectroradiometer (MODIS) 250-m normalized difference vegetation index 16-day composite time-series data for 16 years: 2000 through 2015. The products consisted of: (1) cropland extent/areas versus cropland fallow areas, (2) irrigated versus rainfed croplands, and (3) cropping intensities: single, double, and continuous cropping. An accurate reference cropland product (RCP) for the year 2014 (RCP2014) produced using QSMT was used as a knowledge base to train and develop the ACCA algorithm that was then applied to the MODIS time-series data for the years 2000–2015. A comparison between the ACCA-derived cropland products (ACPs) for the year 2014 (ACP2014) versus RCP2014 provided an overall agreement of 89.4% (kappa = 0.814) with six classes: (a) producer’s accuracies varying

  18. Particle Swarm Optimization approach to defect detection in armour ceramics.

    Science.gov (United States)

    Kesharaju, Manasa; Nagarajah, Romesh

    2017-03-01

    In this research, various extracted features were used in the development of an automated ultrasonic sensor based inspection system that enables defect classification in each ceramic component prior to despatch to the field. Classification is an important task and large number of irrelevant, redundant features commonly introduced to a dataset reduces the classifiers performance. Feature selection aims to reduce the dimensionality of the dataset while improving the performance of a classification system. In the context of a multi-criteria optimization problem (i.e. to minimize classification error rate and reduce number of features) such as one discussed in this research, the literature suggests that evolutionary algorithms offer good results. Besides, it is noted that Particle Swarm Optimization (PSO) has not been explored especially in the field of classification of high frequency ultrasonic signals. Hence, a binary coded Particle Swarm Optimization (BPSO) technique is investigated in the implementation of feature subset selection and to optimize the classification error rate. In the proposed method, the population data is used as input to an Artificial Neural Network (ANN) based classification system to obtain the error rate, as ANN serves as an evaluator of PSO fitness function. Copyright © 2016. Published by Elsevier B.V.

  19. Adaptive Automation Design and Implementation

    Science.gov (United States)

    2015-09-17

    with an automated system to a real-world adaptive au- tomation system implementation. There have been plenty of adaptive automation 17 Adaptive...of systems without increasing manpower requirements by allocating routine tasks to automated aids, improving safety through the use of au- tomated ...between intermediate levels of au- tomation , explicitly defining which human task a given level automates. Each model aids the creation and classification

  20. Evaluation of cell count and classification capabilities in body fluids using a fully automated Sysmex XN equipped with high-sensitive Analysis (hsA) mode and DI-60 hematology analyzer system.

    Science.gov (United States)

    Takemura, Hiroyuki; Ai, Tomohiko; Kimura, Konobu; Nagasaka, Kaori; Takahashi, Toshihiro; Tsuchiya, Koji; Yang, Haeun; Konishi, Aya; Uchihashi, Kinya; Horii, Takashi; Tabe, Yoko; Ohsaka, Akimichi

    2018-01-01

    The XN series automated hematology analyzer has been equipped with a body fluid (BF) mode to count and differentiate leukocytes in BF samples including cerebrospinal fluid (CSF). However, its diagnostic accuracy is not reliable for CSF samples with low cell concentration at the border between normal and pathologic level. To overcome this limitation, a new flow cytometry-based technology, termed "high sensitive analysis (hsA) mode," has been developed. In addition, the XN series analyzer has been equipped with the automated digital cell imaging analyzer DI-60 to classify cell morphology including normal leukocytes differential and abnormal malignant cells detection. Using various BF samples, we evaluated the performance of the XN-hsA mode and DI-60 compared to manual microscopic examination. The reproducibility of the XN-hsA mode showed good results in samples with low cell densities (coefficient of variation; % CV: 7.8% for 6 cells/μL). The linearity of the XN-hsA mode was established up to 938 cells/μL. The cell number obtained using the XN-hsA mode correlated highly with the corresponding microscopic examination. Good correlation was also observed between the DI-60 analyses and manual microscopic classification for all leukocyte types, except monocytes. In conclusion, the combined use of cell counting with the XN-hsA mode and automated morphological analyses using the DI-60 mode is potentially useful for the automated analysis of BF cells.

  1. The Effect of Automation on Job Duties, Classifications, Staffing Patterns, and Labor Costs in the UBC Library's Cataloguing Divisions: A Comparison of 1973 and 1986.

    Science.gov (United States)

    de Bruijn, Erik

    This report discusses an ex post facto study that was done to examine the effect that the implementation of automated systems has had on libraries and support staff, labor costs, and productivity in the cataloging divisions of the library of the University of British Columbia. A comparison was made between two years: 1973, a pre-automated period…

  2. Comparison between the Correlations of Retinal Nerve Fiber Layer Thickness Measured by Spectral Domain Optical Coherence Tomography and Visual Field Defects in Standard Automated White-on-White Perimetry versus Pulsar Perimetry.

    Science.gov (United States)

    Alnawaiseh, Maged; Hömberg, Lisann; Eter, Nicole; Prokosch, Verena

    2017-01-01

    To compare the structure-function relationships between retinal nerve fiber layer thickness (RNFLT) and visual field defects measured either by standard automated perimetry (SAP) or by Pulsar perimetry (PP). 263 eyes of 143 patients were prospectively included. Depending on the RNFLT, patients were assigned to the glaucoma group (group A: RNFL score 3-6) or the control group (group B: RNFL score 0-2). Structure-function relationships between RNFLT and mean sensitivity (MS) measured by SAP and PP were analyzed. Throughout the entire group, the MS assessed by PP and SAP correlated significantly with RNFLT in all sectors. In the glaucoma group, there was no significant difference between the correlations RNFL-SAP and RNFL-PP, whereas a significant difference was found in the control group. In the control group, the correlation between structure and function based on the PP data was significantly stronger than that based on SAP.

  3. Quality Control in Automated Manufacturing Processes – Combined Features for Image Processing

    Directory of Open Access Journals (Sweden)

    B. Kuhlenkötter

    2006-01-01

    Full Text Available In production processes the use of image processing systems is widespread. Hardware solutions and cameras respectively are available for nearly every application. One important challenge of image processing systems is the development and selection of appropriate algorithms and software solutions in order to realise ambitious quality control for production processes. This article characterises the development of innovative software by combining features for an automatic defect classification on product surfaces. The artificial intelligent method Support Vector Machine (SVM is used to execute the classification task according to the combined features. This software is one crucial element for the automation of a manually operated production process. 

  4. Development of Adaptive AE Signal Pattern Recognition Program and Application to Classification of Defects in Metal Contact Regions of Rotating Component

    International Nuclear Information System (INIS)

    Lee, K. Y.; Lee, C. M.; Kim, J. S.

    1996-01-01

    In this study, the artificial defects in rotary compressor are classified using pattern recognition of acoustic emission signal. For this purpose the computer program is developed. The neural network classifier is compared with the statistical classifier such as the linear discriminant function classifier and empirical Bayesian classifier. It is concluded that the former is better. It is possible to acquire the recognition rate of above 99% by neural network classifier

  5. Automated evaluation of ultrasonic indications

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  6. Birth Defects

    Science.gov (United States)

    A birth defect is a problem that happens while a baby is developing in the mother's body. Most birth defects happen during the first 3 months of ... in the United States is born with a birth defect. A birth defect may affect how the ...

  7. The classification of motor neuron defects in the zebrafish embryo toxicity test (ZFET) as an animal alternative approach to assess developmental neurotoxicity.

    Science.gov (United States)

    Muth-Köhne, Elke; Wichmann, Arne; Delov, Vera; Fenske, Martina

    2012-07-01

    Rodents are widely used to test the developmental neurotoxicity potential of chemical substances. The regulatory test procedures are elaborate and the requirement of numerous animals is ethically disputable. Therefore, non-animal alternatives are highly desirable, but appropriate test systems that meet regulatory demands are not yet available. Hence, we have developed a new developmental neurotoxicity assay based on specific whole-mount immunostainings of primary and secondary motor neurons (using the monoclonal antibodies znp1 and zn8) in zebrafish embryos. By classifying the motor neuron defects, we evaluated the severity of the neurotoxic damage to individual primary and secondary motor neurons caused by chemical exposure and determined the corresponding effect concentration values (EC₅₀). In a proof-of-principle study, we investigated the effects of three model compounds thiocyclam, cartap and disulfiram, which show some neurotoxicity-indicating effects in vertebrates, and the positive controls ethanol and nicotine and the negative controls 3,4-dichloroaniline (3,4-DCA) and triclosan. As a quantitative measure of the neurotoxic potential of the test compounds, we calculated the ratios of the EC₅₀ values for motor neuron defects and the cumulative malformations, as determined in a zebrafish embryo toxicity test (zFET). Based on this index, disulfiram was classified as the most potent and thiocyclam as the least potent developmental neurotoxin. The index also confirmed the control compounds as positive and negative neurotoxicants. Our findings demonstrate that this index can be used to reliably distinguish between neurotoxic and non-neurotoxic chemicals and provide a sound estimate for the neurodevelopmental hazard potential of a chemical. The demonstrated method can be a feasible approach to reduce the number of animals used in developmental neurotoxicity evaluation procedures. Copyright © 2012 Elsevier Inc. All rights reserved.

  8. ON THE WAYS OF AUTOMATED PROCESSING OF SPATIAL GEOMETRY OF THE SYSTEM “GATE-CASTING” FOR SOLVING OF THE CLASSIFICATION PROBLEMS

    Directory of Open Access Journals (Sweden)

    A. N. Chichko

    2007-01-01

    Full Text Available The system parameterization of castings, allowing to formalize spatial geometry of casting, is offered. The algorithm of taxonomy, which can be used for solving of problems of castings classification in the systems of computeraided design of foundry technologies, is described. The method is approved on castings of type ''cover”.

  9. Investigation of Three-Group Classifiers to Fully Automate Detection and Classification of Breast Lesions in an Intelligent CAD Mammography Workstation

    National Research Council Canada - National Science Library

    Edwards, Darrin C; Metz, Charles E; Giger, Maryellen Lissak

    2007-01-01

    .... We proved that the area under the ROC curve (AUC) is not useful in classification tasks with three or more groups, and showed that the three decision boundary lines used by the three-group ideal observer are intricately related to one another...

  10. Model-based classification of CPT data and automated lithostratigraphic mapping for high-resolution characterization of a heterogeneous sedimentary aquifer

    OpenAIRE

    Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain

    2017-01-01

    Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT...

  11. Defect modelling

    International Nuclear Information System (INIS)

    Norgett, M.J.

    1980-01-01

    Calculations, drawing principally on developments at AERE Harwell, of the relaxation about lattice defects are reviewed with emphasis on the techniques required for such calculations. The principles of defect modelling are outlined and various programs developed for defect simulations are discussed. Particular calculations for metals, ionic crystals and oxides, are considered. (UK)

  12. Improving Student Question Classification

    Science.gov (United States)

    Heiner, Cecily; Zachary, Joseph L.

    2009-01-01

    Students in introductory programming classes often articulate their questions and information needs incompletely. Consequently, the automatic classification of student questions to provide automated tutorial responses is a challenging problem. This paper analyzes 411 questions from an introductory Java programming course by reducing the natural…

  13. Defining defect specifications to optimize photomask production and requalification

    Science.gov (United States)

    Fiekowsky, Peter

    2006-10-01

    Reducing defect repairs and accelerating defect analysis is becoming more important as the total cost of defect repairs on advanced masks increases. Photomask defect specs based on printability, as measured on AIMS microscopes has been used for years, but the fundamental defect spec is still the defect size, as measured on the photomask, requiring the repair of many unprintable defects. ADAS, the Automated Defect Analysis System from AVI is now available in most advanced mask shops. It makes the use of pure printability specs, or "Optimal Defect Specs" practical. This software uses advanced algorithms to eliminate false defects caused by approximations in the inspection algorithm, classify each defect, simulate each defect and disposition each defect based on its printability and location. This paper defines "optimal defect specs", explains why they are now practical and economic, gives a method of determining them and provides accuracy data.

  14. Creation of Defects Catalogue for Nonconforming Product Identification in the Foundry Organization

    Directory of Open Access Journals (Sweden)

    Andrea Sütőová

    2013-12-01

    Full Text Available The paper deals with system of casting defects classification problematics and creation of defects catalogue in the foundry organization. There is described the value of correct defects classification and identification in the literature review and also some tools for defects classification are mentioned. Existing defects classifications and catalogues are often unusable for particular production processes and casting technology. Many foundries therefore create their own defects catalogues. The sample of created catalogue, which classifies and describes defects occuring in the aluminium foundry organization and its benefits are presented in the paper. The created catalogue primarily serves as a visual support for production operators and quality control processes.

  15. Integrating dimension reduction and out-of-sample extension in automated classification of ex vivo human patellar cartilage on phase contrast X-ray computed tomography.

    Science.gov (United States)

    Nagarajan, Mahesh B; Coan, Paola; Huber, Markus B; Diemoz, Paul C; Wismüller, Axel

    2015-01-01

    Phase contrast X-ray computed tomography (PCI-CT) has been demonstrated as a novel imaging technique that can visualize human cartilage with high spatial resolution and soft tissue contrast. Different textural approaches have been previously investigated for characterizing chondrocyte organization on PCI-CT to enable classification of healthy and osteoarthritic cartilage. However, the large size of feature sets extracted in such studies motivates an investigation into algorithmic feature reduction for computing efficient feature representations without compromising their discriminatory power. For this purpose, geometrical feature sets derived from the scaling index method (SIM) were extracted from 1392 volumes of interest (VOI) annotated on PCI-CT images of ex vivo human patellar cartilage specimens. The extracted feature sets were subject to linear and non-linear dimension reduction techniques as well as feature selection based on evaluation of mutual information criteria. The reduced feature set was subsequently used in a machine learning task with support vector regression to classify VOIs as healthy or osteoarthritic; classification performance was evaluated using the area under the receiver-operating characteristic (ROC) curve (AUC). Our results show that the classification performance achieved by 9-D SIM-derived geometric feature sets (AUC: 0.96 ± 0.02) can be maintained with 2-D representations computed from both dimension reduction and feature selection (AUC values as high as 0.97 ± 0.02). Thus, such feature reduction techniques can offer a high degree of compaction to large feature sets extracted from PCI-CT images while maintaining their ability to characterize the underlying chondrocyte patterns.

  16. Toward Intelligent Software Defect Detection

    Science.gov (United States)

    Benson, Markland J.

    2011-01-01

    Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.

  17. The classification of phocomelia.

    Science.gov (United States)

    Tytherleigh-Strong, G; Hooper, G

    2003-06-01

    We studied 24 patients with 44 phocomelic upper limbs. Only 11 limbs could be grouped in the classification system of Frantz and O' Rahilly. The non-classifiable limbs were further studied and their characteristics identified. It is confirmed that phocomelia is not an intercalary defect.

  18. Multisource multibeam backscatter data: developing a strategy for the production of benthic habitat maps using semi-automated seafloor classification methods

    Science.gov (United States)

    Lacharité, Myriam; Brown, Craig J.; Gazzola, Vicki

    2018-06-01

    The establishment of multibeam echosounders (MBES) as a mainstream tool in ocean mapping has facilitated integrative approaches towards nautical charting, benthic habitat mapping, and seafloor geotechnical surveys. The bathymetric and backscatter information generated by MBES enables marine scientists to present highly accurate bathymetric data with a spatial resolution closely matching that of terrestrial mapping, and can generate customized thematic seafloor maps to meet multiple ocean management needs. However, when a variety of MBES systems are used, the creation of objective habitat maps can be hindered by the lack of backscatter calibration, due for example, to system-specific settings, yielding relative rather than absolute values. Here, we describe an approach using object-based image analysis to combine 4 non-overlapping and uncalibrated (backscatter) MBES coverages to form a seamless habitat map on St. Anns Bank (Atlantic Canada), a marine protected area hosting a diversity of benthic habitats. The benthoscape map was produced by analysing each coverage independently with supervised classification (k-nearest neighbor) of image-objects based on a common suite of 7 benthoscapes (determined with 4214 ground-truthing photographs at 61 stations, and characterized with backscatter, bathymetry, and bathymetric position index). Manual re-classification based on uncertainty in membership values to individual classes—especially at the boundaries between coverages—was used to build the final benthoscape map. Given the costs and scarcity of MBES surveys in offshore marine ecosystems—particularly in large ecosystems in need of adequate conservation strategies, such as in Canadian waters—developing approaches to synthesize multiple datasets to meet management needs is warranted.

  19. Multi-Agent Information Classification Using Dynamic Acquaintance Lists.

    Science.gov (United States)

    Mukhopadhyay, Snehasis; Peng, Shengquan; Raje, Rajeev; Palakal, Mathew; Mostafa, Javed

    2003-01-01

    Discussion of automated information services focuses on information classification and collaborative agents, i.e. intelligent computer programs. Highlights include multi-agent systems; distributed artificial intelligence; thesauri; document representation and classification; agent modeling; acquaintances, or remote agents discovered through…

  20. Classifying Classifications

    DEFF Research Database (Denmark)

    Debus, Michael S.

    2017-01-01

    This paper critically analyzes seventeen game classifications. The classifications were chosen on the basis of diversity, ranging from pre-digital classification (e.g. Murray 1952), over game studies classifications (e.g. Elverdam & Aarseth 2007) to classifications of drinking games (e.g. LaBrie et...... al. 2013). The analysis aims at three goals: The classifications’ internal consistency, the abstraction of classification criteria and the identification of differences in classification across fields and/or time. Especially the abstraction of classification criteria can be used in future endeavors...... into the topic of game classifications....

  1. Assessing the Spatial and Occupation Dynamics of the Brazilian Pasturelands Based on the Automated Classification of MODIS Images from 2000 to 2016

    Directory of Open Access Journals (Sweden)

    Leandro Parente

    2018-04-01

    Full Text Available The pasturelands areas of Brazil constitute an important asset for the country, as the main food source for the world’s largest commercial herd, representing the largest stock of open land in the country, occupying ~21% of the national territory. Understanding the spatio-temporal dynamics of these areas is of fundamental importance for the goal of promoting improved territorial governance, emission mitigation and productivity gains. To this effect, this study mapped, through objective criteria and automatic classification methods (Random Forest applied to MODIS (Moderate Resolution Imaging Spectroradiometer images, the totality of the Brazilian pastures between 2000 and 2016. Based on 90 spectro-temporal metrics derived from the Red, NIR and SWIR1 bands and distinct vegetation indices, distributed between dry and wet seasons, a total of 17 pasture maps with an approximate overall accuracy of 80% were produced with cloud-computing (Google Earth Engine. During this period, the pasture area varied from ~152 (2000 to ~179 (2016 million hectares. This expansion pattern was consistent with the bovine herd variation and mostly occurred in the Amazon, which increased its total pasture area by ~15 million hectares between 2000 and 2005, while the Cerrado, Caatinga and Pantanal biomes showed an increase of ~8 million hectares in this same period. The Atlantic Forest was the only biome in which there was a retraction of pasture areas throughout this series. In general, the results of this study suggest the existence of two relevant moments for the Brazilian pasture land uses. The first, strongly supported by the opening of new grazing areas, prevailed between 2000 and 2005 and mostly occurred in the Deforestation Arc and in the Matopiba regions. From 2006 on, the total pasture area in Brazil showed a trend towards stabilization, indicating a slight intensification of livestock activity in recent years.

  2. Automated Resource Classifier for agglomerative functional ...

    Indian Academy of Sciences (India)

    2007-06-16

    Jun 16, 2007 ... Automated resource; functional classification; integrative biology ... which is an open source software meeting the user requirements of flexibility. ... entries into any of the 7 basic non-overlapping functional classes: Cell wall, ...

  3. Little string origin of surface defects

    Energy Technology Data Exchange (ETDEWEB)

    Haouzi, Nathan; Schmid, Christian [Center for Theoretical Physics, University of California, Berkeley,LeConte Hall, Berkeley (United States)

    2017-05-16

    We derive a large class of codimension-two defects of 4d N=4 Super Yang-Mills (SYM) theory from the (2,0) little string. The origin of the little string is type IIB theory compactified on an ADE singularity. The defects are D-branes wrapping the 2-cycles of the singularity. We use this construction to make contact with the description of SYM defects due to Gukov and Witten https://arxiv.org/abs/hep-th/0612073. Furthermore, we provide a geometric perspective on the nilpotent orbit classification of codimension-two defects, and the connection to ADE-type Toda CFT. The only data needed to specify the defects is a set of weights of the algebra obeying certain constraints, which we give explicitly. We highlight the differences between the defect classification in the little string theory and its (2,0) CFT limit.

  4. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  5. Advanced defect classification by optical metrology

    NARCIS (Netherlands)

    Maas, D.J.

    2017-01-01

    The goal of the workshop is to provide a high level, invited only, international community that accelerates interactions between the main target groups: universities, institutes, entrepreneurs, intrapreneurs and investors in order to facilitate customer development, application discovery or funding

  6. Defects and defect processes in nonmetallic solids

    CERN Document Server

    Hayes, W

    2004-01-01

    This extensive survey covers defects in nonmetals, emphasizing point defects and point-defect processes. It encompasses electronic, vibrational, and optical properties of defective solids, plus dislocations and grain boundaries. 1985 edition.

  7. Cost Accounting in the Automated Manufacturing Environment

    Science.gov (United States)

    1988-06-01

    1 NAVAL POSTGRADUATE SCHOOL M terey, California 0 DTIC II ELECTE R AD%$° NO 0,19880 -- THESIS COST ACCOUNTING IN THE AUTOMATED MANUFACTURING...PROJECT TASK WORK UNIT ELEMENT NO. NO NO ACCESSION NO 11. TITLE (Include Security Classification) E COST ACCOUNTING IN THE AUTOMATED MANUFACTURING...GROUP ’" Cost Accounting ; Product Costing ; Automated Manufacturing; CAD/CAM- CIM 19 ABSTRACT (Continue on reverse if necessary and identify by blo

  8. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  9. Embedded defects

    International Nuclear Information System (INIS)

    Barriola, M.; Vachaspati, T.; Bucher, M.

    1994-01-01

    We give a prescription for embedding classical solutions and, in particular, topological defects in field theories which are invariant under symmetry groups that are not necessarily simple. After providing examples of embedded defects in field theories based on simple groups, we consider the electroweak model and show that it contains the Z string and a one-parameter family of strings called the W(α) string. It is argued that although the members of this family are gauge equivalent when considered in isolation, each member becomes physically distinct when multistring configurations are considered. We then turn to the issue of stability of embedded defects and demonstrate the instability of a large class of such solutions in the absence of bound states or condensates. The Z string is shown to be unstable for all values of the Higgs boson mass when θ W =π/4. W strings are also shown to be unstable for a large range of parameters. Embedded monopoles suffer from the Brandt-Neri-Coleman instability. Finally, we connect the electroweak string solutions to the sphaleron

  10. Automated classification of computer network attacks

    CSIR Research Space (South Africa)

    Van Heerden, R

    2013-11-01

    Full Text Available according to the relevant types of attack scenarios depicted in the ontology. The two network attack instances are the Distributed Denial of Service attack on SpamHaus in 2013 and the theft of 42 million Rand ($6.7 million) from South African Postbank...

  11. "SAFEGUARDING THE INTERESTS OF THE STATE" FROM DEFECTIVE DELINQUENT GIRLS.

    Science.gov (United States)

    Sohasky, Kate E

    2016-01-01

    The 1911 mental classification, "defective delinquent," was created as a temporary legal-medical category in order to identify a peculiar class of delinquent girls in a specific institutional setting. The defective delinquent's alleged slight mental defect, combined with her appearance of normalcy, rendered her a "dangerous" and "incurable" citizen. At the intersection of institutional history and the history of ideas, this article explores the largely overlooked role of borderline mental classifications of near-normalcy in the medicalization of intelligence and criminality during the first third of the twentieth-century United States. Borderline classifications served as mechanisms of control over women's bodies through the criminalization of their minds, and the advent of psychometric tests legitimated and facilitated the spread of this classification beyond its original and intended context. The borderline case of the defective delinquent girl demonstrates the significance of marginal mental classifications to the policing of bodies through the medicalization of intellect. © 2015 Wiley Periodicals, Inc.

  12. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  13. Empirical evaluation of three machine learning method for automatic classification of neoplastic diagnoses Evaluación empírica de tres métodos de aprendizaje automático para clasificar automáticamente diagnósticos de neoplasias

    Directory of Open Access Journals (Sweden)

    José Luis Jara

    2011-12-01

    Full Text Available Diagnoses are a valuable source of information for evaluating a health system. However, they are not used extensively by information systems because diagnoses are normally written in natural language. This work empirically evaluates three machine learning methods to automatically assign codes from the International Classification of Diseases (10th Revision to 3,335 distinct diagnoses of neoplasms obtained from UMLS®. This evaluation is conducted on three different types of preprocessing. The results are encouraging: a well-known rule induction method and maximum entropy models achieve 90% accuracy in a balanced cross-validation experiment.Los diagnósticos médicos son una fuente valiosa de información para evaluar el funcionamiento de un sistema de salud. Sin embargo, su utilización en sistemas de información se ve dificultada porque éstos se encuentran normalmente escritos en lenguaje natural. Este trabajo evalúa empíricamente tres métodos de Aprendizaje Automático para asignar códigos de acuerdo a la Clasificación Internacional de Enfermedades (décima versión a 3.335 diferentes diagnósticos de neoplasias extraídos desde UMLS®. Esta evaluación se realiza con tres tipos distintos de preprocesamiento. Los resultados son alentadores: un conocido método de inducción de reglas de decisión y modelos de entropía máxima obtienen alrededor de 90% accuracy en una validación cruzada balanceada.

  14. A Generic Deep-Learning-Based Approach for Automated Surface Inspection.

    Science.gov (United States)

    Ren, Ruoxu; Hung, Terence; Tan, Kay Chen

    2018-03-01

    Automated surface inspection (ASI) is a challenging task in industry, as collecting training dataset is usually costly and related methods are highly dataset-dependent. In this paper, a generic approach that requires small training data for ASI is proposed. First, this approach builds classifier on the features of image patches, where the features are transferred from a pretrained deep learning network. Next, pixel-wise prediction is obtained by convolving the trained classifier over input image. An experiment on three public and one industrial data set is carried out. The experiment involves two tasks: 1) image classification and 2) defect segmentation. The results of proposed algorithm are compared against several best benchmarks in literature. In the classification tasks, the proposed method improves accuracy by 0.66%-25.50%. In the segmentation tasks, the proposed method reduces error escape rates by 6.00%-19.00% in three defect types and improves accuracies by 2.29%-9.86% in all seven defect types. In addition, the proposed method achieves 0.0% error escape rate in the segmentation task of industrial data.

  15. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  16. Precise design-based defect characterization and root cause analysis

    Science.gov (United States)

    Xie, Qian; Venkatachalam, Panneerselvam; Lee, Julie; Chen, Zhijin; Zafar, Khurram

    2017-03-01

    that human operators will typically miss), to obtain the exact defect location on design, to compare all defective patterns thus detected against a library of known patterns, and to classify all defective patterns as either new or known. By applying the computer to these tasks, we automate the entire process from defective pattern identification to pattern classification with high precision, and we perform this operation en masse during R & D, ramp, and volume production. By adopting the methodology, whenever a specific weak pattern is identified, we are able to run a series of characterization operations to ultimately arrive at the root cause. These characterization operations can include (a) searching all pre-existing Review SEM images for the presence of the specific weak pattern to determine whether there is any spatial (within die or within wafer) or temporal (within any particular date range, before or after a mask revision, etc.) correlation and (b) understanding the failure rate of the specific weak pattern to prioritize the urgency of the problem, (c) comparing the weak pattern against an OPC (Optical Procimity Correction) Verification report or a PWQ (Process Window Qualification)/FEM (Focus Exposure Matrix) result to assess the likelihood of it being a litho-sensitive pattern, etc. After resolving the specific weak pattern, we will categorize it as known pattern, and the engineer will move forward with discovering new weak patterns.

  17. Automated visual inspection of textile

    DEFF Research Database (Denmark)

    Jensen, Rune Fisker; Carstensen, Jens Michael

    1997-01-01

    A method for automated inspection of two types of textile is presented. The goal of the inspection is to determine defects in the textile. A prototype is constructed for simulating the textile production line. At the prototype the images of the textile are acquired by a high speed line scan camera...... the protype to a production line system we only need to gain a speed factor of 4....

  18. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 2 includes best papers from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into six sessions on the basis of the classification of manuscripts considered, which is listed as follows: Mathematical Modeling, Analysis and Computation, Control Engineering, Reliable Networks Design, Vehicular Communications and Networking, Automation and Mechatronics.

  19. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  20. Comparison of Threshold Saccadic Vector Optokinetic Perimetry (SVOP) and Standard Automated Perimetry (SAP) in Glaucoma. Part II: Patterns of Visual Field Loss and Acceptability.

    Science.gov (United States)

    McTrusty, Alice D; Cameron, Lorraine A; Perperidis, Antonios; Brash, Harry M; Tatham, Andrew J; Agarwal, Pankaj K; Murray, Ian C; Fleck, Brian W; Minns, Robert A

    2017-09-01

    We compared patterns of visual field loss detected by standard automated perimetry (SAP) to saccadic vector optokinetic perimetry (SVOP) and examined patient perceptions of each test. A cross-sectional study was done of 58 healthy subjects and 103 with glaucoma who were tested using SAP and two versions of SVOP (v1 and v2). Visual fields from both devices were categorized by masked graders as: 0, normal; 1, paracentral defect; 2, nasal step; 3, arcuate defect; 4, altitudinal; 5, biarcuate; and 6, end-stage field loss. SVOP and SAP classifications were cross-tabulated. Subjects completed a questionnaire on their opinions of each test. We analyzed 142 (v1) and 111 (v2) SVOP and SAP test pairs. SVOP v2 had a sensitivity of 97.7% and specificity of 77.9% for identifying normal versus abnormal visual fields. SAP and SVOP v2 classifications showed complete agreement in 54% of glaucoma patients, with a further 23% disagreeing by one category. On repeat testing, 86% of SVOP v2 classifications agreed with the previous test, compared to 91% of SAP classifications; 71% of subjects preferred SVOP compared to 20% who preferred SAP. Eye-tracking perimetry can be used to obtain threshold visual field sensitivity values in patients with glaucoma and produce maps of visual field defects, with patterns exhibiting close agreement to SAP. Patients preferred eye-tracking perimetry compared to SAP. This first report of threshold eye tracking perimetry shows good agreement with conventional automated perimetry and provides a benchmark for future iterations.

  1. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  2. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  3. Facts about Birth Defects

    Science.gov (United States)

    ... label> Information For… Media Policy Makers Facts about Birth Defects Language: English (US) Español (Spanish) Recommend on ... having a baby born without a birth defect. Birth Defects Are Common Every 4 ½ minutes, a ...

  4. Neural Tube Defects

    Science.gov (United States)

    Neural tube defects are birth defects of the brain, spine, or spinal cord. They happen in the ... that she is pregnant. The two most common neural tube defects are spina bifida and anencephaly. In ...

  5. USING SAS ENTERPRISE GUIDE SOFTWARE IN CLASSIFICATION

    OpenAIRE

    Ana Maria Mihaela IORDACHE

    2011-01-01

    Data mining, also known as "discovery knowledge in large databases "is a modern and powerful information technology and communications tool that can be used to extract useful information but still unknown. This automates the process of discovery some relations and mixtures from the raw data and founded results could be incorporated into an automated decision support. This paper aims to present and perform the classification of European Union countries based on the social indicators calculated...

  6. Developing a database management system to support birth defects surveillance in Florida.

    Science.gov (United States)

    Salemi, Jason L; Hauser, Kimberlea W; Tanner, Jean Paul; Sampat, Diana; Correia, Jane A; Watkins, Sharon M; Kirby, Russell S

    2010-01-01

    The value of any public health surveillance program is derived from the ways in which data are managed and used to improve the public's health. Although birth defects surveillance programs vary in their case volume, budgets, staff, and objectives, the capacity to operate efficiently and maximize resources remains critical to long-term survival. The development of a fully-integrated relational database management system (DBMS) can enrich a surveillance program's data and improve efficiency. To build upon the Florida Birth Defects Registry--a statewide registry relying solely on linkage of administrative datasets and unconfirmed diagnosis codes-the Florida Department of Health provided funding to the University of South Florida to develop and pilot an enhanced surveillance system in targeted areas with a more comprehensive approach to case identification and diagnosis confirmation. To manage operational and administrative complexities, a DBMS was developed, capable of managing transmission of project data from multiple sources, tracking abstractor time during record reviews, offering tools for defect coding and case classification, and providing reports to DBMS users. Since its inception, the DBMS has been used as part of our surveillance projects to guide the receipt of over 200 case lists and review of 12,924 fetuses and infants (with associated maternal records) suspected of having selected birth defects in over 90 birthing and transfer facilities in Florida. The DBMS has provided both anticipated and unexpected benefits. Automation of the processes for managing incoming case lists has reduced clerical workload considerably, while improving accuracy of working lists for field abstraction. Data quality has improved through more effective use of internal edits and comparisons with values for other data elements, while simultaneously increasing abstractor efficiency in completion of case abstraction. We anticipate continual enhancement to the DBMS in the future

  7. Pattern recognition and classification an introduction

    CERN Document Server

    Dougherty, Geoff

    2012-01-01

    The use of pattern recognition and classification is fundamental to many of the automated electronic systems in use today. However, despite the existence of a number of notable books in the field, the subject remains very challenging, especially for the beginner. Pattern Recognition and Classification presents a comprehensive introduction to the core concepts involved in automated pattern recognition. It is designed to be accessible to newcomers from varied backgrounds, but it will also be useful to researchers and professionals in image and signal processing and analysis, and in computer visi

  8. Two datasets of defect reports labeled by a crowd of annotators of unknown reliability

    Directory of Open Access Journals (Sweden)

    Jerónimo Hernández-González

    2018-06-01

    Full Text Available Classifying software defects according to any defined taxonomy is not straightforward. In order to be used for automatizing the classification of software defects, two sets of defect reports were collected from public issue tracking systems from two different real domains. Due to the lack of a domain expert, the collected defects were categorized by a set of annotators of unknown reliability according to their impact from IBM's orthogonal defect classification taxonomy. Both datasets are prepared to solve the defect classification problem by means of techniques of the learning from crowds paradigm (Hernández-González et al. [1].Two versions of both datasets are publicly shared. In the first version, the raw data is given: the text description of defects together with the category assigned by each annotator. In the second version, the text of each defect has been transformed to a descriptive vector using text-mining techniques.

  9. Automation for Primary Processing of Hardwoods

    Science.gov (United States)

    Daniel L. Schmoldt

    1992-01-01

    Hardwood sawmills critically need to incorporate automation and computer technology into their operations. Social constraints, forest biology constraints, forest product market changes, and financial necessity are forcing primary processors to boost their productivity and efficiency to higher levels. The locations, extent, and types of defects found in logs and on...

  10. ILT based defect simulation of inspection images accurately predicts mask defect printability on wafer

    Science.gov (United States)

    Deep, Prakash; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter

    2016-05-01

    At advanced technology nodes mask complexity has been increased because of large-scale use of resolution enhancement technologies (RET) which includes Optical Proximity Correction (OPC), Inverse Lithography Technology (ILT) and Source Mask Optimization (SMO). The number of defects detected during inspection of such mask increased drastically and differentiation of critical and non-critical defects are more challenging, complex and time consuming. Because of significant defectivity of EUVL masks and non-availability of actinic inspection, it is important and also challenging to predict the criticality of defects for printability on wafer. This is one of the significant barriers for the adoption of EUVL for semiconductor manufacturing. Techniques to decide criticality of defects from images captured using non actinic inspection images is desired till actinic inspection is not available. High resolution inspection of photomask images detects many defects which are used for process and mask qualification. Repairing all defects is not practical and probably not required, however it's imperative to know which defects are severe enough to impact wafer before repair. Additionally, wafer printability check is always desired after repairing a defect. AIMSTM review is the industry standard for this, however doing AIMSTM review for all defects is expensive and very time consuming. Fast, accurate and an economical mechanism is desired which can predict defect printability on wafer accurately and quickly from images captured using high resolution inspection machine. Predicting defect printability from such images is challenging due to the fact that the high resolution images do not correlate with actual mask contours. The challenge is increased due to use of different optical condition during inspection other than actual scanner condition, and defects found in such images do not have correlation with actual impact on wafer. Our automated defect simulation tool predicts

  11. Histogram deconvolution - An aid to automated classifiers

    Science.gov (United States)

    Lorre, J. J.

    1983-01-01

    It is shown that N-dimensional histograms are convolved by the addition of noise in the picture domain. Three methods are described which provide the ability to deconvolve such noise-affected histograms. The purpose of the deconvolution is to provide automated classifiers with a higher quality N-dimensional histogram from which to obtain classification statistics.

  12. Tissue Classification

    DEFF Research Database (Denmark)

    Van Leemput, Koen; Puonti, Oula

    2015-01-01

    Computational methods for automatically segmenting magnetic resonance images of the brain have seen tremendous advances in recent years. So-called tissue classification techniques, aimed at extracting the three main brain tissue classes (white matter, gray matter, and cerebrospinal fluid), are now...... well established. In their simplest form, these methods classify voxels independently based on their intensity alone, although much more sophisticated models are typically used in practice. This article aims to give an overview of often-used computational techniques for brain tissue classification...

  13. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  14. Research on Defects Inspection of Solder Balls Based on Eddy Current Pulsed Thermography

    Directory of Open Access Journals (Sweden)

    Xiuyun Zhou

    2015-10-01

    Full Text Available In order to solve tiny defect detection for solder balls in high-density flip-chip, this paper proposed feasibility study on the effect of detectability as well as classification based on eddy current pulsed thermography (ECPT. Specifically, numerical analysis of 3D finite element inductive heat model is generated to investigate disturbance on the temperature field for different kind of defects such as cracks, voids, etc. The temperature variation between defective and non-defective solder balls is monitored for defects identification and classification. Finally, experimental study is carried on the diameter 1mm tiny solder balls by using ECPT and verify the efficacy of the technique.

  15. Future Control and Automation : Proceedings of the 2nd International Conference on Future Control and Automation

    CERN Document Server

    2012-01-01

    This volume Future Control and Automation- Volume 1 includes best papers selected from 2012 2nd International Conference on Future Control and Automation (ICFCA 2012) held on July 1-2, 2012, Changsha, China. Future control and automation is the use of control systems and information technologies to reduce the need for human work in the production of goods and services. This volume can be divided into five sessions on the basis of the classification of manuscripts considered, which is listed as follows: Identification and Control, Navigation, Guidance and Sensor, Simulation Technology, Future Telecommunications and Control

  16. Classification of COROT Exoplanet Light Curves

    NARCIS (Netherlands)

    Debosscher, J.; Aerts, C.C.; Vandenbussche, B.

    2006-01-01

    We present methodology to achieve the automated variability classification of stars based on photometric time series. Our work is done in the framework of the COROT space mission to be launched in 2006, but will also be applicable to data of the future Gaia satellite. We developed routines that are

  17. Improving settlement type classification of aerial images

    CSIR Research Space (South Africa)

    Mdakane, L

    2014-10-01

    Full Text Available , an automated method can be used to help identify human settlements in a fixed, repeatable and timely manner. The main contribution of this work is to improve generalisation on settlement type classification of aerial imagery. Images acquired at different dates...

  18. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  19. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  20. Research of the application of the new communication technologies for distribution automation

    Science.gov (United States)

    Zhong, Guoxin; Wang, Hao

    2018-03-01

    Communication network is a key factor of distribution automation. In recent years, new communication technologies for distribution automation have a rapid development in China. This paper introduces the traditional communication technologies of distribution automation and analyse the defects of these traditional technologies. Then this paper gives a detailed analysis on some new communication technologies for distribution automation including wired communication and wireless communication and then gives an application suggestion of these new technologies.

  1. Simulation based mask defect repair verification and disposition

    Science.gov (United States)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple

  2. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  3. Avaliação comparativa das causas básicas de morte processadas pelos Sistemas "Automated Classification of Medical Entities" e de Seleção de Causa Básica

    OpenAIRE

    Santo, Augusto H.; Pinheiro, Celso E.; Rodrigues, Eliana M.

    1998-01-01

    INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respecti...

  4. Transporter Classification Database (TCDB)

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Transporter Classification Database details a comprehensive classification system for membrane transport proteins known as the Transporter Classification (TC)...

  5. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  6. Establishment and application of medication error classification standards in nursing care based on the International Classification of Patient Safety

    Directory of Open Access Journals (Sweden)

    Xiao-Ping Zhu

    2014-09-01

    Conclusion: Application of this classification system will help nursing administrators to accurately detect system- and process-related defects leading to medication errors, and enable the factors to be targeted to improve the level of patient safety management.

  7. Text mining in the classification of digital documents

    Directory of Open Access Journals (Sweden)

    Marcial Contreras Barrera

    2016-11-01

    Full Text Available Objective: Develop an automated classifier for the classification of bibliographic material by means of the text mining. Methodology: The text mining is used for the development of the classifier, based on a method of type supervised, conformed by two phases; learning and recognition, in the learning phase, the classifier learns patterns across the analysis of bibliographical records, of the classification Z, belonging to library science, information sciences and information resources, recovered from the database LIBRUNAM, in this phase is obtained the classifier capable of recognizing different subclasses (LC. In the recognition phase the classifier is validated and evaluates across classification tests, for this end bibliographical records of the classification Z are taken randomly, classified by a cataloguer and processed by the automated classifier, in order to obtain the precision of the automated classifier. Results: The application of the text mining achieved the development of the automated classifier, through the method classifying documents supervised type. The precision of the classifier was calculated doing the comparison among the assigned topics manually and automated obtaining 75.70% of precision. Conclusions: The application of text mining facilitated the creation of automated classifier, allowing to obtain useful technology for the classification of bibliographical material with the aim of improving and speed up the process of organizing digital documents.

  8. Featureless classification of light curves

    Science.gov (United States)

    Kügler, S. D.; Gianniotis, N.; Polsterer, K. L.

    2015-08-01

    In the era of rapidly increasing amounts of time series data, classification of variable objects has become the main objective of time-domain astronomy. Classification of irregularly sampled time series is particularly difficult because the data cannot be represented naturally as a vector which can be directly fed into a classifier. In the literature, various statistical features serve as vector representations. In this work, we represent time series by a density model. The density model captures all the information available, including measurement errors. Hence, we view this model as a generalization to the static features which directly can be derived, e.g. as moments from the density. Similarity between each pair of time series is quantified by the distance between their respective models. Classification is performed on the obtained distance matrix. In the numerical experiments, we use data from the OGLE (Optical Gravitational Lensing Experiment) and ASAS (All Sky Automated Survey) surveys and demonstrate that the proposed representation performs up to par with the best currently used feature-based approaches. The density representation preserves all static information present in the observational data, in contrast to a less-complete description by features. The density representation is an upper boundary in terms of information made available to the classifier. Consequently, the predictive power of the proposed classification depends on the choice of similarity measure and classifier, only. Due to its principled nature, we advocate that this new approach of representing time series has potential in tasks beyond classification, e.g. unsupervised learning.

  9. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  10. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  11. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  12. Craniotomy Frontal Bone Defect

    African Journals Online (AJOL)

    2018-03-01

    Mar 1, 2018 ... Defect reconstruction and fixation of the graft: The defect of ... where all loose fragments of fractured frontal bone was removed via the ... Mandible. • Ilium. • Allograft ... pediatric patients owing to skull growth. Thus, autologous ...

  13. Congenital platelet function defects

    Science.gov (United States)

    ... pool disorder; Glanzmann's thrombasthenia; Bernard-Soulier syndrome; Platelet function defects - congenital ... Congenital platelet function defects are bleeding disorders that cause reduced platelet function. Most of the time, people with these disorders have ...

  14. Defect of the Eyelids.

    Science.gov (United States)

    Lu, Guanning Nina; Pelton, Ron W; Humphrey, Clinton D; Kriet, John David

    2017-08-01

    Eyelid defects disrupt the complex natural form and function of the eyelids and present a surgical challenge. Detailed knowledge of eyelid anatomy is essential in evaluating a defect and composing a reconstructive plan. Numerous reconstructive techniques have been described, including primary closure, grafting, and a variety of local flaps. This article describes an updated reconstructive ladder for eyelid defects that can be used in various permutations to solve most eyelid defects. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  16. Micro-bridge defects: characterization and root cause analysis

    Science.gov (United States)

    Santoro, Gaetano; Van den Heuvel, Dieter; Braggin, Jennifer; Rosslee, Craig; Leray, Philippe J.; Cheng, Shaunee; Jehoul, Christiane; Schreutelkamp, Robert; Hillel, Noam

    2010-03-01

    Defect review of advanced lithography processes is becoming more and more challenging as feature sizes decrease. Previous studies using a defect review SEM on immersion lithography generated wafers have resulted in a defect classification scheme which, among others, includes a category for micro-bridges. Micro-bridges are small connections between two adjacent lines in photo-resist and are considered device killing defects. Micro-bridge rates also tend to increase as feature sizes decrease, making them even more important for the next technology nodes. Especially because micro-bridge defects can originate from different root causes, the need to further refine and split up the classification of this type of defect into sub groups may become a necessity. This paper focuses on finding the correlation of the different types of micro-bridge defects to a particular root cause based on a full characterization and root cause analysis of this class of defects, by using advanced SEM review capabilities like high quality imaging in very low FOV, Multi Perspective SEM Imaging (MPSI), tilted column and rotated stage (Tilt&Rotation) imaging and Focused Ion Beam (FIB) cross sectioning. Immersion lithography material has been mainly used to generate the set of data presented in this work even though, in the last part of the results, some EUV lithography data will be presented as part of the continuing effort to extend the micro-bridge defect characterization to the EUV technology on 40 nm technology node and beyond.

  17. Point defects in solids

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    The principal properties of point defects are studied: thermodynamics, electronic structure, interactions with etended defects, production by irradiation. Some measuring methods are presented: atomic diffusion, spectroscopic methods, diffuse scattering of neutron and X rays, positron annihilation, molecular dynamics. Then points defects in various materials are investigated: ionic crystals, oxides, semiconductor materials, metals, intermetallic compounds, carbides, nitrides [fr

  18. Fibrous metaphyseal defects

    International Nuclear Information System (INIS)

    Ritschl, P.; Hajek, P.C.; Pechmann, U.

    1989-01-01

    Sixteen patients with fibrous metaphyseal defects were examined with both plain radiography and magnetic resonance (MR) imaging. Depending on the age of the fibrous metaphyseal defects, characteristic radiomorphologic changes were found which correlated well with MR images. Following intravenous Gadolinium-DTPA injection, fibrous metaphyseal defects invariably exhibited a hyperintense border and signal enhancement. (orig./GDG)

  19. Birth Defects (For Parents)

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Birth Defects KidsHealth / For Parents / Birth Defects What's in ... Prevented? Print en español Anomalías congénitas What Are Birth Defects? While still in the womb, some babies ...

  20. Evaluation of Advanced Signal Processing Techniques to Improve Detection and Identification of Embedded Defects

    Energy Technology Data Exchange (ETDEWEB)

    Clayton, Dwight A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Santos-Villalobos, Hector J. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Baba, Justin S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    , or an improvement in contrast over conventional SAFT reconstructed images. This report documents our efforts in four fronts: 1) Comparative study between traditional SAFT and FBD SAFT for concrete specimen with and without Alkali-Silica Reaction (ASR) damage, 2) improvement of our Model-Based Iterative Reconstruction (MBIR) for thick reinforced concrete [5], 3) development of a universal framework for sharing, reconstruction, and visualization of ultrasound NDE datasets, and 4) application of machine learning techniques for automated detection of ASR inside concrete. Our comparative study between FBD and traditional SAFT reconstruction images shows a clear difference between images of ASR and non-ASR specimens. In particular, the left first harmonic shows an increased contrast and sensitivity to ASR damage. For MBIR, we show the superiority of model-based techniques over delay and sum techniques such as SAFT. Improvements include elimination of artifacts caused by direct arrival signals, and increased contrast and Signal to Noise Ratio. For the universal framework, we document a format for data storage based on the HDF5 file format, and also propose a modular Graphic User Interface (GUI) for easy customization of data conversion, reconstruction, and visualization routines. Finally, two techniques for ASR automated detection are presented. The first technique is based on an analysis of the frequency content using Hilbert Transform Indicator (HTI) and the second technique employees Artificial Neural Network (ANN) techniques for training and classification of ultrasound data as ASR or non-ASR damaged classes. The ANN technique shows great potential with classification accuracy above 95%. These approaches are extensible to the detection of additional reinforced, thick concrete defects and damage.

  1. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  2. Dirichlet topological defects

    International Nuclear Information System (INIS)

    Carroll, S.M.; Trodden, M.

    1998-01-01

    We propose a class of field theories featuring solitonic solutions in which topological defects can end when they intersect other defects of equal or higher dimensionality. Such configurations may be termed open-quotes Dirichlet topological defects,close quotes in analogy with the D-branes of string theory. Our discussion focuses on defects in scalar field theories with either gauge or global symmetries, in 3+1 dimensions; the types of defects considered include walls ending on walls, strings on walls, and strings on strings. copyright 1998 The American Physical Society

  3. Synthetic Defects for Vibrothermography

    Science.gov (United States)

    Renshaw, Jeremy; Holland, Stephen D.; Thompson, R. Bruce; Eisenmann, David J.

    2010-02-01

    Synthetic defects are an important tool used for characterizing the performance of nondestructive evaluation techniques. Viscous material-filled synthetic defects were developed for use in vibrothermography (also known as sonic IR) as a tool to improve inspection accuracy and reliability. This paper describes how the heat-generation response of these VMF synthetic defects is similar to the response of real defects. It also shows how VMF defects can be applied to improve inspection accuracy for complex industrial parts and presents a study of their application in an aircraft engine stator vane.

  4. Defect production in ceramics

    Energy Technology Data Exchange (ETDEWEB)

    Zinkle, S.J. [Oak Ridge National Lab., TN (United States); Kinoshita, C. [Kyushu Univ. (Japan)

    1997-08-01

    A review is given of several important defect production and accumulation parameters for irradiated ceramics. Materials covered in this review include alumina, magnesia, spinel silicon carbide, silicon nitride, aluminum nitride and diamond. Whereas threshold displacement energies for many ceramics are known within a reasonable level of uncertainty (with notable exceptions being AIN and Si{sub 3}N{sub 4}), relatively little information exists on the equally important parameters of surviving defect fraction (defect production efficiency) and point defect migration energies for most ceramics. Very little fundamental displacement damage information is available for nitride ceramics. The role of subthreshold irradiation on defect migration and microstructural evolution is also briefly discussed.

  5. On holographic defect entropy

    International Nuclear Information System (INIS)

    Estes, John; Jensen, Kristan; O’Bannon, Andy; Tsatis, Efstratios; Wrase, Timm

    2014-01-01

    We study a number of (3+1)- and (2+1)-dimensional defect and boundary conformal field theories holographically dual to supergravity theories. In all cases the defects or boundaries are planar, and the defects are codimension-one. Using holography, we compute the entanglement entropy of a (hemi-)spherical region centered on the defect (boundary). We define defect and boundary entropies from the entanglement entropy by an appropriate background subtraction. For some (3+1)-dimensional theories we find evidence that the defect/boundary entropy changes monotonically under certain renormalization group flows triggered by operators localized at the defect or boundary. This provides evidence that the g-theorem of (1+1)-dimensional field theories generalizes to higher dimensions

  6. Extracting and identifying concrete structural defects in GPR images

    Science.gov (United States)

    Ye, Qiling; Jiao, Liangbao; Liu, Chuanxin; Cao, Xuehong; Huston, Dryver; Xia, Tian

    2018-03-01

    Traditionally most GPR data interpretations are performed manually. With the advancement of computing technologies, how to automate GPR data interpretation to achieve high efficiency and accuracy has become an active research subject. In this paper, analytical characterizations of major defects in concrete structures, including delamination, air void and moisture in GPR images, are performed. In the study, the image features of different defects are compared. Algorithms are developed for defect feature extraction and identification. For validations, both simulation results and field test data are utilized.

  7. Genital and Urinary Tract Defects

    Science.gov (United States)

    ... conditions > Genital and urinary tract defects Genital and urinary tract defects E-mail to a friend Please fill ... and extra fluids. What problems can genital and urinary tract defects cause? Genital and urinary tract defects affect ...

  8. Learning features for tissue classification with the classification restricted Boltzmann machine

    DEFF Research Database (Denmark)

    van Tulder, Gijs; de Bruijne, Marleen

    2014-01-01

    Performance of automated tissue classification in medical imaging depends on the choice of descriptive features. In this paper, we show how restricted Boltzmann machines (RBMs) can be used to learn features that are especially suited for texture-based tissue classification. We introduce the convo...... outperform conventional RBM-based feature learning, which is unsupervised and uses only a generative learning objective, as well as often-used filter banks. We show that a mixture of generative and discriminative learning can produce filters that give a higher classification accuracy....

  9. Deep convolutional neural networks for detection of rail surface defects

    NARCIS (Netherlands)

    Faghih Roohi, S.; Hajizadeh, S.; Nunez Vicencio, Alfredo; Babuska, R.; De Schutter, B.H.K.; Estevez, Pablo A.; Angelov, Plamen P.; Del Moral Hernandez, Emilio

    2016-01-01

    In this paper, we propose a deep convolutional neural network solution to the analysis of image data for the detection of rail surface defects. The images are obtained from many hours of automated video recordings. This huge amount of data makes it impossible to manually inspect the images and

  10. Automated vehicle for railway track fault detection

    Science.gov (United States)

    Bhushan, M.; Sujay, S.; Tushar, B.; Chitra, P.

    2017-11-01

    For the safety reasons, railroad tracks need to be inspected on a regular basis for detecting physical defects or design non compliances. Such track defects and non compliances, if not detected in a certain interval of time, may eventually lead to severe consequences such as train derailments. Inspection must happen twice weekly by a human inspector to maintain safety standards as there are hundreds and thousands of miles of railroad track. But in such type of manual inspection, there are many drawbacks that may result in the poor inspection of the track, due to which accidents may cause in future. So to avoid such errors and severe accidents, this automated system is designed.Such a concept would surely introduce automation in the field of inspection process of railway track and can help to avoid mishaps and severe accidents due to faults in the track.

  11. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  12. CLASSIFICATION OF LEARNING MANAGEMENT SYSTEMS

    Directory of Open Access Journals (Sweden)

    Yu. B. Popova

    2016-01-01

    Full Text Available Using of information technologies and, in particular, learning management systems, increases opportunities of teachers and students in reaching their goals in education. Such systems provide learning content, help organize and monitor training, collect progress statistics and take into account the individual characteristics of each user. Currently, there is a huge inventory of both paid and free systems are physically located both on college servers and in the cloud, offering different features sets of different licensing scheme and the cost. This creates the problem of choosing the best system. This problem is partly due to the lack of comprehensive classification of such systems. Analysis of more than 30 of the most common now automated learning management systems has shown that a classification of such systems should be carried out according to certain criteria, under which the same type of system can be considered. As classification features offered by the author are: cost, functionality, modularity, keeping the customer’s requirements, the integration of content, the physical location of a system, adaptability training. Considering the learning management system within these classifications and taking into account the current trends of their development, it is possible to identify the main requirements to them: functionality, reliability, ease of use, low cost, support for SCORM standard or Tin Can API, modularity and adaptability. According to the requirements at the Software Department of FITR BNTU under the guidance of the author since 2009 take place the development, the use and continuous improvement of their own learning management system.

  13. BENCHMARKING MACHINE LEARNING TECHNIQUES FOR SOFTWARE DEFECT DETECTION

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine Learning approaches are good in solving problems that have less information. In most cases, the software domain problems characterize as a process of learning that depend on the various circumstances and changes accordingly. A predictive model is constructed by using machine learning approaches and classified them into defective and non-defective modules. Machine learning techniques help developers to retrieve useful information after the classification and enable them to analyse data...

  14. Pipeline defect prediction using long range ultrasonic testing and intelligent processing

    International Nuclear Information System (INIS)

    Dino Isa; Rajprasad Rajkumar

    2009-01-01

    This paper deals with efforts to improve nondestructive testing (NDT) techniques by using artificial intelligence in detecting and predicting pipeline defects such as cracks and wall thinning. The main emphasis here will be on the prediction of corrosion type defects rather than just detection after the fact. Long range ultrasonic testing will be employed, where a ring of piezoelectric transducers are used to generate torsional guided waves. Various defects such as cracks as well as corrosion under insulation (CUI) will be simulated on a test pipe. The machine learning algorithm known as the Support Vector Machine (SVM) will be used to predict and classify transducer signals using regression and large margin classification. Regression results show that the SVM is able to accurately predict future defects based on trends of previous defect. The classification performance was also exceptional showing a facility to detect defects at different depths as well as for distinguishing closely spaced defects. (author)

  15. INTEGRATION OF INFORMATIONAL COMPUTER TECHNOLOGIES SMK: AUTOMATION OF THE MAIN FUNCTIONS OF THE TECHNICAL CONTROL DEPARTMENT

    Directory of Open Access Journals (Sweden)

    S. A. Pavlenko

    2010-01-01

    Full Text Available It is shown that automation of some functions of control department allows to record defects, reclamations and failures of technology, to make the necessary reporting forms and quality certificates for production.

  16. Defects in semiconductors

    CERN Document Server

    Romano, Lucia; Jagadish, Chennupati

    2015-01-01

    This volume, number 91 in the Semiconductor and Semimetals series, focuses on defects in semiconductors. Defects in semiconductors help to explain several phenomena, from diffusion to getter, and to draw theories on materials' behavior in response to electrical or mechanical fields. The volume includes chapters focusing specifically on electron and proton irradiation of silicon, point defects in zinc oxide and gallium nitride, ion implantation defects and shallow junctions in silicon and germanium, and much more. It will help support students and scientists in their experimental and theoret

  17. Classification in context

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper surveys classification research literature, discusses various classification theories, and shows that the focus has traditionally been on establishing a scientific foundation for classification research. This paper argues that a shift has taken place, and suggests that contemporary...... classification research focus on contextual information as the guide for the design and construction of classification schemes....

  18. Classification of the web

    DEFF Research Database (Denmark)

    Mai, Jens Erik

    2004-01-01

    This paper discusses the challenges faced by investigations into the classification of the Web and outlines inquiries that are needed to use principles for bibliographic classification to construct classifications of the Web. This paper suggests that the classification of the Web meets challenges...... that call for inquiries into the theoretical foundation of bibliographic classification theory....

  19. Genome-Wide Comparative Gene Family Classification

    Science.gov (United States)

    Frech, Christian; Chen, Nansheng

    2010-01-01

    Correct classification of genes into gene families is important for understanding gene function and evolution. Although gene families of many species have been resolved both computationally and experimentally with high accuracy, gene family classification in most newly sequenced genomes has not been done with the same high standard. This project has been designed to develop a strategy to effectively and accurately classify gene families across genomes. We first examine and compare the performance of computer programs developed for automated gene family classification. We demonstrate that some programs, including the hierarchical average-linkage clustering algorithm MC-UPGMA and the popular Markov clustering algorithm TRIBE-MCL, can reconstruct manual curation of gene families accurately. However, their performance is highly sensitive to parameter setting, i.e. different gene families require different program parameters for correct resolution. To circumvent the problem of parameterization, we have developed a comparative strategy for gene family classification. This strategy takes advantage of existing curated gene families of reference species to find suitable parameters for classifying genes in related genomes. To demonstrate the effectiveness of this novel strategy, we use TRIBE-MCL to classify chemosensory and ABC transporter gene families in C. elegans and its four sister species. We conclude that fully automated programs can establish biologically accurate gene families if parameterized accordingly. Comparative gene family classification finds optimal parameters automatically, thus allowing rapid insights into gene families of newly sequenced species. PMID:20976221

  20. Study on on-machine defects measuring system on high power laser optical elements

    Science.gov (United States)

    Luo, Chi; Shi, Feng; Lin, Zhifan; Zhang, Tong; Wang, Guilin

    2017-10-01

    The influence of surface defects on high power laser optical elements will cause some harm to the performances of imaging system, including the energy consumption and the damage of film layer. To further increase surface defects on high power laser optical element, on-machine defects measuring system was investigated. Firstly, the selection and design are completed by the working condition analysis of the on-machine defects detection system. By designing on processing algorithms to realize the classification recognition and evaluation of surface defects. The calibration experiment of the scratch was done by using the self-made standard alignment plate. Finally, the detection and evaluation of surface defects of large diameter semi-cylindrical silicon mirror are realized. The calibration results show that the size deviation is less than 4% that meet the precision requirement of the detection of the defects. Through the detection of images the on-machine defects detection system can realize the accurate identification of surface defects.

  1. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  2. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  3. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  4. Metallography of defects

    International Nuclear Information System (INIS)

    Borisova, E.A.; Bochvar, G.A.; Brun, M.Ya.

    1980-01-01

    Different types of defects of metallurgical, technological and exploitation origin in intermediate and final products of titanium alloys, are considered. The examples of metallic and nonmetallic inclusions, chemical homogeneity, different grains, bands, cracks, places of searing, porosity are given; methods of detecting the above defects are described. The methods of metallography, X-ray spectral analysis, measuring microhardness are used

  5. Beating Birth Defects

    Centers for Disease Control (CDC) Podcasts

    Each year in the U.S., one in 33 babies is affected by a major birth defect. Women can greatly improve their chances of giving birth to a healthy baby by avoiding some of the risk factors for birth defects before and during pregnancy. In this podcast, Dr. Stuart Shapira discusses ways to improve the chances of giving birth to a healthy baby.

  6. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  7. The Business Case for Automated Software Engineering

    Science.gov (United States)

    Menzies, Tim; Elrawas, Oussama; Hihn, Jairus M.; Feather, Martin S.; Madachy, Ray; Boehm, Barry

    2007-01-01

    Adoption of advanced automated SE (ASE) tools would be more favored if a business case could be made that these tools are more valuable than alternate methods. In theory, software prediction models can be used to make that case. In practice, this is complicated by the 'local tuning' problem. Normally. predictors for software effort and defects and threat use local data to tune their predictions. Such local tuning data is often unavailable. This paper shows that assessing the relative merits of different SE methods need not require precise local tunings. STAR 1 is a simulated annealer plus a Bayesian post-processor that explores the space of possible local tunings within software prediction models. STAR 1 ranks project decisions by their effects on effort and defects and threats. In experiments with NASA systems. STARI found one project where ASE were essential for minimizing effort/ defect/ threats; and another project were ASE tools were merely optional.

  8. Defects at oxide surfaces

    CERN Document Server

    Thornton, Geoff

    2015-01-01

    This book presents the basics and characterization of defects at oxide surfaces. It provides a state-of-the-art review of the field, containing information to the various types of surface defects, describes analytical methods to study defects, their chemical activity and the catalytic reactivity of oxides. Numerical simulations of defective structures complete the picture developed. Defects on planar surfaces form the focus of much of the book, although the investigation of powder samples also form an important part. The experimental study of planar surfaces opens the possibility of applying the large armoury of techniques that have been developed over the last half-century to study surfaces in ultra-high vacuum. This enables the acquisition of atomic level data under well-controlled conditions, providing a stringent test of theoretical methods. The latter can then be more reliably applied to systems such as nanoparticles for which accurate methods of characterization of structure and electronic properties ha...

  9. Defects in dilute nitrides

    International Nuclear Information System (INIS)

    Chen, W.M.; Buyanova, I.A.; Tu, C.W.; Yonezu, H.

    2005-01-01

    We provide a brief review our recent results from optically detected magnetic resonance studies of grown-in non-radiative defects in dilute nitrides, i.e. Ga(In)NAs and Ga(Al,In)NP. Defect complexes involving intrinsic defects such as As Ga antisites and Ga i self interstitials were positively identified.Effects of growth conditions, chemical compositions and post-growth treatments on formation of the defects are closely examined. These grown-in defects are shown to play an important role in non-radiative carrier recombination and thus in degrading optical quality of the alloys, harmful to performance of potential optoelectronic and photonic devices based on these dilute nitrides. (author)

  10. Classificação anatômica e correção cirúrgica da atresia pulmonar com comunicação interventricular Anatomical classification and surgical repair of the pulmonary atresia with ventricular septal defect

    Directory of Open Access Journals (Sweden)

    Ulisses Alexandre CROTI

    2001-12-01

    Full Text Available OBJETIVO: Analisar as características anatômicas, o resultado das técnicas empregadas na correção cirúrgica de acordo com o número de procedimentos, assim como a mortalidade em cada grupo da classificação de Barbero-Marcial para atresia pulmonar com comunicação interventricular. CASUÍSTICA E MÉTODOS: De janeiro de 1990 a novembro de 1999, 73 pacientes que foram submetidos a estudo cineangiocardiográfico previamente à primeira intervenção cirúrgica, foram analisados. As características anatômicas das artérias pulmonares e artérias colaterais sistêmico-pulmonares, assim como as técnicas cirúrgicas que propiciaram tratamento paliativo, "paliativo definitivo" e definitivo foram estudadas. As causas de mortalidade também foram descritas. RESULTADOS: Dezenove pacientes apresentavam os segmentos pulmonares supridos por artérias pulmonares (grupo A, 45 por artérias pulmonares e artérias colaterais sistêmico-pulmonares (grupo B e 9 somente por artérias colaterais sistêmico-pulmonares (grupo C. O grupo A apresentou maior proporção de tratamentos definitivos, o grupo B maior proporção de paliativos e o grupo C, maior proporção de "paliativos definitivos" (pOBJECTIVE: To analyze the morphological aspects, the surgical results obtained according to the number of procedures, and the mortality in each group of Barbero-Marcial´s classification of the pulmonary atresia with ventricular septal defect. MATERIAL E METHODS: From January 1990 to November 1999, 73 patients submitted to cardiac catheterization and detailed pulmonary angiographic study before the first surgical intervention were analyzed. The anatomical characteristics of the pulmonary arteries and major aorticopulmonary collaterals, as the surgical techniques of definitive, palliative and "definitive palliative" were studied. The causes of mortality were also described. RESULTS: Nineteen patients had all the pulmonary segments supplied by pulmonary arteries (group

  11. Defect creation in solids by a decay of electronic excitations

    International Nuclear Information System (INIS)

    Klinger, M.I.; Lushchik, Ch.B.; Mashovets, T.V.; Kholodar', G.A.; Shejnkman, M.K.; Ehlango, M.A.; Kievskij Gosudarstvennyj Univ.; AN Ukrainskoj SSR, Kiev. Inst. Poluprovodnikov)

    1985-01-01

    A new type of radiationless transitions in nonmetallic solids accompanied by neither the extraction of a heat nor the luminescence, but by a large (in comparison with the interatomic distance) displacements of a small number of atoms is discussed. A classification is given of the instabilities (electrostatic, electron-vibrational, structural) leading to a creation of the defects in crystalline and glassy solids. The processes of the defect creation, due to both the decay of self-trapped excitions in ionic crystals and the multiple ionization of atoms near the pre-existing charged centres in semiconductor are described. The mechanisms of the complex defects reconstruction in semiconductors by nonequilibrium charge carriers and by an electron-hole recombination are discussed. The role of charge carriers in a thermal defect generation is considered. A mechanism of the peculiar defect creation in glassy semiconductors is discussed

  12. Automated evaluation of ultrasonic indications. State of the art -development trends. Pt. 1

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  13. Oil defect detection of electrowetting display

    Science.gov (United States)

    Chiang, Hou-Chi; Tsai, Yu-Hsiang; Yan, Yung-Jhe; Huang, Ting-Wei; Mang, Ou-Yang

    2015-08-01

    In recent years, transparent display is an emerging topic in display technologies. Apply in many fields just like mobile device, shopping or advertising window, and etc. Electrowetting Display (EWD) is one kind of potential transparent display technology advantages of high transmittance, fast response time, high contrast and rich color with pigment based oil system. In mass production process of Electrowetting Display, oil defects should be found by Automated Optical Inspection (AOI) detection system. It is useful in determination of panel defects for quality control. According to the research of our group, we proposed a mechanism of AOI detection system detecting the different kinds of oil defects. This mechanism can detect different kinds of oil defect caused by oil overflow or material deteriorated after oil coating or driving. We had experiment our mechanism with a 6-inch Electrowetting Display panel from ITRI, using an Epson V750 scanner with 1200 dpi resolution. Two AOI algorithms were developed, which were high speed method and high precision method. In high precision method, oil jumping or non-recovered can be detected successfully. This mechanism of AOI detection system can be used to evaluate the oil uniformity in EWD panel process. In the future, our AOI detection system can be used in quality control of panel manufacturing for mass production.

  14. Inspection of surface defects for cladding tube with laser

    International Nuclear Information System (INIS)

    Senoo, Shigeo; Igarashi, Miyuki; Satoh, Masakazu; Miura, Makoto

    1978-01-01

    This paper presents the results of experiment on mechanizing the visual inspection of surface defects of cladding tubes and improving the reliability of surface defect inspection. Laser spot inspection method was adopted for this purpose. Since laser speckle pattern includes many informations about surface aspects, the method can be utilized as an effective means for detection or classification of the surface defects. Laser beam is focussed on cladding tube surfaces, and the reflected laser beam forms typical stellar speckle patterns on a screen. Sample cladding tubes are driven in longitudinal direction, and a photo-detector is placed at a position where secondary reflection will fall on the detector. Reflected laser beam from defect-free surfaces shows uniform distribution on the detector. When the incident focussed laser beam is directed to defects, the intensity of the reflected light is reduced. In the second method, laser beam is scanned by a rotating cube mirror. As the results of experiment, the typical patterns caused by defects were observed. It is clear that reflection patterns change with the kinds of defects. The sensitivity of defect detection decreases with the increase in laser beam diameter. Surface defect detection by intensity change was also tested. (Kato, T.)

  15. Application of elastic net and infrared spectroscopy in the discrimination between defective and non-defective roasted coffees.

    Science.gov (United States)

    Craig, Ana Paula; Franca, Adriana S; Oliveira, Leandro S; Irudayaraj, Joseph; Ileleji, Klein

    2014-10-01

    The quality of the coffee beverage is negatively affected by the presence of defective coffee beans and its evaluation still relies on highly subjective sensory panels. To tackle the problem of subjectivity, sophisticated analytical techniques have been developed and have been shown capable of discriminating defective from non-defective coffees after roasting. However, these techniques are not adequate for routine analysis, for they are laborious (sample preparation) and time consuming, and reliable, simpler and faster techniques need to be developed for such purpose. Thus, it was the aim of this study to evaluate the performance of infrared spectroscopic methods, namely FTIR and NIR, for the discrimination of roasted defective and non-defective coffees, employing a novel statistical approach. The classification models based on Elastic Net exhibited high percentage of correct classification, and the discriminant infrared spectra variables extracted provided a good interpretation of the models. The discrimination of defective and non-defective beans was associated with main chemical descriptors of coffee, such as carbohydrates, proteins/amino acids, lipids, caffeine and chlorogenic acids. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. 7 CFR 51.652 - Classification of defects.

    Science.gov (United States)

    2010-01-01

    ... the equivalent of this amount, by volume, when occurring in other portions of the fruit. Green spots or oil spots More than slightly affecting appearance Aggregating more than a circle 1 inch in... Aggregating more than 25 percent of the surface. Skin breakdown Aggregating more than a circle 3/8 inch in...

  17. 7 CFR 51.713 - Classification of defects.

    Science.gov (United States)

    2010-01-01

    .... Creasing Materially weakens the skin, or extends over more than one-third of the surface Seriously weakens the skin, or extends over more than one-half of the surface Very seriously weakens the skin, or is... spots or oil spots More than slightly affecting appearance Aggregating more than a circle 7/8 inch in...

  18. 7 CFR 51.1837 - Classification of defects.

    Science.gov (United States)

    2010-01-01

    ... the skin, or extends over more than one-third of the surface Seriously weakens the skin, or extends over more than one-half of the surface Very seriously weakens the skin, or is distributed over... of the fruit. Green spots Aggregating more than a circle 1/2 inch (12.7 mm) in diameter Aggregating...

  19. 7 CFR 51.784 - Classification of defects.

    Science.gov (United States)

    2010-01-01

    ... the equivalent of this amount, by volume, when occurring in other portions of the fruit. Green spots... diameter, caused by scale Aggregating more than 1/3 of the surface, caused by scale. Oil spots Aggregating... deep or very rough or unsightly that appearance is very seriously affected. Skin breakdown Aggregating...

  20. 7 CFR 51.2340 - Classification of defects.

    Science.gov (United States)

    2010-01-01

    ... inch (9.5 mm) in diameter which materially detract from the appearance, edible or shipping quality When... which seriously detract from the appearance, edible or shipping quality. Leaf or Limbrubs When not... inch (9.5 mm) in diameter. Insects When feeding injury is evident on fruit or any insect is present in...

  1. 7 CFR 51.3416 - Classification of defects.

    Science.gov (United States)

    2010-01-01

    ... Maximum allowed for U.S. No. 2 processing Occurring outside of or not entirely confined to the vascular ring Internal Black Spot, Internal Discoloration, Vascular Browning, Fusarium Wilt, Net Necrosis, Other Necrosis, Stem End Browning 5% waste 10% waste. Occurring entirely within the vascular ring Hollow Heart or...

  2. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  3. New York State Thruway Authority automatic vehicle classification (AVC) : research report.

    Science.gov (United States)

    2008-03-31

    In December 2007, the N.Y.S. Thruway Authority (Thruway) concluded a Federal : funded research effort to study technology and develop a design for retrofitting : devices required in implementing a fully automated vehicle classification system i...

  4. Support Vector Machine and Parametric Wavelet-Based Texture Classification of Stem Cell Images

    National Research Council Canada - National Science Library

    Jeffreys, Christopher

    2004-01-01

    .... Since colony texture is a major discriminating feature in determining quality, we introduce a non-invasive, semi-automated texture-based stem cell colony classification methodology to aid researchers...

  5. Defects in Amorphous Semiconductors: The Case of Amorphous Indium Gallium Zinc Oxide

    Science.gov (United States)

    de Jamblinne de Meux, A.; Pourtois, G.; Genoe, J.; Heremans, P.

    2018-05-01

    Based on a rational classification of defects in amorphous materials, we propose a simplified model to describe intrinsic defects and hydrogen impurities in amorphous indium gallium zinc oxide (a -IGZO). The proposed approach consists of organizing defects into two categories: point defects, generating structural anomalies such as metal—metal or oxygen—oxygen bonds, and defects emerging from changes in the material stoichiometry, such as vacancies and interstitial atoms. Based on first-principles simulations, it is argued that the defects originating from the second group always act as perfect donors or perfect acceptors. This classification simplifies and rationalizes the nature of defects in amorphous phases. In a -IGZO, the most important point defects are metal—metal bonds (or small metal clusters) and peroxides (O - O single bonds). Electrons are captured by metal—metal bonds and released by the formation of peroxides. The presence of hydrogen can lead to two additional types of defects: metal-hydrogen defects, acting as acceptors, and oxygen-hydrogen defects, acting as donors. The impact of these defects is linked to different instabilities observed in a -IGZO. Specifically, the diffusion of hydrogen and oxygen is connected to positive- and negative-bias stresses, while negative-bias illumination stress originates from the formation of peroxides.

  6. Formation of topological defects

    International Nuclear Information System (INIS)

    Vachaspati, T.

    1991-01-01

    We consider the formation of point and line topological defects (monopoles and strings) from a general point of view by allowing the probability of formation of a defect to vary. To investigate the statistical properties of the defects at formation we give qualitative arguments that are independent of any particular model in which such defects occur. These arguments are substantiated by numerical results in the case of strings and for monopoles in two dimensions. We find that the network of strings at formation undergoes a transition at a certain critical density below which there are no infinite strings and the closed-string (loop) distribution is exponentially suppressed at large lengths. The results are contrasted with the results of statistical arguments applied to a box of strings in dynamical equilibrium. We argue that if point defects were to form with smaller probability, the distance between monopoles and antimonopoles would decrease while the monopole-to-monopole distance would increase. We find that monopoles are always paired with antimonopoles but the pairing becomes clean only when the number density of defects is small. A similar reasoning would also apply to other defects

  7. A study on limb reduction defects in six European regions

    NARCIS (Netherlands)

    Stoll, C; Calzolari, E; Cornel, M; GarciaMinaur, S; Garne, E; Nevin, N

    1996-01-01

    Limb reduction defects (LRD) gained especial attention after the thalidomide tragedy in 1962, LRD are common congenital malformations which present as obvious congenital anomalies recognized at birth, Therefore it might be assumed that they are well documented, However classification of LRDs is

  8. Defect inspection in hot slab surface: multi-source CCD imaging based fuzzy-rough sets method

    Science.gov (United States)

    Zhao, Liming; Zhang, Yi; Xu, Xiaodong; Xiao, Hong; Huang, Chao

    2016-09-01

    To provide an accurate surface defects inspection method and make the automation of robust image region of interests(ROI) delineation strategy a reality in production line, a multi-source CCD imaging based fuzzy-rough sets method is proposed for hot slab surface quality assessment. The applicability of the presented method and the devised system are mainly tied to the surface quality inspection for strip, billet and slab surface etcetera. In this work we take into account the complementary advantages in two common machine vision (MV) systems(line array CCD traditional scanning imaging (LS-imaging) and area array CCD laser three-dimensional (3D) scanning imaging (AL-imaging)), and through establishing the model of fuzzy-rough sets in the detection system the seeds for relative fuzzy connectedness(RFC) delineation for ROI can placed adaptively, which introduces the upper and lower approximation sets for RIO definition, and by which the boundary region can be delineated by RFC region competitive classification mechanism. For the first time, a Multi-source CCD imaging based fuzzy-rough sets strategy is attempted for CC-slab surface defects inspection that allows an automatic way of AI algorithms and powerful ROI delineation strategies to be applied to the MV inspection field.

  9. 21 CFR 864.9175 - Automated blood grouping and antibody test system.

    Science.gov (United States)

    2010-04-01

    ...) Identification. An automated blood grouping and antibody test system is a device used to group erythrocytes (red blood cells) and to detect antibodies to blood group antigens. (b) Classification. Class II (performance... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated blood grouping and antibody test system...

  10. Evolution of a Benthic Imaging System From a Towed Camera to an Automated Habitat Characterization System

    Science.gov (United States)

    2008-09-01

    automated processing of images for color correction, segmentation of foreground targets from sediment and classification of targets to taxonomic category...element in the development of HabCam as a tool for habitat characterization is the automated processing of images for color correction, segmentation of

  11. Formation Energies of Native Point Defects in Strained-Layer Superlattices (Postprint)

    Science.gov (United States)

    2017-06-05

    potential; bulk materials; total energy calculations; entropy; strained- layer superlattice (SLS) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...AFRL-RX-WP-JA-2017-0217 FORMATION ENERGIES OF NATIVE POINT DEFECTS IN STRAINED- LAYER SUPERLATTICES (POSTPRINT) Zhi-Gang Yu...2016 Interim 11 September 2013 – 5 November 2016 4. TITLE AND SUBTITLE FORMATION ENERGIES OF NATIVE POINT DEFECTS IN STRAINED- LAYER SUPERLATTICES

  12. Ventricular Septal Defect (VSD)

    Science.gov (United States)

    ... Call your doctor if your baby or child: Tires easily when eating or playing Is not gaining ... heart procedures. Risk factors Ventricular septal defects may run in families and sometimes may occur with other ...

  13. Birth Defects: Cerebral Palsy

    Science.gov (United States)

    ... Loss > Birth defects & other health conditions > Cerebral palsy Cerebral palsy E-mail to a friend Please fill in ... this page It's been added to your dashboard . Cerebral palsy (also called CP) is a group of conditions ...

  14. Endocardial cushion defect

    Science.gov (United States)

    ... Philadelphia, PA: Elsevier; 2016:chap 426. Kouchoukos NT, Blackstone EH, Hanley FL, Kirklin JK. Atrioventricular septal defect. In: Kouchoukos NT, Blackstone EH, Hanley FL, Kirklin JK, eds. Kirklin/Barratt- ...

  15. Repairing Nanoparticle Surface Defects

    NARCIS (Netherlands)

    Marino, Emanuele; Kodger, Thomas E.; Crisp, R.W.; Timmerman, Dolf; MacArthur, Katherine E.; Heggen, Marc; Schall, Peter

    2017-01-01

    Solar devices based on semiconductor nanoparticles require the use of conductive ligands; however, replacing the native, insulating ligands with conductive metal chalcogenide complexes introduces structural defects within the crystalline nanostructure that act as traps for charge carriers. We

  16. Point defects in platinum

    International Nuclear Information System (INIS)

    Piercy, G.R.

    1960-01-01

    An investigation was made of the mobility and types of point defect introduced in platinum by deformation in liquid nitrogen, quenching into water from 1600 o C, or reactor irradiation at 50 o C. In all cases the activation energy for motion of the defect was determined from measurements of electrical resistivity. Measurements of density, hardness, and x-ray line broadening were also made there applicable. These experiments indicated that the principal defects remaining in platinum after irradiation were single vacant lattice sites and after quenching were pairs of vacant lattice sites. Those present after deformation In liquid nitrogen were single vacant lattice sites and another type of defect, perhaps interstitial atoms. (author)

  17. Towards Automatic Classification of Wikipedia Content

    Science.gov (United States)

    Szymański, Julian

    Wikipedia - the Free Encyclopedia encounters the problem of proper classification of new articles everyday. The process of assignment of articles to categories is performed manually and it is a time consuming task. It requires knowledge about Wikipedia structure, which is beyond typical editor competence, which leads to human-caused mistakes - omitting or wrong assignments of articles to categories. The article presents application of SVM classifier for automatic classification of documents from The Free Encyclopedia. The classifier application has been tested while using two text representations: inter-documents connections (hyperlinks) and word content. The results of the performed experiments evaluated on hand crafted data show that the Wikipedia classification process can be partially automated. The proposed approach can be used for building a decision support system which suggests editors the best categories that fit new content entered to Wikipedia.

  18. An ordinal classification approach for CTG categorization.

    Science.gov (United States)

    Georgoulas, George; Karvelis, Petros; Gavrilis, Dimitris; Stylios, Chrysostomos D; Nikolakopoulos, George

    2017-07-01

    Evaluation of cardiotocogram (CTG) is a standard approach employed during pregnancy and delivery. But, its interpretation requires high level expertise to decide whether the recording is Normal, Suspicious or Pathological. Therefore, a number of attempts have been carried out over the past three decades for development automated sophisticated systems. These systems are usually (multiclass) classification systems that assign a category to the respective CTG. However most of these systems usually do not take into consideration the natural ordering of the categories associated with CTG recordings. In this work, an algorithm that explicitly takes into consideration the ordering of CTG categories, based on binary decomposition method, is investigated. Achieved results, using as a base classifier the C4.5 decision tree classifier, prove that the ordinal classification approach is marginally better than the traditional multiclass classification approach, which utilizes the standard C4.5 algorithm for several performance criteria.

  19. A neural network for noise correlation classification

    Science.gov (United States)

    Paitz, Patrick; Gokhberg, Alexey; Fichtner, Andreas

    2018-02-01

    We present an artificial neural network (ANN) for the classification of ambient seismic noise correlations into two categories, suitable and unsuitable for noise tomography. By using only a small manually classified data subset for network training, the ANN allows us to classify large data volumes with low human effort and to encode the valuable subjective experience of data analysts that cannot be captured by a deterministic algorithm. Based on a new feature extraction procedure that exploits the wavelet-like nature of seismic time-series, we efficiently reduce the dimensionality of noise correlation data, still keeping relevant features needed for automated classification. Using global- and regional-scale data sets, we show that classification errors of 20 per cent or less can be achieved when the network training is performed with as little as 3.5 per cent and 16 per cent of the data sets, respectively. Furthermore, the ANN trained on the regional data can be applied to the global data, and vice versa, without a significant increase of the classification error. An experiment where four students manually classified the data, revealed that the classification error they would assign to each other is substantially larger than the classification error of the ANN (>35 per cent). This indicates that reproducibility would be hampered more by human subjectivity than by imperfections of the ANN.

  20. Defect detection on videos using neural network

    Directory of Open Access Journals (Sweden)

    Sizyakin Roman

    2017-01-01

    Full Text Available In this paper, we consider a method for defects detection in a video sequence, which consists of three main steps; frame compensation, preprocessing by a detector, which is base on the ranking of pixel values, and the classification of all pixels having anomalous values using convolutional neural networks. The effectiveness of the proposed method shown in comparison with the known techniques on several frames of the video sequence with damaged in natural conditions. The analysis of the obtained results indicates the high efficiency of the proposed method. The additional use of machine learning as postprocessing significantly reduce the likelihood of false alarm.

  1. Automated Discovery of Speech Act Categories in Educational Games

    Science.gov (United States)

    Rus, Vasile; Moldovan, Cristian; Niraula, Nobal; Graesser, Arthur C.

    2012-01-01

    In this paper we address the important task of automated discovery of speech act categories in dialogue-based, multi-party educational games. Speech acts are important in dialogue-based educational systems because they help infer the student speaker's intentions (the task of speech act classification) which in turn is crucial to providing adequate…

  2. Automated mapping of building facades by machine learning

    DEFF Research Database (Denmark)

    Höhle, Joachim

    2014-01-01

    Facades of buildings contain various types of objects which have to be recorded for information systems. The article describes a solution for this task focussing on automated classification by means of machine learning techniques. Stereo pairs of oblique images are used to derive 3D point clouds...

  3. Automated detection and categorization of genital injuries using digital colposcopy

    DEFF Research Database (Denmark)

    Fernandes, Kelwin; Cardoso, Jaime S.; Astrup, Birgitte Schmidt

    2017-01-01

    handcrafted features and deep learning techniques in the automated processing of colposcopic images for genital injury detection. Positive results where achieved by both paradigms in segmentation and classification subtasks, being traditional and deep models the best strategy for each subtask type...

  4. SAW Classification Algorithm for Chinese Text Classification

    OpenAIRE

    Xiaoli Guo; Huiyu Sun; Tiehua Zhou; Ling Wang; Zhaoyang Qu; Jiannan Zang

    2015-01-01

    Considering the explosive growth of data, the increased amount of text data’s effect on the performance of text categorization forward the need for higher requirements, such that the existing classification method cannot be satisfied. Based on the study of existing text classification technology and semantics, this paper puts forward a kind of Chinese text classification oriented SAW (Structural Auxiliary Word) algorithm. The algorithm uses the special space effect of Chinese text where words...

  5. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  6. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  8. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  9. Angiographic differentiation of type of ventricular septal defects

    International Nuclear Information System (INIS)

    Cheon, Mal Soon; Park, Hee Young; Kim, Yang Sook

    1989-01-01

    Defects of the ventricular septum are the commonest type of congenital cardiac malformations. A classification with axial angiography of the subtypes of ventricular septal defects is proposed on the study of 126 patients with defects of the ventricular septum. The results were as follows: 1. The incidence of the ventricular septal defects was 39.6% of congenital heart malformation. 2. The sex distribution of cases were 70 males and 56 females, the age ranged from 13 months to 26 years. 3. Angiographic features seen by axial angiography were as follows: a. Perimembranous defects as seen on long axial view of left ventriculogram were in continuity wity aortic valve. The relation of the defect to the tricuspid valve allows distinction of the extension of the preimembranous defect toward inlet, trabecular, or infundibular zones. This relation was determined angiographically, using the course of the contrast medium from the left ventricle through the ventricular septal defect, opacifying the right ventricle. In inlet excavation, the shunted blood opacified the recess between septal leaflet of tricuspid valve and interventricular septum in early phase, in infundibular excavation, opacified the recess between anterior leaflet of tricuspid valve and anterior free wall of right ventricle and in trabecular excavation, the shunted blood traversed anterior portion of tricuspid valve ring, opacified trabecular portion of right ventricle. b. Muscular defects were separated from the semilunar and atrioventricular valves. c, Subarterial defects were related to both semilunar valves, and they were best demonstrated on the elongated right anterior oblique view of the left ventriculogram. d. Total infundibular defects were profiled in right anterior oblique 30 and long axial view, subaortic in location in both views

  10. Norwegian Pitched Roof Defects

    Directory of Open Access Journals (Sweden)

    Lars Gullbrekken

    2016-06-01

    Full Text Available The building constructions investigated in this work are pitched wooden roofs with exterior vertical drainpipes and wooden load-bearing system. The aim of this research is to further investigate the building defects of pitched wooden roofs and obtain an overview of typical roof defects. The work involves an analysis of the building defect archive from the research institute SINTEF Building and Infrastructure. The findings from the SINTEF archive show that moisture is a dominant exposure factor, especially in roof constructions. In pitched wooden roofs, more than half of the defects are caused by deficiencies in design, materials, or workmanship, where these deficiencies allow moisture from precipitation or indoor moisture into the structure. Hence, it is important to increase the focus on robust and durable solutions to avoid defects both from exterior and interior moisture sources in pitched wooden roofs. Proper design of interior ventilation and vapour retarders seem to be the main ways to control entry from interior moisture sources into attic and roof spaces.

  11. Automated tone grading of granite

    International Nuclear Information System (INIS)

    Catalina Hernández, J.C.; Fernández Ramón, G.

    2017-01-01

    The production of a natural stone processing plant is subject to the intrinsic variability of the stone blocks that constitute its raw material, which may cause problems of lack of uniformity in the visual appearance of the produced material that often triggers complaints from customers. The best way to tackle this problem is to classify the product according to its visual features, which is traditionally done by hand: an operator observes each and every piece that comes out of the production line and assigns it to the closest match among a number of predefined classes, taking into account visual features of the material such as colour, texture, grain, veins, etc. However, this manual procedure presents significant consistency problems, due to the inherent subjectivity of the classification performed by each operator, and the errors caused by their progressive fatigue. Attempts to employ automated sorting systems like the ones used in the ceramic tile industry have not been successful, as natural stone presents much higher variability than ceramic tiles. Therefore, it has been necessary to develop classification systems specifically designed for the treatment of the visual parameters that distinguish the different types of natural stone. This paper describes the details of a computer vision system developed by AITEMIN for the automatic classification of granite pieces according to their tone, which provides an integral solution to tone grading problems in the granite processing and marketing industry. The system has been designed to be easily trained by the end user, through the learning of the samples established as tone patterns by the user. [es

  12. Ichthyoplankton Classification Tool using Generative Adversarial Networks and Transfer Learning

    KAUST Repository

    Aljaafari, Nura

    2018-01-01

    . This method is time-consuming and requires a high level of experience. The recent advances in AI have helped to solve and automate several difficult tasks which motivated us to develop a classification tool for ichthyoplankton. We show that using machine

  13. Repairing Nanoparticle Surface Defects.

    Science.gov (United States)

    Marino, Emanuele; Kodger, Thomas E; Crisp, Ryan W; Timmerman, Dolf; MacArthur, Katherine E; Heggen, Marc; Schall, Peter

    2017-10-23

    Solar devices based on semiconductor nanoparticles require the use of conductive ligands; however, replacing the native, insulating ligands with conductive metal chalcogenide complexes introduces structural defects within the crystalline nanostructure that act as traps for charge carriers. We utilized atomically thin semiconductor nanoplatelets as a convenient platform for studying, both microscopically and spectroscopically, the development of defects during ligand exchange with the conductive ligands Na 4 SnS 4 and (NH 4 ) 4 Sn 2 S 6 . These defects can be repaired via mild chemical or thermal routes, through the addition of L-type ligands or wet annealing, respectively. This results in a higher-quality, conductive, colloidally stable nanomaterial that may be used as the active film in optoelectronic devices. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  14. Defect identification using positrons

    International Nuclear Information System (INIS)

    Beling, C.D.; Fung, S.

    2001-01-01

    The current use of the lifetime and Doppler broadening techniques in defect identification is demonstrated with two studies, the first being the identification of carbon vacancy in n-6H SiC through lifetime spectroscopy, and the second the production of de-hydrogenated voids in α-Si:H through light soaking. Some less conventional ideas are presented for more specific defect identification, namely (i) the amalgamation of lifetime and Doppler techniques with conventional deep level transient spectroscopy in what may be called ''positron-deep level transient spectroscopy'', and (ii) the extraction of more spatial information on vacancy defects by means of what may be called ''Fourier transform Doppler broadening of annihilation radiation spectroscopy'' (orig.)

  15. Eisenmenger ventricular septal defect in a Humboldt penguin (Spheniscus humboldti).

    Science.gov (United States)

    Laughlin, D S; Ialeggio, D M; Trupkiewicz, J G; Sleeper, M M

    2016-09-01

    The Eisenmenger ventricular septal defect is an uncommon type of ventricular septal defect characterised in humans by a traditionally perimembranous ventricular septal defect, anterior deviation (cranioventral deviation in small animal patients) of the muscular outlet septum causing malalignment relative to the remainder of the muscular septum, and overriding of the aortic valve. This anomaly is reported infrequently in human patients and was identified in a 45-day-old Humboldt Penguin, Spheniscus humboldti, with signs of poor growth and a cardiac murmur. This case report describes the findings in this penguin and summarises the anatomy and classification of this cardiac anomaly. To the authors' knowledge this is the first report of an Eisenmenger ventricular septal defect in a veterinary patient. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Quantum computing with defects

    Science.gov (United States)

    Varley, Joel

    2011-03-01

    The development of a quantum computer is contingent upon the identification and design of systems for use as qubits, the basic units of quantum information. One of the most promising candidates consists of a defect in diamond known as the nitrogen-vacancy (NV-1) center, since it is an individually-addressable quantum system that can be initialized, manipulated, and measured with high fidelity at room temperature. While the success of the NV-1 stems from its nature as a localized ``deep-center'' point defect, no systematic effort has been made to identify other defects that might behave in a similar way. We provide guidelines for identifying other defect centers with similar properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate systems. To elucidate these points, we compare electronic structure calculations of the NV-1 center in diamond with those of several deep centers in 4H silicon carbide (SiC). Using hybrid functionals, we report formation energies, configuration-coordinate diagrams, and defect-level diagrams to compare and contrast the properties of these defects. We find that the NC VSi - 1 center in SiC, a structural analog of the NV-1 center in diamond, may be a suitable center with very different optical transition energies. We also discuss how the proposed criteria can be translated into guidelines to discover NV analogs in other tetrahedrally coordinated materials. This work was performed in collaboration with J. R. Weber, W. F. Koehl, B. B. Buckley, A. Janotti, C. G. Van de Walle, and D. D. Awschalom. This work was supported by ARO, AFOSR, and NSF.

  17. Defects in semiconductors

    International Nuclear Information System (INIS)

    Pimentel, C.A.F.

    1983-01-01

    Some problems openned in the study of defects in semiconductors are presented. In particular, a review is made of the more important problems in Si monocrystals of basic and technological interest: microdefects and the presence of oxigen and carbon. The techniques usually utilized in the semiconductor material characterization are emphatized according its potentialities. Some applications of x-ray techniques in the epitaxial shell characterization in heterostructures, importants in electronic optics, are shown. The increase in the efficiency of these defect analysis methods in semiconductor materials with the use of synchrotron x-ray sources is shown. (L.C.) [pt

  18. Lifecycle, Iteration, and Process Automation with SMS Gateway

    Directory of Open Access Journals (Sweden)

    Fenny Fenny

    2015-12-01

    Full Text Available Producing a better quality software system requires an understanding of the indicators of the software quality through defect detection, and automated testing. This paper aims to elevate the design and automated testing process in an engine water pump of a drinking water plant. This paper proposes how software developers can improve the maintainability and reliability of automated testing system and report the abnormal state when an error occurs on the machine. The method in this paper uses literature to explain best practices and case studies of a drinking water plant. Furthermore, this paper is expected to be able to provide insights into the efforts to better handle errors and perform automated testing and monitoring on a machine.

  19. Classification of multiple sclerosis lesions using adaptive dictionary learning.

    Science.gov (United States)

    Deshpande, Hrishikesh; Maurel, Pierre; Barillot, Christian

    2015-12-01

    This paper presents a sparse representation and an adaptive dictionary learning based method for automated classification of multiple sclerosis (MS) lesions in magnetic resonance (MR) images. Manual delineation of MS lesions is a time-consuming task, requiring neuroradiology experts to analyze huge volume of MR data. This, in addition to the high intra- and inter-observer variability necessitates the requirement of automated MS lesion classification methods. Among many image representation models and classification methods that can be used for such purpose, we investigate the use of sparse modeling. In the recent years, sparse representation has evolved as a tool in modeling data using a few basis elements of an over-complete dictionary and has found applications in many image processing tasks including classification. We propose a supervised classification approach by learning dictionaries specific to the lesions and individual healthy brain tissues, which include white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF). The size of the dictionaries learned for each class plays a major role in data representation but it is an even more crucial element in the case of competitive classification. Our approach adapts the size of the dictionary for each class, depending on the complexity of the underlying data. The algorithm is validated using 52 multi-sequence MR images acquired from 13 MS patients. The results demonstrate the effectiveness of our approach in MS lesion classification. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Automated System Marketplace 1994.

    Science.gov (United States)

    Griffiths, Jose-Marie; Kertis, Kimberly

    1994-01-01

    Reports results of the 1994 Automated System Marketplace survey based on responses from 60 vendors. Highlights include changes in the library automation marketplace; estimated library systems revenues; minicomputer and microcomputer-based systems; marketplace trends; global markets and mergers; research needs; new purchase processes; and profiles…

  1. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  2. Order Division Automated System.

    Science.gov (United States)

    Kniemeyer, Justin M.; And Others

    This publication was prepared by the Order Division Automation Project staff to fulfill the Library of Congress' requirement to document all automation efforts. The report was originally intended for internal use only and not for distribution outside the Library. It is now felt that the library community at-large may have an interest in the…

  3. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  4. Automation and robotics

    Science.gov (United States)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  5. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  6. Automated cell analysis tool for a genome-wide RNAi screen with support vector machine based supervised learning

    Science.gov (United States)

    Remmele, Steffen; Ritzerfeld, Julia; Nickel, Walter; Hesser, Jürgen

    2011-03-01

    RNAi-based high-throughput microscopy screens have become an important tool in biological sciences in order to decrypt mostly unknown biological functions of human genes. However, manual analysis is impossible for such screens since the amount of image data sets can often be in the hundred thousands. Reliable automated tools are thus required to analyse the fluorescence microscopy image data sets usually containing two or more reaction channels. The herein presented image analysis tool is designed to analyse an RNAi screen investigating the intracellular trafficking and targeting of acylated Src kinases. In this specific screen, a data set consists of three reaction channels and the investigated cells can appear in different phenotypes. The main issue of the image processing task is an automatic cell segmentation which has to be robust and accurate for all different phenotypes and a successive phenotype classification. The cell segmentation is done in two steps by segmenting the cell nuclei first and then using a classifier-enhanced region growing on basis of the cell nuclei to segment the cells. The classification of the cells is realized by a support vector machine which has to be trained manually using supervised learning. Furthermore, the tool is brightness invariant allowing different staining quality and it provides a quality control that copes with typical defects during preparation and acquisition. A first version of the tool has already been successfully applied for an RNAi-screen containing three hundred thousand image data sets and the SVM extended version is designed for additional screens.

  7. Asteroid taxonomic classifications

    International Nuclear Information System (INIS)

    Tholen, D.J.

    1989-01-01

    This paper reports on three taxonomic classification schemes developed and applied to the body of available color and albedo data. Asteroid taxonomic classifications according to two of these schemes are reproduced

  8. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  9. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  10. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  11. Hand eczema classification

    DEFF Research Database (Denmark)

    Diepgen, T L; Andersen, Klaus Ejner; Brandao, F M

    2008-01-01

    of the disease is rarely evidence based, and a classification system for different subdiagnoses of hand eczema is not agreed upon. Randomized controlled trials investigating the treatment of hand eczema are called for. For this, as well as for clinical purposes, a generally accepted classification system...... A classification system for hand eczema is proposed. Conclusions It is suggested that this classification be used in clinical work and in clinical trials....

  12. Automated Sunspot Detection and Classification Using SOHO/MDI Imagery

    Science.gov (United States)

    2015-03-01

    to the geocentric North). 3. Focus and size of the solar disk is adjusted to fit an 18 cm diameter circle on the worksheet. 4. Analyst hand draws the...General Nature of the Sunspot,” The Astrophysical Journal 230, 905–913 (1979). 14. Wheatland, M. S., “A Bayesian Approach to Solar Flare Prediction,” The

  13. A Framework for Automated Marmoset Vocalization Detection And Classification

    Science.gov (United States)

    2016-09-08

    for studying the origins and neural basis of human language. Vocalizations belonging to the same species, or Conspecific Vocalizations (CVs), are...applications including automatic speech recognition [17], speech enhancement [18], voice activity detection [19], hyper-nasality detection [20], and emotion ...vocalizations. The feature sets chosen have the desirable property of capturing characteristics of the signals that are useful in both identifying and

  14. Automated Classification of Martian Morphology Using a Terrain Fingerprinting Method

    NARCIS (Netherlands)

    Koenders, R.; Lindenbergh, R.C.; Zegers, T.E.

    2009-01-01

    The planet Mars has a relatively short human exploration history, while the size of the scientific community studying Mars is also smaller than its Earth equivalent. On the other hand the interest in Mars is large, basically because it is the planet in the solar system most similar to Earth. Several

  15. Advanced Automated Detection Analysis and Classification of Cracks in Pavement

    OpenAIRE

    Scott, Dennis

    2014-01-01

    Functional Session 5: Pavement Management Moderated by Akyiaa Hosten This presentation was held at the Pavement Evaluation 2014 Conference, which took place from September 15-18, 2014 in Blacksburg, Virginia. Presentation only

  16. Defect detection module

    International Nuclear Information System (INIS)

    Ernwein, R.; Westermann, G.

    1986-01-01

    The ''defect detector'' module is aimed at exceptional event or state recording. Foreseen for voltage presence monitoring on high supply voltage module of drift chambers, its characteristics can also show up the vanishing of supply voltage and take in account transitory fast signals [fr

  17. Quantum computing with defects.

    Science.gov (United States)

    Weber, J R; Koehl, W F; Varley, J B; Janotti, A; Buckley, B B; Van de Walle, C G; Awschalom, D D

    2010-05-11

    Identifying and designing physical systems for use as qubits, the basic units of quantum information, are critical steps in the development of a quantum computer. Among the possibilities in the solid state, a defect in diamond known as the nitrogen-vacancy (NV(-1)) center stands out for its robustness--its quantum state can be initialized, manipulated, and measured with high fidelity at room temperature. Here we describe how to systematically identify other deep center defects with similar quantum-mechanical properties. We present a list of physical criteria that these centers and their hosts should meet and explain how these requirements can be used in conjunction with electronic structure theory to intelligently sort through candidate defect systems. To illustrate these points in detail, we compare electronic structure calculations of the NV(-1) center in diamond with those of several deep centers in 4H silicon carbide (SiC). We then discuss the proposed criteria for similar defects in other tetrahedrally coordinated semiconductors.

  18. PASTEC: an automatic transposable element classification tool.

    Directory of Open Access Journals (Sweden)

    Claire Hoede

    Full Text Available SUMMARY: The classification of transposable elements (TEs is key step towards deciphering their potential impact on the genome. However, this process is often based on manual sequence inspection by TE experts. With the wealth of genomic sequences now available, this task requires automation, making it accessible to most scientists. We propose a new tool, PASTEC, which classifies TEs by searching for structural features and similarities. This tool outperforms currently available software for TE classification. The main innovation of PASTEC is the search for HMM profiles, which is useful for inferring the classification of unknown TE on the basis of conserved functional domains of the proteins. In addition, PASTEC is the only tool providing an exhaustive spectrum of possible classifications to the order level of the Wicker hierarchical TE classification system. It can also automatically classify other repeated elements, such as SSR (Simple Sequence Repeats, rDNA or potential repeated host genes. Finally, the output of this new tool is designed to facilitate manual curation by providing to biologists with all the evidence accumulated for each TE consensus. AVAILABILITY: PASTEC is available as a REPET module or standalone software (http://urgi.versailles.inra.fr/download/repet/REPET_linux-x64-2.2.tar.gz. It requires a Unix-like system. There are two standalone versions: one of which is parallelized (requiring Sun grid Engine or Torque, and the other of which is not.

  19. Classification with support hyperplanes

    NARCIS (Netherlands)

    G.I. Nalbantov (Georgi); J.C. Bioch (Cor); P.J.F. Groenen (Patrick)

    2006-01-01

    textabstractA new classification method is proposed, called Support Hy- perplanes (SHs). To solve the binary classification task, SHs consider the set of all hyperplanes that do not make classification mistakes, referred to as semi-consistent hyperplanes. A test object is classified using

  20. Standard classification: Physics

    International Nuclear Information System (INIS)

    1977-01-01

    This is a draft standard classification of physics. The conception is based on the physics part of the systematic catalogue of the Bayerische Staatsbibliothek and on the classification given in standard textbooks. The ICSU-AB classification now used worldwide by physics information services was not taken into account. (BJ) [de

  1. A neural network approach for radiographic image classification in NDT

    International Nuclear Information System (INIS)

    Lavayssiere, B.

    1993-05-01

    Radiography is used by EDF for pipe inspection in nuclear power plants in order to detect defects. The radiographs obtained are then digitized in a well-defined protocol. The aim of EDF consists of developing a non destructive testing system for recognizing defects. In this note, we describe the recognition procedure of areas with defects. We first present the digitization protocol, specifies the poor quality of images under study and propose a procedure to enhance defects. We then examine the problem raised by the choice of good features for classification. After having proved that statistical or standard textural features such as homogeneity, entropy or contrast are not relevant, we develop a geometrical-statistical approach based on the cooperation between signal correlations study and regional extrema analysis. The principle consists of analysing and comparing for areas with defects and without any defect, the evolution of conditional probabilities matrices for increasing neighbourhood sizes, the shape of variograms and the location of regional minima. We demonstrate that anisotropy and surface of series of 'comet tails' associated with probability matrices, variograms slope and statistical indices, regional extrema location, are features able to discriminate areas with defects from areas without any. The classification is then realized by a neural network, which structure, properties and learning mechanisms are detailed. Finally we discuss the results. (author). 5 figs., 21 refs

  2. Classification of refrigerants; Classification des fluides frigorigenes

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-07-01

    This document was made from the US standard ANSI/ASHRAE 34 published in 2001 and entitled 'designation and safety classification of refrigerants'. This classification allows to clearly organize in an international way the overall refrigerants used in the world thanks to a codification of the refrigerants in correspondence with their chemical composition. This note explains this codification: prefix, suffixes (hydrocarbons and derived fluids, azeotropic and non-azeotropic mixtures, various organic compounds, non-organic compounds), safety classification (toxicity, flammability, case of mixtures). (J.S.)

  3. Defect Detection and Segmentation Framework for Remote Field Eddy Current Sensor Data

    Directory of Open Access Journals (Sweden)

    Raphael Falque

    2017-10-01

    Full Text Available Remote-Field Eddy-Current (RFEC technology is often used as a Non-Destructive Evaluation (NDE method to prevent water pipe failures. By analyzing the RFEC data, it is possible to quantify the corrosion present in pipes. Quantifying the corrosion involves detecting defects and extracting their depth and shape. For large sections of pipelines, this can be extremely time-consuming if performed manually. Automated approaches are therefore well motivated. In this article, we propose an automated framework to locate and segment defects in individual pipe segments, starting from raw RFEC measurements taken over large pipelines. The framework relies on a novel feature to robustly detect these defects and a segmentation algorithm applied to the deconvolved RFEC signal. The framework is evaluated using both simulated and real datasets, demonstrating its ability to efficiently segment the shape of corrosion defects.

  4. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for

  5. Automated Single Cell Data Decontamination Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Tennessen, Kristin [Lawrence Berkeley National Lab. (LBNL), Walnut Creek, CA (United States). Dept. of Energy Joint Genome Inst.; Pati, Amrita [Lawrence Berkeley National Lab. (LBNL), Walnut Creek, CA (United States). Dept. of Energy Joint Genome Inst.

    2014-03-21

    Recent technological advancements in single-cell genomics have encouraged the classification and functional assessment of microorganisms from a wide span of the biospheres phylogeny.1,2 Environmental processes of interest to the DOE, such as bioremediation and carbon cycling, can be elucidated through the genomic lens of these unculturable microbes. However, contamination can occur at various stages of the single-cell sequencing process. Contaminated data can lead to wasted time and effort on meaningless analyses, inaccurate or erroneous conclusions, and pollution of public databases. A fully automated decontamination tool is necessary to prevent these instances and increase the throughput of the single-cell sequencing process

  6. Current Strategies in Reconstruction of Maxillectomy Defects

    Science.gov (United States)

    Andrades, Patricio; Militsakh, Oleg; Hanasono, Matthew M.; Rieger, Jana; Rosenthal, Eben L.

    2014-01-01

    Objective To outline a contemporary review of defect classification and reconstructive options. Design Review article. Setting Tertiary care referral centers. Results Although prosthetic rehabilitation remains the standard of care in many institutions, the discomfort of wearing, removing, and cleaning a prosthesis; the inability to retain a prosthesis in large defects; and the frequent need for readjustments often limit the value of this cost-effective and successful method of restoring speech and mastication. However, flap reconstruction offers an option for many, although there is no agreement as to which techniques should be used for optimal reconstruction. Flap reconstruction also involves a longer recovery time with increased risk of surgical complications, has higher costs associated with the procedure, and requires access to a highly experienced surgeon. Conclusion The surgeon and reconstructive team must make individualized decisions based on the extent of the maxillectomy defect (eg, the resection of the infraorbital rim, the extent of palate excision, skin compromise) and the need for radiation therapy. PMID:21844415

  7. Digital detection system of surface defects for large aperture optical elements

    International Nuclear Information System (INIS)

    Fan Yong; Chen Niannian; Gao Lingling; Jia Yuan; Wang Junbo; Cheng Xiaofeng

    2009-01-01

    Based on the light defect images against the dark background in a scattering imaging system, a digital detection system of surface defects for large aperture optical elements has been presented. In the system, the image is segmented by a multi-area self-adaptive threshold segmentation method, then a pixel labeling method based on replacing arrays is adopted to extract defect features quickly, and at last the defects are classified through back-propagation neural networks. Experiment results show that the system can achieve real-time detection and classification. (authors)

  8. Specific classification of financial analysis of enterprise activity

    Directory of Open Access Journals (Sweden)

    Synkevych Nadiia I.

    2014-01-01

    Full Text Available Despite the fact that one can find a big variety of classifications of types of financial analysis of enterprise activity, which differ with their approach to classification and a number of classification features and their content, in modern scientific literature, their complex comparison and analysis of existing classification have not been done. This explains urgency of this study. The article studies classification of types of financial analysis of scientists and presents own approach to this problem. By the results of analysis the article improves and builds up a specific classification of financial analysis of enterprise activity and offers classification by the following features: objects, subjects, goals of study, automation level, time period of the analytical base, scope of study, organisation system, classification features of the subject, spatial belonging, sufficiency, information sources, periodicity, criterial base, method of data selection for analysis and time direction. All types of financial analysis significantly differ with their inherent properties and parameters depending on the goals of financial analysis. The developed specific classification provides subjects of financial analysis of enterprise activity with a possibility to identify a specific type of financial analysis, which would correctly meet the set goals.

  9. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  10. PHOTOMETRIC SUPERNOVA CLASSIFICATION WITH MACHINE LEARNING

    International Nuclear Information System (INIS)

    Lochner, Michelle; Peiris, Hiranya V.; Lahav, Ofer; Winter, Max K.; McEwen, Jason D.

    2016-01-01

    Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques that fit parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k -nearest neighbors, support vector machines, artificial neural networks, and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieve an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.

  11. PHOTOMETRIC SUPERNOVA CLASSIFICATION WITH MACHINE LEARNING

    Energy Technology Data Exchange (ETDEWEB)

    Lochner, Michelle; Peiris, Hiranya V.; Lahav, Ofer; Winter, Max K. [Department of Physics and Astronomy, University College London, Gower Street, London WC1E 6BT (United Kingdom); McEwen, Jason D., E-mail: dr.michelle.lochner@gmail.com [Mullard Space Science Laboratory, University College London, Surrey RH5 6NT (United Kingdom)

    2016-08-01

    Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscopic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques that fit parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k -nearest neighbors, support vector machines, artificial neural networks, and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieve an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.

  12. Multi-decadal classification of synoptic weather types, observed trends and links to rainfall characteristics over Saudi Arabia

    KAUST Repository

    El Kenawy, Ahmed M.; McCabe, Matthew; Stenchikov, Georgiy L.; Raj, Jerry

    2014-01-01

    An automated version of the Lamb weather type classification scheme was employed to characterize daily circulation conditions in Saudi Arabia from 1960 to 2005. Daily gridded fields of sea level pressure (SLP) from both the NCEP

  13. Algorithm of Defect Segmentation for AFP Based on Prepregs

    Directory of Open Access Journals (Sweden)

    CAI Zhiqiang

    2017-04-01

    Full Text Available In order to ensure the performance of the automated fiber placement forming parts, according to the homogeneity of the image of the prepreg surface along the fiber direction, a defect segmentation algorithm which was the combination of gray compensation and substraction algorithm based on image processing technology was proposed. The gray compensation matrix of image was used to compensate the gray image, and the maximum error point of the image matrix was eliminated according to the characteristics that the gray error obeys the normal distribution. The standard image was established, using the allowed deviation coefficient K as a criterion for substraction segmentation. Experiments show that the algorithm has good effect, fast speed in segmenting two kinds of typical laying defect of bubbles or foreign objects, and provides a good theoretical basis to realize automatic laying defect online monitoring.

  14. Software for roof defects recognition on aerial photographs

    Science.gov (United States)

    Yudin, D.; Naumov, A.; Dolzhenko, A.; Patrakova, E.

    2018-05-01

    The article presents information on software for roof defects recognition on aerial photographs, made with air drones. An areal image segmentation mechanism is described. It allows detecting roof defects – unsmoothness that causes water stagnation after rain. It is shown that HSV-transformation approach allows quick detection of stagnation areas, their size and perimeters, but is sensitive to shadows and changes of the roofing-types. Deep Fully Convolutional Network software solution eliminates this drawback. The tested data set consists of the roofing photos with defects and binary masks for them. FCN approach gave acceptable results of image segmentation in Dice metric average value. This software can be used in inspection automation of roof conditions in the production sector and housing and utilities infrastructure.

  15. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  16. Classification, disease, and diagnosis.

    Science.gov (United States)

    Jutel, Annemarie

    2011-01-01

    Classification shapes medicine and guides its practice. Understanding classification must be part of the quest to better understand the social context and implications of diagnosis. Classifications are part of the human work that provides a foundation for the recognition and study of illness: deciding how the vast expanse of nature can be partitioned into meaningful chunks, stabilizing and structuring what is otherwise disordered. This article explores the aims of classification, their embodiment in medical diagnosis, and the historical traditions of medical classification. It provides a brief overview of the aims and principles of classification and their relevance to contemporary medicine. It also demonstrates how classifications operate as social framing devices that enable and disable communication, assert and refute authority, and are important items for sociological study.

  17. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Reifenhaeuser, R.; Schlicht, K.

    1976-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these technics of equipment is further improved and if its volume is tallied with a definite etc. (orig.) [de

  18. Automation of radioimmunoassay

    International Nuclear Information System (INIS)

    Yamaguchi, Chisato; Yamada, Hideo; Iio, Masahiro

    1974-01-01

    Automation systems for measuring Australian antigen by radioimmunoassay under development were discussed. Samples were processed as follows: blood serum being dispensed by automated sampler to the test tube, and then incubated under controlled time and temperature; first counting being omitted; labelled antibody being dispensed to the serum after washing; samples being incubated and then centrifuged; radioactivities in the precipitate being counted by auto-well counter; measurements being tabulated by automated typewriter. Not only well-type counter but also position counter was studied. (Kanao, N.)

  19. Automated electron microprobe

    International Nuclear Information System (INIS)

    Thompson, K.A.; Walker, L.R.

    1986-01-01

    The Plant Laboratory at the Oak Ridge Y-12 Plant has recently obtained a Cameca MBX electron microprobe with a Tracor Northern TN5500 automation system. This allows full stage and spectrometer automation and digital beam control. The capabilities of the system include qualitative and quantitative elemental microanalysis for all elements above and including boron in atomic number, high- and low-magnification imaging and processing, elemental mapping and enhancement, and particle size, shape, and composition analyses. Very low magnification, quantitative elemental mapping using stage control (which is of particular interest) has been accomplished along with automated size, shape, and composition analysis over a large relative area

  20. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  1. Managing laboratory automation.

    Science.gov (United States)

    Saboe, T J

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Finally, some comments on future automation need are discussed.

  2. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  3. Operational proof of automation

    International Nuclear Information System (INIS)

    Jaerschky, R.; Schlicht, K.

    1977-01-01

    Automation of the power plant process may imply quite a number of problems. The automation of dynamic operations requires complicated programmes often interfering in several branched areas. This reduces clarity for the operating and maintenance staff, whilst increasing the possibilities of errors. The synthesis and the organization of standardized equipment have proved very successful. The possibilities offered by this kind of automation for improving the operation of power plants will only sufficiently and correctly be turned to profit, however, if the application of these equipment techniques is further improved and if it stands in a certain ratio with a definite efficiency. (orig.) [de

  4. Automatic appraisal of defects in irradiated pins by eddy current testing

    International Nuclear Information System (INIS)

    Marsol, R.; Cornu, B.

    1986-10-01

    Eddy current testing is very efficient to inspect the sheaths of spent fuel elements. Automation of the process is developed to replace visual examination of recorded eddy current signals. The method is applied to austenitic steel fuel cans for fast neutron reactors to detect cracks, voids, inclusions... The different types of defects and experimental processes are recalled then automatic detection and the method for defect qualification are presented [fr

  5. Congenital Heart Defects (For Parents)

    Science.gov (United States)

    ... to be associated with genetic disorders, such as Down syndrome . But the cause of most congenital heart defects isn't known. While they can't be prevented, many treatments are available for the defects and related health ...

  6. Antigravity from a spacetime defect

    OpenAIRE

    Klinkhamer, F. R.; Queiruga, J. M.

    2018-01-01

    We argue that there may exist spacetime defects embedded in Minkowski spacetime, which have negative active gravitational mass. One such spacetime defect then repels a test particle, corresponding to what may be called "antigravity."

  7. Automated ultrasonic shop inspection of reactor pressure vessel forgings

    International Nuclear Information System (INIS)

    Farley, J.M.; Dikstra, B.J.; Hanstock, D.J.; Pople, C.H.

    1986-01-01

    Automated ultrasonic shop inspection utilizing a computer-controlled system is being applied to each of the forgings for the reactor pressure vessel of the proposed Sizewell B PWR power station. Procedures which utilize a combination of high sensitivity shear wave pulse echo, 0 degrees and 70 degrees angled longitudinal waves, tandem and through-thickness arrays have been developed to provide comprehensive coverage and an overall reliability of inspection comparable to the best achieved in UKAEA defect detection trials and in PISC II. This paper describes the ultrasonic techniques, the automated system (its design, commissioning and testing), validation and the progress of the inspections

  8. Studies of defects and defect agglomerates by positron annihilation spectroscopy

    DEFF Research Database (Denmark)

    Eldrup, Morten Mostgaard; Singh, B.N.

    1997-01-01

    A brief introduction to positron annihilation spectroscopy (PAS), and in particular lo its use for defect studies in metals is given. Positrons injected into a metal may become trapped in defects such as vacancies, vacancy clusters, voids, bubbles and dislocations and subsequently annihilate from...... the trapped state iri the defect. The annihilation characteristics (e.g., the lifetime of the positron) can be measured and provide information about the nature of the defect (e.g., size, density, morphology). The technique is sensitive to both defect size (in the range from monovacancies up to cavities...

  9. Enhanced defect of interest [DOI] monitoring by utilizing sensitive inspection and ADRTrue SEM review

    Science.gov (United States)

    Kirsch, Remo; Zeiske, Ulrich; Shabtay, Saar; Beyer, Mirko; Yerushalmi, Liran; Goshen, Oren

    2011-03-01

    As semiconductor process design rules continue to shrink, the ability of optical inspection tools to separate between true defects and nuisance becomes more and more difficult. Therefore, monitoring Defect of Interest (DOI) become a real challenge (Figure 1). This phenomenon occurs due to the lower signal received from real defects while noise levels remain almost the same, resulting in inspection high nuisance rate, which jeopardizes the ability to provide a meaningful, true defect Pareto. A non-representative defect Pareto creates a real challenge to a reliable process monitoring (Figure 4). Traditionally, inspection tool recipes were optimized to keep data load at a manageable level and provide defect maps with ~10% nuisance rate, but as defects of interest get smaller with design rule shrinkage, this requirement results in a painful compromise in detection sensitivity. The inspection is usually followed by defect review and classification using scanning electron microscope (SEM), the classification done manually and it is performed on a small sample of the inspection defect map due to time and manual resources limitations. Sample is usually 50~60 randomly selected locations, review is performed manually most of the times, and manual classification is performed for all the reviewed locations. In the approach described in this paper, the inspection tool recipe is optimized for sensitivity rather than low nuisance rate (i.e. detect all DOI with compromising on a higher nuisance rate). Inspection results with high nuisance rate introduce new challenges for SEM review methodology & tools. This paper describe a new approach which enhances process monitoring quality and the results of collaborative work of the Process Diagnostic & Control Business Unit of Applied Materials® and GLOBALFOUNDRIES® utilizing Applied Materials ADRTrueTM & SEMVisionTM capabilities. The study shows that the new approach reveals new defect types in the Pareto, and improves the ability to

  10. Congenital Heart Defects and CCHD

    Science.gov (United States)

    ... and more. Stony Point, NY 10980 Close X Home > Complications & Loss > Birth defects & other health conditions > Congenital heart defects and ... in congenital heart defects. You have a family history of congenital heart ... syndrome or VCF. After birth Your baby may be tested for CCHD as ...

  11. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  12. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last 48...

  13. Automated External Defibrillator

    Science.gov (United States)

    ... leads to a 10 percent reduction in survival. Training To Use an Automated External Defibrillator Learning how to use an AED and taking a CPR (cardiopulmonary resuscitation) course are helpful. However, if trained ...

  14. Planning for Office Automation.

    Science.gov (United States)

    Mick, Colin K.

    1983-01-01

    Outlines a practical approach to planning for office automation termed the "Focused Process Approach" (the "what" phase, "how" phase, "doing" phase) which is a synthesis of the problem-solving and participatory planning approaches. Thirteen references are provided. (EJS)

  15. Fixed automated spray technology.

    Science.gov (United States)

    2011-04-19

    This research project evaluated the construction and performance of Boschungs Fixed Automated : Spray Technology (FAST) system. The FAST system automatically sprays de-icing material on : the bridge when icing conditions are about to occur. The FA...

  16. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  17. Automation Interface Design Development

    Data.gov (United States)

    National Aeronautics and Space Administration — Our research makes its contributions at two levels. At one level, we addressed the problems of interaction between humans and computers/automation in a particular...

  18. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  19. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  20. Security classification of information

    Energy Technology Data Exchange (ETDEWEB)

    Quist, A.S.

    1993-04-01

    This document is the second of a planned four-volume work that comprehensively discusses the security classification of information. The main focus of Volume 2 is on the principles for classification of information. Included herein are descriptions of the two major types of information that governments classify for national security reasons (subjective and objective information), guidance to use when determining whether information under consideration for classification is controlled by the government (a necessary requirement for classification to be effective), information disclosure risks and benefits (the benefits and costs of classification), standards to use when balancing information disclosure risks and benefits, guidance for assigning classification levels (Top Secret, Secret, or Confidential) to classified information, guidance for determining how long information should be classified (classification duration), classification of associations of information, classification of compilations of information, and principles for declassifying and downgrading information. Rules or principles of certain areas of our legal system (e.g., trade secret law) are sometimes mentioned to .provide added support to some of those classification principles.

  1. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  2. Highway Electrification And Automation

    OpenAIRE

    Shladover, Steven E.

    1992-01-01

    This report addresses how the California Department of Transportation and the California PATH Program have made efforts to evaluate the feasibility and applicability of highway electrification and automation technologies. In addition to describing how the work was conducted, the report also describes the findings on highway electrification and highway automation, with experimental results, design study results, and a region-wide application impacts study for Los Angeles.

  3. Automated lattice data generation

    Directory of Open Access Journals (Sweden)

    Ayyar Venkitesh

    2018-01-01

    Full Text Available The process of generating ensembles of gauge configurations (and measuring various observables over them can be tedious and error-prone when done “by hand”. In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  4. Automated lattice data generation

    Science.gov (United States)

    Ayyar, Venkitesh; Hackett, Daniel C.; Jay, William I.; Neil, Ethan T.

    2018-03-01

    The process of generating ensembles of gauge configurations (and measuring various observables over them) can be tedious and error-prone when done "by hand". In practice, most of this procedure can be automated with the use of a workflow manager. We discuss how this automation can be accomplished using Taxi, a minimal Python-based workflow manager built for generating lattice data. We present a case study demonstrating this technology.

  5. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  6. Marketing automation supporting sales

    OpenAIRE

    Sandell, Niko

    2016-01-01

    The past couple of decades has been a time of major changes in marketing. Digitalization has become a permanent part of marketing and at the same time enabled efficient collection of data. Personalization and customization of content are playing a crucial role in marketing when new customers are acquired. This has also created a need for automation to facilitate the distribution of targeted content. As a result of successful marketing automation more information of the customers is gathered ...

  7. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  8. Managing laboratory automation

    OpenAIRE

    Saboe, Thomas J.

    1995-01-01

    This paper discusses the process of managing automated systems through their life cycles within the quality-control (QC) laboratory environment. The focus is on the process of directing and managing the evolving automation of a laboratory; system examples are given. The author shows how both task and data systems have evolved, and how they interrelate. A BIG picture, or continuum view, is presented and some of the reasons for success or failure of the various examples cited are explored. Fina...

  9. Shielded cells transfer automation

    International Nuclear Information System (INIS)

    Fisher, J.J.

    1984-01-01

    Nuclear waste from shielded cells is removed, packaged, and transferred manually in many nuclear facilities. Radiation exposure is absorbed by operators during these operations and limited only through procedural controls. Technological advances in automation using robotics have allowed a production waste removal operation to be automated to reduce radiation exposure. The robotic system bags waste containers out of glove box and transfers them to a shielded container. Operators control the system outside the system work area via television cameras. 9 figures

  10. Automated Status Notification System

    Science.gov (United States)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  11. Immobile defects in ferroelastic walls: Wall nucleation at defect sites

    Science.gov (United States)

    He, X.; Salje, E. K. H.; Ding, X.; Sun, J.

    2018-02-01

    Randomly distributed, static defects are enriched in ferroelastic domain walls. The relative concentration of defects in walls, Nd, follows a power law distribution as a function of the total defect concentration C: N d ˜ C α with α = 0.4 . The enrichment Nd/C ranges from ˜50 times when C = 10 ppm to ˜3 times when C = 1000 ppm. The resulting enrichment is due to nucleation at defect sites as observed in large scale MD simulations. The dynamics of domain nucleation and switching is dependent on the defect concentration. Their energy distribution follows the power law with exponents during yield between ɛ ˜ 1.82 and 2.0 when the defect concentration increases. The power law exponent is ɛ ≈ 2.7 in the plastic regime, independent of the defect concentration.

  12. Microsurgical reconstruction of large nerve defects using autologous nerve grafts.

    Science.gov (United States)

    Daoutis, N K; Gerostathopoulos, N E; Efstathopoulos, D G; Misitizis, D P; Bouchlis, G N; Anagnostou, S K

    1994-01-01

    Between 1986 and 1993, 643 patients with peripheral nerve trauma were treated in our clinic. Primary neurorraphy was performed in 431 of these patients and nerve grafting in 212 patients. We present the functional results after nerve grafting in 93 patients with large nerve defects who were followed for more than 2 years. Evaluation of function was based on the Medical Research Council (MRC) classification for motor and sensory recovery. Factors affecting functional outcome, such as age of the patient, denervation time, length of the defect, and level of the injury were noted. Good results according to the MRC classification were obtained in the majority of cases, although function remained less than that of the uninjured side.

  13. Benign gastric filling defect

    International Nuclear Information System (INIS)

    Oh, K. K.; Lee, Y. H.; Cho, O. K.; Park, C. Y.

    1979-01-01

    The gastric lesion is a common source of complaints to Orientals, however, evaluation of gastric symptoms and laboratory examination offer little specific aid in the diagnosis of gastric diseases. Thus roentgenography of gastrointestinal tract is one of the most reliable method for detail diagnosis. On double contract study of stomach, gastric filling defect is mostly caused by malignant gastric cancer, however, other benign lesions can cause similar pictures which can be successfully treated by surgery. 66 cases of benign causes of gastric filling defect were analyzed at this point of view, which was verified pathologically by endoscope or surgery during recent 7 years in Yensei University College of Medicine, Severance Hospital. The characteristic radiological picture of each disease was discussed for precise radiologic diagnosis. 1. Of total 66 cases, there were 52 cases of benign gastric tumor 10 cases of gastric varices, 5 cases of gastric bezoar, 5 cases of corrosive gastritis, 3 cases of granulomatous disease and one case of gastric hematoma. 2. The most frequent causes of benign tumors were adenomatous polyp (35/42) and the next was leiomyoma (4/42). Others were one of case of carcinoid, neurofibroma and cyst. 3. Characteristic of benign adenomatous polyp were relatively small in size, smooth surface and were observed that large size, benign polyp was frequently type IV lesion with a stalk. 4. Submucosal tumors such as leiomyoma needed differential diagnosis with polypoid malignant cancer. However, the characteristic points of differentiation was well circumscribed smooth margined filling defect without definite mucosal destruction on surface. 5. Gastric varices showed multiple lobulated filling defected especially on gastric fundus that changed its size and shape by respiration and posture of patients. Same varices lesions on esophagus and history of liver disease were helpful for easier diagnosis. 6. Gastric bezoar showed well defined movable mass

  14. Benign gastric filling defect

    Energy Technology Data Exchange (ETDEWEB)

    Oh, K. K.; Lee, Y. H.; Cho, O. K.; Park, C. Y. [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    1979-06-15

    The gastric lesion is a common source of complaints to Orientals, however, evaluation of gastric symptoms and laboratory examination offer little specific aid in the diagnosis of gastric diseases. Thus roentgenography of gastrointestinal tract is one of the most reliable method for detail diagnosis. On double contract study of stomach, gastric filling defect is mostly caused by malignant gastric cancer, however, other benign lesions can cause similar pictures which can be successfully treated by surgery. 66 cases of benign causes of gastric filling defect were analyzed at this point of view, which was verified pathologically by endoscope or surgery during recent 7 years in Yensei University College of Medicine, Severance Hospital. The characteristic radiological picture of each disease was discussed for precise radiologic diagnosis. 1. Of total 66 cases, there were 52 cases of benign gastric tumor 10 cases of gastric varices, 5 cases of gastric bezoar, 5 cases of corrosive gastritis, 3 cases of granulomatous disease and one case of gastric hematoma. 2. The most frequent causes of benign tumors were adenomatous polyp (35/42) and the next was leiomyoma (4/42). Others were one of case of carcinoid, neurofibroma and cyst. 3. Characteristic of benign adenomatous polyp were relatively small in size, smooth surface and were observed that large size, benign polyp was frequently type IV lesion with a stalk. 4. Submucosal tumors such as leiomyoma needed differential diagnosis with polypoid malignant cancer. However, the characteristic points of differentiation was well circumscribed smooth margined filling defect without definite mucosal destruction on surface. 5. Gastric varices showed multiple lobulated filling defected especially on gastric fundus that changed its size and shape by respiration and posture of patients. Same varices lesions on esophagus and history of liver disease were helpful for easier diagnosis. 6. Gastric bezoar showed well defined movable mass

  15. Benign gastric filling defect

    Energy Technology Data Exchange (ETDEWEB)

    Oh, K K; Lee, Y H; Cho, O K; Park, C Y [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    1979-06-15

    The gastric lesion is a common source of complaints to Orientals, however, evaluation of gastric symptoms and laboratory examination offer little specific aid in the diagnosis of gastric diseases. Thus roentgenography of gastrointestinal tract is one of the most reliable method for detail diagnosis. On double contract study of stomach, gastric filling defect is mostly caused by malignant gastric cancer, however, other benign lesions can cause similar pictures which can be successfully treated by surgery. 66 cases of benign causes of gastric filling defect were analyzed at this point of view, which was verified pathologically by endoscope or surgery during recent 7 years in Yensei University College of Medicine, Severance Hospital. The characteristic radiological picture of each disease was discussed for precise radiologic diagnosis. 1. Of total 66 cases, there were 52 cases of benign gastric tumor 10 cases of gastric varices, 5 cases of gastric bezoar, 5 cases of corrosive gastritis, 3 cases of granulomatous disease and one case of gastric hematoma. 2. The most frequent causes of benign tumors were adenomatous polyp (35/42) and the next was leiomyoma (4/42). Others were one of case of carcinoid, neurofibroma and cyst. 3. Characteristic of benign adenomatous polyp were relatively small in size, smooth surface and were observed that large size, benign polyp was frequently type IV lesion with a stalk. 4. Submucosal tumors such as leiomyoma needed differential diagnosis with polypoid malignant cancer. However, the characteristic points of differentiation was well circumscribed smooth margined filling defect without definite mucosal destruction on surface. 5. Gastric varices showed multiple lobulated filling defected especially on gastric fundus that changed its size and shape by respiration and posture of patients. Same varices lesions on esophagus and history of liver disease were helpful for easier diagnosis. 6. Gastric bezoar showed well defined movable mass

  16. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C......In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...

  17. Demonstration of automated robotic workcell for hazardous waste characterization

    International Nuclear Information System (INIS)

    Holliday, M.; Dougan, A.; Gavel, D.; Gustaveson, D.; Johnson, R.; Kettering, B.; Wilhelmsen, K.

    1993-02-01

    An automated robotic workcell to classify hazardous waste stream items with previously unknown characteristics has been designed, tested and demonstrated The object attributes being quantified are radiation signature, metal content, and object orientation and volume. The multi sensor information is used to make segregation decisions plus do automatic grasping of objects. The work-cell control program uses an off-line programming system by Cimetrix Inc. as a server to do both simulation control as well as actual hardware control of the workcell. This paper will discuss the overall workcell layout, sensor specifications, workcell supervisory control, 2D vision based automated grasp planning and object classification algorithms

  18. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  19. Surface defects and chiral algebras

    Energy Technology Data Exchange (ETDEWEB)

    Córdova, Clay [School of Natural Sciences, Institute for Advanced Study,1 Einstein Dr, Princeton, NJ 08540 (United States); Gaiotto, Davide [Perimeter Institute for Theoretical Physics,31 Caroline St N, Waterloo, ON N2L 2Y5 (Canada); Shao, Shu-Heng [School of Natural Sciences, Institute for Advanced Study,1 Einstein Dr, Princeton, NJ 08540 (United States)

    2017-05-26

    We investigate superconformal surface defects in four-dimensional N=2 superconformal theories. Each such defect gives rise to a module of the associated chiral algebra and the surface defect Schur index is the character of this module. Various natural chiral algebra operations such as Drinfeld-Sokolov reduction and spectral flow can be interpreted as constructions involving four-dimensional surface defects. We compute the index of these defects in the free hypermultiplet theory and Argyres-Douglas theories, using both infrared techniques involving BPS states, as well as renormalization group flows onto Higgs branches. In each case we find perfect agreement with the predicted characters.

  20. Classification of Flotation Frothers

    Directory of Open Access Journals (Sweden)

    Jan Drzymala

    2018-02-01

    Full Text Available In this paper, a scheme of flotation frothers classification is presented. The scheme first indicates the physical system in which a frother is present and four of them i.e., pure state, aqueous solution, aqueous solution/gas system and aqueous solution/gas/solid system are distinguished. As a result, there are numerous classifications of flotation frothers. The classifications can be organized into a scheme described in detail in this paper. The frother can be present in one of four physical systems, that is pure state, aqueous solution, aqueous solution/gas and aqueous solution/gas/solid system. It results from the paper that a meaningful classification of frothers relies on choosing the physical system and next feature, trend, parameter or parameters according to which the classification is performed. The proposed classification can play a useful role in characterizing and evaluation of flotation frothers.