Sample records for automatic term identification

  1. Automatic Language Identification (United States)


    hundreds guish one language from another. The reader is referred of input languages would need to be supported , the cost of to the linguistics literature...eventually obtained bet- 108 TRAINING FRENCH GERMAN ITRAIING FRENCH M- ALGORITHM - __ GERMAN NHSPANISH TRAINING SPEECH SET OF MODELS: UTTERANCES ONE MODEL...i.e. vowels ) for each speech utterance are located malized to be insensitive to overall amplitude, pitch and automatically. Next, feature vectors

  2. 2010 United States Automatic Identification System Database (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  3. 2014 United States Automatic Identification System Database (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. 2011 United States Automatic Identification System Database (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  5. 2009 United States Automatic Identification System Database (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  6. 2012 United States Automatic Identification System Database (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...



    Vladimír Modrák; Peter Knuth


    Automatic identification of items saves time and is beneficial in various areas, including warehouse management. Identification can be done by many technologies, but RFID technology seems to be one of the smartest solutions. This article deals with testing and possible use of RFID technology in warehouse management. All results and measurement outcomes are documented in form of graphs followed by comprehensive analysis.

  8. Automatic identification of species with neural networks

    Directory of Open Access Journals (Sweden)

    Andrés Hernández-Serna


    Full Text Available A new automatic identification system using photographic images has been designed to recognize fish, plant, and butterfly species from Europe and South America. The automatic classification system integrates multiple image processing tools to extract the geometry, morphology, and texture of the images. Artificial neural networks (ANNs were used as the pattern recognition method. We tested a data set that included 740 species and 11,198 individuals. Our results show that the system performed with high accuracy, reaching 91.65% of true positive fish identifications, 92.87% of plants and 93.25% of butterflies. Our results highlight how the neural networks are complementary to species identification.

  9. Intelligent Storage System Based on Automatic Identification

    Directory of Open Access Journals (Sweden)

    Kolarovszki Peter


    Full Text Available This article describes RFID technology in conjunction with warehouse management systems. Article also deals with automatic identification and data capture technologies and each processes, which are used in warehouse management system. It describes processes from entering goods into production to identification of goods and also palletizing, storing, bin transferring and removing goods from warehouse. Article focuses on utilizing AMP middleware in WMS processes in Nowadays, the identification of goods in most warehouses is carried through barcodes. In this article we want to specify, how can be processes described above identified through RFID technology. All results are verified by measurement in our AIDC laboratory, which is located at the University of Žilina, and also in Laboratory of Automatic Identification Goods and Services located in GS1 Slovakia. The results of our research bring the new point of view and indicate the ways using of RFID technology in warehouse management system.

  10. Automatic identification of mass spectra

    International Nuclear Information System (INIS)

    Drabloes, F.


    Several approaches to preprocessing and comparison of low resolution mass spectra have been evaluated by various test methods related to library search. It is shown that there is a clear correlation between the nature of any contamination of a spectrum, the basic principle of the transformation or distance measure, and the performance of the identification system. The identification of functionality from low resolution spectra has also been evaluated using several classification methods. It is shown that there is an upper limit to the success of this approach, but also that this can be improved significantly by using a very limited amount of additional information. 10 refs

  11. Automatic target identification using neural networks (United States)

    Abdallah, Mahmoud A.; Samu, Tayib I.; Grissom, William A.


    Neural network theories are applied to attain human-like performance in areas such as speech recognition, statistical mapping, and target recognition or identification. In target identification, one of the difficult tasks has been the extraction of features to be used to train the neural network which is subsequently used for the target's identification. The purpose of this paper is to describe the development of an automatic target identification system using features extracted from a specific class of targets. The extracted features were the graphical representations of the silhouettes of the targets. Image processing techniques and some Fast Fourier Transform (FFT) properties were implemented to extract the features. The FFT eliminates variations in the extracted features due to rotation or scaling. A Neural Network was trained with the extracted features using the Learning Vector Quantization paradigm. An identification system was set up to test the algorithm. The image processing software was interfaced with MATLAB Neural Network Toolbox via a computer program written in C language to automate the target identification process. The system performed well as at classified the objects used to train it irrespective of rotation, scaling, and translation. This automatic target identification system had a classification success rate of about 95%.

  12. Optical Automatic Car Identification (OACI) : Volume 1. Advanced System Specification. (United States)


    A performance specification is provided in this report for an Optical Automatic Car Identification (OACI) scanner system which features 6% improved readability over existing industry scanner systems. It also includes the analysis and rationale which ...

  13. Estimating spatial travel times using automatic vehicle identification data (United States)


    Prepared ca. 2001. The paper describes an algorithm that was developed for estimating reliable and accurate average roadway link travel times using Automatic Vehicle Identification (AVI) data. The algorithm presented is unique in two aspects. First, ...

  14. Statistical pattern recognition for automatic writer identification and verification

    NARCIS (Netherlands)

    Bulacu, Marius Lucian


    The thesis addresses the problem of automatic person identification using scanned images of handwriting.Identifying the author of a handwritten sample using automatic image-based methods is an interesting pattern recognition problem with direct applicability in the forensic and historic document

  15. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre


    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr


    Directory of Open Access Journals (Sweden)

    A. A. Vorobeva


    Full Text Available Internet is anonymous, this allows posting under a false name, on behalf of others or simply anonymous. Thus, individuals, criminal or terrorist organizations can use Internet for criminal purposes; they hide their identity to avoid the prosecuting. Existing approaches and algorithms for author identification of web-posts on Russian language are not effective. The development of proven methods, technics and tools for author identification is extremely important and challenging task. In this work the algorithm and software for authorship identification of web-posts was developed. During the study the effectiveness of several classification and feature selection algorithms were tested. The algorithm includes some important steps: 1 Feature extraction; 2 Features discretization; 3 Feature selection with the most effective Relief-f algorithm (to find the best feature set with the most discriminating power for each set of candidate authors and maximize accuracy of author identification; 4 Author identification on model based on Random Forest algorithm. Random Forest and Relief-f algorithms are used to identify the author of a short text on Russian language for the first time. The important step of author attribution is data preprocessing - discretization of continuous features; earlier it was not applied to improve the efficiency of author identification. The software outputs top q authors with maximum probabilities of authorship. This approach is helpful for manual analysis in forensic linguistics, when developed tool is used to narrow the set of candidate authors. For experiments on 10 candidate authors, real author appeared in to top 3 in 90.02% cases, on first place real author appeared in 70.5% of cases.

  17. Automatic allograph matching in forensic writer identification

    NARCIS (Netherlands)

    Niels, R; Vuurpijl, L.; Schomaker, L.R.B.

    A well-established task in forensic writer identification focuses on the comparison of prototypical character shapes (allographs) present in handwriting. In order for a computer to perform this task convincingly, it should yield results that are plausible and understandable to the human expert.

  18. Automatic allograph matching in forensic writer identification

    NARCIS (Netherlands)

    Niels, R.M.J.; Vuurpijl, L.G.; Schomaker, L.R.B.


    A well-established task in forensic writer identification focuses on the comparison of prototypical character shapes (allographs) present in handwriting. In order for a computer to perform this task convincingly, it should yield results that are plausible and understandable to the human expert.

  19. Optical Automatic Car Identification (OACI) Field Test Program (United States)


    The results of the Optical Automatic Car Identification (OACI) tests at Chicago conducted from August 16 to September 4, 1975 are presented. The main purpose of this test was to determine the suitability of optics as a principle of operation for an a...

  20. Person categorization and automatic racial stereotyping effects on weapon identification. (United States)

    Jones, Christopher R; Fazio, Russell H


    Prior stereotyping research provides conflicting evidence regarding the importance of person categorization along a particular dimension for the automatic activation of a stereotype corresponding to that dimension. Experiment 1 replicated a racial stereotyping effect on object identification and examined whether it could be attenuated by encouraging categorization by age. Experiment 2 employed socially complex person stimuli and manipulated whether participants categorized spontaneously or by race. In Experiment 3, the distinctiveness of the racial dimension was manipulated by having Black females appear in the context of either Black males or White females. The results indicated that conditions fostering categorization by race consistently produced automatic racial stereotyping and that conditions fostering nonracial categorization can eliminate automatic racial stereotyping. Implications for the relation between automatic stereotype activation and dimension of categorization are discussed.

  1. Accuracy of Automatic Cephalometric Software on Landmark Identification (United States)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.


    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (p4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  2. Fast and automatic thermographic material identification for the recycling process (United States)

    Haferkamp, Heinz; Burmester, Ingo


    Within the framework of the future closed loop recycling process the automatic and economical sorting of plastics is a decisive element. The at the present time available identification and sorting systems are not yet suitable for the sorting of technical plastics since essential demands, as the realization of high recognition reliability and identification rates considering the variety of technical plastics, can not be guaranteed. Therefore the Laser Zentrum Hannover e.V. in cooperation with the Hoerotron GmbH and the Preussag Noell GmbH has carried out investigations on a rapid thermographic and laser-supported material- identification-system for automatic material-sorting- systems. The automatic identification of different engineering plastics coming from electronic or automotive waste is possible. Identification rates up to 10 parts per second are allowed by the effort from fast IR line scanners. The procedure is based on the following principle: within a few milliseconds a spot on the relevant sample is heated by a CO2 laser. The samples different and specific chemical and physical material properties cause different temperature distributions on their surfaces that are measured by a fast IR-linescan system. This 'thermal impulse response' has to be analyzed by means of a computer system. Investigations have shown that it is possible to analyze more than 18 different sorts of plastics at a frequency of 10 Hz. Crucial for the development of such a system is the rapid processing of imaging data, the minimization of interferences caused by oscillating samples geometries, and a wide range of possible additives in plastics in question. One possible application area is sorting of plastics coming from car- and electronic waste recycling.

  3. MAC, A System for Automatically IPR Identification, Collection and Distribution (United States)

    Serrão, Carlos

    Controlling Intellectual Property Rights (IPR) in the Digital World is a very hard challenge. The facility to create multiple bit-by-bit identical copies from original IPR works creates the opportunities for digital piracy. One of the most affected industries by this fact is the Music Industry. The Music Industry has supported huge losses during the last few years due to this fact. Moreover, this fact is also affecting the way that music rights collecting and distributing societies are operating to assure a correct music IPR identification, collection and distribution. In this article a system for automating this IPR identification, collection and distribution is presented and described. This system makes usage of advanced automatic audio identification system based on audio fingerprinting technology. This paper will present the details of the system and present a use-case scenario where this system is being used.

  4. Automatic identification of algal community from microscopic images. (United States)

    Santhi, Natchimuthu; Pradeepa, Chinnaraj; Subashini, Parthasarathy; Kalaiselvi, Senthil


    A good understanding of the population dynamics of algal communities is crucial in several ecological and pollution studies of freshwater and oceanic systems. This paper reviews the subsequent introduction to the automatic identification of the algal communities using image processing techniques from microscope images. The diverse techniques of image preprocessing, segmentation, feature extraction and recognition are considered one by one and their parameters are summarized. Automatic identification and classification of algal community are very difficult due to various factors such as change in size and shape with climatic changes, various growth periods, and the presence of other microbes. Therefore, the significance, uniqueness, and various approaches are discussed and the analyses in image processing methods are evaluated. Algal identification and associated problems in water organisms have been projected as challenges in image processing application. Various image processing approaches based on textures, shapes, and an object boundary, as well as some segmentation methods like, edge detection and color segmentations, are highlighted. Finally, artificial neural networks and some machine learning algorithms were used to classify and identifying the algae. Further, some of the benefits and drawbacks of schemes are examined.


    Directory of Open Access Journals (Sweden)

    Marijan Gržan


    Full Text Available Automatic Identification System (AIS represents an important improvement in the fields of maritime security and vessel tracking. It is used by the signatory countries to the SOLAS Convention and by private and public providers. Its main advantage is that it can be used as an additional navigation aids, especially in avoiding collision at sea and in search and rescue operations. The present work analyses the functioning of the AIS System and the ways of exchanging data among the users. We also study one of the vulnerabilities of the System that can be abused by malicious users. The threat itself is analysed in detail in order to provide insight into the very process from the creation of a program to its implementation.

  6. Automatic identification of otologic drilling faults: a preliminary report. (United States)

    Shen, Peng; Feng, Guodong; Cao, Tianyang; Gao, Zhiqiang; Li, Xisheng


    A preliminary study was carried out to identify parameters to characterize drilling faults when using an otologic drill under various operating conditions. An otologic drill was modified by the addition of four sensors. Under consistent conditions, the drill was used to simulate three important types of drilling faults and the captured data were analysed to extract characteristic signals. A multisensor information fusion system was designed to fuse the signals and automatically identify the faults. When identifying drilling faults, there was a high degree of repeatability and regularity, with an average recognition rate of >70%. This study shows that the variables measured change in a fashion that allows the identification of particular drilling faults, and that it is feasible to use these data to provide rapid feedback for a control system. Further experiments are being undertaken to implement such a system.

  7. 33 CFR 164.43 - Automatic Identification System Shipborne Equipment-Prince William Sound. (United States)


    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Automatic Identification System Shipborne Equipment-Prince William Sound. 164.43 Section 164.43 Navigation and Navigable Waters COAST GUARD... Automatic Identification System Shipborne Equipment—Prince William Sound. (a) Until December 31, 2004, each...

  8. Automatic Identification of Interictal Epileptiform Discharges in Secondary Generalized Epilepsy

    Directory of Open Access Journals (Sweden)

    Won-Du Chang


    Full Text Available Ictal epileptiform discharges (EDs are characteristic signal patterns of scalp electroencephalogram (EEG or intracranial EEG (iEEG recorded from patients with epilepsy, which assist with the diagnosis and characterization of various types of epilepsy. The EEG signal, however, is often recorded from patients with epilepsy for a long period of time, and thus detection and identification of EDs have been a burden on medical doctors. This paper proposes a new method for automatic identification of two types of EDs, repeated sharp-waves (sharps, and runs of sharp-and-slow-waves (SSWs, which helps to pinpoint epileptogenic foci in secondary generalized epilepsy such as Lennox-Gastaut syndrome (LGS. In the experiments with iEEG data acquired from a patient with LGS, our proposed method detected EDs with an accuracy of 93.76% and classified three different signal patterns with a mean classification accuracy of 87.69%, which was significantly higher than that of a conventional wavelet-based method. Our study shows that it is possible to successfully detect and discriminate sharps and SSWs from background EEG activity using our proposed method.

  9. Automatic failure identification of the nuclear power plant pellet fuel

    International Nuclear Information System (INIS)

    Oliveira, Adriano Fortunato de


    This paper proposed the development of an automatic technique for evaluating defects to help in the stage of fabrication of fuel elements. Was produced an intelligent image analysis for automatic recognition of defects in uranium pellets. Therefore, an Artificial Neural Network (ANN) was trained using segments of histograms of pellets, containing examples of both normal (no fault) and of defectives pellets (with major defects normally found). The images of the pellets were segmented into 11 shares. Histograms were made of these segments and trained the ANN. Besides automating the process, the system was able to obtain this classification accuracy of 98.33%. Although this percentage represents a significant advance ever in the quality control process, the use of more advanced techniques of photography and lighting will reduce it to insignificant levels with low cost. Technologically, the method developed, should it ever be implemented, will add substantial value in terms of process quality control and production outages in relation to domestic manufacturing of nuclear fuel. (author)

  10. Automatic annotation of protein motif function with Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Vanathi


    Full Text Available Abstract Background Conserved protein sequence motifs are short stretches of amino acid sequence patterns that potentially encode the function of proteins. Several sequence pattern searching algorithms and programs exist foridentifying candidate protein motifs at the whole genome level. However, amuch needed and importanttask is to determine the functions of the newly identified protein motifs. The Gene Ontology (GO project is an endeavor to annotate the function of genes or protein sequences with terms from a dynamic, controlled vocabulary and these annotations serve well as a knowledge base. Results This paperpresents methods to mine the GO knowledge base and use the association between the GO terms assigned to a sequence and the motifs matched by the same sequence as evidence for predicting the functions of novel protein motifs automatically. The task of assigning GO terms to protein motifsis viewed as both a binary classification and information retrieval problem, where PROSITE motifs are used as samples for mode training and functional prediction. The mutual information of a motif and aGO term association isfound to be a very useful feature. We take advantageof the known motifs to train a logistic regression classifier, which allows us to combine mutual information with other frequency-based features and obtain a probability of correctassociation. The trained logistic regression model has intuitively meaningful and logically plausible parameter values, and performs very well empirically according to our evaluation criteria. Conclusions In this research, different methods for automatic annotation of protein motifs have been investigated. Empirical result demonstrated that the methods have a great potential for detecting and augmenting information about thefunctions of newly discovered candidate protein motifs.

  11. Roadway system assessment using bluetooth-based automatic vehicle identification travel time data. (United States)


    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection : systems. This includes considerations in the physical setup of the collection system as well as the interpretation of...

  12. Automaticity: design of a registry to assess long-term acceptance and clinical impact of Automatic Algorithms in Insignia pacemakers. (United States)

    Alings, Marco; Vorstenbosch, Jan-Mark; Reeve, Helen


    Worldwide, the number of implants of pacemakers is steadily increasing and this poses an incremental burden on outpatient clinics. While device manufacturers have developed safe and effective automatic algorithms to lighten this workload, the clinical utilization of these algorithms has not been well studied. The Automaticity study is the first large-scale, worldwide registry to evaluate physician's acceptance of automatic algorithms for ventricular capture, automatic sensing, and automatic optimization of sensor settings. The primary objective of the registry is to determine the percentage of patients who have any of the 'Automaticity Algorithms' reprogrammed within 12 months of pacemaker implant. Patients will be implanted with a commercially available pacemaker (Insignia I/Nexus I Ultra or Insignia I/Nexus I AVT, Boston Scientific CRM, St Paul, MN, USA). At discharge, all the 'Automaticity Algorithms' are to be programmed to 'Auto/On'. Data collection on changes in device programming, physician's perception of algorithm function, and adverse events will occur for 12 months following device implant. The Automaticity study is the first large-scale, prospective, multi-site, international registry designed to assess the long-term acceptance of automatic pacemaker algorithms for adjustment of the ventricular output, atrial and ventricular sensitivity, and optimization of minute ventilation and accelerometer settings.

  13. Automatic Identification of Travel Locations in Rare Books - Object Oriented Information Management

    Directory of Open Access Journals (Sweden)

    Detlev Doherr


    Full Text Available The digital content of the Internet is growing exponentially and mass digitization of printed media opens access to literature, in particular the genre of travel literature from the 18th and 19th century, which consists of diaries or travel books describing routes, observations or inspirations. The identification of described locations in the digital text is a long-standing challenge which requires information technology to supply dynamic links to sources by new forms of interaction and synthesis between humanistic texts and scientific observations. Using object oriented information technology, a prototype of a software tool is developed which makes it possible to automatically identify geographic locations and travel routes mentioned in rare books. The information objects contain properties such as names and classification codes for populated places, streams, mountains and regions. Together, with the latitudes and longitudes of every single location, it is possible to geo-reference this information in order that all processed and filtered datasets can be displayed by a map application. This method has already been used in the Humboldt Digital Library to present Alexander von Humboldt's maps and was tested in a case study to prove the correctness and reliability of the automatic identification of locations based on the work of Alexander von Humboldt and Johann Wolfgang von Goethe. The results reveal numerous errors due to misspellings, change of location names, equality of terms and location names. But on the other hand it becomes very clear that results of the automatic object detection and recognition can be improved by error-free and comprehensive sources. As a result an increase in quality and usability of the service can be expected, accompanied by more options to detect unknown locations in the descriptions of rare books.

  14. Automatic script identification from images using cluster-based templates

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Kerns, L.; Kelly, P.; Thomas, T.


    We have developed a technique for automatically identifying the script used to generate a document that is stored electronically in bit image form. Our approach differs from previous work in that the distinctions among scripts are discovered by an automatic learning procedure, without any handson analysis. We first develop a set of representative symbols (templates) for each script in our database (Cyrillic, Roman, etc.). We do this by identifying all textual symbols in a set of training documents, scaling each symbol to a fixed size, clustering similar symbols, pruning minor clusters, and finding each cluster`s centroid. To identify a new document`s script, we identify and scale a subset of symbols from the document and compare them to the templates for each script. We choose the script whose templates provide the best match. Our current system distinguishes among the Armenian, Burmese, Chinese, Cyrillic, Ethiopic, Greek, Hebrew, Japanese, Korean, Roman, and Thai scripts with over 90% accuracy.

  15. Automatic vertebral identification using surface-based registration (United States)

    Herring, Jeannette L.; Dawant, Benoit M.


    This work introduces an enhancement to currently existing methods of intra-operative vertebral registration by allowing the portion of the spinal column surface that correctly matches a set of physical vertebral points to be automatically selected from several possible choices. Automatic selection is made possible by the shape variations that exist among lumbar vertebrae. In our experiments, we register vertebral points representing physical space to spinal column surfaces extracted from computed tomography images. The vertebral points are taken from the posterior elements of a single vertebra to represent the region of surgical interest. The surface is extracted using an improved version of the fully automatic marching cubes algorithm, which results in a triangulated surface that contains multiple vertebrae. We find the correct portion of the surface by registering the set of physical points to multiple surface areas, including all vertebral surfaces that potentially match the physical point set. We then compute the standard deviation of the surface error for the set of points registered to each vertebral surface that is a possible match, and the registration that corresponds to the lowest standard deviation designates the correct match. We have performed our current experiments on two plastic spine phantoms and one patient.

  16. An automatic microseismic or acoustic emission arrival identification scheme with deep recurrent neural networks (United States)

    Zheng, Jing; Lu, Jiren; Peng, Suping; Jiang, Tianqi


    The conventional arrival pick-up algorithms cannot avoid the manual modification of the parameters for the simultaneous identification of multiple events under different signal-to-noise ratios (SNRs). Therefore, in order to automatically obtain the arrivals of multiple events with high precision under different SNRs, in this study an algorithm was proposed which had the ability to pick up the arrival of microseismic or acoustic emission events based on deep recurrent neural networks. The arrival identification was performed using two important steps, which included a training phase and a testing phase. The training process was mathematically modelled by deep recurrent neural networks using Long Short-Term Memory architecture. During the testing phase, the learned weights were utilized to identify the arrivals through the microseismic/acoustic emission data sets. The data sets were obtained by rock physics experiments of the acoustic emission. In order to obtain the data sets under different SNRs, this study added random noise to the raw experiments' data sets. The results showed that the outcome of the proposed method was able to attain an above 80 per cent hit-rate at SNR 0 dB, and an approximately 70 per cent hit-rate at SNR -5 dB, with an absolute error in 10 sampling points. These results indicated that the proposed method had high selection precision and robustness.

  17. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)


    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  18. Semi-automatic long-term acoustic surveying

    DEFF Research Database (Denmark)

    Andreassen, Tórur; Surlykke, Annemarie; Hallam, John


    data sampling rates (500kHz). Using a sound energy threshold criterion for triggering recording, we collected 236GB (Gi=10243) of data at full bandwidth. We implemented a simple automatic method using a Support Vector Machine (SVM) classifier based on a combination of temporal and spectral analyses...

  19. Towards automatic addressee identification in multi-party dialogues

    NARCIS (Netherlands)

    Jovanovic, N.; op den Akker, Hendrikus J.A.


    The paper is about the issue of addressing in multi-party dialogues. Analysis of addressing behavior in face to face meetings results in the identification of several addressing mechanisms. From these we extract several utterance features and features of non-verbal communicative behavior of a

  20. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard


    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  1. Identification with video game characters as automatic shift of self-perceptions

    NARCIS (Netherlands)

    Klimmt, C.; Hefner, D.; Vorderer, P.A.; Roth, C.; Blake, C.


    Two experiments tested the prediction that video game players identify with the character or role they are assigned, which leads to automatic shifts in implicit self-perceptions. Video game identification, thus, is considered as a kind of altered self-experience. In Study 1 (N = 61), participants

  2. Automatic Knowledge Extraction and Knowledge Structuring for a National Term Bank

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Erdman Thomsen, Hanne


    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank....


    Directory of Open Access Journals (Sweden)



    Full Text Available Speaker Identification (SI aims at automatically identifying an individual by extracting and processing information from his/her voice. Speaker voice is a robust a biometric modality that has a strong impact in several application areas. In this study, a new combination learning scheme has been proposed based on Gaussian mixture model-universal background model (GMM-UBM and Learning vector quantization (LVQ for automatic text-independent speaker identification. Features vectors, constituted by the Mel Frequency Cepstral Coefficients (MFCC extracted from the speech signal are used to train the New England subset of the TIMIT database. The best results obtained (90% for gender- independent speaker identification, 97 % for male speakers and 93% for female speakers for test data using 36 MFCC features.

  4. Automatic identification of otological drilling faults: an intelligent recognition algorithm. (United States)

    Cao, Tianyang; Li, Xisheng; Gao, Zhiqiang; Feng, Guodong; Shen, Peng


    This article presents an intelligent recognition algorithm that can recognize milling states of the otological drill by fusing multi-sensor information. An otological drill was modified by the addition of sensors. The algorithm was designed according to features of the milling process and is composed of a characteristic curve, an adaptive filter and a rule base. The characteristic curve can weaken the impact of the unstable normal milling process and reserve the features of drilling faults. The adaptive filter is capable of suppressing interference in the characteristic curve by fusing multi-sensor information. The rule base can identify drilling faults through the filtering result data. The experiments were repeated on fresh porcine scapulas, including normal milling and two drilling faults. The algorithm has high rates of identification. This study shows that the intelligent recognition algorithm can identify drilling faults under interference conditions. (c) 2010 John Wiley & Sons, Ltd.

  5. Ontology-based automatic identification of public health-related Turkish tweets. (United States)

    Küçük, Emine Ela; Yapar, Kürşad; Küçük, Dilek; Küçük, Doğan


    Social media analysis, such as the analysis of tweets, is a promising research topic for tracking public health concerns including epidemics. In this paper, we present an ontology-based approach to automatically identify public health-related Turkish tweets. The system is based on a public health ontology that we have constructed through a semi-automated procedure. The ontology concepts are expanded through a linguistically motivated relaxation scheme as the last stage of ontology development, before being integrated into our system to increase its coverage. The ultimate lexical resource which includes the terms corresponding to the ontology concepts is used to filter the Twitter stream so that a plausible tweet subset, including mostly public-health related tweets, can be obtained. Experiments are carried out on two million genuine tweets and promising precision rates are obtained. Also implemented within the course of the current study is a Web-based interface, to track the results of this identification system, to be used by the related public health staff. Hence, the current social media analysis study has both technical and practical contributions to the significant domain of public health. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Perspective of the applications of automatic identification technologies in the Serbian Army

    Directory of Open Access Journals (Sweden)

    Velibor V. Jovanović


    Full Text Available Without modern information systems, supply-chain management is almost impossible. Automatic identification technologies provide automated data processing, which contributes to improving the conditions and support decision making. Automatic identification technology media, notably BARCODE and RFID technology, are used as carriers of labels with high quality data and adequate description of material means, for providing a crucial visibility of inventory levels through the supply chain. With these media and the use of an adequate information system, the Ministry of Defense of the Republic of Serbia will be able to establish a system of codification and, in accordance with the NATO codification system, to successfully implement a unique codification, classification and determination of storage numbers for all tools, components and spare parts for their unequivocal identification. In the perspective, this will help end users to perform everyday tasks without compromising the material integrity of security data. It will also help command structures to have reliable information for decision making to ensure optimal management. Products and services that pass the codification procedure will have the opportunity to be offered in the largest market of armament and military equipment. This paper gives a comparative analysis of two automatic identification technologies - BARCODE, the most common one, and RFID, the most advanced one - with an emphasis on the advantages and disadvantages of their use in tracking inventory through the supply chain. Their possible application in the Serbian Army is discussed in general.

  7. Automatic identification of variables in epidemiological datasets using logic regression. (United States)

    Lorenz, Matthias W; Abdi, Negin Ashtiani; Scheckenbach, Frank; Pflug, Anja; Bülbül, Alpaslan; Catapano, Alberico L; Agewall, Stefan; Ezhov, Marat; Bots, Michiel L; Kiechl, Stefan; Orth, Andreas


    For an individual participant data (IPD) meta-analysis, multiple datasets must be transformed in a consistent format, e.g. using uniform variable names. When large numbers of datasets have to be processed, this can be a time-consuming and error-prone task. Automated or semi-automated identification of variables can help to reduce the workload and improve the data quality. For semi-automation high sensitivity in the recognition of matching variables is particularly important, because it allows creating software which for a target variable presents a choice of source variables, from which a user can choose the matching one, with only low risk of having missed a correct source variable. For each variable in a set of target variables, a number of simple rules were manually created. With logic regression, an optimal Boolean combination of these rules was searched for every target variable, using a random subset of a large database of epidemiological and clinical cohort data (construction subset). In a second subset of this database (validation subset), this optimal combination rules were validated. In the construction sample, 41 target variables were allocated on average with a positive predictive value (PPV) of 34%, and a negative predictive value (NPV) of 95%. In the validation sample, PPV was 33%, whereas NPV remained at 94%. In the construction sample, PPV was 50% or less in 63% of all variables, in the validation sample in 71% of all variables. We demonstrated that the application of logic regression in a complex data management task in large epidemiological IPD meta-analyses is feasible. However, the performance of the algorithm is poor, which may require backup strategies.

  8. Semi-automatic Term Extraction for the African Languages, with ...

    African Journals Online (AJOL)


    and extract potential terms from electronic corpora, is known as (semi-)auto- matic term extraction. In the great majority of the current approaches, character- istics of a special-language corpus are compared to those of a general-language corpus. In all approaches, humans remain the final arbiters, and must decide whether ...

  9. Automatic identification of non-reflective subsurface targets in radar sounder data based on morphological profile (United States)

    Khodadadzadeh, Mahdi; Ilisei, Ana-Maria; Bruzzone, Lorenzo


    The amount of radar sounder data, which are used to analyze the subsurface of icy environments (e.g., Poles of Earth and Mars), is dramatically increasing from both airborne campaigns at the ice sheets and satellite missions on other planetary bodies. However, the main approach to the investigation of such data is by visual interpretation, which is subjective and time consuming. Moreover, the few available automatic techniques have been developed for analyzing highly reflective subsurface targets, e.g., ice layers, basal interface. Besides the high reflective targets, glaciologists have also shown great interest in the analysis of non-reflective targets, such as the echo-free zone in ice sheets, and the reflective free zone in the subsurface of the South Pole of Mars. However, in the literature, there is no dedicated automatic technique for the analysis of non-reflective targets. To address this limitation, we propose an automatic classification technique for the identification of non-reflective targets in radar sounder data. The method is made up of two steps, i.e., i) feature extraction, which is the core of the method, and ii) automatic classification of subsurface targets. We initially prove that the commonly employed features for the analysis of the radar signal (e.g., statistical and texture based features) are ineffective for the identification of non-reflective targets. Thus, for feature extraction, we propose to exploit structural information based on the morphological closing profile. We show the effectiveness of such features in discriminating of non-reflective target from the other ice subsurface targets. In the second step, a random forest classifier is used to perform the automatic classification. Our experimental results, conducted using two data sets from Central Antarctica and South Pole of Mars, point out the effectiveness of the proposed technique for the accurate identification of non-reflective targets.

  10. Managing Returnable Containers Logistics - A Case Study Part II - Improving Visibility through Using Automatic Identification Technologies

    Directory of Open Access Journals (Sweden)

    Gretchen Meiser


    Full Text Available This case study is the result of a project conducted on behalf of a company that uses its own returnable containers to transport purchased parts from suppliers. The objective of this project was to develop a proposal to enable the company to more effectively track and manage its returnable containers. The research activities in support of this project included (1 the analysis and documentation of the physical flow and the information flow associated with the containers and (2 the investigation of new technologies to improve the automatic identification and tracking of containers. This paper explains the automatic identification technologies and important criteria for selection. A companion paper details the flow of information and containers within the logistics chain, and it identifies areas for improving the management of the containers.

  11. A pattern recognition approach based on DTW for automatic transient identification in nuclear power plants

    International Nuclear Information System (INIS)

    Galbally, Javier; Galbally, David


    Highlights: • Novel transient identification method for NPPs. • Low-complexity. • Low training data requirements. • High accuracy. • Fully reproducible protocol carried out on a real benchmark. - Abstract: Automatic identification of transients in nuclear power plants (NPPs) allows monitoring the fatigue damage accumulated by critical components during plant operation, and is therefore of great importance for ensuring that usage factors remain within the original design bases postulated by the plant designer. Although several schemes to address this important issue have been explored in the literature, there is still no definitive solution available. In the present work, a new method for automatic transient identification is proposed, based on the Dynamic Time Warping (DTW) algorithm, largely used in other related areas such as signature or speech recognition. The novel transient identification system is evaluated on real operational data following a rigorous pattern recognition protocol. Results show the high accuracy of the proposed approach, which is combined with other interesting features such as its low complexity and its very limited requirements of training data

  12. An Automatic Identification Procedure to Promote the use of FES-Cycling Training for Hemiparetic Patients

    Directory of Open Access Journals (Sweden)

    Emilia Ambrosini


    Full Text Available Cycling induced by Functional Electrical Stimulation (FES training currently requires a manual setting of different parameters, which is a time-consuming and scarcely repeatable procedure. We proposed an automatic procedure for setting session-specific parameters optimized for hemiparetic patients. This procedure consisted of the identification of the stimulation strategy as the angular ranges during which FES drove the motion, the comparison between the identified strategy and the physiological muscular activation strategy, and the setting of the pulse amplitude and duration of each stimulated muscle. Preliminary trials on 10 healthy volunteers helped define the procedure. Feasibility tests on 8 hemiparetic patients (5 stroke, 3 traumatic brain injury were performed. The procedure maximized the motor output within the tolerance constraint, identified a biomimetic strategy in 6 patients, and always lasted less than 5 minutes. Its reasonable duration and automatic nature make the procedure usable at the beginning of every training session, potentially enhancing the performance of FES-cycling training.

  13. Identification of mycobacterium tuberculosis in sputum smear slide using automatic scanning microscope (United States)

    Rulaningtyas, Riries; Suksmono, Andriyan B.; Mengko, Tati L. R.; Saptawati, Putri


    Sputum smear observation has an important role in tuberculosis (TB) disease diagnosis, because it needs accurate identification to avoid high errors diagnosis. In development countries, sputum smear slide observation is commonly done with conventional light microscope from Ziehl-Neelsen stained tissue and it doesn't need high cost to maintain the microscope. The clinicians do manual screening process for sputum smear slide which is time consuming and needs highly training to detect the presence of TB bacilli (mycobacterium tuberculosis) accurately, especially for negative slide and slide with less number of TB bacilli. For helping the clinicians, we propose automatic scanning microscope with automatic identification of TB bacilli. The designed system modified the field movement of light microscope with stepper motor which was controlled by microcontroller. Every sputum smear field was captured by camera. After that some image processing techniques were done for the sputum smear images. The color threshold was used for background subtraction with hue canal in HSV color space. Sobel edge detection algorithm was used for TB bacilli image segmentation. We used feature extraction based on shape for bacilli analyzing and then neural network classified TB bacilli or not. The results indicated identification of TB bacilli that we have done worked well and detected TB bacilli accurately in sputum smear slide with normal staining, but not worked well in over staining and less staining tissue slide. However, overall the designed system can help the clinicians in sputum smear observation becomes more easily.

  14. Forming and detection of digital watermarks in the System for Automatic Identification of VHF Transmissions

    Directory of Open Access Journals (Sweden)

    О. В. Шишкін


    Full Text Available Forming and detection algorithms for digital watermarks are designed for automatic identification of VHF radiotelephone transmissions in the maritime and aeronautical mobile services. An audible insensitivity and interference resistance of embedded digital data are provided by means of OFDM technology jointly with normalized distortions distribution and data packet detection by the hash-function. Experiments were carried out on the base of ship’s radio station RT-2048 Sailor and USB ADC-DAC module of type Е14-140M L-CARD in the off-line processing regime in Matlab medium

  15. Single Document Automatic Text Summarization using Term Frequency-Inverse Document Frequency (TF-IDF

    Directory of Open Access Journals (Sweden)

    Hans Christian


    Full Text Available The increasing availability of online information has triggered an intensive research in the area of automatic text summarization within the Natural Language Processing (NLP. Text summarization reduces the text by removing the less useful information which helps the reader to find the required information quickly. There are many kinds of algorithms that can be used to summarize the text. One of them is TF-IDF (TermFrequency-Inverse Document Frequency. This research aimed to produce an automatic text summarizer implemented with TF-IDF algorithm and to compare it with other various online source of automatic text summarizer. To evaluate the summary produced from each summarizer, The F-Measure as the standard comparison value had been used. The result of this research produces 67% of accuracy with three data samples which are higher compared to the other online summarizers.

  16. Deep learning for automatic localization, identification, and segmentation of vertebral bodies in volumetric MR images (United States)

    Suzani, Amin; Rasoulian, Abtin; Seitel, Alexander; Fels, Sidney; Rohling, Robert N.; Abolmaesumi, Purang


    This paper proposes an automatic method for vertebra localization, labeling, and segmentation in multi-slice Magnetic Resonance (MR) images. Prior work in this area on MR images mostly requires user interaction while our method is fully automatic. Cubic intensity-based features are extracted from image voxels. A deep learning approach is used for simultaneous localization and identification of vertebrae. The localized points are refined by local thresholding in the region of the detected vertebral column. Thereafter, a statistical multi-vertebrae model is initialized on the localized vertebrae. An iterative Expectation Maximization technique is used to register the vertebral body of the model to the image edges and obtain a segmentation of the lumbar vertebral bodies. The method is evaluated by applying to nine volumetric MR images of the spine. The results demonstrate 100% vertebra identification and a mean surface error of below 2.8 mm for 3D segmentation. Computation time is less than three minutes per high-resolution volumetric image.

  17. Automatic identification and location technology of glass insulator self-shattering (United States)

    Huang, Xinbo; Zhang, Huiying; Zhang, Ye


    The insulator of transmission lines is one of the most important infrastructures, which is vital to ensure the safe operation of transmission lines under complex and harsh operating conditions. The glass insulator often self-shatters but the available identification methods are inefficient and unreliable. Then, an automatic identification and localization technology of self-shattered glass insulators is proposed, which consists of the cameras installed on the tower video monitoring devices or the unmanned aerial vehicles, the 4G/OPGW network, and the monitoring center, where the identification and localization algorithm is embedded into the expert software. First, the images of insulators are captured by cameras, which are processed to identify the region of insulator string by the presented identification algorithm of insulator string. Second, according to the characteristics of the insulator string image, a mathematical model of the insulator string is established to estimate the direction and the length of the sliding blocks. Third, local binary pattern histograms of the template and the sliding block are extracted, by which the self-shattered insulator can be recognized and located. Finally, a series of experiments is fulfilled to verify the effectiveness of the algorithm. For single insulator images, Ac, Pr, and Rc of the algorithm are 94.5%, 92.38%, and 96.78%, respectively. For double insulator images, Ac, Pr, and Rc are 90.00%, 86.36%, and 93.23%, respectively.

  18. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    International Nuclear Information System (INIS)

    Amato, G.; Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V.; Sorrentino, F.; Tognoni, E.


    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  19. Real-time automatic target identification system for air-to-ground targeting (United States)

    Nicholas, Mike; Wood, Jonathan; Nothard, Jo


    Future targeting systems, for manned or unmanned combat aircraft, aim to provide increased mission success and platform survivability by successfully detecting and identifying even difficult targets at very long ranges. One of the key enabling technologies for such systems is robust automatic target identification (ATI), operating on high resolution electro-optic sensor imagery. QinetiQ have developed a real time ATI processor which will be demonstrated with infrared imagery from the Wescam MX15 in airborne trials in summer 2005. This paper describes some of the novel ATI algorithms, the challenges overcome to port the ATI from the laboratory onto a real time system and offers an assessment of likely airborne performance based on analysis of synthetic image sequences.

  20. Benefit Analyses of Technologies for Automatic Identification to Be Implemented in the Healthcare Sector (United States)

    Krey, Mike; Schlatter, Ueli

    The tasks and objectives of automatic identification (Auto-ID) are to provide information on goods and products. It has already been established for years in the areas of logistics and trading and can no longer be ignored by the German healthcare sector. Some German hospitals have already discovered the capabilities of Auto-ID. Improvements in quality, safety and reductions in risk, cost and time are aspects and areas where improvements are achievable. Privacy protection, legal restraints, and the personal rights of patients and staff members are just a few aspects which make the heath care sector a sensible field for the implementation of Auto-ID. Auto-ID in this context contains the different technologies, methods and products for the registration, provision and storage of relevant data. With the help of a quantifiable and science-based evaluation, an answer is sought as to which Auto-ID has the highest capability to be implemented in healthcare business.

  1. Ontorat: automatic generation of new ontology terms, annotations, and axioms based on ontology design patterns. (United States)

    Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun


    It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following

  2. SLIDE: automatic spine level identification system using a deep convolutional neural network. (United States)

    Hetherington, Jorden; Lessoway, Victoria; Gunka, Vit; Abolmaesumi, Purang; Rohling, Robert


    Percutaneous spinal needle insertion procedures often require proper identification of the vertebral level to effectively and safely deliver analgesic agents. The current clinical method involves "blind" identification of the vertebral level through manual palpation of the spine, which has only 30% reported accuracy. Therefore, there is a need for better anatomical identification prior to needle insertion. A real-time system was developed to identify the vertebral level from a sequence of ultrasound images, following a clinical imaging protocol. The system uses a deep convolutional neural network (CNN) to classify transverse images of the lower spine. Several existing CNN architectures were implemented, utilizing transfer learning, and compared for adequacy in a real-time system. In the system, the CNN output is processed, using a novel state machine, to automatically identify vertebral levels as the transducer moves up the spine. Additionally, a graphical display was developed and integrated within 3D Slicer. Finally, an augmented reality display, projecting the level onto the patient's back, was also designed. A small feasibility study [Formula: see text] evaluated performance. The proposed CNN successfully discriminates ultrasound images of the sacrum, intervertebral gaps, and vertebral bones, achieving 88% 20-fold cross-validation accuracy. Seventeen of 20 test ultrasound scans had successful identification of all vertebral levels, processed at real-time speed (40 frames/s). A machine learning system is presented that successfully identifies lumbar vertebral levels. The small study on human subjects demonstrated real-time performance. A projection-based augmented reality display was used to show the vertebral level directly on the subject adjacent to the puncture site.

  3. Metodology of identification parameters of models control objects of automatic trailing system

    Directory of Open Access Journals (Sweden)

    I.V. Zimchuk


    Full Text Available The determining factor for the successful solution of the problem of synthesis of optimal control systems of different processes are adequacy of mathematical model of control object. In practice, the options can differ from the objects taken priori, causing a need to clarification of them. In this context, the article presents the results of the development and application of methods parameters identification of mathematical models of control object of automatic trailing system. The stated problem in the article is solved provided that control object is fully controlled and observed, and a differential equation of control object is known a priori. The coefficients of this equation to be determined. Identifying quality criterion is to minimize the integral value of squared error of identification. The method is based on a description of the dynamics of the object in space state. Equation of identification synthesized using the vector-matrix representation of model. This equation describes the interconnection of coefficients of matrix state and control with inputs and outputs of object. The initial data for calculation are the results of experimental investigation of the reaction of phase coordinates of control object at a typical input signal. The process of calculating the model parameters is reduced to solving the system of equations of the first order each. Application the above approach is illustrated in the example identification of coefficients transfer function of control object first order. Results of digital simulation are presented, they are confirming the justice of set out mathematical calculations. The approach enables to do the identification of models of one-dimensional and multidimensional objects and does not require a large amount of calculation for its implementation. The order of identified model is limited capabilities of measurement phase coordinates of corresponding control object. The practical significance of the work is

  4. A hybrid approach to automatic de-identification of psychiatric notes. (United States)

    Lee, Hee-Jin; Wu, Yonghui; Zhang, Yaoyun; Xu, Jun; Xu, Hua; Roberts, Kirk


    De-identification, or identifying and removing protected health information (PHI) from clinical data, is a critical step in making clinical data available for clinical applications and research. This paper presents a natural language processing system for automatic de-identification of psychiatric notes, which was designed to participate in the 2016 CEGS N-GRID shared task Track 1. The system has a hybrid structure that combines machine leaning techniques and rule-based approaches. The rule-based components exploit the structure of the psychiatric notes as well as characteristic surface patterns of PHI mentions. The machine learning components utilize supervised learning with rich features. In addition, the system performance was boosted with integration of additional data to the training set through domain adaptation. The hybrid system showed overall micro-averaged F-score 90.74 on the test set, second-best among all the participants of the CEGS N-GRID task. Copyright © 2017. Published by Elsevier Inc.

  5. A new technology for automatic identification and sorting of plastics for recycling. (United States)

    Ahmad, S R


    A new technology for automatic sorting of plastics, based upon optical identification of fluorescence signatures of dyes, incorporated in such materials in trace concentrations prior to product manufacturing, is described. Three commercial tracers were selected primarily on the basis of their good absorbency in the 310-370 nm spectral band and their identifiable narrow-band fluorescence signatures in the visible band of the spectrum when present in binary combinations. This absorption band was selected because of the availability of strong emission lines in this band from a commercial Hg-arc lamp and high fluorescence quantum yields of the tracers at this excitation wavelength band. The plastics chosen for tracing and identification are HDPE, LDPE, PP, EVA, PVC and PET and the tracers were compatible and chemically non-reactive with the host matrices and did not affect the transparency of the plastics. The design of a monochromatic and collimated excitation source, the sensor system are described and their performances in identifying and sorting plastics doped with tracers at a few parts per million concentration levels are evaluated. In an industrial sorting system, the sensor was able to sort 300 mm long plastic bottles at a conveyor belt speed of 3.5 m.sec(-1) with a sorting purity of -95%. The limitation was imposed due to mechanical singulation irregularities at high speed and the limited processing speed of the computer used.

  6. Field-portable imaging remote sensing system for automatic identification and imaging of hazardous gases (United States)

    Harig, R.; Rusch, P.; Peters, H.; Gerhard, J.; Braun, R.,; Sabbah, S.; Beecken, J.


    Hazardous compounds may be released into the atmosphere in the case of fires, chemical accidents, terrorist acts, or war. In these cases, information about the released compounds is required immediately in order to take appropriate measures to protect workers, residents, emergency response personnel at the site of the release, and the environment. Remote sensing by infrared spectroscopy allows detection and identification of hazardous clouds in the atmosphere from long distances. In addition, imaging spectroscopy allows an assessment of the location, the dimensions and the dispersion of a potentially hazardous cloud. This additional information may contribute significantly to a correct assessment of a situation by emergency response forces. Therefore an imaging remote sensing system based on a Fourier-transform spectrometer with a focal plane array detector for automatic identification and imaging of gases has been developed. Imaging systems allow the use of spatial information in addition to spectral information. Thus, in order to achieve low limits of detection, algorithms that combine algorithms for spectral analysis and image analysis have been developed. In this work, the system and first results of measurements are presented.

  7. Semi-automatic Term Extraction for an isiZulu Linguistic Terms ...

    African Journals Online (AJOL)


    This paper advances the use of frequency analysis and the keyword analysis as strategies to extract terms for the compilation of the dictionary of isiZulu linguistic terms. The study uses the isiZulu. National Corpus (INC) of about 1,2 million tokens as a reference corpus as well as an LSP corpus of about 100,000 tokens as a ...

  8. Semi-automatic Term Extraction for an isiZulu Linguistic Terms ...

    African Journals Online (AJOL)

    Abstract. The University of KwaZulu-Natal (UKZN) is compiling a series of Language for Special Purposes (LSP) dictionaries for various specialized subject domains in line with its language policy and plan. The focus in this paper is the term extraction for words in the linguistics subject domain. This paper advances the use ...

  9. EVEREST: automatic identification and classification of protein domains in all protein sequences

    Directory of Open Access Journals (Sweden)

    Linial Nathan


    Full Text Available Abstract Background Proteins are comprised of one or several building blocks, known as domains. Such domains can be classified into families according to their evolutionary origin. Whereas sequencing technologies have advanced immensely in recent years, there are no matching computational methodologies for large-scale determination of protein domains and their boundaries. We provide and rigorously evaluate a novel set of domain families that is automatically generated from sequence data. Our domain family identification process, called EVEREST (EVolutionary Ensembles of REcurrent SegmenTs, begins by constructing a library of protein segments that emerge in an all vs. all pairwise sequence comparison. It then proceeds to cluster these segments into putative domain families. The selection of the best putative families is done using machine learning techniques. A statistical model is then created for each of the chosen families. This procedure is then iterated: the aforementioned statistical models are used to scan all protein sequences, to recreate a library of segments and to cluster them again. Results Processing the Swiss-Prot section of the UniProt Knoledgebase, release 7.2, EVEREST defines 20,230 domains, covering 85% of the amino acids of the Swiss-Prot database. EVEREST annotates 11,852 proteins (6% of the database that are not annotated by Pfam A. In addition, in 43,086 proteins (20% of the database, EVEREST annotates a part of the protein that is not annotated by Pfam A. Performance tests show that EVEREST recovers 56% of Pfam A families and 63% of SCOP families with high accuracy, and suggests previously unknown domain families with at least 51% fidelity. EVEREST domains are often a combination of domains as defined by Pfam or SCOP and are frequently sub-domains of such domains. Conclusion The EVEREST process and its output domain families provide an exhaustive and validated view of the protein domain world that is automatically


    The report presents briefly a nonlinear model originally proposed by the late Norbert Wiener for the characterization of general systems. Three...procedures are then offered for the identification of any given system in terms of the Wiener model. Finally, this report presents the results of a digital

  11. Automatic Active-Region Identification and Azimuth Disambiguation of the SOLIS/VSM Full-Disk Vector Magnetograms (United States)

    Georgoulis, M. K.; Raouafi, N.-E.; Henney, C. J.

    The Vector Spectromagnetograph (VSM) of the NSO's Synoptic Optical Long-Term Investigations of the Sun (SOLIS) facility is now operational and obtains the first-ever vector magnetic field measurements of the entire visible solar hemisphere. To fully exploit the unprecedented SOLIS/VSM data, however, one must first address two critical problems: first, the study of solar active regions requires an automatic, physically intuitive, technique for active-region identification in the solar disk. Second, use of active-region vector magnetograms requires removal of the azimuthal 180° ambiguity in the orientation of the transverse magnetic field component. Here we report on an effort to address both problems simultaneously and efficiently. To identify solar active regions we apply an algorithm designed to locate complex, flux-balanced, magnetic structures with a dominant East--West orientation on the disk. Each of the disk portions corresponding to active regions is thereafter extracted and subjected to the Nonpotential Magnetic Field Calculation (NPFC) method that provides a physically-intuitive solution of the 180° ambiguity. Both algorithms have been integrated into the VSM data pipeline and operate in real time, without human intervention. We conclude that this combined approach can contribute meaningfully to our emerging capability for full-disk vector magnetography as pioneered by SOLIS today and will be carried out by ground-based and space-borne magnetographs in the future.

  12. Automatic identification and characterization of radial files in light microscopy images of wood. (United States)

    Brunel, Guilhem; Borianne, Philippe; Subsol, Gérard; Jaeger, Marc; Caraglio, Yves


    Analysis of anatomical sections of wood provides important information for understanding the secondary growth and development of plants. This study reports on a new method for the automatic detection and characterization of cell files in wood images obtained by light microscopy. To facilitate interpretation of the results, reliability coefficients have been determined, which characterize the files, their cells and their respective measurements. Histological sections and blocks of the gymnosperms Pinus canariensis, P. nigra and Abies alba were used, together with histological sections of the angiosperm mahogany (Swietenia spp.). Samples were scanned microscopically and mosaic images were built up. After initial processing to reduce noise and enhance contrast, cells were identified using a 'watershed' algorithm and then cell files were built up by the successive aggregation of cells taken from progressively enlarged neighbouring regions. Cell characteristics such as thickness and size were calculated, and a method was developed to determine the reliability of the measurements relative to manual methods. Image analysis using this method can be performed in less than 20 s, which compares with a time of approx. 40 min to produce the same results manually. The results are accompanied by a reliability indicator that can highlight specific configurations of cells and also potentially erroneous data. The method provides a fast, economical and reliable tool for the identification of cell files. The reliability indicator characterizing the files permits quick filtering of data for statistical analysis while also highlighting particular biological configurations present in the wood sections.

  13. Adoption of automatic identification systems by grocery retailersin the Johannesburg area

    Directory of Open Access Journals (Sweden)

    Christopher C. Darlington


    Full Text Available Retailers not only need the right data capture technology to meet the requirements of their applications, they must also decide on what the optimum technology is from the different symbologies that have been developed over the years. Automatic identification systems (AIS are a priority to decision makers as they attempt to obtain the best blend of equipment to ensure greater loss prevention and higher reliability in data capture. However there is a risk of having too simplistic a view of adopting AIS, since no one solution is applicable across an industry or business model. This problem is addressed through an exploratory, descriptive study, where the nature and value of AIS adoption by grocery retailers in the Johannesburg area is interrogated. Mixed empirical results indicate that, as retailers adopt AIS in order to improve their supply chain management systems, different types of applications are associated with various constraints and opportunities. Overall this study is in line with previous research that supports the notion that supply chain decisions are of a strategic nature even though efficient management of information is a day-to-day business operational decision.

  14. Compensation of Cable Voltage Drops and Automatic Identification of Cable Parameters in 400 Hz Ground Power Units

    DEFF Research Database (Denmark)

    Borup, Uffe; Nielsen, Bo Vork; Blaabjerg, Frede


    self and mutual impedance parameters. The model predicts the voltage drop at both symmetrical and unbalanced loads. In order to determine the cable model parameters an automatic identification concept is derived. The concept is tested in full scale on a 90-kVA 400-Hz GPU with two different cables....... It is concluded that the performance is significantly improved both with symmetrical and unsymmetrical cables and with balanced and unbalanced loads....

  15. Long-term abacus training induces automatic processing of abacus numbers in children. (United States)

    Du, Fenglei; Yao, Yuan; Zhang, Qiong; Chen, Feiyan


    Abacus-based mental calculation (AMC) is a unique strategy for arithmetic that is based on the mental abacus. AMC experts can solve calculation problems with extraordinarily fast speed and high accuracy. Previous studies have demonstrated that abacus experts showed superior performance and special neural correlates during numerical tasks. However, most of those studies focused on the perception and cognition of Arabic numbers. It remains unclear how the abacus numbers were perceived. By applying a similar enumeration Stroop task, in which participants are presented with a visual display containing two abacus numbers and asked to compare the numerosity of beads that consisted of the abacus number, in the present study we investigated the automatic processing of the numerical value of abacus numbers in abacus-trained children. The results demonstrated a significant congruity effect in the numerosity comparison task for abacus-trained children, in both reaction time and error rate analysis. These results suggested that the numerical value of abacus numbers was perceived automatically by the abacus-trained children after long-term training.

  16. SU-F-R-25: Automatic Identification of Suspicious Recurrent/residual Disease Regions After Prostatectomy

    Energy Technology Data Exchange (ETDEWEB)

    Parra, N A; Abramowitz, M; Pollack, A; Stoyanova, R [University of Miami, Miami, FL (United States)


    Purpose: To automatically identify and outline suspicious regions of recurrent or residual disease in the prostate bed using Dynamic Contrast Enhanced-MRI (DCE-MRI) in patients after prostatectomy. Methods: Twenty-two patients presenting for salvage radiotherapy and with identified Gross Tumor Volume (GTV) in the prostate bed were retrospectively analyzed. The MRI data consisted of Axial T2weighted-MRI (T2w) of the pelvis: resolution 1.25×1.25×2.5 mm; Field of View (FOV): 320×320 mm; slice thickness=2.5mm; 72 slices; and Dynamic Contrast Enhanced MRI (DCE-MRI)–12 series of T1w with identical spatial resolution to T2w and at 30–34s temporal resolution. Unsupervised pattern recognition was used to decompose the 4D DCE data as the product W.H of weights W of k patterns H. A well-perfused pattern Hwp was identified and the weight map Wwp associated to Hwp was used to delineate suspicious volumes. Threshold of Wwp set at mean(Wwp)+S*std(Wwp), S=1,1.5,2 and 2.5 defined four volumes labeled as DCE1.0 to DCE2.5. These volumes were displayed on T2w and, along with GTV, were correlated with the highest pre-treatment PSA values, and with pharmacokinetic analysis constants. Results: GTV was significantly correlated with DCE2.0(ρ= 0.60, p<0.003), and DCE 2.5 (ρ=0.58, p=0.004)). Significant correlation was found between highest pre-treatment PSA and GTV(ρ=0.42, p<0.049), DCE2.0(ρ= 0.52, p<0.012), and DCE 2.5 (ρ=0.67, p<<0.01)). Kruskal-Wallis analysis showed that Ktrans median value was statistically different between non-specific prostate bed tissue NSPBT and both GTV (p<<0.001) and DCE2.5 (p<<0.001), but while median Ve was statistically different between DCE2.5 and NSPBT (p=0.002), it was not statistically different between GTV and NSPBT (p=0.054), suggesting that automatic volumes capture more accurately the area of malignancy. Conclusion: Software developed for identification and visualization of suspicions regions in DCE-MRI from post-prostatectomy patients has

  17. Automatic temporal segment detection via bilateral long short-term memory recurrent neural networks (United States)

    Sun, Bo; Cao, Siming; He, Jun; Yu, Lejun; Li, Liandong


    Constrained by the physiology, the temporal factors associated with human behavior, irrespective of facial movement or body gesture, are described by four phases: neutral, onset, apex, and offset. Although they may benefit related recognition tasks, it is not easy to accurately detect such temporal segments. An automatic temporal segment detection framework using bilateral long short-term memory recurrent neural networks (BLSTM-RNN) to learn high-level temporal-spatial features, which synthesizes the local and global temporal-spatial information more efficiently, is presented. The framework is evaluated in detail over the face and body database (FABO). The comparison shows that the proposed framework outperforms state-of-the-art methods for solving the problem of temporal segment detection.

  18. AROMA-AIRWICK: a CHLOE/CDC-3600 system for the automatic identification of spark images and their association into tracks

    International Nuclear Information System (INIS)

    Clark, R.K.

    The AROMA-AIRWICK System for CHLOE, an automatic film scanning equipment built at Argonne by Donald Hodges, and the CDC-3600 computer is a system for the automatic identification of spark images and their association into tracks. AROMA-AIRWICK has been an outgrowth of the generally recognized need for the automatic processing of high energy physics data and the fact that the Argonne National Laboratory has been a center of serious spark chamber development in recent years

  19. Identification of Units and Other Terms in Czech Medical Records

    Czech Academy of Sciences Publication Activity Database

    Zvára Jr., Karel; Kašpar, Václav


    Roč. 6, č. 1 (2010), s. 78-82 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : natural language processing * healthcare documentation * medical reports * EHR * finite-state machine * regular expression Subject RIV: IN - Informatics, Computer Science and -other-terms-in-czech-medical-records.html

  20. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal-neutron-activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, M.C.


    A computer code that automatically analyzes gamma-ray spectra obtained with Ge(Li) detectors is described. The program contains such features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification, and calculation of masses and errors. Finally, the results obtained with this computer code for a lunar sample are reported and briefly discussed

  1. Long term Suboxone™ emotional reactivity as measured by automatic detection in speech.

    Directory of Open Access Journals (Sweden)

    Edward Hill

    Full Text Available Addictions to illicit drugs are among the nation's most critical public health and societal problems. The current opioid prescription epidemic and the need for buprenorphine/naloxone (Suboxone®; SUBX as an opioid maintenance substance, and its growing street diversion provided impetus to determine affective states ("true ground emotionality" in long-term SUBX patients. Toward the goal of effective monitoring, we utilized emotion-detection in speech as a measure of "true" emotionality in 36 SUBX patients compared to 44 individuals from the general population (GP and 33 members of Alcoholics Anonymous (AA. Other less objective studies have investigated emotional reactivity of heroin, methadone and opioid abstinent patients. These studies indicate that current opioid users have abnormal emotional experience, characterized by heightened response to unpleasant stimuli and blunted response to pleasant stimuli. However, this is the first study to our knowledge to evaluate "true ground" emotionality in long-term buprenorphine/naloxone combination (Suboxone™. We found in long-term SUBX patients a significantly flat affect (p<0.01, and they had less self-awareness of being happy, sad, and anxious compared to both the GP and AA groups. We caution definitive interpretation of these seemingly important results until we compare the emotional reactivity of an opioid abstinent control using automatic detection in speech. These findings encourage continued research strategies in SUBX patients to target the specific brain regions responsible for relapse prevention of opioid addiction.

  2. MetaboHunter: an automatic approach for identification of metabolites from 1H-NMR spectra of complex mixtures

    Directory of Open Access Journals (Sweden)

    Culf Adrian


    Full Text Available Abstract Background One-dimensional 1H-NMR spectroscopy is widely used for high-throughput characterization of metabolites in complex biological mixtures. However, the accurate identification of individual compounds is still a challenging task, particularly in spectral regions with higher peak densities. The need for automatic tools to facilitate and further improve the accuracy of such tasks, while using increasingly larger reference spectral libraries becomes a priority of current metabolomics research. Results We introduce a web server application, called MetaboHunter, which can be used for automatic assignment of 1H-NMR spectra of metabolites. MetaboHunter provides methods for automatic metabolite identification based on spectra or peak lists with three different search methods and with possibility for peak drift in a user defined spectral range. The assignment is performed using as reference libraries manually curated data from two major publicly available databases of NMR metabolite standard measurements (HMDB and MMCD. Tests using a variety of synthetic and experimental spectra of single and multi metabolite mixtures show that MetaboHunter is able to identify, in average, more than 80% of detectable metabolites from spectra of synthetic mixtures and more than 50% from spectra corresponding to experimental mixtures. This work also suggests that better scoring functions improve by more than 30% the performance of MetaboHunter's metabolite identification methods. Conclusions MetaboHunter is a freely accessible, easy to use and user friendly 1H-NMR-based web server application that provides efficient data input and pre-processing, flexible parameter settings, fast and automatic metabolite fingerprinting and results visualization via intuitive plotting and compound peak hit maps. Compared to other published and freely accessible metabolomics tools, MetaboHunter implements three efficient methods to search for metabolites in manually curated

  3. Short-Term Load Forecasting-Based Automatic Distribution Network Reconfiguration

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Huaiguang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yingchen [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    In a traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of the load forecasting technique can provide an accurate prediction of the load power that will happen in a future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during a longer time period instead of using a snapshot of the load at the time when the reconfiguration happens; thus, the distribution system operator can use this information to better operate the system reconfiguration and achieve optimal solutions. This paper proposes a short-term load forecasting approach to automatically reconfigure distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with a forecaster based on support vector regression and parallel parameters optimization. The network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum amount of loss at the future time. The simulation results validate and evaluate the proposed approach.

  4. Short-Term Load Forecasting Based Automatic Distribution Network Reconfiguration: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Huaiguang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yingchen [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jiang, Huaiguang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yingchen [National Renewable Energy Laboratory (NREL), Golden, CO (United States)


    In the traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of load forecasting technique can provide accurate prediction of load power that will happen in future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during the longer time period instead of using the snapshot of load at the time when the reconfiguration happens, and thus it can provide information to the distribution system operator (DSO) to better operate the system reconfiguration to achieve optimal solutions. Thus, this paper proposes a short-term load forecasting based approach for automatically reconfiguring distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with support vector regression (SVR) based forecaster and parallel parameters optimization. And the network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum loss at the future time. The simulation results validate and evaluate the proposed approach.

  5. Label-free sensor for automatic identification of erythrocytes using digital in-line holographic microscopy and machine learning. (United States)

    Go, Taesik; Byeon, Hyeokjun; Lee, Sang Joon


    Cell types of erythrocytes should be identified because they are closely related to their functionality and viability. Conventional methods for classifying erythrocytes are time consuming and labor intensive. Therefore, an automatic and accurate erythrocyte classification system is indispensable in healthcare and biomedical fields. In this study, we proposed a new label-free sensor for automatic identification of erythrocyte cell types using a digital in-line holographic microscopy (DIHM) combined with machine learning algorithms. A total of 12 features, including information on intensity distributions, morphological descriptors, and optical focusing characteristics, is quantitatively obtained from numerically reconstructed holographic images. All individual features for discocytes, echinocytes, and spherocytes are statistically different. To improve the performance of cell type identification, we adopted several machine learning algorithms, such as decision tree model, support vector machine, linear discriminant classification, and k-nearest neighbor classification. With the aid of these machine learning algorithms, the extracted features are effectively utilized to distinguish erythrocytes. Among the four tested algorithms, the decision tree model exhibits the best identification performance for the training sets (n = 440, 98.18%) and test sets (n = 190, 97.37%). This proposed methodology, which smartly combined DIHM and machine learning, would be helpful for sensing abnormal erythrocytes and computer-aided diagnosis of hematological diseases in clinic. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads. (United States)

    Giuliani, C; Agostinelli, A; Di Nardo, F; Fioretti, S; Burattini, L


    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; Pautomatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs.

  7. Development of an automatical identification method in the CsI detectors; Develppement d`une method automatique d`identification dans les detecteurs CsI

    Energy Technology Data Exchange (ETDEWEB)

    Gourio, D. [Gesellschaft fuer Schwerionenforschung, Planckstrasse 1, D-64291 Darmstadt (Germany); Assenard, M.; Germain, M.; Reposeur, T.; Eudes, P.; Lautridou, P.; Laville, J.L.; Lebrun, C.; Rahmani, A. [Laboratoire de Physique Subatomique et des Technologies Associees - SUBATECH, Centre National de la Recherche Scientifique, 44 - Nantes (France)


    Achievement of multidetectors offers the possibility of measuring almost totally the particles produced in heavy ion reactions. Particularly, INDRA covers 90% of the solid angle and uses some 350 CsI detectors for the charged particle detection (Z {<=} 4). As the data yield from these multidetectors is huge we developed a first approach to automatically perform the identification procedure for light particle in a CsI scintillator. This is based on a pattern recognition with a final check assuring the consistency of the result

  8. Automatic writer identification using connected-component contours and edge-based features of uppercase Western script. (United States)

    Schomaker, Lambert; Bulacu, Marius


    In this paper, a new technique for offline writer identification is presented, using connected-component contours (COCOCOs or CO3s) in uppercase handwritten samples. In our model, the writer is considered to be characterized by a stochastic pattern generator, producing a family of connected components for the uppercase character set. Using a codebook of CO3s from an independent training set of 100 writers, the probability-density function (PDF) of CO3s was computed for an independent test set containing 150 unseen writers. Results revealed a high-sensitivity of the CO3 PDF for identifying individual writers on the basis of a single sentence of uppercase characters. The proposed automatic approach bridges the gap between image-statistics approaches on one end and manually measured allograph features of individual characters on the other end. Combining the CO3 PDF with an independent edge-based orientation and curvature PDF yielded very high correct identification rates.

  9. Contribution to automatic speech recognition. Analysis of the direct acoustical signal. Recognition of isolated words and phoneme identification

    International Nuclear Information System (INIS)

    Dupeyrat, Benoit


    This report deals with the acoustical-phonetic step of the automatic recognition of the speech. The parameters used are the extrema of the acoustical signal (coded in amplitude and duration). This coding method, the properties of which are described, is simple and well adapted to a digital processing. The quality and the intelligibility of the coded signal after reconstruction are particularly satisfactory. An experiment for the automatic recognition of isolated words has been carried using this coding system. We have designed a filtering algorithm operating on the parameters of the coding. Thus the characteristics of the formants can be derived under certain conditions which are discussed. Using these characteristics the identification of a large part of the phonemes for a given speaker was achieved. Carrying on the studies has required the development of a particular methodology of real time processing which allowed immediate evaluation of the improvement of the programs. Such processing on temporal coding of the acoustical signal is extremely powerful and could represent, used in connection with other methods an efficient tool for the automatic processing of the speech.(author) [fr

  10. Automatic identification of watercourses in flat and engineered landscapes by computing the skeleton of a LiDAR point cloud (United States)

    Broersen, Tom; Peters, Ravi; Ledoux, Hugo


    Drainage networks play a crucial role in protecting land against floods. It is therefore important to have an accurate map of the watercourses that form the drainage network. Previous work on the automatic identification of watercourses was typically based on grids, focused on natural landscapes, and used mostly the slope and curvature of the terrain. We focus in this paper on areas that are characterised by low-lying, flat, and engineered landscapes; these are characteristic to the Netherlands for instance. We propose a new methodology to identify watercourses automatically from elevation data, it uses solely a raw classified LiDAR point cloud as input. We show that by computing twice a skeleton of the point cloud-once in 2D and once in 3D-and that by using the properties of the skeletons we can identify most of the watercourses. We have implemented our methodology and tested it for three different soil types around Utrecht, the Netherlands. We were able to detect 98% of the watercourses for one soil type, and around 75% for the worst case, when we compared to a reference dataset that was obtained semi-automatically.

  11. Maritime over the Horizon Sensor Integration: High Frequency Surface-Wave-Radar and Automatic Identification System Data Integration Algorithm. (United States)

    Nikolic, Dejan; Stojkovic, Nikola; Lekic, Nikola


    To obtain the complete operational picture of the maritime situation in the Exclusive Economic Zone (EEZ) which lies over the horizon (OTH) requires the integration of data obtained from various sensors. These sensors include: high frequency surface-wave-radar (HFSWR), satellite automatic identification system (SAIS) and land automatic identification system (LAIS). The algorithm proposed in this paper utilizes radar tracks obtained from the network of HFSWRs, which are already processed by a multi-target tracking algorithm and associates SAIS and LAIS data to the corresponding radar tracks, thus forming an integrated data pair. During the integration process, all HFSWR targets in the vicinity of AIS data are evaluated and the one which has the highest matching factor is used for data association. On the other hand, if there is multiple AIS data in the vicinity of a single HFSWR track, the algorithm still makes only one data pair which consists of AIS and HFSWR data with the highest mutual matching factor. During the design and testing, special attention is given to the latency of AIS data, which could be very high in the EEZs of developing countries. The algorithm is designed, implemented and tested in a real working environment. The testing environment is located in the Gulf of Guinea and includes a network of HFSWRs consisting of two HFSWRs, several coastal sites with LAIS receivers and SAIS data provided by provider of SAIS data.

  12. Algorithms for the automatic identification of MARFEs and UFOs in JET database of visible camera videos

    International Nuclear Information System (INIS)

    Murari, A.; Camplani, M.; Cannas, B.; Usai, P.; Mazon, D.; Delaunay, F.


    MARFE instabilities and UFOs leave clear signatures in JET fast visible camera videos. Given the potential harmful consequences of these events, particularly as triggers of disruptions, it would be important to have the means of detecting them automatically. In this paper, the results of various algorithms to identify automatically the MARFEs and UFOs in JET visible videos are reported. The objective is to retrieve the videos, which have captured these events, exploring the whole JET database of images, as a preliminary step to the development of real-time identifiers in the future. For the detection of MARFEs, a complete identifier has been finalized, using morphological operators and Hu moments. The final algorithm manages to identify the videos with MARFEs with a success rate exceeding 80%. Due to the lack of a complete statistics of examples, the UFO identifier is less developed, but a preliminary code can detect UFOs quite reliably. (authors)

  13. Automatic identification of cell files in light microscopic images of conifer wood


    Kennel, Pol; Subsol, Gérard; Guéroult, Michaël; Borianne, Philippe


    International audience; In this paper, we present an automatic method to recognize cell files in light microscopic images of conifer wood. This original method is decomposed into three steps: the segmentation step which extracts some anatomical structures in the image, the classification step which identifies in these structures the interesting cells, and the cell files recognition step. Some preliminary results obtained on several species of conifers are presented and analyzed.

  14. The Effects of Degraded Vision and Automatic Combat Identification Reliability on Infantry Friendly Fire Engagements


    Kogler, Timothy Michael


    Fratricide is one of the most devastating consequences of any military conflict. Target identification failures have been identified as the last link in a chain of mistakes that can lead to fratricide. Other links include weapon and equipment malfunctions, command, control, and communication failures, navigation failures, fire discipline failures, and situation awareness failures. This research examined the effects of degraded vision and combat identification reliability on the time-stress...

  15. Automatic classification of long-term ambulatory ECG records according to type of ischemic heart disease

    Directory of Open Access Journals (Sweden)

    Smrdel Aleš


    Full Text Available Abstract Background Elevated transient ischemic ST segment episodes in the ambulatory electrocardiographic (AECG records appear generally in patients with transmural ischemia (e. g. Prinzmetal's angina while depressed ischemic episodes appear in patients with subendocardial ischemia (e. g. unstable or stable angina. Huge amount of AECG data necessitates automatic methods for analysis. We present an algorithm which determines type of transient ischemic episodes in the leads of records (elevations/depressions and classifies AECG records according to type of ischemic heart disease (Prinzmetal's angina; coronary artery diseases excluding patients with Prinzmetal's angina; other heart diseases. Methods The algorithm was developed using 24-hour AECG records of the Long Term ST Database (LTST DB. The algorithm robustly generates ST segment level function in each AECG lead of the records, and tracks time varying non-ischemic ST segment changes such as slow drifts and axis shifts to construct the ST segment reference function. The ST segment reference function is then subtracted from the ST segment level function to obtain the ST segment deviation function. Using the third statistical moment of the histogram of the ST segment deviation function, the algorithm determines deflections of leads according to type of ischemic episodes present (elevations, depressions, and then classifies records according to type of ischemic heart disease. Results Using 74 records of the LTST DB (containing elevated or depressed ischemic episodes, mixed ischemic episodes, or no episodes, the algorithm correctly determined deflections of the majority of the leads of the records and correctly classified majority of the records with Prinzmetal's angina into the Prinzmetal's angina category (7 out of 8; majority of the records with other coronary artery diseases into the coronary artery diseases excluding patients with Prinzmetal's angina category (47 out of 55; and correctly

  16. Automatic NMR-based identification of chemical reaction types in mixtures of co-occurring reactions.

    Directory of Open Access Journals (Sweden)

    Diogo A R S Latino

    Full Text Available The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the (1H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the (1H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF, the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure

  17. Automatic NMR-based identification of chemical reaction types in mixtures of co-occurring reactions. (United States)

    Latino, Diogo A R S; Aires-de-Sousa, João


    The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the (1)H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants) and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the (1)H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps) produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF), the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure elucidation of

  18. Testing the algorithms for automatic identification of errors on the measured quantities of the nuclear power plant. Verification tests

    International Nuclear Information System (INIS)

    Svatek, J.


    During the development and implementation of supporting software for the control room and emergency control centre at the Dukovany nuclear power plant it appeared necessary to validate the input quantities in order to assure operating reliability of the software tools. Therefore, the development of software for validation of the measured quantities of the plant data sources was initiated, and the software had to be debugged and verified. The report contains the proposal for and description of the verification tests for testing the algorithms of automatic identification of errors on the observed quantities of the NPP by means of homemade validation software. In particular, the algorithms treated serve the validation of the hot leg temperature at primary circuit loop no. 2 or 4 at the Dukovany-2 reactor unit using data from the URAN and VK3 information systems, recorded during 3 different days. (author)

  19. Automatic untargeted metabolic profiling analysis coupled with Chemometrics for improving metabolite identification quality to enhance geographical origin discrimination capability. (United States)

    Han, Lu; Zhang, Yue-Ming; Song, Jing-Jing; Fan, Mei-Juan; Yu, Yong-Jie; Liu, Ping-Ping; Zheng, Qing-Xia; Chen, Qian-Si; Bai, Chang-Cai; Sun, Tao; She, Yuan-Bin


    Untargeted metabolic profiling analysis is employed to screen metabolites for specific purposes, such as geographical origin discrimination. However, the data analysis remains a challenging task. In this work, a new automatic untargeted metabolic profiling analysis coupled with a chemometric strategy was developed to improve the metabolite identification results and to enhance the geographical origin discrimination capability. Automatic untargeted metabolic profiling analysis with chemometrics (AuMPAC) was used to screen the total ion chromatographic (TIC) peaks that showed significant differences among the various geographical regions. Then, a chemometric peak resolution strategy is employed for the screened TIC peaks. The retrieved components were further analyzed using ANOVA, and those that showed significant differences were used to build a geographical origin discrimination model by using two-way encoding partial least squares. To demonstrate its performance, a geographical origin discrimination of flaxseed samples from six geographical regions in China was conducted, and 18 TIC peaks were screened. A total of 19 significant different metabolites were obtained after the peak resolution. The accuracy of the geographical origin discrimination was up to 98%. A comparison of the AuMPAC, AMDIS, and XCMS indicated that AuMPACobtained the best geographical origin discrimination results. In conclusion, AuMPAC provided another method for data analysis. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Semi-automatic identification of punching areas for tissue microarray building: the tubular breast cancer pilot study

    Directory of Open Access Journals (Sweden)

    Beltrame Francesco


    Full Text Available Abstract Background Tissue MicroArray technology aims to perform immunohistochemical staining on hundreds of different tissue samples simultaneously. It allows faster analysis, considerably reducing costs incurred in staining. A time consuming phase of the methodology is the selection of tissue areas within paraffin blocks: no utilities have been developed for the identification of areas to be punched from the donor block and assembled in the recipient block. Results The presented work supports, in the specific case of a primary subtype of breast cancer (tubular breast cancer, the semi-automatic discrimination and localization between normal and pathological regions within the tissues. The diagnosis is performed by analysing specific morphological features of the sample such as the absence of a double layer of cells around the lumen and the decay of a regular glands-and-lobules structure. These features are analysed using an algorithm which performs the extraction of morphological parameters from images and compares them to experimentally validated threshold values. Results are satisfactory since in most of the cases the automatic diagnosis matches the response of the pathologists. In particular, on a total of 1296 sub-images showing normal and pathological areas of breast specimens, algorithm accuracy, sensitivity and specificity are respectively 89%, 84% and 94%. Conclusions The proposed work is a first attempt to demonstrate that automation in the Tissue MicroArray field is feasible and it can represent an important tool for scientists to cope with this high-throughput technique.

  1. A computer program for automatic gamma-ray spectra analysis with isotope identification for the purpose of activation analysis

    International Nuclear Information System (INIS)

    Weigel, H.; Dauk, J.


    A FORTRAN IV program for a PDP-9 computer, with 16K storage capacity, is developed performing automatic analysis of complex gamma-spectra, taken with Ge/Li/ detectors. It searches for full energy peaks and evaluates the peak areas. The program features and automatically performed isotope identifiaction. It is written in such a flexible manner that after reactor irradiation, spectra from samples of any composition can be evaluated for activation analysis. The peak search rutin is based on the following criteria: the counting rate has to increase for two succesive channels; and the amplitude of the corresponding maximum has to be greater than/or equal to F 1 times the statistical error of the counting rate in the valley just before the maximum. In order to detect superimposed peaks, it is assumed that the dependence of FWHM on channel number is roughly approximated by a linear function, and the actual and''theoretical''FWHM values are compared. To determine the net peak area a Gaussian based function is fitted to each peak. The isotope identification is based on the procedure developed by ADAMS and DAMS. (T.G.)

  2. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions. (United States)

    Zdravevski, Eftim; Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger


    Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from

  3. Integration of onshore and offshore seismic arrays to study the seismicity of the Calabrian Region: a two steps automatic procedure for the identification of the best stations geometry


    D’Alessandro, A; Guerra, I; D’Anna, G; Gervasi, A; Harabaglia, P; Luzio, D; Stellato, G


    We plan to deploy in the Taranto Gulf some Ocean Bottom broadband Seismometer with Hydrophones. Our aim is to investigate the offshore seismicity of the Sibari Gulf. The seismographic network optimization consists in the identification of the optimal sites for the installation of the offshore stations, which is a crucial factor for the success of the monitoring campaign. In this paper, we propose a two steps automatic procedure for the identification of the best stations geo...

  4. Analysis and Development of FACE Automatic Apparatus for Rapid Identification of Transuranium Isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Sebesta, Edward Henri [Univ. of California, Berkeley, CA (United States)


    A description of and operating manual for the FACE Automatic Apparatus has been written along with a documentation of the FACE machine operating program, to provide a user manual for the FACE Automatic Apparatus. In addition, FACE machine performance was investigated to improve transuranium throughput. Analysis of the causes of transuranium isotope loss was undertaken both chemical and radioactive. To lower radioactive loss, the dynamics of the most time consuming step of the FACE machine, the chromatographic column output droplet drying and flaming, in preparation of sample for alpha spectroscopy and counting, was investigated. A series of droplets were dried in an experimental apparatus demonstrating that droplets could be dried significantly faster through more intensie heating, enabling the FACE machine cycle to be shortened by 30-60 seconds. Proposals incorporating these ideas were provided for FACE machine development. The 66% chemical loss of product was analyzed and changes were proposed to reduce the radioisotopes product loss. An analysis of the chromatographic column was also provided. All operating steps in the FACE machine are described and analyzed to provide a complete guide, along with the proposals for machine improvement.

  5. Automatic identification of comparative effectiveness research from medline citations to support clinicians' treatment information needs. (United States)

    Zhang, Mingyuan; Del Fiol, Guilherme; Grout, Randall W; Jonnalagadda, Siddhartha; Medlin, Richard; Mishra, Rashmi; Weir, Charlene; Liu, Hongfang; Mostafa, Javed; Fiszman, Marcelo


    Online knowledge resources such as Medline can address most clinicians' patient care information needs. Yet, significant barriers, notably lack of time, limit the use of these sources at the point of care. The most common information needs raised by clinicians are treatment-related. Comparative effectiveness studies allow clinicians to consider multiple treatment alternatives for a particular problem. Still, solutions are needed to enable efficient and effective consumption of comparative effectiveness research at the point of care. Design and assess an algorithm for automatically identifying comparative effectiveness studies and extracting the interventions investigated in these studies. The algorithm combines semantic natural language processing, Medline citation metadata, and machine learning techniques. We assessed the algorithm in a case study of treatment alternatives for depression. Both precision and recall for identifying comparative studies was 0.83. A total of 86% of the interventions extracted perfectly or partially matched the gold standard. Overall, the algorithm achieved reasonable performance. The method provides building blocks for the automatic summarization of comparative effectiveness research to inform point of care decision-making.

  6. Automatic identification of comparative effectiveness research from Medline citations to support clinicians’ treatment information needs (United States)

    Zhang, Mingyuan; Fiol, Guilherme Del; Grout, Randall W.; Jonnalagadda, Siddhartha; Medlin, Richard; Mishra, Rashmi; Weir, Charlene; Liu, Hongfang; Mostafa, Javed; Fiszman, Marcelo


    Online knowledge resources such as Medline can address most clinicians’ patient care information needs. Yet, significant barriers, notably lack of time, limit the use of these sources at the point of care. The most common information needs raised by clinicians are treatment-related. Comparative effectiveness studies allow clinicians to consider multiple treatment alternatives for a particular problem. Still, solutions are needed to enable efficient and effective consumption of comparative effectiveness research at the point of care. Objective Design and assess an algorithm for automatically identifying comparative effectiveness studies and extracting the interventions investigated in these studies. Methods The algorithm combines semantic natural language processing, Medline citation metadata, and machine learning techniques. We assessed the algorithm in a case study of treatment alternatives for depression. Results Both precision and recall for identifying comparative studies was 0.83. A total of 86% of the interventions extracted perfectly or partially matched the gold standard. Conclusion Overall, the algorithm achieved reasonable performance. The method provides building blocks for the automatic summarization of comparative effectiveness research to inform point of care decision-making. PMID:23920677

  7. Automatic identification of bird targets with radar via patterns produced by wing flapping

    NARCIS (Netherlands)

    Zaugg, S.; Saporta, G.; van Loon, E.; Schmaljohann, H.; Liechti, F.


    Bird identification with radar is important for bird migration research, environmental impact assessments (e.g. wind farms), aircraft security and radar meteorology. In a study on bird migration, radar signals from birds, insects and ground clutter were recorded. Signals from birds show a typical

  8. Accident identification system with automatic detection of abnormal condition using quantum computation

    Energy Technology Data Exchange (ETDEWEB)

    Nicolau, Andressa dos Santos; Schirru, Roberto, E-mail:, E-mail: [Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil); Lima, Alan Miranda Monteiro de [Coordenacao dos Programas de Pos-Graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil)


    Transient identification systems have been proposed in order to maintain the plant operating in safe conditions and help operators in make decisions in emergency short time interval with maximum certainty associated. This article presents a system, time independent and without the use of an event that can be used as a starting point for t = 0 (reactor scram, for instance), for transient/accident identification of a pressurized water nuclear reactor (PWR). The model was developed in order to be able to recognize the normal condition and three accidents of the design basis list of the Nuclear Power Plant Angra 2, postulated in the Final Safety Analysis Report (FSAR). Were used several sets of process variables in order to establish a minimum set of variables considered necessary and sufficient. The optimization step of the identification algorithm is based upon the paradigm of Quantum Computing. In this case, the optimization metaheuristic Quantum Inspired Evolutionary Algorithm (QEA) was implemented and works as a data mining tool. The results obtained with the QEA without the time variable are compatible to the techniques in the reference literature, for the transient identification problem, with less computational effort (number of evaluations). This system allows a solution that approximates the ideal solution, the Voronoi Vectors with only one partition for the classes of accidents with robustness. (author)

  9. Accident identification system with automatic detection of abnormal condition using quantum computation

    International Nuclear Information System (INIS)

    Nicolau, Andressa dos Santos; Schirru, Roberto; Lima, Alan Miranda Monteiro de


    Transient identification systems have been proposed in order to maintain the plant operating in safe conditions and help operators in make decisions in emergency short time interval with maximum certainty associated. This article presents a system, time independent and without the use of an event that can be used as a starting point for t = 0 (reactor scram, for instance), for transient/accident identification of a pressurized water nuclear reactor (PWR). The model was developed in order to be able to recognize the normal condition and three accidents of the design basis list of the Nuclear Power Plant Angra 2, postulated in the Final Safety Analysis Report (FSAR). Were used several sets of process variables in order to establish a minimum set of variables considered necessary and sufficient. The optimization step of the identification algorithm is based upon the paradigm of Quantum Computing. In this case, the optimization metaheuristic Quantum Inspired Evolutionary Algorithm (QEA) was implemented and works as a data mining tool. The results obtained with the QEA without the time variable are compatible to the techniques in the reference literature, for the transient identification problem, with less computational effort (number of evaluations). This system allows a solution that approximates the ideal solution, the Voronoi Vectors with only one partition for the classes of accidents with robustness. (author)

  10. Automatic feed phase identification in multivariate bioprocess profiles by sequential binary classification. (United States)

    Nikzad-Langerodi, Ramin; Lughofer, Edwin; Saminger-Platz, Susanne; Zahel, Thomas; Sagmeister, Patrick; Herwig, Christoph


    In this paper, we propose a new strategy for retrospective identification of feed phases from online sensor-data enriched feed profiles of an Escherichia Coli (E. coli) fed-batch fermentation process. In contrast to conventional (static), data-driven multi-class machine learning (ML), we exploit process knowledge in order to constrain our classification system yielding more parsimonious models compared to static ML approaches. In particular, we enforce unidirectionality on a set of binary, multivariate classifiers trained to discriminate between adjacent feed phases by linking the classifiers through a one-way switch. The switch is activated when the actual classifier output changes. As a consequence, the next binary classifier in the classifier chain is used for the discrimination between the next feed phase pair etc. We allow activation of the switch only after a predefined number of consecutive predictions of a transition event in order to prevent premature activation of the switch and undertake a sensitivity analysis regarding the optimal choice of the (time) lag parameter. From a complexity/parsimony perspective the benefit of our approach is three-fold: i) The multi-class learning task is broken down into binary subproblems which usually have simpler decision surfaces and tend to be less susceptible to the class-imbalance problem. ii) We exploit the fact that the process follows a rigid feed cycle structure (i.e. batch-feed-batch-feed) which allows us to focus on the subproblems involving phase transitions as they occur during the process while discarding off-transition classifiers and iii) only one binary classifier is active at the time which keeps effective model complexity low. We further use a combination of logistic regression and Lasso (i.e. regularized logistic regression, RLR) as a wrapper to extract the most relevant features for individual subproblems from the whole set of high-dimensional sensor data. We train different soft computing classifiers

  11. Automatic extraction and identification of users' responses in Facebook medical quizzes. (United States)

    Rodríguez-González, Alejandro; Menasalvas Ruiz, Ernestina; Mayer Pujadas, Miguel A


    In the last few years the use of social media in medicine has grown exponentially, providing a new area of research based on the analysis and use of Web 2.0 capabilities. In addition, the use of social media in medical education is a subject of particular interest which has been addressed in several studies. One example of this application is the medical quizzes of The New England Journal of Medicine (NEJM) that regularly publishes a set of questions through their Facebook timeline. We present an approach for the automatic extraction of medical quizzes and their associated answers on a Facebook platform by means of a set of computer-based methods and algorithms. We have developed a tool for the extraction and analysis of medical quizzes stored on Facebook timeline at the NEJM Facebook page, based on a set of computer-based methods and algorithms using Java. The system is divided into two main modules: Crawler and Data retrieval. The system was launched on December 31, 2014 and crawled through a total of 3004 valid posts and 200,081 valid comments. The first post was dated on July 23, 2009 and the last one on December 30, 2014. 285 quizzes were analyzed with 32,780 different users providing answers to the aforementioned quizzes. Of the 285 quizzes, patterns were found in 261 (91.58%). From these 261 quizzes where trends were found, we saw that users follow trends of incorrect answers in 13 quizzes and trends of correct answers in 248. This tool is capable of automatically identifying the correct and wrong answers to a quiz provided on Facebook posts in a text format to a quiz, with a small rate of false negative cases and this approach could be applicable to the extraction and analysis of other sources after including some adaptations of the information on the Internet. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. Source term identification in atmospheric modelling via sparse optimization (United States)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas


    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the

  13. An Automatic Parameter Identification Method for a PMSM Drive with LC-Filter

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Christensen, Jeppe Haals; Weber, Magnus L.


    This paper presents a method for stand-still identification of parameters in a permanent magnet synchronous motor (PMSM) fed from an inverter equipped with an three-phase LC-type output filter. Using a special random modulation strategy, the method uses the inverter for broad-band excitation...... of the PMSM fed through an LC-filter. Based on the measured current response, model parameters for both the filter (L, R, C) and the PMSM (L and R) are estimated: First, the frequency response of the system is estimated using Welch Modified Periodogram method and then an optimization algorithm is used to find...... the parameters in an analytical reference model that minimize the model error. To demonstrate the practical feasibility of the method, a fully functional drive including an embedded real-time controller has been built. In addition to modulation, data acquisition and control the whole parameter identification...

  14. Automatic and rapid identification of glycopeptides by nano-UPLC-LTQ-FT-MS and proteomic search engine. (United States)

    Giménez, Estela; Gay, Marina; Vilaseca, Marta


    Here we demonstrate the potential of nano-UPLC-LTQ-FT-MS and the Byonic™ proteomic search engine for the separation, detection, and identification of N- and O-glycopeptide glycoforms in standard glycoproteins. The use of a BEH C18 nanoACQUITY column allowed the separation of the glycopeptides present in the glycoprotein digest and a baseline-resolution of the glycoforms of the same glycopeptide on the basis of the number of sialic acids. Moreover, we evaluated several acquisition strategies in order to improve the detection and characterization of glycopeptide glycoforms with the maximum number of identification percentages. The proposed strategy is simple to set up with the technology platforms commonly used in proteomic labs. The method allows the straightforward and rapid obtention of a general glycosylated map of a given protein, including glycosites and their corresponding glycosylated structures. The MS strategy selected in this work, based on a gas phase fractionation approach, led to 136 unique peptides from four standard proteins, which represented 78% of the total number of peptides identified. Moreover, the method does not require an extra glycopeptide enrichment step, thus preventing the bias that this step could cause towards certain glycopeptide species. Data are available via ProteomeXchange with identifier PXD003578. We propose a simple and high-throughput glycoproteomics-based methodology that allows the separation of glycopeptide glycoforms on the basis of the number of sialic acids, and their automatic and rapid identification without prior knowledge of protein glycosites or type and structure of the glycans. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme. (United States)

    Gennaro, G; Ballaminut, A; Contento, G


    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  16. A smart pattern recognition system for the automatic identification of aerospace acoustic sources (United States)

    Cabell, R. H.; Fuller, C. R.


    An intelligent air-noise recognition system is described that uses pattern recognition techniques to distinguish noise signatures of five different types of acoustic sources, including jet planes, propeller planes, a helicopter, train, and wind turbine. Information for classification is calculated using the power spectral density and autocorrelation taken from the output of a single microphone. Using this system, as many as 90 percent of test recordings were correctly identified, indicating that the linear discriminant functions developed can be used for aerospace source identification.

  17. Automatic pattern identification of rock moisture based on the Staff-RF model (United States)

    Zheng, Wei; Tao, Kai; Jiang, Wei


    Studies on the moisture and damage state of rocks generally focus on the qualitative description and mechanical information of rocks. This method is not applicable to the real-time safety monitoring of rock mass. In this study, a musical staff computing model is used to quantify the acoustic emission signals of rocks with different moisture patterns. Then, the random forest (RF) method is adopted to form the staff-RF model for the real-time pattern identification of rock moisture. The entire process requires only the computing information of the AE signal and does not require the mechanical conditions of rocks.

  18. Quality assurance in the production of pipe fittings by automatic laser-based material identification (United States)

    Moench, Ingo; Peter, Laszlo; Priem, Roland; Sturm, Volker; Noll, Reinhard


    In plants of the chemical, nuclear and off-shore industry, application specific high-alloyed steels are used for pipe fittings. Mixing of different steel grades can lead to corrosion with severe consequential damages. Growing quality requirements and environmental responsibilities demand a 100% material control in the production of the pipe fittings. Therefore, LIFT, an automatic inspection machine, was developed to insure against any mix of material grades. LIFT is able to identify more than 30 different steel grades. The inspection method is based on Laser-Induced Breakdown Spectrometry (LIBS). An expert system, which can be easily trained and recalibrated, was developed for the data evaluation. The result of the material inspection is transferred to an external handling system via a PLC interface. The duration of the inspection process is 2 seconds. The graphical user interface was developed with respect to the requirements of an unskilled operator. The software is based on a realtime operating system and provides a safe and reliable operation. An interface for the remote maintenance by modem enables a fast operational support. Logged data are retrieved and evaluated. This is the basis for an adaptive improvement of the configuration of LIFT with respect to changing requirements in the production line. Within the first six months of routine operation, about 50000 pipe fittings were inspected.

  19. Clinical system engineering of long-term automatic thermal control during brain hypothermia under changing conditions. (United States)

    Wakamatsu, H; Utsuki, T; Mitaka, C; Ohno, K


    Automatic control systems of brain temperature for water surface-cooling were first-ever applied to the brain hypothermic treatment of patients. A patient in ICU was regarded as a unity controlled system with an input (temperature of water into blanket) and an output (tympanic membrane temperature). The proposed algorithm of optimal-adaptive and fuzzy control laws inclusive of our developed cooling and warming machine were well confirmed during the hypothermic course to keep brain temperature of patients within its allowable range. It was well controlled without much influence due to room temperature, metabolic and circulatory change caused by various medical treatments including the effect of nonlinear and time-varying characteristics of individual patients. The clinical control of brain temperature was almost continuously performed in around 10 days, under the brain temperature between 35 degrees C and 37 degrees C scheduled by physicians according to the state of patients. Their state had been monitored during the therapeutic course of pharmacological treatment with almost everyday examinations by CT imaging, referring various vital signs inclusive of the temperature of urinary bladder with continuous measurement of intracranial pressure by a catheter placement in cerebral sinus. The patients had good recovery to their rehabilitation after mild hypothermia by the proposed automatic control systems.

  20. Automatic cell identification and visualization using digital holographic microscopy with head mounted augmented reality devices. (United States)

    O'Connor, Timothy; Rawat, Siddharth; Markman, Adam; Javidi, Bahram


    We propose a compact imaging system that integrates an augmented reality head mounted device with digital holographic microscopy for automated cell identification and visualization. A shearing interferometer is used to produce holograms of biological cells, which are recorded using customized smart glasses containing an external camera. After image acquisition, segmentation is performed to isolate regions of interest containing biological cells in the field-of-view, followed by digital reconstruction of the cells, which is used to generate a three-dimensional (3D) pseudocolor optical path length profile. Morphological features are extracted from the cell's optical path length map, including mean optical path length, coefficient of variation, optical volume, projected area, projected area to optical volume ratio, cell skewness, and cell kurtosis. Classification is performed using the random forest classifier, support vector machines, and K-nearest neighbor, and the results are compared. Finally, the augmented reality device displays the cell's pseudocolor 3D rendering of its optical path length profile, extracted features, and the identified cell's type or class. The proposed system could allow a healthcare worker to quickly visualize cells using augmented reality smart glasses and extract the relevant information for rapid diagnosis. To the best of our knowledge, this is the first report on the integration of digital holographic microscopy with augmented reality devices for automated cell identification and visualization.

  1. Automatic Identification of Physical Activity Intensity and Modality from the Fusion of Accelerometry and Heart Rate Data. (United States)

    García-García, Fernando; Benito, Pedro J; Hernando, María E


    Physical activity (PA) is essential to prevent and to treat a variety of chronic diseases. The automated detection and quantification of PA over time empowers lifestyle interventions, facilitating reliable exercise tracking and data-driven counseling. We propose and compare various combinations of machine learning (ML) schemes for the automatic classification of PA from multi-modal data, simultaneously captured by a biaxial accelerometer and a heart rate (HR) monitor. Intensity levels (low / moderate / vigorous) were recognized, as well as for vigorous exercise, its modality (sustained aerobic / resistance / mixed). In total, 178.63 h of data about PA intensity (65.55 % low / 18.96 % moderate / 15.49 % vigorous) and 17.00 h about modality were collected in two experiments: one in free-living conditions, another in a fitness center under controlled protocols. The structure used for automatic classification comprised: a) definition of 42 time-domain signal features, b) dimensionality reduction, c) data clustering, and d) temporal filtering to exploit time redundancy by means of a Hidden Markov Model (HMM). Four dimensionality reduction techniques and four clustering algorithms were studied. In order to cope with class imbalance in the dataset, a custom performance metric was defined to aggregate recognition accuracy, precision and recall. The best scheme, which comprised a projection through Linear Discriminant Analysis (LDA) and k-means clustering, was evaluated in leave-one-subject-out cross-validation; notably outperforming the standard industry procedures for PA intensity classification: score 84.65 %, versus up to 63.60 %. Errors tended to be brief and to appear around transients. The application of ML techniques for pattern identification and temporal filtering allowed to merge accelerometry and HR data in a solid manner, and achieved markedly better recognition performances than the standard methods for PA intensity estimation.

  2. New representations of {Delta}E-CsI matrix. Automatical identification of light particles; Nouvelles representations de matrice de type {Delta}E-CsI. Identification automatique des particules legeres

    Energy Technology Data Exchange (ETDEWEB)

    Benlliure, P.; Chbihi, A.


    The authors propose a new representation of a {Delta}E-CsI matrix that keeps the necessary information for the light particle discrimination in INDRA radiation detector. They also propose an automatic identification method based upon the fact, that the particle production rate is higher than the production rate of heavier fragments.

  3. Semi-automatic construction of the Chinese-English MeSH using Web-based term translation method. (United States)

    Lu, Wen-Hsiang; Lin, Shih-Jui; Chan, Yi-Che; Chen, Kuan-Hsi


    Due to language barrier, non-English users are unable to retrieve the most updated medical information from the U.S. authoritative medical websites, such as PubMed and MedlinePlus. A few cross-language medical information retrieval (CLMIR) systems have been utilizing MeSH (Medical Subject Heading) with multilingual thesaurus to bridge the gap. Unfortunately, MeSH has yet not been translated into traditional Chinese currently. We proposed a semi-automatic approach to constructing Chinese-English MeSH based on Web-based term translation. The system provides knowledge engineers with candidate terms mining from anchor texts and search-result pages. The result is encouraging. Currently, more than 19,000 Chinese-English MeSH entries have been complied. This thesaurus will be used in Chinese-English CLMIR in the future.

  4. Large data analysis: automatic visual personal identification in a demography of 1.2 billion persons (United States)

    Daugman, John


    The largest biometric deployment in history is now underway in India, where the Government is enrolling the iris patterns (among other data) of all 1.2 billion citizens. The purpose of the Unique Identification Authority of India (UIDAI) is to ensure fair access to welfare benefits and entitlements, to reduce fraud, and enhance social inclusion. Only a minority of Indian citizens have bank accounts; only 4 percent possess passports; and less than half of all aid money reaches its intended recipients. A person who lacks any means of establishing their identity is excluded from entitlements and does not officially exist; thus the slogan of UIDAI is: To give the poor an identity." This ambitious program enrolls a million people every day, across 36,000 stations run by 83 agencies, with a 3-year completion target for the entire national population. The halfway point was recently passed with more than 600 million persons now enrolled. In order to detect and prevent duplicate identities, every iris pattern that is enrolled is first compared against all others enrolled so far; thus the daily workflow now requires 600 trillion (or 600 million-million) iris cross-comparisons. Avoiding identity collisions (False Matches) requires high biometric entropy, and achieving the tremendous match speed requires phase bit coding. Both of these requirements are being delivered operationally by wavelet methods developed by the author for encoding and comparing iris patterns, which will be the focus of this Large Data Award" presentation.

  5. Increasing Accuracy: A New Design and Algorithm for Automatically Measuring Weights, Travel Direction and Radio Frequency Identification (RFID) of Penguins. (United States)

    Afanasyev, Vsevolod; Buldyrev, Sergey V; Dunn, Michael J; Robst, Jeremy; Preston, Mark; Bremner, Steve F; Briggs, Dirk R; Brown, Ruth; Adlard, Stacey; Peat, Helen J


    A fully automated weighbridge using a new algorithm and mechanics integrated with a Radio Frequency Identification System is described. It is currently in use collecting data on Macaroni penguins (Eudyptes chrysolophus) at Bird Island, South Georgia. The technology allows researchers to collect very large, highly accurate datasets of both penguin weight and direction of their travel into or out of a breeding colony, providing important contributory information to help understand penguin breeding success, reproductive output and availability of prey. Reliable discrimination between single and multiple penguin crossings is demonstrated. Passive radio frequency tags implanted into penguins allow researchers to match weight and trip direction to individual birds. Low unit and operation costs, low maintenance needs, simple operator requirements and accurate time stamping of every record are all important features of this type of weighbridge, as is its proven ability to operate 24 hours a day throughout a breeding season, regardless of temperature or weather conditions. Users are able to define required levels of accuracy by adjusting filters and raw data are automatically recorded and stored allowing for a range of processing options. This paper presents the underlying principles, design specification and system description, provides evidence of the weighbridge's accurate performance and demonstrates how its design is a significant improvement on existing systems.

  6. Completing fishing monitoring with spaceborne Vessel Detection System (VDS) and Automatic Identification System (AIS) to assess illegal fishing in Indonesia. (United States)

    Longépé, Nicolas; Hajduch, Guillaume; Ardianto, Romy; Joux, Romain de; Nhunfat, Béatrice; Marzuki, Marza I; Fablet, Ronan; Hermawan, Indra; Germain, Olivier; Subki, Berny A; Farhan, Riza; Muttaqin, Ahmad Deni; Gaspar, Philippe


    The Indonesian fisheries management system is now equipped with the state-of-the-art technologies to deter and combat Illegal, Unreported and Unregulated (IUU) fishing. Since October 2014, non-cooperative fishing vessels can be detected from spaceborne Vessel Detection System (VDS) based on high resolution radar imagery, which directly benefits to coordinated patrol vessels in operation context. This study attempts to monitor the amount of illegal fishing in the Arafura Sea based on this new source of information. It is analyzed together with Vessel Monitoring System (VMS) and satellite-based Automatic Identification System (Sat-AIS) data, taking into account their own particularities. From October 2014 to March 2015, i.e. just after the establishment of a new moratorium by the Indonesian authorities, the estimated share of fishing vessels not carrying VMS, thus being illegal, ranges from 42 to 47%. One year later in January 2016, this proportion decreases and ranges from 32 to 42%. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Hybrid EEG—Eye Tracker: Automatic Identification and Removal of Eye Movement and Blink Artifacts from Electroencephalographic Signal

    Directory of Open Access Journals (Sweden)

    Malik M. Naeem Mannan


    Full Text Available Contamination of eye movement and blink artifacts in Electroencephalogram (EEG recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI. In this paper, we proposed an automatic framework based on independent component analysis (ICA and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data.

  8. Study on Safety of Navigation using Automatic Identification System for Marine Traffic Area Case Study: Malacca Straits

    Directory of Open Access Journals (Sweden)

    Muhammad Badrus Zaman


    Full Text Available International Maritime Organization (IMO has recommended the implementation of Automatic Identification System (AIS to improve the safety of navigation at marine traffic area. Based on regulation, IMO requires AIS to be fitted aboard all ships of 300 gross tonnage and upwards engaged on international voyages, cargo ships of 500 gross tonnage and upwards not engaged on international voyages and all passenger ships irrespective of size. The function of the AIS is to make communication between ship to ship and communication between ship to the port or land area. In this study, the study area is the Malacca Strait. Malacca Straits is the strait categorized as high risk level. Malacca straits is also busy area for maritime transportation because it is an area for international transportation lines. Many captains feel anxious and cautiously when passes through the strait. AIS receiver was used in this study which has been installed at Universiti Teknologi Malaysia by Kobe University Japan. Using AIS receiver, the current condition of the ship in the Malacca Straits area can be monitored properly. In addition, the data recorded on the AIS receiver can be used for research to enhance safety of navigation.

  9. Increasing Accuracy: A New Design and Algorithm for Automatically Measuring Weights, Travel Direction and Radio Frequency Identification (RFID of Penguins.

    Directory of Open Access Journals (Sweden)

    Vsevolod Afanasyev

    Full Text Available A fully automated weighbridge using a new algorithm and mechanics integrated with a Radio Frequency Identification System is described. It is currently in use collecting data on Macaroni penguins (Eudyptes chrysolophus at Bird Island, South Georgia. The technology allows researchers to collect very large, highly accurate datasets of both penguin weight and direction of their travel into or out of a breeding colony, providing important contributory information to help understand penguin breeding success, reproductive output and availability of prey. Reliable discrimination between single and multiple penguin crossings is demonstrated. Passive radio frequency tags implanted into penguins allow researchers to match weight and trip direction to individual birds. Low unit and operation costs, low maintenance needs, simple operator requirements and accurate time stamping of every record are all important features of this type of weighbridge, as is its proven ability to operate 24 hours a day throughout a breeding season, regardless of temperature or weather conditions. Users are able to define required levels of accuracy by adjusting filters and raw data are automatically recorded and stored allowing for a range of processing options. This paper presents the underlying principles, design specification and system description, provides evidence of the weighbridge's accurate performance and demonstrates how its design is a significant improvement on existing systems.

  10. PhasePApy: A robust pure Python package for automatic identification of seismic phases (United States)

    Chen, Chen; Holland, Austin


    We developed a Python phase identification package: the PhasePApy for earthquake data processing and near‐real‐time monitoring. The package takes advantage of the growing number of Python libraries including Obspy. All the data formats supported by Obspy can be supported within the PhasePApy. The PhasePApy has two subpackages: the PhasePicker and the Associator, aiming to identify phase arrival onsets and associate them to phase types, respectively. The PhasePicker and the Associator can work jointly or separately. Three autopickers are implemented in the PhasePicker subpackage: the frequency‐band picker, the Akaike information criteria function derivative picker, and the kurtosis picker. All three autopickers identify picks with the same processing methods but different characteristic functions. The PhasePicker triggers the pick with a dynamic threshold and can declare a pick with false‐pick filtering. Also, the PhasePicker identifies a pick polarity and uncertainty for further seismological analysis, such as focal mechanism determination. Two associators are included in the Associator subpackage: the 1D Associator and 3D Associator, which assign phase types to picks that can best fit potential earthquakes by minimizing root mean square (rms) residuals of the misfits in distance and time, respectively. The Associator processes multiple picks from all channels at a seismic station and aggregates them to increase computational efficiencies. Both associators use travel‐time look up tables to determine the best estimation of the earthquake location and evaluate the phase type for picks. The PhasePApy package has been used extensively for local and regional earthquakes and can work for active source experiments as well.

  11. Price strategy and pricing strategy: terms and content identification


    Panasenko Tetyana


    The article is devoted to the terminology and content identification of seemingly identical concepts "price strategy" and "pricing strategy". The article contains evidence that the price strategy determines the direction, principles and procedure of implementing the company price policy and pricing strategy creates a set of rules and practical methods of price formation in accordance with the pricing strategy of the company.

  12. Progressively expanded neural network for automatic material identification in hyperspectral imagery (United States)

    Paheding, Sidike

    The science of hyperspectral remote sensing focuses on the exploitation of the spectral signatures of various materials to enhance capabilities including object detection, recognition, and material characterization. Hyperspectral imagery (HSI) has been extensively used for object detection and identification applications since it provides plenty of spectral information to uniquely identify materials by their reflectance spectra. HSI-based object detection algorithms can be generally classified into stochastic and deterministic approaches. Deterministic approaches are comparatively simple to apply since it is usually based on direct spectral similarity such as spectral angles or spectral correlation. In contrast, stochastic algorithms require statistical modeling and estimation for target class and non-target class. Over the decades, many single class object detection methods have been proposed in the literature, however, deterministic multiclass object detection in HSI has not been explored. In this work, we propose a deterministic multiclass object detection scheme, named class-associative spectral fringe-adjusted joint transform correlation. Human brain is capable of simultaneously processing high volumes of multi-modal data received every second of the day. In contrast, a machine sees input data simply as random binary numbers. Although machines are computationally efficient, they are inferior when comes to data abstraction and interpretation. Thus, mimicking the learning strength of human brain has been current trend in artificial intelligence. In this work, we present a biological inspired neural network, named progressively expanded neural network (PEN Net), based on nonlinear transformation of input neurons to a feature space for better pattern differentiation. In PEN Net, discrete fixed excitations are disassembled and scattered in the feature space as a nonlinear line. Each disassembled element on the line corresponds to a pattern with similar features

  13. Validation of automatic landmark identification for atlas-based segmentation for radiation treatment planning of the head-and-neck region (United States)

    Leavens, Claudia; Vik, Torbjørn; Schulz, Heinrich; Allaire, Stéphane; Kim, John; Dawson, Laura; O'Sullivan, Brian; Breen, Stephen; Jaffray, David; Pekar, Vladimir


    Manual contouring of target volumes and organs at risk in radiation therapy is extremely time-consuming, in particular for treating the head-and-neck area, where a single patient treatment plan can take several hours to contour. As radiation treatment delivery moves towards adaptive treatment, the need for more efficient segmentation techniques will increase. We are developing a method for automatic model-based segmentation of the head and neck. This process can be broken down into three main steps: i) automatic landmark identification in the image dataset of interest, ii) automatic landmark-based initialization of deformable surface models to the patient image dataset, and iii) adaptation of the deformable models to the patient-specific anatomical boundaries of interest. In this paper, we focus on the validation of the first step of this method, quantifying the results of our automatic landmark identification method. We use an image atlas formed by applying thin-plate spline (TPS) interpolation to ten atlas datasets, using 27 manually identified landmarks in each atlas/training dataset. The principal variation modes returned by principal component analysis (PCA) of the landmark positions were used by an automatic registration algorithm, which sought the corresponding landmarks in the clinical dataset of interest using a controlled random search algorithm. Applying a run time of 60 seconds to the random search, a root mean square (rms) distance to the ground-truth landmark position of 9.5 +/- 0.6 mm was calculated for the identified landmarks. Automatic segmentation of the brain, mandible and brain stem, using the detected landmarks, is demonstrated.

  14. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation (United States)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.


    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term

  15. Automatic identification of high impact articles in PubMed to support clinical decision making. (United States)

    Bian, Jiantao; Morid, Mohammad Amin; Jonnalagadda, Siddhartha; Luo, Gang; Del Fiol, Guilherme


    The practice of evidence-based medicine involves integrating the latest best available evidence into patient care decisions. Yet, critical barriers exist for clinicians' retrieval of evidence that is relevant for a particular patient from primary sources such as randomized controlled trials and meta-analyses. To help address those barriers, we investigated machine learning algorithms that find clinical studies with high clinical impact from PubMed®. Our machine learning algorithms use a variety of features including bibliometric features (e.g., citation count), social media attention, journal impact factors, and citation metadata. The algorithms were developed and evaluated with a gold standard composed of 502 high impact clinical studies that are referenced in 11 clinical evidence-based guidelines on the treatment of various diseases. We tested the following hypotheses: (1) our high impact classifier outperforms a state-of-the-art classifier based on citation metadata and citation terms, and PubMed's® relevance sort algorithm; and (2) the performance of our high impact classifier does not decrease significantly after removing proprietary features such as citation count. The mean top 20 precision of our high impact classifier was 34% versus 11% for the state-of-the-art classifier and 4% for PubMed's® relevance sort (p=0.009); and the performance of our high impact classifier did not decrease significantly after removing proprietary features (mean top 20 precision=34% vs. 36%; p=0.085). The high impact classifier, using features such as bibliometrics, social media attention and MEDLINE® metadata, outperformed previous approaches and is a promising alternative to identifying high impact studies for clinical decision support. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Automatic estimation of aquifer parameters using long-term water supply pumping and injection records (United States)

    Luo, Ning; Illman, Walter A.


    Analyses are presented of long-term hydrographs perturbed by variable pumping/injection events in a confined aquifer at a municipal water-supply well field in the Region of Waterloo, Ontario (Canada). Such records are typically not considered for aquifer test analysis. Here, the water-level variations are fingerprinted to pumping/injection rate changes using the Theis model implemented in the WELLS code coupled with PEST. Analyses of these records yield a set of transmissivity ( T) and storativity ( S) estimates between each monitoring and production borehole. These individual estimates are found to poorly predict water-level variations at nearby monitoring boreholes not used in the calibration effort. On the other hand, the geometric means of the individual T and S estimates are similar to those obtained from previous pumping tests conducted at the same site and adequately predict water-level variations in other boreholes. The analyses reveal that long-term municipal water-level records are amenable to analyses using a simple analytical solution to estimate aquifer parameters. However, uniform parameters estimated with analytical solutions should be considered as first rough estimates. More accurate hydraulic parameters should be obtained by calibrating a three-dimensional numerical model that rigorously captures the complexities of the site with these data.

  17. Background and Source Term Identification in Active Neutron Interrogation Methods (United States)


    background source terms during active neutron interrogation. Oxide Percent SiO2 60.6 Al2O3 15.9 CaO 6.4 MgO 4.7 Na2O 3.1 Fe 6.7 K2O 1.8 TiO2 0.7... P2O5 0.1 Table 5. Chemical Properties of Continental Crust Provides the average amount of each element present in the earth’s crust for

  18. Automatic plume episode identification and cloud shine reconstruction method for ambient gamma dose rates during nuclear accidents. (United States)

    Zhang, Xiaole; Raskob, Wolfgang; Landman, Claudia; Trybushnyi, Dmytro; Haller, Christoph; Yuan, Hongyong


    Ambient gamma dose rate (GDR) is the primary observation quantity for nuclear emergency management due to its high acquisition frequency and dense spatial deployment. However, ambient GDR is the sum of both cloud and ground shine, which hinders its effective utilization. In this study, an automatic method is proposed to identify the radioactive plume passage and to separate the cloud and ground shine in the total GDR. The new method is evaluated against a synthetic GDR dataset generated by JRODOS (Real Time On-line Decision Support) System and compared with another method (Hirayama, H. et al., 2014. Estimation of I-131 concentration using time history of pulse height distribution at monitoring post and detector response for radionuclide in plume. Transactions of the Atomic Energy Society of Japan 13:119-126, in Japanese (with English abstract)). The reconstructed cloud shine agrees well with the actual values for the whole synthetic dataset (1451 data points), with a very small absolute fractional bias (FB = 0.02) and normalized mean square error (NMSE = 2.04) as well as a large correlation coefficient (r = 0.95). The new method significantly outperforms the existing one (more than 95% reduction of FB and NMSE, and 61% improvement of the correlation coefficient), mainly due to the modification for high deposition events. The code of the proposed methodology and all the test data are available for academic and non-commercial use. The new approach with the detailed interpretation of the in-situ environment data will help improving the ability of off-site source term inverse estimation for nuclear accidents. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Identification of terms to define unconstrained air transportation demands (United States)

    Jacobson, I. D.; Kuhilhau, A. R.


    The factors involved in the evaluation of unconstrained air transportation systems were carefully analyzed. By definition an unconstrained system is taken to be one in which the design can employ innovative and advanced concepts no longer limited by present environmental, social, political or regulatory settings. Four principal evaluation criteria are involved: (1) service utilization, based on the operating performance characteristics as viewed by potential patrons; (2) community impacts, reflecting decisions based on the perceived impacts of the system; (3) technological feasibility, estimating what is required to reduce the system to practice; and (4) financial feasibility, predicting the ability of the concepts to attract financial support. For each of these criteria, a set of terms or descriptors was identified, which should be used in the evaluation to render it complete. It is also demonstrated that these descriptors have the following properties: (a) their interpretation may be made by different groups of evaluators; (b) their interpretations and the way they are used may depend on the stage of development of the system in which they are used; (c) in formulating the problem, all descriptors should be addressed independent of the evaluation technique selected.

  20. Identification of conductive hearing loss using air conduction tests alone: reliability and validity of an automatic test battery. (United States)

    Convery, Elizabeth; Keidser, Gitte; Seeto, Mark; Freeston, Katrina; Zhou, Dan; Dillon, Harvey


    The primary objective of this study was to determine whether a combination of automatically administered pure-tone audiometry and a tone-in-noise detection task, both delivered via an air conduction (AC) pathway, could reliably and validly predict the presence of a conductive component to the hearing loss. The authors hypothesized that performance on the battery of tests would vary according to hearing loss type. A secondary objective was to evaluate the reliability and validity of a novel automatic audiometry algorithm to assess its suitability for inclusion in the test battery. Participants underwent a series of hearing assessments that were conducted in a randomized order: manual pure-tone air conduction audiometry and bone conduction audiometry; automatic pure-tone air conduction audiometry; and an automatic tone-in-noise detection task. The automatic tests were each administered twice. The ability of the automatic test battery to: (a) predict the presence of an air-bone gap (ABG); and (b) accurately measure AC hearing thresholds was assessed against the results of manual audiometry. Test-retest conditions were compared to determine the reliability of each component of the automatic test battery. Data were collected on 120 ears from normal-hearing and conductive, sensorineural, and mixed hearing-loss subgroups. Performance differences between different types of hearing loss were observed. Ears with a conductive component (conductive and mixed ears) tended to have normal signal to noise ratios (SNR) despite impaired thresholds in quiet, while ears without a conductive component (normal and sensorineural ears) demonstrated, on average, an increasing relationship between their thresholds in quiet and their achieved SNR. Using the relationship between these two measures among ears with no conductive component as a benchmark, the likelihood that an ear has a conductive component can be estimated based on the deviation from this benchmark. The sensitivity and

  1. Application of stacked convolutional and long short-term memory network for accurate identification of CAD ECG signals. (United States)

    Tan, Jen Hong; Hagiwara, Yuki; Pang, Winnie; Lim, Ivy; Oh, Shu Lih; Adam, Muhammad; Tan, Ru San; Chen, Ming; Acharya, U Rajendra


    Coronary artery disease (CAD) is the most common cause of heart disease globally. This is because there is no symptom exhibited in its initial phase until the disease progresses to an advanced stage. The electrocardiogram (ECG) is a widely accessible diagnostic tool to diagnose CAD that captures abnormal activity of the heart. However, it lacks diagnostic sensitivity. One reason is that, it is very challenging to visually interpret the ECG signal due to its very low amplitude. Hence, identification of abnormal ECG morphology by clinicians may be prone to error. Thus, it is essential to develop a software which can provide an automated and objective interpretation of the ECG signal. This paper proposes the implementation of long short-term memory (LSTM) network with convolutional neural network (CNN) to automatically diagnose CAD ECG signals accurately. Our proposed deep learning model is able to detect CAD ECG signals with a diagnostic accuracy of 99.85% with blindfold strategy. The developed prototype model is ready to be tested with an appropriate huge database before the clinical usage. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Automatic Assessment of Global Craniofacial Differences between Crouzon mice and Wild-type mice in terms of the Cephalic Index

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Oubel, Estanislao; Frangi, Alejandro F.


    This paper presents the automatic assessment of differences between Wild-Type mice and Crouzon mice based on high-resolution 3D Micro CT data. One factor used for the diagnosis of Crouzon syndrome in humans is the cephalic index, which is the skull width/length ratio. This index has traditionally...... of landmark matching is limited using only affine transformations, the errors were considered acceptable. The automatic estimation of the cephalic index was in full agreement with the gold standard measurements. Discriminant analysis of the three scaling parameters resulted in a good classification...

  3. Measuring Container Port Complementarity and Substitutability with Automatic Identification System (AIS Data – Studying the Inter-port Relationships in the Oslo Fjord Multi-port Gateway Region

    Directory of Open Access Journals (Sweden)

    Halvor Schøyen


    Full Text Available This paper considers the degree of competition among small and medium-sized container ports located in a multi-port gateway region. The level of port competition is evaluated by means of an analysis of the revealed preferences in the port-calling pattern of container feeder vessels deployed on their various links and routes. Unit of analysis is feeder vessel sailing legs and ports stays at/between adjacent container ports. At these ports’ terminals, ships are moored and loading and unloading of containers are performed. The vessel movement data is provided by the Automatic Identification System (AIS. A study of the principal container ports in the Oslo Fjord area is performed, measuring the actual container feeder traffic during the year of 2015. It is demonstrated to which extent ports in the Oslo Fjord region are acting as substitutes, and to which extent they are functioning more as a complement to each other.

  4. Automatic identification of regions of interest on renal tomographic images;Identification automatique des regions d'interets sur des images tomographiques renales

    Energy Technology Data Exchange (ETDEWEB)

    Boukerroui, D.; Cocquerez, J.P. [Universite de Technologie de Compiegne, CNRS UMR 6599 Heudiasyc, 60 (France); Touhami, W. [Ecole Nationale d' ingenieurs de Tunis (Tunisia)


    We propose in this paper, an original approach in a statistical framework, for fully automatic delineation of kidneys (healthy and pathological) in 2 dimension CT images. Our approach has 2 main steps: a localisation step followed by a delineation step. The localisation step is guided by a statistically learned prior spatial model in one hand and a grey level prior model in a second hand. The second step, utilizes the localization results in order to precisely delineate the kidney's regions using a set of learned IF-THEN rues. The proposed approach is tested on clinically acquired images and promising results are obtained. (authors)

  5. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle. (United States)

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P


    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Automatic writer identification using connected-component contours and edge-based features of uppercase western script

    NARCIS (Netherlands)

    Schomaker, L; Bulacu, M

    In this paper, a new technique for offline writer identification is presented, using connected-component contours (COCOCOs or CO(3)s) in uppercase handwritten samples. In our model, the writer is considered to be characterized by a stochastic pattern generator, producing a family of connected

  7. Automatic de-identification of textual documents in the electronic health record: a review of recent research. (United States)

    Meystre, Stephane M; Friedlin, F Jeffrey; South, Brett R; Shen, Shuying; Samore, Matthew H


    In the United States, the Health Insurance Portability and Accountability Act (HIPAA) protects the confidentiality of patient data and requires the informed consent of the patient and approval of the Internal Review Board to use data for research purposes, but these requirements can be waived if data is de-identified. For clinical data to be considered de-identified, the HIPAA "Safe Harbor" technique requires 18 data elements (called PHI: Protected Health Information) to be removed. The de-identification of narrative text documents is often realized manually, and requires significant resources. Well aware of these issues, several authors have investigated automated de-identification of narrative text documents from the electronic health record, and a review of recent research in this domain is presented here. This review focuses on recently published research (after 1995), and includes relevant publications from bibliographic queries in PubMed, conference proceedings, the ACM Digital Library, and interesting publications referenced in already included papers. The literature search returned more than 200 publications. The majority focused only on structured data de-identification instead of narrative text, on image de-identification, or described manual de-identification, and were therefore excluded. Finally, 18 publications describing automated text de-identification were selected for detailed analysis of the architecture and methods used, the types of PHI detected and removed, the external resources used, and the types of clinical documents targeted. All text de-identification systems aimed to identify and remove person names, and many included other types of PHI. Most systems used only one or two specific clinical document types, and were mostly based on two different groups of methodologies: pattern matching and machine learning. Many systems combined both approaches for different types of PHI, but the majority relied only on pattern matching, rules, and

  8. Evaluation of a direct method for the identification and antibiotic susceptibility assessment of microrganisms isolated from blood cultures by automatic systems

    Directory of Open Access Journals (Sweden)

    Sergio Frugoni


    Full Text Available The purpose of blood cultures in the septic patient is to address a correct therapeutic approach. Identification and antibiotic susceptibility test carried out directly from the bottle may give important information in short time.The introduction of the automatic instrumentation has improved the discovering of pathogens in the blood, however the elapsing time between the positive detection and the microbiological report is still along. Is the evaluation of this study a fast, easy, cheap method to be applied to the routine, which could reduce the response time in the bacteraemia diagnosis.The automatic systems Vitek Senior (bioMérieux, and Vitek 2 (bioMérieux were used at Pio Albergo Trivulzio (Centre1 and at Istituto dei Tumori (Centre2 respectivetly.To remove blood cells, 7 ml. of the culture has been moved by vacuum sampling in a test tube and centrifuged for 10 minutes at 1000 rpm the supernatant has been further centrifuged for 10 minutes at 3000 rpm.0.5 ml. of BHI has been added to the pellet o sediment.The concentration of bacterial suspension has been fit for the inoculation. At the same time has been prepared standard cultures in suitable culture media were carried out for comparison. In the centro1 and centro2 have been isolated and identify respectively 63 and 31 Gram negative, and, 32 and 40 gram positive microorganisms have been isolated and identify in the Centre1 and Centre2 respectively.The identification Gram-negative and Gram positive microorganisms showed an agreement of 100% and 86.2% and 93.3% and 65.78% respectively between the direct and the standard method. For antibiotic susceptibility tests, 903 (Centre1 and 491 (Centre2 and 396 and 509 compounds were totally assessed in Gram negative and Gram positive bacteria respectively.The analysis has highlighted that: Centre1 has reported 0.30% very major errors (GE, 0.92% major errors (EM, 1.23% minor errors (Em. Centre 2 showed 0.57% very major errors (GE, 0.09% major errors

  9. Neural Bases of Automaticity (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.


    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  10. Rapid and automatic chemical identification of the medicinal flower buds of Lonicera plants by the benchtop and hand-held Fourier transform infrared spectroscopy (United States)

    Chen, Jianbo; Guo, Baolin; Yan, Rui; Sun, Suqin; Zhou, Qun


    With the utilization of the hand-held equipment, Fourier transform infrared (FT-IR) spectroscopy is a promising analytical technique to minimize the time cost for the chemical identification of herbal materials. This research examines the feasibility of the hand-held FT-IR spectrometer for the on-site testing of herbal materials, using Lonicerae Japonicae Flos (LJF) and Lonicerae Flos (LF) as examples. Correlation-based linear discriminant models for LJF and LF are established based on the benchtop and hand-held FT-IR instruments. The benchtop FT-IR models can exactly recognize all articles of LJF and LF. Although a few LF articles are misjudged at the sub-class level, the hand-held FT-IR models are able to exactly discriminate LJF and LF. As a direct and label-free analytical technique, FT-IR spectroscopy has great potential in the rapid and automatic chemical identification of herbal materials either in laboratories or in fields. This is helpful to prevent the spread and use of adulterated herbal materials in time.


    Directory of Open Access Journals (Sweden)

    Yuita Arum Sari


    Full Text Available Abstract Twitter is a social media application, which can give a sign for identifying user emotion. Identification of user emotion can be utilized in commercial domain, health, politic, and security problems. The problem of emotion identification in twit is the unstructured short text messages which lead the difficulty to figure out main features. In this paper, we propose a new framework for identifying the tendency of user emotions using specific features, i.e. hashtag, emoji, emoticon, and adjective term. Preprocessing is applied in the first phase, and then user emotions are identified by means of classification method using kNN. The proposed method can achieve good results, near ground truth, with accuracy of 92%.

  12. An investigation into the factors that influence toolmark identifications on ammunition discharged from semi-automatic pistols recovered from car fires. (United States)

    Collender, Mark A; Doherty, Kevin A J; Stanton, Kenneth T


    Following a shooting incident where a vehicle is used to convey the culprits to and from the scene, both the getaway car and the firearm are often deliberately burned in an attempt to destroy any forensic evidence which may be subsequently recovered. Here we investigate the factors that influence the ability to make toolmark identifications on ammunition discharged from pistols recovered from such car fires. This work was carried out by conducting a number of controlled furnace tests in conjunction with real car fire tests in which three 9mm semi-automatic pistols were burned. Comparisons between pre-burn and post burn test fired ammunition discharged from these pistols were then performed to establish if identifications were still possible. The surfaces of the furnace heated samples and car fire samples were examined following heating/burning to establish what factors had influenced their surface morphology. The primary influence on the surfaces of the furnace heated and car fire samples was the formation of oxide layers. The car fire samples were altered to a greater extent than the furnace heated samples. Identifications were still possible between pre- and post-burn discharged cartridge cases, but this was not the case for the discharged bullets. It is suggested that the reason for this is a difference between the types of firearms discharge-generated toolmarks impressed onto the base of cartridge cases compared to those striated along the surfaces of bullets. It was also found that the temperatures recorded in the front foot wells were considerably less than those recorded on top of the rear seats during the car fires. These factors should be assessed by forensic firearms examiners when performing casework involving pistols recovered from car fires. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Identification of alcohol abuse and transition from long-term unemployment to disability pension. (United States)

    Nurmela, Kirsti; Heikkinen, Virpi; Hokkanen, Risto; Ylinen, Aarne; Uitti, Jukka; Mattila, Aino; Joukamaa, Matti; Virtanen, Pekka


    The aim of the study was to reveal potential gaps and inconsistencies in the identification of alcohol abuse in health care and in employment services and to analyse the granting of disability pensions with respect to the alcohol abuse identification pattern. The material consisted of documentary information on 505 long-term unemployed subjects with low employability sent to the development project entitled 'Eligibility for a Disability Pension' in 2001-2006 in Finland. The dichotomous variables 'Alcohol abuse identified in employment services' and 'Alcohol abuse identified in health care' were cross-tabulated to obtain a four-class variable 'Alcohol abuse identification pattern'. Logistic regression analyses were conducted to ascertain the association of alcohol abuse identification pattern with the granting of disability pensions. Alcohol abuse was detected by both health care and employment services in 47% of those identified as abusers (41% of examinees). Each service systems also identified cases that the other did not. When alcohol abuse was identified in health care only, the OR for a disability pension being granted was 2.8 (95% CI 1.5-5.2) compared with applicants without identified alcohol abuse. The result remained the same and statistically significant after adjusting for confounders. Alcohol abuse identified in health care was positively associated with the granting of a disability pension. Closer co-operation between employment services and health care could help to identify those long-term unemployed individuals with impaired work ability in need of thorough medical examination. © 2015 the Nordic Societies of Public Health.

  14. Automatic Frequency Identification under Sample Loss in Sinusoidal Pulse Width Modulation Signals Using an Iterative Autocorrelation Algorithm

    Directory of Open Access Journals (Sweden)

    Alejandro Said


    Full Text Available In this work, we present a simple algorithm to calculate automatically the Fourier spectrum of a Sinusoidal Pulse Width Modulation Signal (SPWM. Modulated voltage signals of this kind are used in industry by speed drives to vary the speed of alternating current motors while maintaining a smooth torque. Nevertheless, the SPWM technique produces undesired harmonics, which yield stator heating and power losses. By monitoring these signals without human interaction, it is possible to identify the harmonic content of SPWM signals in a fast and continuous manner. The algorithm is based in the autocorrelation function, commonly used in radar and voice signal processing. Taking advantage of the symmetry properties of the autocorrelation, the algorithm is capable of estimating half of the period of the fundamental frequency; thus, allowing one to estimate the necessary number of samples to produce an accurate Fourier spectrum. To deal with the loss of samples, i.e., the scan backlog, the algorithm iteratively acquires and trims the discrete sequence of samples until the required number of samples reaches a stable value. The simulation shows that the algorithm is not affected by either the magnitude of the switching pulses or the acquisition noise.

  15. Automatic identification of rockfalls and volcano-tectonic earthquakes at the Piton de la Fournaise volcano using a Random Forest algorithm (United States)

    Hibert, Clément; Provost, Floriane; Malet, Jean-Philippe; Maggi, Alessia; Stumpf, André; Ferrazzini, Valérie


    Monitoring the endogenous seismicity of volcanoes helps to forecast eruptions and prevent their related risks, and also provides critical information on the eruptive processes. Due the high number of events recorded during pre-eruptive periods by the seismic monitoring networks, cataloging each event can be complex and time-consuming if done by human operators. Automatic seismic signal processing methods are thus essential to build consistent catalogs based on objective criteria. We evaluated the performance of the ;Random Forests; (RF) machine-learning algorithm for classifying seismic signals recorded at the Piton de la Fournaise volcano, La Réunion Island (France). We focused on the discrimination of the dominant event types (rockfalls and volcano-tectonic earthquakes) using over 19,000 events covering two time periods: 2009-2011 and 2014-2015. We parametrized the seismic signals using 60 attributes that were then given to RF algorithm. When the RF classifier was given enough training samples, its sensitivity (rate of good identification) exceeded 99%, and its performance remained high (above 90%) even with few training samples. The sensitivity collapsed when using an RF classifier trained with data from 2009 to 2011 to classify data from 2014 to 2015 catalog, because the physical characteristics of the rockfalls and hence their seismic signals had evolved between the two time-periods. The main attribute families (waveform, spectrum, spectrogram or polarization) were all found to be useful for event discrimination. Our work validates the performance of the RF algorithm and suggests it could be implemented at other volcanic observatories to perform automatic, near real-time, classification of seismic events.

  16. Automatic identification of methotrexate-induced liver toxicity in patients with rheumatoid arthritis from the electronic medical record. (United States)

    Lin, Chen; Karlson, Elizabeth W; Dligach, Dmitriy; Ramirez, Monica P; Miller, Timothy A; Mo, Huan; Braggs, Natalie S; Cagan, Andrew; Gainer, Vivian; Denny, Joshua C; Savova, Guergana K


    To improve the accuracy of mining structured and unstructured components of the electronic medical record (EMR) by adding temporal features to automatically identify patients with rheumatoid arthritis (RA) with methotrexate-induced liver transaminase abnormalities. Codified information and a string-matching algorithm were applied to a RA cohort of 5903 patients from Partners HealthCare to select 1130 patients with potential liver toxicity. Supervised machine learning was applied as our key method. For features, Apache clinical Text Analysis and Knowledge Extraction System (cTAKES) was used to extract standard vocabulary from relevant sections of the unstructured clinical narrative. Temporal features were further extracted to assess the temporal relevance of event mentions with regard to the date of transaminase abnormality. All features were encapsulated in a 3-month-long episode for classification. Results were summarized at patient level in a training set (N=480 patients) and evaluated against a test set (N=120 patients). The system achieved positive predictive value (PPV) 0.756, sensitivity 0.919, F1 score 0.829 on the test set, which was significantly better than the best baseline system (PPV 0.590, sensitivity 0.703, F1 score 0.642). Our innovations, which included framing the phenotype problem as an episode-level classification task, and adding temporal information, all proved highly effective. Automated methotrexate-induced liver toxicity phenotype discovery for patients with RA based on structured and unstructured information in the EMR shows accurate results. Our work demonstrates that adding temporal features significantly improved classification results. © The Author 2014. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email:

  17. The influence of short-term memory on standard discrimination and cued identification olfactory tasks. (United States)

    Zucco, Gesualdo M; Hummel, Thomas; Tomaiuolo, Francesco; Stevenson, Richard J


    Amongst the techniques to assess olfactory functions, discrimination and cued identification are those most prone to the influence of odour short-term memory (STM). Discrimination task requires participants to detect the odd one out of three presented odourants. As re-smelling is not permitted, an un-intended STM load may generate, even though the task purports to assess discrimination ability. Analogously, cued identification task requires participants to smell an odour, and then select a label from three or four alternatives. As the interval between smelling and reading each label increases this too imposes a STM load, even though the task aims to measure identification ability. We tested whether modifying task design to reduce STM load improve performance on these tests. We examined five age-groups of participants (Adolescents, Young adults, Middle-aged, Elderly, very Elderly), some of whom should be more prone to the effects of STM load than others, on standard and modified tests of discrimination and identification. We found that using a technique to reduce STM load improved performance, especially for the very Elderly and Adolescent groups. Sources of error are now prevented. Findings indicate that STM load can adversely affect performance in groups vulnerable from memory impairment (i.e., very Elderly) and in those who may still be acquiring memory-based representations of familiar odours (i.e., Adolescents). It may be that adults in general would be even more sensitive to the effects of olfactory STM load reduction, if the odour-related task was more difficult. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. I RAN Fast and I Remembered What I Read: The Relationship between Reading, Rapid Automatic Naming, and Auditory and Visual Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Sheila G. Crewther


    Full Text Available Although rapid automatic naming (RAN speed and short-term auditory memory are widely recognised as good predictors of reading ability in most age groups, the predictive value of short-term memory for visually presented digits for reading and RAN in young typically developing learner readers (mean age 91.5 months has seldom been investigated. We found that visual digit span is a better predictor of reading ability than auditory digit span in learner readers. A significant correlation has also been found between RAN speed and visual, but not auditory digit span. These results suggests that RAN speed may be a good predictor of a child's future reading ability and eventual fluency because like visual digit span, it is a measure of rate of access to memory for the visual icons and their semantic name and meaning. The results also suggest that auditory memory is not an important factor in young children learning to read.

  19. Identification of major depressive disorder among the long-term unemployed. (United States)

    Nurmela, Kirsti; Mattila, Aino; Heikkinen, Virpi; Uitti, Jukka; Ylinen, Aarne; Virtanen, Pekka


    Depression is a common mental health disorder among the unemployed, but research on identifying their depression in health care is scarce. The present study aimed to explore the identification of major depressive disorder (MDD) in health care on long-term unemployed and find out if the duration of unemployment correlates with the risk for unidentified MDD. The study sample consisted the patient files of long-term unemployed people (duration of unemployment 1-35 years, median 11 years), who in a screening project diagnosed with MDD (n = 243). The MDD diagnosis was found in the health care records of 101. Binomial logistic regression models were used to explore the effect of the duration of unemployment, as a discrete variable, to the identification of MDD in health care. MDD was appropriately identified in health care for 42% (n = 101) of the participants with MDD. The odds ratio for unidentified MDD in health care was 1.060 (95% confidence interval 1.011; 1.111, p = 0.016) per unemployment year. When unemployment had continued, for example, for five years the odds ratio for having unidentified MDD was 1.336. The association remained significant throughout adjustments for the set of background factors (gender, age, occupational status, marital status, homelessness, criminal record, suicide attempts, number of health care visits). This study among depressed long-term unemployed people indicates that the longer the unemployment period has lasted, the more commonly these people suffer from unidentified MDD. Health services should be developed with respect to sensitivity to detect signs of depression among the long-term unemployed.

  20. The Development of Automaticity in Short-Term Memory Search: Item-Response Learning and Category Learning (United States)

    Cao, Rui; Nosofsky, Robert M.; Shiffrin, Richard M.


    In short-term-memory (STM)-search tasks, observers judge whether a test probe was present in a short list of study items. Here we investigated the long-term learning mechanisms that lead to the highly efficient STM-search performance observed under conditions of consistent-mapping (CM) training, in which targets and foils never switch roles across…

  1. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich


    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  2. An MRI-derived definition of MCI-to-AD conversion for long-term, automatic prognosis of MCI patients.

    Directory of Open Access Journals (Sweden)

    Yaman Aksu

    Full Text Available Alzheimer's disease (AD and mild cognitive impairment (MCI are of great current research interest. While there is no consensus on whether MCIs actually "convert" to AD, this concept is widely applied. Thus, the more important question is not whether MCIs convert, but what is the best such definition. We focus on automatic prognostication, nominally using only a baseline brain image, of whether an MCI will convert within a multi-year period following the initial clinical visit. This is not a traditional supervised learning problem since, in ADNI, there are no definitive labeled conversion examples. It is not unsupervised, either, since there are (labeled ADs and Controls, as well as cognitive scores for MCIs. Prior works have defined MCI subclasses based on whether or not clinical scores significantly change from baseline. There are concerns with these definitions, however, since, e.g., most MCIs (and ADs do not change from a baseline CDR = 0.5 at any subsequent visit in ADNI, even while physiological changes may be occurring. These works ignore rich phenotypical information in an MCI patient's brain scan and labeled AD and Control examples, in defining conversion. We propose an innovative definition, wherein an MCI is a converter if any of the patient's brain scans are classified "AD" by a Control-AD classifier. This definition bootstraps design of a second classifier, specifically trained to predict whether or not MCIs will convert. We thus predict whether an AD-Control classifier will predict that a patient has AD. Our results demonstrate that this definition leads not only to much higher prognostic accuracy than by-CDR conversion, but also to subpopulations more consistent with known AD biomarkers (including CSF markers. We also identify key prognostic brain region biomarkers.

  3. Radio frequency identification (RFID) of dentures in long-term care facilities. (United States)

    Madrid, Carlos; Korsvold, Tové; Rochat, Aline; Abarca, Marcelo


    The difficulty of identifying the ownership of lost dentures when found is a common and expensive problem in long term care facilities (LTCFs) and hospitals. The purpose of this study was to evaluate the reliability of using radiofrequency identification (RFID) in the identification of dentures for LTCF residents after 3 and 6 months. Thirty-eight residents of 2 LTCFs in Switzerland agreed to participate after providing informed consent. The tag was programmed with the family and first names of the participants and then inserted in the dentures. After placement of the tag, the information was read. A second and third assessment to review the functioning of the tag occurred at 3 and 6 months, and defective tags (if present) were reported and replaced. The data were analyzed with descriptive statistics. At the 3-month assessment of 34 residents (63 tags) 1 tag was unreadable and 62 tags (98.2%) were operational. At 6 months, the tags of 27 of the enrolled residents (50 tags) were available for review. No examined tag was defective at this time period. Within the limits of this study (number of patients, 6-month time span) RFID appears to be a reliable method of tracking and identifying dentures, with only 1 of 65 devices being unreadable at 3 months and 100% of 50 initially placed tags being readable at the end of the trial. Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  4. Automatic segmentation of the hippocampus for preterm neonates from early-in-life to term-equivalent age

    Directory of Open Access Journals (Sweden)

    Ting Guo


    Conclusions: MAGeT-Brain is capable of segmenting hippocampi accurately in preterm neonates, even at early-in-life. Hippocampal asymmetry with a larger right side is demonstrated on early-in-life images, suggesting that this phenomenon has its onset in the 3rd trimester of gestation. Hippocampal volume assessed at the time of early-in-life and term-equivalent age is linearly associated with GA at birth, whereby smaller volumes are associated with earlier birth.

  5. The development of automaticity in short-term memory search: Item-response learning and category learning. (United States)

    Cao, Rui; Nosofsky, Robert M; Shiffrin, Richard M


    In short-term-memory (STM)-search tasks, observers judge whether a test probe was present in a short list of study items. Here we investigated the long-term learning mechanisms that lead to the highly efficient STM-search performance observed under conditions of consistent-mapping (CM) training, in which targets and foils never switch roles across trials. In item-response learning, subjects learn long-term mappings between individual items and target versus foil responses. In category learning, subjects learn high-level codes corresponding to separate sets of items and learn to attach old versus new responses to these category codes. To distinguish between these 2 forms of learning, we tested subjects in categorized varied mapping (CV) conditions: There were 2 distinct categories of items, but the assignment of categories to target versus foil responses varied across trials. In cases involving arbitrary categories, CV performance closely resembled standard varied-mapping performance without categories and departed dramatically from CM performance, supporting the item-response-learning hypothesis. In cases involving prelearned categories, CV performance resembled CM performance, as long as there was sufficient practice or steps taken to reduce trial-to-trial category-switching costs. This pattern of results supports the category-coding hypothesis for sufficiently well-learned categories. Thus, item-response learning occurs rapidly and is used early in CM training; category learning is much slower but is eventually adopted and is used to increase the efficiency of search beyond that available from item-response learning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Predictors of long-term mortality in Multicenter Automatic Defibrillator Implantation Trial II (MADIT II) patients with implantable cardioverter-defibrillators. (United States)

    Cygankiewicz, Iwona; Gillespie, John; Zareba, Wojciech; Brown, Mary W; Goldenberg, Ilan; Klein, Helmut; McNitt, Scott; Polonsky, Slava; Andrews, Mark; Dwyer, Edward M; Hall, W Jackson; Moss, Arthur J


    Data on long-term follow-up and factors influencing mortality in implantable cardioverter-defibrillator (ICD) recipients are limited. The aim of this study was to evaluate mortality during long-term follow-up and the predictive value of several risk markers in the Multicenter Automatic Defibrillator Implantation Trial II (MADIT II) patients with implanted cardioverter-defibrillators (ICDs). The study involved U.S. patients from the MADIT II trial randomized to and receiving ICD treatment. Data regarding long-term mortality were retrieved from the National Death Registry. Several clinical, biochemical, and electrocardiogram variables were tested in a multivariate Cox model for predicting long-term mortality, and a score identifying high-, medium-, and lower risk patients was developed. The study population consisted of 655 patients, mean age 64 +/- 10 years. During a follow-up of up to 9 years, averaging 63 months, 294 deaths occurred. The 6-year cumulative probability of death was 40%, with evidence of a constant risk of about 8.5% per year among survivors. Median survival was estimated at 8 years. Multivariate analysis identified age >65 years, New York Heart Association class 3-4, diabetes, non-sinus rhythm, and increased levels of blood urea nitrogen as independent risk predictors of mortality. Patients with three or more of these risk factors were characterized by a 6-year mortality rate of 68%, compared with 43% in those with one to two risk factors and 19% in patients with no risk factors. A combination of a few readily available clinical variables indicating advanced disease and comorbid conditions identifies ICD patients at high risk of mortality during long-term follow-up.

  7. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises (United States)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner


    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  8. Long-term clinical evaluation of the automatic stance-phase lock-controlled prosthetic knee joint in young adults with unilateral above-knee amputation. (United States)

    Andrysek, Jan; Wright, F Virginia; Rotter, Karin; Garcia, Daniela; Valdebenito, Rebeca; Mitchell, Carlos Alvarez; Rozbaczylo, Claudio; Cubillos, Rafael


    The purpose of this study was to clinically evaluate the automatic stance-phase lock (ASPL) knee mechanism against participants' existing weight-activated braking (WAB) prosthetic knee joint. This prospective crossover study involved 10 young adults with an above-knee amputation. Primary measurements consisted of tests of walking speeds and capacity. Heart rate was measured during the six-minute walk test and the Physiological Cost Index (PCI) which was calculated from heart rate estimated energy expenditure. Activity was measured with a pedometer. User function and quality of life were assessed using the Lower Limb Function Questionnaire (LLFQ) and Prosthetic Evaluation Questionnaire (PEQ). Long-term follow-up over 12 months were completed. Walking speeds were the same for WAB and APSL knees. Energy expenditure (PCI) was lower for the ASPL knees (p = 0.007). Step counts were the same for both knees, and questionnaires indicated ASPL knee preference attributed primarily to knee stability and improved walking, while limitations included terminal impact noise. Nine of 10 participants chose to keep using the ASPL knee as part of the long-term follow-up. Potential benefits of the ASPL knee were identified in this study by functional measures, questionnaires and user feedback, but not changes in activity or the PEQ.

  9. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.


    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  10. Operator overloading as an enabling technology for automatic differentiation

    International Nuclear Information System (INIS)

    Corliss, G.F.; Griewank, A.


    We present an example of the science that is enabled by object-oriented programming techniques. Scientific computation often needs derivatives for solving nonlinear systems such as those arising in many PDE algorithms, optimization, parameter identification, stiff ordinary differential equations, or sensitivity analysis. Automatic differentiation computes derivatives accurately and efficiently by applying the chain rule to each arithmetic operation or elementary function. Operator overloading enables the techniques of either the forward or the reverse mode of automatic differentiation to be applied to real-world scientific problems. We illustrate automatic differentiation with an example drawn from a model of unsaturated flow in a porous medium. The problem arises from planning for the long-term storage of radioactive waste

  11. Automatic segmentation of diatom images for classification

    NARCIS (Netherlands)

    Jalba, Andrei C.; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    A general framework for automatic segmentation of diatom images is presented. This segmentation is a critical first step in contour-based methods for automatic identification of diatoms by computerized image analysis. We review existing results, adapt popular segmentation methods to this difficult

  12. Inverse Relationship of Blood Pressure to Long-Term Outcomes and Benefit of Cardiac Resynchronization Therapy in Patients With Mild Heart Failure: A Multicenter Automatic Defibrillator Implantation Trial With Cardiac Resynchronization Therapy Long-Term Follow-Up Substudy. (United States)

    Biton, Yitschak; Moss, Arthur J; Kutyifa, Valentina; Mathias, Andrew; Sherazi, Saadia; Zareba, Wojciech; McNitt, Scott; Polonsky, Bronislava; Barsheshet, Alon; Brown, Mary W; Goldenberg, Ilan


    Previous studies have shown that low blood pressure is associated with increased mortality and heart failure (HF) in patients with left ventricular dysfunction. Cardiac resynchronization therapy (CRT) was shown to increase systolic blood pressure (SBP). Therefore, we hypothesized that treatment with CRT would provide incremental benefit in patients with lower SBP values. The independent contribution of SBP to outcome was analyzed in 1267 patients with left bundle brunch block enrolled in Multicenter Automatic Defibrillator Implantation Trial With Cardiac Resynchronization Therapy (MADIT-CRT). SBP was assessed as continuous measures and further categorized into approximate quintiles. The risk of long-term HF or death and CRT with defibrillator versus implantable cardioverter defibrillator benefit was assessed in multivariate Cox proportional hazards regression models. Multivariate analysis showed that in the implantable cardioverter defibrillator arm, each 10-mm Hg decrement of SBP was independently associated with a significant 21% (P2-fold risk-increase. CRT with defibrillator provided the greatest HF or mortality risk reduction in patients with SBPSBP≥136 mm Hg and hazard ratio of 0.94, P=0.808, with SBP>136 mm Hg (P for trend=0.001). In patients with mild HF, prolonged QRS, and left bundle brunch block, low SBP is related to higher risk of mortality or HF with implantable cardioverter defibrillator therapy alone. Treatment with CRT is associated with incremental clinical benefits in patients with lower baseline SBP values. URL: Unique identifier: NCT00180271. © 2015 American Heart Association, Inc.

  13. Identification and long term stability of DNA captured on a dental impression wafer. (United States)

    Kim, Maile; Siegler, Kate; Tamariz, Jeannie; Caragine, Theresa; Fernandez, Jill; Daronch, Marcia; Moursi, Amr


    The purpose of this study was to determine the quantity and quality of DNA extracted from a dental bite impression wafer immediately after impression and after 12 months of home storage. The authors' hypothesis was that the wafer would retain sufficient DNA with appropriate genetic markers to make an identification match. Two impression wafers (Toothprints(®) brand) were administered to 100 3- to 26-year-olds. A cotton swab was used as a control. DNA from wafers stored for 12 months at home were compared to DNA collected at time 0 and compared to swabs at specific sites to determine quality and accuracy. The amount of DNA captured and recovered was analyzed using MagAttract technology and a quantitative real-time polymerase chain reaction. Capillary gel electrophoresis was performed to determine the quality of the DNA profiles obtained from the wafers vs those generated from the swabs of each subject. Average DNA concentration was: 480 pg/μL (wafer at time 0); 392 pg/μL (wafer after 12 months kept by subjects); and 1,041 pg/μL (buccal swab). Sufficient DNA for human identification was recovered from all sets of wafers, producing clear DNA profiles and accurate matches to buccal swabs. No inhibitors were found that could interfere with DNA profiling. Toothprints® impression wafers can be useful for DNA collection and child identification. After 12 months, the wafer was still usable for DNA capture and identification match.

  14. What Information Does Your EHR Contain? Automatic Generation of a Clinical Metadata Warehouse (CMDW) to Support Identification and Data Access Within Distributed Clinical Research Networks. (United States)

    Bruland, Philipp; Doods, Justin; Storck, Michael; Dugas, Martin


    Data dictionaries provide structural meta-information about data definitions in health information technology (HIT) systems. In this regard, reusing healthcare data for secondary purposes offers several advantages (e.g. reduce documentation times or increased data quality). Prerequisites for data reuse are its quality, availability and identical meaning of data. In diverse projects, research data warehouses serve as core components between heterogeneous clinical databases and various research applications. Given the complexity (high number of data elements) and dynamics (regular updates) of electronic health record (EHR) data structures, we propose a clinical metadata warehouse (CMDW) based on a metadata registry standard. Metadata of two large hospitals were automatically inserted into two CMDWs containing 16,230 forms and 310,519 data elements. Automatic updates of metadata are possible as well as semantic annotations. A CMDW allows metadata discovery, data quality assessment and similarity analyses. Common data models for distributed research networks can be established based on similarity analyses.

  15. Automatic identification of the number of food items in a meal using clustering techniques based on the monitoring of swallowing and chewing


    Lopez-Meyer, Paulo; Schuckers, Stephanie; Makeyev, Oleksandr; Fontana, Juan M.; Sazonov, Edward


    The number of distinct foods consumed in a meal is of significant clinical concern in the study of obesity and other eating disorders. This paper proposes the use of information contained in chewing and swallowing sequences for meal segmentation by food types. Data collected from experiments of 17 volunteers were analyzed using two different clustering techniques. First, an unsupervised clustering technique, Affinity Propagation (AP), was used to automatically identify the number of segments ...

  16. Sequence protein identification by randomized sequence database and transcriptome mass spectrometry (SPIDER-TMS): from manual to automatic application of a 'de novo sequencing' approach. (United States)

    Pascale, Raffaella; Grossi, Gerarda; Cruciani, Gabriele; Mecca, Giansalvatore; Santoro, Donatello; Sarli Calace, Renzo; Falabella, Patrizia; Bianco, Giuliana

    Sequence protein identification by a randomized sequence database and transcriptome mass spectrometry software package has been developed at the University of Basilicata in Potenza (Italy) and designed to facilitate the determination of the amino acid sequence of a peptide as well as an unequivocal identification of proteins in a high-throughput manner with enormous advantages of time, economical resource and expertise. The software package is a valid tool for the automation of a de novo sequencing approach, overcoming the main limits and a versatile platform useful in the proteomic field for an unequivocal identification of proteins, starting from tandem mass spectrometry data. The strength of this software is that it is a user-friendly and non-statistical approach, so protein identification can be considered unambiguous.

  17. Molecular identification of Taenia specimens after long-term preservation in formalin. (United States)

    Jeon, Hyeong-Kyu; Kim, Kyu-Heon; Eom, Keeseon S


    The majority of Taenia tapeworm specimens in the museum collections are usually kept in a formalin fixative for permanent preservation mainly for use in morphological examinations. This study aims to improve Taenia tapeworm identification even of one preserved in formalin for a maximum of 81 years. Taenia tapeworms were collected by the parasite collection unit of the Swiss Natural History Museum and from units in Indonesia, Japan and Korea. A small amount of formalin-fixed tissue (100 mg) was crushed in liquid nitrogen and then soaked in a Tris-EDTA buffer for 3-5h. The sample was then digested in SDS and proteinase K (20 mg/ml) for 3-5h at 56 °C. After the addition of proteinase K (20mg/ml), SDS and hexadecyl-trimethyl-ammonium bromide (CTAB), incubation was continued for another 3h at 65 °C. A maximum yield of genomic DNA was obtained from this additional step and the quality of genomic DNA obtained with this extraction method seemed to be independent of the duration of storage time in the formalin fixative. The molecular identification of Taenia tapeworms was performed by using PCR and DNA sequences corresponding to position 80-428 of cox1 gene. T. asiatica was detected in the isolates of Indonesia, Japan and Korea. Improvements in the genomic DNA extraction method from formalin fixed museum collections will help in the molecular identification of parasites. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Automatisms: bridging clinical neurology with criminal law. (United States)

    Rolnick, Joshua; Parvizi, Josef


    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. Identification of structural DNA variations in human cell cultures after long-term passage. (United States)

    Pavlova, G V; Vergun, A A; Rybalkina, E Y; Butovskaya, P R; Ryskov, A P


    Random amplified polymorphic DNA (RAPD) analysis was adapted for genomic identification of cell cultures and evaluation of DNA stability in cells of different origin at different culture passages. DNA stability was observed in cultures after no more than 5 passages. Adipose-derived stromal cells demonstrated increased DNA instability. RAPD fragments from different cell lines after different number of passages were cloned and sequenced. The chromosomal localization of these fragments was identified and single-nucleotide variations in RAPD fragments isolated from cell lines after 8-12 passages were revealed. Some of them had permanent localization, while most variations demonstrated random distribution and can be considered as de novo mutations.

  20. DNA evolutionary algorithm (DNAEA) for source term identification in convection-diffusion equation

    International Nuclear Information System (INIS)

    Yang, X-H; Hu, X-X; Shen, Z-Y


    The source identification problem is changed into an optimization problem in this paper. This is a complicated nonlinear optimization problem. It is very intractable with traditional optimization methods. So DNA evolutionary algorithm (DNAEA) is presented to solve the discussed problem. In this algorithm, an initial population is generated by a chaos algorithm. With the shrinking of searching range, DNAEA gradually directs to an optimal result with excellent individuals obtained by DNAEA. The position and intensity of pollution source are well found with DNAEA. Compared with Gray-coded genetic algorithm and pure random search algorithm, DNAEA has rapider convergent speed and higher calculation precision

  1. A comparative analysis of DBSCAN, K-means, and quadratic variation algorithms for automatic identification of swallows from swallowing accelerometry signals. (United States)

    Dudik, Joshua M; Kurosu, Atsuko; Coyle, James L; Sejdić, Ervin


    Cervical auscultation with high resolution sensors is currently under consideration as a method of automatically screening for specific swallowing abnormalities. To be clinically useful without human involvement, any devices based on cervical auscultation should be able to detect specified swallowing events in an automatic manner. In this paper, we comparatively analyze the density-based spatial clustering of applications with noise algorithm (DBSCAN), a k-means based algorithm, and an algorithm based on quadratic variation as methods of differentiating periods of swallowing activity from periods of time without swallows. These algorithms utilized swallowing vibration data exclusively and compared the results to a gold standard measure of swallowing duration. Data was collected from 23 subjects that were actively suffering from swallowing difficulties. Comparing the performance of the DBSCAN algorithm with a proven segmentation algorithm that utilizes k-means clustering demonstrated that the DBSCAN algorithm had a higher sensitivity and correctly segmented more swallows. Comparing its performance with a threshold-based algorithm that utilized the quadratic variation of the signal showed that the DBSCAN algorithm offered no direct increase in performance. However, it offered several other benefits including a faster run time and more consistent performance between patients. All algorithms showed noticeable differentiation from the endpoints provided by a videofluoroscopy examination as well as reduced sensitivity. In summary, we showed that the DBSCAN algorithm is a viable method for detecting the occurrence of a swallowing event using cervical auscultation signals, but significant work must be done to improve its performance before it can be implemented in an unsupervised manner. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Comparison of short-term estrogenicity tests for identification of hormone-disrupting chemicals

    DEFF Research Database (Denmark)

    Andersen, H R; Andersson, A M; Arnold, S F


    The aim of this study was to compare results obtained by eight different short-term assays of estrogenlike actions of chemicals conducted in 10 different laboratories in five countries. Twenty chemicals were selected to represent direct-acting estrogens, compounds with estrogenic metabolites...

  3. Identification of a functional connectome for long-term fear memory in mice.

    Directory of Open Access Journals (Sweden)

    Anne L Wheeler

    Full Text Available Long-term memories are thought to depend upon the coordinated activation of a broad network of cortical and subcortical brain regions. However, the distributed nature of this representation has made it challenging to define the neural elements of the memory trace, and lesion and electrophysiological approaches provide only a narrow window into what is appreciated a much more global network. Here we used a global mapping approach to identify networks of brain regions activated following recall of long-term fear memories in mice. Analysis of Fos expression across 84 brain regions allowed us to identify regions that were co-active following memory recall. These analyses revealed that the functional organization of long-term fear memories depends on memory age and is altered in mutant mice that exhibit premature forgetting. Most importantly, these analyses indicate that long-term memory recall engages a network that has a distinct thalamic-hippocampal-cortical signature. This network is concurrently integrated and segregated and therefore has small-world properties, and contains hub-like regions in the prefrontal cortex and thalamus that may play privileged roles in memory expression.

  4. Long-term forecasting of hourly electricity load: Identification of consumption profiles and segmentation of customers

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Larsen, Helge V.; Boomsma, Trine Krogh


    Data for aggregated hourly electricity demand shows systematic variations over the day, week, and seasons, and forecasting of aggregated hourly electricity load has been the subject of many studies. With hourly metering of individual customers, data for individual consumption profiles is available....... Using this data and analysing the case of Denmark, we show that consumption profiles for categories of customers are equally systematic but very different for distinct categories, that is, distinct categories of customers contribute differently to the aggregated electricity load profile. Therefore......, to model and forecast long-term changes in the aggregated electricity load profile, we identify profiles for different categories of customers and link these to projections of the aggregated annual consumption by categories of customers. Long-term projection of the aggregated load is important for future...

  5. Liabilities identification and long-term management - Review of French situation

    International Nuclear Information System (INIS)


    In France, long term liabilities due to nuclear activities concern four main operators: Electricite de France (EDF), AREVA (an industrial group created on September 3, 2001 and covering the entire fuel cycle from ore extraction and transformation to the recycling of spent fuel), the Atomic Energy Commission (CEA, the French public research organism in the nuclear sector) and the French Agency for radioactive waste management (ANDRA, in charge with the long term operation of radioactive waste installations). Long term liabilities are due to the financing of both decommissioning of nuclear installations and radioactive waste long term management. In the current French organisational scheme, the different operators must take the responsibility of these long term liabilities. The setting of national policies and the establishment of the legislation are carried out at a national level by the French state. These include the supervision of the three operators through different Ministries and the regulatory control of safety trough the Nuclear Safety Authority (ASN). EDF, AREVA, CEA and ANDRA are responsible for all aspects of the decommissioning (from a technical and financial point of view). Within a safety regulatory frame, they have their own initiative concerning future expenses, based on estimated costs and the expected operational lifetime of the installations. They are responsible of the definition and implementation of the technical options. Through its supervision activities, the French State regularly requires updating studies of these estimated costs, which are conducted by the operators. A general review of the management of these long-term liabilities is also carried out on a four years basis by the French Court of Accounts. Operators are due to constitute provisions during the life cycle of their installations. Provisions are calculated for each installation on the basis of the decommissioning expenses and of the reasonably estimated lifetime. They are re

  6. Comparison of short-term estrogenicity tests for identification of hormone-disrupting chemicals

    DEFF Research Database (Denmark)

    Andersen, H R; Andersson, A M; Arnold, S F


    The aim of this study was to compare results obtained by eight different short-term assays of estrogenlike actions of chemicals conducted in 10 different laboratories in five countries. Twenty chemicals were selected to represent direct-acting estrogens, compounds with estrogenic metabolites...... cells, transient reporter gene expression in MCF-7 cells, reporter gene expression in yeast strains stably transfected with the human ER and an estrogen-responsive reporter gene, and vitellogenin production in juvenile rainbow trout. 17beta-Estradiol, 17alpha-ethynyl estradiol, and diethylstilbestrol...... methods vary in their sensitivity to estrogenic compounds. Thus, short-term tests are useful for screening purposes, but the methods must be further validated by additional interlaboratory and interassay comparisons to document the reliability of the methods....

  7. Comparison of Short-Term Estrogenicity Tests for Identification of Hormone-Disrupting Chemicals

    DEFF Research Database (Denmark)

    Andersen, Helle Raun; Andersson, Anna-Maria; Arnold, Steven F.


    The aim of this study was to compare results obtained by eight different short-term assays of estrogenlike actions of chemicals conducted in 10 different laboratories in five countries. Twenty chemicals were selected to represent direct-acting estrogens, compounds with estrogenic metabolites......, transient reporter gene expression in MCF-7 cells, reporter gene expression in yeast strains stably transfected with the human ER and an estrogen-responsive reporter gene, and vitellogenin production in juvenile rainbow trout. 17β-Estradiol, 17α-ethynyl estradiol, and diethylstilbestrol induced a strong...... in their sensitivity to estrogenic compounds. Thus, short-term tests are useful for screening purposes, but the methods must be further validated by additional interlaboratory and interassay comparisons to document the reliability of the methods....

  8. Automatic Detection of Fake News


    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada


    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  9. Assessment of Granger causality by nonlinear model identification: application to short-term cardiovascular variability. (United States)

    Faes, Luca; Nollo, Giandomenico; Chon, Ki H


    A method for assessing Granger causal relationships in bivariate time series, based on nonlinear autoregressive (NAR) and nonlinear autoregressive exogenous (NARX) models is presented. The method evaluates bilateral interactions between two time series by quantifying the predictability improvement (PI) of the output time series when the dynamics associated with the input time series are included, i.e., moving from NAR to NARX prediction. The NARX model identification was performed by the optimal parameter search (OPS) algorithm, and its results were compared to the least-squares method to determine the most appropriate method to be used for experimental data. The statistical significance of the PI was assessed using a surrogate data technique. The proposed method was tested with simulation examples involving short realizations of linear stochastic processes and nonlinear deterministic signals in which either unidirectional or bidirectional coupling and varying strengths of interactions were imposed. It was found that the OPS-based NARX model was accurate and sensitive in detecting imposed Granger causality conditions. In addition, the OPS-based NARX model was more accurate than the least squares method. Application to the systolic blood pressure and heart rate variability signals demonstrated the feasibility of the method. In particular, we found a bilateral causal relationship between the two signals as evidenced by the significant reduction in the PI values with the NARX model prediction compared to the NAR model prediction, which was also confirmed by the surrogate data analysis. Furthermore, we found significant reduction in the complexity of the dynamics of the two causal pathways of the two signals as the body position was changed from the supine to upright. The proposed is a general method, thus, it can be applied to a wide variety of physiological signals to better understand causality and coupling that may be different between normal and diseased

  10. Classification of Large-Scale Remote Sensing Images for Automatic Identification of Health Hazards: Smoke Detection Using an Autologistic Regression Classifier. (United States)

    Wolters, Mark A; Dean, C B


    Remote sensing images from Earth-orbiting satellites are a potentially rich data source for monitoring and cataloguing atmospheric health hazards that cover large geographic regions. A method is proposed for classifying such images into hazard and nonhazard regions using the autologistic regression model, which may be viewed as a spatial extension of logistic regression. The method includes a novel and simple approach to parameter estimation that makes it well suited to handling the large and high-dimensional datasets arising from satellite-borne instruments. The methodology is demonstrated on both simulated images and a real application to the identification of forest fire smoke.

  11. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.


    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  12. Screening local Lactobacilli from Iran in terms of production of lactic acid and identification of superior strains

    Directory of Open Access Journals (Sweden)

    Fatemeh Soleimanifard


    Full Text Available Introduction: Lactobacilli are a group of lactic acid bacteria that their final product of fermentation is lactic acid. The objective of this research is selection of local Lactobacilli producing L (+ lactic acid. Materials and methods: In this research the local strains were screened based on the ability to produce lactic acid. The screening was performed in two stages. The first stage was the titration method and the second stage was the enzymatic method. The superior strains obtained from titration method were selected to do enzymatic test. Finally, the superior strains in the second stage (enzymatic which had the ability to produce L(+ lactic acid were identified by biochemical tests. Then, molecular identification of strains was performed by using 16S rRNA sequencing. Results: In this study, the ability of 79 strains of local Lactobacilli in terms of production of lactic acid was studied. The highest and lowest rates of lactic acid production was 34.8 and 12.4 mg/g. Superior Lactobacilli in terms of production of lactic acid ability of producing had an optical isomer L(+, the highest levels of L(+ lactic acid were with 3.99 and the lowest amount equal to 1.03 mg/g. The biochemical and molecular identification of superior strains showed that strains are Lactobacillus paracasei. Then the sequences of 16S rRNA of superior strains were reported in NCBI with accession numbers KF735654، KF735655، KJ508201and KJ508202. Discussion and conclusion: The amounts of lactic acid production by local Lactobacilli were very different and producing some of these strains on available reports showed more products. The results of this research suggest the use of superior strains of Lactobacilli for production of pure L(+ lactic acid.

  13. Identification of long-term containment/stabilization technology performance issues

    International Nuclear Information System (INIS)

    Matthern, G.E.; Nickelson, D.F.


    U.S. Department of Energy (DOE) faces a somewhat unique challenge when addressing in situ remedial alternatives that leave long-lived radionuclides and hazardous contaminants onsite. These contaminants will remain a potential hazard for thousands of years. However, the risks, costs, and uncertainties associated with removal and offsite disposal are leading many sites to select in situ disposal alternatives. Improvements in containment, stabilization, and monitoring technologies will enhance the viability of such alternatives for implementation. DOE's Office of Science and Technology sponsored a two day workshop designed to investigate issues associated with the long-term in situ stabilization and containment of buried, long-lived hazardous and radioactive contaminants. The workshop facilitated communication among end users representing most sites within the DOE, regulators, and technologists to define long-term performance issues for in situ stabilization and containment alternatives. Participants were divided into groups to identify issues and a strategy to address priority issues. This paper presents the results of the working groups and summarizes the conclusions. A common issue identified by the work groups is communication. Effective communication between technologists, risk assessors, end users, regulators, and other stakeholders would contribute greatly to resolution of both technical and programmatic issues

  14. Extended CT scale overcomes restoration caused streak artifacts for dental identification in CT--3D color encoded automatic discrimination of dental restorations. (United States)

    Jackowski, C; Lussi, A; Classens, M; Kilchoer, T; Bolliger, S; Aghayev, E; Criste, A; Dirnhofer, R; Thali, M J


    Besides DNA, dental radiographs play a major role in the identification of victims in mass casualties or in corpses with major postmortem alterations. Computed tomography (CT) is increasingly applied in forensic investigations and is used to scan the dentition of deceased persons within minutes. We investigated different restoration materials concerning their radiopacity in CT for dental identification purposes. Extracted teeth with different filling materials (composite, amalgam, ceramic, temporary fillings) were CT scanned. Radiopacities of the filling materials were analyzed in extended CT scale images. Radiopacity values ranged from 6000-8500HU (temporary fillings), 4500-17000HU (composite fillings) and >30710HU (Amalgam and Gold). The values were used to define presets for a 3D colored volume rendering software. The effects of filling material caused streak artifacts could be distinctively reduced for the assessment of the dental status and a postprocessing algorithm was introduced that allows for 3D color encoded visualization and discrimination of different dental restorations based on postmortem CT data.

  15. Identification of liabilities and long-term management of the fund in Hungary

    International Nuclear Information System (INIS)

    Czoch, Ildiko


    According to the basic rules, laid down in the Act on Atomic Energy CXVI. of 1996, radioactive waste management shall not impose undue burden on future generations. To satisfy this requirement, the long-term costs of waste disposal and of decommissioning of the plant shall be paid by the generation that enjoys the benefits of nuclear energy production and applications of isotopes. Accordingly, by the Act and its executive orders, a Central Nuclear Financial Fund was established on 1 January 1998 to finance radioactive waste disposal, interim storage and disposal of spent fuel as well as decommissioning of nuclear facilities. The Minister, supervising the Hungarian Atomic Energy Authority is disposing of the Central Nuclear Financial Fund, while the Hungarian Atomic Energy Authority is responsible for its management. The Fund is a separate state fund pursuant to Act XXXVIII of 1992 on Public Finance, exclusively earmarked for financing the construction and operation of disposal facilities for the final disposal of radioactive waste, as well as for the interim storage and final disposal of spent fuel, and the decommissioning of nuclear facilities. A long-term plan (up to the decommissioning of the nuclear facilities), a medium term plan (for five years) and an annual work schedule are to be prepared on the use of the Fund. The preparation of these plans/schedules is within the responsibilities of the Public Agency for Radioactive Waste Management. The long and medium term plans shall be annually reviewed and revised as required. The long and medium term plans and the annual work schedule shall be approved by the Minister supervising the Hungarian Atomic Energy Authority. The payments into the Fund are defined in accordance with these plans. The liabilities of the Paks Nuclear Power Plant for annual payments into the Fund are included in the law on the central budget on the proposal of the Minister supervising the Hungarian Atomic Energy Authority. It is based upon

  16. Clinical chorioamnionitis at term VIII: a rapid MMP-8 test for the identification of intra-amniotic inflammation. (United States)

    Chaiyasit, Noppadol; Romero, Roberto; Chaemsaithong, Piya; Docheva, Nikolina; Bhatti, Gaurav; Kusanovic, Juan Pedro; Dong, Zhong; Yeo, Lami; Pacora, Percy; Hassan, Sonia S; Erez, Offer


    Clinical chorioamnionitis is the most common infection/inflammatory process diagnosed in labor and delivery units worldwide. The condition is a syndrome that can be caused by (1) intra-amniotic infection, (2) intra-amniotic inflammation without demonstrable microorganisms (i.e. sterile intra-amniotic inflammation), and (3) maternal systemic inflammation that is not associated with intra-amniotic inflammation. The presence of intra-amniotic inflammation is a risk factor for adverse maternal and neonatal outcomes in a broad range of obstetrical syndromes that includes clinical chorioamnionitis at term. Although the diagnosis of intra-amniotic infection has relied on culture results, such information is not immediately available for patient management. Therefore, the diagnosis of intra-amniotic inflammation could be helpful as a proxy for intra-amniotic infection, while results of microbiologic studies are pending. A rapid test is now available for the diagnosis of intra-amniotic inflammation, based on the determination of neutrophil collagenase or matrix metalloproteinase-8 (MMP-8). The objectives of this study were (1) to evaluate the diagnostic indices of a rapid MMP-8 test for the identification of intra-amniotic inflammation/infection in patients with the diagnosis of clinical chorioamnionitis at term, and (2) to compare the diagnostic performance of a rapid MMP-8 test to that of a conventional enzyme-linked immunosorbent assay (ELISA) interleukin (IL)-6 test for patients with clinical chorioamnionitis at term. A retrospective cohort study was conducted. A transabdominal amniocentesis was performed in patients with clinical chorioamnionitis at term (n=44). Amniotic fluid was analyzed using cultivation techniques (for aerobic and anaerobic bacteria as well as genital Mycoplasmas) and broad-range polymerase chain reaction (PCR) coupled with electrospray ionization mass spectrometry (PCR/ESI-MS). Amniotic fluid IL-6 concentrations were determined by ELISA, and rapid

  17. Polarization microscopy imaging for the identification of unfertilized oocytes after short-term insemination. (United States)

    Guo, Yi; Liu, Wenqiang; Wang, Yu; Pan, Jiaping; Liang, Shanshan; Ruan, Jingling; Teng, Xiaoming


    To develop a unique approach using polarization microscopy (PM) to determine whether the presence of a spindle can be used as an indicator associated with fertilization failure 5 hours after short-term insemination. Observational study. Assisted reproduction center. Eighty-five patients undergoing short-term insemination. Oocytes imaged via PM at 4, 5, and 6 hours after standard insemination. Spindle visualization and fertilization rate, with rescue intracytoplasmic sperm injection (ICSI) results determined by rates of normal fertilization, abnormal fertilization, and good-quality embryo formation. After standard insemination, comparisons of spindle visualization at three time points indicated that the predictive accuracy rates were 84.30% at 5 hours, 86.80% at 6 hours, and 62.20% at 4 hours, with the rates at 5 and 6 hours statistically significantly higher than at 4 hours. A spindle was present in 242 of the 788 metaphase-II oocytes 5 hours after insemination, and there were 204 failed fertilizations on day 1. The positive predictive value was 0.84. After rescue ICSI, the abnormal fertilization rate of the polar body group (assessed using the polar body visualization method) was statistically significantly higher than that of the PM group (assessed using the spindle visualization method) and the regular ICSI group (9.37%, 5.88%, and 4.87%, respectively). The presence of a spindle 5 hours after insemination in in vitro fertilization is an accurate indicator of unfertilized oocytes. Spindle imaging combined with rescue measures effectively prevents fertilization failure and decreases the polyspermy rate. Copyright © 2017 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  18. compMS2Miner: An Automatable Metabolite Identification, Visualization, and Data-Sharing R Package for High-Resolution LC-MS Data Sets. (United States)

    Edmands, William M B; Petrick, Lauren; Barupal, Dinesh K; Scalbert, Augustin; Wilson, Mark J; Wickliffe, Jeffrey K; Rappaport, Stephen M


    A long-standing challenge of untargeted metabolomic profiling by ultrahigh-performance liquid chromatography-high-resolution mass spectrometry (UHPLC-HRMS) is efficient transition from unknown mass spectral features to confident metabolite annotations. The compMS 2 Miner (Comprehensive MS 2 Miner) package was developed in the R language to facilitate rapid, comprehensive feature annotation using a peak-picker-output and MS 2 data files as inputs. The number of MS 2 spectra that can be collected during a metabolomic profiling experiment far outweigh the amount of time required for pain-staking manual interpretation; therefore, a degree of software workflow autonomy is required for broad-scale metabolite annotation. CompMS 2 Miner integrates many useful tools in a single workflow for metabolite annotation and also provides a means to overview the MS 2 data with a Web application GUI compMS 2 Explorer (Comprehensive MS 2 Explorer) that also facilitates data-sharing and transparency. The automatable compMS 2 Miner workflow consists of the following steps: (i) matching unknown MS 1 features to precursor MS 2 scans, (ii) filtration of spectral noise (dynamic noise filter), (iii) generation of composite mass spectra by multiple similar spectrum signal summation and redundant/contaminant spectra removal, (iv) interpretation of possible fragment ion substructure using an internal database, (v) annotation of unknowns with chemical and spectral databases with prediction of mammalian biotransformation metabolites, wrapper functions for in silico fragmentation software, nearest neighbor chemical similarity scoring, random forest based retention time prediction, text-mining based false positive removal/true positive ranking, chemical taxonomic prediction and differential evolution based global annotation score optimization, and (vi) network graph visualizations, data curation, and sharing are made possible via the compMS 2 Explorer application. Metabolite identities and comments

  19. Identification of anomalous ULF emission related to short term earthquake precursor (United States)

    Sihotang, Bertalina; Ahadi, Suaidi


    The observation of ULF emission recorded in geomagnetic station is a powerful study of short term earthquake precursor. Anomalies that arise when there is no disturbance magnetic storm can be considered as precursors of the earthquake. Data used in this research were geomagnetic data recorded in LWA station as a main station, GSI station and TSI station as reference station and earthquake data. The earthquakes were on September 23, 2015 (Mw. 5.1), October 1, 2015 (Mw 5.4), and November 18, 2015 (Mw. 5.2). In determining onset time and lead time, polarization ratio Z/H at frequency of 0.012 Hz was used. Lead time detected were 4 days, 7 days, 4 days before the earthquakes respectively. The azimuth of anomalies ULF emission was investigated with Single Station Transfer Function (SSTF) that represent earthquake preparation zone. The azimuth of anomalous ULF emissions for each earthquakes were 217.3°, 208.5°, and 184.3°.

  20. Identification of trend in long term precipitation and reference evapotranspiration over Narmada river basin (India) (United States)

    Pandey, Brij Kishor; Khare, Deepak


    Precipitation and reference evapotranspiration are key parameters in hydro-meteorological studies and used for agricultural planning, irrigation system design and management. Precipitation and evaporative demand are expected to be alter under climate change and affect the sustainable development. In this article, spatial variability and temporal trend of precipitation and reference evapotranspiration (ETo) were investigated over Narmada river basin (India), a humid tropical climatic region. In the present study, 12 and 28 observatory stations were selected for precipitation and ETo, respectively of 102-years period (1901-2002). A rigorous analysis for trend detection was carried out using non parametric tests such as Mann-Kendall (MK) and Spearman Rho (SR). Sen's slope estimator was used to analyze the rate of change in long term series. Moreover, all the stations of basin exhibit positive trend for annual ETo, while 8% stations indicate significant negative trend for mean annual precipitation, respectively. Change points of annual precipitation were identified around the year 1962 applying Buishand's and Pettit's test. Annual mean precipitation reduced by 9% in upper part while increased maximum by 5% in lower part of the basin due temporal changes. Although annual mean ETo increase by 4-12% in most of the region. Moreover, results of the study are very helpful in planning and development of agricultural water resources.

  1. Identification of factors promoting ex vivo maintenance of mouse hematopoietic stem cells by long-term single-cell quantification. (United States)

    Kokkaliaris, Konstantinos D; Drew, Erin; Endele, Max; Loeffler, Dirk; Hoppe, Philipp S; Hilsenbeck, Oliver; Schauberger, Bernhard; Hinzen, Christoph; Skylaki, Stavroula; Theodorou, Marina; Kieslinger, Matthias; Lemischka, Ihor; Moore, Kateri; Schroeder, Timm


    The maintenance of hematopoietic stem cells (HSCs) during ex vivo culture is an important prerequisite for their therapeutic manipulation. However, despite intense research, culture conditions for robust maintenance of HSCs are still missing. Cultured HSCs are quickly lost, preventing their improved analysis and manipulation. Identification of novel factors supporting HSC ex vivo maintenance is therefore necessary. Coculture with the AFT024 stroma cell line is capable of maintaining HSCs ex vivo long-term, but the responsible molecular players remain unknown. Here, we use continuous long-term single-cell observation to identify the HSC behavioral signature under supportive or nonsupportive stroma cocultures. We report early HSC survival as a major characteristic of HSC-maintaining conditions. Behavioral screening after manipulation of candidate molecules revealed that the extracellular matrix protein dermatopontin (Dpt) is involved in HSC maintenance. DPT knockdown in supportive stroma impaired HSC survival, whereas ectopic expression of the Dpt gene or protein in nonsupportive conditions restored HSC survival. Supplementing defined stroma- and serum-free culture conditions with recombinant DPT protein improved HSC clonogenicity. These findings illustrate a previously uncharacterized role of Dpt in maintaining HSCs ex vivo. © 2016 by The American Society of Hematology.

  2. Hazard identification of inhaled nanomaterials: making use of short-term inhalation studies. (United States)

    Klein, Christoph L; Wiench, Karin; Wiemann, Martin; Ma-Hock, Lan; van Ravenzwaay, Ben; Landsiedel, Robert


    A major health concern for nanomaterials is their potential toxic effect after inhalation of dusts. Correspondingly, the core element of tier 1 in the currently proposed integrated testing strategy (ITS) is a short-term rat inhalation study (STIS) for this route of exposure. STIS comprises a comprehensive scheme of biological effects and marker determination in order to generate appropriate information on early key elements of pathogenesis, such as inflammatory reactions in the lung and indications of effects in other organs. Within the STIS information on the persistence, progression and/or regression of effects is obtained. The STIS also addresses organ burden in the lung and potential translocation to other tissues. Up to now, STIS was performed in research projects and routine testing of nanomaterials. Meanwhile, rat STIS results for more than 20 nanomaterials are available including the representative nanomaterials listed by the Organization for Economic Cooperation and Development (OECD) working party on manufactured nanomaterials (WPMN), which has endorsed a list of representative manufactured nanomaterials (MN) as well as a set of relevant endpoints to be addressed. Here, results of STIS carried out with different nanomaterials are discussed as case studies. The ranking of different nanomaterials potential to induce adverse effects and the ranking of the respective NOAEC are the same among the STIS and the corresponding subchronic and chronic studies. In another case study, a translocation of a coated silica nanomaterial was judged critical for its safety assessment. Thus, STIS enables application of the proposed ITS, as long as reliable and relevant in vitro methods for the tier 1 testing are still missing. Compared to traditional subacute and subchronic inhalation testing (according to OECD test guidelines 412 and 413), STIS uses less animals and resources and offers additional information on organ burden and progression or regression of potential effects.

  3. Comparison of Multi-shot Models for Short-term Re-identification of People using RGB-D Sensors

    DEFF Research Database (Denmark)

    Møgelmose, Andreas; Bahnsen, Chris; Moeslund, Thomas B.


    This work explores different types of multi-shot descriptors for re-identification in an on-the-fly enrolled environment using RGB-D sensors. We present a full re-identification pipeline complete with detection, segmentation, feature extraction, and re-identification, which expands on previous work...

  4. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo


    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  5. A 100-m Fabry–Pérot Cavity with Automatic Alignment Controls for Long-Term Observations of Earth’s Strain

    Directory of Open Access Journals (Sweden)

    Akiteru Takamori


    Full Text Available We have developed and built a highly accurate laser strainmeter for geophysical observations. It features the precise length measurement of a 100-m optical cavity with reference to a stable quantum standard. Unlike conventional laser strainmeters based on simple Michelson interferometers that require uninterrupted fringe counting to track the evolution of ground deformations, this instrument is able to determine the absolute length of a cavity at any given time. The instrument offers advantage in covering a variety of geophysical events, ranging from instantaneous earthquakes to crustal deformations associated with tectonic strain changes that persist over time. An automatic alignment control and an autonomous relocking system have been developed to realize stable performance and maximize observation times. It was installed in a deep underground site at the Kamioka mine in Japan, and an effective resolution of 2 × (10−8 − 10−7 m was achieved. The regular tidal deformations and co-seismic strain changes were in good agreement with those from a theoretical model and a co-located conventional laser strainmeter. Only the new instrument was able to record large strain steps caused by a nearby large earthquake because of its capability of absolute length determination.

  6. Biometric identification standards research (United States)


    A "biometric" technology is an automatic method for the identification, or identity verification, of an individual based on physiological or behavioral characteristics. The primary objective of the study summarized in this tech brief was to make reco...

  7. Liabilities identification and long-term management at national level (Spain)

    International Nuclear Information System (INIS)

    Espejo Hernandez, Jose Manuel; Gonzalez Gomez, Jose Luis


    economic uncertainties in high level waste disposal systems is a constant line of work, and in this respect ENRESA attempts to incorporate the most adequate techniques for cost analysis in a probabilistic framework. Even though the economical calculations are revised every year, tempering forecasting inaccuracies, in the longer term, it is felt that problems might arise if there were a particularly significant time difference between the dates of plant decommissioning and the initiation of repository construction work. Under these conditions, any delay in constructing the definitive disposal facility might lead to not having sufficient financial resources available for its construction, operation or dismantling. The Spanish legislation includes no indications in this respect. Conceptually, various treatment hypothesis could be envisaged, such as legally increasing the period of fee collection, the creation of an extra fee during the last few years of collection, the obligation for the waste producers to contract additional guarantees in order to address uncovered risks, or acceptance by the State of responsibilities in relation to this issue. Obviously, the case of a surplus of money after the completion of waste disposal is also to be taken into account. In relation to this hypothesis, criteria and procedures for liquidation or distribution should have to be set out. It is considered that, at present, it is to soon to approach such a question

  8. Long-term screening for sleep apnoea in paced patients: preliminary assessment of a novel patient management flowchart by using automatic pacemaker indexes and sleep lab polygraphy. (United States)

    Aimé, Ezio; Rovida, Marina; Contardi, Danilo; Ricci, Cristian; Gaeta, Maddalena; Innocenti, Ester; Cabral Tantchou-Tchoumi, Jacques


    The primary aim of this pilot study was to prospectively assess a flowchart to screen and diagnose paced patients (pts) affected by sleep apnoeas, by crosschecking indexes derived from pacemakers (minute ventilation sensor on-board) with Sleep-Lab Polygraphy (PG) outcomes. Secondarily, "smoothed" long-term pacemaker indexes (all the information between two consecutive follow-up visits) have been retrospectively compared vs. standard short-term pacemaker indexes (last 24h) at each follow-up (FU) visit, to test their correlation and diagnostic concordance. Data from long-term FU of 61 paced pts were collected. At each visit, the standard short-term apnoea+hypopnoea (PM_AHI) index was retrieved from the pacemaker memory. Patients showing PM_AHI ≥ 30 at least once during FU were proposed to undergo a PG for diagnostic confirmation. Smoothed pacemaker (PM_SAHI) indexes were calculated by averaging the overall number of apnoeas/hypopnoeas over the period between two FU visits, and retrospectively compared with standard PM_AHI. Data were available from 609 consecutive visits (overall 4.64 ± 1.78 years FU). PM_AHI indexes were positive during FU in 40/61 pts (65.6%); 26/40 pts (65%) accepted to undergo a PG recording; Sleep-Lab confirmed positivity in 22/26 pts (84.6% positive predictive value for PM_AHI). A strong correlation (r=0.73) and a high level of concordance were found between smoothed and standard indexes (multivariate analysis, Cohen's-k and Z-score tests). Pacemaker-derived indexes may help in screening paced pts potentially affected by sleep apnoeas. Long-term "smoothed" apnoea indexes could improve the accuracy of pacemaker screening capability, even though this hypothesis must be prospectively confirmed by larger studies. Copyright © 2014 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  9. DIRADTM - a system for real time detection and identification of radioactive objects

    International Nuclear Information System (INIS)

    Guillot, L.; Reboli, A.


    The authors present the DIRAD system (DIRAD stands for Detection and Identification of Radionuclides), an automatic system for real time identification of a radioactive anomaly and its interpretation in terms of risk level. It can be adapted to different contexts: pedestrian control, parcel or luggage control, road traffic control, and so on. In case of risk detection, an alert is transmitted in real time to a supervision station along with the whole set of spectral data

  10. Automatic fluid dispenser (United States)

    Sakellaris, P. C. (Inventor)


    Fluid automatically flows to individual dispensing units at predetermined times from a fluid supply and is available only for a predetermined interval of time after which an automatic control causes the fluid to drain from the individual dispensing units. Fluid deprivation continues until the beginning of a new cycle when the fluid is once again automatically made available at the individual dispensing units.

  11. 21 CFR 892.1900 - Automatic radiographic film processor. (United States)


    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automatic radiographic film processor. 892.1900 Section 892.1900 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... processor. (a) Identification. An automatic radiographic film processor is a device intended to be used to...

  12. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António


    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  13. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu


    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  14. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)


    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  15. Automatically ordering events and times in text

    CERN Document Server

    Derczynski, Leon R A


    The book offers a detailed guide to temporal ordering, exploring open problems in the field and providing solutions and extensive analysis. It addresses the challenge of automatically ordering events and times in text. Aided by TimeML, it also describes and presents concepts relating to time in easy-to-compute terms. Working out the order that events and times happen has proven difficult for computers, since the language used to discuss time can be vague and complex. Mapping out these concepts for a computational system, which does not have its own inherent idea of time, is, unsurprisingly, tough. Solving this problem enables powerful systems that can plan, reason about events, and construct stories of their own accord, as well as understand the complex narratives that humans express and comprehend so naturally. This book presents a theory and data-driven analysis of temporal ordering, leading to the identification of exactly what is difficult about the task. It then proposes and evaluates machine-learning so...

  16. Automatic Amharic text news classification: Aneural networks ...

    African Journals Online (AJOL)

    The study is on classification of Amharic news automatically using neural networks approach. Learning Vector Quantization (LVQ) algorithm is employed to classify new instance of Amharic news based on classifier developed using training dataset. Two weighting schemes, Term Frequency (TF) and Term Frequency by ...

  17. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi


    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  18. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.


    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  19. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.


    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided


    African Journals Online (AJOL)

    Both the nursing staff shortage and the need for precise control in the administration of dangerous drugs intra- venously have led to the development of various devices to achieve an automatic system. The continuous automatic control of the drip rate eliminates errors due to any physical effect such as movement of the ...

  1. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike


    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  2. Automatic generation of warehouse mediators using an ontology engine

    Energy Technology Data Exchange (ETDEWEB)

    Critchlow, T., LLNL


    Data warehouses created for dynamic scientific environments, such as genetics, face significant challenges to their long-term feasibility One of the most significant of these is the high frequency of schema evolution resulting from both technological advances and scientific insight Failure to quickly incorporate these modifications will quickly render the warehouse obsolete, yet each evolution requires significant effort to ensure the changes are correctly propagated DataFoundry utilizes a mediated warehouse architecture with an ontology infrastructure to reduce the maintenance acquirements of a warehouse. Among the things, the ontology is used as an information source for automatically generating mediators, the methods that transfer data between the data sources and the warehouse The identification, definition and representation of the metadata required to perform this task is a primary contribution of this work.

  3. Automatic stereoscopic system for person recognition (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.


    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  4. PATH-01. Identification of Prognastic Variables Based on Molecular Profiling of Long-Term and Short-Term Surviving Gglioblastoma Patients

    DEFF Research Database (Denmark)

    Michaelsen, Signe Regner; Urup, Thomas; Olsen, Lars Rønn


    Glioblastoma is a devastating disease and despite extensive treatment, overall survival (OS) for these patients remains poor. Yet, a small proportion of glioblastoma patients present relatively long survival over 3 years, but the underlying molecular background separating these long-term survivors...... (LTS) from short-term survivors (STS) are still insufficiently understood. The purpose of this study was to identify independent prognostic variables for survival by examining molecular profiles of LTS and STS in a clinically well characterized cohort of glioblastoma patients. The cohort consisted....... For all patients, RNA had previously been purified from microdissected tumor tissue of the diagnostic specimen and analyzed for expression levels by a customized NanoString platform. This covered 800 genes related to glioblastoma cancer hallmarks, including regulation of angiogenesis and immune response...

  5. Fast automatic analysis of antenatal dexamethasone on micro-seizure activity in the EEG

    International Nuclear Information System (INIS)

    Rastin, S.J.; Unsworth, C.P.; Bennet, L.


    Full text: In this work wc develop an automatic scheme for studying the effect of the antenatal Dexamethasone on the EEG activity. To do so an FFT (Fast Fourier Transform) based detector was designed and applied to the EEG recordings obtained from two groups of fetal sheep. Both groups received two injections with a time delay of 24 h between them. However the applied medicine was different for each group (Dex and saline). The detector developed was used to automatically identify and classify micro-seizures that occurred in the frequency bands corresponding to the EEG transients known as slow waves (2.5 14 Hz). For each second of the data recordings the spectrum was computed and the rise of the energy in each predefined frequency band then counted when the energy level exceeded a predefined corresponding threshold level (Where the threshold level was obtained from the long term average of the spectral points at each band). Our results demonstrate that it was possible to automatically count the micro-seizures for the three different bands in a time effective manner. It was found that the number of transients did not strongly depend on the nature of the injected medicine which was consistent with the results manually obtained by an EEG expert. Tn conclusion, the automatic detection scheme presented here would allow for rapid micro-seizure event identification of hours of highly sampled EEG data thus providing a valuable time-saving device.

  6. Presentation video retrieval using automatically recovered slide and spoken text (United States)

    Cooper, Matthew


    Video is becoming a prevalent medium for e-learning. Lecture videos contain text information in both the presentation slides and lecturer's speech. This paper examines the relative utility of automatically recovered text from these sources for lecture video retrieval. To extract the visual information, we automatically detect slides within the videos and apply optical character recognition to obtain their text. Automatic speech recognition is used similarly to extract spoken text from the recorded audio. We perform controlled experiments with manually created ground truth for both the slide and spoken text from more than 60 hours of lecture video. We compare the automatically extracted slide and spoken text in terms of accuracy relative to ground truth, overlap with one another, and utility for video retrieval. Results reveal that automatically recovered slide text and spoken text contain different content with varying error profiles. Experiments demonstrate that automatically extracted slide text enables higher precision video retrieval than automatically recovered spoken text.

  7. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library


    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  8. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard


    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  9. Automatic requirements traceability


    Andžiulytė, Justė


    This paper focuses on automatic requirements traceability and algorithms that automatically find recommendation links for requirements. The main objective of this paper is the evaluation of these algorithms and preparation of the method defining algorithms to be used in different cases. This paper presents and examines probabilistic, vector space and latent semantic indexing models of information retrieval and association rule mining using authors own implementations of these algorithms and o...

  10. Position automatic determination technology

    International Nuclear Information System (INIS)


    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  11. B0-correction and k-means clustering for accurate and automatic identification of regions with reduced apparent diffusion coefficient (ADC) in adva nced cervical cancer at the time of brachytherapy

    DEFF Research Database (Denmark)

    Haack, Søren; Pedersen, Erik Morre; Vinding, Mads Sloth

    in dose planning of radiotherapy. This study evaluates the use of k-means clustering for automatic user independent delineation of regions of reduced apparent diffusion coefficient (ADC) and the value of B0-correction of DW-MRI for reduction of geometrical distortions during dose planning of brachytherapy...

  12. Second-Language Learners' Identification of Target-Language Phonemes: A Short-Term Phonetic Training Study (United States)

    Cebrian, Juli; Carlet, Angelica


    This study examined the effect of short-term high-variability phonetic training on the perception of English /b/, /v/, /d/, /ð/, /ae/, /? /, /i/, and /i/ by Catalan/Spanish bilinguals learning English as a foreign language. Sixteen English-major undergraduates were tested before and after undergoing a four-session perceptual training program…

  13. Automatically predicting mood from expressed emotions

    NARCIS (Netherlands)

    Katsimerou, C.


    Affect-adaptive systems have the potential to assist users that experience systematically negative moods. This thesis aims at building a platform for predicting automatically a person’s mood from his/her visual expressions. The key word is mood, namely a relatively long-term, stable and diffused

  14. Enhancing Automaticity through Task-Based Language Learning (United States)

    De Ridder, Isabelle; Vangehuchten, Lieve; Gomez, Marta Sesena


    In general terms automaticity could be defined as the subconscious condition wherein "we perform a complex series of tasks very quickly and efficiently, without having to think about the various components and subcomponents of action involved" (DeKeyser 2001: 125). For language learning, Segalowitz (2003) characterised automaticity as a…

  15. Identification of relationships between climate indices and long-term precipitation in South Korea using ensemble empirical mode decomposition (United States)

    Kim, Taereem; Shin, Ju-Young; Kim, Sunghun; Heo, Jun-Haeng


    Climate indices characterize climate systems and may identify important indicators for long-term precipitation, which are driven by climate interactions in atmosphere-ocean circulation. In this study, we investigated the climate indices that are effective indicators of long-term precipitation in South Korea, and examined their relationships based on statistical methods. Monthly total precipitation was collected from a total of 60 meteorological stations, and they were decomposed by ensemble empirical mode decomposition (EEMD) to identify the inherent oscillating patterns or cycles. Cross-correlation analysis and stepwise variable selection were employed to select the significant climate indices at each station. The climate indices that affect the monthly precipitation in South Korea were identified based on the selection frequencies of the selected indices at all stations. The NINO12 indices with four- and ten-month lags and AMO index with no lag were identified as indicators of monthly precipitation in South Korea. Moreover, they indicate meaningful physical information (e.g. periodic oscillations and long-term trend) inherent in the monthly precipitation. The NINO12 indices with four- and ten- month lags was a strong indicator representing periodic oscillations in monthly precipitation. In addition, the long-term trend of the monthly precipitation could be explained by the AMO index. A multiple linear regression model was constructed to investigate the influences of the identified climate indices on the prediction of monthly precipitation. Three identified climate indices successfully explained the monthly precipitation in the winter dry season. Compared to the monthly precipitation in coastal areas, the monthly precipitation in inland areas showed stronger correlation to the identified climate indices.

  16. Identification of first-stage labor arrest by electromyography in term nulliparous women after induction of labor. (United States)

    Vasak, Blanka; Graatsma, Elisabeth M; Hekman-Drost, Elske; Eijkemans, Marinus J; Schagen van Leeuwen, Jules H; Visser, Gerard H A; Jacod, Benoit C


    Worldwide induction and cesarean delivery rates have increased rapidly, with consequences for subsequent pregnancies. The majority of intrapartum cesarean deliveries are performed for failure to progress, typically in nulliparous women at term. Current uterine registration techniques fail to identify inefficient contractions leading to first-stage labor arrest. An alternative technique, uterine electromyography has been shown to identify inefficient contractions leading to first-stage arrest of labor in nulliparous women with spontaneous onset of labor at term. The objective of this study was to determine whether this finding can be reproduced in induction of labor. Uterine activity was measured in 141 nulliparous women with singleton term pregnancies and a fetus in cephalic position during induced labor. Electrical activity of the myometrium during contractions was characterized by its power density spectrum. No significant differences were found in contraction characteristics between women with induced labor delivering vaginally with or without oxytocin and women with arrested labor with subsequent cesarean delivery. Uterine electromyography shows no correlation with progression of labor in induced labor, which is in contrast to spontaneous labor. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  17. Genetic Programming for Automatic Hydrological Modelling (United States)

    Chadalawada, Jayashree; Babovic, Vladan


    One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach

  18. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel


    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  19. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...... on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...

  20. Diagnosis of district potential in terms of renewable energies. Report 1 - Present situation: Assessment of renewable energy production, Identification and quantification of territory's potentialities in terms of renewable energies

    International Nuclear Information System (INIS)


    After a presentation of the Gers district context (geography, administrative organisation, demography, housing, economy, expertise), the report presents the energy situation, an overview of the solar thermal sector (installations and installers), of the solar photovoltaic sector (existing and projected installations, installers), of hydroelectricity, of wood-energy (individual heating, industrial heating plants, planned installations), of wind energy, of biogas, and of geothermal energy (existing and planned installations). It proposes an assessment of these energies as a whole. Then, after an overview of the district situation with respect to national objectives and to other districts of the region, the study reports an identification and quantification of potentialities in terms of theoretical resources for different energy sources (solar, wind, hydraulic, wood, methanization, valorizable biomass, geothermal, and agri-fuels). Avoided CO 2 emissions are assessed

  1. Metaphor identification in large texts corpora.

    Directory of Open Access Journals (Sweden)

    Yair Neuman

    Full Text Available Identifying metaphorical language-use (e.g., sweet child is one of the challenges facing natural language processing. This paper describes three novel algorithms for automatic metaphor identification. The algorithms are variations of the same core algorithm. We evaluate the algorithms on two corpora of Reuters and the New York Times articles. The paper presents the most comprehensive study of metaphor identification in terms of scope of metaphorical phrases and annotated corpora size. Algorithms' performance in identifying linguistic phrases as metaphorical or literal has been compared to human judgment. Overall, the algorithms outperform the state-of-the-art algorithm with 71% precision and 27% averaged improvement in prediction over the base-rate of metaphors in the corpus.

  2. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.


    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  3. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)


    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  4. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads


    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstrac...

  5. ENVIRONMENTS and EOL: identification of Environment Ontology terms in text and the annotation of the Encyclopedia of Life. (United States)

    Pafilis, Evangelos; Frankild, Sune P; Schnetzer, Julia; Fanini, Lucia; Faulwetter, Sarah; Pavloudi, Christina; Vasileiadou, Katerina; Leary, Patrick; Hammock, Jennifer; Schulz, Katja; Parr, Cynthia Sims; Arvanitidis, Christos; Jensen, Lars Juhl


    The association of organisms to their environments is a key issue in exploring biodiversity patterns. This knowledge has traditionally been scattered, but textual descriptions of taxa and their habitats are now being consolidated in centralized resources. However, structured annotations are needed to facilitate large-scale analyses. Therefore, we developed ENVIRONMENTS, a fast dictionary-based tagger capable of identifying Environment Ontology (ENVO) terms in text. We evaluate the accuracy of the tagger on a new manually curated corpus of 600 Encyclopedia of Life (EOL) species pages. We use the tagger to associate taxa with environments by tagging EOL text content monthly, and integrate the results into the EOL to disseminate them to a broad audience of users. The software and the corpus are available under the open-source BSD and the CC-BY-NC-SA 3.0 licenses, respectively, at © The Author 2015. Published by Oxford University Press.

  6. Automatic scanning of emulsion films

    International Nuclear Information System (INIS)

    D'Ambrosio, N.; Mandrioli, G.; Sirrib, G.


    The use of nuclear emulsions in recent large neutrino experiments is mostly due to the significant results in the developments of this detection technique. In the emulsion films, trajectories of through-going particles are permanently recorded: thus, the emulsion target can be considered not only as a tracking but also as a storing device. If the data readout is performed by automatic scanning systems interfaced to an acquisition computer equipped with a fast frame grabber, nuclear emulsions can be used as very large target detector and quickly analyzed in particle physics experiments. Techniques for automatic scanning of nuclear emulsions have been developed in the early past. The effort was initiated by Niwa at Nagoya (Japan) in the late 70s. The first large-scale application was the CHORUS experiment; then emulsions have been used to search for T neutrinos in a high track density environment like DONUT. In order to measure with high accuracy and high speed, very strict constraints must be satisfied in terms of mechanical precisions, camera speed, image processing power. Recent improvements in this technique are briefly reported

  7. Identification of Appropriate Housekeeping Genes for Gene Expression Analysis in Long-term Hypoxia-treated Kidney Cells. (United States)

    Moein, Shiva; Javanmard, Shaghayegh Haghjooy; Abedi, Maryam; Izadpanahi, Mohammad Hosein; Gheisari, Yousof


    Selection of stably expressing housekeeping genes (HKGs) is a crucial step in gene expression analysis. However, there are no universal HKGs for all experiments, and they should be determined by each biologic condition. The aim of this study was to detect appropriate HKGs for kidney cells cultured in long-term hypoxia. Based on a screening step using a microarray data available from gene expression omnibus database, a set of candidate HKGs were chosen to be assessed in human kidney cells cultured in hypoxic or normoxic conditions for about 2 weeks in a time course manner. The stability of gene expression was assessed by refFinder, a web-based tool that integrates four computational programs (geNorm, Normfinder, BestKeeper, and the comparative ΔΔCt method). GAPDH and ACTB were the most stable genes in hypoxia treated cells whereas, B2M and ACTB were the best HKGs in cells cultured in normoxia. When both hypoxia and normoxia treated cells from all time points were evaluated together, GAPDH and ACTB equally showed the most stability. As in relative quantification of real-time polymerase chain reaction data, the same HKGs should be selected for all groups, we believe that GAPDH and ACTB are suitable HKGs for studies on the effect of hypoxia on cultured kidney cells.

  8. In situ testing of waste forms and container materials: Contribution to the identification of their long term behaviour

    International Nuclear Information System (INIS)

    Iseghem, P. Van; Kursten, B.; Valcke, E.; Serra, H.; Fays, J.; Sneyers, A.


    This paper reviews in situ projects that have been carried out in the underground laboratory in Boom Clay in Mol in Belgium. The projects involved in situ interaction between candidate container materials, nuclear waste glasses or cements and the Boom Clay. The in situ tests on container materials showed a strong corrosion of C-steel, so that a new container (overpack) material is being considered actually in Belgium, namely stainless steel. The interaction between waste glass or cements and Boom Clay results in reaction layers of a few hundred μm thickness. These layers were successfully characterised in detail, and the reaction processes based on these analyses interpreted. The outputs of the in situ tests on the different materials are compared, and possible interferences discussed. The added value of these in situ tests as part of the global objective of evaluating the long term behaviour of the materials in a disposal rock, in conjunction with laboratory tests and modelling is discussed. (authors)

  9. Cardiac and mood-related changes during short-term abstinence from crack cocaine: the identification of possible withdrawal phenomena. (United States)

    Kajdasz, D K; Moore, J W; Donepudi, H; Cochrane, C E; Malcolm, R J


    Studies assessing withdrawal phenomenon during short-term abstinence from chronic cocaine use have been limited. Although cocaine abusers are reported to be at increased risk for cardiac disorders, little research has assessed cardiac parameters in cocaine abusers and subsequent changes in these parameters that may be associated with the discontinuation of cocaine use. In this study, we categorize 441 chronic cocaine abusers into three groups based on self-reported length of abstinence from cocaine use at entry into a trial approved by the National Institute on Drug Abuse (NIDA) assessing the use of pergolide mesylate in treating relapse and craving in crack cocaine abuse. Electrocardiogram (ECG) PR intervals were found to be correlated positively with length of abstinence, returning to normal population levels within 30 days. In addition, levels of generalized anxiety, nervousness, and heart racing were found to be correlated negatively with length of abstinence from crack cocaine. This work provides preliminary evidence of cardiac and mood-related parameters that are associated with cocaine abstinence and that may indicate specific withdrawal phenomena in chronic users. In addition, these results suggest that the risk of cardiomyopathies associated with abnormal atrial-ventricular polarization may dissipate relatively quickly in abusing individuals.

  10. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    NARCIS (Netherlands)

    Liu, C.; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdinand


    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared

  11. Automatic tracking of wake vortices using ground-wind sensor data (United States)


    Algorithms for automatic tracking of wake vortices using ground-wind anemometer : data are developed. Methods of bad-data suppression, track initiation, and : track termination are included. An effective sensor-failure detection-and identification : ...

  12. Profiling School Shooters: Automatic Text-Based Analysis

    Directory of Open Access Journals (Sweden)

    Yair eNeuman


    Full Text Available School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various charateristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by six school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/priorization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.

  13. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin


    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  14. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his ...... a renewed stimulus for continuing and deepening Bob's research visions. A familiar touch is given to the book by some pictures kindly provided to us by his wife Nieba, the personal recollections of his brother Gary and some of his colleagues and friends....... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...

  15. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological...... aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  16. Uranium casting furnace automatic temperature control development

    International Nuclear Information System (INIS)

    Lind, R.F.


    Development of an automatic molten uranium temperature control system for use on batch-type induction casting furnaces is described. Implementation of a two-color optical pyrometer, development of an optical scanner for the pyrometer, determination of furnace thermal dynamics, and design of control systems are addressed. The optical scanning system is shown to greatly improve pyrometer measurement repeatability, particularly where heavy floating slag accumulations cause surface temperature gradients. Thermal dynamics of the furnaces were determined by applying least-squares system identification techniques to actual production data. A unity feedback control system utilizing a proportional-integral-derivative compensator is designed by using frequency-domain techniques. 14 refs

  17. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  18. Automatic contact in DYNA3D for vehicle crashworthiness

    International Nuclear Information System (INIS)

    Whirley, R.G.; Engelmann, B.E.


    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit, nonlinear, finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. The authors have used a new four-step automatic contact algorithm. Key aspects of the proposed method include (1) automatic identification of adjacent and opposite surfaces in the global search phase, and (2) the use of a smoothly varying surface normal that allows a consistent treatment of shell intersection and corner contact conditions without ad hoc rules. Three examples are given to illustrate the performance of the newly proposed algorithm in the public DYNA3D code

  19. Automatic fault extraction using a modified ant-colony algorithm

    International Nuclear Information System (INIS)

    Zhao, Junsheng; Sun, Sam Zandong


    The basis of automatic fault extraction is seismic attributes, such as the coherence cube which is always used to identify a fault by the minimum value. The biggest challenge in automatic fault extraction is noise, including that of seismic data. However, a fault has a better spatial continuity in certain direction, which makes it quite different from noise. Considering this characteristic, a modified ant-colony algorithm is introduced into automatic fault identification and tracking, where the gradient direction and direction consistency are used as constraints. Numerical model test results show that this method is feasible and effective in automatic fault extraction and noise suppression. The application of field data further illustrates its validity and superiority. (paper)

  20. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.


    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  1. Identification and characterization of novel long-term metabolites of oxymesterone and mesterolone in human urine by application of selected reaction monitoring GC-CI-MS/MS. (United States)

    Polet, Michael; Van Gansbeke, Wim; Geldof, Lore; Deventer, Koen; Van Eenoo, Peter


    The search for metabolites with longer detection times remains an important task in, for example, toxicology and doping control. The impact of these long-term metabolites is highlighted by the high number of positive cases after reanalysis of samples that were stored for several years, e.g. samples of previous Olympic Games. A substantial number of previously alleged negative samples have now been declared positive due to the detection of various long-term steroid metabolites the existence of which was unknown during the Olympic Games of 2008 and 2012. In this work, the metabolism of oxymesterone and mesterolone, two anabolic androgenic steroids (AAS), was investigated by application of a selected reaction monitoring gas chromatography-chemical ionization-triple quadrupole mass spectrometry (GC-CI-MS/MS) protocol for metabolite detection and identification. Correlations between AAS structure and GC-CI-MS/MS fragmentation behaviour enabled the search for previously unknown but expected AAS metabolites by selection of theoretical transitions for expected metabolites. Use of different hydrolysis protocols allowed for evaluation of the detection window of both phase I and phase II metabolites. For oxymesterone, a new metabolite, 18-nor-17β-hydroxymethyl-17α-methyl-4-hydroxy-androst-4,13-diene-3-one, was identified. It was detectable up to 46 days by using GC-CI-MS/MS, whereas with a traditional screening (detection of metabolite 17-epioxymesterone with electron ionization GC-MS/MS) oxymesterone administration was only detectable for 3.5 days. A new metabolite was also found for mesterolone. It was identified as 1α-methyl-5α-androstan-3,6,16-triol-17-one and its sulfate form after hydrolysis with Helix pomatia resulted in a prolonged detection time (up to 15 days) for mesterolone abuse. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  2. Automatic Evaluation Of Interferograms (United States)

    Becker, Friedhelm; Meier, Gerd E. A.; Wegner, Horst


    A system for the automatic evaluation of interference patterns has been developed. After digitizing the interferograms from classical and holografic interferometers with a television digitizer and performing different picture enhancement operations the fringe loci are extracted by use of a floating-threshold method. The fringes are numbered using a special scheme after the removal of any fringe disconnections which might appear if there was insufficient contrast in the interferograms. The reconstruction of the object function from the numbered fringe field is achieved by a local polynomial least-squares approximation. Applications are given, demonstrating the evaluation of interferograms of supersonic flow fields and the analysis of holografic interferograms of car-tyres.

  3. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.


    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  4. Developing a Speaker Identification System for the DARPA RATS Project

    DEFF Research Database (Denmark)

    Plchot, O; Matsoukas, S; Matejka, P


    present results using multiple SID systems differing mainly in the algorithm used for voice activity detection (VAD) and feature extraction. We show that (a) unsupervised VAD performs as well supervised methods in terms of downstream SID performance, (b) noise-robust feature extraction methods......This paper describes the speaker identification (SID) system developed by the Patrol team for the first phase of the DARPA RATS (Robust Automatic Transcription of Speech) program, which seeks to advance state of the art detection capabilities on audio from highly degraded communication channels. We...

  5. Simplified automatic on-line document searching

    International Nuclear Information System (INIS)

    Ebinuma, Yukio


    The author proposed searching method for users who need not-comprehensive retrieval. That is to provide flexible number of related documents for the users automatically. A group of technical terms are used as search terms to express an inquiry. Logical sums of the terms in the ascending order of frequency of the usage are prepared sequentially and automatically, and then the search formulas, qsub(m) and qsub(m-1) which meet certain threshold values are selected automatically also. Users justify precision of the search output up to 20 items retrieved by the formula qsub(m). If a user wishes more than 30% of recall ratio, the serach result should be output by qsub(m), and if he wishes less than 30% of it, it should be output by qsub(m-1). The search by this method using one year volume of INIS Database (76,600 items) and five inquiries resulted in 32% of recall ratio and 36% of precision ratio on the average in the case of qsub(m). The connecting time of a terminal was within 15 minutes per an inquiry. It showed more efficiency than that of an inexperienced searcher. The method can be applied to on-line searching system for database in which natural language only or natural language and controlled vocabulary are used. (author)

  6. Automatic operation device for control rods

    International Nuclear Information System (INIS)

    Sekimizu, Koichi


    Purpose: To enable automatic operation of control rods based on the reactor operation planning, and particularly, to decrease the operator's load upon start up and shutdown of the reactor. Constitution: Operation plannings, demand for the automatic operation, break point setting value, power and reactor core flow rate change, demand for operation interrupt, demand for restart, demand for forecasting and the like are inputted to an input device, and an overall judging device performs a long-term forecast as far as the break point by a long-term forecasting device based on the operation plannings. The automatic reactor operation or the like is carried out based on the long-term forecasting and the short time forecasting is performed by the change in the reactor core status due to the control rod operation sequence based on the control rod pattern and the operation planning. Then, it is judged if the operation for the intended control rod is possible or not based on the result of the short time forecasting. (Aizawa, K.)

  7. Automatic readout micrometer (United States)

    Lauritzen, T.

    A measuring system is described for surveying and very accurately positioning objects with respect to a reference line. A principle use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse of fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range.

  8. Reachability Games on Automatic Graphs (United States)

    Neider, Daniel

    In this work we study two-person reachability games on finite and infinite automatic graphs. For the finite case we empirically show that automatic game encodings are competitive to well-known symbolic techniques such as BDDs, SAT and QBF formulas. For the infinite case we present a novel algorithm utilizing algorithmic learning techniques, which allows to solve huge classes of automatic reachability games.

  9. Automatic reactor protection system tester

    International Nuclear Information System (INIS)

    Deliant, J.D.; Jahnke, S.; Raimondo, E.


    The object of this paper is to present the automatic tester of reactor protection systems designed and developed by EDF and Framatome. In order, the following points are discussed: . The necessity for reactor protection system testing, . The drawbacks of manual testing, . The description and use of the Framatome automatic tester, . On-site installation of this system, . The positive results obtained using the Framatome automatic tester in France

  10. [The maintenance of automatic analysers and associated documentation]. (United States)

    Adjidé, V; Fournier, P; Vassault, A


    The maintenance of automatic analysers and associated documentation taking part in the requirements of the ISO 15189 Standard and the French regulation as well have to be defined in the laboratory policy. The management of the periodic maintenance and documentation shall be implemented and fulfilled. The organisation of corrective maintenance has to be managed to avoid interruption of the task of the laboratory. The different recommendations concern the identification of materials including automatic analysers, the environmental conditions to take into account, the documentation provided by the manufacturer and documents prepared by the laboratory including procedures for maintenance.

  11. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von


    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  12. Automatic adjustment of astrochronologic correlations (United States)

    Zeeden, Christian; Kaboth, Stefanie; Hilgen, Frederik; Laskar, Jacques


    Here we present an algorithm for the automated adjustment and optimisation of correlations between proxy data and an orbital tuning target (or similar datasets as e.g. ice models) for the R environment (R Development Core Team 2008), building on the 'astrochron' package (Meyers et al.2014). The basis of this approach is an initial tuning on orbital (precession, obliquity, eccentricity) scale. We use filters of orbital frequency ranges related to e.g. precession, obliquity or eccentricity of data and compare these filters to an ensemble of target data, which may consist of e.g. different combinations of obliquity and precession, different phases of precession and obliquity, a mix of orbital and other data (e.g. ice models), or different orbital solutions. This approach allows for the identification of an ideal mix of precession and obliquity to be used as tuning target. In addition, the uncertainty related to different tuning tie points (and also precession- and obliquity contributions of the tuning target) can easily be assessed. Our message is to suggest an initial tuning and then obtain a reproducible tuned time scale, avoiding arbitrary chosen tie points and replacing these by automatically chosen ones, representing filter maxima (or minima). We present and discuss the above outlined approach and apply it to artificial and geological data. Artificial data are assessed to find optimal filter settings; real datasets are used to demonstrate the possibilities of such an approach. References: Meyers, S.R. (2014). Astrochron: An R Package for Astrochronology. R Development Core Team (2008). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0, URL

  13. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.


    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  14. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.


    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed


    Directory of Open Access Journals (Sweden)

    M. Mathias


    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  16. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.


    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  17. Automatic alkaloid removal system. (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd


    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user.

  18. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads


    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... manner. In a first phase, a coarse estimate of the overall spine alignment and the vertebra locations is computed using a shape model sampling scheme. These samples are used to initialize a second phase of active shape model search, under a nonlinear model of vertebra appearance. The search...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  19. Classifying visemes for automatic lipreading

    NARCIS (Netherlands)

    Visser, Michiel; Poel, Mannes; Nijholt, Antinus; Matousek, Vaclav; Mautner, Pavel; Ocelikovi, Jana; Sojka, Petr


    Automatic lipreading is automatic speech recognition that uses only visual information. The relevant data in a video signal is isolated and features are extracted from it. From a sequence of feature vectors, where every vector represents one video image, a sequence of higher level semantic elements

  20. Ferenczi's concept of identification with the aggressor: understanding dissociative structure with interacting victim and abuser self-states. (United States)

    Howell, Elizabeth F


    No one has described more passionately than Ferenczi the traumatic induction of dissociative trance with its resulting fragmentation of the personality. Ferenczi introduced the concept and term, identification with the aggressor in his seminal "Confusion of Tongues" paper, in which he described how the abused child becomes transfixed and robbed of his senses. Having been traumatically overwhelmed, the child becomes hypnotically transfixed by the aggressor's wishes and behavior, automatically identifying by mimicry rather than by a purposeful identification with the aggressor's role. To expand upon Ferenczi's observations, identification with the aggressor can be understood as a two-stage process. The first stage is automatic and initiated by trauma, but the second stage is defensive and purposeful. While identification with the aggressor begins as an automatic organismic process, with repeated activation and use, gradually it becomes a defensive process. Broadly, as a dissociative defense, it has two enacted relational parts, the part of the victim and the part of the aggressor. This paper describes the intrapersonal aspects (how aggressor and victim self-states interrelate in the internal world), as well as the interpersonal aspects (how these become enacted in the external). This formulation has relevance to understanding the broad spectrum of the dissociative structure of mind, borderline personality disorder, and dissociative identity disorder.

  1. Automatic exposure for xeromammography

    International Nuclear Information System (INIS)

    Aichinger, H.


    During mammography without intensifying screens, exposure measurements are carried out behind the film. It is, however, difficult to construct an absolutely shadow-free ionization chamber of adequate sensitivity working in the necessary range of 25 to 50 kV. Repeated attempts have been made to utilize the advantages of automatic exposure for xero-mammography. In this case also the ionization chamber was placed behind the Xerox plate. Depending on tube filtration, object thickness and tube voltage, more than 80%, sometimes even 90%, of the radiation is absorbed by the Xerox plate. Particularly the characteristic Mo radiation of 17.4 keV and 19.6 keV is almost totally absorbed by the plate and cannot therefore be registered by the ionization chamber. This results in a considerable dependence of the exposure on kV and object thickness. Dependence on tube voltage and object thickness have been examined dosimetrically and spectroscopically with a Ge(Li)-spectrometer. Finally, the successful use of a shadow-free chamber is described; this has been particularly adapted for xero-mammography and is placed in front of the plate. (orig) [de

  2. Historical Review and Perspective on Automatic Journalizing


    Kato, Masaki


    ContentsIntroduction1. EDP Accounting and Automatic Journalizing2. Learning System of Automatic Journalizing3. Automatic Journalizing by the Artificial Intelligence4. Direction of the Progress of the Accounting Information System

  3. Assessing the efficacy of benchmarks for automatic speech accent recognition


    Benjamin Bock; Lior Shamir


    Speech accents can possess valuable information about the speaker, and can be used in intelligent multimedia-based human-computer interfaces. The performance of algorithms for automatic classification of accents is often evaluated using audio datasets that include recording samples of different people, representing different accents. Here we describe a method that can detect bias in accent datasets, and apply the method to two accent identification datasets to reveal the existence of dataset ...

  4. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye


    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  5. Image-based automatic recognition of larvae (United States)

    Sang, Ru; Yu, Guiying; Fan, Weijun; Guo, Tiantai


    As the main objects, imagoes have been researched in quarantine pest recognition in these days. However, pests in their larval stage are latent, and the larvae spread abroad much easily with the circulation of agricultural and forest products. It is presented in this paper that, as the new research objects, larvae are recognized by means of machine vision, image processing and pattern recognition. More visional information is reserved and the recognition rate is improved as color image segmentation is applied to images of larvae. Along with the characteristics of affine invariance, perspective invariance and brightness invariance, scale invariant feature transform (SIFT) is adopted for the feature extraction. The neural network algorithm is utilized for pattern recognition, and the automatic identification of larvae images is successfully achieved with satisfactory results.

  6. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  7. AVID: Automatic Visualization Interface Designer

    National Research Council Canada - National Science Library

    Chuah, Mei


    .... Automatic generation offers great flexibility in performing data and information analysis tasks, because new designs are generated on a case by case basis to suit current and changing future needs...

  8. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.


    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  9. Behavioral and electrophysiological evidence for early and automatic detection of phonological equivalence in variable speech inputs. (United States)

    Kharlamov, Viktor; Campbell, Kenneth; Kazanina, Nina


    Speech sounds are not always perceived in accordance with their acoustic-phonetic content. For example, an early and automatic process of perceptual repair, which ensures conformity of speech inputs to the listener's native language phonology, applies to individual input segments that do not exist in the native inventory or to sound sequences that are illicit according to the native phonotactic restrictions on sound co-occurrences. The present study with Russian and Canadian English speakers shows that listeners may perceive phonetically distinct and licit sound sequences as equivalent when the native language system provides robust evidence for mapping multiple phonetic forms onto a single phonological representation. In Russian, due to an optional but productive t-deletion process that affects /stn/ clusters, the surface forms [sn] and [stn] may be phonologically equivalent and map to a single phonological form /stn/. In contrast, [sn] and [stn] clusters are usually phonologically distinct in (Canadian) English. Behavioral data from identification and discrimination tasks indicated that [sn] and [stn] clusters were more confusable for Russian than for English speakers. The EEG experiment employed an oddball paradigm with nonwords [asna] and [astna] used as the standard and deviant stimuli. A reliable mismatch negativity response was elicited approximately 100 msec postchange in the English group but not in the Russian group. These findings point to a perceptual repair mechanism that is engaged automatically at a prelexical level to ensure immediate encoding of speech inputs in phonological terms, which in turn enables efficient access to the meaning of a spoken utterance.

  10. An automatic image recognition approach

    Directory of Open Access Journals (Sweden)

    Tudor Barbu


    Full Text Available Our paper focuses on the graphical analysis domain. We propose an automatic image recognition technique. This approach consists of two main pattern recognition steps. First, it performs an image feature extraction operation on an input image set, using statistical dispersion features. Then, an unsupervised classification process is performed on the previously obtained graphical feature vectors. An automatic region-growing based clustering procedure is proposed and utilized in the classification stage.

  11. Photo-identification methods reveal seasonal and long-term site-fidelity of Risso’s dolphins (Grampus griseus) in shallow waters (Cardigan Bay, Wales)

    NARCIS (Netherlands)

    Boer, de M.N.; Leopold, M.F.; Simmonds, M.P.; Reijnders, P.J.H.


    A photo-identification study on Risso’s dolphins was carried out off Bardsey Island in Wales (July to September, 1997-2007). Their local abundance was estimated using two different analytical techniques: 1) mark-recapture of well-marked dolphins using a “closed-population” model; and 2) a census

  12. Artificial Intelligence In Automatic Target Recognizers: Technology And Timelines (United States)

    Gilmore, John F.


    The recognition of targets in thermal imagery has been a problem exhaustively analyzed in its current localized dimension. This paper discusses the application of artificial intelligence (AI) technology to automatic target recognition, a concept capable of expanding current ATR efforts into a new globalized dimension. Deficiencies of current automatic target recognition systems are reviewed in terms of system shortcomings. Areas of artificial intelligence which show the most promise in improving ATR performance are analyzed, and a timeline is formed in light of how near (as well as far) term artificial intelligence applications may exist. Current research in the area of high level expert vision systems is reviewed and the possible utilization of artificial intelligence architectures to improve low level image processing functions is also discussed. Additional application areas of relevance to solving the problem of automatic target recognition utilizing both high and low level processing are also explored.

  13. Global Distribution Adjustment and Nonlinear Feature Transformation for Automatic Colorization

    Directory of Open Access Journals (Sweden)

    Terumasa Aoki


    Full Text Available Automatic colorization is generally classified into two groups: propagation-based methods and reference-based methods. In reference-based automatic colorization methods, color image(s are used as reference(s to reconstruct original color of a gray target image. The most important task here is to find the best matching pairs for all pixels between reference and target images in order to transfer color information from reference to target pixels. A lot of attractive local feature-based image matching methods have already been developed for the last two decades. Unfortunately, as far as we know, there are no optimal matching methods for automatic colorization because the requirements for pixel matching in automatic colorization are wholly different from those for traditional image matching. To design an efficient matching algorithm for automatic colorization, clustering pixel with low computational cost and generating descriptive feature vector are the most important challenges to be solved. In this paper, we present a novel method to address these two problems. In particular, our work concentrates on solving the second problem (designing a descriptive feature vector; namely, we will discuss how to learn a descriptive texture feature using scaled sparse texture feature combining with a nonlinear transformation to construct an optimal feature descriptor. Our experimental results show our proposed method outperforms the state-of-the-art methods in terms of robustness for color reconstruction for automatic colorization applications.

  14. Evaluation and analysis of term scoring methods for term extraction

    NARCIS (Netherlands)

    Verberne, S.; Sappelli, M.; Hiemstra, D.; Kraaij, W.


    We evaluate five term scoring methods for automatic term extraction on four different types of text collections: personal document collections, news articles, scientific articles and medical discharge summaries. Each collection has its own use case: author profiling, boolean query term suggestion,

  15. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc


    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze p...

  16. automatic time regulator for switching on an aeration device for ...

    African Journals Online (AJOL)

    human labor, necessitated the design of an automatic time regulator circuit, which controls the switching on and off .... Design Parameters. Rectifier performance parameters: The per- formance of the rectifier section of the power supply block was evaluated in terms of the following .... The C6 is a voltage control capacitor that.

  17. Selective visual working memory in fear of spiders: The role of automaticity and material-specificity

    NARCIS (Netherlands)

    Reinecke, A.; Becker, E.S.; Rinck, M.


    Following cognitive models of anxiety, biases occur if threat processing is automatic versus strategic. Therefore, most of these models predict attentional bias, but not explicit memory bias. We suggest dividing memory into the highly automatic working memory (WM) component versus long-term memory

  18. The Masked Semantic Priming Effect Is Task Dependent: Reconsidering the Automatic Spreading Activation Process (United States)

    de Wit, Bianca; Kinoshita, Sachiko


    Semantic priming effects are popularly explained in terms of an automatic spreading activation process, according to which the activation of a node in a semantic network spreads automatically to interconnected nodes, preactivating a semantically related word. It is expected from this account that semantic priming effects should be routinely…


    Directory of Open Access Journals (Sweden)

    Gondor Mihaela


    Full Text Available This paper examines the role of Automatic Fiscal Stabilizers (AFS for stabilizing the cyclical fluctuations of macroeconomic output as an alternative to discretionary fiscal policy, admitting its huge potential of being an anti crisis solution. The objectives of the study are the identification of the general features of the concept of automatic fiscal stabilizers and the logical assessment of them from economic perspectives. Based on the literature in the field, this paper points out the disadvantages of fiscal discretionary policy and argue the need of using Automatic Fiscal Stabilizers in order to provide a faster decision making process, shielded from political interference, and reduced uncertainty for households and business environment. The paper conclude about the need of using fiscal policy for smoothing the economic cycle, but in a way which includes among its features transparency, responsibility and clear operating mechanisms. Based on the research results the present paper assumes that pro-cyclicality reduces de effectiveness of the Automatic Fiscal Stabilizer and as a result concludes that it is very important to avoid the pro-cyclicality in fiscal rule design. Moreover, by committing in advance to specific fiscal policy action contingent on economic developments, uncertainty about the fiscal policy framework during a recession should be reduced. Being based on logical analysis and not focused on empirical, contextualized one, the paper presents some features of AFS operating mechanism and also identifies and systematizes the factors which provide its importance and national individuality. Reaching common understanding on the Automatic Fiscal Stabilizer concept as a institutional device for smoothing the gap of the economic cycles across different countries, particularly for the European Union Member States, will facilitate efforts to coordinate fiscal policy responses during a crisis, especially in the context of the fiscal

  20. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong


    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  1. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard


    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  2. Traduction automatique et terminologie automatique (Automatic Translation and Automatic Terminology (United States)

    Dansereau, Jules


    An exposition of reasons why a system of automatic translation could not use a terminology bank except as a source of information. The fundamental difference between the two tools is explained and examples of translation and mistranslation are given as evidence of the limits and possibilities of each process. (Text is in French.) (AMH)

  3. Using Automatic Identification System Technology to Improve Maritime Border Security (United States)


    to transport WMD is of significant concern. Terrorists have demonstrated that they have the capability to use explosive-laden suicide boats as...Cost-Effective-ADS-B-Solution-General-Aviation#.VFBlQvTF_6I 42 When Malaysia Airlines flight MH...low-altitude coverage and coverage reliability.”123 Malaysia is also “implementing ADS-B surveillance to improve coverage of certain air routes

  4. Automatic identification of variables in epidemiological datasets using logic regression

    NARCIS (Netherlands)

    M.W. Lorenz (Matthias W.); Abdi, N.A. (Negin Ashtiani); F. Scheckenbach (Frank); A. Pflug (Anja); A. Bulbul (Alpaslan); A.L. Catapano (Alberico); S. Agewall (Stefan); M. Ezhov (Marat); M.L. Bots (Michiel); S. Kiechl (Stefan); Orth, A. (Andreas); G.D. Norata (Giuseppe); J.P. Empana (Jean Philippe); Lin, H.-J. (Hung-Ju); S. McLachlan (Stela); L. Bokemark (Lena); K. Ronkainen (Kimmo); Amato, M. (Mauro); U. Schminke (Ulf); Srinivasan, S.R. (Sathanur R.); L. Lind (Lars); Kato, A. (Akihiko); Dimitriadis, C. (Chrystosomos); Przewlocki, T. (Tadeusz); Okazaki, S. (Shuhei); C.D. Stehouwer (Coen); Lazarevic, T. (Tatjana); J. Willeit (Johann); Yanez, D.N. (David N.); H. Steinmetz (helmuth); Sander, D. (Dirk); H. Poppert (Holger); M. Desvarieux (Moise); M.A. Ikram (Arfan); Bevc, S. (Sebastjan); Staub, D. (Daniel); Sirtori, C.R. (Cesare R.); B. Iglseder (Bernhard); G. Engström; G.L. Tripepi (Giovanni); Beloqui, O. (Oscar); Lee, M.-S. (Moo-Sik); A. Friera (Alfonsa); W. Xie (Wuxiang); L. Grigore (Liliana); M. Plichart (Matthieu); Su, T.-C. (Ta-Chen); C.M. Robertson (Christine M); C. Schmidt (Caroline); Tuomainen, T.-P. (Tomi-Pekka); F. Veglia (Fabrizio); H. Völzke (Henry); M.G.A.A.M. Nijpels (Giel); Jovanovic, A. (Aleksandar); J. Willeit (Johann); Sacco, R.L. (Ralph L.); O.H. Franco (Oscar); Hojs, R. (Radovan); Uthoff, H. (Heiko); B. Hedblad (Bo); Park, H.W. (Hyun Woong); Suarez, C. (Carmen); Zhao, D. (Dong); Catapano, A. (Alberico); P. Ducimetiere (P.); Chien, K.-L. (Kuo-Liong); Price, J.F. (Jackie F.); G. Bergstrom (Goran); J. Kauhanen (Jussi); E. Tremoli (Elena); M. Dörr (Marcus); Berenson, G. (Gerald); A. Papagianni (Aikaterini); Kablak-Ziembicka, A. (Anna); Kitagawa, K. (Kazuo); J.M. Dekker (Jacqueline); Stolic, R. (Radojica); J.F. Polak (Joseph F.); M. Sitzer (Matthias); H. Bickel (Horst); T. Rundek (Tatjana); A. Hofman (Albert); Ekart, R. (Robert); Frauchiger, B. (Beat); Castelnuovo, S. (Samuela); M. Rosvall (Maria); C. Zoccali (Carmine); Landecho, M.F. (Manuel F.); Bae, J.-H. (Jang-Ho); Gabriel, R. (Rafael); Liu, J. (Jing); D. Baldassarre (Damiano); M. Kavousi (Maryam)


    textabstractBackground: For an individual participant data (IPD) meta-analysis, multiple datasets must be transformed in a consistent format, e.g. using uniform variable names. When large numbers of datasets have to be processed, this can be a time-consuming and error-prone task. Automated or

  5. Automatic identification of variables in epidemiological datasets using logic regression

    NARCIS (Netherlands)

    Lorenz, Matthias W.; Abdi, Negin Ashtiani; Scheckenbach, Frank; Pflug, Anja; Bülbül, Alpaslan; Catapano, Alberico L.; Agewall, Stefan; Ezhov, Marat; Bots, Michiel L.; Kiechl, Stefan; Orth, Andreas; Norata, Giuseppe D.; Empana, Jean Philippe; Lin, Hung Ju; McLachlan, Stela; Bokemark, Lena; Ronkainen, Kimmo; Amato, Mauro; Schminke, Ulf; Srinivasan, Sathanur R.; Lind, Lars; Kato, Akihiko; Dimitriadis, Chrystosomos; Przewlocki, Tadeusz; Okazaki, Shuhei; Stehouwer, C. D.A.; Lazarevic, Tatjana; Willeit, Peter; Yanez, David N.; Steinmetz, Helmuth; Sander, Dirk; Poppert, Holger; Desvarieux, Moise; Ikram, M. Arfan; Bevc, Sebastjan; Staub, Daniel; Sirtori, Cesare R.; Iglseder, Bernhard; Engström, Gunnar; Tripepi, Giovanni; Beloqui, Oscar; Lee, Moo Sik; Friera, Alfonsa; Xie, Wuxiang; Grigore, Liliana; Plichart, Matthieu; Su, Ta Chen; Robertson, Christine; Schmidt, Caroline; Tuomainen, Tomi Pekka; Veglia, Fabrizio; Völzke, Henry; Nijpels, Giel; Jovanovic, Aleksandar; Willeit, Johann; Sacco, Ralph L.; Franco, Oscar H.; Hojs, Radovan; Uthoff, Heiko; Hedblad, Bo; Park, Hyun Woong; Suarez, Carmen; Zhao, Dong; Catapano, Alberico; Ducimetiere, Pierre; Chien, Kuo Liong; Price, Jackie F.; Bergström, Göran; Kauhanen, Jussi; Tremoli, Elena; Dörr, Marcus; Berenson, Gerald; Papagianni, Aikaterini; Kablak-Ziembicka, Anna; Kitagawa, Kazuo; Dekker, Jaqueline M.; Stolic, Radojica; Polak, Joseph F.; Sitzer, Matthias; Bickel, Horst; Rundek, Tatjana; Hofman, Albert; Ekart, Robert; Frauchiger, Beat; Castelnuovo, Samuela; Rosvall, Maria; Zoccali, Carmine; Landecho, Manuel F.; Bae, Jang Ho; Gabriel, Rafael; Liu, Jing; Baldassarre, Damiano; Kavousi, Maryam


    Background: For an individual participant data (IPD) meta-analysis, multiple datasets must be transformed in a consistent format, e.g. using uniform variable names. When large numbers of datasets have to be processed, this can be a time-consuming and error-prone task. Automated or semi-automated

  6. Automatic identification of variables in epidemiological datasets using logic regression

    NARCIS (Netherlands)

    M.W. Lorenz (Matthias W.); N.A. Abdi (Negin Ashtiani); F. Scheckenbach (Frank); A. Pflug (Anja); A. Bulbul (Alpaslan); A.L. Catapano (Alberico L.); S. Agewall (Stefan); M. Ezhov (Marat); M.L. Bots (Michiel); S. Kiechl (Stefan); A. Orth (Andreas)


    markdownabstractBackground: For an individual participant data (IPD) meta-analysis, multiple datasets must be transformed in a consistent format, e.g. using uniform variable names. When large numbers of datasets have to be processed, this can be a time-consuming and error-prone task. Automated or

  7. Intelligent system for automatic feature detection and selection or identification (United States)

    Sun, C.T.; Shiang, P.S.; Jang, J.S.; Fu, C.Y.


    A neural network uses a fuzzy membership function, the parameters of which are adaptive during the training process, to parameterize the interconnection weights between an (n{minus}1)`th layer and an n`th layer of the network. Each j`th node in each k`th layer of the network except the input layer produces its output value y{sub k,j} according to the function shown in Equation 1 where N{sub k{minus}1} is the number of nodes in layer k{minus}1, i indexes the nodes of layer k{minus}1 and all the w{sub k,i,j} are interconnection weights. The interconnection weights to all nodes j in the n`th layer are given by w{sub n,i,j}=w{sub n,j} (i, p{sub n,j,1}, . . . , p{sub n,j},p{sub n}). The apparatus is trained by setting values for at least one of the parameters p{sub n,j,1}, . . . , p{sub n,j},Pn. Preferably the number of parameters P{sub n} is less than the number of nodes N{sub n{minus}1} in layer n{minus}1. W{sub n,j} (i,p{sub n,j,1}, . . . , p{sub n,j},Pn) can be convex in i, and it can be bell-shaped. Sample functions for w{sub n,j} (i, p{sub n,j,1}, . . . , p{sub n,j},Pn) include Equation 2, shown in the patent. 8 figs.

  8. 33 CFR 401.20 - Automatic Identification System. (United States)


    ... more than 50 passengers for hire; and (2) Each dredge, floating plant or towing vessel over 8 meters in... close to the primary conning position in the navigation bridge and a standard 120 Volt, AC, 3-prong...

  9. Using automatic identification system technology to improve maritime border security


    Lindstrom, Tedric R.


    Approved for public release; distribution is unlimited Our coastal waters are the United States’ most open and vulnerable borders. This vast maritime domain harbors critical threats from terrorism, criminal activities, and natural disasters. Maritime borders pose significant security challenges, as nefarious entities have used small boats to conduct illegal activities for years, and they continue to do so today. Illegal drugs, money, weapons, and migrants flow both directions across our ma...

  10. 47 CFR 25.281 - Automatic Transmitter Identification System (ATIS). (United States)


    ...) The protocol shall be International Morse Code keyed by a 1200 Hz ±800 Hz tone representing a mark and... code programmed into the ATIS device in a permanent manner such that it cannot be readily changed by...

  11. Automatic Identification System (AIS) Transmit Testing in Louisville Phase 2 (United States)


    11  4.1.2  Paducah , KY...2013, team members traveled to Louisville, KY and Paducah , KY to gather input from various stakeholders. Since none of the stakeholders had used the...hours out from the lock. 4.1.2 Paducah , KY While in Paducah , a visit was made to Ingram barge. Mr. Mark Stevens (who has since retired) and Mr. Mike

  12. Child vocalization composition as discriminant information for automatic autism detection. (United States)

    Xu, Dongxin; Gilkerson, Jill; Richards, Jeffrey; Yapanel, Umit; Gray, Sharmi


    Early identification is crucial for young children with autism to access early intervention. The existing screens require either a parent-report questionnaire and/or direct observation by a trained practitioner. Although an automatic tool would benefit parents, clinicians and children, there is no automatic screening tool in clinical use. This study reports a fully automatic mechanism for autism detection/screening for young children. This is a direct extension of the LENA (Language ENvironment Analysis) system, which utilizes speech signal processing technology to analyze and monitor a child's natural language environment and the vocalizations/speech of the child. It is discovered that child vocalization composition contains rich discriminant information for autism detection. By applying pattern recognition and machine learning approaches to child vocalization composition data, accuracy rates of 85% to 90% in cross-validation tests for autism detection have been achieved at the equal-error-rate (EER) point on a data set with 34 children with autism, 30 language delayed children and 76 typically developing children. Due to its easy and automatic procedure, it is believed that this new tool can serve a significant role in childhood autism screening, especially in regards to population-based or universal screening.

  13. Identification of a subset of patients with acute myeloid leukemia characterized by long-term in vitro proliferation and altered cell cycle regulation of the leukemic cells. (United States)

    Hatfield, Kimberley Joanne; Reikvam, Håkon; Bruserud, Øystein


    The malignant cell population of acute myeloid leukemia (AML) includes a small population of stem/progenitor cells with long-term in vitro proliferation. We wanted to compare long-term AML cell proliferation for unselected patients, investigate the influence of endothelial cells on AML cell proliferation and identify biological characteristics associated with clonogenic capacity. Cells were cultured in medium supplemented with recombinant growth factors FMS-like tyrosine kinase-3 ligand, stem cell factor, IL-3, G-CSF and thrombopoietin. The colony-forming unit assay was used to estimate the number of progenitors in AML cell populations after 35 days of culture, and microarray was used to study global gene expression profiles between AML patients. Long-term cell proliferation was observed in 7 of 31 patients, whereas 3 additional patients showed long-term proliferation after endothelial cell coculture. Patient-specific differences in constitutive cytokine release were maintained during cell culture. Patients with long-term proliferation showed altered expression in six cell cycle-related genes (HMMR, BUB1, NUSAP1, AURKB, CCNF, DLGAP5), two genes involved in DNA replication (TOP2A, RFC3) and one gene with unknown function (LHFPL2). We identified a subset of AML patients characterized by long-term in vitro cell proliferation and altered expression of cell cycle regulators that may be potential candidates for treatment of AML.

  14. Who identifies with suicidal film characters? Determinants of identification with suicidal protagonists of drama films. (United States)

    Till, Benedikt; Herberth, Arno; Sonneck, Gernot; Vitouch, Peter; Niederkrotenthaler, Thomas


    Identification with a media character is an influential factor for the effects of a media product on the recipient, but still very little is known about this cognitive process. This study investigated to what extent identification of a recipient with the suicidal protagonist of a film drama is influenced by the similarity between them in terms of sex, age, and education as well as by the viewer's empathy and suicidality. Sixty adults were assigned randomly to one of two film groups. Both groups watched a drama that concluded with the tragic suicide of the protagonist. Identification, empathy, suicidality, as well as socio-demographic data were measured by questionnaires that were applied before and after the movie screening. Results indicated that identification was not associated with socio-demographic similarity or the viewer's suicidality. However, the greater the subjects' empathy was, the more they identified with the protagonist in one of the two films. This investigation provides evidence that challenges the common assumption that identification with a film character is automatically generated when viewer and protagonist are similar in terms of sex, age, education or attitude.

  15. Automatically Preparing Safe SQL Queries (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  16. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim


    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  17. Automatic Error Analysis Using Intervals (United States)

    Rothwell, E. J.; Cloud, M. J.


    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  18. Automatic measurement of the radioactive mercury uptake by the kidney

    International Nuclear Information System (INIS)

    Zurowski, S.; Raynaud, C.; CEA, 91 - Orsay


    An entirely automatic method to measure the Hg uptake by the kidney is proposed. The following operations are carried out in succession: measurement of extrarenal activity, demarcation of uptake areas, anatomical identification of uptake areas, separation of overlapping organ images and measurement of kidney depth. The first results thus calculated on 30 patients are very close to those obtained with a standard manual method and are highly encouraging. Two important points should be stressed: a broad demarcation of the uptake areas is necessary and an original method, that of standard errors, is useful for the background noise determination and uptake area demarcation. This automatic measurement technique is so designed that it can be applied to other special cases [fr

  19. Source identification and long-term monitoring of airborne particulate matter (PM2.5/PM10) in an urban region of Korea

    International Nuclear Information System (INIS)

    Yong-Sam Chung; Sun-Ha Kim; Jong-Hwa Moon; Young-Jin Kim; Jong-Myoung Lim; Jin-Hong Lee


    For the identification of air pollution sources, about 500 airborne particulate matter (PM 2.5 and PM 10 ) samples were collected by using a Gent air sampler and a polycarbonate filter in an urban region in the middle of Korea from 2000 to 2003. The concentrations of 25 elements in the samples were measured by using instrumental neutron activation analysis (INAA). Receptor modeling was performed on the air monitoring data by using the positive matrix factorization (PMF2) method. According to this analysis, the existence of 6 to 10 PMF factors, such as metal-alloy, oil combustion, diesel exhaust, coal combustion, gasoline exhaust, incinerator, Cu-smelter, biomass burning, sea-salt, and soil dust were identified. (author)

  20. Singer Identification in Rembetiko Music


    Holzapfel, André; Stylianou, Yannis


    In this paper, the problem of the automatic identification of a singer is investigated using methods known from speaker identification. Ways for using world models are presented and the usage of Cepstral Mean Subtraction (CMS) is evaluated. In order to minimize the difference due to musical style we use a novel data set, consisting of samples from greekRembetiko music, being very similar in style. The data set also explores for the first time the influence of the recording quality, by includi...

  1. Automatic lighting controls demonstration: Long-term results

    Energy Technology Data Exchange (ETDEWEB)

    Rubinstein, F. (Lawrence Berkeley Lab., CA (United States))


    An advanced electronically ballasted lighting control system was installed in a portion of an office building to measure the energy and demand savings. The lighting control system used an integrated lighting control scenario that included daylight following, lumen depreciation correction, and scheduling. The system reduced lighting energy on weekdays by 62% and 51% in the north and south daylit zones, respectively, compared to a reference zone that did not have controls. During the summer, over 75% energy savings were achieved on weekdays in the north daylit zone. Even in the south interior zone, which benefitted lime from daylight, correction strategies and adjustment of the aisleway lights to a low level resulted in energy use of only half that of the reference zone. Although, in general, the savings varied over the year due to changing daylight conditions, the energy reduction achieved with controls could be fit using a simple analytical model. Significant savings also occurred during core operating hours when it is more expensive to supply and use energy. Compared to the usage in the reference zone, energy reductions of 49%, 44%, and 62% were measured in the south daylight, south interior, and north daylight zones, respectively, during core operating hours throughout the year. Lighting energy usage on weekends decreased dramatically in the zones with controls, with the usage in the north daylit zone only 10% that of the reference zone. A simple survey developed to assess occupant response to the lighting control system showed that the occupants were satisfied with the light levels provided.

  2. Synthesis of digital locomotive receiver of automatic locomotive signaling

    Directory of Open Access Journals (Sweden)

    K. V. Goncharov


    Full Text Available Purpose. Automatic locomotive signaling of continuous type with a numeric coding (ALSN has several disadvantages: a small number of signal indications, low noise stability, high inertia and low functional flexibility. Search for new and more advanced methods of signal processing for automatic locomotive signaling, synthesis of the noise proof digital locomotive receiver are essential. Methodology. The proposed algorithm of detection and identification locomotive signaling codes is based on the definition of mutual correlations of received oscillation and reference signals. For selecting threshold levels of decision element the following criterion has been formulated: the locomotive receiver should maximum set the correct solution for a given probability of dangerous errors. Findings. It has been found that the random nature of the ALSN signal amplitude does not affect the detection algorithm. However, the distribution law and numeric characteristics of signal amplitude affect the probability of errors, and should be considered when selecting a threshold levels According to obtained algorithm of detection and identification ALSN signals the digital locomotive receiver has been synthesized. It contains band pass filter, peak limiter, normalizing amplifier with automatic gain control circuit, analog to digital converter and digital signal processor. Originality. The ALSN system is improved by the way of the transfer of technical means to modern microelectronic element base, more perfect methods of detection and identification codes of locomotive signaling are applied. Practical value. Use of digital technology in the construction of the locomotive receiver ALSN will expand its functionality, will increase the noise immunity and operation stability of the locomotive signal system in conditions of various destabilizing factors.

  3. A biometric approach to laboratory rodent identification. (United States)

    Cameron, Jens; Jacobson, Christina; Nilsson, Kenneth; Rögnvaldsson, Thorsteinn


    Individual identification of laboratory rodents typically involves invasive methods, such as tattoos, ear clips, and implanted transponders. Beyond the ethical dilemmas they may present, these methods may cause pain or distress that confounds research results. The authors describe a prototype device for biometric identification of laboratory rodents that would allow researchers to identify rodents without the complications of other methods. The device, which uses the rodent's ear blood vessel pattern as the identifier, is fast, automatic, noninvasive, and painless.

  4. Automatic feed system for ultrasonic machining (United States)

    Calkins, Noel C.


    Method and apparatus for ultrasonic machining in which feeding of a tool assembly holding a machining tool toward a workpiece is accomplished automatically. In ultrasonic machining, a tool located just above a workpiece and vibrating in a vertical direction imparts vertical movement to particles of abrasive material which then remove material from the workpiece. The tool does not contact the workpiece. Apparatus for moving the tool assembly vertically is provided such that it operates with a relatively small amount of friction. Adjustable counterbalance means is provided which allows the tool to be immobilized in its vertical travel. A downward force, termed overbalance force, is applied to the tool assembly. The overbalance force causes the tool to move toward the workpiece as material is removed from the workpiece.

  5. Automatic female dehumanization across the menstrual cycle. (United States)

    Piccoli, Valentina; Fantoni, Carlo; Foroni, Francesco; Bianchi, Mauro; Carnaghi, Andrea


    In this study, we investigate whether hormonal shifts during the menstrual cycle contribute to the dehumanization of other women and men. Female participants with different levels of likelihood of conception (LoC) completed a semantic priming paradigm in a lexical decision task. When the word 'woman' was the prime, animal words were more accessible in high versus low LoC whereas human words were more inhibited in the high versus low LoC. When the word 'man' was used as the prime, no difference was found in terms of accessibility between high and low LoC for either animal or human words. These results show that the female dehumanization is automatically elicited by menstrual cycle-related processes and likely associated with an enhanced activation of mate-attraction goals. © 2016 The British Psychological Society.

  6. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena


    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  7. Automatically extracting information needs from complex clinical questions. (United States)

    Cao, Yong-gang; Cimino, James J; Ely, John; Yu, Hong


    Clinicians pose complex clinical questions when seeing patients, and identifying the answers to those questions in a timely manner helps improve the quality of patient care. We report here on two natural language processing models, namely, automatic topic assignment and keyword identification, that together automatically and effectively extract information needs from ad hoc clinical questions. Our study is motivated in the context of developing the larger clinical question answering system AskHERMES (Help clinicians to Extract and aRrticulate Multimedia information for answering clinical quEstionS). We developed supervised machine-learning systems to automatically assign predefined general categories (e.g. etiology, procedure, and diagnosis) to a question. We also explored both supervised and unsupervised systems to automatically identify keywords that capture the main content of the question. We evaluated our systems on 4654 annotated clinical questions that were collected in practice. We achieved an F1 score of 76.0% for the task of general topic classification and 58.0% for keyword extraction. Our systems have been implemented into the larger question answering system AskHERMES. Our error analyses suggested that inconsistent annotation in our training data have hurt both question analysis tasks. Our systems, available at, can automatically extract information needs from both short (the number of word tokens 20), and from both well-structured and ill-formed questions. We speculate that the performance of general topic classification and keyword extraction can be further improved if consistently annotated data are made available. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. Self-Compassion and Automatic Thoughts (United States)

    Akin, Ahmet


    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  9. New software for computer-assisted dental-data matching in Disaster Victim Identification and long-term missing persons investigations: "DAVID Web". (United States)

    Clement, J G; Winship, V; Ceddia, J; Al-Amad, S; Morales, A; Hill, A J


    In 1997 an internally supported but unfunded pilot project at the Victorian Institute of Forensic Medicine (VIFM) Australia led to the development of a computer system which closely mimicked Interpol paperwork for the storage, later retrieval and tentative matching of the many AM and PM dental records that are often needed for rapid Disaster Victim Identification. The program was called "DAVID" (Disaster And Victim IDentification). It combined the skills of the VIFM Information Technology systems manager (VW), an experienced odontologist (JGC) and an expert database designer (JC); all current authors on this paper. Students did much of the writing of software to prescription from Monash University. The student group involved won an Australian Information Industry Award in recognition of the contribution the new software could have made to the DVI process. Unfortunately, the potential of the software was never realized because paradoxically the federal nature of Australia frequently thwarts uniformity of systems across the entire country. As a consequence, the final development of DAVID never took place. Given the recent problems encountered post-tsunami by the odontologists who were obliged to use the Plass Data system (Plass Data Software, Holbaek, Denmark) and with the impending risks imposed upon Victoria by the decision to host the Commonwealth Games in Melbourne during March 2006, funding was sought and obtained from the state government to update counter disaster preparedness at the VIFM. Some of these funds have been made available to upgrade and complete the DAVID project. In the wake of discussions between leading expert odontologists from around the world held in Geneva during July 2003 at the invitation of the International Committee of the Red Cross significant alterations to the initial design parameters of DAVID were proposed. This was part of broader discussions directed towards developing instruments which could be used by the ICRC's "The Missing

  10. An Automatic Document Indexing System Based on Cooperating Expert Systems: Design and Development. (United States)

    Schuegraf, Ernst J.; van Bommel, Martin F.


    Describes the design of an automatic indexing system that is based on statistical techniques and expert system technology. Highlights include system architecture; the derivation of topic indicators, including word frequency; experimental results using documents from ERIC; the effects of stemming; and the identification of characteristic…

  11. Automatic Service Derivation from Business Process Model Repositories via Semantic Technology

    NARCIS (Netherlands)

    Leopold, H.; Pittke, F.; Mendling, J.


    Although several approaches for service identification have been defined in research and practice, there is a notable lack of fully automated techniques. In this paper, we address the problem of manual work in the context of service derivation and present an approach for automatically deriving

  12. Automatic morphometry of synaptic boutons of cultured cells using granulometric analysis of digital images

    NARCIS (Netherlands)

    Prodanov, D.P.; Heeroma, Joost; Marani, Enrico


    Numbers, linear density, and surface area of synaptic boutons can be important parameters in studies on synaptic plasticity in cultured neurons. We present a method for automatic identification and morphometry of boutons based on filtering of digital images using granulometric analysis. Cultures of

  13. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.


    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  14. Automatic design of magazine covers (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.


    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  15. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.


    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  16. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz


    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  17. Topical Session on Liabilities identification and long-term management at national level - Topical Session held during the 36. Meeting of the RWMC

    International Nuclear Information System (INIS)


    These proceedings cover a topical session that was held at the March 2003 meeting of the Radioactive Waste Management Committee. The topical session focused on liability assessment and management for decommissioning of all types of nuclear installations, including decontamination of historic sites and waste management, as applicable. The presentations covered the current, national situations. The first oral presentation, from Switzerland, set the scene by providing a broad coverage of the relevant issues. The subsequent presentations - five from Member countries and one from the EC - described additional national positions and the evolving EC proposed directives. Each oral presentation was followed by a brief period of Q and As for clarification only. A plenary discussion took place on the ensemble of presentations and a Rapporteur provided a report on points made and lessons learnt. Additionally, written contributions were provided by RWMC delegates from several other countries. These are included in the proceedings as are the papers from the oral sessions, and the Rapporteur's report. These papers are not intended to be exhaustive, but to give an informed glimpse of NEA countries' approaches to liability identification and management in the context of nuclear facilities decommissioning and dismantling

  18. MOS voltage automatic tuning circuit


    李, 田茂; 中田, 辰則; 松本, 寛樹


    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  19. Automatic controller at associated memory

    International Nuclear Information System (INIS)

    Courty, P.


    Organized around an A2 type controller, this CAMAC device allows on command of the associated computer to start reading 64K 16 bit words into an outer memory. This memory is fully controlled by the computer. In the automatic mode, which works at 10 6 words/sec, the computer can access any other module of the same crate by cycle-stealing [fr

  20. Automatic Guidance for Remote Manipulator (United States)

    Johnston, A. R.


    Position sensor and mirror guides manipulator toward object. Grasping becomes automatic when sensor begins to receive signal from reflector on object to be manipulated. Light-emitting diodes on manipulator produce light signals for reflector, which is composite of plane and corner reflectors. Proposed scheme especially useful when manipulator arm tends to flex or when object is moving. Sensor and microprocessor designed to compensate for manipulatorarm oscillation.

  1. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis


    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  2. Automatically-Programed Machine Tools (United States)

    Purves, L.; Clerman, N.


    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  3. Automatic computation of transfer functions (United States)

    Atcitty, Stanley; Watson, Luke Dale


    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  4. CRISPR Recognition Tool (CRT): a tool for automatic detection ofclustered regularly interspaced palindromic repeats

    Energy Technology Data Exchange (ETDEWEB)

    Bland, Charles; Ramsey, Teresa L.; Sabree, Fareedah; Lowe,Micheal; Brown, Kyndall; Kyrpides, Nikos C.; Hugenholtz, Philip


    Clustered Regularly Interspaced Palindromic Repeats (CRISPRs) are a novel type of direct repeat found in a wide range of bacteria and archaea. CRISPRs are beginning to attract attention because of their proposed mechanism; that is, defending their hosts against invading extrachromosomal elements such as viruses. Existing repeat detection tools do a poor job of identifying CRISPRs due to the presence of unique spacer sequences separating the repeats. In this study, a new tool, CRT, is introduced that rapidly and accurately identifies CRISPRs in large DNA strings, such as genomes and metagenomes. CRT was compared to CRISPR detection tools, Patscan and Pilercr. In terms of correctness, CRT was shown to be very reliable, demonstrating significant improvements over Patscan for measures precision, recall and quality. When compared to Pilercr, CRT showed improved performance for recall and quality. In terms of speed, CRT also demonstrated superior performance, especially for genomes containing large numbers of repeats. In this paper a new tool was introduced for the automatic detection of CRISPR elements. This tool, CRT, was shown to be a significant improvement over the current techniques for CRISPR identification. CRT's approach to detecting repetitive sequences is straightforward. It uses a simple sequential scan of a DNA sequence and detects repeats directly without any major conversion or preprocessing of the input. This leads to a program that is easy to describe and understand; yet it is very accurate, fast and memory efficient, being O(n) in space and O(nm/l) in time.

  5. Automatic segmentation of clinical texts. (United States)

    Apostolova, Emilia; Channin, David S; Demner-Fushman, Dina; Furst, Jacob; Lytinen, Steven; Raicu, Daniela


    Clinical narratives, such as radiology and pathology reports, are commonly available in electronic form. However, they are also commonly entered and stored as free text. Knowledge of the structure of clinical narratives is necessary for enhancing the productivity of healthcare departments and facilitating research. This study attempts to automatically segment medical reports into semantic sections. Our goal is to develop a robust and scalable medical report segmentation system requiring minimum user input for efficient retrieval and extraction of information from free-text clinical narratives. Hand-crafted rules were used to automatically identify a high-confidence training set. This automatically created training dataset was later used to develop metrics and an algorithm that determines the semantic structure of the medical reports. A word-vector cosine similarity metric combined with several heuristics was used to classify each report sentence into one of several pre-defined semantic sections. This baseline algorithm achieved 79% accuracy. A Support Vector Machine (SVM) classifier trained on additional formatting and contextual features was able to achieve 90% accuracy. Plans for future work include developing a configurable system that could accommodate various medical report formatting and content standards.

  6. Image simulation for automatic license plate recognition (United States)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José


    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  7. Identification of long-term trends and seasonality in high-frequency water quality data from the Yangtze River basin, China. (United States)

    Duan, Weili; He, Bin; Chen, Yaning; Zou, Shan; Wang, Yi; Nover, Daniel; Chen, Wen; Yang, Guishan


    Comprehensive understanding of the long-term trends and seasonality of water quality is important for controlling water pollution. This study focuses on spatio-temporal distributions, long-term trends, and seasonality of water quality in the Yangtze River basin using a combination of the seasonal Mann-Kendall test and time-series decomposition. The used weekly water quality data were from 17 environmental stations for the period January 2004 to December 2015. Results show gradual improvement in water quality during this period in the Yangtze River basin and greater improvement in the Uppermost Yangtze River basin. The larger cities, with high GDP and population density, experienced relatively higher pollution levels due to discharge of industrial and household wastewater. There are higher pollution levels in Xiang and Gan River basins, as indicated by higher NH4-N and CODMn concentrations measured at the stations within these basins. Significant trends in water quality were identified for the 2004-2015 period. Operations of the three Gorges Reservoir (TGR) enhanced pH fluctuations and possibly attenuated CODMn, and NH4-N transportation. Finally, seasonal cycles of varying strength were detected for time-series of pollutants in river discharge. Seasonal patterns in pH indicate that maxima appear in winter, and minima in summer, with the opposite true for CODMn. Accurate understanding of long-term trends and seasonality are necessary goals of water quality monitoring system efforts and the analysis methods described here provide essential information for effectively controlling water pollution.

  8. [Isolation, culture and identification of adipose-derived stem cells from SD rat adipose tissues subjected to long-term cryopreservation]. (United States)

    Liu, Qin; Wang, Liping; Chen, Fang; Zhang, Yi


    Objective To study the feasibility of isolation and culture of adipose-derived stem cells (ADSCs) from SD rat adipose tissues subjected to long-term cryopreservation. Methods We took inguinal fat pads from healthy SD rats. Adipose tissues were stored with 100 mL/L dimethyl sulfoxide (DMSO) combined with 900 mL/L fetal bovine serum (FBS) in liquid nitrogen. Three months later, the adipose tissues were resuscitated for the isolation and culture of ADSCs. The growth status and morphology were observed. The growth curve and cell surface markers CD29, CD45, CD90 of the 3rd passage cells were analyzed respectively by CCK-8 assay and immunocytochemistry. The 3rd passage cells were induced towards adipogenic lineages and osteogenic lineages by different inducers, and the resulting cells were examined separately by oil red O staining and alizarin red staining. Results The ADSCs obtained from SD rat adipose tissues subjected to long-term cryopreservation showed a spindle-shape appearance and had a good proliferation ability. The cell growth curve was typical "S" curve. Immunocytochemistry showed that the 3rd passage cells were positive for CD29 and CD90, while negative for CD45. The cells were positive for oil red O staining after adipogenic induction, and also positive for alizarin red staining after osteogenic induction. Conclusion The ADSCs can be isolated from SD rat adipose tissues subjected to long-term cryopreservation.

  9. FaNexer: Persian Keyphrase Automatic Indexer

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Falahati Qadimi Fumani


    Full Text Available The main objective of this paper was to design a model of automatic keyphrase indexing for Persian. The train model, consisting of six features – “TF”, “TF × IDF”, “RE”, “RE × IDF”, “Node Degree” and “First Occurrence” – were elaborated on. These six features were defined briefly and for each feature, the discretization ranges applied as well as the Yes/No probability scores of being an index term were reported. Finally, the way the model, and each of its components, performed were demonstrated in a step-by-step manner by running the software on a sample full-text article. The ultimate assessment of the software on 75 test articles revealed that it had a very good performance on full-texts (F-measure = 27.3%, Precision = 31.68%, and recall = 25.45% and abstracts (F-measure = 28%, precision = 32.19%, and recall = 26.27% when default was set at 7. The software also proved successful as regards generation of keyphrases rather than single word index terms at default 7. In all, 58.1% of the index terms generated by the software for full-text documents, and 58.67% of those generated for abstracts were phrases. Finally, 78.86% and 74.48% of the keyterms generated for full-texts and abstracts were judged as relevant by an LIS expert.

  10. Semi-automated identification of leopard frogs (United States)

    Petrovska-Delacrétaz, Dijana; Edwards, Aaron; Chiasson, John; Chollet, Gérard; Pilliod, David S.


    Principal component analysis is used to implement a semi-automatic recognition system to identify recaptured northern leopard frogs (Lithobates pipiens). Results of both open set and closed set experiments are given. The presented algorithm is shown to provide accurate identification of 209 individual leopard frogs from a total set of 1386 images.

  11. 47 CFR 74.482 - Station identification. (United States)


    ...) Automatically activated equipment may be used to transmit station identification in International Morse Code... unscrambled analog (F3E) mode or in International Morse Code pursuant to the provisions of paragraph (d) of... is maintained at 40%±10%, and that the code transmission rate is maintained between 20 and 25 words...

  12. Identification of long-term trends and seasonality in high-frequency water quality data from the Yangtze River basin, China.

    Directory of Open Access Journals (Sweden)

    Weili Duan

    Full Text Available Comprehensive understanding of the long-term trends and seasonality of water quality is important for controlling water pollution. This study focuses on spatio-temporal distributions, long-term trends, and seasonality of water quality in the Yangtze River basin using a combination of the seasonal Mann-Kendall test and time-series decomposition. The used weekly water quality data were from 17 environmental stations for the period January 2004 to December 2015. Results show gradual improvement in water quality during this period in the Yangtze River basin and greater improvement in the Uppermost Yangtze River basin. The larger cities, with high GDP and population density, experienced relatively higher pollution levels due to discharge of industrial and household wastewater. There are higher pollution levels in Xiang and Gan River basins, as indicated by higher NH4-N and CODMn concentrations measured at the stations within these basins. Significant trends in water quality were identified for the 2004-2015 period. Operations of the three Gorges Reservoir (TGR enhanced pH fluctuations and possibly attenuated CODMn, and NH4-N transportation. Finally, seasonal cycles of varying strength were detected for time-series of pollutants in river discharge. Seasonal patterns in pH indicate that maxima appear in winter, and minima in summer, with the opposite true for CODMn. Accurate understanding of long-term trends and seasonality are necessary goals of water quality monitoring system efforts and the analysis methods described here provide essential information for effectively controlling water pollution.


    Directory of Open Access Journals (Sweden)

    A. G. Stryzhniou


    Full Text Available The paper presents the method for automatic raising and leveling of support platform that differ from others in simplicity and versatility. The method includes four phases of raising and leveling when performance capabilities of the system is defined and the soil condition is tested. In addition, the current condition of the system is controlled and corrected with the issuance of control parameters to the control panel. The method can be used not only for static, but also for dynamic leveling systems, such as active suspension. The method assumes identification and dynamics testing of reference units. The synchronization of reference units moving was implemented to avoid dangerous skewing of support platform. The recommendations for the system implementation and experimental model identification of support platform are presented.

  14. An efficient scheme for automatic web pages categorization using the support vector machine (United States)

    Bhalla, Vinod Kumar; Kumar, Neeraj


    In the past few years, with an evolution of the Internet and related technologies, the number of the Internet users grows exponentially. These users demand access to relevant web pages from the Internet within fraction of seconds. To achieve this goal, there is a requirement of an efficient categorization of web page contents. Manual categorization of these billions of web pages to achieve high accuracy is a challenging task. Most of the existing techniques reported in the literature are semi-automatic. Using these techniques, higher level of accuracy cannot be achieved. To achieve these goals, this paper proposes an automatic web pages categorization into the domain category. The proposed scheme is based on the identification of specific and relevant features of the web pages. In the proposed scheme, first extraction and evaluation of features are done followed by filtering the feature set for categorization of domain web pages. A feature extraction tool based on the HTML document object model of the web page is developed in the proposed scheme. Feature extraction and weight assignment are based on the collection of domain-specific keyword list developed by considering various domain pages. Moreover, the keyword list is reduced on the basis of ids of keywords in keyword list. Also, stemming of keywords and tag text is done to achieve a higher accuracy. An extensive feature set is generated to develop a robust classification technique. The proposed scheme was evaluated using a machine learning method in combination with feature extraction and statistical analysis using support vector machine kernel as the classification tool. The results obtained confirm the effectiveness of the proposed scheme in terms of its accuracy in different categories of web pages.

  15. Optimisation of the long-term efficacy of dental chair waterline disinfection by the identification and rectification of factors associated with waterline disinfection failure. (United States)

    O'Donnell, M J; Shore, A C; Russell, R J; Coleman, D C


    Although many studies have highlighted the problem of biofilm growth in dental chair unit waterlines (DUWs), no long-term studies on the efficacy of DUW disinfection using a large number of dental chair units (DCUs) have been reported. To investigate the long-term (21 months) efficacy of the Planmeca Waterline Cleaning System (WCS) to maintain the quality of DUW output water below the American Dental Association (ADA) recommended standard of < or =200cfu/mL of aerobic heterotrophic bacteria using once weekly disinfection with the hydrogen peroxide-and silver ion-containing disinfectant Planosil. Microbiological quality of DUW output water was monitored by culture on R2A agar for 10 DCUs fitted with the WCS. The presence of biofilm in DUWs was examined by electron microscopy. During the first 9 months a high prevalence (28/300 disinfection cycles; 9.3%) of intermittent DUW disinfection failure occurred in 8/10 DCUs due to operator omission to disinfect all DUWs (10/28 failed cycles), incorrect compressed air pressure failing to distribute the disinfectant properly (4/28 failed cycles) and physical blockage of disinfectant intake valves due to corrosion effects of Planosil (14/28 failed cycles). On rectification of these faults through engineering redesign and procedural changes, no further cases of intermittent DUW disinfection failure were observed. Independently of these factors, a rapid and consistent decline in efficacy of DUW disinfection occurred in 4/10 DCUs following the initial 9 months of once weekly disinfection. There was a highly significant difference (P<0.0001) in the prevalence of strongly catalase-positive Novosphingobium and Sphingomonas bacterial species (mean average prevalence of 37.1%) in DUW output water from these 4 DCUs compared to the other 6 DCUs and DCU supply water (prevalence <1%), which correlated with biofilm presence in the DUWs and indicated selective pressure for maintenance of these species by prolonged disinfectant usage

  16. Identification of Typical Left Bundle Branch Block Contraction by Strain Echocardiography Is Additive to Electrocardiography in Prediction of Long-Term Outcome After Cardiac Resynchronization Therapy

    DEFF Research Database (Denmark)

    Risum, Niels; Tayal, Bhupendar; Hansen, Thomas F


    BACKGROUND: Current guidelines suggest that patients with left bundle branch block (LBBB) be treated with cardiac resynchronization therapy (CRT); however, one-third do not have a significant activation delay, which can result in nonresponse. By identifying characteristic opposing wall contraction......, 2-dimensional strain echocardiography (2DSE) may detect true LBBB activation. OBJECTIVES: This study sought to investigate whether the absence of a typical LBBB mechanical activation pattern by 2DSE was associated with unfavorable long-term outcome and if this is additive to electrocardiographic...... whether typical LBBB contraction was present. The pre-defined outcome was freedom from death, left ventricular assist device, or heart transplantation over 4 years. RESULTS: Two-thirds of patients (63%) had a typical LBBB contraction pattern. During 4 years, 48 patients (23%) reached the primary endpoint...

  17. Single thrombopoietin dose alleviates hematopoietic stem cells intrinsic short- and long-term ionizing radiation damage. In vivo identification of anatomical cell expansion sites. (United States)

    Tronik-Le Roux, Diana; Nicola, Marie-Anne; Vaigot, Pierre; Nurden, Paquita


    Hematopoietic stem cells (HSC) are essential for maintaining the integrity of complex and long-lived organisms. HSC, which are self-renewing, reconstitute the hematopoietic system through out life and facilitate long-term repopulation of myeloablated recipients. We have previously demonstrated that when mice are exposed to sublethal doses of ionizing radiation, subsets of the stem/progenitor compartment are affected. In this study we examine the role of thrombopoietin (TPO) on the regenerative capacities of HSC after irradiation and report the first demonstration of efficacy of a single injection of TPO shortly after in vivo exposure to ionizing radiation for reducing HSC injury and improving their functional outcome. Our results demonstrate that TPO treatment not only reduced the number of apoptotic cells but also induced a significant modification of their intrinsic characteristics. These findings were supported by transplantation assays with long-term HSC that were irradiated or unirradiated, TPO treated or untreated, in CD45.1/CD45.2 systems and by using luciferase-labeled HSC for direct bioluminescence imaging in living animals. Of particular importance, our data demonstrate the skull to be a highly favorable site for the TPO-induced emergence of hematopoietic cells after irradiation, suggesting a TPO-mediated relationship of primitive hematopoietic cells to an anatomical component. Together, the data presented here: provide novel findings about aspects of TPO action on stem cells, open new areas of investigation for therapeutic options in patients who are treated with radiation therapy, and show that early administration of a clinically suitable TPO-agonist counteracts the previously observed adverse effects.

  18. Unification of automatic target tracking and automatic target recognition (United States)

    Schachter, Bruce J.


    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.

  19. Automatic differential analysis of NMR experiments in complex samples. (United States)

    Margueritte, Laure; Markov, Petar; Chiron, Lionel; Starck, Jean-Philippe; Vonthron-Sénécheau, Catherine; Bourjot, Mélanie; Delsuc, Marc-André


    Liquid state nuclear magnetic resonance (NMR) is a powerful tool for the analysis of complex mixtures of unknown molecules. This capacity has been used in many analytical approaches: metabolomics, identification of active compounds in natural extracts, and characterization of species, and such studies require the acquisition of many diverse NMR measurements on series of samples. Although acquisition can easily be performed automatically, the number of NMR experiments involved in these studies increases very rapidly, and this data avalanche requires to resort to automatic processing and analysis. We present here a program that allows the autonomous, unsupervised processing of a large corpus of 1D, 2D, and diffusion-ordered spectroscopy experiments from a series of samples acquired in different conditions. The program provides all the signal processing steps, as well as peak-picking and bucketing of 1D and 2D spectra, the program and its components are fully available. In an experiment mimicking the search of a bioactive species in a natural extract, we use it for the automatic detection of small amounts of artemisinin added to a series of plant extracts and for the generation of the spectral fingerprint of this molecule. This program called Plasmodesma is a novel tool that should be useful to decipher complex mixtures, particularly in the discovery of biologically active natural products from plants extracts but can also in drug discovery or metabolomics studies. Copyright © 2017 John Wiley & Sons, Ltd.

  20. A Review on Automatic Mammographic Density and Parenchymal Segmentation (United States)

    He, Wenda; Juette, Arne; Denton, Erika R. E.; Oliver, Arnau


    Breast cancer is the most frequently diagnosed cancer in women. However, the exact cause(s) of breast cancer still remains unknown. Early detection, precise identification of women at risk, and application of appropriate disease prevention measures are by far the most effective way to tackle breast cancer. There are more than 70 common genetic susceptibility factors included in the current non-image-based risk prediction models (e.g., the Gail and the Tyrer-Cuzick models). Image-based risk factors, such as mammographic densities and parenchymal patterns, have been established as biomarkers but have not been fully incorporated in the risk prediction models used for risk stratification in screening and/or measuring responsiveness to preventive approaches. Within computer aided mammography, automatic mammographic tissue segmentation methods have been developed for estimation of breast tissue composition to facilitate mammographic risk assessment. This paper presents a comprehensive review of automatic mammographic tissue segmentation methodologies developed over the past two decades and the evidence for risk assessment/density classification using segmentation. The aim of this review is to analyse how engineering advances have progressed and the impact automatic mammographic tissue segmentation has in a clinical environment, as well as to understand the current research gaps with respect to the incorporation of image-based risk factors in non-image-based risk prediction models. PMID:26171249

  1. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard


    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  2. On automatic machine translation evaluation

    Directory of Open Access Journals (Sweden)

    Darinka Verdonik


    Full Text Available An important task of developing machine translation (MT is evaluating system performance. Automatic measures are most commonly used for this task, as manual evaluation is time-consuming and costly. However, to perform an objective evaluation is not a trivial task. Automatic measures, such as BLEU, TER, NIST, METEOR etc., have their own weaknesses, while manual evaluations are also problematic since they are always to some extent subjective. In this paper we test the influence of a test set on the results of automatic MT evaluation for the subtitling domain. Translating subtitles is a rather specific task for MT, since subtitles are a sort of summarization of spoken text rather than a direct translation of (written text. Additional problem when translating language pair that does not include English, in our example Slovene-Serbian, is that commonly the translations are done from English to Serbian and from English to Slovenian, and not directly, since most of the TV production is originally filmed in English. All this poses additional challenges to MT and consequently to MT evaluation. Automatic evaluation is based on a reference translation, which is usually taken from an existing parallel corpus and marked as a test set. In our experiments, we compare the evaluation results for the same MT system output using three types of test set. In the first round, the test set are 4000 subtitles from the parallel corpus of subtitles SUMAT. These subtitles are not direct translations from Serbian to Slovene or vice versa, but are based on an English original. In the second round, the test set are 1000 subtitles randomly extracted from the first test set and translated anew, from Serbian to Slovenian, based solely on the Serbian written subtitles. In the third round, the test set are the same 1000 subtitles, however this time the Slovene translations were obtained by manually correcting the Slovene MT outputs so that they are correct translations of the

  3. Automatic Evaluation of Machine Translation

    DEFF Research Database (Denmark)

    Martinez, Mercedes Garcia; Koglin, Arlene; Mesa-Lao, Bartolomé


    of quality criteria in as few edits as possible. The quality of MT systems is generally measured by automatic metrics, producing scores that should correlate with human evaluation.In this study, we investigate correlations between one of such metrics, i.e. Translation Edit Rate (TER), and actual post...... of post-editing effort, namely i) temporal (time), ii) cognitive (mental processes) and iii) technical (keyboard activity). For the purposes of this research, TER scores were correlated with two different indicators of post-editing effort as computed in the CRITT Translation Process Database (TPR...

  4. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz


    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  5. Motor automaticity in Parkinson’s disease (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu


    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  6. The RNA world, automatic sequences and oncogenetics

    International Nuclear Information System (INIS)

    Tahir Shah, K.


    We construct a model of the RNA world in terms of naturally evolving nucleotide sequences assuming only Crick-Watson base pairing and self-cleaving/splicing capability. These sequences have the following properties. 1) They are recognizable by an automation (or automata). That is, to each k-sequence, there exist a k-automation which accepts, recognizes or generates the k-sequence. These are known as automatic sequences. Fibonacci and Morse-Thue sequences are the most natural outcome of pre-biotic chemical conditions. 2) Infinite (resp. large) sequences are self-similar (resp. nearly self-similar) under certain rewrite rules and consequently give rise to fractal (resp.fractal-like) structures. Computationally, such sequences can also be generated by their corresponding deterministic parallel re-write system, known as a DOL system. The self-similar sequences are fixed points of their respective rewrite rules. Some of these automatic sequences have the capability that they can read or 'accept' other sequences while others can detect errors and trigger error-correcting mechanisms. They can be enlarged and have block and/or palindrome structure. Linear recurring sequences such as Fibonacci sequence are simply Feed-back Shift Registers, a well know model of information processing machines. We show that a mutation of any rewrite rule can cause a combinatorial explosion of error and relates this to oncogenetical behavior. On the other hand, a mutation of sequences that are not rewrite rules, leads to normal evolutionary change. Known experimental results support our hypothesis. (author). Refs

  7. An automatically tuning intrusion detection system. (United States)

    Yu, Zhenwei; Tsai, Jeffrey J P; Weigert, Thomas


    An intrusion detection system (IDS) is a security layer used to detect ongoing intrusive activities in information systems. Traditionally, intrusion detection relies on extensive knowledge of security experts, in particular, on their familiarity with the computer system to be protected. To reduce this dependence, various data-mining and machine learning techniques have been deployed for intrusion detection. An IDS is usually working in a dynamically changing environment, which forces continuous tuning of the intrusion detection model, in order to maintain sufficient performance. The manual tuning process required by current systems depends on the system operators in working out the tuning solution and in integrating it into the detection model. In this paper, an automatically tuning IDS (ATIDS) is presented. The proposed system will automatically tune the detection model on-the-fly according to the feedback provided by the system operator when false predictions are encountered. The system is evaluated using the KDDCup'99 intrusion detection dataset. Experimental results show that the system achieves up to 35% improvement in terms of misclassification cost when compared with a system lacking the tuning feature. If only 10% false predictions are used to tune the model, the system still achieves about 30% improvement. Moreover, when tuning is not delayed too long, the system can achieve about 20% improvement, with only 1.3% of the false predictions used to tune the model. The results of the experiments show that a practical system can be built based on ATIDS: system operators can focus on verification of predictions with low confidence, as only those predictions determined to be false will be used to tune the detection model.

  8. Automatic Registration and Mosaicking System for Remotely Sensed Imagery

    Directory of Open Access Journals (Sweden)

    Emiliano Castejon


    Full Text Available Image registration is an important operation in many remote sensing applications and it, besides other tasks, involves the identification of corresponding control points in the images. As manual identification of control points may be time-consuming and tiring, several automatic techniques have been developed. This paper describes a system for automatic registration and mosaic of remote sensing images under development at The National Institute for Space Research (INPE and at The University of California, Santa Barbara (UCSB. The user can provide information to the system in order to speed up the registration process as well as to avoid mismatched control points. Based on statistical procedure, the system gives an indication of the registration quality. This allows users to stop the processing, to modify the registration parameters or to continue the processing. Extensive system tests have been performed with different types of data (optical, radar, multi-sensor, high-resolution images and video sequences in order to check the system performance. An online demo system is available on the internet ( which contains several examples that can be carried out using web browser.

  9. Identification of long-term carbon sequestration in soils with historical inputs of biochar using novel stable isotope and spectroscopic techniques (United States)

    Hernandez-Soriano, Maria C.; Kerré, Bart; Hardy, Brieuc; Dufey, Joseph; Smolders, Erik


    Biochar is the collective term for organic matter (OM) that has been produced by pyrolysis of biomass, e.g. during production of charcoal or during natural processes such as bush fires. Biochar production and application is now suggested as one of the economically feasible options for global C-sequestration strategies. The C-sequestration in soil through application of biochar is not only related to its persistence (estimated lifetime exceeds 1000 year in soil), but also due to indirect effects such as its potential to adsorb and increase OM stability in soil. Historical charcoal production sites that had been in use >200 years ago in beech/oak forests have been localized in the south of Belgium. Aerial photography identified black spots in arable land on former forest sites. Soil sampling was conducted in an arable field used for maize production near Mettet (Belgium) where charcoal production was intensive until late 18th century. Soils were sampled in a horizontal gradient across the 'black soils' that extend of few decametres, collecting soil from the spots (Biochar Amended, BA) as well as from the non-biochar amended (NBA). Stable C isotope composition was used to estimate the long-term C-sequestration derived from crops in these soils where maize had been produced since about 15 years. Because C in the biochar originates in forest wood (C3 plants), its isotopic signature (δ13C) differs from the maize (a C4 plant). The C and N content and the δ13C were determined for bulk soil samples and for microaggregate size fractions separated by wet sieving. Fourier Transform Infrared Spectroscopy (FTIR) coupled to optical microscopy was used to obtaining fingerprints of biochar and OM composition for soil microaggregates. The total C content in the BA soil (5.5%) and the C/N ratio (16.9) were higher than for NBA (C content 2.7%; C/N ratio 12.6), which confirms the persistence of OM in the BA. The average isotopic signature of bulk soil from BA (-26.08) was slightly

  10. A comparison of accurate automatic hippocampal segmentation methods. (United States)

    Zandifar, Azar; Fonov, Vladimir; Coupé, Pierrick; Pruessner, Jens; Collins, D Louis


    The hippocampus is one of the first brain structures affected by Alzheimer's disease (AD). While many automatic methods for hippocampal segmentation exist, few studies have compared them on the same data. In this study, we compare four fully automated hippocampal segmentation methods in terms of their conformity with manual segmentation and their ability to be used as an AD biomarker in clinical settings. We also apply error correction to the four automatic segmentation methods, and complete a comprehensive validation to investigate differences between the methods. The effect size and classification performance is measured for AD versus normal control (NC) groups and for stable mild cognitive impairment (sMCI) versus progressive mild cognitive impairment (pMCI) groups. Our study shows that the nonlinear patch-based segmentation method with error correction is the most accurate automatic segmentation method and yields the most conformity with manual segmentation (κ=0.894). The largest effect size between AD versus NC and sMCI versus pMCI is produced by FreeSurfer with error correction. We further show that, using only hippocampal volume, age, and sex as features, the area under the receiver operating characteristic curve reaches up to 0.8813 for AD versus NC and 0.6451 for sMCI versus pMCI. However, the automatic segmentation methods are not significantly different in their performance. Copyright © 2017. Published by Elsevier Inc.

  11. Identification of Suitable Indices for Identification of Potential Sites ...

    African Journals Online (AJOL)

    parameters for identification ... importance weights of the different factors need to .... A buffer map showing different suitability areas in terms of distance from drainage patterns/streams, was then extracted from the drainage pattern using Arc View.

  12. Identification and Characterization of the V(DJ Recombination Activating Gene 1 in Long-Term Memory of Context Fear Conditioning

    Directory of Open Access Journals (Sweden)

    Edgardo Castro-Pérez


    Full Text Available An increasing body of evidence suggests that mechanisms related to the introduction and repair of DNA double strand breaks (DSBs may be associated with long-term memory (LTM processes. Previous studies from our group suggested that factors known to function in DNA recombination/repair machineries, such as DNA ligases, polymerases, and DNA endonucleases, play a role in LTM. Here we report data using C57BL/6 mice showing that the V(DJ recombination-activating gene 1 (RAG1, which encodes a factor that introduces DSBs in immunoglobulin and T-cell receptor genes, is induced in the amygdala, but not in the hippocampus, after context fear conditioning. Amygdalar induction of RAG1 mRNA, measured by real-time PCR, was not observed in context-only or shock-only controls, suggesting that the context fear conditioning response is related to associative learning processes. Furthermore, double immunofluorescence studies demonstrated the neuronal localization of RAG1 protein in amygdalar sections prepared after perfusion and fixation. In functional studies, intra-amygdalar injections of RAG1 gapmer antisense oligonucleotides, given 1 h prior to conditioning, resulted in amygdalar knockdown of RAG1 mRNA and a significant impairment in LTM, tested 24 h after training. Overall, these findings suggest that the V(DJ recombination-activating gene 1, RAG1, may play a role in LTM consolidation.

  13. Identification and Characterization of the V(D)J Recombination Activating Gene 1 in Long-Term Memory of Context Fear Conditioning. (United States)

    Castro-Pérez, Edgardo; Soto-Soto, Emilio; Pérez-Carambot, Marizabeth; Dionisio-Santos, Dawling; Saied-Santiago, Kristian; Ortiz-Zuazaga, Humberto G; Peña de Ortiz, Sandra


    An increasing body of evidence suggests that mechanisms related to the introduction and repair of DNA double strand breaks (DSBs) may be associated with long-term memory (LTM) processes. Previous studies from our group suggested that factors known to function in DNA recombination/repair machineries, such as DNA ligases, polymerases, and DNA endonucleases, play a role in LTM. Here we report data using C57BL/6 mice showing that the V(D)J recombination-activating gene 1 (RAG1), which encodes a factor that introduces DSBs in immunoglobulin and T-cell receptor genes, is induced in the amygdala, but not in the hippocampus, after context fear conditioning. Amygdalar induction of RAG1 mRNA, measured by real-time PCR, was not observed in context-only or shock-only controls, suggesting that the context fear conditioning response is related to associative learning processes. Furthermore, double immunofluorescence studies demonstrated the neuronal localization of RAG1 protein in amygdalar sections prepared after perfusion and fixation. In functional studies, intra-amygdalar injections of RAG1 gapmer antisense oligonucleotides, given 1 h prior to conditioning, resulted in amygdalar knockdown of RAG1 mRNA and a significant impairment in LTM, tested 24 h after training. Overall, these findings suggest that the V(D)J recombination-activating gene 1, RAG1, may play a role in LTM consolidation.

  14. Metabolite Profiling of Diverse Rice Germplasm and Identification of Conserved Metabolic Markers of Rice Roots in Response to Long-Term Mild Salinity Stress. (United States)

    Nam, Myung Hee; Bang, Eunjung; Kwon, Taek Yun; Kim, Yuran; Kim, Eun Hee; Cho, Kyungwon; Park, Woong June; Kim, Beom-Gi; Yoon, In Sun


    The sensitivity of rice to salt stress greatly depends on growth stages, organ types and cultivars. Especially, the roots of young rice seedlings are highly salt-sensitive organs that limit plant growth, even under mild soil salinity conditions. In an attempt to identify metabolic markers of rice roots responding to salt stress, metabolite profiling was performed by ¹H-NMR spectroscopy in 38 rice genotypes that varied in biomass accumulation under long-term mild salinity condition. Multivariate statistical analysis showed separation of the control and salt-treated rice roots and rice genotypes with differential growth potential. By quantitative analyses of ¹H-NMR data, five conserved salt-responsive metabolic markers of rice roots were identified. Sucrose, allantoin and glutamate accumulated by salt stress, whereas the levels of glutamine and alanine decreased. A positive correlation of metabolite changes with growth potential and salt tolerance of rice genotypes was observed for allantoin and glutamine. Adjustment of nitrogen metabolism in rice roots is likely to be closely related to maintain the growth potential and increase the stress tolerance of rice.

  15. Metabolite Profiling of Diverse Rice Germplasm and Identification of Conserved Metabolic Markers of Rice Roots in Response to Long-Term Mild Salinity Stress

    Directory of Open Access Journals (Sweden)

    Myung Hee Nam


    Full Text Available The sensitivity of rice to salt stress greatly depends on growth stages, organ types and cultivars. Especially, the roots of young rice seedlings are highly salt-sensitive organs that limit plant growth, even under mild soil salinity conditions. In an attempt to identify metabolic markers of rice roots responding to salt stress, metabolite profiling was performed by 1H-NMR spectroscopy in 38 rice genotypes that varied in biomass accumulation under long-term mild salinity condition. Multivariate statistical analysis showed separation of the control and salt-treated rice roots and rice genotypes with differential growth potential. By quantitative analyses of 1H-NMR data, five conserved salt-responsive metabolic markers of rice roots were identified. Sucrose, allantoin and glutamate accumulated by salt stress, whereas the levels of glutamine and alanine decreased. A positive correlation of metabolite changes with growth potential and salt tolerance of rice genotypes was observed for allantoin and glutamine. Adjustment of nitrogen metabolism in rice roots is likely to be closely related to maintain the growth potential and increase the stress tolerance of rice.

  16. Identification and understanding the factors affecting the public and political acceptance of long term storage of spent fuel and high-level radioactive wastes

    International Nuclear Information System (INIS)

    Gorea, Valica


    In the end of 2004, according to the information available to the IAEA, there were 440 nuclear reactors operating worldwide, with a total net capacity of 366.3 GW(e), 6 of them being connected to the grid in 2004 ( 2 in Ukraine, one each in China, Japan and the Russian Federation and a reconnection in Canada) by comparison with 2 connections and 2 re-connections in 2003. Also, in the end of 2004, 26 nuclear power plants were under construction with a total net capacity of 20.8 GW(e). The conclusion accepted by common consent is that the nuclear power is still in progress and represents one of the options for power security on long and middle term. If we refer to the nuclear fusion which will produce commercial electric power, over 30 - 40 years, in practically unlimited quantities, the above underlining becomes even more evident. Fortunately, besides the beneficent characteristics, such as: clean, stable as engendering and price, has also a negative characteristic, which generally breathes fear into the people: radioactive waste. A classification of the radioactive waste is not the target of this presentation. I just want to point that a nuclear power plant produces during the time spent fuel - long life high radioactive, generating heat. Another high radioactive waste have similar characteristics (HLW = High Level Waste) for which reason these two categories of wastes are treated together. The spent fuel and the High Level Waste are interim stored for cooling, for around 50 years, afterwards it is transferred to the final repository where it will be kept for hundreds of years, in the case of an open fuel cycle and this is also the case of Cernavoda NPP. Taking into consideration that the Cernavoda Unit 1 reaches the age of 10 years of commercial running during December 2006, it results that the issue of the final disposal is not such urgent as it looks. The objectives of long term management of radioactive waste are public health and protection of the environment

  17. An automatic holographic adaptive phoropter (United States)

    Amirsolaimani, Babak; Peyghambarian, N.; Schwiegerling, Jim; Bablumyan, Arkady; Savidis, Nickolaos; Peyman, Gholam


    Phoropters are the most common instrument used to detect refractive errors. During a refractive exam, lenses are flipped in front of the patient who looks at the eye chart and tries to read the symbols. The procedure is fully dependent on the cooperation of the patient to read the eye chart, provides only a subjective measurement of visual acuity, and can at best provide a rough estimate of the patient's vision. Phoropters are difficult to use for mass screenings requiring a skilled examiner, and it is hard to screen young children and the elderly etc. We have developed a simplified, lightweight automatic phoropter that can measure the optical error of the eye objectively without requiring the patient's input. The automatic holographic adaptive phoropter is based on a Shack-Hartmann wave front sensor and three computercontrolled fluidic lenses. The fluidic lens system is designed to be able to provide power and astigmatic corrections over a large range of corrections without the need for verbal feedback from the patient in less than 20 seconds.

  18. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.


    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  19. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji


    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  20. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael


    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  1. Cross-language identification of long-term average speech spectra in Korean and English: toward a better understanding of the quantitative difference between two languages. (United States)

    Noh, Heil; Lee, Dong-Hee


    To identify the quantitative differences between Korean and English in long-term average speech spectra (LTASS). Twenty Korean speakers, who lived in the capital of Korea and spoke standard Korean as their first language, were compared with 20 native English speakers. For the Korean speakers, a passage from a novel and a passage from a leading newspaper article were chosen. For the English speakers, the Rainbow Passage was used. The speech was digitally recorded using GenRad 1982 Precision Sound Level Meter and GoldWave® software and analyzed using MATLAB program. There was no significant difference in the LTASS between the Korean subjects reading a news article or a novel. For male subjects, the LTASS of Korean speakers was significantly lower than that of English speakers above 1.6 kHz except at 4 kHz and its difference was more than 5 dB, especially at higher frequencies. For women, the LTASS of Korean speakers showed significantly lower levels at 0.2, 0.5, 1, 1.25, 2, 2.5, 6.3, 8, and 10 kHz, but the differences were less than 5 dB. Compared with English speakers, the LTASS of Korean speakers showed significantly lower levels in frequencies above 2 kHz except at 4 kHz. The difference was less than 5 dB between 2 and 5 kHz but more than 5 dB above 6 kHz. To adjust the formula for fitting hearing aids for Koreans, our results based on the LTASS analysis suggest that one needs to raise the gain in high-frequency regions.

  2. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink


    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  3. Automaticity in reading isiZulu


    Sandra Land


    Automaticity, or instant recognition of combinations of letters as units of language, is essential for proficient reading in any language. The article explores automaticity amongst competent adult first-language readers of isiZulu, and the factors associated with it or its opposite - active decoding. Whilst the transparent spelling patterns of isiZulu aid learner readers, some of its orthographical features may militate against their gaining automaticity. These features are agglutination; a c...

  4. Toward an Automatic Calibration of Dual Fluoroscopy Imaging Systems (United States)

    Al-Durgham, Kaleel; Lichti, Derek; Kuntze, Gregor; Sharma, Gulshan; Ronsky, Janet


    High-speed dual fluoroscopy (DF) imaging provides a novel, in-vivo solution to quantify the six-degree-of-freedom skeletal kinematics of humans and animals with sub-millimetre accuracy and high temporal resolution. A rigorous geometric calibration of DF system parameters is essential to ensure precise bony rotation and translation measurements. One way to achieve the system calibration is by performing a bundle adjustment with self-calibration. A first-time bundle adjustment-based system calibration was recently achieved. The system calibration through the bundle adjustment has been shown to be robust, precise, and straightforward. Nevertheless, due to the inherent absence of colour/semantic information in DF images, a significant amount of user input is needed to prepare the image observations for the bundle adjustment. This paper introduces a semi-automated methodology to minimise the amount of user input required to process calibration images and henceforth to facilitate the calibration task. The methodology is optimized for processing images acquired over a custom-made calibration frame with radio-opaque spherical targets. Canny edge detection is used to find distinct structural components of the calibration images. Edge-linking is applied to cluster the edge pixels into unique groups. Principal components analysis is utilized to automatically detect the calibration targets from the groups and to filter out possible outliers. Ellipse fitting is utilized to achieve the spatial measurements as well as to perform quality analysis over the detected targets. Single photo resection is used together with a template matching procedure to establish the image-to-object point correspondence and to simplify target identification. The proposed methodology provided 56,254 identified-targets from 411 images that were used to run a second-time bundle adjustment-based DF system calibration. Compared to a previous fully manual procedure, the proposed methodology has

  5. Automatic adventitious respiratory sound analysis: A systematic review.

    Directory of Open Access Journals (Sweden)

    Renard Xaviero Adhi Pramono

    Full Text Available Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD, and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established.To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works.A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016 and IEEExplore (1984-2016 databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification.Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated.Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved.A total of 77 reports from the literature were included in this review. 55 (71.43% of the studies focused on wheeze, 40 (51.95% on crackle, 9 (11.69% on stridor, 9

  6. Long-term chemical analysis and organic aerosol source apportionment at nine sites in central Europe: source identification and uncertainty assessment

    Directory of Open Access Journals (Sweden)

    K. R. Daellenbach


    Full Text Available Long-term monitoring of organic aerosol is important for epidemiological studies, validation of atmospheric models, and air quality management. In this study, we apply a recently developed filter-based offline methodology using an aerosol mass spectrometer (AMS to investigate the regional and seasonal differences of contributing organic aerosol sources. We present offline AMS measurements for particulate matter smaller than 10 µm at nine stations in central Europe with different exposure characteristics for the entire year of 2013 (819 samples. The focus of this study is a detailed source apportionment analysis (using positive matrix factorization, PMF including in-depth assessment of the related uncertainties. Primary organic aerosol (POA is separated in three components: hydrocarbon-like OA related to traffic emissions (HOA, cooking OA (COA, and biomass burning OA (BBOA. We observe enhanced production of secondary organic aerosol (SOA in summer, following the increase in biogenic emissions with temperature (summer oxygenated OA, SOOA. In addition, a SOA component was extracted that correlated with an anthropogenic secondary inorganic species that is dominant in winter (winter oxygenated OA, WOOA. A factor (sulfur-containing organic, SC-OA explaining sulfur-containing fragments (CH3SO2+, which has an event-driven temporal behaviour, was also identified. The relative yearly average factor contributions range from 4 to 14 % for HOA, from 3 to 11 % for COA, from 11 to 59 % for BBOA, from 5 to 23 % for SC-OA, from 14 to 27 % for WOOA, and from 15 to 38 % for SOOA. The uncertainty of the relative average factor contribution lies between 2 and 12 % of OA. At the sites north of the alpine crest, the sum of HOA, COA, and BBOA (POA contributes less to OA (POA / OA  =  0.3 than at the southern alpine valley sites (0.6. BBOA is the main contributor to POA with 87 % in alpine valleys and 42 % north of the alpine crest

  7. A Fully Automatic Fresh Apple Juicer: Peeling, Coring, Slicing and Juicing

    Directory of Open Access Journals (Sweden)

    Hu Fuwen


    Full Text Available With the fresh apple juice as an example, a fully automatic and intelligent juicer prototype was built via the integrated application of servo positioning modules, human-machine interface, image vision sensor system and 3D printing. All steps including peeling, coring, slicing and juicing were achieved automatically. The challenging technical problems about the identification and orientation of apple core, and adaptive peeling were settled creatively. The trial operation results illustrated that the fresh apple juice can be produced without manual intervention and the system has potential application in the crowded sites, such as mall, school, restaurant and hospital.

  8. A bar-code reader for an alpha-beta automatic counting system - FAG

    International Nuclear Information System (INIS)

    Levinson, S.; Shemesh, Y.; Ankry, N.; Assido, H.; German, U.; Peled, O.


    A bar-code laser system for sample number reading was integrated into the FAG Alpha-Beta automatic counting system. The sample identification by means of an attached bar-code label enables unmistakable and reliable attribution of results to the counted sample. Installation of the bar-code reader system required several modifications: Mechanical changes in the automatic sample changer, design and production of new sample holders, modification of the sample planchettes, changes in the electronic system, update of the operating software of the system (authors)

  9. Automatic Thermal Infrared Panoramic Imaging Sensor

    National Research Council Canada - National Science Library

    Gutin, Mikhail; Tsui, Eddy K; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey


    .... Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence...

  10. Prototype Design and Application of a Semi-circular Automatic Parking System


    Atacak, Ismail; Erdogdu, Ertugrul


    Nowadays, with the increasing population in urban areas, the number of vehicles used in traffic has also increased in these areas. This has brought with it major problems that are caused by insufficient parking areas, in terms of traffic congestion, drivers and environment. In this study, in order to overcome these problems, a multi-storey automatic parking system that automatically performs vehicle recognition, vehicle parking, vehicle delivery and pricing processes has been designed and the...

  11. Semi-Automatic Removal of Foreground Stars from Images of Galaxies (United States)

    Frei, Zsolt


    A new procedure, designed to remove foreground stars from galaxy proviles is presented here. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well-known stellar photometry packages, DAOPhot (Stetson 1987). Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since: (a) the most suitable stars are selected automatically from the image for the PSF fit; (b) after star-removal an intelligent and automatic procedure removes any possible residuals; (c) unlimited number of images can be cleaned in one run without any user interaction whatsoever. (SECTION: Computing and Data Analysis)

  12. An intelligent support system for automatic detection of cerebral vascular accidents from brain CT images. (United States)

    Hajimani, Elmira; Ruano, M G; Ruano, A E


    This paper presents a Radial Basis Functions Neural Network (RBFNN) based detection system, for automatic identification of Cerebral Vascular Accidents (CVA) through analysis of Computed Tomographic (CT) images. For the design of a neural network classifier, a Multi Objective Genetic Algorithm (MOGA) framework is used to determine the architecture of the classifier, its corresponding parameters and input features by maximizing the classification precision, while ensuring generalization. This approach considers a large number of input features, comprising first and second order pixel intensity statistics, as well as symmetry/asymmetry information with respect to the ideal mid-sagittal line. Values of specificity of 98% and sensitivity of 98% were obtained, at pixel level, by an ensemble of non-dominated models generated by MOGA, in a set of 150 CT slices (1,867,602pixels), marked by a NeuroRadiologist. This approach also compares favorably at a lesion level with three other published solutions, in terms of specificity (86% compared with 84%), degree of coincidence of marked lesions (89% compared with 77%) and classification accuracy rate (96% compared with 88%). Copyright © 2017. Published by Elsevier B.V.

  13. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.


    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  14. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole


    , using this method has been developed. (ADIODES is an abbreviation of `` Automatic Differentiation Interval Ordinary Differential Equation Solver''). ADIODES is used to prove existence and uniqueness of periodic solutions to specific ordinary differential equations occuring in dynamical systems theory....... These proofs of existence and uniqueness are difficult or impossible to obtain using other known methods. Also, a method for solving boundary value problems is described. Finally a method for enclosing solutions to a class of integral equations is described. This method is based on the mean value enclosure...... of an integral operator and uses interval Bernstein polynomials for enclosing the solution. Two numerical examples are given, using two orders of approximation and using different numbers of discretization points....

  15. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.


    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  16. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.


    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation, the ...... is completely data driven and it can divide a group of eye images into classes based on structure, colour or a combination of the two. The methods have been tested on a large set of photos with promising results....... regions. The result is a blue-brown ratio for each eye. Furthermore, an image clustering approach has been used with promising results. The approach is based on using a sparse dictionary of feature vectors learned from a training set of iris regions. The feature vectors contain both local structural...

  17. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.


    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  18. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.


    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  19. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva


    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  20. Automatic Channel Fault Detection on a Small Animal APD-Based Digital PET Scanner (United States)

    Charest, Jonathan; Beaudoin, Jean-François; Cadorette, Jules; Lecomte, Roger; Brunet, Charles-Antoine; Fontaine, Réjean


    Avalanche photodiode (APD) based positron emission tomography (PET) scanners show enhanced imaging capabilities in terms of spatial resolution and contrast due to the one to one coupling and size of individual crystal-APD detectors. However, to ensure the maximal performance, these PET scanners require proper calibration by qualified scanner operators, which can become a cumbersome task because of the huge number of channels they are made of. An intelligent system (IS) intends to alleviate this workload by enabling a diagnosis of the observational errors of the scanner. The IS can be broken down into four hierarchical blocks: parameter extraction, channel fault detection, prioritization and diagnosis. One of the main activities of the IS consists in analyzing available channel data such as: normalization coincidence counts and single count rates, crystal identification classification data, energy histograms, APD bias and noise thresholds to establish the channel health status that will be used to detect channel faults. This paper focuses on the first two blocks of the IS: parameter extraction and channel fault detection. The purpose of the parameter extraction block is to process available data on individual channels into parameters that are subsequently used by the fault detection block to generate the channel health status. To ensure extensibility, the channel fault detection block is divided into indicators representing different aspects of PET scanner performance: sensitivity, timing, crystal identification and energy. Some experiments on a 8 cm axial length LabPET scanner located at the Sherbrooke Molecular Imaging Center demonstrated an erroneous channel fault detection rate of 10% (with a 95% confidence interval (CI) of [9, 11]) which is considered tolerable. Globally, the IS achieves a channel fault detection efficiency of 96% (CI: [95, 97]), which proves that many faults can be detected automatically. Increased fault detection efficiency would be

  1. Automatic health record review to help prioritize gravely ill Social Security disability applicants. (United States)

    Abbott, Kenneth; Ho, Yen-Yi; Erickson, Jennifer


    Every year, thousands of patients die waiting for disability benefits from the Social Security Administration. Some qualify for expedited service under the Compassionate Allowance (CAL) initiative, but CAL software focuses exclusively on information from a single form field. This paper describes the development of a supplemental process for identifying some overlooked but gravely ill applicants, through automatic annotation of health records accompanying new claims. We explore improved prioritization instead of fully autonomous claims approval. We developed a sample of claims containing medical records at the moment of arrival in a single office. A series of tools annotated both patient records and public Web page descriptions of CAL medical conditions. We trained random forests to identify CAL patients and validated each model with 10-fold cross validation. Our main model, a general CAL classifier, had an area under the receiver operating characteristic curve of 0.915. Combining this classifier with existing software improved sensitivity from 0.960 to 0.994, detecting every deceased patient, but reducing positive predictive value to 0.216. True positive CAL identification is a priority, given CAL patient mortality. Mere prioritization of the false positives would not create a meaningful burden in terms of manual review. Death certificate data suggest the presence of truly ill patients among putative false positives. To a limited extent, it is possible to identify gravely ill Social Security disability applicants by analyzing annotations of unstructured electronic health records, and the level of identification is sufficient to be useful in prioritizing case reviews. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the US.

  2. Rapid Automatic Motor Encoding of Competing Reach Options

    Directory of Open Access Journals (Sweden)

    Jason P. Gallivan


    Full Text Available Mounting neural evidence suggests that, in situations in which there are multiple potential targets for action, the brain prepares, in parallel, competing movements associated with these targets, prior to implementing one of them. Central to this interpretation is the idea that competing viewed targets, prior to selection, are rapidly and automatically transformed into corresponding motor representations. Here, by applying target-specific, gradual visuomotor rotations and dissociating, unbeknownst to participants, the visual direction of potential targets from the direction of the movements required to reach the same targets, we provide direct evidence for this provocative idea. Our results offer strong empirical support for theories suggesting that competing action options are automatically represented in terms of the movements required to attain them. The rapid motor encoding of potential targets may support the fast optimization of motor costs under conditions of target uncertainty and allow the motor system to inform decisions about target selection.

  3. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani


    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  4. Towards Automatic Trunk Classification on Young Conifers

    DEFF Research Database (Denmark)

    Petri, Stig; Immerkær, John


    In the garden nursery industry providing young Nordmann firs for Christmas tree plantations, there is a rising interest in automatic classification of their products to ensure consistently high quality and reduce the cost of manual labor. This paper describes a fully automatic single-view algorithm...

  5. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.


    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  6. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.


    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  7. Box & Jenkins Model Identification:A Comparison of Methodologies

    Directory of Open Access Journals (Sweden)

    Maria Augusta Soares Machado


    Full Text Available This paper focuses on a presentation of a comparison of a neuro-fuzzy back propagation network and Forecast automatic model Identification to identify automatically Box & Jenkins non seasonal models.Recently some combinations of neural networks and fuzzy logic technologies have being used to deal with uncertain and subjective problems. It is concluded on the basis of the obtained results that this type of approach is very powerful to be used.

  8. Scheduling with Automatic Resolution of Conflicts (United States)

    Clement, Bradley; Schaffer, Steve


    DSN Requirement Scheduler is a computer program that automatically schedules, reschedules, and resolves conflicts for allocations of resources of NASA s Deep Space Network (DSN) on the basis of ever-changing project requirements for DSN services. As used here, resources signifies, primarily, DSN antennas, ancillary equipment, and times during which they are available. Examples of project-required DSN services include arraying, segmentation, very-long-baseline interferometry, and multiple spacecraft per aperture. Requirements can include periodic reservations of specific or optional resources during specific time intervals or within ranges specified in terms of starting times and durations. This program is built on the Automated Scheduling and Planning Environment (ASPEN) software system (aspects of which have been described in previous NASA Tech Briefs articles), with customization to reflect requirements and constraints involved in allocation of DSN resources. Unlike prior DSN-resource- scheduling programs that make single passes through the requirements and require human intervention to resolve conflicts, this program makes repeated passes in a continuing search for all possible allocations, provides a best-effort solution at any time, and presents alternative solutions among which users can choose.

  9. Automatic generation of stop word lists for information retrieval and analysis (United States)

    Rose, Stuart J


    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  10. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison (United States)

    Sa, Qila; Wang, Zhihui


    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  11. Automatic detection of adverse events to predict drug label changes using text and data mining techniques. (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki


    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  12. Automatic exploitation system for photographic dosemeters; Systeme d`exploitation automatique des dosimetres photographiques

    Energy Technology Data Exchange (ETDEWEB)

    Magri, Y.; Devillard, D.; Godefroit, J.L.; Barillet, C.


    The Laboratory of Dosimetry Exploitation (LED) has realized an equipment allowing to exploit automatically photographic film dosemeters. This system uses an identification of the films by code-bars and gives the doses measurement with a completely automatic reader. The principle consists in putting in ribbon the emulsions to be exploited and to develop them in a circulation machine. The measurement of the blackening film is realized on a reading plate having fourteen points of reading, in which are circulating the emulsions in ribbon. The exploitation is made with the usual dose calculation method, with special computers codes. A comparison on 2000 dosemeters has shown that the results are the same in manual and automatical methods. This system has been operating since July 1995 by the LED. (N.C.).

  13. Peak fitting and identification software library for high resolution gamma-ray spectra

    International Nuclear Information System (INIS)

    Uher, Josef; Roach, Greg; Tickner, James


    A new gamma-ray spectral analysis software package is under development in our laboratory. It can be operated as a stand-alone program or called as a software library from Java, C, C++ and MATLAB TM environments. It provides an advanced graphical user interface for data acquisition, spectral analysis and radioisotope identification. The code uses a peak-fitting function that includes peak asymmetry, Compton continuum and flexible background terms. Peak fitting function parameters can be calibrated as functions of energy. Each parameter can be constrained to improve fitting of overlapping peaks. All of these features can be adjusted by the user. To assist with peak identification, the code can automatically measure half-lives of single or multiple overlapping peaks from a time series of spectra. It implements library-based peak identification, with options for restricting the search based on radioisotope half-lives and reaction types. The software also improves the reliability of isotope identification by utilizing Monte-Carlo simulation results.

  14. Automatic locking orthotic knee device (United States)

    Weddendorf, Bruce C. (Inventor)


    An articulated tang in clevis joint for incorporation in newly manufactured conventional strap-on orthotic knee devices or for replacing such joints in conventional strap-on orthotic knee devices is discussed. The instant tang in clevis joint allows the user the freedom to extend and bend the knee normally when no load (weight) is applied to the knee and to automatically lock the knee when the user transfers weight to the knee, thus preventing a damaged knee from bending uncontrollably when weight is applied to the knee. The tang in clevis joint of the present invention includes first and second clevis plates, a tang assembly and a spacer plate secured between the clevis plates. Each clevis plate includes a bevelled serrated upper section. A bevelled shoe is secured to the tank in close proximity to the bevelled serrated upper section of the clevis plates. A coiled spring mounted within an oblong bore of the tang normally urges the shoes secured to the tang out of engagement with the serrated upper section of each clevic plate to allow rotation of the tang relative to the clevis plate. When weight is applied to the joint, the load compresses the coiled spring, the serrations on each clevis plate dig into the bevelled shoes secured to the tang to prevent relative movement between the tang and clevis plates. A shoulder is provided on the tang and the spacer plate to prevent overextension of the joint.

  15. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler


    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  16. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre


    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  17. Automatic segmentation of the colon (United States)

    Wyatt, Christopher L.; Ge, Yaorong; Vining, David J.


    Virtual colonoscopy is a minimally invasive technique that enables detection of colorectal polyps and cancer. Normally, a patient's bowel is prepared with colonic lavage and gas insufflation prior to computed tomography (CT) scanning. An important step for 3D analysis of the image volume is segmentation of the colon. The high-contrast gas/tissue interface that exists in the colon lumen makes segmentation of the majority of the colon relatively easy; however, two factors inhibit automatic segmentation of the entire colon. First, the colon is not the only gas-filled organ in the data volume: lungs, small bowel, and stomach also meet this criteria. User-defined seed points placed in the colon lumen have previously been required to spatially isolate only the colon. Second, portions of the colon lumen may be obstructed by peristalsis, large masses, and/or residual feces. These complicating factors require increased user interaction during the segmentation process to isolate additional colon segments. To automate the segmentation of the colon, we have developed a method to locate seed points and segment the gas-filled lumen with no user supervision. We have also developed an automated approach to improve lumen segmentation by digitally removing residual contrast-enhanced fluid resulting from a new bowel preparation that liquefies and opacifies any residual feces.

  18. Automatic panoramic thermal integrated sensor (United States)

    Gutin, Mikhail A.; Tsui, Eddy K.; Gutin, Olga N.


    Historically, the US Army has recognized the advantages of panoramic imagers with high image resolution: increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The novel ViperViewTM high-resolution panoramic thermal imager is the heart of the Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) in support of the Future Combat Systems (FCS) and the Intelligent Munitions Systems (IMS). The APTIS is anticipated to operate as an intelligent node in a wireless network of multifunctional nodes that work together to improve situational awareness (SA) in many defense and offensive operations, as well as serve as a sensor node in tactical Intelligence Surveillance Reconnaissance (ISR). The ViperView is as an aberration-corrected omnidirectional imager with small optics designed to match the resolution of a 640x480 pixels IR camera with improved image quality for longer range target detection, classification, and tracking. The same approach is applicable to panoramic cameras working in the visible spectral range. Other components of the ATPIS sensor suite include ancillary sensors, advanced power management, and wakeup capability. This paper describes the development status of the APTIS system.

  19. Automatic segmentation of psoriasis lesions (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang


    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  20. A neurocomputational model of automatic sequence production. (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory


    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.


    Human Engineering Inst., Cleveland, OH.


  2. Isotope Identification

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    The objective of this training modules is to examine the process of using gamma spectroscopy for radionuclide identification; apply pattern recognition to gamma spectra; identify methods of verifying energy calibration; and discuss potential causes of isotope misidentification.

  3. Automatic x-ray image characterisation for non-destructive evaluation


    Yin, Ying; Tian, Gui Yun


    In this paper, firstly we introduce an automatic welding defect inspection system for X-ray image\\ud evaluation, then, a novel image segmentation approach is proposed. In this approach, we first apply\\ud an adaptive morphological filter (AMF) with an appropriate structuring element to remove noise and\\ud most of the background image which is useless for defect identification. Secondly, edges were derived\\ud from edge detection by using the Sobel operator. Morphological processing is used to m...

  4. Computer vision-based automatic beverage dispenser prototype for user experience studies


    Merchán, Fernando; Valderrama, Elba; Poveda, Martín


    This paper presents several aspects of the implementation of a prototype of automatic beverage dispenser with computer vision functionalities. The system presents touchless technologies including face recognition for user identification and hand gesture recognition for beverage selection. This prototype is a test platform to explore the acceptance of these technologies by consumers and to compare it with other technologies such as touch screens. We present both the technical aspects of the de...

  5. Intraoperative multichannel audio-visual information recording and automatic surgical phase and incident detection. (United States)

    Suzuki, Takashi; Sakurai, Yasuo; Yoshimitsu, Kitaro; Nambu, Kyojiro; Muragaki, Yoshihiro; Iseki, Hiroshi


    Identification, analysis, and treatment of potential risk in surgical workflow are the key to decrease medical errors in operating room. For the automatic analysis of recorded surgical information, this study reports multichannel audio visual recording system, and its review and analysis system. Motion in operating room is quantified using video file size without motion tracking. Conversation among surgical staff is quantified using fast Fourier transformation and frequency filter without speech recognition. The results suggested the progression phase of surgical procedure.

  6. Detection of tuberculosis by automatic cough sound analysis. (United States)

    Botha, G H Renier; Theron, Grant; Warren, Rob; Klopper, Marisa; Dheda, Kheertan; van Helden, Paul; Niesler, Thomas R


    Globally, tuberculosis (TB) remains one of the most deadly diseases. Although several effective diagnosis methods exist, in lower income countries clinics may not be in a position to afford expensive equipment and employ the trained experts needed to interpret results. In these situations, symptoms including cough are commonly used to identify patients for testing. However, self-reported cough has suboptimal sensitivity and specificity, which may be improved by digital detection. This study investigates a simple and easily applied method for TB screening based on the automatic analysis of coughing sounds. A database of cough audio recordings was collected and used to develop statistical classifiers. These classifiers use short-term spectral information to automatically distinguish between the coughs of TB positive and TB negative patients with an accuracy of 78% and an AUC of 0.95. When a set of five clinical measurements is available in addition to the audio, this accuracy improves to 82%. By choosing an appropriate decision threshold, the system can achieve a sensitivity of 95% at a specificity of approximately 72%. The experiments suggest that the classifiers are using some spectral information that is not perceivable by the human auditory system, and that certain frequencies are more useful for classification than others. We conclude that automatic classification of coughing sounds may represent a viable low-cost and low-complexity screening method for TB. © 2018 Institute of Physics and Engineering in Medicine.

  7. Open Dataset for the Automatic Recognition of Sedentary Behaviors. (United States)

    Possos, William; Cruz, Robinson; Cerón, Jesús D; López, Diego M; Sierra-Torres, Carlos H


    Sedentarism is associated with the development of noncommunicable diseases (NCD) such as cardiovascular diseases (CVD), type 2 diabetes, and cancer. Therefore, the identification of specific sedentary behaviors (TV viewing, sitting at work, driving, relaxing, etc.) is especially relevant for planning personalized prevention programs. To build and evaluate a public a dataset for the automatic recognition (classification) of sedentary behaviors. The dataset included data from 30 subjects, who performed 23 sedentary behaviors while wearing a commercial wearable on the wrist, a smartphone on the hip and another in the thigh. Bluetooth Low Energy (BLE) beacons were used in order to improve the automatic classification of different sedentary behaviors. The study also compared six well know data mining classification techniques in order to identify the more precise method of solving the classification problem of the 23 defined behaviors. A better classification accuracy was obtained using the Random Forest algorithm and when data were collected from the phone on the hip. Furthermore, the use of beacons as a reference for obtaining the symbolic location of the individual improved the precision of the classification.

  8. A Cough-Based Algorithm for Automatic Diagnosis of Pertussis (United States)

    Pramono, Renard Xaviero Adhi; Imtiaz, Syed Anas; Rodriguez-Villegas, Esther


    Pertussis is a contagious respiratory disease which mainly affects young children and can be fatal if left untreated. The World Health Organization estimates 16 million pertussis cases annually worldwide resulting in over 200,000 deaths. It is prevalent mainly in developing countries where it is difficult to diagnose due to the lack of healthcare facilities and medical professionals. Hence, a low-cost, quick and easily accessible solution is needed to provide pertussis diagnosis in such areas to contain an outbreak. In this paper we present an algorithm for automated diagnosis of pertussis using audio signals by analyzing cough and whoop sounds. The algorithm consists of three main blocks to perform automatic cough detection, cough classification and whooping sound detection. Each of these extract relevant features from the audio signal and subsequently classify them using a logistic regression model. The output from these blocks is collated to provide a pertussis likelihood diagnosis. The performance of the proposed algorithm is evaluated using audio recordings from 38 patients. The algorithm is able to diagnose all pertussis successfully from all audio recordings without any false diagnosis. It can also automatically detect individual cough sounds with 92% accuracy and PPV of 97%. The low complexity of the proposed algorithm coupled with its high accuracy demonstrates that it can be readily deployed using smartphones and can be extremely useful for quick identification or early screening of pertussis and for infection outbreaks control. PMID:27583523

  9. Machine learning-based automatic detection of pulmonary trunk (United States)

    Wu, Hong; Deng, Kun; Liang, Jianming


    Pulmonary embolism is a common cardiovascular emergency with about 600,000 cases occurring annually and causing approximately 200,000 deaths in the US. CT pulmonary angiography (CTPA) has become the reference standard for PE diagnosis, but the interpretation of these large image datasets is made complex and time consuming by the intricate branching structure of the pulmonary vessels, a myriad of artifacts that may obscure or mimic PEs, and suboptimal bolus of contrast and inhomogeneities with the pulmonary arterial blood pool. To meet this challenge, several approaches for computer aided diagnosis of PE in CTPA have been proposed. However, none of these approaches is capable of detecting central PEs, distinguishing the pulmonary artery from the vein to effectively remove any false positives from the veins, and dynamically adapting to suboptimal contrast conditions associated the CTPA scans. To overcome these shortcomings, it requires highly efficient and accurate identification of the pulmonary trunk. For this very purpose, in this paper, we present a machine learning based approach for automatically detecting the pulmonary trunk. Our idea is to train a cascaded AdaBoost classifier with a large number of Haar features extracted from CTPA image samples, so that the pulmonary trunk can be automatically identified by sequentially scanning the CTPA images and classifying each encountered sub-image with the trained classifier. Our approach outperforms an existing anatomy-based approach, requiring no explicit representation of anatomical knowledge and achieving a nearly 100% accuracy tested on a large number of cases.

  10. A manual and an automatic TERS based virus discrimination (United States)

    Olschewski, Konstanze; Kämmer, Evelyn; Stöckel, Stephan; Bocklitz, Thomas; Deckert-Gaudig, Tanja; Zell, Roland; Cialla-May, Dana; Weber, Karina; Deckert, Volker; Popp, Jürgen


    Rapid techniques for virus identification are more relevant today than ever. Conventional virus detection and identification strategies generally rest upon various microbiological methods and genomic approaches, which are not suited for the analysis of single virus particles. In contrast, the highly sensitive spectroscopic technique tip-enhanced Raman spectroscopy (TERS) allows the characterisation of biological nano-structures like virions on a single-particle level. In this study, the feasibility of TERS in combination with chemometrics to discriminate two pathogenic viruses, Varicella-zoster virus (VZV) and Porcine teschovirus (PTV), was investigated. In a first step, chemometric methods transformed the spectral data in such a way that a rapid visual discrimination of the two examined viruses was enabled. In a further step, these methods were utilised to perform an automatic quality rating of the measured spectra. Spectra that passed this test were eventually used to calculate a classification model, through which a successful discrimination of the two viral species based on TERS spectra of single virus particles was also realised with a classification accuracy of 91%.Rapid techniques for virus identification are more relevant today than ever. Conventional virus detection and identification strategies generally rest upon various microbiological methods and genomic approaches, which are not suited for the analysis of single virus particles. In contrast, the highly sensitive spectroscopic technique tip-enhanced Raman spectroscopy (TERS) allows the characterisation of biological nano-structures like virions on a single-particle level. In this study, the feasibility of TERS in combination with chemometrics to discriminate two pathogenic viruses, Varicella-zoster virus (VZV) and Porcine teschovirus (PTV), was investigated. In a first step, chemometric methods transformed the spectral data in such a way that a rapid visual discrimination of the two examined viruses

  11. Automatic Recognition of Road Signs (United States)

    Inoue, Yasuo; Kohashi, Yuuichirou; Ishikawa, Naoto; Nakajima, Masato


    The increase in traffic accidents is becoming a serious social problem with the recent rapid traffic increase. In many cases, the driver"s carelessness is the primary factor of traffic accidents, and the driver assistance system is demanded for supporting driver"s safety. In this research, we propose the new method of automatic detection and recognition of road signs by image processing. The purpose of this research is to prevent accidents caused by driver"s carelessness, and call attention to a driver when the driver violates traffic a regulation. In this research, high accuracy and the efficient sign detecting method are realized by removing unnecessary information except for a road sign from an image, and detect a road sign using shape features. At first, the color information that is not used in road signs is removed from an image. Next, edges except for circular and triangle ones are removed to choose sign shape. In the recognition process, normalized cross correlation operation is carried out to the two-dimensional differentiation pattern of a sign, and the accurate and efficient method for detecting the road sign is realized. Moreover, the real-time operation in a software base was realized by holding down calculation cost, maintaining highly precise sign detection and recognition. Specifically, it becomes specifically possible to process by 0.1 sec(s)/frame using a general-purpose PC (CPU: Pentium4 1.7GHz). As a result of in-vehicle experimentation, our system could process on real time and has confirmed that detection and recognition of a sign could be performed correctly.

  12. Laser Scanner For Automatic Storage (United States)

    Carvalho, Fernando D.; Correia, Bento A.; Rebordao, Jose M.; Rodrigues, F. Carvalho


    The automated magazines are beeing used at industry more and more. One of the problems related with the automation of a Store House is the identification of the products envolved. Already used for stock management, the Bar Codes allows an easy way to identify one product. Applied to automated magazines, the bar codes allows a great variety of items in a small code. In order to be used by the national producers of automated magazines, a devoted laser scanner has been develloped. The Prototype uses an He-Ne laser whose beam scans a field angle of 75 degrees at 16 Hz. The scene reflectivity is transduced by a photodiode into an electrical signal, which is then binarized. This digital signal is the input of the decodifying program. The machine is able to see barcodes and to decode the information. A parallel interface allows the comunication with the central unit, which is responsible for the management of automated magazine.

  13. Radiation dosimetry by automatic image analysis of dicentric chromosomes

    International Nuclear Information System (INIS)

    Bayley, R.; Carothers, A.; Farrow, S.; Gordon, J.; Ji, L.; Piper, J.; Rutovitz, D.; Stark, M.; Chen, X.; Wald, N.; Pittsburgh Univ., PA


    A system for scoring dicentric chromosomes by image analysis comprised fully automatic location of mitotic cells, automatic retrieval, focus and digitisation at high resolution, automatic rejection of nuclei and debris and detection and segmentation of chromosome clusters, automatic centromere location, and subsequent rapid interactive visual review of potential dicentric chromosomes to confirm positives and reject false positives. A calibration set of about 15000 cells was used to establish the quadratic dose response for 60 Co γ-irradiation. The dose-response function parameters were established by a maximum likelihood technique, and confidence limits in the dose response and in the corresponding inverse curve, of estimated dose for observed dicentric frequency, were established by Monte Carlo techniques. The system was validated in a blind trial by analysing a test comprising a total of about 8000 cells irradiated to 1 of 10 dose levels, and estimating the doses from the observed dicentric frequency. There was a close correspondence between the estimated and true doses. The overall sensitivity of the system in terms of the proportion of the total population of dicentrics present in the cells analysed that were detected by the system was measured to be about 40%. This implies that about 2.5 times more cells must be analysed by machine than by visual analysis. Taking this factor into account, the measured review time and false positive rates imply that analysis by the system of sufficient cells to provide the equivalent of a visual analysis of 500 cells would require about 1 h for operator review. (author). 20 refs.; 4 figs.; 5 tabs

  14. Automatic safety rod for reactors. [LMFBR (United States)

    Germer, J.H.


    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  15. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  16. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.


    Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations......, and can easily be adapted to situational circumstances. Both the perception of advertising and the way advertising influences brand evaluation involves both processes. Automatic processes govern the recognition of advertising stimuli, the relevance decision which determines further higher-level processing...

  17. Towards automatic verification of ladder logic programs


    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha


    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  18. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan


    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  19. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.


    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  20. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias


    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  1. The UAB Informatics Institute and 2016 CEGS N-GRID de-identification shared task challenge. (United States)

    Bui, Duy Duc An; Wyatt, Mathew; Cimino, James J


    Clinical narratives (the text notes found in patients' medical records) are important information sources for secondary use in research. However, in order to protect patient privacy, they must be de-identified prior to use. Manual de-identification is considered to be the gold standard approach but is tedious, expensive, slow, and impractical for use with large-scale clinical data. Automated or semi-automated de-identification using computer algorithms is a potentially promising alternative. The Informatics Institute of the University of Alabama at Birmingham is applying de-identification to clinical data drawn from the UAB hospital's electronic medical records system before releasing them for research. We participated in a shared task challenge by the Centers of Excellence in Genomic Science (CEGS) Neuropsychiatric Genome-Scale and RDoC Individualized Domains (N-GRID) at the de-identification regular track to gain experience developing our own automatic de-identification tool. We focused on the popular and successful methods from previous challenges: rule-based, dictionary-matching, and machine-learning approaches. We also explored new techniques such as disambiguation rules, term ambiguity measurement, and used multi-pass sieve framework at a micro level. For the challenge's primary measure (strict entity), our submissions achieved competitive results (f-measures: 87.3%, 87.1%, and 86.7%). For our preferred measure (binary token HIPAA), our submissions achieved superior results (f-measures: 93.7%, 93.6%, and 93%). With those encouraging results, we gain the confidence to improve and use the tool for the real de-identification task at the UAB Informatics Institute. Copyright © 2017 Elsevier Inc. All rights reserved.



    Hu Ng; Hau-Lee Ton; Wooi-Haw Tan; Timothy Tzen-Vun Yap; Pei-Fen Chong; Junaidi Abdullah


    This paper presents a human identification system based on automatically extracted gait features. The proposed approach consists of three parts: extraction of human gait features from enhanced human silhouette, smoothing process on extracted gait features and classification by three classification techniques: fuzzy k- nearest neighbour, linear discriminate analysis and linear support vector machine. The gait features extracted are height, width, crotch height, step-size of the human silhouett...

  3. Automatic semi-continuous accumulation chamber for diffuse gas emissions monitoring in volcanic and non-volcanic areas (United States)

    Lelli, Matteo; Raco, Brunella; Norelli, Francesco; Virgili, Giorgio; Continanza, Davide


    Since various decades the accumulation chamber method is intensively used in monitoring activities of diffuse gas emissions in volcanic areas. Although some improvements have been performed in terms of sensitivity and reproducibility of the detectors, the equipment used for measurement of gas emissions temporal variation usually requires expensive and bulky equipment. The unit described in this work is a low cost, easy to install-and-manage instrument that will make possible the creation of low-cost monitoring networks. The Non-Dispersive Infrared detector used has a concentration range of 0-5% CO2, but the substitution with other detector (range 0-5000 ppm) is possible and very easy. Power supply unit has a 12V, 7Ah battery, which is recharged by a 35W solar panel (equipped with charge regulator). The control unit contains a custom programmed CPU and the remote transmission is assured by a GPRS modem. The chamber is activated by DataLogger unit, using a linear actuator between the closed position (sampling) and closed position (idle). A probe for the measure of soil temperature, soil electrical conductivity, soil volumetric water content, air pressure and air temperature is assembled on the device, which is already arranged for the connection of others external sensors, including an automatic weather station. The automatic station has been tested on the field at Lipari island (Sicily, Italy) during a period of three months, performing CO2 flux measurement (and also weather parameters), each 1 hour. The possibility to measure in semi-continuous mode, and at the same time, the gas fluxes from soil and many external parameters, helps the time series analysis aimed to the identification of gas flux anomalies due to variations in deep system (e.g. onset of volcanic crises) from those triggered by external conditions.

  4. Radio frequency identification and its application in e-commerce


    Bahr, Witold; Price, Brian J


    This chapter presents Radio Frequency Identification (RFID), which is one of the Automatic Identification and Data Capture (AIDC) technologies (Wamba and Boeck, 2008) and discusses the application of RFID in E-Commerce. Firstly RFID is defined and the tag and reader components of the RFID system are explained. Then historical context of RFID is briefly discussed. Next, RFID is contrasted with other AIDC technologies, especially the use of barcodes which are commonly applied in E-Commerce. Las...

  5. Identification device (United States)

    Lin, Jian-Shian; Su, Chih-Chieh; Chou, Ta-Hsin; Wu, Mount-Learn; Lai, Chieh-Lung; Hsu, Che-Lung; Lan, Hsiao-Chin; Huang, Hung-I.; Liu, Yung-Chih; Tu, Zong-Ru; Lee, Chien-Chieh; Chang, Jenq-Yang


    In this Letter, the identification device disclosed in the present invention is comprised of: a carrier and a plurality of pseudo-pixels; wherein each of the plural pseudo-pixels is formed on the carrier and is further comprised of at least a light grating composed of a plurality of light grids. In a preferred aspect, each of the plural light grids is formed on the carrier while spacing from each other by an interval ranged between 50nm and 900nm. As the aforesaid identification device can present specific colors and patterns while it is being viewed by naked eye with respect to a specific viewing angle, the identification device is preferred for security and anti-counterfeit applications since the specific colors and patterns will become invisible when it is viewed while deviating from the specific viewing angle.

  6. 46 CFR 63.25-1 - Small automatic auxiliary boilers. (United States)


    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  7. 30 CFR 77.314 - Automatic temperature control instruments. (United States)


    ... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77...

  8. Identification automatique des diatomées de la Merja fouarate : Une ...

    African Journals Online (AJOL)

    Cependant les recherches continuent et s'appuient fortement sur les retombées de l'Intelligence artificielle. Mots clés : Diatomées, actuel, identification automatique, traitement numérique de l'image, morphologie mathématique, Fouarate, Kenitra, Maroc. Automatic identification of Fouarate Merja diatoms: An alternative to ...

  9. Identification automatique des diatomées de la Merja fouarate : Une ...

    African Journals Online (AJOL)


    30 sept. 2015 ... Keyword: Diatoms, Automatic Identification, actual Image Processing, Mathematical Morphology,. Fouarate, kenitra, Maroc. INTRODUCTION. On présente, dans cette étude, une alternative aux techniques manuelles de détermination des diatomées. La problématique de l'identification et de la classification ...

  10. Automatic learning-based beam angle selection for thoracic IMRT. (United States)

    Amit, Guy; Purdie, Thomas G; Levinshtein, Alex; Hope, Andrew J; Lindsay, Patricia; Marshall, Andrea; Jaffray, David A; Pekar, Vladimir


    The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose-volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner's clinical experience. The purpose of this work is to propose and study a computationally efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume coverage and organ at risk

  11. Automatic Power Line Inspection Using UAV Images

    Directory of Open Access Journals (Sweden)

    Yong Zhang


    Full Text Available Power line inspection ensures the safe operation of a power transmission grid. Using unmanned aerial vehicle (UAV images of power line corridors is an effective way to carry out these vital inspections. In this paper, we propose an automatic inspection method for power lines using UAV images. This method, known as the power line automatic measurement method based on epipolar constraints (PLAMEC, acquires the spatial position of the power lines. Then, the semi patch matching based on epipolar constraints (SPMEC dense matching method is applied to automatically extract dense point clouds within the power line corridor. Obstacles can then be automatically detected by calculating the spatial distance between a power line and the point cloud representing the ground. Experimental results show that the PLAMEC automatically measures power lines effectively with a measurement accuracy consistent with that of manual stereo measurements. The height root mean square (RMS error of the point cloud was 0.233 m, and the RMS error of the power line was 0.205 m. In addition, we verified the detected obstacles in the field and measured the distance between the canopy and power line using a laser range finder. The results show that the difference of these two distances was within ±0.5 m.

  12. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.


    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  13. Musical Instrument Identification using Multiscale Mel-frequency Cepstral Coefficients

    DEFF Research Database (Denmark)

    Sturm, Bob L.; Morvidone, Marcela; Daudet, Laurent


    We investigate the benefits of evaluating Mel-frequency cepstral coefficients (MFCCs) over several time scales in the context of automatic musical instrument identification for signals that are monophonic but derived from real musical settings. We define several sets of features derived from MFCCs...

  14. A review on modeling, identification and servo control of robotic ...

    African Journals Online (AJOL)

    Robotic excavator is a hydraulic actuated 4 DOF manipulator mounted on a mobile chassis which implements automatic excavations. This article reviews modeling, identification, and low level control of the robotic excavator. First, modeling of the nonlinear hydraulic dynamics, coupling manipulator dynamics, and soil-tool ...

  15. Spoken Indian language identification: a review of features and ...

    Indian Academy of Sciences (India)



    Apr 12, 2018 ... sound of that language. These language-specific properties can be exploited to identify a spoken language reliably. Automatic language identification has emerged as a prominent research area in. Indian languages processing. People from different regions of India speak around 800 different languages.

  16. Automatic Defect Detection of Fasteners on the Catenary Support Device Using Deep Convolutional Neural Network

    NARCIS (Netherlands)

    Chen, Junwen; Liu, Zhigang; Wang, H.; Nunez Vicencio, Alfredo; Han, Zhiwei


    The excitation and vibration triggered by the long-term operation of railway vehicles inevitably result in defective states of catenary support devices. With the massive construction of high-speed electrified railways, automatic defect detection of diverse and plentiful fasteners on the catenary

  17. Automatic Detection of Childhood Absence Epilepsy Seizures: Toward a Monitoring Device

    DEFF Research Database (Denmark)

    Duun-Henriksen, Jonas; Madsen, Rasmus E.; Remvig, Line S.


    long-term prognoses, balancing antiepileptic effects and side effects. The electroencephalographic appearance of paroxysms in childhood absence epilepsy is fairly homogeneous, making it feasible to develop patient-independent automatic detection. We implemented a state-of-the-art algorithm...

  18. Is Mobile-Assisted Language Learning Really Useful? An Examination of Recall Automatization and Learner Autonomy (United States)

    Sato, Takeshi; Murase, Fumiko; Burden, Tyler


    The aim of this study is to examine the advantages of Mobile-Assisted Language Learning (MALL), especially vocabulary learning of English as a foreign or second language (L2) in terms of the two strands: automatization and learner autonomy. Previous studies articulate that technology-enhanced L2 learning could bring about some positive effects.…

  19. Improving the Learning Experience of Business Subjects in Engineering Studies Using Automatic Spreadsheet Correctors (United States)

    Rafart Serra, Maria Assumpció; Bikfalvi, Andrea; Soler Masó, Josep; Prados Carrasco, Ferran; Poch Garcia, Jordi


    The combination of two macro trends, Information and Communication Technologies' (ICT) proliferation and novel approaches in education, has resulted in a series of opportunities with no precedent in terms of content, channels and methods in education. The present contribution aims to describe the experience of using an automatic spreadsheet…

  20. Towards Automatic Improvement of Patient Queries in Health Retrieval Systems

    Directory of Open Access Journals (Sweden)

    Nesrine KSENTINI


    Full Text Available With the adoption of health information technology for clinical health, e-health is becoming usual practice today. Users of this technology find it difficult to seek information relevant to their needs due to the increasing amount of the clinical and medical data on the web, and the lack of knowledge of medical jargon. In this regards, a method is described to improve user's needs by automatically adding new related terms to their queries which appear in the same context of the original query in order to improve final search results. This method is based on the assessment of semantic relationships defined by a proposed statistical method between a set of terms or keywords. Experiments were performed on CLEF-eHealth-2015 database and the obtained results show the effectiveness of our proposed method.

  1. Patterns of Work Identification (United States)

    Hebden, J. E.


    The paper examines ways in which the concept of work identification may provide a useful means of delineating the boundaries of occupational groups. Two kinds of identification are discussed: identification with employing organizations and identification with occupation.

  2. Support vector machine for automatic pain recognition (United States)

    Monwar, Md Maruf; Rezaei, Siamak


    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  3. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E


    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  4. Automaticity in reading isiZulu

    Directory of Open Access Journals (Sweden)

    Sandra Land


    Full Text Available Automaticity, or instant recognition of combinations of letters as units of language, is essential for proficient reading in any language. The article explores automaticity amongst competent adult first-language readers of isiZulu, and the factors associated with it or its opposite - active decoding. Whilst the transparent spelling patterns of isiZulu aid learner readers, some of its orthographical features may militate against their gaining automaticity. These features are agglutination; a conjoined writing system; comparatively long, complex words; and a high rate of recurring strings of particular letters. This implies that optimal strategies for teaching reading in orthographically opaque languages such as English should not be assumed to apply to languages with dissimilar orthographies. Keywords: Orthography; Eye movement; Reading; isiZulu

  5. Development of fully automatic pipe welding system

    International Nuclear Information System (INIS)

    Tanioka, Shin-ichi; Nakano, Mitsuhiro; Tejima, Akio; Yamada, Minoru; Saito, Tatsuo; Saito, Yoshiyuki; Abe, Rikio


    We have succeeded in developing a fully automatic TIG welding system; namely CAPTIG that enables unmanned welding operations from the initial layer to the final finishing layer continuously. This welding system is designed for continuous, multilayered welding of thick and large diameter fixed pipes of nuclear power plants and large-size boiler plants where high-quality welding is demanded. In the tests conducted with this welding system, several hours of continuous unmanned welding corroborated that excellent beads are formed, good results are obtained in radiographic inspection and that quality welding is possible most reliably. This system incorporates a microcomputer for fully automatic controls by which it features a seam tracking function, wire feed position automatic control function, a self-checking function for inter-pass temperature, cooling water temperature and wire reserve. (author)

  6. Automatic control variac system for electronic accelerator

    International Nuclear Information System (INIS)

    Zhang Shuocheng; Wang Dan; Jing Lan; Qiao Weimin; Ma Yunhai


    An automatic control variac system is designed in order to satisfy the controlling requirement of the electronic accelerator developed by the Institute. Both design and operational principles, structure of the system as well as the software of industrial PC and micro controller unit are described. The interfaces of the control module are RS232 and RS485. A fiber optical interface (FOC) could be set up if an industrial FOC network is necessary, which will extend the filed of its application and make the communication of the system better. It is shown in practice that the system can adjust the variac output voltage automatically and assure the accurate and automatic control of the electronic accelerator. The system is designed in accordance with the general design principles and possesses the merits such as easy operation and maintenance, good expansibility, and low cost, thus it could also be used in other industrial branches. (authors)

  7. Automaticity in reading isiZulu

    Directory of Open Access Journals (Sweden)

    Sandra Land


    Full Text Available Automaticity, or instant recognition of combinations of letters as units of language, is essential for proficient reading in any language. The article explores automaticity amongst competent adult first-language readers of isiZulu, and the factors associated with it or its opposite - active decoding. Whilst the transparent spelling patterns of isiZulu aid learner readers, some of its orthographical features may militate against their gaining automaticity. These features are agglutination; a conjoined writing system; comparatively long, complex words; and a high rate of recurring strings of particular letters. This implies that optimal strategies for teaching reading in orthographically opaque languages such as English should not be assumed to apply to languages with dissimilar orthographies.Keywords: Orthography; Eye movement; Reading; isiZulu

  8. Automatic inference of indexing rules for MEDLINE. (United States)

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent


    Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  9. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen


    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  10. Automatic emotional expression analysis from eye area (United States)

    Akkoç, Betül; Arslan, Ahmet


    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  11. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh


    and specialization of classes (inheritance) are considered different abstractions. We present a new programming language, Lapis, that unifies inheritance and program specialization at the conceptual, syntactic, and semantic levels. This paper presents the initial development of Lapis, which uses inheritance...... with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented...

  12. Automatic control system in the reactor peggy

    International Nuclear Information System (INIS)

    Bertrand, J.; Mourchon, R.; Da Costa, D.; Desandre-Navarre, Ch.


    The equipment makes it possible for the reactor to attain a given power automatically and for the power to be maintained around this level. The principle of its operation consists in the changing from one power to another, at constant period, by means of a programmer transforming a power-step request into a voltage variation which is linear with time and which represents the logarithm of the required power. The real power is compared continuously with the required power. Stabilization occurs automatically as soon as the difference between the reactor power and the required power diminishes to a few per cent. (authors) [fr

  13. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan


    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  14. Some results of automatic processing of images

    International Nuclear Information System (INIS)

    Golenishchev, I.A.; Gracheva, T.N.; Khardikov, S.V.


    The problems of automatic deciphering of the radiographic picture the purpose of which is making a conclusion concerning the quality of the inspected product on the basis of the product defect images in the picture are considered. The methods of defect image recognition are listed, and the algorithms and the class features of defects are described. The results of deciphering of a small radiographic picture by means of the ''Minsk-22'' computer are presented. It is established that the sensitivity of the method of the automatic deciphering is close to that obtained for visual deciphering

  15. Automatic speech recognition a deep learning approach

    CERN Document Server

    Yu, Dong


    This book summarizes the recent advancement in the field of automatic speech recognition with a focus on discriminative and hierarchical models. This will be the first automatic speech recognition book to include a comprehensive coverage of recent developments such as conditional random field and deep learning techniques. It presents insights and theoretical foundation of a series of recent models such as conditional random field, semi-Markov and hidden conditional random field, deep neural network, deep belief network, and deep stacking models for sequential learning. It also discusses practical considerations of using these models in both acoustic and language modeling for continuous speech recognition.

  16. Semi-automatic Data Integration using Karma (United States)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.


    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of

  17. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    Benkirane, A.; Auger, G.; Chbihi, A.; Bloyet, D.; Plagnol, E.


    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append

  18. Automatic switching between noise classification and speech enhancement for hearing aid devices. (United States)

    Saki, Fatemeh; Kehtarnavaz, Nasser


    This paper presents a voice activity detector (VAD) for automatic switching between a noise classifier and a speech enhancer as part of the signal processing pipeline of hearing aid devices. The developed VAD consists of a computationally efficient feature extractor and a random forest classifier. Previously used signal features as well as two newly introduced signal features are extracted and fed into the classifier to perform automatic switching. This switching approach is compared to two popular VADs. The results obtained indicate the introduced approach outperforms these existing approaches in terms of both detection rate and processing time.

  19. Automatic construction of a recurrent neural network based classifier for vehicle passage detection (United States)

    Burnaev, Evgeny; Koptelov, Ivan; Novikov, German; Khanipov, Timur


    Recurrent Neural Networks (RNNs) are extensively used for time-series modeling and prediction. We propose an approach for automatic construction of a binary classifier based on Long Short-Term Memory RNNs (LSTM-RNNs) for detection of a vehicle passage through a checkpoint. As an input to the classifier we use multidimensional signals of various sensors that are installed on the checkpoint. Obtained results demonstrate that the previous approach to handcrafting a classifier, consisting of a set of deterministic rules, can be successfully replaced by an automatic RNN training on an appropriately labelled data.


    Directory of Open Access Journals (Sweden)



    Full Text Available In radio communication systems, signal modulation format recognition is a significant characteristic used in radio signal monitoring and identification. Over the past few decades, modulation formats have become increasingly complex, which has led to the problem of how to accurately and promptly recognize a modulation format. In addressing these challenges, the development of automatic modulation recognition systems that can classify a radio signal’s modulation format has received worldwide attention. Decision-theoretic methods and pattern recognition solutions are the two typical automatic modulation recognition approaches. While decision-theoretic approaches use probabilistic or likelihood functions, pattern recognition uses feature-based methods. This study applies the pattern recognition approach based on statistical parameters, using an artificial neural network to classify five different digital modulation formats. The paper deals with automatic recognition of both inter-and intra-classes of digitally modulated signals in contrast to most of the existing algorithms in literature that deal with either inter-class or intra-class modulation format recognition. The results of this study show that accurate and prompt modulation recognition is possible beyond the lower bound of 5 dB commonly acclaimed in literature. The other significant contribution of this paper is the usage of the Python programming language which reduces computational complexity that characterizes other automatic modulation recognition classifiers developed using the conventional MATLAB neural network toolbox.

  1. Sleep facilitates long-term face adaptation


    Ditye, Thomas; Javadi, Amir Homayoun; Carbon, Claus-Christian; Walsh, Vincent


    Adaptation is an automatic neural mechanism supporting the optimization of visual processing on the basis of previous experiences. While the short-term effects of adaptation on behaviour and physiology have been studied extensively, perceptual long-term changes associated with adaptation are still poorly understood. Here, we show that the integration of adaptation-dependent long-term shifts in neural function is facilitated by sleep. Perceptual shifts induced by adaptation to a distorted imag...

  2. Robust, accurate and fast automatic segmentation of the spinal cord. (United States)

    De Leener, Benjamin; Kadoury, Samuel; Cohen-Adad, Julien


    Spinal cord segmentation provides measures of atrophy and facilitates group analysis via inter-subject correspondence. Automatizing this procedure enables studies with large throughput and minimizes user bias. Although several automatic segmentation methods exist, they are often restricted in terms of image contrast and field-of-view. This paper presents a new automatic segmentation method (PropSeg) optimized for robustness, accuracy and speed. The algorithm is based on the propagation of a deformable model and is divided into three parts: firstly, an initialization step detects the spinal cord position and orientation using a circular Hough transform on multiple axial slices rostral and caudal to the starting plane and builds an initial elliptical tubular mesh. Secondly, a low-resolution deformable model is propagated along the spinal cord. To deal with highly variable contrast levels between the spinal cord and the cerebrospinal fluid, the deformation is coupled with a local contrast-to-noise adaptation at each iteration. Thirdly, a refinement process and a global deformation are applied on the propagated mesh to provide an accurate segmentation of the spinal cord. Validation was performed in 15 healthy subjects and two patients with spinal cord injury, using T1- and T2-weighted images of the entire spinal cord and on multiecho T2*-weighted images. Our method was compared against manual segmentation and against an active surface method. Results show high precision for all the MR sequences. Dice coefficients were 0.9 for the T1- and T2-weighted cohorts and 0.86 for the T2*-weighted images. The proposed method runs in less than 1min on a normal computer and can be used to quantify morphological features such as cross-sectional area along the whole spinal cord. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Automatic segmentation of maxillofacial cysts in cone beam CT images. (United States)

    Abdolali, Fatemeh; Zoroofi, Reza Aghaeizadeh; Otake, Yoshito; Sato, Yoshinobu


    Accurate segmentation of cysts and tumors is an essential step for diagnosis, monitoring and planning therapeutic intervention. This task is usually done manually, however manual identification and segmentation is tedious. In this paper, an automatic method based on asymmetry analysis is proposed which is general enough to segment various types of jaw cysts. The key observation underlying this approach is that normal head and face structure is roughly symmetric with respect to midsagittal plane: the left part and the right part can be divided equally by an axis of symmetry. Cysts and tumors typically disturb this symmetry. The proposed approach consists of three main steps as follows: At first, diffusion filtering is used for preprocessing and symmetric axis is detected. Then, each image is divided into two parts. In the second stage, free form deformation (FFD) is used to correct slight displacement of corresponding pixels of the left part and a reflected copy of the right part. In the final stage, intensity differences are analyzed and a number of constraints are enforced to remove false positive regions. The proposed method has been validated on 97 Cone Beam Computed Tomography (CBCT) sets containing various jaw cysts which were collected from various image acquisition centers. Validation is performed using three similarity indicators (Jaccard index, Dice's coefficient and Hausdorff distance). The mean Dice's coefficient of 0.83, 0.87 and 0.80 is achieved for Radicular, Dentigerous and KCOT classes, respectively. For most of the experiments done, we achieved high true positive (TP). This means that a large number of cyst pixels are correctly classified. Quantitative results of automatic segmentation show that the proposed method is more effective than one of the recent methods in the literature. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Comparison of the techniques for the identification of the epidural space using the loss-of-resistance technique or an automated syringe - results of a randomized double-blind study. (United States)

    Duniec, Larysa; Nowakowski, Piotr; Sieczko, Jakub; Chlebus, Marcin; Łazowski, Tomasz


    The conventional, loss of resistance technique for identification of the epidural space is highly dependent on the anaesthetist's personal experience and is susceptible to technical errors. Therefore, an alternative, automated technique was devised to overcome the drawbacks of the traditional method. The aim of the study was to compare the efficacy of epidural space identification and the complication rate between the two groups - the automatic syringe and conventional loss of resistance methods. 47 patients scheduled for orthopaedic and gynaecology procedures under epidural anaesthesia were enrolled into the study. The number of attempts, ease of epidural space identification, complication rate and the patients' acceptance regarding the two techniques were evaluated. The majority of blocks were performed by trainee anaesthetists (91.5%). No statistical difference was found between the number of needle insertion attempts (1 vs. 2), the efficacy of epidural anaesthesia or the number of complications between the groups. The ease of epidural space identification, as assessed by an anaesthetist, was significantly better (P = 0.011) in the automated group (87.5% vs. 52.4%). A similar number of patients (92% vs. 94%) in both groups stated they would accept epidural anaesthesia in the future. The automated and loss of resistance methods of epidural space identification were proved to be equivalent in terms of efficacy and safety. Since the use of the automated technique may facilitate epidural space identification, it may be regarded as useful technique for anaesthetists inexperienced in epidural anaesthesia, or for trainees.

  5. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J. S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J. P. K.; Geertzen, J. H. B.


    This paper describes a new automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitted

  6. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients,

  7. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan


    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... incrementalizing a broad range of static analyses....

  8. Automatic Estimation of Movement Statistics of People

    DEFF Research Database (Denmark)

    Ægidiussen Jensen, Thomas; Rasmussen, Henrik Anker; Moeslund, Thomas B.


    Automatic analysis of how people move about in a particular environment has a number of potential applications. However, no system has so far been able to do detection and tracking robustly. Instead, trajectories are often broken into tracklets. The key idea behind this paper is based around...

  9. Automatic program generation: future of software engineering

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, J.H.


    At this moment software development is still more of an art than an engineering discipline. Each piece of software is lovingly engineered, nurtured, and presented to the world as a tribute to the writer's skill. When will this change. When will the craftsmanship be removed and the programs be turned out like so many automobiles from an assembly line. Sooner or later it will happen: economic necessities will demand it. With the advent of cheap microcomputers and ever more powerful supercomputers doubling capacity, much more software must be produced. The choices are to double the number of programers, double the efficiency of each programer, or find a way to produce the needed software automatically. Producing software automatically is the only logical choice. How will automatic programing come about. Some of the preliminary actions which need to be done and are being done are to encourage programer plagiarism of existing software through public library mechanisms, produce well understood packages such as compiler automatically, develop languages capable of producing software as output, and learn enough about the whole process of programing to be able to automate it. Clearly, the emphasis must not be on efficiency or size, since ever larger and faster hardware is coming.

  10. Automatic Differentiation and its Program Realization

    Czech Academy of Sciences Publication Activity Database

    Hartman, J.; Lukšan, Ladislav; Zítko, J.


    Roč. 45, č. 5 (2009), s. 865-883 ISSN 0023-5954 R&D Projects: GA AV ČR IAA1030405 Institutional research plan: CEZ:AV0Z10300504 Keywords : automatic differentiation * modeling languages * systems of optimization Subject RIV: BA - General Mathematics Impact factor: 0.445, year: 2009

  11. Development of automatic facilities for ZEPHYR

    International Nuclear Information System (INIS)

    Eder, O.; Lackner, E.; Pohl, F.; Schilling, H.B.


    This concept of remotely controlled facilities for repair and maintenance tasks inside the ZEPHYR vacuum vessel uses a supporting structure to insert various types of mobile automatic devices are guided by an egg-shaped disc which is part of the supporting structure. Considerations of adapting the guiding disc to the vessel contour are included. (orig.)

  12. AUTORED - the JADE automatic data reduction system

    International Nuclear Information System (INIS)

    Whittaker, J.B.


    The design and implementation of and experience with an automatic data processing system for the reduction of data from the JADE experiment at DESY is described. The central elements are a database and a job submitter which combine powerfully to minimise the need for manual intervention. (author)

  13. Automatic Assessment of 3D Modeling Exams (United States)

    Sanna, A.; Lamberti, F.; Paravati, G.; Demartini, C.


    Computer-based assessment of exams provides teachers and students with two main benefits: fairness and effectiveness in the evaluation process. This paper proposes a fully automatic evaluation tool for the Graphic and Virtual Design (GVD) curriculum at the First School of Architecture of the Politecnico di Torino, Italy. In particular, the tool is…

  14. Automatic alignment of audiobooks in Afrikaans

    CSIR Research Space (South Africa)

    Van Heerden, CJ


    Full Text Available to perform Maximum A Posteriori adaptation on the baseline models. The corresponding value for models trained on the audiobook data is 0.996. An automatic measure of alignment accuracy is also introduced and compared to accuracies measured relative to a gold...

  15. Automatic Smoker Detection from Telephone Speech Signals

    DEFF Research Database (Denmark)

    Alavijeh, Amir Hossein Poorjam; Hesaraki, Soheila; Safavi, Saeid


    This paper proposes an automatic smoking habit detection from spontaneous telephone speech signals. In this method, each utterance is modeled using i-vector and non-negative factor analysis (NFA) frameworks, which yield low-dimensional representation of utterances by applying factor analysis on G...

  16. Reduction of Dutch Sentences for Automatic Subtitling

    NARCIS (Netherlands)

    Tjong Kim Sang, E.F.; Daelemans, W.; Höthker, A.


    We compare machine learning approaches for sentence length reduction for automatic generation of subtitles for deaf and hearing-impaired people with a method which relies on hand-crafted deletion rules. We describe building the necessary resources for this task: a parallel corpus of examples of news

  17. Effective speed management through automatic enforcement.

    NARCIS (Netherlands)

    Oei, H.-l.


    This paper analyses several aspects of the Dutch experience of speed enforcement, and presents the results of some speed management experiments in The Netherlands, using automatic warning of speeders and enforcement of speeding. Traditional approaches to manage speed there have not resulted in

  18. Automatic invariant detection in dynamic web applications

    NARCIS (Netherlands)

    Groeneveld, F.; Mesbah, A.; Van Deursen, A.


    The complexity of modern web applications increases as client-side JavaScript and dynamic DOM programming are used to offer a more interactive web experience. In this paper, we focus on improving the dependability of such applications by automatically inferring invariants from the client-side and

  19. Automatization and familiarity in repeated checking

    NARCIS (Netherlands)

    Dek, E.C.P.|info:eu-repo/dai/nl/313959552; van den Hout, M.A.|info:eu-repo/dai/nl/070445354; Giele, C.L.|info:eu-repo/dai/nl/318754460; Engelhard, I.M.|info:eu-repo/dai/nl/239681533


    Repetitive, compulsive-like checking of an object leads to reductions in memory confidence, vividness, and detail. Experimental research suggests that this is caused by increased familiarity with perceptual characteristics of the stimulus and automatization of the checking procedure (Dek, van den

  20. 32 CFR 2001.30 - Automatic declassification. (United States)


    ... that originated in an agency that has ceased to exist and for which there is no successor agency, the... international agreement that does not permit automatic or unilateral declassification. The declassifying agency... foreign nuclear programs (e.g., intelligence assessments or reports, foreign nuclear program information...