WorldWideScience

Sample records for automatic term identification

  1. Automatic modal identification of cable-supported bridges instrumented with a long-term monitoring system

    Science.gov (United States)

    Ni, Y. Q.; Fan, K. Q.; Zheng, G.; Chan, T. H. T.; Ko, J. M.

    2003-08-01

    An automatic modal identification program is developed for continuous extraction of modal parameters of three cable-supported bridges in Hong Kong which are instrumented with a long-term monitoring system. The program employs the Complex Modal Indication Function (CMIF) algorithm to identify modal properties from continuous ambient vibration measurements in an on-line manner. By using the LabVIEW graphical programming language, the software realizes the algorithm in Virtual Instrument (VI) style. The applicability and implementation issues of the developed software are demonstrated by using one-year measurement data acquired from 67 channels of accelerometers deployed on the cable-stayed Ting Kau Bridge. With the continuously identified results, normal variability of modal vectors caused by varying environmental and operational conditions is observed. Such observation is very helpful for selection of appropriate measured modal vectors for structural health monitoring applications.

  2. Automatic identification in mining

    Energy Technology Data Exchange (ETDEWEB)

    Puckett, D; Patrick, C [Mine Computers and Electronics Inc., Morehead, KY (United States)

    1998-06-01

    The feasibility of monitoring the locations and vital statistics of equipment and personnel in surface and underground mining operations has increased with advancements in radio frequency identification (RFID) technology. This paper addresses the use of RFID technology, which is relatively new to the mining industry, to track surface equipment in mine pits, loading points and processing facilities. Specific applications are discussed, including both simplified and complex truck tracking systems and an automatic pit ticket system. This paper concludes with a discussion of the future possibilities of using RFID technology in mining including monitoring heart and respiration rates, body temperatures and exertion levels; monitoring repetitious movements for the study of work habits; and logging air quality via personnel sensors. 10 refs., 5 figs.

  3. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  4. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  5. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  6. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  7. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  8. Automatic identification of mass spectra

    International Nuclear Information System (INIS)

    Drabloes, F.

    1992-01-01

    Several approaches to preprocessing and comparison of low resolution mass spectra have been evaluated by various test methods related to library search. It is shown that there is a clear correlation between the nature of any contamination of a spectrum, the basic principle of the transformation or distance measure, and the performance of the identification system. The identification of functionality from low resolution spectra has also been evaluated using several classification methods. It is shown that there is an upper limit to the success of this approach, but also that this can be improved significantly by using a very limited amount of additional information. 10 refs

  9. Automatic identification of tuberculosis mycobacterium

    Directory of Open Access Journals (Sweden)

    Cicero Ferreira Fernandes Costa Filho

    Full Text Available Introduction According to the Global TB control report of 2013, “Tuberculosis (TB remains a major global health problem. In 2012, an estimated 8.6 million people developed TB and 1.3 million died from the disease. Two main sputum smear microscopy techniques are used for TB diagnosis: Fluorescence microscopy and conventional microscopy. Fluorescence microscopy is a more expensive diagnostic method because of the high costs of the microscopy unit and its maintenance. Therefore, conventional microscopy is more appropriate for use in developing countries. Methods This paper presents a new method for detecting tuberculosis bacillus in conventional sputum smear microscopy. The method consists of two main steps, bacillus segmentation and post-processing. In the first step, the scalar selection technique was used to select input variables for the segmentation classifiers from four color spaces. Thirty features were used, including the subtractions of the color components of different color spaces. In the post-processing step, three filters were used to separate bacilli from artifact: a size filter, a geometric filter and a Rule-based filter that uses the components of the RGB color space. Results In bacillus identification, an overall sensitivity of 96.80% and an error rate of 3.38% were obtained. An image database with 120-sputum-smear microscopy slices of 12 patients with objects marked as bacillus, agglomerated bacillus and artifact was generated and is now available online. Conclusions The best results were obtained with a support vector machine in bacillus segmentation associated with the application of the three post-processing filters.

  10. Automatically identifying gene/protein terms in MEDLINE abstracts.

    Science.gov (United States)

    Yu, Hong; Hatzivassiloglou, Vasileios; Rzhetsky, Andrey; Wilbur, W John

    2002-01-01

    Natural language processing (NLP) techniques are used to extract information automatically from computer-readable literature. In biology, the identification of terms corresponding to biological substances (e.g., genes and proteins) is a necessary step that precedes the application of other NLP systems that extract biological information (e.g., protein-protein interactions, gene regulation events, and biochemical pathways). We have developed GPmarkup (for "gene/protein-full name mark up"), a software system that automatically identifies gene/protein terms (i.e., symbols or full names) in MEDLINE abstracts. As a part of marking up process, we also generated automatically a knowledge source of paired gene/protein symbols and full names (e.g., LARD for lymphocyte associated receptor of death) from MEDLINE. We found that many of the pairs in our knowledge source do not appear in the current GenBank database. Therefore our methods may also be used for automatic lexicon generation. GPmarkup has 73% recall and 93% precision in identifying and marking up gene/protein terms in MEDLINE abstracts. A random sample of gene/protein symbols and full names and a sample set of marked up abstracts can be viewed at http://www.cpmc.columbia.edu/homepages/yuh9001/GPmarkup/. Contact. hy52@columbia.edu. Voice: 212-939-7028; fax: 212-666-0140.

  11. Estimating spatial travel times using automatic vehicle identification data

    Science.gov (United States)

    2001-01-01

    Prepared ca. 2001. The paper describes an algorithm that was developed for estimating reliable and accurate average roadway link travel times using Automatic Vehicle Identification (AVI) data. The algorithm presented is unique in two aspects. First, ...

  12. Optical Automatic Car Identification (OACI) : Volume 1. Advanced System Specification.

    Science.gov (United States)

    1978-12-01

    A performance specification is provided in this report for an Optical Automatic Car Identification (OACI) scanner system which features 6% improved readability over existing industry scanner systems. It also includes the analysis and rationale which ...

  13. 47 CFR 25.281 - Automatic Transmitter Identification System (ATIS).

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Automatic Transmitter Identification System (ATIS). 25.281 Section 25.281 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES SATELLITE COMMUNICATIONS Technical Operations § 25.281 Automatic Transmitter...

  14. Statistical pattern recognition for automatic writer identification and verification

    NARCIS (Netherlands)

    Bulacu, Marius Lucian

    2007-01-01

    The thesis addresses the problem of automatic person identification using scanned images of handwriting.Identifying the author of a handwritten sample using automatic image-based methods is an interesting pattern recognition problem with direct applicability in the forensic and historic document

  15. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre

    1975-11-01

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr

  16. Development of an automatic identification algorithm for antibiogram analysis

    OpenAIRE

    Costa, LFR; Eduardo Silva; Noronha, VT; Ivone Vaz-Moreira; Olga C Nunes; de Andrade, MM

    2015-01-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. ALA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using suscepti...

  17. FORENSIC LINGUISTICS: AUTOMATIC WEB AUTHOR IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    A. A. Vorobeva

    2016-03-01

    Full Text Available Internet is anonymous, this allows posting under a false name, on behalf of others or simply anonymous. Thus, individuals, criminal or terrorist organizations can use Internet for criminal purposes; they hide their identity to avoid the prosecuting. Existing approaches and algorithms for author identification of web-posts on Russian language are not effective. The development of proven methods, technics and tools for author identification is extremely important and challenging task. In this work the algorithm and software for authorship identification of web-posts was developed. During the study the effectiveness of several classification and feature selection algorithms were tested. The algorithm includes some important steps: 1 Feature extraction; 2 Features discretization; 3 Feature selection with the most effective Relief-f algorithm (to find the best feature set with the most discriminating power for each set of candidate authors and maximize accuracy of author identification; 4 Author identification on model based on Random Forest algorithm. Random Forest and Relief-f algorithms are used to identify the author of a short text on Russian language for the first time. The important step of author attribution is data preprocessing - discretization of continuous features; earlier it was not applied to improve the efficiency of author identification. The software outputs top q authors with maximum probabilities of authorship. This approach is helpful for manual analysis in forensic linguistics, when developed tool is used to narrow the set of candidate authors. For experiments on 10 candidate authors, real author appeared in to top 3 in 90.02% cases, on first place real author appeared in 70.5% of cases.

  18. Optical Automatic Car Identification (OACI) Field Test Program

    Science.gov (United States)

    1976-05-01

    The results of the Optical Automatic Car Identification (OACI) tests at Chicago conducted from August 16 to September 4, 1975 are presented. The main purpose of this test was to determine the suitability of optics as a principle of operation for an a...

  19. Automatic Identification System (AIS) Transmit Testing in Louisville Phase 2

    Science.gov (United States)

    2014-08-01

    Firewall Louisville QM 65.206.28.x NAIS Site Controller PC RS232 Serial cable TV32 Computer Cmd Center Serial splitter SAAB R40 AIS Base Station...172.17.14.6 Rack mount computer AIS Radio Interface Ethernet Switch 192.168.0.x Firewall Cable Modem 192.168.0.1 VTS Accred. Boundary serial connection...Automatic Identification System ( AIS ) Transmit Testing in Louisville Phase 2 Distribution Statement A: Approved for public release

  20. Accuracy of Automatic Cephalometric Software on Landmark Identification

    Science.gov (United States)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.

    2017-11-01

    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (pmean differences in both horizontal and vertical directions. Small mean differences (mean differences were found for A-point (3.0 4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  1. MAC, A System for Automatically IPR Identification, Collection and Distribution

    Science.gov (United States)

    Serrão, Carlos

    Controlling Intellectual Property Rights (IPR) in the Digital World is a very hard challenge. The facility to create multiple bit-by-bit identical copies from original IPR works creates the opportunities for digital piracy. One of the most affected industries by this fact is the Music Industry. The Music Industry has supported huge losses during the last few years due to this fact. Moreover, this fact is also affecting the way that music rights collecting and distributing societies are operating to assure a correct music IPR identification, collection and distribution. In this article a system for automating this IPR identification, collection and distribution is presented and described. This system makes usage of advanced automatic audio identification system based on audio fingerprinting technology. This paper will present the details of the system and present a use-case scenario where this system is being used.

  2. Automatic Adviser on Mobile Objects Status Identification and Classification

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Saryan, A. S.

    2018-05-01

    A mobile object status identification task is defined within the image discrimination theory. It is proposed to classify objects into three classes: object operation status; its maintenance is required and object should be removed from the production process. Two methods were developed to construct the separating boundaries between the designated classes: a) using statistical information on the research objects executed movement, b) basing on regulatory documents and expert commentary. Automatic Adviser operation simulation and the operation results analysis complex were synthesized. Research results are commented using a specific example of cuts rolling from the hump yard. The work was supported by Russian Fundamental Research Fund, project No. 17-20-01040.

  3. Semi-automatic long-term acoustic surveying

    DEFF Research Database (Denmark)

    Andreassen, Tórur; Surlykke, Annemarie; Hallam, John

    2014-01-01

    Increasing concern about decline in biodiversity has created a demand for population surveys. Acoustic monitoring is an efficient non-invasive method, which may be deployed for surveys of animals as diverse as insects, birds, and bats. Long-term unmanned automatic monitoring may provide unique...... to determine bat behavior and correct for the bias toward loud bats inherent in acoustic surveying....

  4. Channel Access Algorithm Design for Automatic Identification System

    Institute of Scientific and Technical Information of China (English)

    Oh Sang-heon; Kim Seung-pum; Hwang Dong-hwan; Park Chan-sik; Lee Sang-jeong

    2003-01-01

    The Automatic Identification System (AIS) is a maritime equipment to allow an efficient exchange of the navigational data between ships and between ships and shore stations. It utilizes a channel access algorithm which can quickly resolve conflicts without any intervention from control stations. In this paper, a design of channel access algorithm for the AIS is presented. The input/output relationship of each access algorithm module is defined by drawing the state transition diagram, dataflow diagram and flowchart based on the technical standard, ITU-R M.1371. In order to verify the designed channel access algorithm, the simulator was developed using the C/C++ programming language. The results show that the proposed channel access algorithm can properly allocate transmission slots and meet the operational performance requirements specified by the technical standard.

  5. Automatic identification and normalization of dosage forms in drug monographs

    Science.gov (United States)

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  6. Automatic identification of otologic drilling faults: a preliminary report.

    Science.gov (United States)

    Shen, Peng; Feng, Guodong; Cao, Tianyang; Gao, Zhiqiang; Li, Xisheng

    2009-09-01

    A preliminary study was carried out to identify parameters to characterize drilling faults when using an otologic drill under various operating conditions. An otologic drill was modified by the addition of four sensors. Under consistent conditions, the drill was used to simulate three important types of drilling faults and the captured data were analysed to extract characteristic signals. A multisensor information fusion system was designed to fuse the signals and automatically identify the faults. When identifying drilling faults, there was a high degree of repeatability and regularity, with an average recognition rate of >70%. This study shows that the variables measured change in a fashion that allows the identification of particular drilling faults, and that it is feasible to use these data to provide rapid feedback for a control system. Further experiments are being undertaken to implement such a system.

  7. 33 CFR 164.43 - Automatic Identification System Shipborne Equipment-Prince William Sound.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Automatic Identification System Shipborne Equipment-Prince William Sound. 164.43 Section 164.43 Navigation and Navigable Waters COAST GUARD... Automatic Identification System Shipborne Equipment—Prince William Sound. (a) Until December 31, 2004, each...

  8. Rewriting and suppressing UMLS terms for improved biomedical term identification

    NARCIS (Netherlands)

    K.M. Hettne (Kristina); E.M. van Mulligen (Erik); M.J. Schuemie (Martijn); R.J.A. Schijvenaars (Bob); J.A. Kors (Jan)

    2010-01-01

    textabstractBackground: Identification of terms is essential for biomedical text mining. We concentrate here on the use of vocabularies for term identification, specifically the Unified Medical Language System (UMLS). To make the UMLS more suitable for biomedical text mining we implemented and

  9. Automatic Identification System modular receiver for academic purposes

    Science.gov (United States)

    Cabrera, F.; Molina, N.; Tichavska, M.; Araña, V.

    2016-07-01

    The Automatic Identification System (AIS) standard is encompassed within the Global Maritime Distress and Safety System (GMDSS), in force since 1999. The GMDSS is a set of procedures, equipment, and communication protocols designed with the aim of increasing the safety of sea crossings, facilitating navigation, and the rescue of vessels in danger. The use of this system not only is increasingly attractive to security issues but also potentially creates intelligence products throughout the added-value information that this network can transmit from ships on real time (identification, position, course, speed, dimensions, flag, among others). Within the marine electronics market, commercial receivers implement this standard and allow users to access vessel-broadcasted information if in the range of coverage. In addition to satellite services, users may request actionable information from private or public AIS terrestrial networks where real-time feed or historical data can be accessed from its nodes. This paper describes the configuration of an AIS receiver based on a modular design. This modular design facilitates the evaluation of specific modules and also a better understanding of the standard and the possibility of changing hardware modules to improve the performance of the prototype. Thus, the aim of this paper is to describe the system's specifications, its main hardware components, and to present educational didactics on the setup and use of a modular and terrestrial AIS receiver. The latter is for academic purposes and in undergraduate studies such as electrical engineering, telecommunications, and maritime studies.

  10. Automatic failure identification of the nuclear power plant pellet fuel

    International Nuclear Information System (INIS)

    Oliveira, Adriano Fortunato de

    2010-01-01

    This paper proposed the development of an automatic technique for evaluating defects to help in the stage of fabrication of fuel elements. Was produced an intelligent image analysis for automatic recognition of defects in uranium pellets. Therefore, an Artificial Neural Network (ANN) was trained using segments of histograms of pellets, containing examples of both normal (no fault) and of defectives pellets (with major defects normally found). The images of the pellets were segmented into 11 shares. Histograms were made of these segments and trained the ANN. Besides automating the process, the system was able to obtain this classification accuracy of 98.33%. Although this percentage represents a significant advance ever in the quality control process, the use of more advanced techniques of photography and lighting will reduce it to insignificant levels with low cost. Technologically, the method developed, should it ever be implemented, will add substantial value in terms of process quality control and production outages in relation to domestic manufacturing of nuclear fuel. (author)

  11. Development of an Automatic Identification System Autonomous Positioning System

    Directory of Open Access Journals (Sweden)

    Qing Hu

    2015-11-01

    Full Text Available In order to overcome the vulnerability of the global navigation satellite system (GNSS and provide robust position, navigation and time (PNT information in marine navigation, the autonomous positioning system based on ranging-mode Automatic Identification System (AIS is presented in the paper. The principle of the AIS autonomous positioning system (AAPS is investigated, including the position algorithm, the signal measurement technique, the geometric dilution of precision, the time synchronization technique and the additional secondary factor correction technique. In order to validate the proposed AAPS, a verification system has been established in the Xinghai sea region of Dalian (China. Static and dynamic positioning experiments are performed. The original function of the AIS in the AAPS is not influenced. The experimental results show that the positioning precision of the AAPS is better than 10 m in the area with good geometric dilution of precision (GDOP by the additional secondary factor correction technology. This is the most economical solution for a land-based positioning system to complement the GNSS for the navigation safety of vessels sailing along coasts.

  12. Roadway system assessment using bluetooth-based automatic vehicle identification travel time data.

    Science.gov (United States)

    2012-12-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection : systems. This includes considerations in the physical setup of the collection system as well as the interpretation of...

  13. Automatic topic identification of health-related messages in online health community using text classification.

    Science.gov (United States)

    Lu, Yingjie

    2013-01-01

    To facilitate patient involvement in online health community and obtain informative support and emotional support they need, a topic identification approach was proposed in this paper for identifying automatically topics of the health-related messages in online health community, thus assisting patients in reaching the most relevant messages for their queries efficiently. Feature-based classification framework was presented for automatic topic identification in our study. We first collected the messages related to some predefined topics in a online health community. Then we combined three different types of features, n-gram-based features, domain-specific features and sentiment features to build four feature sets for health-related text representation. Finally, three different text classification techniques, C4.5, Naïve Bayes and SVM were adopted to evaluate our topic classification model. By comparing different feature sets and different classification techniques, we found that n-gram-based features, domain-specific features and sentiment features were all considered to be effective in distinguishing different types of health-related topics. In addition, feature reduction technique based on information gain was also effective to improve the topic classification performance. In terms of classification techniques, SVM outperformed C4.5 and Naïve Bayes significantly. The experimental results demonstrated that the proposed approach could identify the topics of online health-related messages efficiently.

  14. Automatic annotation of protein motif function with Gene Ontology terms

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Vanathi

    2004-09-01

    Full Text Available Abstract Background Conserved protein sequence motifs are short stretches of amino acid sequence patterns that potentially encode the function of proteins. Several sequence pattern searching algorithms and programs exist foridentifying candidate protein motifs at the whole genome level. However, amuch needed and importanttask is to determine the functions of the newly identified protein motifs. The Gene Ontology (GO project is an endeavor to annotate the function of genes or protein sequences with terms from a dynamic, controlled vocabulary and these annotations serve well as a knowledge base. Results This paperpresents methods to mine the GO knowledge base and use the association between the GO terms assigned to a sequence and the motifs matched by the same sequence as evidence for predicting the functions of novel protein motifs automatically. The task of assigning GO terms to protein motifsis viewed as both a binary classification and information retrieval problem, where PROSITE motifs are used as samples for mode training and functional prediction. The mutual information of a motif and aGO term association isfound to be a very useful feature. We take advantageof the known motifs to train a logistic regression classifier, which allows us to combine mutual information with other frequency-based features and obtain a probability of correctassociation. The trained logistic regression model has intuitively meaningful and logically plausible parameter values, and performs very well empirically according to our evaluation criteria. Conclusions In this research, different methods for automatic annotation of protein motifs have been investigated. Empirical result demonstrated that the methods have a great potential for detecting and augmenting information about thefunctions of newly discovered candidate protein motifs.

  15. Rewriting and suppressing UMLS terms for improved biomedical term identification

    Directory of Open Access Journals (Sweden)

    Hettne Kristina M

    2010-03-01

    Full Text Available Abstract Background Identification of terms is essential for biomedical text mining.. We concentrate here on the use of vocabularies for term identification, specifically the Unified Medical Language System (UMLS. To make the UMLS more suitable for biomedical text mining we implemented and evaluated nine term rewrite and eight term suppression rules. The rules rely on UMLS properties that have been identified in previous work by others, together with an additional set of new properties discovered by our group during our work with the UMLS. Our work complements the earlier work in that we measure the impact on the number of terms identified by the different rules on a MEDLINE corpus. The number of uniquely identified terms and their frequency in MEDLINE were computed before and after applying the rules. The 50 most frequently found terms together with a sample of 100 randomly selected terms were evaluated for every rule. Results Five of the nine rewrite rules were found to generate additional synonyms and spelling variants that correctly corresponded to the meaning of the original terms and seven out of the eight suppression rules were found to suppress only undesired terms. Using the five rewrite rules that passed our evaluation, we were able to identify 1,117,772 new occurrences of 14,784 rewritten terms in MEDLINE. Without the rewriting, we recognized 651,268 terms belonging to 397,414 concepts; with rewriting, we recognized 666,053 terms belonging to 410,823 concepts, which is an increase of 2.8% in the number of terms and an increase of 3.4% in the number of concepts recognized. Using the seven suppression rules, a total of 257,118 undesired terms were suppressed in the UMLS, notably decreasing its size. 7,397 terms were suppressed in the corpus. Conclusions We recommend applying the five rewrite rules and seven suppression rules that passed our evaluation when the UMLS is to be used for biomedical term identification in MEDLINE. A software

  16. Automatic identification of temporal sequences in chewing sounds

    NARCIS (Netherlands)

    Amft, O.D.; Kusserow, M.; Tröster, G.

    2007-01-01

    Chewing is an essential part of food intake. The analysis and detection of food patterns is an important component of an automatic dietary monitoring system. However chewing is a time-variable process depending on food properties. We present an automated methodology to extract sub-sequences of

  17. Automatic Bayesian single molecule identification for localization microscopy

    OpenAIRE

    Tang, Yunqing; Hendriks, Johnny; Gensch, Thomas; Dai, Luru; Li, Junbai

    2016-01-01

    Single molecule localization microscopy (SMLM) is on its way to become a mainstream imaging technique in the life sciences. However, analysis of SMLM data is biased by user provided subjective parameters required by the analysis software. To remove this human bias we introduce here the Auto-Bayes method that executes the analysis of SMLM data automatically. We demonstrate the success of the method using the photoelectron count of an emitter as selection characteristic. Moreover, the principle...

  18. Automatic vertebral identification using surface-based registration

    Science.gov (United States)

    Herring, Jeannette L.; Dawant, Benoit M.

    2000-06-01

    This work introduces an enhancement to currently existing methods of intra-operative vertebral registration by allowing the portion of the spinal column surface that correctly matches a set of physical vertebral points to be automatically selected from several possible choices. Automatic selection is made possible by the shape variations that exist among lumbar vertebrae. In our experiments, we register vertebral points representing physical space to spinal column surfaces extracted from computed tomography images. The vertebral points are taken from the posterior elements of a single vertebra to represent the region of surgical interest. The surface is extracted using an improved version of the fully automatic marching cubes algorithm, which results in a triangulated surface that contains multiple vertebrae. We find the correct portion of the surface by registering the set of physical points to multiple surface areas, including all vertebral surfaces that potentially match the physical point set. We then compute the standard deviation of the surface error for the set of points registered to each vertebral surface that is a possible match, and the registration that corresponds to the lowest standard deviation designates the correct match. We have performed our current experiments on two plastic spine phantoms and one patient.

  19. The epidural needle guidance with an intelligent and automatic identification system for epidural anesthesia

    Science.gov (United States)

    Kao, Meng-Chun; Ting, Chien-Kun; Kuo, Wen-Chuan

    2018-02-01

    Incorrect placement of the needle causes medical complications in the epidural block, such as dural puncture or spinal cord injury. This study proposes a system which combines an optical coherence tomography (OCT) imaging probe with an automatic identification (AI) system to objectively identify the position of the epidural needle tip. The automatic identification system uses three features as image parameters to distinguish the different tissue by three classifiers. Finally, we found that the support vector machine (SVM) classifier has highest accuracy, specificity, and sensitivity, which reached to 95%, 98%, and 92%, respectively.

  20. An automatic microseismic or acoustic emission arrival identification scheme with deep recurrent neural networks

    Science.gov (United States)

    Zheng, Jing; Lu, Jiren; Peng, Suping; Jiang, Tianqi

    2018-02-01

    The conventional arrival pick-up algorithms cannot avoid the manual modification of the parameters for the simultaneous identification of multiple events under different signal-to-noise ratios (SNRs). Therefore, in order to automatically obtain the arrivals of multiple events with high precision under different SNRs, in this study an algorithm was proposed which had the ability to pick up the arrival of microseismic or acoustic emission events based on deep recurrent neural networks. The arrival identification was performed using two important steps, which included a training phase and a testing phase. The training process was mathematically modelled by deep recurrent neural networks using Long Short-Term Memory architecture. During the testing phase, the learned weights were utilized to identify the arrivals through the microseismic/acoustic emission data sets. The data sets were obtained by rock physics experiments of the acoustic emission. In order to obtain the data sets under different SNRs, this study added random noise to the raw experiments' data sets. The results showed that the outcome of the proposed method was able to attain an above 80 per cent hit-rate at SNR 0 dB, and an approximately 70 per cent hit-rate at SNR -5 dB, with an absolute error in 10 sampling points. These results indicated that the proposed method had high selection precision and robustness.

  1. Automatic Adviser on stationary devices status identification and anticipated change

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  2. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  3. Automatic identification of inertial sensor placement on human body segments during walking

    NARCIS (Netherlands)

    Weenk, D.; van Beijnum, Bernhard J.F.; Baten, Christian T.M.; Hermens, Hermanus J.; Veltink, Petrus H.

    2013-01-01

    We present a novel method for the automatic identification of inertial sensors on human body segments during walking. This method allows the user to place (wireless) inertial sensors on arbitrary body segments. Next, the user walks for just a few seconds and the segment to which each sensor is

  4. Automatic Priming Effects for New Associations in Lexical Decision and Perceptual Identification

    NARCIS (Netherlands)

    D. Pecher (Diane); J.G.W. Raaijmakers (Jeroen)

    1999-01-01

    textabstractInformation storage in semantic memory was investigated by looking at automatic priming effects for new associations in two experiments. In the study phase word pairs were presented in a paired-associate learning task. Lexical decision and perceptual identification were used to examine

  5. Automatic slice identification in 3D medical images with a ConvNet regressor

    NARCIS (Netherlands)

    de Vos, Bob D.; Viergever, Max A.; de Jong, Pim A.; Išgum, Ivana

    2016-01-01

    Identification of anatomical regions of interest is a prerequisite in many medical image analysis tasks. We propose a method that automatically identifies a slice of interest (SOI) in 3D images with a convolutional neural network (ConvNet) regressor. In 150 chest CT scans two reference slices were

  6. Identification with video game characters as automatic shift of self-perceptions

    NARCIS (Netherlands)

    Klimmt, C.; Hefner, D.; Vorderer, P.A.; Roth, C.; Blake, C.

    2010-01-01

    Two experiments tested the prediction that video game players identify with the character or role they are assigned, which leads to automatic shifts in implicit self-perceptions. Video game identification, thus, is considered as a kind of altered self-experience. In Study 1 (N = 61), participants

  7. Automatic Identification and Reconstruction of the Right Phrenic Nerve on Computed Tomography

    OpenAIRE

    Bamps, Kobe; Cuypers, Céline; Polmans, Pieter; Claesen, Luc; Koopman, Pieter

    2016-01-01

    An automatic computer algorithm was successfully constructed, enabling identification and reconstruction of the right phrenic nerve on high resolution coronary computed tomography scans. This could lead to a substantial reduction in the incidence of phrenic nerve paralysis during pulmonary vein isolation using ballon techniques.

  8. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  9. Automatic Knowledge Extraction and Knowledge Structuring for a National Term Bank

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2011-01-01

    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank.......This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data from...

  10. LEARNING VECTOR QUANTIZATION FOR ADAPTED GAUSSIAN MIXTURE MODELS IN AUTOMATIC SPEAKER IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    IMEN TRABELSI

    2017-05-01

    Full Text Available Speaker Identification (SI aims at automatically identifying an individual by extracting and processing information from his/her voice. Speaker voice is a robust a biometric modality that has a strong impact in several application areas. In this study, a new combination learning scheme has been proposed based on Gaussian mixture model-universal background model (GMM-UBM and Learning vector quantization (LVQ for automatic text-independent speaker identification. Features vectors, constituted by the Mel Frequency Cepstral Coefficients (MFCC extracted from the speech signal are used to train the New England subset of the TIMIT database. The best results obtained (90% for gender- independent speaker identification, 97 % for male speakers and 93% for female speakers for test data using 36 MFCC features.

  11. Automatic identification of otological drilling faults: an intelligent recognition algorithm.

    Science.gov (United States)

    Cao, Tianyang; Li, Xisheng; Gao, Zhiqiang; Feng, Guodong; Shen, Peng

    2010-06-01

    This article presents an intelligent recognition algorithm that can recognize milling states of the otological drill by fusing multi-sensor information. An otological drill was modified by the addition of sensors. The algorithm was designed according to features of the milling process and is composed of a characteristic curve, an adaptive filter and a rule base. The characteristic curve can weaken the impact of the unstable normal milling process and reserve the features of drilling faults. The adaptive filter is capable of suppressing interference in the characteristic curve by fusing multi-sensor information. The rule base can identify drilling faults through the filtering result data. The experiments were repeated on fresh porcine scapulas, including normal milling and two drilling faults. The algorithm has high rates of identification. This study shows that the intelligent recognition algorithm can identify drilling faults under interference conditions. (c) 2010 John Wiley & Sons, Ltd.

  12. Associative priming in a masked perceptual identification task: evidence for automatic processes.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Raaijmakers, Jeroen G W

    2002-10-01

    Two experiments investigated the influence of automatic and strategic processes on associative priming effects in a perceptual identification task in which prime-target pairs are briefly presented and masked. In this paradigm, priming is defined as a higher percentage of correctly identified targets for related pairs than for unrelated pairs. In Experiment 1, priming was obtained for mediated word pairs. This mediated priming effect was affected neither by the presence of direct associations nor by the presentation time of the primes, indicating that automatic priming effects play a role in perceptual identification. Experiment 2 showed that the priming effect was not affected by the proportion (.90 vs. .10) of related pairs if primes were presented briefly to prevent their identification. However, a large proportion effect was found when primes were presented for 1000 ms so that they were clearly visible. These results indicate that priming in a masked perceptual identification task is the result of automatic processes and is not affected by strategies. The present paradigm provides a valuable alternative to more commonly used tasks such as lexical decision.

  13. Perspective of the applications of automatic identification technologies in the Serbian Army

    Directory of Open Access Journals (Sweden)

    Velibor V. Jovanović

    2012-07-01

    Full Text Available Without modern information systems, supply-chain management is almost impossible. Automatic identification technologies provide automated data processing, which contributes to improving the conditions and support decision making. Automatic identification technology media, notably BARCODE and RFID technology, are used as carriers of labels with high quality data and adequate description of material means, for providing a crucial visibility of inventory levels through the supply chain. With these media and the use of an adequate information system, the Ministry of Defense of the Republic of Serbia will be able to establish a system of codification and, in accordance with the NATO codification system, to successfully implement a unique codification, classification and determination of storage numbers for all tools, components and spare parts for their unequivocal identification. In the perspective, this will help end users to perform everyday tasks without compromising the material integrity of security data. It will also help command structures to have reliable information for decision making to ensure optimal management. Products and services that pass the codification procedure will have the opportunity to be offered in the largest market of armament and military equipment. This paper gives a comparative analysis of two automatic identification technologies - BARCODE, the most common one, and RFID, the most advanced one - with an emphasis on the advantages and disadvantages of their use in tracking inventory through the supply chain. Their possible application in the Serbian Army is discussed in general.

  14. Managing Returnable Containers Logistics - A Case Study Part II - Improving Visibility through Using Automatic Identification Technologies

    Directory of Open Access Journals (Sweden)

    Gretchen Meiser

    2011-05-01

    Full Text Available This case study is the result of a project conducted on behalf of a company that uses its own returnable containers to transport purchased parts from suppliers. The objective of this project was to develop a proposal to enable the company to more effectively track and manage its returnable containers. The research activities in support of this project included (1 the analysis and documentation of the physical flow and the information flow associated with the containers and (2 the investigation of new technologies to improve the automatic identification and tracking of containers. This paper explains the automatic identification technologies and important criteria for selection. A companion paper details the flow of information and containers within the logistics chain, and it identifies areas for improving the management of the containers.

  15. Automatic limb identification and sleeping parameters assessment for pressure ulcer prevention.

    Science.gov (United States)

    Baran Pouyan, Maziyar; Birjandtalab, Javad; Nourani, Mehrdad; Matthew Pompeo, M D

    2016-08-01

    Pressure ulcers (PUs) are common among vulnerable patients such as elderly, bedridden and diabetic. PUs are very painful for patients and costly for hospitals and nursing homes. Assessment of sleeping parameters on at-risk limbs is critical for ulcer prevention. An effective assessment depends on automatic identification and tracking of at-risk limbs. An accurate limb identification can be used to analyze the pressure distribution and assess risk for each limb. In this paper, we propose a graph-based clustering approach to extract the body limbs from the pressure data collected by a commercial pressure map system. A robust signature-based technique is employed to automatically label each limb. Finally, an assessment technique is applied to evaluate the experienced stress by each limb over time. The experimental results indicate high performance and more than 94% average accuracy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Roadway System Assessment Using Bluetooth-Based Automatic Vehicle Identification Travel Time Data

    OpenAIRE

    Day, Christopher M.; Brennan, Thomas M.; Hainen, Alexander M.; Remias, Stephen M.; Bullock, Darcy M.

    2012-01-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection systems. This includes considerations in the physical setup of the collection system as well as the interpretation of the data. An extended discussion is provided, with examples, demonstrating data techniques for converting the raw data into more concise metrics and views. Examples of statistical before-after tests are also provided. A series of case studies were ...

  17. Semi-automatic Term Extraction for the African Languages, with ...

    African Journals Online (AJOL)

    rbr

    for the treatment of single-word terms versus multi-word terms; and the various findings are sum- marised in a ... these days in many different types of dictionary to use the systematic evidence .... not form the focus of the current investigation. ..... When one studies the first 25 unique terms on the KeyWord list, one sees that.

  18. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal neutron activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, C.

    1978-01-01

    A description is given of a computer code which automatically analyses gamma-ray spectra obtained with Ge(Li) detectors. The program contains features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification and calculation of masses and errors. Finally the results obtained with our computer code for a lunar sample are reported and briefly discussed

  19. A pattern recognition approach based on DTW for automatic transient identification in nuclear power plants

    International Nuclear Information System (INIS)

    Galbally, Javier; Galbally, David

    2015-01-01

    Highlights: • Novel transient identification method for NPPs. • Low-complexity. • Low training data requirements. • High accuracy. • Fully reproducible protocol carried out on a real benchmark. - Abstract: Automatic identification of transients in nuclear power plants (NPPs) allows monitoring the fatigue damage accumulated by critical components during plant operation, and is therefore of great importance for ensuring that usage factors remain within the original design bases postulated by the plant designer. Although several schemes to address this important issue have been explored in the literature, there is still no definitive solution available. In the present work, a new method for automatic transient identification is proposed, based on the Dynamic Time Warping (DTW) algorithm, largely used in other related areas such as signature or speech recognition. The novel transient identification system is evaluated on real operational data following a rigorous pattern recognition protocol. Results show the high accuracy of the proposed approach, which is combined with other interesting features such as its low complexity and its very limited requirements of training data

  20. Exploration of Web Users' Search Interests through Automatic Subject Categorization of Query Terms.

    Science.gov (United States)

    Pu, Hsiao-tieh; Yang, Chyan; Chuang, Shui-Lung

    2001-01-01

    Proposes a mechanism that carefully integrates human and machine efforts to explore Web users' search interests. The approach consists of a four-step process: extraction of core terms; construction of subject taxonomy; automatic subject categorization of query terms; and observation of users' search interests. Research findings are proved valuable…

  1. Automatic identification of bullet signatures based on consecutive matching striae (CMS) criteria.

    Science.gov (United States)

    Chu, Wei; Thompson, Robert M; Song, John; Vorburger, Theodore V

    2013-09-10

    The consecutive matching striae (CMS) numeric criteria for firearm and toolmark identifications have been widely accepted by forensic examiners, although there have been questions concerning its observer subjectivity and limited statistical support. In this paper, based on signal processing and extraction, a model for the automatic and objective counting of CMS is proposed. The position and shape information of the striae on the bullet land is represented by a feature profile, which is used for determining the CMS number automatically. Rapid counting of CMS number provides a basis for ballistics correlations with large databases and further statistical and probability analysis. Experimental results in this report using bullets fired from ten consecutively manufactured barrels support this developed model. Published by Elsevier Ireland Ltd.

  2. An Automatic Identification Procedure to Promote the use of FES-Cycling Training for Hemiparetic Patients

    Directory of Open Access Journals (Sweden)

    Emilia Ambrosini

    2014-01-01

    Full Text Available Cycling induced by Functional Electrical Stimulation (FES training currently requires a manual setting of different parameters, which is a time-consuming and scarcely repeatable procedure. We proposed an automatic procedure for setting session-specific parameters optimized for hemiparetic patients. This procedure consisted of the identification of the stimulation strategy as the angular ranges during which FES drove the motion, the comparison between the identified strategy and the physiological muscular activation strategy, and the setting of the pulse amplitude and duration of each stimulated muscle. Preliminary trials on 10 healthy volunteers helped define the procedure. Feasibility tests on 8 hemiparetic patients (5 stroke, 3 traumatic brain injury were performed. The procedure maximized the motor output within the tolerance constraint, identified a biomimetic strategy in 6 patients, and always lasted less than 5 minutes. Its reasonable duration and automatic nature make the procedure usable at the beginning of every training session, potentially enhancing the performance of FES-cycling training.

  3. Identification of mycobacterium tuberculosis in sputum smear slide using automatic scanning microscope

    Science.gov (United States)

    Rulaningtyas, Riries; Suksmono, Andriyan B.; Mengko, Tati L. R.; Saptawati, Putri

    2015-04-01

    Sputum smear observation has an important role in tuberculosis (TB) disease diagnosis, because it needs accurate identification to avoid high errors diagnosis. In development countries, sputum smear slide observation is commonly done with conventional light microscope from Ziehl-Neelsen stained tissue and it doesn't need high cost to maintain the microscope. The clinicians do manual screening process for sputum smear slide which is time consuming and needs highly training to detect the presence of TB bacilli (mycobacterium tuberculosis) accurately, especially for negative slide and slide with less number of TB bacilli. For helping the clinicians, we propose automatic scanning microscope with automatic identification of TB bacilli. The designed system modified the field movement of light microscope with stepper motor which was controlled by microcontroller. Every sputum smear field was captured by camera. After that some image processing techniques were done for the sputum smear images. The color threshold was used for background subtraction with hue canal in HSV color space. Sobel edge detection algorithm was used for TB bacilli image segmentation. We used feature extraction based on shape for bacilli analyzing and then neural network classified TB bacilli or not. The results indicated identification of TB bacilli that we have done worked well and detected TB bacilli accurately in sputum smear slide with normal staining, but not worked well in over staining and less staining tissue slide. However, overall the designed system can help the clinicians in sputum smear observation becomes more easily.

  4. Automatic Identification of Subtechniques in Skating-Style Roller Skiing Using Inertial Sensors

    Science.gov (United States)

    Sakurai, Yoshihisa; Fujita, Zenya; Ishige, Yusuke

    2016-01-01

    This study aims to develop and validate an automated system for identifying skating-style cross-country subtechniques using inertial sensors. In the first experiment, the performance of a male cross-country skier was used to develop an automated identification system. In the second, eight male and seven female college cross-country skiers participated to validate the developed identification system. Each subject wore inertial sensors on both wrists and both roller skis, and a small video camera on a backpack. All subjects skied through a 3450 m roller ski course using a skating style at their maximum speed. The adopted subtechniques were identified by the automated method based on the data obtained from the sensors, as well as by visual observations from a video recording of the same ski run. The system correctly identified 6418 subtechniques from a total of 6768 cycles, which indicates an accuracy of 94.8%. The precisions of the automatic system for identifying the V1R, V1L, V2R, V2L, V2AR, and V2AL subtechniques were 87.6%, 87.0%, 97.5%, 97.8%, 92.1%, and 92.0%, respectively. Most incorrect identification cases occurred during a subtechnique identification that included a transition and turn event. Identification accuracy can be improved by separately identifying transition and turn events. This system could be used to evaluate each skier’s subtechniques in course conditions. PMID:27049388

  5. Automatic Identification of Subtechniques in Skating-Style Roller Skiing Using Inertial Sensors

    Directory of Open Access Journals (Sweden)

    Yoshihisa Sakurai

    2016-04-01

    Full Text Available This study aims to develop and validate an automated system for identifying skating-style cross-country subtechniques using inertial sensors. In the first experiment, the performance of a male cross-country skier was used to develop an automated identification system. In the second, eight male and seven female college cross-country skiers participated to validate the developed identification system. Each subject wore inertial sensors on both wrists and both roller skis, and a small video camera on a backpack. All subjects skied through a 3450 m roller ski course using a skating style at their maximum speed. The adopted subtechniques were identified by the automated method based on the data obtained from the sensors, as well as by visual observations from a video recording of the same ski run. The system correctly identified 6418 subtechniques from a total of 6768 cycles, which indicates an accuracy of 94.8%. The precisions of the automatic system for identifying the V1R, V1L, V2R, V2L, V2AR, and V2AL subtechniques were 87.6%, 87.0%, 97.5%, 97.8%, 92.1%, and 92.0%, respectively. Most incorrect identification cases occurred during a subtechnique identification that included a transition and turn event. Identification accuracy can be improved by separately identifying transition and turn events. This system could be used to evaluate each skier’s subtechniques in course conditions.

  6. Forming and detection of digital watermarks in the System for Automatic Identification of VHF Transmissions

    Directory of Open Access Journals (Sweden)

    О. В. Шишкін

    2013-07-01

    Full Text Available Forming and detection algorithms for digital watermarks are designed for automatic identification of VHF radiotelephone transmissions in the maritime and aeronautical mobile services. An audible insensitivity and interference resistance of embedded digital data are provided by means of OFDM technology jointly with normalized distortions distribution and data packet detection by the hash-function. Experiments were carried out on the base of ship’s radio station RT-2048 Sailor and USB ADC-DAC module of type Е14-140M L-CARD in the off-line processing regime in Matlab medium

  7. Deep learning for automatic localization, identification, and segmentation of vertebral bodies in volumetric MR images

    Science.gov (United States)

    Suzani, Amin; Rasoulian, Abtin; Seitel, Alexander; Fels, Sidney; Rohling, Robert N.; Abolmaesumi, Purang

    2015-03-01

    This paper proposes an automatic method for vertebra localization, labeling, and segmentation in multi-slice Magnetic Resonance (MR) images. Prior work in this area on MR images mostly requires user interaction while our method is fully automatic. Cubic intensity-based features are extracted from image voxels. A deep learning approach is used for simultaneous localization and identification of vertebrae. The localized points are refined by local thresholding in the region of the detected vertebral column. Thereafter, a statistical multi-vertebrae model is initialized on the localized vertebrae. An iterative Expectation Maximization technique is used to register the vertebral body of the model to the image edges and obtain a segmentation of the lumbar vertebral bodies. The method is evaluated by applying to nine volumetric MR images of the spine. The results demonstrate 100% vertebra identification and a mean surface error of below 2.8 mm for 3D segmentation. Computation time is less than three minutes per high-resolution volumetric image.

  8. Automatic and Controlled Processing in Sentence Recall: The Role of Long-Term and Working Memory

    Science.gov (United States)

    Jefferies, E.; Lambon Ralph, M.A.; Baddeley, A.D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley,…

  9. Evaluating current automatic de-identification methods with Veteran’s health administration clinical documents

    Directory of Open Access Journals (Sweden)

    Ferrández Oscar

    2012-07-01

    Full Text Available Abstract Background The increased use and adoption of Electronic Health Records (EHR causes a tremendous growth in digital information useful for clinicians, researchers and many other operational purposes. However, this information is rich in Protected Health Information (PHI, which severely restricts its access and possible uses. A number of investigators have developed methods for automatically de-identifying EHR documents by removing PHI, as specified in the Health Insurance Portability and Accountability Act “Safe Harbor” method. This study focuses on the evaluation of existing automated text de-identification methods and tools, as applied to Veterans Health Administration (VHA clinical documents, to assess which methods perform better with each category of PHI found in our clinical notes; and when new methods are needed to improve performance. Methods We installed and evaluated five text de-identification systems “out-of-the-box” using a corpus of VHA clinical documents. The systems based on machine learning methods were trained with the 2006 i2b2 de-identification corpora and evaluated with our VHA corpus, and also evaluated with a ten-fold cross-validation experiment using our VHA corpus. We counted exact, partial, and fully contained matches with reference annotations, considering each PHI type separately, or only one unique ‘PHI’ category. Performance of the systems was assessed using recall (equivalent to sensitivity and precision (equivalent to positive predictive value metrics, as well as the F2-measure. Results Overall, systems based on rules and pattern matching achieved better recall, and precision was always better with systems based on machine learning approaches. The highest “out-of-the-box” F2-measure was 67% for partial matches; the best precision and recall were 95% and 78%, respectively. Finally, the ten-fold cross validation experiment allowed for an increase of the F2-measure to 79% with partial matches

  10. Visible and NIR spectral band combination to produce high security ID tags for automatic identification

    Science.gov (United States)

    Pérez-Cabré, Elisabet; Millán, María S.; Javidi, Bahram

    2006-09-01

    Verification of a piece of information and/or authentication of a given object or person are common operations carried out by automatic security systems that can be applied, for instance, to control the entrance to restricted areas, access to public buildings, identification of cardholders, etc. Vulnerability of such security systems may depend on the ease of counterfeiting the information used as a piece of identification for verification and authentication. To protect data against tampering, the signature that identifies an object is usually encrypted to avoid an easy recognition at human sight and an easy reproduction using conventional devices for imaging or scanning. To make counterfeiting even more difficult, we propose to combine data from visible and near infrared (NIR) spectral bands. By doing this, neither the visible content nor the NIR data by theirselves are sufficient to allow the signature recognition and thus, the identification of a given object. Only the appropriate combination of both signals permits a satisfactory authentication. In addition, the resulting signature is encrypted following a fully-phase encryption technique and the obtained complex-amplitude distribution is encoded on an ID tag. Spatial multiplexing of the encrypted signature allows us to build a distortion-invariant ID tag, so that remote authentication can be achieved even if the tag is captured under rotation or at different distances. We also explore the possibility of using partial information of the encrypted signature to simplify the ID tag design.

  11. Benefit Analyses of Technologies for Automatic Identification to Be Implemented in the Healthcare Sector

    Science.gov (United States)

    Krey, Mike; Schlatter, Ueli

    The tasks and objectives of automatic identification (Auto-ID) are to provide information on goods and products. It has already been established for years in the areas of logistics and trading and can no longer be ignored by the German healthcare sector. Some German hospitals have already discovered the capabilities of Auto-ID. Improvements in quality, safety and reductions in risk, cost and time are aspects and areas where improvements are achievable. Privacy protection, legal restraints, and the personal rights of patients and staff members are just a few aspects which make the heath care sector a sensible field for the implementation of Auto-ID. Auto-ID in this context contains the different technologies, methods and products for the registration, provision and storage of relevant data. With the help of a quantifiable and science-based evaluation, an answer is sought as to which Auto-ID has the highest capability to be implemented in healthcare business.

  12. A support vector machine approach to the automatic identification of fluorescence spectra emitted by biological agents

    Science.gov (United States)

    Gelfusa, M.; Murari, A.; Lungaroni, M.; Malizia, A.; Parracino, S.; Peluso, E.; Cenciarelli, O.; Carestia, M.; Pizzoferrato, R.; Vega, J.; Gaudio, P.

    2016-10-01

    Two of the major new concerns of modern societies are biosecurity and biosafety. Several biological agents (BAs) such as toxins, bacteria, viruses, fungi and parasites are able to cause damage to living systems either humans, animals or plants. Optical techniques, in particular LIght Detection And Ranging (LIDAR), based on the transmission of laser pulses and analysis of the return signals, can be successfully applied to monitoring the release of biological agents into the atmosphere. It is well known that most of biological agents tend to emit specific fluorescence spectra, which in principle allow their detection and identification, if excited by light of the appropriate wavelength. For these reasons, the detection of the UVLight Induced Fluorescence (UV-LIF) emitted by BAs is particularly promising. On the other hand, the stand-off detection of BAs poses a series of challenging issues; one of the most severe is the automatic discrimination between various agents which emit very similar fluorescence spectra. In this paper, a new data analysis method, based on a combination of advanced filtering techniques and Support Vector Machines, is described. The proposed approach covers all the aspects of the data analysis process, from filtering and denoising to automatic recognition of the agents. A systematic series of numerical tests has been performed to assess the potential and limits of the proposed methodology. The first investigations of experimental data have already given very encouraging results.

  13. Metodology of identification parameters of models control objects of automatic trailing system

    Directory of Open Access Journals (Sweden)

    I.V. Zimchuk

    2017-04-01

    Full Text Available The determining factor for the successful solution of the problem of synthesis of optimal control systems of different processes are adequacy of mathematical model of control object. In practice, the options can differ from the objects taken priori, causing a need to clarification of them. In this context, the article presents the results of the development and application of methods parameters identification of mathematical models of control object of automatic trailing system. The stated problem in the article is solved provided that control object is fully controlled and observed, and a differential equation of control object is known a priori. The coefficients of this equation to be determined. Identifying quality criterion is to minimize the integral value of squared error of identification. The method is based on a description of the dynamics of the object in space state. Equation of identification synthesized using the vector-matrix representation of model. This equation describes the interconnection of coefficients of matrix state and control with inputs and outputs of object. The initial data for calculation are the results of experimental investigation of the reaction of phase coordinates of control object at a typical input signal. The process of calculating the model parameters is reduced to solving the system of equations of the first order each. Application the above approach is illustrated in the example identification of coefficients transfer function of control object first order. Results of digital simulation are presented, they are confirming the justice of set out mathematical calculations. The approach enables to do the identification of models of one-dimensional and multidimensional objects and does not require a large amount of calculation for its implementation. The order of identified model is limited capabilities of measurement phase coordinates of corresponding control object. The practical significance of the work is

  14. Automatic and controlled processing in sentence recall: The role of long-term and working memory

    OpenAIRE

    Jefferies, Elizabeth; Lambon Ralph, Matthew A.; Baddeley, Alan D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley, 2000) proposes that the executive component of working memory plays a crucial role in the formation of links between different representational formats...

  15. Automatic Identification of Alpine Mass Movements by a Combination of Seismic and Infrasound Sensors

    Science.gov (United States)

    Hübl, Johannes; McArdell, Brian W.; Walter, Fabian

    2018-01-01

    The automatic detection and identification of alpine mass movements such as debris flows, debris floods, or landslides have been of increasing importance for devising mitigation measures in densely populated and intensively used alpine regions. Since these mass movements emit characteristic seismic and acoustic waves in the low-frequency range (<30 Hz), several approaches have already been developed for detection and warning systems based on these signals. However, a combination of the two methods, for improving detection probability and reducing false alarms, is still applied rarely. This paper presents an update and extension of a previously published approach for a detection and identification system based on a combination of seismic and infrasound sensors. Furthermore, this work evaluates the possible early warning times at several test sites and aims to analyze the seismic and infrasound spectral signature produced by different sediment-related mass movements to identify the process type and estimate the magnitude of the event. Thus, this study presents an initial method for estimating the peak discharge and total volume of debris flows based on infrasound data. Tests on several catchments show that this system can detect and identify mass movements in real time directly at the sensor site with high accuracy and a low false alarm ratio. PMID:29789449

  16. A new technology for automatic identification and sorting of plastics for recycling.

    Science.gov (United States)

    Ahmad, S R

    2004-10-01

    A new technology for automatic sorting of plastics, based upon optical identification of fluorescence signatures of dyes, incorporated in such materials in trace concentrations prior to product manufacturing, is described. Three commercial tracers were selected primarily on the basis of their good absorbency in the 310-370 nm spectral band and their identifiable narrow-band fluorescence signatures in the visible band of the spectrum when present in binary combinations. This absorption band was selected because of the availability of strong emission lines in this band from a commercial Hg-arc lamp and high fluorescence quantum yields of the tracers at this excitation wavelength band. The plastics chosen for tracing and identification are HDPE, LDPE, PP, EVA, PVC and PET and the tracers were compatible and chemically non-reactive with the host matrices and did not affect the transparency of the plastics. The design of a monochromatic and collimated excitation source, the sensor system are described and their performances in identifying and sorting plastics doped with tracers at a few parts per million concentration levels are evaluated. In an industrial sorting system, the sensor was able to sort 300 mm long plastic bottles at a conveyor belt speed of 3.5 m.sec(-1) with a sorting purity of -95%. The limitation was imposed due to mechanical singulation irregularities at high speed and the limited processing speed of the computer used.

  17. A hybrid approach to automatic de-identification of psychiatric notes.

    Science.gov (United States)

    Lee, Hee-Jin; Wu, Yonghui; Zhang, Yaoyun; Xu, Jun; Xu, Hua; Roberts, Kirk

    2017-11-01

    De-identification, or identifying and removing protected health information (PHI) from clinical data, is a critical step in making clinical data available for clinical applications and research. This paper presents a natural language processing system for automatic de-identification of psychiatric notes, which was designed to participate in the 2016 CEGS N-GRID shared task Track 1. The system has a hybrid structure that combines machine leaning techniques and rule-based approaches. The rule-based components exploit the structure of the psychiatric notes as well as characteristic surface patterns of PHI mentions. The machine learning components utilize supervised learning with rich features. In addition, the system performance was boosted with integration of additional data to the training set through domain adaptation. The hybrid system showed overall micro-averaged F-score 90.74 on the test set, second-best among all the participants of the CEGS N-GRID task. Copyright © 2017. Published by Elsevier Inc.

  18. Automatic Identification of Alpine Mass Movements by a Combination of Seismic and Infrasound Sensors

    Directory of Open Access Journals (Sweden)

    Andreas Schimmel

    2018-05-01

    Full Text Available The automatic detection and identification of alpine mass movements such as debris flows, debris floods, or landslides have been of increasing importance for devising mitigation measures in densely populated and intensively used alpine regions. Since these mass movements emit characteristic seismic and acoustic waves in the low-frequency range (<30 Hz, several approaches have already been developed for detection and warning systems based on these signals. However, a combination of the two methods, for improving detection probability and reducing false alarms, is still applied rarely. This paper presents an update and extension of a previously published approach for a detection and identification system based on a combination of seismic and infrasound sensors. Furthermore, this work evaluates the possible early warning times at several test sites and aims to analyze the seismic and infrasound spectral signature produced by different sediment-related mass movements to identify the process type and estimate the magnitude of the event. Thus, this study presents an initial method for estimating the peak discharge and total volume of debris flows based on infrasound data. Tests on several catchments show that this system can detect and identify mass movements in real time directly at the sensor site with high accuracy and a low false alarm ratio.

  19. Automatic multimodal detection for long-term seizure documentation in epilepsy.

    Science.gov (United States)

    Fürbass, F; Kampusch, S; Kaniusas, E; Koren, J; Pirker, S; Hopfengärtner, R; Stefan, H; Kluge, T; Baumgartner, C

    2017-08-01

    This study investigated sensitivity and false detection rate of a multimodal automatic seizure detection algorithm and the applicability to reduced electrode montages for long-term seizure documentation in epilepsy patients. An automatic seizure detection algorithm based on EEG, EMG, and ECG signals was developed. EEG/ECG recordings of 92 patients from two epilepsy monitoring units including 494 seizures were used to assess detection performance. EMG data were extracted by bandpass filtering of EEG signals. Sensitivity and false detection rate were evaluated for each signal modality and for reduced electrode montages. All focal seizures evolving to bilateral tonic-clonic (BTCS, n=50) and 89% of focal seizures (FS, n=139) were detected. Average sensitivity in temporal lobe epilepsy (TLE) patients was 94% and 74% in extratemporal lobe epilepsy (XTLE) patients. Overall detection sensitivity was 86%. Average false detection rate was 12.8 false detections in 24h (FD/24h) for TLE and 22 FD/24h in XTLE patients. Utilization of 8 frontal and temporal electrodes reduced average sensitivity from 86% to 81%. Our automatic multimodal seizure detection algorithm shows high sensitivity with full and reduced electrode montages. Evaluation of different signal modalities and electrode montages paces the way for semi-automatic seizure documentation systems. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  20. Ontorat: automatic generation of new ontology terms, annotations, and axioms based on ontology design patterns.

    Science.gov (United States)

    Xiang, Zuoshuang; Zheng, Jie; Lin, Yu; He, Yongqun

    2015-01-01

    It is time-consuming to build an ontology with many terms and axioms. Thus it is desired to automate the process of ontology development. Ontology Design Patterns (ODPs) provide a reusable solution to solve a recurrent modeling problem in the context of ontology engineering. Because ontology terms often follow specific ODPs, the Ontology for Biomedical Investigations (OBI) developers proposed a Quick Term Templates (QTTs) process targeted at generating new ontology classes following the same pattern, using term templates in a spreadsheet format. Inspired by the ODPs and QTTs, the Ontorat web application is developed to automatically generate new ontology terms, annotations of terms, and logical axioms based on a specific ODP(s). The inputs of an Ontorat execution include axiom expression settings, an input data file, ID generation settings, and a target ontology (optional). The axiom expression settings can be saved as a predesigned Ontorat setting format text file for reuse. The input data file is generated based on a template file created by a specific ODP (text or Excel format). Ontorat is an efficient tool for ontology expansion. Different use cases are described. For example, Ontorat was applied to automatically generate over 1,000 Japan RIKEN cell line cell terms with both logical axioms and rich annotation axioms in the Cell Line Ontology (CLO). Approximately 800 licensed animal vaccines were represented and annotated in the Vaccine Ontology (VO) by Ontorat. The OBI team used Ontorat to add assay and device terms required by ENCODE project. Ontorat was also used to add missing annotations to all existing Biobank specific terms in the Biobank Ontology. A collection of ODPs and templates with examples are provided on the Ontorat website and can be reused to facilitate ontology development. With ever increasing ontology development and applications, Ontorat provides a timely platform for generating and annotating a large number of ontology terms by following

  1. Integration and segregation of large-scale brain networks during short-term task automatization.

    Science.gov (United States)

    Mohr, Holger; Wolfensteller, Uta; Betzel, Richard F; Mišić, Bratislav; Sporns, Olaf; Richiardi, Jonas; Ruge, Hannes

    2016-11-03

    The human brain is organized into large-scale functional networks that can flexibly reconfigure their connectivity patterns, supporting both rapid adaptive control and long-term learning processes. However, it has remained unclear how short-term network dynamics support the rapid transformation of instructions into fluent behaviour. Comparing fMRI data of a learning sample (N=70) with a control sample (N=67), we find that increasingly efficient task processing during short-term practice is associated with a reorganization of large-scale network interactions. Practice-related efficiency gains are facilitated by enhanced coupling between the cingulo-opercular network and the dorsal attention network. Simultaneously, short-term task automatization is accompanied by decreasing activation of the fronto-parietal network, indicating a release of high-level cognitive control, and a segregation of the default mode network from task-related networks. These findings suggest that short-term task automatization is enabled by the brain's ability to rapidly reconfigure its large-scale network organization involving complementary integration and segregation processes.

  2. Semi-automatic Term Extraction for an isiZulu Linguistic Terms ...

    African Journals Online (AJOL)

    user

    This paper advances the use of frequency analysis and the keyword analysis as strategies to extract terms for the compilation of the dictionary of isiZulu linguistic terms. The study uses the isiZulu. National Corpus (INC) of about 1,2 million tokens as a reference corpus as well as an LSP corpus of about 100,000 tokens as a ...

  3. The Iqmulus Urban Showcase: Automatic Tree Classification and Identification in Huge Mobile Mapping Point Clouds

    Science.gov (United States)

    Böhm, J.; Bredif, M.; Gierlinger, T.; Krämer, M.; Lindenberg, R.; Liu, K.; Michel, F.; Sirmacek, B.

    2016-06-01

    Current 3D data capturing as implemented on for example airborne or mobile laser scanning systems is able to efficiently sample the surface of a city by billions of unselective points during one working day. What is still difficult is to extract and visualize meaningful information hidden in these point clouds with the same efficiency. This is where the FP7 IQmulus project enters the scene. IQmulus is an interactive facility for processing and visualizing big spatial data. In this study the potential of IQmulus is demonstrated on a laser mobile mapping point cloud of 1 billion points sampling ~ 10 km of street environment in Toulouse, France. After the data is uploaded to the IQmulus Hadoop Distributed File System, a workflow is defined by the user consisting of retiling the data followed by a PCA driven local dimensionality analysis, which runs efficiently on the IQmulus cloud facility using a Spark implementation. Points scattering in 3 directions are clustered in the tree class, and are separated next into individual trees. Five hours of processing at the 12 node computing cluster results in the automatic identification of 4000+ urban trees. Visualization of the results in the IQmulus fat client helps users to appreciate the results, and developers to identify remaining flaws in the processing workflow.

  4. THE IQMULUS URBAN SHOWCASE: AUTOMATIC TREE CLASSIFICATION AND IDENTIFICATION IN HUGE MOBILE MAPPING POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    J. Böhm

    2016-06-01

    Full Text Available Current 3D data capturing as implemented on for example airborne or mobile laser scanning systems is able to efficiently sample the surface of a city by billions of unselective points during one working day. What is still difficult is to extract and visualize meaningful information hidden in these point clouds with the same efficiency. This is where the FP7 IQmulus project enters the scene. IQmulus is an interactive facility for processing and visualizing big spatial data. In this study the potential of IQmulus is demonstrated on a laser mobile mapping point cloud of 1 billion points sampling ~ 10 km of street environment in Toulouse, France. After the data is uploaded to the IQmulus Hadoop Distributed File System, a workflow is defined by the user consisting of retiling the data followed by a PCA driven local dimensionality analysis, which runs efficiently on the IQmulus cloud facility using a Spark implementation. Points scattering in 3 directions are clustered in the tree class, and are separated next into individual trees. Five hours of processing at the 12 node computing cluster results in the automatic identification of 4000+ urban trees. Visualization of the results in the IQmulus fat client helps users to appreciate the results, and developers to identify remaining flaws in the processing workflow.

  5. Integration of low level and ontology derived features for automatic weapon recognition and identification

    Science.gov (United States)

    Sirakov, Nikolay M.; Suh, Sang; Attardo, Salvatore

    2011-06-01

    This paper presents a further step of a research toward the development of a quick and accurate weapons identification methodology and system. A basic stage of this methodology is the automatic acquisition and updating of weapons ontology as a source of deriving high level weapons information. The present paper outlines the main ideas used to approach the goal. In the next stage, a clustering approach is suggested on the base of hierarchy of concepts. An inherent slot of every node of the proposed ontology is a low level features vector (LLFV), which facilitates the search through the ontology. Part of the LLFV is the information about the object's parts. To partition an object a new approach is presented capable of defining the objects concavities used to mark the end points of weapon parts, considered as convexities. Further an existing matching approach is optimized to determine whether an ontological object matches the objects from an input image. Objects from derived ontological clusters will be considered for the matching process. Image resizing is studied and applied to decrease the runtime of the matching approach and investigate its rotational and scaling invariance. Set of experiments are preformed to validate the theoretical concepts.

  6. Adoption of automatic identification systems by grocery retailersin the Johannesburg area

    Directory of Open Access Journals (Sweden)

    Christopher C. Darlington

    2011-11-01

    Full Text Available Retailers not only need the right data capture technology to meet the requirements of their applications, they must also decide on what the optimum technology is from the different symbologies that have been developed over the years. Automatic identification systems (AIS are a priority to decision makers as they attempt to obtain the best blend of equipment to ensure greater loss prevention and higher reliability in data capture. However there is a risk of having too simplistic a view of adopting AIS, since no one solution is applicable across an industry or business model. This problem is addressed through an exploratory, descriptive study, where the nature and value of AIS adoption by grocery retailers in the Johannesburg area is interrogated. Mixed empirical results indicate that, as retailers adopt AIS in order to improve their supply chain management systems, different types of applications are associated with various constraints and opportunities. Overall this study is in line with previous research that supports the notion that supply chain decisions are of a strategic nature even though efficient management of information is a day-to-day business operational decision.

  7. Modified automatic term selection v2: A faster algorithm to calculate inelastic scattering cross-sections

    Energy Technology Data Exchange (ETDEWEB)

    Rusz, Ján, E-mail: jan.rusz@fysik.uu.se

    2017-06-15

    Highlights: • New algorithm for calculating double differential scattering cross-section. • Shown good convergence properties. • Outperforms older MATS algorithm, particularly in zone axis calculations. - Abstract: We present a new algorithm for calculating inelastic scattering cross-section for fast electrons. Compared to the previous Modified Automatic Term Selection (MATS) algorithm (Rusz et al. [18]), it has far better convergence properties in zone axis calculations and it allows to identify contributions of individual atoms. One can think of it as a blend of MATS algorithm and a method described by Weickenmeier and Kohl [10].

  8. Terminology of the public relations field: corpus — automatic term recognition — terminology database

    Directory of Open Access Journals (Sweden)

    Nataša Logar Berginc

    2013-12-01

    Full Text Available The article describes an analysis of automatic term recognition results performed for single- and multi-word terms with the LUIZ term extraction system. The target application of the results is a terminology database of Public Relations and the main resource the KoRP Public Relations Corpus. Our analysis is focused on two segments: (a single-word noun term candidates, which we compare with the frequency list of nouns from KoRP and evaluate termhood on the basis of the judgements of two domain experts, and (b multi-word term candidates with verb and noun as headword. In order to better assess the performance of the system and the soundness of our approach we also performed an analysis of recall. Our results show that the terminological relevance of extracted nouns is indeed higher than that of merely frequent nouns, and that verbal phrases only rarely count as proper terms. The most productive patterns of multi-word terms with noun as a headword have the following structure: [adjective + noun], [adjective + and + adjective + noun] and [adjective + adjective + noun]. The analysis of recall shows low inter-annotator agreement, but nevertheless very satisfactory recall levels.

  9. Automatic Threshold Determination for a Local Approach of Change Detection in Long-Term Signal Recordings

    Directory of Open Access Journals (Sweden)

    David Hewson

    2007-01-01

    Full Text Available CUSUM (cumulative sum is a well-known method that can be used to detect changes in a signal when the parameters of this signal are known. This paper presents an adaptation of the CUSUM-based change detection algorithms to long-term signal recordings where the various hypotheses contained in the signal are unknown. The starting point of the work was the dynamic cumulative sum (DCS algorithm, previously developed for application to long-term electromyography (EMG recordings. DCS has been improved in two ways. The first was a new procedure to estimate the distribution parameters to ensure the respect of the detectability property. The second was the definition of two separate, automatically determined thresholds. One of them (lower threshold acted to stop the estimation process, the other one (upper threshold was applied to the detection function. The automatic determination of the thresholds was based on the Kullback-Leibler distance which gives information about the distance between the detected segments (events. Tests on simulated data demonstrated the efficiency of these improvements of the DCS algorithm.

  10. Identification of Units and Other Terms in Czech Medical Records

    Czech Academy of Sciences Publication Activity Database

    Zvára Jr., Karel; Kašpar, Václav

    2010-01-01

    Roč. 6, č. 1 (2010), s. 78-82 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : natural language processing * healthcare documentation * medical reports * EHR * finite-state machine * regular expression Subject RIV: IN - Informatics, Computer Science http://www.ejbi.org/en/ejbi/article/61-en-identification-of-units-and-other-terms-in-czech-medical-records.html

  11. GPU-accelerated automatic identification of robust beam setups for proton and carbon-ion radiotherapy

    International Nuclear Information System (INIS)

    Ammazzalorso, F; Jelen, U; Bednarz, T

    2014-01-01

    We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.

  12. GPU-accelerated automatic identification of robust beam setups for proton and carbon-ion radiotherapy

    Science.gov (United States)

    Ammazzalorso, F.; Bednarz, T.; Jelen, U.

    2014-03-01

    We demonstrate acceleration on graphic processing units (GPU) of automatic identification of robust particle therapy beam setups, minimizing negative dosimetric effects of Bragg peak displacement caused by treatment-time patient positioning errors. Our particle therapy research toolkit, RobuR, was extended with OpenCL support and used to implement calculation on GPU of the Port Homogeneity Index, a metric scoring irradiation port robustness through analysis of tissue density patterns prior to dose optimization and computation. Results were benchmarked against an independent native CPU implementation. Numerical results were in agreement between the GPU implementation and native CPU implementation. For 10 skull base cases, the GPU-accelerated implementation was employed to select beam setups for proton and carbon ion treatment plans, which proved to be dosimetrically robust, when recomputed in presence of various simulated positioning errors. From the point of view of performance, average running time on the GPU decreased by at least one order of magnitude compared to the CPU, rendering the GPU-accelerated analysis a feasible step in a clinical treatment planning interactive session. In conclusion, selection of robust particle therapy beam setups can be effectively accelerated on a GPU and become an unintrusive part of the particle therapy treatment planning workflow. Additionally, the speed gain opens new usage scenarios, like interactive analysis manipulation (e.g. constraining of some setup) and re-execution. Finally, through OpenCL portable parallelism, the new implementation is suitable also for CPU-only use, taking advantage of multiple cores, and can potentially exploit types of accelerators other than GPUs.

  13. FragIdent – Automatic identification and characterisation of cDNA-fragments

    Directory of Open Access Journals (Sweden)

    Goehler Heike

    2009-03-01

    Full Text Available Abstract Background Many genetic studies and functional assays are based on cDNA fragments. After the generation of cDNA fragments from an mRNA sample, their content is at first unknown and must be assigned by sequencing reactions or hybridisation experiments. Even in characterised libraries, a considerable number of clones are wrongly annotated. Furthermore, mix-ups can happen in the laboratory. It is therefore essential to the relevance of experimental results to confirm or determine the identity of the employed cDNA fragments. However, the manual approach for the characterisation of these fragments using BLAST web interfaces is not suited for larger number of sequences and so far, no user-friendly software is publicly available. Results Here we present the development of FragIdent, an application for the automatic identification of open reading frames (ORFs within cDNA-fragments. The software performs BLAST analyses to identify the genes represented by the sequences and suggests primers to complete the sequencing of the whole insert. Gene-specific information as well as the protein domains encoded by the cDNA fragment are retrieved from Internet-based databases and included in the output. The application features an intuitive graphical interface and is designed for researchers without any bioinformatics skills. It is suited for projects comprising up to several hundred different clones. Conclusion We used FragIdent to identify 84 cDNA clones from a yeast two-hybrid experiment. Furthermore, we identified 131 protein domains within our analysed clones. The source code is freely available from our homepage at http://compbio.charite.de/genetik/FragIdent/.

  14. FragIdent--automatic identification and characterisation of cDNA-fragments.

    Science.gov (United States)

    Seelow, Dominik; Goehler, Heike; Hoffmann, Katrin

    2009-03-02

    Many genetic studies and functional assays are based on cDNA fragments. After the generation of cDNA fragments from an mRNA sample, their content is at first unknown and must be assigned by sequencing reactions or hybridisation experiments. Even in characterised libraries, a considerable number of clones are wrongly annotated. Furthermore, mix-ups can happen in the laboratory. It is therefore essential to the relevance of experimental results to confirm or determine the identity of the employed cDNA fragments. However, the manual approach for the characterisation of these fragments using BLAST web interfaces is not suited for larger number of sequences and so far, no user-friendly software is publicly available. Here we present the development of FragIdent, an application for the automatic identification of open reading frames (ORFs) within cDNA-fragments. The software performs BLAST analyses to identify the genes represented by the sequences and suggests primers to complete the sequencing of the whole insert. Gene-specific information as well as the protein domains encoded by the cDNA fragment are retrieved from Internet-based databases and included in the output. The application features an intuitive graphical interface and is designed for researchers without any bioinformatics skills. It is suited for projects comprising up to several hundred different clones. We used FragIdent to identify 84 cDNA clones from a yeast two-hybrid experiment. Furthermore, we identified 131 protein domains within our analysed clones. The source code is freely available from our homepage at http://compbio.charite.de/genetik/FragIdent/.

  15. AROMA-AIRWICK: a CHLOE/CDC-3600 system for the automatic identification of spark images and their association into tracks

    International Nuclear Information System (INIS)

    Clark, R.K.

    The AROMA-AIRWICK System for CHLOE, an automatic film scanning equipment built at Argonne by Donald Hodges, and the CDC-3600 computer is a system for the automatic identification of spark images and their association into tracks. AROMA-AIRWICK has been an outgrowth of the generally recognized need for the automatic processing of high energy physics data and the fact that the Argonne National Laboratory has been a center of serious spark chamber development in recent years

  16. [Effects of long-term Tai Ji Quan exercise on automatic nervous modulation in the elderly].

    Science.gov (United States)

    Guo, Feng

    2015-03-01

    To examine the effects of long-term Tai Ji Quan (Chinnese Traditional Exercise) on automatic nervous modulation in the elders. The 18 subjects from Tai Ji Quan exercise class in Liaoning University of Retired Veteran Cadres were assigned into long-term Tai Ji Quan exercise group including 10 subjects and novice group including 8 subjects. Electrocardiography, respiratory and blood pressure data were collected on the following time points: at rest before Tai Ji Qhuan exercise and 30 min or 60 min after Tai Ji Quan exercise. The subjects at rest state in the long-term Tai Ji Quan exercise group showed higher than the subjects in the novice group in resperitory rate (RR), standard deviations of normal to normal intervals (SDNN), total power (TP), low frequency power (LFP), high frequency power (HFP), normalized high frequency power (nHFP), but lower in LFP/HFP, systolic and diastolic blood pressure, and heart rate. At rest state the respiratory rate of subjects in long-term Tai Ji Quan exercise group was significantly lower than the novices. After Tai Ji Quan exercise, TP, nHFP, LFP/HFP, heart rate and systolic pressure showed significantly changes, and the change level of Tai Ji Quan on these indices was larger in Tai Ji Quan exercise group than that in the novice group. Long-term Tai Ji Quan exercise can improve vagal modulations, and tend to reduce the sympathetic modulations.

  17. CERPI and CEREL, two computer codes for the automatic identification and determination of gamma emitters in thermal-neutron-activated samples

    International Nuclear Information System (INIS)

    Giannini, M.; Oliva, P.R.; Ramorino, M.C.

    1979-01-01

    A computer code that automatically analyzes gamma-ray spectra obtained with Ge(Li) detectors is described. The program contains such features as automatic peak location and fitting, determination of peak energies and intensities, nuclide identification, and calculation of masses and errors. Finally, the results obtained with this computer code for a lunar sample are reported and briefly discussed

  18. Google Earth Visualizations of the Marine Automatic Identification System (AIS): Monitoring Ship Traffic in National Marine Sanctuaries

    Science.gov (United States)

    Schwehr, K.; Hatch, L.; Thompson, M.; Wiley, D.

    2007-12-01

    The Automatic Identification System (AIS) is a new technology that provides ship position reports with location, time, and identity information without human intervention from ships carrying the transponders to any receiver listening to the broadcasts. In collaboration with the USCG's Research and Development Center, NOAA's Stellwagen Bank National Marine Sanctuary (SBNMS) has installed 3 AIS receivers around Massachusetts Bay to monitor ship traffic transiting the sanctuary and surrounding waters. The SBNMS and the USCG also worked together propose the shifting the shipping lanes (termed the traffic separation scheme; TSS) that transit the sanctuary slightly to the north to reduce the probability of ship strikes of whales that frequent the sanctuary. Following approval by the United Nation's International Maritime Organization, AIS provided a means for NOAA to assess changes in the distribution of shipping traffic caused by formal change in the TSS effective July 1, 2007. However, there was no easy way to visualize this type of time series data. We have created a software package called noaadata-py to process the AIS ship reports and produce KML files for viewing in Google Earth. Ship tracks can be shown changing over time to allow the viewer to feel the motion of traffic through the sanctuary. The ship tracks can also be gridded to create ship traffic density reports for specified periods of time. The density is displayed as map draped on the sea surface or as vertical histogram columns. Additional visualizations such as bathymetry images, S57 nautical charts, and USCG Marine Information for Safety and Law Enforcement (MISLE) can be combined with the ship traffic visualizations to give a more complete picture of the maritime environment. AIS traffic analyses have the potential to give managers throughout NOAA's National Marine Sanctuaries an improved ability to assess the impacts of ship traffic on the marine resources they seek to protect. Viewing ship traffic

  19. Automatic temporal segment detection via bilateral long short-term memory recurrent neural networks

    Science.gov (United States)

    Sun, Bo; Cao, Siming; He, Jun; Yu, Lejun; Li, Liandong

    2017-03-01

    Constrained by the physiology, the temporal factors associated with human behavior, irrespective of facial movement or body gesture, are described by four phases: neutral, onset, apex, and offset. Although they may benefit related recognition tasks, it is not easy to accurately detect such temporal segments. An automatic temporal segment detection framework using bilateral long short-term memory recurrent neural networks (BLSTM-RNN) to learn high-level temporal-spatial features, which synthesizes the local and global temporal-spatial information more efficiently, is presented. The framework is evaluated in detail over the face and body database (FABO). The comparison shows that the proposed framework outperforms state-of-the-art methods for solving the problem of temporal segment detection.

  20. Long term Suboxone™ emotional reactivity as measured by automatic detection in speech.

    Directory of Open Access Journals (Sweden)

    Edward Hill

    Full Text Available Addictions to illicit drugs are among the nation's most critical public health and societal problems. The current opioid prescription epidemic and the need for buprenorphine/naloxone (Suboxone®; SUBX as an opioid maintenance substance, and its growing street diversion provided impetus to determine affective states ("true ground emotionality" in long-term SUBX patients. Toward the goal of effective monitoring, we utilized emotion-detection in speech as a measure of "true" emotionality in 36 SUBX patients compared to 44 individuals from the general population (GP and 33 members of Alcoholics Anonymous (AA. Other less objective studies have investigated emotional reactivity of heroin, methadone and opioid abstinent patients. These studies indicate that current opioid users have abnormal emotional experience, characterized by heightened response to unpleasant stimuli and blunted response to pleasant stimuli. However, this is the first study to our knowledge to evaluate "true ground" emotionality in long-term buprenorphine/naloxone combination (Suboxone™. We found in long-term SUBX patients a significantly flat affect (p<0.01, and they had less self-awareness of being happy, sad, and anxious compared to both the GP and AA groups. We caution definitive interpretation of these seemingly important results until we compare the emotional reactivity of an opioid abstinent control using automatic detection in speech. These findings encourage continued research strategies in SUBX patients to target the specific brain regions responsible for relapse prevention of opioid addiction.

  1. Label-free sensor for automatic identification of erythrocytes using digital in-line holographic microscopy and machine learning.

    Science.gov (United States)

    Go, Taesik; Byeon, Hyeokjun; Lee, Sang Joon

    2018-04-30

    Cell types of erythrocytes should be identified because they are closely related to their functionality and viability. Conventional methods for classifying erythrocytes are time consuming and labor intensive. Therefore, an automatic and accurate erythrocyte classification system is indispensable in healthcare and biomedical fields. In this study, we proposed a new label-free sensor for automatic identification of erythrocyte cell types using a digital in-line holographic microscopy (DIHM) combined with machine learning algorithms. A total of 12 features, including information on intensity distributions, morphological descriptors, and optical focusing characteristics, is quantitatively obtained from numerically reconstructed holographic images. All individual features for discocytes, echinocytes, and spherocytes are statistically different. To improve the performance of cell type identification, we adopted several machine learning algorithms, such as decision tree model, support vector machine, linear discriminant classification, and k-nearest neighbor classification. With the aid of these machine learning algorithms, the extracted features are effectively utilized to distinguish erythrocytes. Among the four tested algorithms, the decision tree model exhibits the best identification performance for the training sets (n = 440, 98.18%) and test sets (n = 190, 97.37%). This proposed methodology, which smartly combined DIHM and machine learning, would be helpful for sensing abnormal erythrocytes and computer-aided diagnosis of hematological diseases in clinic. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Automatic Identification of the Repolarization Endpoint by Computing the Dominant T-wave on a Reduced Number of Leads.

    Science.gov (United States)

    Giuliani, C; Agostinelli, A; Di Nardo, F; Fioretti, S; Burattini, L

    2016-01-01

    Electrocardiographic (ECG) T-wave endpoint (Tend) identification suffers lack of reliability due to the presence of noise and variability among leads. Tend identification can be improved by using global repolarization waveforms obtained by combining several leads. The dominant T-wave (DTW) is a global repolarization waveform that proved to improve Tend identification when computed using the 15 (I to III, aVr, aVl, aVf, V1 to V6, X, Y, Z) leads usually available in clinics, of which only 8 (I, II, V1 to V6) are independent. The aim of the present study was to evaluate if the 8 independent leads are sufficient to obtain a DTW which allows a reliable Tend identification. To this aim Tend measures automatically identified from 15-dependent-lead DTWs of 46 control healthy subjects (CHS) and 103 acute myocardial infarction patients (AMIP) were compared with those obtained from 8-independent-lead DTWs. Results indicate that Tend distributions have not statistically different median values (CHS: 340 ms vs. 340 ms, respectively; AMIP: 325 ms vs. 320 ms, respectively), besides being strongly correlated (CHS: ρ=0.97, AMIP: 0.88; Pautomatic Tend identification from DTW, the 8 independent leads can be used without a statistically significant loss of accuracy but with a significant decrement of computational effort. The lead dependence of 7 out of 15 leads does not introduce a significant bias in the Tend determination from 15 dependent lead DTWs.

  3. Short-Term Load Forecasting-Based Automatic Distribution Network Reconfiguration

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Huaiguang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yingchen [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-23

    In a traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of the load forecasting technique can provide an accurate prediction of the load power that will happen in a future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during a longer time period instead of using a snapshot of the load at the time when the reconfiguration happens; thus, the distribution system operator can use this information to better operate the system reconfiguration and achieve optimal solutions. This paper proposes a short-term load forecasting approach to automatically reconfigure distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with a forecaster based on support vector regression and parallel parameters optimization. The network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum amount of loss at the future time. The simulation results validate and evaluate the proposed approach.

  4. Short-Term Load Forecasting Based Automatic Distribution Network Reconfiguration: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Huaiguang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yingchen [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jiang, Huaiguang [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zhang, Yingchen [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-07-26

    In the traditional dynamic network reconfiguration study, the optimal topology is determined at every scheduled time point by using the real load data measured at that time. The development of load forecasting technique can provide accurate prediction of load power that will happen in future time and provide more information about load changes. With the inclusion of load forecasting, the optimal topology can be determined based on the predicted load conditions during the longer time period instead of using the snapshot of load at the time when the reconfiguration happens, and thus it can provide information to the distribution system operator (DSO) to better operate the system reconfiguration to achieve optimal solutions. Thus, this paper proposes a short-term load forecasting based approach for automatically reconfiguring distribution systems in a dynamic and pre-event manner. Specifically, a short-term and high-resolution distribution system load forecasting approach is proposed with support vector regression (SVR) based forecaster and parallel parameters optimization. And the network reconfiguration problem is solved by using the forecasted load continuously to determine the optimal network topology with the minimum loss at the future time. The simulation results validate and evaluate the proposed approach.

  5. Automatic writer identification using connected-component contours and edge-based features of uppercase Western script.

    Science.gov (United States)

    Schomaker, Lambert; Bulacu, Marius

    2004-06-01

    In this paper, a new technique for offline writer identification is presented, using connected-component contours (COCOCOs or CO3s) in uppercase handwritten samples. In our model, the writer is considered to be characterized by a stochastic pattern generator, producing a family of connected components for the uppercase character set. Using a codebook of CO3s from an independent training set of 100 writers, the probability-density function (PDF) of CO3s was computed for an independent test set containing 150 unseen writers. Results revealed a high-sensitivity of the CO3 PDF for identifying individual writers on the basis of a single sentence of uppercase characters. The proposed automatic approach bridges the gap between image-statistics approaches on one end and manually measured allograph features of individual characters on the other end. Combining the CO3 PDF with an independent edge-based orientation and curvature PDF yielded very high correct identification rates.

  6. Contribution to automatic speech recognition. Analysis of the direct acoustical signal. Recognition of isolated words and phoneme identification

    International Nuclear Information System (INIS)

    Dupeyrat, Benoit

    1981-01-01

    This report deals with the acoustical-phonetic step of the automatic recognition of the speech. The parameters used are the extrema of the acoustical signal (coded in amplitude and duration). This coding method, the properties of which are described, is simple and well adapted to a digital processing. The quality and the intelligibility of the coded signal after reconstruction are particularly satisfactory. An experiment for the automatic recognition of isolated words has been carried using this coding system. We have designed a filtering algorithm operating on the parameters of the coding. Thus the characteristics of the formants can be derived under certain conditions which are discussed. Using these characteristics the identification of a large part of the phonemes for a given speaker was achieved. Carrying on the studies has required the development of a particular methodology of real time processing which allowed immediate evaluation of the improvement of the programs. Such processing on temporal coding of the acoustical signal is extremely powerful and could represent, used in connection with other methods an efficient tool for the automatic processing of the speech.(author) [fr

  7. Automatic identification of watercourses in flat and engineered landscapes by computing the skeleton of a LiDAR point cloud

    Science.gov (United States)

    Broersen, Tom; Peters, Ravi; Ledoux, Hugo

    2017-09-01

    Drainage networks play a crucial role in protecting land against floods. It is therefore important to have an accurate map of the watercourses that form the drainage network. Previous work on the automatic identification of watercourses was typically based on grids, focused on natural landscapes, and used mostly the slope and curvature of the terrain. We focus in this paper on areas that are characterised by low-lying, flat, and engineered landscapes; these are characteristic to the Netherlands for instance. We propose a new methodology to identify watercourses automatically from elevation data, it uses solely a raw classified LiDAR point cloud as input. We show that by computing twice a skeleton of the point cloud-once in 2D and once in 3D-and that by using the properties of the skeletons we can identify most of the watercourses. We have implemented our methodology and tested it for three different soil types around Utrecht, the Netherlands. We were able to detect 98% of the watercourses for one soil type, and around 75% for the worst case, when we compared to a reference dataset that was obtained semi-automatically.

  8. Automatic mental heath assistant : monitoring and measuring nonverbal behavior of the crew during long-term missions

    NARCIS (Netherlands)

    Voynarovskaya, N.; Gorbunov, R.D.; Barakova, E.I.; Rauterberg, G.W.M.; Barakova, E.I.; Ruyter, B.; Spink, A.

    2010-01-01

    This paper presents a method for monitoring the mental state of small isolated crews during long-term missions (such as space mission, polar expeditions, submarine crews, meteorological stations, and etc.) The research is done as a part of Automatic Mental Health Assistant (AMHA) project which aims

  9. Maritime over the Horizon Sensor Integration: High Frequency Surface-Wave-Radar and Automatic Identification System Data Integration Algorithm.

    Science.gov (United States)

    Nikolic, Dejan; Stojkovic, Nikola; Lekic, Nikola

    2018-04-09

    To obtain the complete operational picture of the maritime situation in the Exclusive Economic Zone (EEZ) which lies over the horizon (OTH) requires the integration of data obtained from various sensors. These sensors include: high frequency surface-wave-radar (HFSWR), satellite automatic identification system (SAIS) and land automatic identification system (LAIS). The algorithm proposed in this paper utilizes radar tracks obtained from the network of HFSWRs, which are already processed by a multi-target tracking algorithm and associates SAIS and LAIS data to the corresponding radar tracks, thus forming an integrated data pair. During the integration process, all HFSWR targets in the vicinity of AIS data are evaluated and the one which has the highest matching factor is used for data association. On the other hand, if there is multiple AIS data in the vicinity of a single HFSWR track, the algorithm still makes only one data pair which consists of AIS and HFSWR data with the highest mutual matching factor. During the design and testing, special attention is given to the latency of AIS data, which could be very high in the EEZs of developing countries. The algorithm is designed, implemented and tested in a real working environment. The testing environment is located in the Gulf of Guinea and includes a network of HFSWRs consisting of two HFSWRs, several coastal sites with LAIS receivers and SAIS data provided by provider of SAIS data.

  10. Flight Extraction and Phase Identification for Large Automatic Dependent Surveillance–Broadcast Datasets

    NARCIS (Netherlands)

    Sun, J.; Ellerbroek, J.; Hoekstra, J.M.

    2017-01-01

    AUTOMATIC dependent surveillance–broadcast (ADS-B) [1,2] is widely implemented in modern commercial aircraft and will become mandatory equipment in 2020. Flight state information such as position, velocity, and vertical rate are broadcast by tens of thousands of aircraft around the world constantly

  11. Automatic semantic encoding in verbal short-term memory: evidence from the concreteness effect.

    Science.gov (United States)

    Campoy, Guillermo; Castellà, Judit; Provencio, Violeta; Hitch, Graham J; Baddeley, Alan D

    2015-01-01

    The concreteness effect in verbal short-term memory (STM) tasks is assumed to be a consequence of semantic encoding in STM, with immediate recall of concrete words benefiting from richer semantic representations. We used the concreteness effect to test the hypothesis that semantic encoding in standard verbal STM tasks is a consequence of controlled, attention-demanding mechanisms of strategic semantic retrieval and encoding. Experiment 1 analysed the effect of presentation rate, with slow presentations being assumed to benefit strategic, time-dependent semantic encoding. Experiments 2 and 3 provided a more direct test of the strategic hypothesis by introducing three different concurrent attention-demanding tasks. Although Experiment 1 showed a larger concreteness effect with slow presentations, the following two experiments yielded strong evidence against the strategic hypothesis. Limiting available attention resources by concurrent tasks reduced global memory performance, but the concreteness effect was equivalent to that found in control conditions. We conclude that semantic effects in STM result from automatic semantic encoding and provide tentative explanations for the interaction between the concreteness effect and the presentation rate.

  12. Automatic identification of epileptic seizures from EEG signals using linear programming boosting.

    Science.gov (United States)

    Hassan, Ahnaf Rashik; Subasi, Abdulhamit

    2016-11-01

    Computerized epileptic seizure detection is essential for expediting epilepsy diagnosis and research and for assisting medical professionals. Moreover, the implementation of an epilepsy monitoring device that has low power and is portable requires a reliable and successful seizure detection scheme. In this work, the problem of automated epilepsy seizure detection using singe-channel EEG signals has been addressed. At first, segments of EEG signals are decomposed using a newly proposed signal processing scheme, namely complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN). Six spectral moments are extracted from the CEEMDAN mode functions and train and test matrices are formed afterward. These matrices are fed into the classifier to identify epileptic seizures from EEG signal segments. In this work, we implement an ensemble learning based machine learning algorithm, namely linear programming boosting (LPBoost) to perform classification. The efficacy of spectral features in the CEEMDAN domain is validated by graphical and statistical analyses. The performance of CEEMDAN is compared to those of its predecessors to further inspect its suitability. The effectiveness and the appropriateness of LPBoost are demonstrated as opposed to the commonly used classification models. Resubstitution and 10 fold cross-validation error analyses confirm the superior algorithm performance of the proposed scheme. The algorithmic performance of our epilepsy seizure identification scheme is also evaluated against state-of-the-art works in the literature. Experimental outcomes manifest that the proposed seizure detection scheme performs better than the existing works in terms of accuracy, sensitivity, specificity, and Cohen's Kappa coefficient. It can be anticipated that owing to its use of only one channel of EEG signal, the proposed method will be suitable for device implementation, eliminate the onus of clinicians for analyzing a large bulk of data manually, and

  13. Algorithms for the automatic identification of MARFEs and UFOs in JET database of visible camera videos

    International Nuclear Information System (INIS)

    Murari, A.; Camplani, M.; Cannas, B.; Usai, P.; Mazon, D.; Delaunay, F.

    2010-01-01

    MARFE instabilities and UFOs leave clear signatures in JET fast visible camera videos. Given the potential harmful consequences of these events, particularly as triggers of disruptions, it would be important to have the means of detecting them automatically. In this paper, the results of various algorithms to identify automatically the MARFEs and UFOs in JET visible videos are reported. The objective is to retrieve the videos, which have captured these events, exploring the whole JET database of images, as a preliminary step to the development of real-time identifiers in the future. For the detection of MARFEs, a complete identifier has been finalized, using morphological operators and Hu moments. The final algorithm manages to identify the videos with MARFEs with a success rate exceeding 80%. Due to the lack of a complete statistics of examples, the UFO identifier is less developed, but a preliminary code can detect UFOs quite reliably. (authors)

  14. Identification of fracture zones and its application in automatic bone fracture reduction.

    Science.gov (United States)

    Paulano-Godino, Félix; Jiménez-Delgado, Juan J

    2017-04-01

    The preoperative planning of bone fractures using information from CT scans increases the probability of obtaining satisfactory results, since specialists are provided with additional information before surgery. The reduction of complex bone fractures requires solving a 3D puzzle in order to place each fragment into its correct position. Computer-assisted solutions may aid in this process by identifying the number of fragments and their location, by calculating the fracture zones or even by computing the correct position of each fragment. The main goal of this paper is the development of an automatic method to calculate contact zones between fragments and thus to ease the computation of bone fracture reduction. In this paper, an automatic method to calculate the contact zone between two bone fragments is presented. In a previous step, bone fragments are segmented and labelled from CT images and a point cloud is generated for each bone fragment. The calculated contact zones enable the automatic reduction of complex fractures. To that end, an automatic method to match bone fragments in complex fractures is also presented. The proposed method has been successfully applied in the calculation of the contact zone of 4 different bones from the ankle area. The calculated fracture zones enabled the reduction of all the tested cases using the presented matching algorithm. The performed tests show that the reduction of these fractures using the proposed methods leaded to a small overlapping between fragments. The presented method makes the application of puzzle-solving strategies easier, since it does not obtain the entire fracture zone but the contact area between each pair of fragments. Therefore, it is not necessary to find correspondences between fracture zones and fragments may be aligned two by two. The developed algorithms have been successfully applied in different fracture cases in the ankle area. The small overlapping error obtained in the performed tests

  15. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    Science.gov (United States)

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  16. Automatic NMR-based identification of chemical reaction types in mixtures of co-occurring reactions.

    Directory of Open Access Journals (Sweden)

    Diogo A R S Latino

    Full Text Available The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the (1H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the (1H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF, the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure

  17. Automatic NMR-based identification of chemical reaction types in mixtures of co-occurring reactions.

    Science.gov (United States)

    Latino, Diogo A R S; Aires-de-Sousa, João

    2014-01-01

    The combination of chemoinformatics approaches with NMR techniques and the increasing availability of data allow the resolution of problems far beyond the original application of NMR in structure elucidation/verification. The diversity of applications can range from process monitoring, metabolic profiling, authentication of products, to quality control. An application related to the automatic analysis of complex mixtures concerns mixtures of chemical reactions. We encoded mixtures of chemical reactions with the difference between the (1)H NMR spectra of the products and the reactants. All the signals arising from all the reactants of the co-occurring reactions were taken together (a simulated spectrum of the mixture of reactants) and the same was done for products. The difference spectrum is taken as the representation of the mixture of chemical reactions. A data set of 181 chemical reactions was used, each reaction manually assigned to one of 6 types. From this dataset, we simulated mixtures where two reactions of different types would occur simultaneously. Automatic learning methods were trained to classify the reactions occurring in a mixture from the (1)H NMR-based descriptor of the mixture. Unsupervised learning methods (self-organizing maps) produced a reasonable clustering of the mixtures by reaction type, and allowed the correct classification of 80% and 63% of the mixtures in two independent test sets of different similarity to the training set. With random forests (RF), the percentage of correct classifications was increased to 99% and 80% for the same test sets. The RF probability associated to the predictions yielded a robust indication of their reliability. This study demonstrates the possibility of applying machine learning methods to automatically identify types of co-occurring chemical reactions from NMR data. Using no explicit structural information about the reactions participants, reaction elucidation is performed without structure elucidation of

  18. Testing the algorithms for automatic identification of errors on the measured quantities of the nuclear power plant. Verification tests

    International Nuclear Information System (INIS)

    Svatek, J.

    1999-12-01

    During the development and implementation of supporting software for the control room and emergency control centre at the Dukovany nuclear power plant it appeared necessary to validate the input quantities in order to assure operating reliability of the software tools. Therefore, the development of software for validation of the measured quantities of the plant data sources was initiated, and the software had to be debugged and verified. The report contains the proposal for and description of the verification tests for testing the algorithms of automatic identification of errors on the observed quantities of the NPP by means of homemade validation software. In particular, the algorithms treated serve the validation of the hot leg temperature at primary circuit loop no. 2 or 4 at the Dukovany-2 reactor unit using data from the URAN and VK3 information systems, recorded during 3 different days. (author)

  19. Automatic identification of single- and/or few-layer thin-film material

    DEFF Research Database (Denmark)

    2014-01-01

    One or more digital representations of single- (101) and/or few-layer (102) thin- film material are automatically identified robustly and reliably in a digital image (100), the digital image (100) having a predetermined number of colour components, by - determining (304) a background colour...... component of the digital image (100) for each colour component, and - determining or estimating (306) a colour component of thin-film material to be identified in the digital image (100) for each colour component by obtaining a pre-determined contrast value (C R; C G; C B) for each colour component...

  20. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions.

    Science.gov (United States)

    Zdravevski, Eftim; Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger

    2017-01-01

    Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from

  1. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions.

    Directory of Open Access Journals (Sweden)

    Eftim Zdravevski

    Full Text Available Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position.The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers.The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be

  2. AUTOMATIC RECOGNITION OF CORONAL TYPE II RADIO BURSTS: THE AUTOMATED RADIO BURST IDENTIFICATION SYSTEM METHOD AND FIRST OBSERVATIONS

    International Nuclear Information System (INIS)

    Lobzin, Vasili V.; Cairns, Iver H.; Robinson, Peter A.; Steward, Graham; Patterson, Garth

    2010-01-01

    Major space weather events such as solar flares and coronal mass ejections are usually accompanied by solar radio bursts, which can potentially be used for real-time space weather forecasts. Type II radio bursts are produced near the local plasma frequency and its harmonic by fast electrons accelerated by a shock wave moving through the corona and solar wind with a typical speed of ∼1000 km s -1 . The coronal bursts have dynamic spectra with frequency gradually falling with time and durations of several minutes. This Letter presents a new method developed to detect type II coronal radio bursts automatically and describes its implementation in an extended Automated Radio Burst Identification System (ARBIS 2). Preliminary tests of the method with spectra obtained in 2002 show that the performance of the current implementation is quite high, ∼80%, while the probability of false positives is reasonably low, with one false positive per 100-200 hr for high solar activity and less than one false event per 10000 hr for low solar activity periods. The first automatically detected coronal type II radio burst is also presented.

  3. Semi-automatic identification of punching areas for tissue microarray building: the tubular breast cancer pilot study

    Directory of Open Access Journals (Sweden)

    Beltrame Francesco

    2010-11-01

    Full Text Available Abstract Background Tissue MicroArray technology aims to perform immunohistochemical staining on hundreds of different tissue samples simultaneously. It allows faster analysis, considerably reducing costs incurred in staining. A time consuming phase of the methodology is the selection of tissue areas within paraffin blocks: no utilities have been developed for the identification of areas to be punched from the donor block and assembled in the recipient block. Results The presented work supports, in the specific case of a primary subtype of breast cancer (tubular breast cancer, the semi-automatic discrimination and localization between normal and pathological regions within the tissues. The diagnosis is performed by analysing specific morphological features of the sample such as the absence of a double layer of cells around the lumen and the decay of a regular glands-and-lobules structure. These features are analysed using an algorithm which performs the extraction of morphological parameters from images and compares them to experimentally validated threshold values. Results are satisfactory since in most of the cases the automatic diagnosis matches the response of the pathologists. In particular, on a total of 1296 sub-images showing normal and pathological areas of breast specimens, algorithm accuracy, sensitivity and specificity are respectively 89%, 84% and 94%. Conclusions The proposed work is a first attempt to demonstrate that automation in the Tissue MicroArray field is feasible and it can represent an important tool for scientists to cope with this high-throughput technique.

  4. A computer program for automatic gamma-ray spectra analysis with isotope identification for the purpose of activation analysis

    International Nuclear Information System (INIS)

    Weigel, H.; Dauk, J.

    1974-01-01

    A FORTRAN IV program for a PDP-9 computer, with 16K storage capacity, is developed performing automatic analysis of complex gamma-spectra, taken with Ge/Li/ detectors. It searches for full energy peaks and evaluates the peak areas. The program features and automatically performed isotope identifiaction. It is written in such a flexible manner that after reactor irradiation, spectra from samples of any composition can be evaluated for activation analysis. The peak search rutin is based on the following criteria: the counting rate has to increase for two succesive channels; and the amplitude of the corresponding maximum has to be greater than/or equal to F 1 times the statistical error of the counting rate in the valley just before the maximum. In order to detect superimposed peaks, it is assumed that the dependence of FWHM on channel number is roughly approximated by a linear function, and the actual and''theoretical''FWHM values are compared. To determine the net peak area a Gaussian based function is fitted to each peak. The isotope identification is based on the procedure developed by ADAMS and DAMS. (T.G.)

  5. Exploring the potential of machine learning for automatic slum identification from VHR imagery

    NARCIS (Netherlands)

    Duque, J.C.; Patino, J.E.; Betancourt, A.

    2017-01-01

    Slum identification in urban settlements is a crucial step in the process of formulation of pro-poor policies. However, the use of conventional methods for slum detection such as field surveys can be time-consuming and costly. This paper explores the possibility of implementing a low-cost

  6. Automatic identification of bird targets with radar via patterns produced by wing flapping

    NARCIS (Netherlands)

    Zaugg, S.; Saporta, G.; van Loon, E.; Schmaljohann, H.; Liechti, F.

    2008-01-01

    Bird identification with radar is important for bird migration research, environmental impact assessments (e.g. wind farms), aircraft security and radar meteorology. In a study on bird migration, radar signals from birds, insects and ground clutter were recorded. Signals from birds show a typical

  7. Accident identification system with automatic detection of abnormal condition using quantum computation

    International Nuclear Information System (INIS)

    Nicolau, Andressa dos Santos; Schirru, Roberto; Lima, Alan Miranda Monteiro de

    2011-01-01

    Transient identification systems have been proposed in order to maintain the plant operating in safe conditions and help operators in make decisions in emergency short time interval with maximum certainty associated. This article presents a system, time independent and without the use of an event that can be used as a starting point for t = 0 (reactor scram, for instance), for transient/accident identification of a pressurized water nuclear reactor (PWR). The model was developed in order to be able to recognize the normal condition and three accidents of the design basis list of the Nuclear Power Plant Angra 2, postulated in the Final Safety Analysis Report (FSAR). Were used several sets of process variables in order to establish a minimum set of variables considered necessary and sufficient. The optimization step of the identification algorithm is based upon the paradigm of Quantum Computing. In this case, the optimization metaheuristic Quantum Inspired Evolutionary Algorithm (QEA) was implemented and works as a data mining tool. The results obtained with the QEA without the time variable are compatible to the techniques in the reference literature, for the transient identification problem, with less computational effort (number of evaluations). This system allows a solution that approximates the ideal solution, the Voronoi Vectors with only one partition for the classes of accidents with robustness. (author)

  8. A framework for automatic custom instruction identification on multi-issue ASIPs

    NARCIS (Netherlands)

    Nery, A.S.; Nedjah, N.; Franca, F.M.G.; Jozwiak, L.; Corporaal, H.

    2014-01-01

    Custom Instruction Identification is an important part in the design of efficient Application-Specific Processors (ASIPs). It consists of profiling of a given application to find patterns of basic operations that are frequently executed. Operations of such patterns can be implemented together as a

  9. Automatic identification of motion artifacts in EHG recording for robust analysis of uterine contractions.

    Science.gov (United States)

    Ye-Lin, Yiyao; Garcia-Casado, Javier; Prats-Boluda, Gema; Alberola-Rubio, José; Perales, Alfredo

    2014-01-01

    Electrohysterography (EHG) is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique.

  10. Automatic Identification of Motion Artifacts in EHG Recording for Robust Analysis of Uterine Contractions

    Directory of Open Access Journals (Sweden)

    Yiyao Ye-Lin

    2014-01-01

    Full Text Available Electrohysterography (EHG is a noninvasive technique for monitoring uterine electrical activity. However, the presence of artifacts in the EHG signal may give rise to erroneous interpretations and make it difficult to extract useful information from these recordings. The aim of this work was to develop an automatic system of segmenting EHG recordings that distinguishes between uterine contractions and artifacts. Firstly, the segmentation is performed using an algorithm that generates the TOCO-like signal derived from the EHG and detects windows with significant changes in amplitude. After that, these segments are classified in two groups: artifacted and nonartifacted signals. To develop a classifier, a total of eleven spectral, temporal, and nonlinear features were calculated from EHG signal windows from 12 women in the first stage of labor that had previously been classified by experts. The combination of characteristics that led to the highest degree of accuracy in detecting artifacts was then determined. The results showed that it is possible to obtain automatic detection of motion artifacts in segmented EHG recordings with a precision of 92.2% using only seven features. The proposed algorithm and classifier together compose a useful tool for analyzing EHG signals and would help to promote clinical applications of this technique.

  11. Automatic feed phase identification in multivariate bioprocess profiles by sequential binary classification.

    Science.gov (United States)

    Nikzad-Langerodi, Ramin; Lughofer, Edwin; Saminger-Platz, Susanne; Zahel, Thomas; Sagmeister, Patrick; Herwig, Christoph

    2017-08-22

    In this paper, we propose a new strategy for retrospective identification of feed phases from online sensor-data enriched feed profiles of an Escherichia Coli (E. coli) fed-batch fermentation process. In contrast to conventional (static), data-driven multi-class machine learning (ML), we exploit process knowledge in order to constrain our classification system yielding more parsimonious models compared to static ML approaches. In particular, we enforce unidirectionality on a set of binary, multivariate classifiers trained to discriminate between adjacent feed phases by linking the classifiers through a one-way switch. The switch is activated when the actual classifier output changes. As a consequence, the next binary classifier in the classifier chain is used for the discrimination between the next feed phase pair etc. We allow activation of the switch only after a predefined number of consecutive predictions of a transition event in order to prevent premature activation of the switch and undertake a sensitivity analysis regarding the optimal choice of the (time) lag parameter. From a complexity/parsimony perspective the benefit of our approach is three-fold: i) The multi-class learning task is broken down into binary subproblems which usually have simpler decision surfaces and tend to be less susceptible to the class-imbalance problem. ii) We exploit the fact that the process follows a rigid feed cycle structure (i.e. batch-feed-batch-feed) which allows us to focus on the subproblems involving phase transitions as they occur during the process while discarding off-transition classifiers and iii) only one binary classifier is active at the time which keeps effective model complexity low. We further use a combination of logistic regression and Lasso (i.e. regularized logistic regression, RLR) as a wrapper to extract the most relevant features for individual subproblems from the whole set of high-dimensional sensor data. We train different soft computing classifiers

  12. An Automatic Parameter Identification Method for a PMSM Drive with LC-Filter

    DEFF Research Database (Denmark)

    Bech, Michael Møller; Christensen, Jeppe Haals; Weber, Magnus L.

    2016-01-01

    of the PMSM fed through an LC-filter. Based on the measured current response, model parameters for both the filter (L, R, C) and the PMSM (L and R) are estimated: First, the frequency response of the system is estimated using Welch Modified Periodogram method and then an optimization algorithm is used to find...... the parameters in an analytical reference model that minimize the model error. To demonstrate the practical feasibility of the method, a fully functional drive including an embedded real-time controller has been built. In addition to modulation, data acquisition and control the whole parameter identification...... method is also implemented on the real-time controller. Based on laboratory experiments on a 22 kW drive, it is concluded that the embedded identification method can estimate the five parameters in less than ten seconds....

  13. Automatic extraction and identification of users' responses in Facebook medical quizzes.

    Science.gov (United States)

    Rodríguez-González, Alejandro; Menasalvas Ruiz, Ernestina; Mayer Pujadas, Miguel A

    2016-04-01

    In the last few years the use of social media in medicine has grown exponentially, providing a new area of research based on the analysis and use of Web 2.0 capabilities. In addition, the use of social media in medical education is a subject of particular interest which has been addressed in several studies. One example of this application is the medical quizzes of The New England Journal of Medicine (NEJM) that regularly publishes a set of questions through their Facebook timeline. We present an approach for the automatic extraction of medical quizzes and their associated answers on a Facebook platform by means of a set of computer-based methods and algorithms. We have developed a tool for the extraction and analysis of medical quizzes stored on Facebook timeline at the NEJM Facebook page, based on a set of computer-based methods and algorithms using Java. The system is divided into two main modules: Crawler and Data retrieval. The system was launched on December 31, 2014 and crawled through a total of 3004 valid posts and 200,081 valid comments. The first post was dated on July 23, 2009 and the last one on December 30, 2014. 285 quizzes were analyzed with 32,780 different users providing answers to the aforementioned quizzes. Of the 285 quizzes, patterns were found in 261 (91.58%). From these 261 quizzes where trends were found, we saw that users follow trends of incorrect answers in 13 quizzes and trends of correct answers in 248. This tool is capable of automatically identifying the correct and wrong answers to a quiz provided on Facebook posts in a text format to a quiz, with a small rate of false negative cases and this approach could be applicable to the extraction and analysis of other sources after including some adaptations of the information on the Internet. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Source term identification in atmospheric modelling via sparse optimization

    Science.gov (United States)

    Adam, Lukas; Branda, Martin; Hamburger, Thomas

    2015-04-01

    Inverse modelling plays an important role in identifying the amount of harmful substances released into atmosphere during major incidents such as power plant accidents or volcano eruptions. Another possible application of inverse modelling lies in the monitoring the CO2 emission limits where only observations at certain places are available and the task is to estimate the total releases at given locations. This gives rise to minimizing the discrepancy between the observations and the model predictions. There are two standard ways of solving such problems. In the first one, this discrepancy is regularized by adding additional terms. Such terms may include Tikhonov regularization, distance from a priori information or a smoothing term. The resulting, usually quadratic, problem is then solved via standard optimization solvers. The second approach assumes that the error term has a (normal) distribution and makes use of Bayesian modelling to identify the source term. Instead of following the above-mentioned approaches, we utilize techniques from the field of compressive sensing. Such techniques look for a sparsest solution (solution with the smallest number of nonzeros) of a linear system, where a maximal allowed error term may be added to this system. Even though this field is a developed one with many possible solution techniques, most of them do not consider even the simplest constraints which are naturally present in atmospheric modelling. One of such examples is the nonnegativity of release amounts. We believe that the concept of a sparse solution is natural in both problems of identification of the source location and of the time process of the source release. In the first case, it is usually assumed that there are only few release points and the task is to find them. In the second case, the time window is usually much longer than the duration of the actual release. In both cases, the optimal solution should contain a large amount of zeros, giving rise to the

  15. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.

    Science.gov (United States)

    Zazo, Ruben; Lozano-Diez, Alicia; Gonzalez-Dominguez, Javier; Toledano, Doroteo T; Gonzalez-Rodriguez, Joaquin

    2016-01-01

    Long Short Term Memory (LSTM) Recurrent Neural Networks (RNNs) have recently outperformed other state-of-the-art approaches, such as i-vector and Deep Neural Networks (DNNs), in automatic Language Identification (LID), particularly when dealing with very short utterances (∼3s). In this contribution we present an open-source, end-to-end, LSTM RNN system running on limited computational resources (a single GPU) that outperforms a reference i-vector system on a subset of the NIST Language Recognition Evaluation (8 target languages, 3s task) by up to a 26%. This result is in line with previously published research using proprietary LSTM implementations and huge computational resources, which made these former results hardly reproducible. Further, we extend those previous experiments modeling unseen languages (out of set, OOS, modeling), which is crucial in real applications. Results show that a LSTM RNN with OOS modeling is able to detect these languages and generalizes robustly to unseen OOS languages. Finally, we also analyze the effect of even more limited test data (from 2.25s to 0.1s) proving that with as little as 0.5s an accuracy of over 50% can be achieved.

  16. Automatic and rapid identification of glycopeptides by nano-UPLC-LTQ-FT-MS and proteomic search engine.

    Science.gov (United States)

    Giménez, Estela; Gay, Marina; Vilaseca, Marta

    2017-01-30

    Here we demonstrate the potential of nano-UPLC-LTQ-FT-MS and the Byonic™ proteomic search engine for the separation, detection, and identification of N- and O-glycopeptide glycoforms in standard glycoproteins. The use of a BEH C18 nanoACQUITY column allowed the separation of the glycopeptides present in the glycoprotein digest and a baseline-resolution of the glycoforms of the same glycopeptide on the basis of the number of sialic acids. Moreover, we evaluated several acquisition strategies in order to improve the detection and characterization of glycopeptide glycoforms with the maximum number of identification percentages. The proposed strategy is simple to set up with the technology platforms commonly used in proteomic labs. The method allows the straightforward and rapid obtention of a general glycosylated map of a given protein, including glycosites and their corresponding glycosylated structures. The MS strategy selected in this work, based on a gas phase fractionation approach, led to 136 unique peptides from four standard proteins, which represented 78% of the total number of peptides identified. Moreover, the method does not require an extra glycopeptide enrichment step, thus preventing the bias that this step could cause towards certain glycopeptide species. Data are available via ProteomeXchange with identifier PXD003578. We propose a simple and high-throughput glycoproteomics-based methodology that allows the separation of glycopeptide glycoforms on the basis of the number of sialic acids, and their automatic and rapid identification without prior knowledge of protein glycosites or type and structure of the glycans. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Automatic pattern identification of rock moisture based on the Staff-RF model

    Science.gov (United States)

    Zheng, Wei; Tao, Kai; Jiang, Wei

    2018-04-01

    Studies on the moisture and damage state of rocks generally focus on the qualitative description and mechanical information of rocks. This method is not applicable to the real-time safety monitoring of rock mass. In this study, a musical staff computing model is used to quantify the acoustic emission signals of rocks with different moisture patterns. Then, the random forest (RF) method is adopted to form the staff-RF model for the real-time pattern identification of rock moisture. The entire process requires only the computing information of the AE signal and does not require the mechanical conditions of rocks.

  18. Automatic identification of NDA measured items: Use of E-tags

    International Nuclear Information System (INIS)

    Chitumbo, K.; Olsen, R.; Hatcher, C.R.; Kadner, S.P.

    1995-01-01

    This paper describes how electronic identification devices or E-tags could reduce the time spent by LAEA inspectors making nondestructive assay (NDA) measurements. As one example, the use of E-tags with a high-level neutron coincidence counter (HLNC) is discussed in detail. Sections of the paper include inspection procedures, system description, software, and future plans. Mounting of E-tabs, modifications to the HLNC, and the use of tamper indicating devices are also discussed. The technology appears to have wide application to different types of nuclear facilities and inspections and could significantly change NDA inspection procedures

  19. Automatic cell identification and visualization using digital holographic microscopy with head mounted augmented reality devices.

    Science.gov (United States)

    O'Connor, Timothy; Rawat, Siddharth; Markman, Adam; Javidi, Bahram

    2018-03-01

    We propose a compact imaging system that integrates an augmented reality head mounted device with digital holographic microscopy for automated cell identification and visualization. A shearing interferometer is used to produce holograms of biological cells, which are recorded using customized smart glasses containing an external camera. After image acquisition, segmentation is performed to isolate regions of interest containing biological cells in the field-of-view, followed by digital reconstruction of the cells, which is used to generate a three-dimensional (3D) pseudocolor optical path length profile. Morphological features are extracted from the cell's optical path length map, including mean optical path length, coefficient of variation, optical volume, projected area, projected area to optical volume ratio, cell skewness, and cell kurtosis. Classification is performed using the random forest classifier, support vector machines, and K-nearest neighbor, and the results are compared. Finally, the augmented reality device displays the cell's pseudocolor 3D rendering of its optical path length profile, extracted features, and the identified cell's type or class. The proposed system could allow a healthcare worker to quickly visualize cells using augmented reality smart glasses and extract the relevant information for rapid diagnosis. To the best of our knowledge, this is the first report on the integration of digital holographic microscopy with augmented reality devices for automated cell identification and visualization.

  20. Automatic identification of optimal marker genes for phenotypic and taxonomic groups of microorganisms.

    Directory of Open Access Journals (Sweden)

    Elad Segev

    Full Text Available Finding optimal markers for microorganisms important in the medical, agricultural, environmental or ecological fields is of great importance. Thousands of complete microbial genomes now available allow us, for the first time, to exhaustively identify marker proteins for groups of microbial organisms. In this work, we model the biological task as the well-known mathematical "hitting set" problem, solving it based on both greedy and randomized approximation algorithms. We identify unique markers for 17 phenotypic and taxonomic microbial groups, including proteins related to the nitrite reductase enzyme as markers for the non-anammox nitrifying bacteria group, and two transcription regulation proteins, nusG and yhiF, as markers for the Archaea and Escherichia/Shigella taxonomic groups, respectively. Additionally, we identify marker proteins for three subtypes of pathogenic E. coli, which previously had no known optimal markers. Practically, depending on the completeness of the database this algorithm can be used for identification of marker genes for any microbial group, these marker genes may be prime candidates for the understanding of the genetic basis of the group's phenotype or to help discover novel functions which are uniquely shared among a group of microbes. We show that our method is both theoretically and practically efficient, while establishing an upper bound on its time complexity and approximation ratio; thus, it promises to remain efficient and permit the identification of marker proteins that are specific to phenotypic or taxonomic groups, even as more and more bacterial genomes are being sequenced.

  1. Automatic identification of mobile and rigid substructures in molecular dynamics simulations and fractional structural fluctuation analysis.

    Directory of Open Access Journals (Sweden)

    Leandro Martínez

    Full Text Available The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD and Root Mean Square Fluctuations (RMSF of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit.

  2. Quality assurance in the production of pipe fittings by automatic laser-based material identification

    Science.gov (United States)

    Moench, Ingo; Peter, Laszlo; Priem, Roland; Sturm, Volker; Noll, Reinhard

    1999-09-01

    In plants of the chemical, nuclear and off-shore industry, application specific high-alloyed steels are used for pipe fittings. Mixing of different steel grades can lead to corrosion with severe consequential damages. Growing quality requirements and environmental responsibilities demand a 100% material control in the production of the pipe fittings. Therefore, LIFT, an automatic inspection machine, was developed to insure against any mix of material grades. LIFT is able to identify more than 30 different steel grades. The inspection method is based on Laser-Induced Breakdown Spectrometry (LIBS). An expert system, which can be easily trained and recalibrated, was developed for the data evaluation. The result of the material inspection is transferred to an external handling system via a PLC interface. The duration of the inspection process is 2 seconds. The graphical user interface was developed with respect to the requirements of an unskilled operator. The software is based on a realtime operating system and provides a safe and reliable operation. An interface for the remote maintenance by modem enables a fast operational support. Logged data are retrieved and evaluated. This is the basis for an adaptive improvement of the configuration of LIFT with respect to changing requirements in the production line. Within the first six months of routine operation, about 50000 pipe fittings were inspected.

  3. Automatic Identification of Physical Activity Intensity and Modality from the Fusion of Accelerometry and Heart Rate Data.

    Science.gov (United States)

    García-García, Fernando; Benito, Pedro J; Hernando, María E

    2016-12-07

    Physical activity (PA) is essential to prevent and to treat a variety of chronic diseases. The automated detection and quantification of PA over time empowers lifestyle interventions, facilitating reliable exercise tracking and data-driven counseling. We propose and compare various combinations of machine learning (ML) schemes for the automatic classification of PA from multi-modal data, simultaneously captured by a biaxial accelerometer and a heart rate (HR) monitor. Intensity levels (low / moderate / vigorous) were recognized, as well as for vigorous exercise, its modality (sustained aerobic / resistance / mixed). In total, 178.63 h of data about PA intensity (65.55 % low / 18.96 % moderate / 15.49 % vigorous) and 17.00 h about modality were collected in two experiments: one in free-living conditions, another in a fitness center under controlled protocols. The structure used for automatic classification comprised: a) definition of 42 time-domain signal features, b) dimensionality reduction, c) data clustering, and d) temporal filtering to exploit time redundancy by means of a Hidden Markov Model (HMM). Four dimensionality reduction techniques and four clustering algorithms were studied. In order to cope with class imbalance in the dataset, a custom performance metric was defined to aggregate recognition accuracy, precision and recall. The best scheme, which comprised a projection through Linear Discriminant Analysis (LDA) and k-means clustering, was evaluated in leave-one-subject-out cross-validation; notably outperforming the standard industry procedures for PA intensity classification: score 84.65 %, versus up to 63.60 %. Errors tended to be brief and to appear around transients. The application of ML techniques for pattern identification and temporal filtering allowed to merge accelerometry and HR data in a solid manner, and achieved markedly better recognition performances than the standard methods for PA intensity estimation.

  4. Automatic detection of patient identification and positioning errors in radiation therapy treatment using 3-dimensional setup images.

    Science.gov (United States)

    Jani, Shyam S; Low, Daniel A; Lamb, James M

    2015-01-01

    To develop an automated system that detects patient identification and positioning errors between 3-dimensional computed tomography (CT) and kilovoltage CT planning images. Planning kilovoltage CT images were collected for head and neck (H&N), pelvis, and spine treatments with corresponding 3-dimensional cone beam CT and megavoltage CT setup images from TrueBeam and TomoTherapy units, respectively. Patient identification errors were simulated by registering setup and planning images from different patients. For positioning errors, setup and planning images were misaligned by 1 to 5 cm in the 6 anatomical directions for H&N and pelvis patients. Spinal misalignments were simulated by misaligning to adjacent vertebral bodies. Image pairs were assessed using commonly used image similarity metrics as well as custom-designed metrics. Linear discriminant analysis classification models were trained and tested on the imaging datasets, and misclassification error (MCE), sensitivity, and specificity parameters were estimated using 10-fold cross-validation. For patient identification, our workflow produced MCE estimates of 0.66%, 1.67%, and 0% for H&N, pelvis, and spine TomoTherapy images, respectively. Sensitivity and specificity ranged from 97.5% to 100%. MCEs of 3.5%, 2.3%, and 2.1% were obtained for TrueBeam images of the above sites, respectively, with sensitivity and specificity estimates between 95.4% and 97.7%. MCEs for 1-cm H&N/pelvis misalignments were 1.3%/5.1% and 9.1%/8.6% for TomoTherapy and TrueBeam images, respectively. Two-centimeter MCE estimates were 0.4%/1.6% and 3.1/3.2%, respectively. MCEs for vertebral body misalignments were 4.8% and 3.6% for TomoTherapy and TrueBeam images, respectively. Patient identification and gross misalignment errors can be robustly and automatically detected using 3-dimensional setup images of different energies across 3 commonly treated anatomical sites. Copyright © 2015 American Society for Radiation Oncology. Published by

  5. Large data analysis: automatic visual personal identification in a demography of 1.2 billion persons

    Science.gov (United States)

    Daugman, John

    2014-05-01

    The largest biometric deployment in history is now underway in India, where the Government is enrolling the iris patterns (among other data) of all 1.2 billion citizens. The purpose of the Unique Identification Authority of India (UIDAI) is to ensure fair access to welfare benefits and entitlements, to reduce fraud, and enhance social inclusion. Only a minority of Indian citizens have bank accounts; only 4 percent possess passports; and less than half of all aid money reaches its intended recipients. A person who lacks any means of establishing their identity is excluded from entitlements and does not officially exist; thus the slogan of UIDAI is: To give the poor an identity." This ambitious program enrolls a million people every day, across 36,000 stations run by 83 agencies, with a 3-year completion target for the entire national population. The halfway point was recently passed with more than 600 million persons now enrolled. In order to detect and prevent duplicate identities, every iris pattern that is enrolled is first compared against all others enrolled so far; thus the daily workflow now requires 600 trillion (or 600 million-million) iris cross-comparisons. Avoiding identity collisions (False Matches) requires high biometric entropy, and achieving the tremendous match speed requires phase bit coding. Both of these requirements are being delivered operationally by wavelet methods developed by the author for encoding and comparing iris patterns, which will be the focus of this Large Data Award" presentation.

  6. Automatic Identification and Quantification of Extra-Well Fluorescence in Microarray Images.

    Science.gov (United States)

    Rivera, Robert; Wang, Jie; Yu, Xiaobo; Demirkan, Gokhan; Hopper, Marika; Bian, Xiaofang; Tahsin, Tasnia; Magee, D Mitchell; Qiu, Ji; LaBaer, Joshua; Wallstrom, Garrick

    2017-11-03

    In recent studies involving NAPPA microarrays, extra-well fluorescence is used as a key measure for identifying disease biomarkers because there is evidence to support that it is better correlated with strong antibody responses than statistical analysis involving intraspot intensity. Because this feature is not well quantified by traditional image analysis software, identification and quantification of extra-well fluorescence is performed manually, which is both time-consuming and highly susceptible to variation between raters. A system that could automate this task efficiently and effectively would greatly improve the process of data acquisition in microarray studies, thereby accelerating the discovery of disease biomarkers. In this study, we experimented with different machine learning methods, as well as novel heuristics, for identifying spots exhibiting extra-well fluorescence (rings) in microarray images and assigning each ring a grade of 1-5 based on its intensity and morphology. The sensitivity of our final system for identifying rings was found to be 72% at 99% specificity and 98% at 92% specificity. Our system performs this task significantly faster than a human, while maintaining high performance, and therefore represents a valuable tool for microarray image analysis.

  7. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    Science.gov (United States)

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  8. An automatic sodium-loop for testing the lon-term behaviour of sintered bodies flowed through by gas

    International Nuclear Information System (INIS)

    Barkleit, G.; George, G.; Haase, I.; Kiessling, W.

    1980-08-01

    An automatic sodium loop NAKOS for testing the long-term behaviour of porous stainless steel bodies which are flowed through by gas is described. The loop using a special safety protection system is capable of working without control up to 1000 h. During a 500 h-experiment the safety system and the gas permeability measuring method for testing the porous bodies were tested. Both first results of the behaviour of sintered bodies in liquid sodium of high purity and temperatures of about 850 K and some details of the production of these bodies are given. (author)

  9. Increasing Accuracy: A New Design and Algorithm for Automatically Measuring Weights, Travel Direction and Radio Frequency Identification (RFID) of Penguins.

    Science.gov (United States)

    Afanasyev, Vsevolod; Buldyrev, Sergey V; Dunn, Michael J; Robst, Jeremy; Preston, Mark; Bremner, Steve F; Briggs, Dirk R; Brown, Ruth; Adlard, Stacey; Peat, Helen J

    2015-01-01

    A fully automated weighbridge using a new algorithm and mechanics integrated with a Radio Frequency Identification System is described. It is currently in use collecting data on Macaroni penguins (Eudyptes chrysolophus) at Bird Island, South Georgia. The technology allows researchers to collect very large, highly accurate datasets of both penguin weight and direction of their travel into or out of a breeding colony, providing important contributory information to help understand penguin breeding success, reproductive output and availability of prey. Reliable discrimination between single and multiple penguin crossings is demonstrated. Passive radio frequency tags implanted into penguins allow researchers to match weight and trip direction to individual birds. Low unit and operation costs, low maintenance needs, simple operator requirements and accurate time stamping of every record are all important features of this type of weighbridge, as is its proven ability to operate 24 hours a day throughout a breeding season, regardless of temperature or weather conditions. Users are able to define required levels of accuracy by adjusting filters and raw data are automatically recorded and stored allowing for a range of processing options. This paper presents the underlying principles, design specification and system description, provides evidence of the weighbridge's accurate performance and demonstrates how its design is a significant improvement on existing systems.

  10. Hybrid EEG—Eye Tracker: Automatic Identification and Removal of Eye Movement and Blink Artifacts from Electroencephalographic Signal

    Directory of Open Access Journals (Sweden)

    Malik M. Naeem Mannan

    2016-02-01

    Full Text Available Contamination of eye movement and blink artifacts in Electroencephalogram (EEG recording makes the analysis of EEG data more difficult and could result in mislead findings. Efficient removal of these artifacts from EEG data is an essential step in improving classification accuracy to develop the brain-computer interface (BCI. In this paper, we proposed an automatic framework based on independent component analysis (ICA and system identification to identify and remove ocular artifacts from EEG data by using hybrid EEG and eye tracker system. The performance of the proposed algorithm is illustrated using experimental and standard EEG datasets. The proposed algorithm not only removes the ocular artifacts from artifactual zone but also preserves the neuronal activity related EEG signals in non-artifactual zone. The comparison with the two state-of-the-art techniques namely ADJUST based ICA and REGICA reveals the significant improved performance of the proposed algorithm for removing eye movement and blink artifacts from EEG data. Additionally, results demonstrate that the proposed algorithm can achieve lower relative error and higher mutual information values between corrected EEG and artifact-free EEG data.

  11. PhasePApy: A robust pure Python package for automatic identification of seismic phases

    Science.gov (United States)

    Chen, Chen; Holland, Austin

    2016-01-01

    We developed a Python phase identification package: the PhasePApy for earthquake data processing and near‐real‐time monitoring. The package takes advantage of the growing number of Python libraries including Obspy. All the data formats supported by Obspy can be supported within the PhasePApy. The PhasePApy has two subpackages: the PhasePicker and the Associator, aiming to identify phase arrival onsets and associate them to phase types, respectively. The PhasePicker and the Associator can work jointly or separately. Three autopickers are implemented in the PhasePicker subpackage: the frequency‐band picker, the Akaike information criteria function derivative picker, and the kurtosis picker. All three autopickers identify picks with the same processing methods but different characteristic functions. The PhasePicker triggers the pick with a dynamic threshold and can declare a pick with false‐pick filtering. Also, the PhasePicker identifies a pick polarity and uncertainty for further seismological analysis, such as focal mechanism determination. Two associators are included in the Associator subpackage: the 1D Associator and 3D Associator, which assign phase types to picks that can best fit potential earthquakes by minimizing root mean square (rms) residuals of the misfits in distance and time, respectively. The Associator processes multiple picks from all channels at a seismic station and aggregates them to increase computational efficiencies. Both associators use travel‐time look up tables to determine the best estimation of the earthquake location and evaluate the phase type for picks. The PhasePApy package has been used extensively for local and regional earthquakes and can work for active source experiments as well.

  12. Price strategy and pricing strategy: terms and content identification

    OpenAIRE

    Panasenko Tetyana

    2015-01-01

    The article is devoted to the terminology and content identification of seemingly identical concepts "price strategy" and "pricing strategy". The article contains evidence that the price strategy determines the direction, principles and procedure of implementing the company price policy and pricing strategy creates a set of rules and practical methods of price formation in accordance with the pricing strategy of the company.

  13. Progressively expanded neural network for automatic material identification in hyperspectral imagery

    Science.gov (United States)

    Paheding, Sidike

    The science of hyperspectral remote sensing focuses on the exploitation of the spectral signatures of various materials to enhance capabilities including object detection, recognition, and material characterization. Hyperspectral imagery (HSI) has been extensively used for object detection and identification applications since it provides plenty of spectral information to uniquely identify materials by their reflectance spectra. HSI-based object detection algorithms can be generally classified into stochastic and deterministic approaches. Deterministic approaches are comparatively simple to apply since it is usually based on direct spectral similarity such as spectral angles or spectral correlation. In contrast, stochastic algorithms require statistical modeling and estimation for target class and non-target class. Over the decades, many single class object detection methods have been proposed in the literature, however, deterministic multiclass object detection in HSI has not been explored. In this work, we propose a deterministic multiclass object detection scheme, named class-associative spectral fringe-adjusted joint transform correlation. Human brain is capable of simultaneously processing high volumes of multi-modal data received every second of the day. In contrast, a machine sees input data simply as random binary numbers. Although machines are computationally efficient, they are inferior when comes to data abstraction and interpretation. Thus, mimicking the learning strength of human brain has been current trend in artificial intelligence. In this work, we present a biological inspired neural network, named progressively expanded neural network (PEN Net), based on nonlinear transformation of input neurons to a feature space for better pattern differentiation. In PEN Net, discrete fixed excitations are disassembled and scattered in the feature space as a nonlinear line. Each disassembled element on the line corresponds to a pattern with similar features

  14. Price strategy and pricing strategy: terms and content identification

    Directory of Open Access Journals (Sweden)

    Panasenko Tetyana

    2015-11-01

    Full Text Available The article is devoted to the terminology and content identification of seemingly identical concepts "price strategy" and "pricing strategy". The article contains evidence that the price strategy determines the direction, principles and procedure of implementing the company price policy and pricing strategy creates a set of rules and practical methods of price formation in accordance with the pricing strategy of the company.

  15. Automatic identification of high impact articles in PubMed to support clinical decision making.

    Science.gov (United States)

    Bian, Jiantao; Morid, Mohammad Amin; Jonnalagadda, Siddhartha; Luo, Gang; Del Fiol, Guilherme

    2017-09-01

    The practice of evidence-based medicine involves integrating the latest best available evidence into patient care decisions. Yet, critical barriers exist for clinicians' retrieval of evidence that is relevant for a particular patient from primary sources such as randomized controlled trials and meta-analyses. To help address those barriers, we investigated machine learning algorithms that find clinical studies with high clinical impact from PubMed®. Our machine learning algorithms use a variety of features including bibliometric features (e.g., citation count), social media attention, journal impact factors, and citation metadata. The algorithms were developed and evaluated with a gold standard composed of 502 high impact clinical studies that are referenced in 11 clinical evidence-based guidelines on the treatment of various diseases. We tested the following hypotheses: (1) our high impact classifier outperforms a state-of-the-art classifier based on citation metadata and citation terms, and PubMed's® relevance sort algorithm; and (2) the performance of our high impact classifier does not decrease significantly after removing proprietary features such as citation count. The mean top 20 precision of our high impact classifier was 34% versus 11% for the state-of-the-art classifier and 4% for PubMed's® relevance sort (p=0.009); and the performance of our high impact classifier did not decrease significantly after removing proprietary features (mean top 20 precision=34% vs. 36%; p=0.085). The high impact classifier, using features such as bibliometrics, social media attention and MEDLINE® metadata, outperformed previous approaches and is a promising alternative to identifying high impact studies for clinical decision support. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    Science.gov (United States)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term

  17. REMI and ROUSE: Quantitative Models for Long-Term and Short-Term Priming in Perceptual Identification

    NARCIS (Netherlands)

    E.J. Wagenmakers (Eric-Jan); R. Zeelenberg (René); D.E. Huber (David); J.G.W. Raaijmakers (Jeroen)

    2003-01-01

    textabstractThe REM model originally developed for recognition memory (Shiffrin & Steyvers, 1997) has recently been extended to implicit memory phenomena observed during threshold identification of words. We discuss two REM models based on Bayesian principles: a model for long-term priming (REMI;

  18. Automatic estimation of aquifer parameters using long-term water supply pumping and injection records

    Science.gov (United States)

    Luo, Ning; Illman, Walter A.

    2016-09-01

    Analyses are presented of long-term hydrographs perturbed by variable pumping/injection events in a confined aquifer at a municipal water-supply well field in the Region of Waterloo, Ontario (Canada). Such records are typically not considered for aquifer test analysis. Here, the water-level variations are fingerprinted to pumping/injection rate changes using the Theis model implemented in the WELLS code coupled with PEST. Analyses of these records yield a set of transmissivity ( T) and storativity ( S) estimates between each monitoring and production borehole. These individual estimates are found to poorly predict water-level variations at nearby monitoring boreholes not used in the calibration effort. On the other hand, the geometric means of the individual T and S estimates are similar to those obtained from previous pumping tests conducted at the same site and adequately predict water-level variations in other boreholes. The analyses reveal that long-term municipal water-level records are amenable to analyses using a simple analytical solution to estimate aquifer parameters. However, uniform parameters estimated with analytical solutions should be considered as first rough estimates. More accurate hydraulic parameters should be obtained by calibrating a three-dimensional numerical model that rigorously captures the complexities of the site with these data.

  19. Rapid Identification of Cortical Motor Areas in Rodents by High-Frequency Automatic Cortical Stimulation and Novel Motor Threshold Algorithm

    Directory of Open Access Journals (Sweden)

    Mitsuaki Takemi

    2017-10-01

    Full Text Available Cortical stimulation mapping is a valuable tool to test the functional organization of the motor cortex in both basic neurophysiology (e.g., elucidating the process of motor plasticity and clinical practice (e.g., before resecting brain tumors involving the motor cortex. However, compilation of motor maps based on the motor threshold (MT requires a large number of cortical stimulations and is therefore time consuming. Shortening the time for mapping may reduce stress on the subjects and unveil short-term plasticity mechanisms. In this study, we aimed to establish a cortical stimulation mapping procedure in which the time needed to identify a motor area is reduced to the order of minutes without compromising reliability. We developed an automatic motor mapping system that applies epidural cortical surface stimulations (CSSs through one-by-one of 32 micro-electrocorticographic electrodes while examining the muscles represented in a cortical region. The next stimulus intensity was selected according to previously evoked electromyographic responses in a closed-loop fashion. CSS was repeated at 4 Hz and electromyographic responses were submitted to a newly proposed algorithm estimating the MT with smaller number of stimuli with respect to traditional approaches. The results showed that in all tested rats (n = 12 the motor area maps identified by our novel mapping procedure (novel MT algorithm and 4-Hz CSS significantly correlated with the maps achieved by the conventional MT algorithm with 1-Hz CSS. The reliability of the both mapping methods was very high (intraclass correlation coefficients ≧0.8, while the time needed for the mapping was one-twelfth shorter with the novel method. Furthermore, the motor maps assessed by intracortical microstimulation and the novel CSS mapping procedure in two rats were compared and were also significantly correlated. Our novel mapping procedure that determined a cortical motor area within a few minutes could help

  20. Preliminary investigation of processes that affect source term identification

    International Nuclear Information System (INIS)

    Wickliff, D.S.; Solomon, D.K.; Farrow, N.D.

    1991-09-01

    Solid Waste Storage Area (SWSA) 5 is known to be a significant source of contaminants, especially tritium ( 3 H), to the White Oak Creek (WOC) watershed. For example, Solomon et al. (1991) estimated the total 3 H discharge in Melton Branch (most of which originates in SWSA 5) for the 1988 water year to be 1210 Ci. A critical issue for making decisions concerning remedial actions at SWSA 5 is knowing whether the annual contaminant discharge is increasing or decreasing. Because (1) the magnitude of the annual contaminant discharge is highly correlated to the amount of annual precipitation (Solomon et al., 1991) and (2) a significant lag may exist between the time of peak contaminant release from primary sources (i.e., waste trenches) and the time of peak discharge into streams, short-term stream monitoring by itself is not sufficient for predicting future contaminant discharges. In this study we use 3 H to examine the link between contaminant release from primary waste sources and contaminant discharge into streams. By understanding and quantifying subsurface transport processes, realistic predictions of future contaminant discharge, along with an evaluation of the effectiveness of remedial action alternatives, will be possible. The objectives of this study are (1) to characterize the subsurface movement of contaminants (primarily 3 H) with an emphasis on the effects of matrix diffusion; (2) to determine the relative strength of primary vs secondary sources; and (3) to establish a methodology capable of determining whether the 3 H discharge from SWSA 5 to streams is increasing or decreasing

  1. Identification of conductive hearing loss using air conduction tests alone: reliability and validity of an automatic test battery.

    Science.gov (United States)

    Convery, Elizabeth; Keidser, Gitte; Seeto, Mark; Freeston, Katrina; Zhou, Dan; Dillon, Harvey

    2014-01-01

    The primary objective of this study was to determine whether a combination of automatically administered pure-tone audiometry and a tone-in-noise detection task, both delivered via an air conduction (AC) pathway, could reliably and validly predict the presence of a conductive component to the hearing loss. The authors hypothesized that performance on the battery of tests would vary according to hearing loss type. A secondary objective was to evaluate the reliability and validity of a novel automatic audiometry algorithm to assess its suitability for inclusion in the test battery. Participants underwent a series of hearing assessments that were conducted in a randomized order: manual pure-tone air conduction audiometry and bone conduction audiometry; automatic pure-tone air conduction audiometry; and an automatic tone-in-noise detection task. The automatic tests were each administered twice. The ability of the automatic test battery to: (a) predict the presence of an air-bone gap (ABG); and (b) accurately measure AC hearing thresholds was assessed against the results of manual audiometry. Test-retest conditions were compared to determine the reliability of each component of the automatic test battery. Data were collected on 120 ears from normal-hearing and conductive, sensorineural, and mixed hearing-loss subgroups. Performance differences between different types of hearing loss were observed. Ears with a conductive component (conductive and mixed ears) tended to have normal signal to noise ratios (SNR) despite impaired thresholds in quiet, while ears without a conductive component (normal and sensorineural ears) demonstrated, on average, an increasing relationship between their thresholds in quiet and their achieved SNR. Using the relationship between these two measures among ears with no conductive component as a benchmark, the likelihood that an ear has a conductive component can be estimated based on the deviation from this benchmark. The sensitivity and

  2. Automatic reference selection for quantitative EEG interpretation: identification of diffuse/localised activity and the active earlobe reference, iterative detection of the distribution of EEG rhythms.

    Science.gov (United States)

    Wang, Bei; Wang, Xingyu; Ikeda, Akio; Nagamine, Takashi; Shibasaki, Hiroshi; Nakamura, Masatoshi

    2014-01-01

    EEG (Electroencephalograph) interpretation is important for the diagnosis of neurological disorders. The proper adjustment of the montage can highlight the EEG rhythm of interest and avoid false interpretation. The aim of this study was to develop an automatic reference selection method to identify a suitable reference. The results may contribute to the accurate inspection of the distribution of EEG rhythms for quantitative EEG interpretation. The method includes two pre-judgements and one iterative detection module. The diffuse case is initially identified by pre-judgement 1 when intermittent rhythmic waveforms occur over large areas along the scalp. The earlobe reference or averaged reference is adopted for the diffuse case due to the effect of the earlobe reference depending on pre-judgement 2. An iterative detection algorithm is developed for the localised case when the signal is distributed in a small area of the brain. The suitable averaged reference is finally determined based on the detected focal and distributed electrodes. The presented technique was applied to the pathological EEG recordings of nine patients. One example of the diffuse case is introduced by illustrating the results of the pre-judgements. The diffusely intermittent rhythmic slow wave is identified. The effect of active earlobe reference is analysed. Two examples of the localised case are presented, indicating the results of the iterative detection module. The focal and distributed electrodes are detected automatically during the repeating algorithm. The identification of diffuse and localised activity was satisfactory compared with the visual inspection. The EEG rhythm of interest can be highlighted using a suitable selected reference. The implementation of an automatic reference selection method is helpful to detect the distribution of an EEG rhythm, which can improve the accuracy of EEG interpretation during both visual inspection and automatic interpretation. Copyright © 2013 IPEM

  3. Automatic writer identification using connected-component contours and edge-based features of uppercase western script

    NARCIS (Netherlands)

    Schomaker, L; Bulacu, M

    In this paper, a new technique for offline writer identification is presented, using connected-component contours (COCOCOs or CO(3)s) in uppercase handwritten samples. In our model, the writer is considered to be characterized by a stochastic pattern generator, producing a family of connected

  4. Automatic identification approach for high-performance liquid chromatography-multiple reaction monitoring fatty acid global profiling.

    Science.gov (United States)

    Tie, Cai; Hu, Ting; Jia, Zhi-Xin; Zhang, Jin-Lan

    2015-08-18

    Fatty acids (FAs) are a group of lipid molecules that are essential to organisms. As potential biomarkers for different diseases, FAs have attracted increasing attention from both biological researchers and the pharmaceutical industry. A sensitive and accurate method for globally profiling and identifying FAs is required for biomarker discovery. The high selectivity and sensitivity of high-performance liquid chromatography-multiple reaction monitoring (HPLC-MRM) gives it great potential to fulfill the need to identify FAs from complicated matrices. This paper developed a new approach for global FA profiling and identification for HPLC-MRM FA data mining. Mathematical models for identifying FAs were simulated using the isotope-induced retention time (RT) shift (IRS) and peak area ratios between parallel isotope peaks for a series of FA standards. The FA structures were predicated using another model based on the RT and molecular weight. Fully automated FA identification software was coded using the Qt platform based on these mathematical models. Different samples were used to verify the software. A high identification efficiency (greater than 75%) was observed when 96 FA species were identified in plasma. This FAs identification strategy promises to accelerate FA research and applications.

  5. Automatic de-identification of textual documents in the electronic health record: a review of recent research

    Directory of Open Access Journals (Sweden)

    South Brett R

    2010-08-01

    Full Text Available Abstract Background In the United States, the Health Insurance Portability and Accountability Act (HIPAA protects the confidentiality of patient data and requires the informed consent of the patient and approval of the Internal Review Board to use data for research purposes, but these requirements can be waived if data is de-identified. For clinical data to be considered de-identified, the HIPAA "Safe Harbor" technique requires 18 data elements (called PHI: Protected Health Information to be removed. The de-identification of narrative text documents is often realized manually, and requires significant resources. Well aware of these issues, several authors have investigated automated de-identification of narrative text documents from the electronic health record, and a review of recent research in this domain is presented here. Methods This review focuses on recently published research (after 1995, and includes relevant publications from bibliographic queries in PubMed, conference proceedings, the ACM Digital Library, and interesting publications referenced in already included papers. Results The literature search returned more than 200 publications. The majority focused only on structured data de-identification instead of narrative text, on image de-identification, or described manual de-identification, and were therefore excluded. Finally, 18 publications describing automated text de-identification were selected for detailed analysis of the architecture and methods used, the types of PHI detected and removed, the external resources used, and the types of clinical documents targeted. All text de-identification systems aimed to identify and remove person names, and many included other types of PHI. Most systems used only one or two specific clinical document types, and were mostly based on two different groups of methodologies: pattern matching and machine learning. Many systems combined both approaches for different types of PHI, but the

  6. Automatic de-identification of textual documents in the electronic health record: a review of recent research.

    Science.gov (United States)

    Meystre, Stephane M; Friedlin, F Jeffrey; South, Brett R; Shen, Shuying; Samore, Matthew H

    2010-08-02

    In the United States, the Health Insurance Portability and Accountability Act (HIPAA) protects the confidentiality of patient data and requires the informed consent of the patient and approval of the Internal Review Board to use data for research purposes, but these requirements can be waived if data is de-identified. For clinical data to be considered de-identified, the HIPAA "Safe Harbor" technique requires 18 data elements (called PHI: Protected Health Information) to be removed. The de-identification of narrative text documents is often realized manually, and requires significant resources. Well aware of these issues, several authors have investigated automated de-identification of narrative text documents from the electronic health record, and a review of recent research in this domain is presented here. This review focuses on recently published research (after 1995), and includes relevant publications from bibliographic queries in PubMed, conference proceedings, the ACM Digital Library, and interesting publications referenced in already included papers. The literature search returned more than 200 publications. The majority focused only on structured data de-identification instead of narrative text, on image de-identification, or described manual de-identification, and were therefore excluded. Finally, 18 publications describing automated text de-identification were selected for detailed analysis of the architecture and methods used, the types of PHI detected and removed, the external resources used, and the types of clinical documents targeted. All text de-identification systems aimed to identify and remove person names, and many included other types of PHI. Most systems used only one or two specific clinical document types, and were mostly based on two different groups of methodologies: pattern matching and machine learning. Many systems combined both approaches for different types of PHI, but the majority relied only on pattern matching, rules, and

  7. Evaluation of a direct method for the identification and antibiotic susceptibility assessment of microrganisms isolated from blood cultures by automatic systems

    Directory of Open Access Journals (Sweden)

    Sergio Frugoni

    2008-03-01

    Full Text Available The purpose of blood cultures in the septic patient is to address a correct therapeutic approach. Identification and antibiotic susceptibility test carried out directly from the bottle may give important information in short time.The introduction of the automatic instrumentation has improved the discovering of pathogens in the blood, however the elapsing time between the positive detection and the microbiological report is still along. Is the evaluation of this study a fast, easy, cheap method to be applied to the routine, which could reduce the response time in the bacteraemia diagnosis.The automatic systems Vitek Senior (bioMérieux, and Vitek 2 (bioMérieux were used at Pio Albergo Trivulzio (Centre1 and at Istituto dei Tumori (Centre2 respectivetly.To remove blood cells, 7 ml. of the culture has been moved by vacuum sampling in a test tube and centrifuged for 10 minutes at 1000 rpm the supernatant has been further centrifuged for 10 minutes at 3000 rpm.0.5 ml. of BHI has been added to the pellet o sediment.The concentration of bacterial suspension has been fit for the inoculation. At the same time has been prepared standard cultures in suitable culture media were carried out for comparison. In the centro1 and centro2 have been isolated and identify respectively 63 and 31 Gram negative, and, 32 and 40 gram positive microorganisms have been isolated and identify in the Centre1 and Centre2 respectively.The identification Gram-negative and Gram positive microorganisms showed an agreement of 100% and 86.2% and 93.3% and 65.78% respectively between the direct and the standard method. For antibiotic susceptibility tests, 903 (Centre1 and 491 (Centre2 and 396 and 509 compounds were totally assessed in Gram negative and Gram positive bacteria respectively.The analysis has highlighted that: Centre1 has reported 0.30% very major errors (GE, 0.92% major errors (EM, 1.23% minor errors (Em. Centre 2 showed 0.57% very major errors (GE, 0.09% major errors

  8. Automatic Assessment of Global Craniofacial Differences between Crouzon mice and Wild-type mice in terms of the Cephalic Index

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Oubel, Estanislao; Frangi, Alejandro F.

    2006-01-01

    of landmark matching is limited using only affine transformations, the errors were considered acceptable. The automatic estimation of the cephalic index was in full agreement with the gold standard measurements. Discriminant analysis of the three scaling parameters resulted in a good classification...

  9. Combining Facial Recognition, Automatic License Plate Readers and Closed Circuit Television to Create an Interstate Identification System for Wanted Subjects

    Science.gov (United States)

    2015-12-01

    they are responsive or unresponsive based on what is in their best interest at the time. Media and privacy groups get their power from controversy ; the...of potential suspect identification. To do so, the thesis examines all systems’ basic capabilities, privacy issues or concerns, best practices...systems’ basic capabilities, privacy issues or concerns, best practices, possible areas for improvement, and policy considerations. Since the

  10. Rapid and automatic chemical identification of the medicinal flower buds of Lonicera plants by the benchtop and hand-held Fourier transform infrared spectroscopy

    Science.gov (United States)

    Chen, Jianbo; Guo, Baolin; Yan, Rui; Sun, Suqin; Zhou, Qun

    2017-07-01

    With the utilization of the hand-held equipment, Fourier transform infrared (FT-IR) spectroscopy is a promising analytical technique to minimize the time cost for the chemical identification of herbal materials. This research examines the feasibility of the hand-held FT-IR spectrometer for the on-site testing of herbal materials, using Lonicerae Japonicae Flos (LJF) and Lonicerae Flos (LF) as examples. Correlation-based linear discriminant models for LJF and LF are established based on the benchtop and hand-held FT-IR instruments. The benchtop FT-IR models can exactly recognize all articles of LJF and LF. Although a few LF articles are misjudged at the sub-class level, the hand-held FT-IR models are able to exactly discriminate LJF and LF. As a direct and label-free analytical technique, FT-IR spectroscopy has great potential in the rapid and automatic chemical identification of herbal materials either in laboratories or in fields. This is helpful to prevent the spread and use of adulterated herbal materials in time.

  11. USER EMOTION IDENTIFICATION IN TWITTER USING SPECIFIC FEATURES: HASHTAG, EMOJI, EMOTICON, AND ADJECTIVE TERM

    Directory of Open Access Journals (Sweden)

    Yuita Arum Sari

    2014-08-01

    Full Text Available Abstract Twitter is a social media application, which can give a sign for identifying user emotion. Identification of user emotion can be utilized in commercial domain, health, politic, and security problems. The problem of emotion identification in twit is the unstructured short text messages which lead the difficulty to figure out main features. In this paper, we propose a new framework for identifying the tendency of user emotions using specific features, i.e. hashtag, emoji, emoticon, and adjective term. Preprocessing is applied in the first phase, and then user emotions are identified by means of classification method using kNN. The proposed method can achieve good results, near ground truth, with accuracy of 92%.

  12. An investigation into the factors that influence toolmark identifications on ammunition discharged from semi-automatic pistols recovered from car fires.

    Science.gov (United States)

    Collender, Mark A; Doherty, Kevin A J; Stanton, Kenneth T

    2017-01-01

    Following a shooting incident where a vehicle is used to convey the culprits to and from the scene, both the getaway car and the firearm are often deliberately burned in an attempt to destroy any forensic evidence which may be subsequently recovered. Here we investigate the factors that influence the ability to make toolmark identifications on ammunition discharged from pistols recovered from such car fires. This work was carried out by conducting a number of controlled furnace tests in conjunction with real car fire tests in which three 9mm semi-automatic pistols were burned. Comparisons between pre-burn and post burn test fired ammunition discharged from these pistols were then performed to establish if identifications were still possible. The surfaces of the furnace heated samples and car fire samples were examined following heating/burning to establish what factors had influenced their surface morphology. The primary influence on the surfaces of the furnace heated and car fire samples was the formation of oxide layers. The car fire samples were altered to a greater extent than the furnace heated samples. Identifications were still possible between pre- and post-burn discharged cartridge cases, but this was not the case for the discharged bullets. It is suggested that the reason for this is a difference between the types of firearms discharge-generated toolmarks impressed onto the base of cartridge cases compared to those striated along the surfaces of bullets. It was also found that the temperatures recorded in the front foot wells were considerably less than those recorded on top of the rear seats during the car fires. These factors should be assessed by forensic firearms examiners when performing casework involving pistols recovered from car fires. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Identification of alcohol abuse and transition from long-term unemployment to disability pension.

    Science.gov (United States)

    Nurmela, Kirsti; Heikkinen, Virpi; Hokkanen, Risto; Ylinen, Aarne; Uitti, Jukka; Mattila, Aino; Joukamaa, Matti; Virtanen, Pekka

    2015-07-01

    The aim of the study was to reveal potential gaps and inconsistencies in the identification of alcohol abuse in health care and in employment services and to analyse the granting of disability pensions with respect to the alcohol abuse identification pattern. The material consisted of documentary information on 505 long-term unemployed subjects with low employability sent to the development project entitled 'Eligibility for a Disability Pension' in 2001-2006 in Finland. The dichotomous variables 'Alcohol abuse identified in employment services' and 'Alcohol abuse identified in health care' were cross-tabulated to obtain a four-class variable 'Alcohol abuse identification pattern'. Logistic regression analyses were conducted to ascertain the association of alcohol abuse identification pattern with the granting of disability pensions. Alcohol abuse was detected by both health care and employment services in 47% of those identified as abusers (41% of examinees). Each service systems also identified cases that the other did not. When alcohol abuse was identified in health care only, the OR for a disability pension being granted was 2.8 (95% CI 1.5-5.2) compared with applicants without identified alcohol abuse. The result remained the same and statistically significant after adjusting for confounders. Alcohol abuse identified in health care was positively associated with the granting of a disability pension. Closer co-operation between employment services and health care could help to identify those long-term unemployed individuals with impaired work ability in need of thorough medical examination. © 2015 the Nordic Societies of Public Health.

  14. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  15. Planning for Site Transition to Long-Term Stewardship: Identification of Requirements and Issues

    International Nuclear Information System (INIS)

    Banaee, J.

    2002-01-01

    A systematic methodology is presented and applied for the identification of requirements and issues pertaining to the planning for, and transition to, long term stewardship (LTS). The method has been applied to three of the twelve identified LTS functions. The results of the application of the methodology to contaminated and uncontaminated federal real property in those three functions are presented. The issues that could be seen as impediments to the implementation of LTS are also identified for the three areas under consideration. The identified requirements are significant and in some cases complex to implement. It is clear that early and careful planning is required in all circumstances

  16. Planning for Site Transition to Long-Term Stewardship: Identification of Requirements and Issues

    Energy Technology Data Exchange (ETDEWEB)

    Banaee, J.

    2002-05-16

    A systematic methodology is presented and applied for the identification of requirements and issues pertaining to the planning for, and transition to, long term stewardship (LTS). The method has been applied to three of the twelve identified LTS functions. The results of the application of the methodology to contaminated and uncontaminated federal real property in those three functions are presented. The issues that could be seen as impediments to the implementation of LTS are also identified for the three areas under consideration. The identified requirements are significant and in some cases complex to implement. It is clear that early and careful planning is required in all circumstances.

  17. Planning for Site Transition to Long-Term Stewardship: Identification of Requirements and Issues

    Energy Technology Data Exchange (ETDEWEB)

    Banaee, Jila

    2002-08-01

    A systematic methodology is presented and applied for the identification of requirements and issues pertaining to the planning for, and transition to, long term stewardship (LTS). The method has been applied to three of the twelve identified LTS functions. The results of the application of the methodology to contaminated and uncontaminated federal real property in those three functions are presented. The issues that could be seen as impediments to the implementation of LTS are also identified for the three areas under consideration. The identified requirements are significant and in some cases complex to implement. It is clear that early and careful planning is required in all circumstances.

  18. The influence of short-term memory on standard discrimination and cued identification olfactory tasks.

    Science.gov (United States)

    Zucco, Gesualdo M; Hummel, Thomas; Tomaiuolo, Francesco; Stevenson, Richard J

    2014-01-30

    Amongst the techniques to assess olfactory functions, discrimination and cued identification are those most prone to the influence of odour short-term memory (STM). Discrimination task requires participants to detect the odd one out of three presented odourants. As re-smelling is not permitted, an un-intended STM load may generate, even though the task purports to assess discrimination ability. Analogously, cued identification task requires participants to smell an odour, and then select a label from three or four alternatives. As the interval between smelling and reading each label increases this too imposes a STM load, even though the task aims to measure identification ability. We tested whether modifying task design to reduce STM load improve performance on these tests. We examined five age-groups of participants (Adolescents, Young adults, Middle-aged, Elderly, very Elderly), some of whom should be more prone to the effects of STM load than others, on standard and modified tests of discrimination and identification. We found that using a technique to reduce STM load improved performance, especially for the very Elderly and Adolescent groups. Sources of error are now prevented. Findings indicate that STM load can adversely affect performance in groups vulnerable from memory impairment (i.e., very Elderly) and in those who may still be acquiring memory-based representations of familiar odours (i.e., Adolescents). It may be that adults in general would be even more sensitive to the effects of olfactory STM load reduction, if the odour-related task was more difficult. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Automatic methods for long-term tracking and the detection and decoding of communication dances in honeybees

    Directory of Open Access Journals (Sweden)

    Fernando eWario

    2015-09-01

    Full Text Available The honeybee waggle dance communication system is an intriguing example of abstract animal communication and has been investigated thoroughly throughout the last seven decades. Typically, observables such as durations or angles are extracted manually directly from the observation hive or from video recordings to quantify dance properties, particularly to determine where bees have foraged. In recent years, biology has profited from automation, improving measurement precision, removing human bias, and accelerating data collection. As a further step, we have developed technologies to track all individuals of a honeybee colony and detect and decode communication dances automatically. In strong contrast to conventional approaches that focus on a small subset of the hive life, whether this regards time, space, or animal identity, our more inclusive system will help the understanding of the dance comprehensively in its spatial, temporal, and social context. In this contribution, we present full specifications of the recording setup and the software for automatic recognition and decoding of tags and dances, and we discuss potential research directions that may benefit from automation. Lastly, to exemplify the power of the methodology, we show experimental data and respective analyses for a continuous, experimental recording of nine weeks duration.

  20. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  1. I RAN Fast and I Remembered What I Read: The Relationship between Reading, Rapid Automatic Naming, and Auditory and Visual Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Sheila G. Crewther

    2011-05-01

    Full Text Available Although rapid automatic naming (RAN speed and short-term auditory memory are widely recognised as good predictors of reading ability in most age groups, the predictive value of short-term memory for visually presented digits for reading and RAN in young typically developing learner readers (mean age 91.5 months has seldom been investigated. We found that visual digit span is a better predictor of reading ability than auditory digit span in learner readers. A significant correlation has also been found between RAN speed and visual, but not auditory digit span. These results suggests that RAN speed may be a good predictor of a child's future reading ability and eventual fluency because like visual digit span, it is a measure of rate of access to memory for the visual icons and their semantic name and meaning. The results also suggest that auditory memory is not an important factor in young children learning to read.

  2. Radio frequency identification (RFID) of dentures in long-term care facilities.

    Science.gov (United States)

    Madrid, Carlos; Korsvold, Tové; Rochat, Aline; Abarca, Marcelo

    2012-03-01

    The difficulty of identifying the ownership of lost dentures when found is a common and expensive problem in long term care facilities (LTCFs) and hospitals. The purpose of this study was to evaluate the reliability of using radiofrequency identification (RFID) in the identification of dentures for LTCF residents after 3 and 6 months. Thirty-eight residents of 2 LTCFs in Switzerland agreed to participate after providing informed consent. The tag was programmed with the family and first names of the participants and then inserted in the dentures. After placement of the tag, the information was read. A second and third assessment to review the functioning of the tag occurred at 3 and 6 months, and defective tags (if present) were reported and replaced. The data were analyzed with descriptive statistics. At the 3-month assessment of 34 residents (63 tags) 1 tag was unreadable and 62 tags (98.2%) were operational. At 6 months, the tags of 27 of the enrolled residents (50 tags) were available for review. No examined tag was defective at this time period. Within the limits of this study (number of patients, 6-month time span) RFID appears to be a reliable method of tracking and identifying dentures, with only 1 of 65 devices being unreadable at 3 months and 100% of 50 initially placed tags being readable at the end of the trial. Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.

  3. A New Approach to Environmentally Safe Unique Identification of Long-Term Stored Copper Canisters

    International Nuclear Information System (INIS)

    Chernikova, D.; Axell, K.; Nordlund, A.

    2015-01-01

    A new approach to environmentally safe unique identification of long-term stored copper canisters is suggested in this paper. The approach is based on the use of a tungstenbased insert placed inside a copper cask between a top iron lid and a copper lid. The insert/label is marked with unique code in a form of binary number, which is implemented as a combination of holes in the tungsten plate. In order to provide a necessary redundancy of the identifier, the tungsten label marked with few identical binary codes. The position of code (i.e., holes in tungsten) corresponds to a predefined placement of the spent fuel assembles in the iron container. This is in order to avoid any non-uniformity of the gamma background at the canister surface caused by a presence of iron-filled spaces between spent nuclear fuel assembles. Due to the use of the tungsten material gamma rays emitted by the spent fuel assembles are collimated in a specific way because of strong attenuation properties of tungsten. As a result, the variation in the gamma-counting rate in a detector array placed on the top of copper lid provides the distribution of the holes in the tungsten insert or in other words the unique identifier. Thus, this way of identification of copper cask do not impair the integrity of the cask and it offers a way that the information about spent nuclear fuel is legible for a time scale up to a few thousands years. (author)

  4. Identification of pumping influences in long-term water level fluctuations.

    Science.gov (United States)

    Harp, Dylan R; Vesselinov, Velimir V

    2011-01-01

    Identification of the pumping influences at monitoring wells caused by spatially and temporally variable water supply pumping can be a challenging, yet an important hydrogeological task. The information that can be obtained can be critical for conceptualization of the hydrogeological conditions and indications of the zone of influence of the individual pumping wells. However, the pumping influences are often intermittent and small in magnitude with variable production rates from multiple pumping wells. While these difficulties may support an inclination to abandon the existing dataset and conduct a dedicated cross-hole pumping test, that option can be challenging and expensive to coordinate and execute. This paper presents a method that utilizes a simple analytical modeling approach for analysis of a long-term water level record utilizing an inverse modeling approach. The methodology allows the identification of pumping wells influencing the water level fluctuations. Thus, the analysis provides an efficient and cost-effective alternative to designed and coordinated cross-hole pumping tests. We apply this method on a dataset from the Los Alamos National Laboratory site. Our analysis also provides (1) an evaluation of the information content of the transient water level data; (2) indications of potential structures of the aquifer heterogeneity inhibiting or promoting pressure propagation; and (3) guidance for the development of more complicated models requiring detailed specification of the aquifer heterogeneity. Copyright © 2010 The Author(s). Journal compilation © 2010 National Ground Water Association.

  5. The Development of Automaticity in Short-Term Memory Search: Item-Response Learning and Category Learning

    Science.gov (United States)

    Cao, Rui; Nosofsky, Robert M.; Shiffrin, Richard M.

    2017-01-01

    In short-term-memory (STM)-search tasks, observers judge whether a test probe was present in a short list of study items. Here we investigated the long-term learning mechanisms that lead to the highly efficient STM-search performance observed under conditions of consistent-mapping (CM) training, in which targets and foils never switch roles across…

  6. Long-term memory for odors: influences of familiarity and identification across 64 days.

    Science.gov (United States)

    Cornell Kärnekull, Stina; Jönsson, Fredrik U; Willander, Johan; Sikström, Sverker; Larsson, Maria

    2015-05-01

    Few studies have investigated long-term odor recognition memory, although some early observations suggested that the forgetting rate of olfactory representations is slower than for other sensory modalities. This study investigated recognition memory across 64 days for high and low familiar odors and faces. Memory was assessed in 83 young participants at 4 occasions; immediate, 4, 16, and 64 days after encoding. The results indicated significant forgetting for odors and faces across the 64 days. The forgetting functions for the 2 modalities were not fundamentally different. Moreover, high familiar odors and faces were better remembered than low familiar ones, indicating an important role of semantic knowledge on recognition proficiency for both modalities. Although odor recognition was significantly better than chance at the 64 days testing, memory for the low familiar odors was relatively poor. Also, the results indicated that odor identification consistency across sessions, irrespective of accuracy, was positively related to successful recognition. © The Author 2015. Published by Oxford University Press.

  7. Automatic identification and removal of ocular artifacts in EEG--improved adaptive predictor filtering for portable applications.

    Science.gov (United States)

    Zhao, Qinglin; Hu, Bin; Shi, Yujun; Li, Yang; Moore, Philip; Sun, Minghou; Peng, Hong

    2014-06-01

    Electroencephalogram (EEG) signals have a long history of use as a noninvasive approach to measure brain function. An essential component in EEG-based applications is the removal of Ocular Artifacts (OA) from the EEG signals. In this paper we propose a hybrid de-noising method combining Discrete Wavelet Transformation (DWT) and an Adaptive Predictor Filter (APF). A particularly novel feature of the proposed method is the use of the APF based on an adaptive autoregressive model for prediction of the waveform of signals in the ocular artifact zones. In our test, based on simulated data, the accuracy of noise removal in the proposed model was significantly increased when compared to existing methods including: Wavelet Packet Transform (WPT) and Independent Component Analysis (ICA), Discrete Wavelet Transform (DWT) and Adaptive Noise Cancellation (ANC). The results demonstrate that the proposed method achieved a lower mean square error and higher correlation between the original and corrected EEG. The proposed method has also been evaluated using data from calibration trials for the Online Predictive Tools for Intervention in Mental Illness (OPTIMI) project. The results of this evaluation indicate an improvement in performance in terms of the recovery of true EEG signals with EEG tracking and computational speed in the analysis. The proposed method is well suited to applications in portable environments where the constraints with respect to acceptable wearable sensor attachments usually dictate single channel devices.

  8. A novel GLM-based method for the Automatic IDentification of functional Events (AIDE) in fNIRS data recorded in naturalistic environments.

    Science.gov (United States)

    Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias

    2017-07-15

    Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is

  9. Automatic segmentation of the hippocampus for preterm neonates from early-in-life to term-equivalent age

    Directory of Open Access Journals (Sweden)

    Ting Guo

    2015-01-01

    Conclusions: MAGeT-Brain is capable of segmenting hippocampi accurately in preterm neonates, even at early-in-life. Hippocampal asymmetry with a larger right side is demonstrated on early-in-life images, suggesting that this phenomenon has its onset in the 3rd trimester of gestation. Hippocampal volume assessed at the time of early-in-life and term-equivalent age is linearly associated with GA at birth, whereby smaller volumes are associated with earlier birth.

  10. The development of automaticity in short-term memory search: Item-response learning and category learning.

    Science.gov (United States)

    Cao, Rui; Nosofsky, Robert M; Shiffrin, Richard M

    2017-05-01

    In short-term-memory (STM)-search tasks, observers judge whether a test probe was present in a short list of study items. Here we investigated the long-term learning mechanisms that lead to the highly efficient STM-search performance observed under conditions of consistent-mapping (CM) training, in which targets and foils never switch roles across trials. In item-response learning, subjects learn long-term mappings between individual items and target versus foil responses. In category learning, subjects learn high-level codes corresponding to separate sets of items and learn to attach old versus new responses to these category codes. To distinguish between these 2 forms of learning, we tested subjects in categorized varied mapping (CV) conditions: There were 2 distinct categories of items, but the assignment of categories to target versus foil responses varied across trials. In cases involving arbitrary categories, CV performance closely resembled standard varied-mapping performance without categories and departed dramatically from CM performance, supporting the item-response-learning hypothesis. In cases involving prelearned categories, CV performance resembled CM performance, as long as there was sufficient practice or steps taken to reduce trial-to-trial category-switching costs. This pattern of results supports the category-coding hypothesis for sufficiently well-learned categories. Thus, item-response learning occurs rapidly and is used early in CM training; category learning is much slower but is eventually adopted and is used to increase the efficiency of search beyond that available from item-response learning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Automatized near-real-time short-term Probabilistic Volcanic Hazard Assessment of tephra dispersion before eruptions: BET_VHst for Vesuvius and Campi Flegrei during recent exercises

    Science.gov (United States)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Rouwet, Dmtri; Tonini, Roberto; Macedonio, Giovanni; Marzocchi, Warner

    2015-04-01

    Probabilistic Volcanic Hazard Assessment (PVHA) represents the most complete scientific contribution for planning rational strategies aimed at mitigating the risk posed by volcanic activity at different time scales. The definition of the space-time window for PVHA is related to the kind of risk mitigation actions that are under consideration. Short temporal intervals (days to weeks) are important for short-term risk mitigation actions like the evacuation of a volcanic area. During volcanic unrest episodes or eruptions, it is of primary importance to produce short-term tephra fallout forecast, and frequently update it to account for the rapidly evolving situation. This information is obviously crucial for crisis management, since tephra may heavily affect building stability, public health, transportations and evacuation routes (airports, trains, road traffic) and lifelines (electric power supply). In this study, we propose a methodology named BET_VHst (Selva et al. 2014) for short-term PVHA of volcanic tephra dispersal based on automatic interpretation of measures from the monitoring system and physical models of tephra dispersal from all possible vent positions and eruptive sizes based on frequently updated meteorological forecasts. The large uncertainty at all the steps required for the analysis, both aleatory and epistemic, is treated by means of Bayesian inference and statistical mixing of long- and short-term analyses. The BET_VHst model is here presented through its implementation during two exercises organized for volcanoes in the Neapolitan area: MESIMEX for Mt. Vesuvius, and VUELCO for Campi Flegrei. References Selva J., Costa A., Sandri L., Macedonio G., Marzocchi W. (2014) Probabilistic short-term volcanic hazard in phases of unrest: a case study for tephra fallout, J. Geophys. Res., 119, doi: 10.1002/2014JB011252

  12. Long-term clinical evaluation of the automatic stance-phase lock-controlled prosthetic knee joint in young adults with unilateral above-knee amputation.

    Science.gov (United States)

    Andrysek, Jan; Wright, F Virginia; Rotter, Karin; Garcia, Daniela; Valdebenito, Rebeca; Mitchell, Carlos Alvarez; Rozbaczylo, Claudio; Cubillos, Rafael

    2017-05-01

    The purpose of this study was to clinically evaluate the automatic stance-phase lock (ASPL) knee mechanism against participants' existing weight-activated braking (WAB) prosthetic knee joint. This prospective crossover study involved 10 young adults with an above-knee amputation. Primary measurements consisted of tests of walking speeds and capacity. Heart rate was measured during the six-minute walk test and the Physiological Cost Index (PCI) which was calculated from heart rate estimated energy expenditure. Activity was measured with a pedometer. User function and quality of life were assessed using the Lower Limb Function Questionnaire (LLFQ) and Prosthetic Evaluation Questionnaire (PEQ). Long-term follow-up over 12 months were completed. Walking speeds were the same for WAB and APSL knees. Energy expenditure (PCI) was lower for the ASPL knees (p = 0.007). Step counts were the same for both knees, and questionnaires indicated ASPL knee preference attributed primarily to knee stability and improved walking, while limitations included terminal impact noise. Nine of 10 participants chose to keep using the ASPL knee as part of the long-term follow-up. Potential benefits of the ASPL knee were identified in this study by functional measures, questionnaires and user feedback, but not changes in activity or the PEQ.

  13. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  14. Operator overloading as an enabling technology for automatic differentiation

    International Nuclear Information System (INIS)

    Corliss, G.F.; Griewank, A.

    1993-01-01

    We present an example of the science that is enabled by object-oriented programming techniques. Scientific computation often needs derivatives for solving nonlinear systems such as those arising in many PDE algorithms, optimization, parameter identification, stiff ordinary differential equations, or sensitivity analysis. Automatic differentiation computes derivatives accurately and efficiently by applying the chain rule to each arithmetic operation or elementary function. Operator overloading enables the techniques of either the forward or the reverse mode of automatic differentiation to be applied to real-world scientific problems. We illustrate automatic differentiation with an example drawn from a model of unsaturated flow in a porous medium. The problem arises from planning for the long-term storage of radioactive waste

  15. Mining Twitter as a First Step toward Assessing the Adequacy of Gender Identification Terms on Intake Forms.

    Science.gov (United States)

    Hicks, Amanda; Hogan, William R; Rutherford, Michael; Malin, Bradley; Xie, Mengjun; Fellbaum, Christiane; Yin, Zhijun; Fabbri, Daniel; Hanna, Josh; Bian, Jiang

    2015-01-01

    The Institute of Medicine (IOM) recommends that health care providers collect data on gender identity. If these data are to be useful, they should utilize terms that characterize gender identity in a manner that is 1) sensitive to transgender and gender non-binary individuals (trans* people) and 2) semantically structured to render associated data meaningful to the health care professionals. We developed a set of tools and approaches for analyzing Twitter data as a basis for generating hypotheses on language used to identify gender and discuss gender-related issues across regions and population groups. We offer sample hypotheses regarding regional variations in the usage of certain terms such as 'genderqueer', 'genderfluid', and 'neutrois' and their usefulness as terms on intake forms. While these hypotheses cannot be directly validated with Twitter data alone, our data and tools help to formulate testable hypotheses and design future studies regarding the adequacy of gender identification terms on intake forms.

  16. Numerical method of identification of an unknown source term in a heat equation

    Directory of Open Access Journals (Sweden)

    Fatullayev Afet Golayo?lu

    2002-01-01

    Full Text Available A numerical procedure for an inverse problem of identification of an unknown source in a heat equation is presented. Approach of proposed method is to approximate unknown function by polygons linear pieces which are determined consecutively from the solution of minimization problem based on the overspecified data. Numerical examples are presented.

  17. Ethnicity-specific birthweight distributions improve identification of term newborns at risk for short-term morbidity.

    Science.gov (United States)

    Hanley, Gillian E; Janssen, Patricia A

    2013-11-01

    We aimed to determine whether ethnicity-specific birthweight distributions more accurately identify newborns at risk for short-term neonatal morbidity associated with small for gestational age (SGA) birth than population-based distributions not stratified on ethnicity. We examined 100,463 singleton term infants born to parents in Washington State between Jan. 1, 2006, and Dec. 31, 2008. Using multivariable logistic regression models, we compared the ability of an ethnicity-specific growth distribution and a population-based growth distribution to predict which infants were at increased risk for Apgar score distributions had the highest rates of each of the adverse outcomes assessed-more than double those of infants only considered SGA by the population-based standards. When controlling for mother's age, parity, body mass index, education, gestational age, mode of delivery, and marital status, newborns considered SGA by ethnicity-specific birthweight distributions were between 2 and 7 times more likely to suffer from the adverse outcomes listed above than infants who were not SGA. In contrast, newborns considered SGA by population-based birthweight distributions alone were at no higher risk of any adverse outcome except hypothermia (adjusted odds ratio, 2.76; 95% confidence interval, 1.68-4.55) and neonatal intensive care unit admission (adjusted odds ratio, 1.40; 95% confidence interval, 1.18-1.67). Ethnicity-specific birthweight distributions were significantly better at identifying the infants at higher risk of short-term neonatal morbidity, suggesting that their use could save resources and unnecessary parental anxiety. Copyright © 2013 Mosby, Inc. All rights reserved.

  18. Sequence protein identification by randomized sequence database and transcriptome mass spectrometry (SPIDER-TMS): from manual to automatic application of a 'de novo sequencing' approach.

    Science.gov (United States)

    Pascale, Raffaella; Grossi, Gerarda; Cruciani, Gabriele; Mecca, Giansalvatore; Santoro, Donatello; Sarli Calace, Renzo; Falabella, Patrizia; Bianco, Giuliana

    Sequence protein identification by a randomized sequence database and transcriptome mass spectrometry software package has been developed at the University of Basilicata in Potenza (Italy) and designed to facilitate the determination of the amino acid sequence of a peptide as well as an unequivocal identification of proteins in a high-throughput manner with enormous advantages of time, economical resource and expertise. The software package is a valid tool for the automation of a de novo sequencing approach, overcoming the main limits and a versatile platform useful in the proteomic field for an unequivocal identification of proteins, starting from tandem mass spectrometry data. The strength of this software is that it is a user-friendly and non-statistical approach, so protein identification can be considered unambiguous.

  19. What Information Does Your EHR Contain? Automatic Generation of a Clinical Metadata Warehouse (CMDW) to Support Identification and Data Access Within Distributed Clinical Research Networks.

    Science.gov (United States)

    Bruland, Philipp; Doods, Justin; Storck, Michael; Dugas, Martin

    2017-01-01

    Data dictionaries provide structural meta-information about data definitions in health information technology (HIT) systems. In this regard, reusing healthcare data for secondary purposes offers several advantages (e.g. reduce documentation times or increased data quality). Prerequisites for data reuse are its quality, availability and identical meaning of data. In diverse projects, research data warehouses serve as core components between heterogeneous clinical databases and various research applications. Given the complexity (high number of data elements) and dynamics (regular updates) of electronic health record (EHR) data structures, we propose a clinical metadata warehouse (CMDW) based on a metadata registry standard. Metadata of two large hospitals were automatically inserted into two CMDWs containing 16,230 forms and 310,519 data elements. Automatic updates of metadata are possible as well as semantic annotations. A CMDW allows metadata discovery, data quality assessment and similarity analyses. Common data models for distributed research networks can be established based on similarity analyses.

  20. DNA evolutionary algorithm (DNAEA) for source term identification in convection-diffusion equation

    International Nuclear Information System (INIS)

    Yang, X-H; Hu, X-X; Shen, Z-Y

    2008-01-01

    The source identification problem is changed into an optimization problem in this paper. This is a complicated nonlinear optimization problem. It is very intractable with traditional optimization methods. So DNA evolutionary algorithm (DNAEA) is presented to solve the discussed problem. In this algorithm, an initial population is generated by a chaos algorithm. With the shrinking of searching range, DNAEA gradually directs to an optimal result with excellent individuals obtained by DNAEA. The position and intensity of pollution source are well found with DNAEA. Compared with Gray-coded genetic algorithm and pure random search algorithm, DNAEA has rapider convergent speed and higher calculation precision

  1. Automatic identification of fault zone head waves and direct P waves and its application in the Parkfield section of the San Andreas Fault, California

    Science.gov (United States)

    Li, Zefeng; Peng, Zhigang

    2016-06-01

    Fault zone head waves (FZHWs) are observed along major strike-slip faults and can provide high-resolution imaging of fault interface properties at seismogenic depth. In this paper, we present a new method to automatically detect FZHWs and pick direct P waves secondary arrivals (DWSAs). The algorithm identifies FZHWs by computing the amplitude ratios between the potential FZHWs and DSWAs. The polarities, polarizations and characteristic periods of FZHWs and DSWAs are then used to refine the picks or evaluate the pick quality. We apply the method to the Parkfield section of the San Andreas Fault where FZHWs have been identified before by manual picks. We compare results from automatically and manually picked arrivals and find general agreement between them. The obtained velocity contrast at Parkfield is generally 5-10 per cent near Middle Mountain while it decreases below 5 per cent near Gold Hill. We also find many FZHWs recorded by the stations within 1 km of the background seismicity (i.e. the Southwest Fracture Zone) that have not been reported before. These FZHWs could be generated within a relatively wide low velocity zone sandwiched between the fast Salinian block on the southwest side and the slow Franciscan Mélange on the northeast side. Station FROB on the southwest (fast) side also recorded a small portion of weak precursory signals before sharp P waves. However, the polarities of weak signals are consistent with the right-lateral strike-slip mechanisms, suggesting that they are unlikely genuine FZHW signals.

  2. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  3. Automatic garment template sewing technology based on machine identification%基于机器识别的全自动服装模板缝纫技术

    Institute of Scientific and Technical Information of China (English)

    张华玲; 戴斌辉; 原竞杰

    2016-01-01

    In view of the low efficiency of the traditional garment sewing process and dependence on the manual operation and other issues,a kind of automatic sewing technology of intelligent garment templates is put forward based on visual technology.The X/Y/Z direction of the three freedom mo-tion of the mechanical body is designed to complete the cutting and molding of the fabric,PVC,leath-er and other different materials.Then by using the teaching acquisition intelligent vision technology, the sample is automaticly generated to complete intelligent traj ectory planning,and drive sequential action of mechanical body through the embedded platform.The automation of garment sewing clothing is realized,improving the streamlined,standardized and efficient operations,decreasing the depend-ence of garment factory on skilled workers.%针对传统服装缝制工艺效率低且依赖于人工操作等问题,提出一种基于智能视觉技术的全自动服装模板缝纫技术。设计一个可沿 X/Y/Z 3个方向自由运动的机械本体,完成对面料、PVC、皮革等不同材料的裁剪和制模;运用智能视觉技术软件进行视教采集,自动生成样片完成智能轨迹规划并通过嵌入式平台驱动机械本体的顺序动作,实现服装缝制自动化,提高服装作业的流水化、标准化、高效化,降低服装厂对熟练工的依赖性。

  4. A comparative analysis of DBSCAN, K-means, and quadratic variation algorithms for automatic identification of swallows from swallowing accelerometry signals.

    Science.gov (United States)

    Dudik, Joshua M; Kurosu, Atsuko; Coyle, James L; Sejdić, Ervin

    2015-04-01

    Cervical auscultation with high resolution sensors is currently under consideration as a method of automatically screening for specific swallowing abnormalities. To be clinically useful without human involvement, any devices based on cervical auscultation should be able to detect specified swallowing events in an automatic manner. In this paper, we comparatively analyze the density-based spatial clustering of applications with noise algorithm (DBSCAN), a k-means based algorithm, and an algorithm based on quadratic variation as methods of differentiating periods of swallowing activity from periods of time without swallows. These algorithms utilized swallowing vibration data exclusively and compared the results to a gold standard measure of swallowing duration. Data was collected from 23 subjects that were actively suffering from swallowing difficulties. Comparing the performance of the DBSCAN algorithm with a proven segmentation algorithm that utilizes k-means clustering demonstrated that the DBSCAN algorithm had a higher sensitivity and correctly segmented more swallows. Comparing its performance with a threshold-based algorithm that utilized the quadratic variation of the signal showed that the DBSCAN algorithm offered no direct increase in performance. However, it offered several other benefits including a faster run time and more consistent performance between patients. All algorithms showed noticeable differentiation from the endpoints provided by a videofluoroscopy examination as well as reduced sensitivity. In summary, we showed that the DBSCAN algorithm is a viable method for detecting the occurrence of a swallowing event using cervical auscultation signals, but significant work must be done to improve its performance before it can be implemented in an unsupervised manner. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Comparison of Short-Term Estrogenicity Tests for Identification of Hormone-Disrupting Chemicals

    DEFF Research Database (Denmark)

    Andersen, Helle Raun; Andersson, Anna-Maria; Arnold, Steven F.

    1999-01-01

    The aim of this study was to compare results obtained by eight different short-term assays of estrogenlike actions of chemicals conducted in 10 different laboratories in five countries. Twenty chemicals were selected to represent direct-acting estrogens, compounds with estrogenic metabolites, est...

  6. Comparison of short-term estrogenicity tests for identification of hormone-disrupting chemicals

    DEFF Research Database (Denmark)

    Andersen, H R; Andersson, A M; Arnold, S F

    1999-01-01

    The aim of this study was to compare results obtained by eight different short-term assays of estrogenlike actions of chemicals conducted in 10 different laboratories in five countries. Twenty chemicals were selected to represent direct-acting estrogens, compounds with estrogenic metabolites, est...

  7. Liabilities identification and long-term management - Review of French situation

    International Nuclear Information System (INIS)

    2003-01-01

    In France, long term liabilities due to nuclear activities concern four main operators: Electricite de France (EDF), AREVA (an industrial group created on September 3, 2001 and covering the entire fuel cycle from ore extraction and transformation to the recycling of spent fuel), the Atomic Energy Commission (CEA, the French public research organism in the nuclear sector) and the French Agency for radioactive waste management (ANDRA, in charge with the long term operation of radioactive waste installations). Long term liabilities are due to the financing of both decommissioning of nuclear installations and radioactive waste long term management. In the current French organisational scheme, the different operators must take the responsibility of these long term liabilities. The setting of national policies and the establishment of the legislation are carried out at a national level by the French state. These include the supervision of the three operators through different Ministries and the regulatory control of safety trough the Nuclear Safety Authority (ASN). EDF, AREVA, CEA and ANDRA are responsible for all aspects of the decommissioning (from a technical and financial point of view). Within a safety regulatory frame, they have their own initiative concerning future expenses, based on estimated costs and the expected operational lifetime of the installations. They are responsible of the definition and implementation of the technical options. Through its supervision activities, the French State regularly requires updating studies of these estimated costs, which are conducted by the operators. A general review of the management of these long-term liabilities is also carried out on a four years basis by the French Court of Accounts. Operators are due to constitute provisions during the life cycle of their installations. Provisions are calculated for each installation on the basis of the decommissioning expenses and of the reasonably estimated lifetime. They are re

  8. Long-term forecasting of hourly electricity load: Identification of consumption profiles and segmentation of customers

    DEFF Research Database (Denmark)

    Møller Andersen, Frits; Larsen, Helge V.; Boomsma, Trine Krogh

    2013-01-01

    , to model and forecast long-term changes in the aggregated electricity load profile, we identify profiles for different categories of customers and link these to projections of the aggregated annual consumption by categories of customers. Long-term projection of the aggregated load is important for future......Data for aggregated hourly electricity demand shows systematic variations over the day, week, and seasons, and forecasting of aggregated hourly electricity load has been the subject of many studies. With hourly metering of individual customers, data for individual consumption profiles is available....... Using this data and analysing the case of Denmark, we show that consumption profiles for categories of customers are equally systematic but very different for distinct categories, that is, distinct categories of customers contribute differently to the aggregated electricity load profile. Therefore...

  9. Records, record linkage, and the identification of long term environmental hazards

    Energy Technology Data Exchange (ETDEWEB)

    Acheson, E. D.

    1978-11-15

    Long-term effects of toxic substances in man which have been recognized so far have been noticed because they have involved gross relative risks, or bizarre effects, or have been stumbled upon by chance or because of special circumstances. These facts and some recent epidemiological evidence together suggest that a systematic approach with more precise methods and data would almost certainly reveal the effects of many more toxic substances, particularly in workers exposed in manufacturing industry. Additional ways are suggested in which record linkage techniques might be used to identify substances with long-term toxic effects. Obstacles to further progress in the field of monitoring for long-term hazards in man are: lack of a public policy dealing with confidentiality and informed consent in the use of identifiable personal records, which balances the needs of bona fide research workers with proper safeguards for the privacy of the individual, and lack of resources to improve the quality, accessibility and organization of the appropriate data. (PCS)

  10. Classification of Large-Scale Remote Sensing Images for Automatic Identification of Health Hazards: Smoke Detection Using an Autologistic Regression Classifier.

    Science.gov (United States)

    Wolters, Mark A; Dean, C B

    2017-01-01

    Remote sensing images from Earth-orbiting satellites are a potentially rich data source for monitoring and cataloguing atmospheric health hazards that cover large geographic regions. A method is proposed for classifying such images into hazard and nonhazard regions using the autologistic regression model, which may be viewed as a spatial extension of logistic regression. The method includes a novel and simple approach to parameter estimation that makes it well suited to handling the large and high-dimensional datasets arising from satellite-borne instruments. The methodology is demonstrated on both simulated images and a real application to the identification of forest fire smoke.

  11. SU-E-J-182: A Feasibility Study Evaluating Automatic Identification of Gross Tumor Volume for Breast Cancer Radiotherapy Using Dynamic Contrast-Enhanced MR Imaging

    International Nuclear Information System (INIS)

    Wang, C; Horton, J; Yin, F; Blitzblau, R; Palta, M; Chang, Z

    2014-01-01

    Purpose: To develop a computerized pharmacokinetic model-free Gross Tumor Volume (GTV) segmentation method based on dynamic contrastenhanced MRI (DCE-MRI) data that can improve physician GTV contouring efficiency. Methods: 12 patients with biopsy-proven early stage breast cancer with post-contrast enhanced DCE-MRI images were analyzed in this study. A fuzzy c-means (FCM) clustering-based method was applied to segment 3D GTV from pre-operative DCE-MRI data. A region of interest (ROI) is selected by a clinician/physicist, and the normalized signal evolution curves were calculated by dividing the signal intensity enhancement value at each voxel by the pre-contrast signal intensity value at the corresponding voxel. Three semi-quantitative metrics were analyzed based on normalized signal evolution curves: initial Area Under signal evolution Curve (iAUC), Immediate Enhancement Ratio (IER), and Variance of Enhancement Slope (VES). The FCM algorithm wass applied to partition ROI voxels into GTV voxels and non-GTV voxels by using three analyzed metrics. The partition map for the smaller cluster is then generated and binarized with an automatically calculated threshold. To reduce spurious structures resulting from background, a labeling operation was performed to keep the largest three-dimensional connected component as the identified target. Basic morphological operations including hole-filling and spur removal were useutilized to improve the target smoothness. Each segmented GTV was compared to that drawn by experienced radiation oncologists. An agreement index was proposed to quantify the overlap between the GTVs identified using two approaches and a thershold value of 0.4 is regarded as acceptable. Results: The GTVs identified by the proposed method were overlapped with the ones drawn by radiation oncologists in all cases, and in 10 out of 12 cases, the agreement indices were above the threshold of 0.4. Conclusion: The proposed automatic segmentation method was shown to

  12. SU-E-J-182: A Feasibility Study Evaluating Automatic Identification of Gross Tumor Volume for Breast Cancer Radiotherapy Using Dynamic Contrast-Enhanced MR Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C; Horton, J; Yin, F; Blitzblau, R; Palta, M; Chang, Z [Duke University Medical Center, Durham, NC (United States)

    2014-06-01

    Purpose: To develop a computerized pharmacokinetic model-free Gross Tumor Volume (GTV) segmentation method based on dynamic contrastenhanced MRI (DCE-MRI) data that can improve physician GTV contouring efficiency. Methods: 12 patients with biopsy-proven early stage breast cancer with post-contrast enhanced DCE-MRI images were analyzed in this study. A fuzzy c-means (FCM) clustering-based method was applied to segment 3D GTV from pre-operative DCE-MRI data. A region of interest (ROI) is selected by a clinician/physicist, and the normalized signal evolution curves were calculated by dividing the signal intensity enhancement value at each voxel by the pre-contrast signal intensity value at the corresponding voxel. Three semi-quantitative metrics were analyzed based on normalized signal evolution curves: initial Area Under signal evolution Curve (iAUC), Immediate Enhancement Ratio (IER), and Variance of Enhancement Slope (VES). The FCM algorithm wass applied to partition ROI voxels into GTV voxels and non-GTV voxels by using three analyzed metrics. The partition map for the smaller cluster is then generated and binarized with an automatically calculated threshold. To reduce spurious structures resulting from background, a labeling operation was performed to keep the largest three-dimensional connected component as the identified target. Basic morphological operations including hole-filling and spur removal were useutilized to improve the target smoothness. Each segmented GTV was compared to that drawn by experienced radiation oncologists. An agreement index was proposed to quantify the overlap between the GTVs identified using two approaches and a thershold value of 0.4 is regarded as acceptable. Results: The GTVs identified by the proposed method were overlapped with the ones drawn by radiation oncologists in all cases, and in 10 out of 12 cases, the agreement indices were above the threshold of 0.4. Conclusion: The proposed automatic segmentation method was shown to

  13. Screening local Lactobacilli from Iran in terms of production of lactic acid and identification of superior strains

    Directory of Open Access Journals (Sweden)

    Fatemeh Soleimanifard

    2015-12-01

    Full Text Available Introduction: Lactobacilli are a group of lactic acid bacteria that their final product of fermentation is lactic acid. The objective of this research is selection of local Lactobacilli producing L (+ lactic acid. Materials and methods: In this research the local strains were screened based on the ability to produce lactic acid. The screening was performed in two stages. The first stage was the titration method and the second stage was the enzymatic method. The superior strains obtained from titration method were selected to do enzymatic test. Finally, the superior strains in the second stage (enzymatic which had the ability to produce L(+ lactic acid were identified by biochemical tests. Then, molecular identification of strains was performed by using 16S rRNA sequencing. Results: In this study, the ability of 79 strains of local Lactobacilli in terms of production of lactic acid was studied. The highest and lowest rates of lactic acid production was 34.8 and 12.4 mg/g. Superior Lactobacilli in terms of production of lactic acid ability of producing had an optical isomer L(+, the highest levels of L(+ lactic acid were with 3.99 and the lowest amount equal to 1.03 mg/g. The biochemical and molecular identification of superior strains showed that strains are Lactobacillus paracasei. Then the sequences of 16S rRNA of superior strains were reported in NCBI with accession numbers KF735654، KF735655، KJ508201and KJ508202. Discussion and conclusion: The amounts of lactic acid production by local Lactobacilli were very different and producing some of these strains on available reports showed more products. The results of this research suggest the use of superior strains of Lactobacilli for production of pure L(+ lactic acid.

  14. Identification of long-term containment/stabilization technology performance issues

    International Nuclear Information System (INIS)

    Matthern, G.E.; Nickelson, D.F.

    1997-01-01

    U.S. Department of Energy (DOE) faces a somewhat unique challenge when addressing in situ remedial alternatives that leave long-lived radionuclides and hazardous contaminants onsite. These contaminants will remain a potential hazard for thousands of years. However, the risks, costs, and uncertainties associated with removal and offsite disposal are leading many sites to select in situ disposal alternatives. Improvements in containment, stabilization, and monitoring technologies will enhance the viability of such alternatives for implementation. DOE's Office of Science and Technology sponsored a two day workshop designed to investigate issues associated with the long-term in situ stabilization and containment of buried, long-lived hazardous and radioactive contaminants. The workshop facilitated communication among end users representing most sites within the DOE, regulators, and technologists to define long-term performance issues for in situ stabilization and containment alternatives. Participants were divided into groups to identify issues and a strategy to address priority issues. This paper presents the results of the working groups and summarizes the conclusions. A common issue identified by the work groups is communication. Effective communication between technologists, risk assessors, end users, regulators, and other stakeholders would contribute greatly to resolution of both technical and programmatic issues

  15. Identification of liabilities and long-term management of the fund in Hungary

    International Nuclear Information System (INIS)

    Czoch, Ildiko

    2003-01-01

    According to the basic rules, laid down in the Act on Atomic Energy CXVI. of 1996, radioactive waste management shall not impose undue burden on future generations. To satisfy this requirement, the long-term costs of waste disposal and of decommissioning of the plant shall be paid by the generation that enjoys the benefits of nuclear energy production and applications of isotopes. Accordingly, by the Act and its executive orders, a Central Nuclear Financial Fund was established on 1 January 1998 to finance radioactive waste disposal, interim storage and disposal of spent fuel as well as decommissioning of nuclear facilities. The Minister, supervising the Hungarian Atomic Energy Authority is disposing of the Central Nuclear Financial Fund, while the Hungarian Atomic Energy Authority is responsible for its management. The Fund is a separate state fund pursuant to Act XXXVIII of 1992 on Public Finance, exclusively earmarked for financing the construction and operation of disposal facilities for the final disposal of radioactive waste, as well as for the interim storage and final disposal of spent fuel, and the decommissioning of nuclear facilities. A long-term plan (up to the decommissioning of the nuclear facilities), a medium term plan (for five years) and an annual work schedule are to be prepared on the use of the Fund. The preparation of these plans/schedules is within the responsibilities of the Public Agency for Radioactive Waste Management. The long and medium term plans shall be annually reviewed and revised as required. The long and medium term plans and the annual work schedule shall be approved by the Minister supervising the Hungarian Atomic Energy Authority. The payments into the Fund are defined in accordance with these plans. The liabilities of the Paks Nuclear Power Plant for annual payments into the Fund are included in the law on the central budget on the proposal of the Minister supervising the Hungarian Atomic Energy Authority. It is based upon

  16. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  17. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  18. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  19. Identification of factors promoting ex vivo maintenance of mouse hematopoietic stem cells by long-term single-cell quantification.

    Science.gov (United States)

    Kokkaliaris, Konstantinos D; Drew, Erin; Endele, Max; Loeffler, Dirk; Hoppe, Philipp S; Hilsenbeck, Oliver; Schauberger, Bernhard; Hinzen, Christoph; Skylaki, Stavroula; Theodorou, Marina; Kieslinger, Matthias; Lemischka, Ihor; Moore, Kateri; Schroeder, Timm

    2016-09-01

    The maintenance of hematopoietic stem cells (HSCs) during ex vivo culture is an important prerequisite for their therapeutic manipulation. However, despite intense research, culture conditions for robust maintenance of HSCs are still missing. Cultured HSCs are quickly lost, preventing their improved analysis and manipulation. Identification of novel factors supporting HSC ex vivo maintenance is therefore necessary. Coculture with the AFT024 stroma cell line is capable of maintaining HSCs ex vivo long-term, but the responsible molecular players remain unknown. Here, we use continuous long-term single-cell observation to identify the HSC behavioral signature under supportive or nonsupportive stroma cocultures. We report early HSC survival as a major characteristic of HSC-maintaining conditions. Behavioral screening after manipulation of candidate molecules revealed that the extracellular matrix protein dermatopontin (Dpt) is involved in HSC maintenance. DPT knockdown in supportive stroma impaired HSC survival, whereas ectopic expression of the Dpt gene or protein in nonsupportive conditions restored HSC survival. Supplementing defined stroma- and serum-free culture conditions with recombinant DPT protein improved HSC clonogenicity. These findings illustrate a previously uncharacterized role of Dpt in maintaining HSCs ex vivo. © 2016 by The American Society of Hematology.

  20. Identification of trend in long term precipitation and reference evapotranspiration over Narmada river basin (India)

    Science.gov (United States)

    Pandey, Brij Kishor; Khare, Deepak

    2018-02-01

    Precipitation and reference evapotranspiration are key parameters in hydro-meteorological studies and used for agricultural planning, irrigation system design and management. Precipitation and evaporative demand are expected to be alter under climate change and affect the sustainable development. In this article, spatial variability and temporal trend of precipitation and reference evapotranspiration (ETo) were investigated over Narmada river basin (India), a humid tropical climatic region. In the present study, 12 and 28 observatory stations were selected for precipitation and ETo, respectively of 102-years period (1901-2002). A rigorous analysis for trend detection was carried out using non parametric tests such as Mann-Kendall (MK) and Spearman Rho (SR). Sen's slope estimator was used to analyze the rate of change in long term series. Moreover, all the stations of basin exhibit positive trend for annual ETo, while 8% stations indicate significant negative trend for mean annual precipitation, respectively. Change points of annual precipitation were identified around the year 1962 applying Buishand's and Pettit's test. Annual mean precipitation reduced by 9% in upper part while increased maximum by 5% in lower part of the basin due temporal changes. Although annual mean ETo increase by 4-12% in most of the region. Moreover, results of the study are very helpful in planning and development of agricultural water resources.

  1. Liabilities identification and long-term management at national level (Spain)

    International Nuclear Information System (INIS)

    Espejo Hernandez, Jose Manuel; Gonzalez Gomez, Jose Luis

    2003-01-01

    economic uncertainties in high level waste disposal systems is a constant line of work, and in this respect ENRESA attempts to incorporate the most adequate techniques for cost analysis in a probabilistic framework. Even though the economical calculations are revised every year, tempering forecasting inaccuracies, in the longer term, it is felt that problems might arise if there were a particularly significant time difference between the dates of plant decommissioning and the initiation of repository construction work. Under these conditions, any delay in constructing the definitive disposal facility might lead to not having sufficient financial resources available for its construction, operation or dismantling. The Spanish legislation includes no indications in this respect. Conceptually, various treatment hypothesis could be envisaged, such as legally increasing the period of fee collection, the creation of an extra fee during the last few years of collection, the obligation for the waste producers to contract additional guarantees in order to address uncovered risks, or acceptance by the State of responsibilities in relation to this issue. Obviously, the case of a surplus of money after the completion of waste disposal is also to be taken into account. In relation to this hypothesis, criteria and procedures for liquidation or distribution should have to be set out. It is considered that, at present, it is to soon to approach such a question

  2. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  3. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  4. A 100-m Fabry–Pérot Cavity with Automatic Alignment Controls for Long-Term Observations of Earth’s Strain

    Directory of Open Access Journals (Sweden)

    Akiteru Takamori

    2014-08-01

    Full Text Available We have developed and built a highly accurate laser strainmeter for geophysical observations. It features the precise length measurement of a 100-m optical cavity with reference to a stable quantum standard. Unlike conventional laser strainmeters based on simple Michelson interferometers that require uninterrupted fringe counting to track the evolution of ground deformations, this instrument is able to determine the absolute length of a cavity at any given time. The instrument offers advantage in covering a variety of geophysical events, ranging from instantaneous earthquakes to crustal deformations associated with tectonic strain changes that persist over time. An automatic alignment control and an autonomous relocking system have been developed to realize stable performance and maximize observation times. It was installed in a deep underground site at the Kamioka mine in Japan, and an effective resolution of 2 × (10−8 − 10−7 m was achieved. The regular tidal deformations and co-seismic strain changes were in good agreement with those from a theoretical model and a co-located conventional laser strainmeter. Only the new instrument was able to record large strain steps caused by a nearby large earthquake because of its capability of absolute length determination.

  5. Piezo-electric automatic vehicle classification system : Oregon Department of Transportation with Castle Rock Consultants for a SHRP Long Term Pavement Performance Site.

    Science.gov (United States)

    1990-05-01

    Oregon has twelve sites that are part of the Strategic Highway Research Program (SHRP), Long Term Pavement Performance (LTPP) studies. Part of the data gathering on these sites involves vehicle weight and classification. This pilot project was to hel...

  6. Piezo-electric automatic vehicle classification system : Oregon Department of Transportation with Castle Rock Consultants for a SHRP Long Term Pavement Performance Site : final report.

    Science.gov (United States)

    1991-07-01

    Oregon has twelve pavement test sites that are part of the Strategic Highway Research Program (SHRP), Long Term Pavement Performance (LTPP) studies. Part of the data gathering on these sites involves vehicle weight and classification. This pilot proj...

  7. DIRADTM - a system for real time detection and identification of radioactive objects

    International Nuclear Information System (INIS)

    Guillot, L.; Reboli, A.

    2009-01-01

    The authors present the DIRAD system (DIRAD stands for Detection and Identification of Radionuclides), an automatic system for real time identification of a radioactive anomaly and its interpretation in terms of risk level. It can be adapted to different contexts: pedestrian control, parcel or luggage control, road traffic control, and so on. In case of risk detection, an alert is transmitted in real time to a supervision station along with the whole set of spectral data

  8. DNA typing for personal identification of urine after long-term preservation for testing in doping control.

    Science.gov (United States)

    Aoki, Kimiko; Tanaka, Hiroyuki; Ueki, Makoto

    2017-08-01

    When the tampering of a urine sample is suspected in doping control, personal identification of the sample needs to be determined by short tandem repeat (STR) analysis using DNA. We established a method for extracting DNA from urine samples stored at -20 °C without using any additives or procedures, which is consistent with how samples are required to be managed for doping control. The method, using the Puregene® Blood Core kit followed by NucleoSpin® gDNA Clean-up or NucleoSpin® gDNA Clean-up XS kit, does not need any special instrument and can provide a purified extract with high-quality DNA from up to 40 mL of urine suitable for STR analysis using an AmpFlSTR® Identifiler® PCR amplification kit. Storing urine at -20 °C is detrimental to the stability of DNA. The DNA concentration of preserved urine could not be predicted by specific gravity or creatinine level at the time of urine collection. The DNA concentration of a purified extract (10 μL) was required to be >0.06 ng/μL to ensure a successful STR analysis. Thus, the required extraction volumes of urine preserved for 3-7 years at -20 °C were estimated to be 30 mL and 20 mL to succeed in at least 86% of men and 91% of women, respectively. Considering the long half-life of DNA during long-term preservation, our extraction method is applicable to urine samples stored even for 10 years, which is currently the storage duration allowed (increased from 8 years) before re-examination in doping control. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Comparison of Multi-shot Models for Short-term Re-identification of People using RGB-D Sensors

    DEFF Research Database (Denmark)

    Møgelmose, Andreas; Bahnsen, Chris; Moeslund, Thomas B.

    2015-01-01

    This work explores different types of multi-shot descriptors for re-identification in an on-the-fly enrolled environment using RGB-D sensors. We present a full re-identification pipeline complete with detection, segmentation, feature extraction, and re-identification, which expands on previous work...... by using multi-shot descriptors modeling people over a full camera pass instead of single frames with no temporal linking. We compare two different multi-shot models; mean histogram and histogram series, and test them each in 3 different color spaces. Both histogram descriptors are assisted by a depth...

  10. Rapid Identification of Microorganisms from Positive Blood Culture by MALDI-TOF MS After Short-Term Incubation on Solid Medium.

    Science.gov (United States)

    Curtoni, Antonio; Cipriani, Raffaella; Marra, Elisa Simona; Barbui, Anna Maria; Cavallo, Rossana; Costa, Cristina

    2017-01-01

    Matrix-assisted laser-desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) is a useful tool for rapid identification of microorganisms. Unfortunately, its direct application to positive blood culture is still lacking standardized procedures. In this study, we evaluated an easy- and rapid-to-perform protocol for MALDI-TOF MS direct identification of microorganisms from positive blood culture after a short-term incubation on solid medium. This protocol was used to evaluate direct identification of microorganisms from 162 positive monomicrobial blood cultures; at different incubation times (3, 5, 24 h), MALDI-TOF MS assay was performed from the growing microorganism patina. Overall, MALDI-TOF MS concordance with conventional methods at species level was 60.5, 80.2, and 93.8% at 3, 5, and 24 h, respectively. Considering only bacteria, the identification performances at species level were 64.1, 85.0, and 94.1% at 3, 5, and 24 h, respectively. This protocol applied to a commercially available MS typing system may represent, a fast and powerful diagnostic tool for pathogen direct identification and for a promptly and pathogen-driven antimicrobial therapy in selected cases.

  11. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  12. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  13. Long-term screening for sleep apnoea in paced patients: preliminary assessment of a novel patient management flowchart by using automatic pacemaker indexes and sleep lab polygraphy.

    Science.gov (United States)

    Aimé, Ezio; Rovida, Marina; Contardi, Danilo; Ricci, Cristian; Gaeta, Maddalena; Innocenti, Ester; Cabral Tantchou-Tchoumi, Jacques

    2014-10-01

    The primary aim of this pilot study was to prospectively assess a flowchart to screen and diagnose paced patients (pts) affected by sleep apnoeas, by crosschecking indexes derived from pacemakers (minute ventilation sensor on-board) with Sleep-Lab Polygraphy (PG) outcomes. Secondarily, "smoothed" long-term pacemaker indexes (all the information between two consecutive follow-up visits) have been retrospectively compared vs. standard short-term pacemaker indexes (last 24h) at each follow-up (FU) visit, to test their correlation and diagnostic concordance. Data from long-term FU of 61 paced pts were collected. At each visit, the standard short-term apnoea+hypopnoea (PM_AHI) index was retrieved from the pacemaker memory. Patients showing PM_AHI ≥ 30 at least once during FU were proposed to undergo a PG for diagnostic confirmation. Smoothed pacemaker (PM_SAHI) indexes were calculated by averaging the overall number of apnoeas/hypopnoeas over the period between two FU visits, and retrospectively compared with standard PM_AHI. Data were available from 609 consecutive visits (overall 4.64 ± 1.78 years FU). PM_AHI indexes were positive during FU in 40/61 pts (65.6%); 26/40 pts (65%) accepted to undergo a PG recording; Sleep-Lab confirmed positivity in 22/26 pts (84.6% positive predictive value for PM_AHI). A strong correlation (r=0.73) and a high level of concordance were found between smoothed and standard indexes (multivariate analysis, Cohen's-k and Z-score tests). Pacemaker-derived indexes may help in screening paced pts potentially affected by sleep apnoeas. Long-term "smoothed" apnoea indexes could improve the accuracy of pacemaker screening capability, even though this hypothesis must be prospectively confirmed by larger studies. Copyright © 2014 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  14. Rapid identification of microorganisms from positive blood cultures by MALDI-TOF mass spectrometry subsequent to very short-term incubation on solid medium.

    Science.gov (United States)

    Idelevich, E A; Schüle, I; Grünastel, B; Wüllenweber, J; Peters, G; Becker, K

    2014-10-01

    Rapid identification of the causative microorganism is important for appropriate antimicrobial therapy of bloodstream infections. Bacteria from positive blood culture (BC) bottles are not readily available for identification by matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS). Lysis and centrifugation procedures suggested for direct MALDI-TOF MS from positive BCs without previous culture are associated with additional hands-on processing time and costs. Here, we describe an alternative approach applying MALDI-TOF MS from bacterial cultures incubated very briefly on solid medium. After plating of positive BC broth on Columbia blood agar (n = 165), MALDI-TOF MS was performed after 1.5, 2, 3, 4, 5, 6, 7, 8, 12 and (for control) 24 h of incubation until reliable identification to the species level was achieved (score ≥2.0). Mean incubation time needed to achieve species-level identification was 5.9 and 2.0 h for Gram-positive aerobic cocci (GPC, n = 86) and Gram-negative aerobic rods (GNR, n = 42), respectively. Short agar cultures with incubation times ≤2, ≤4, ≤6, ≤8 and ≤12 h yielded species identification in 1.2%, 18.6%, 64.0%, 96.5%, 98.8% of GPC, and in 76.2%, 95.2%, 97.6%, 97.6%, 97.6% of GNR, respectively. Control species identification at 24 h was achieved in 100% of GPC and 97.6% of GNR. Ethanol/formic acid protein extraction performed for an additional 34 GPC isolates cultivated from positive BCs showed further reduction in time to species identification (3.1 h). MALDI-TOF MS using biomass subsequent to very short-term incubation on solid medium allows very early and reliable bacterial identification from positive BCs without additional time and cost expenditure. © 2014 The Authors Clinical Microbiology and Infection © 2014 European Society of Clinical Microbiology and Infectious Diseases.

  15. Automatically annotating web pages using Google Rich Snippets

    NARCIS (Netherlands)

    Hogenboom, F.P.; Frasincar, F.; Vandic, D.; Meer, van der J.; Boon, F.; Kaymak, U.

    2011-01-01

    We propose the Automatic Review Recognition and annO- tation of Web pages (ARROW) framework, a framework for Web page review identification and annotation using RDFa Google Rich Snippets. The ARROW framework consists of four steps: hotspot identification, subjectivity analysis, in- formation

  16. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  17. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  18. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  19. Automatically ordering events and times in text

    CERN Document Server

    Derczynski, Leon R A

    2017-01-01

    The book offers a detailed guide to temporal ordering, exploring open problems in the field and providing solutions and extensive analysis. It addresses the challenge of automatically ordering events and times in text. Aided by TimeML, it also describes and presents concepts relating to time in easy-to-compute terms. Working out the order that events and times happen has proven difficult for computers, since the language used to discuss time can be vague and complex. Mapping out these concepts for a computational system, which does not have its own inherent idea of time, is, unsurprisingly, tough. Solving this problem enables powerful systems that can plan, reason about events, and construct stories of their own accord, as well as understand the complex narratives that humans express and comprehend so naturally. This book presents a theory and data-driven analysis of temporal ordering, leading to the identification of exactly what is difficult about the task. It then proposes and evaluates machine-learning so...

  20. Automatic generation of warehouse mediators using an ontology engine

    Energy Technology Data Exchange (ETDEWEB)

    Critchlow, T., LLNL

    1998-04-01

    Data warehouses created for dynamic scientific environments, such as genetics, face significant challenges to their long-term feasibility One of the most significant of these is the high frequency of schema evolution resulting from both technological advances and scientific insight Failure to quickly incorporate these modifications will quickly render the warehouse obsolete, yet each evolution requires significant effort to ensure the changes are correctly propagated DataFoundry utilizes a mediated warehouse architecture with an ontology infrastructure to reduce the maintenance acquirements of a warehouse. Among the things, the ontology is used as an information source for automatically generating mediators, the methods that transfer data between the data sources and the warehouse The identification, definition and representation of the metadata required to perform this task is a primary contribution of this work.

  1. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  2. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  3. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  4. Toward Automatic Georeferencing of Archival Aerial Photogrammetric Surveys

    Science.gov (United States)

    Giordano, S.; Le Bris, A.; Mallet, C.

    2018-05-01

    Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i) the computation of a coarse absolute image orientation; (ii) the use of the coarse Digital Surface Model (DSM) information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.

  5. TOWARD AUTOMATIC GEOREFERENCING OF ARCHIVAL AERIAL PHOTOGRAMMETRIC SURVEYS

    Directory of Open Access Journals (Sweden)

    S. Giordano

    2018-05-01

    Full Text Available Images from archival aerial photogrammetric surveys are a unique and relatively unexplored means to chronicle 3D land-cover changes over the past 100 years. They provide a relatively dense temporal sampling of the territories with very high spatial resolution. Such time series image analysis is a mandatory baseline for a large variety of long-term environmental monitoring studies. The current bottleneck for accurate comparison between epochs is their fine georeferencing step. No fully automatic method has been proposed yet and existing studies are rather limited in terms of area and number of dates. State-of-the art shows that the major challenge is the identification of ground references: cartographic coordinates and their position in the archival images. This task is manually performed, and extremely time-consuming. This paper proposes to use a photogrammetric approach, and states that the 3D information that can be computed is the key to full automation. Its original idea lies in a 2-step approach: (i the computation of a coarse absolute image orientation; (ii the use of the coarse Digital Surface Model (DSM information for automatic absolute image orientation. It only relies on a recent orthoimage+DSM, used as master reference for all epochs. The coarse orthoimage, compared with such a reference, allows the identification of dense ground references and the coarse DSM provides their position in the archival images. Results on two areas and 5 dates show that this method is compatible with long and dense archival aerial image series. Satisfactory planimetric and altimetric accuracies are reported, with variations depending on the ground sampling distance of the images and the location of the Ground Control Points.

  6. The Efficacy of Short-term Gated Audiovisual Speech Training for Improving Auditory Sentence Identification in Noise in Elderly Hearing Aid Users

    Science.gov (United States)

    Moradi, Shahram; Wahlin, Anna; Hällgren, Mathias; Rönnberg, Jerker; Lidestam, Björn

    2017-01-01

    This study aimed to examine the efficacy and maintenance of short-term (one-session) gated audiovisual speech training for improving auditory sentence identification in noise in experienced elderly hearing-aid users. Twenty-five hearing aid users (16 men and 9 women), with an average age of 70.8 years, were randomly divided into an experimental (audiovisual training, n = 14) and a control (auditory training, n = 11) group. Participants underwent gated speech identification tasks comprising Swedish consonants and words presented at 65 dB sound pressure level with a 0 dB signal-to-noise ratio (steady-state broadband noise), in audiovisual or auditory-only training conditions. The Hearing-in-Noise Test was employed to measure participants’ auditory sentence identification in noise before the training (pre-test), promptly after training (post-test), and 1 month after training (one-month follow-up). The results showed that audiovisual training improved auditory sentence identification in noise promptly after the training (post-test vs. pre-test scores); furthermore, this improvement was maintained 1 month after the training (one-month follow-up vs. pre-test scores). Such improvement was not observed in the control group, neither promptly after the training nor at the one-month follow-up. However, no significant between-groups difference nor an interaction between groups and session was observed. Conclusion: Audiovisual training may be considered in aural rehabilitation of hearing aid users to improve listening capabilities in noisy conditions. However, the lack of a significant between-groups effect (audiovisual vs. auditory) or an interaction between group and session calls for further research. PMID:28348542

  7. Automatic measurement of images on astrometric plates

    Science.gov (United States)

    Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.

    1994-04-01

    We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).

  8. Fast automatic analysis of antenatal dexamethasone on micro-seizure activity in the EEG

    International Nuclear Information System (INIS)

    Rastin, S.J.; Unsworth, C.P.; Bennet, L.

    2010-01-01

    Full text: In this work wc develop an automatic scheme for studying the effect of the antenatal Dexamethasone on the EEG activity. To do so an FFT (Fast Fourier Transform) based detector was designed and applied to the EEG recordings obtained from two groups of fetal sheep. Both groups received two injections with a time delay of 24 h between them. However the applied medicine was different for each group (Dex and saline). The detector developed was used to automatically identify and classify micro-seizures that occurred in the frequency bands corresponding to the EEG transients known as slow waves (2.5 14 Hz). For each second of the data recordings the spectrum was computed and the rise of the energy in each predefined frequency band then counted when the energy level exceeded a predefined corresponding threshold level (Where the threshold level was obtained from the long term average of the spectral points at each band). Our results demonstrate that it was possible to automatically count the micro-seizures for the three different bands in a time effective manner. It was found that the number of transients did not strongly depend on the nature of the injected medicine which was consistent with the results manually obtained by an EEG expert. Tn conclusion, the automatic detection scheme presented here would allow for rapid micro-seizure event identification of hours of highly sampled EEG data thus providing a valuable time-saving device.

  9. B0-correction and k-means clustering for accurate and automatic identification of regions with reduced apparent diffusion coefficient (ADC) in adva nced cervical cancer at the time of brachytherapy

    DEFF Research Database (Denmark)

    Haack, Søren; Pedersen, Erik Morre; Vinding, Mads Sloth

    in dose planning of radiotherapy. This study evaluates the use of k-means clustering for automatic user independent delineation of regions of reduced apparent diffusion coefficient (ADC) and the value of B0-correction of DW-MRI for reduction of geometrical distortions during dose planning of brachytherapy...

  10. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  11. Thai Automatic Speech Recognition

    National Research Council Canada - National Science Library

    Suebvisai, Sinaporn; Charoenpornsawat, Paisarn; Black, Alan; Woszczyna, Monika; Schultz, Tanja

    2005-01-01

    .... We focus on the discussion of the rapid deployment of ASR for Thai under limited time and data resources, including rapid data collection issues, acoustic model bootstrap, and automatic generation of pronunciations...

  12. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  13. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  14. Diagnostic and prognostic system for identification of accident scenarios and prediction of 'source term' in nuclear power plants under accident conditions

    International Nuclear Information System (INIS)

    Santhosh; Gera, B.; Kumar, Mithilesh

    2014-01-01

    Nuclear power plant experiences a number of transients during its operations. These transients may be due to equipment failure, malfunctioning of process support systems etc. In such a situation, the plant may result in an abnormal state which is undesired. In case of such an undesired plant condition, the operator has to carry out diagnostic and corrective actions. When an event occurs starting from the steady state operation, instruments' readings develop a time dependent pattern and these patterns are unique with respect to the type of the particular event. Therefore, by properly selecting the plant process parameters, the transients can be distinguished. In this connection, a computer based tool known as Diagnostic and Prognostic System has been developed for identification of large pipe break scenarios in 220 MWe Pressurised Heavy Water Reactors (PHWRs) and for prediction of expected 'Source Term' and consequence for a situation where Emergency Core Cooling System (ECCS) is not available or partially available. Diagnostic and Prognostic System is essentially a transient identification and expected source term forecasting system. The system is based on Artificial Neural Networks (ANNs) that continuously monitors the plant conditions and identifies a Loss Of Coolant Accident (LOCA) scenario quickly based on the reactor process parameter values. The system further identifies the availability of injection of ECCS and in case non-availability of ECCS, it can forecast expected 'Source Term'. The system is a support to plant operators as well as for emergency preparedness. The ANN is trained with a process parameter database pertaining to accident conditions and tested against blind exercises. In order to see the feasibility of implementing in the plant for real-time diagnosis, this system has been set up on a high speed computing facility and has been demonstrated successfully for LOCA scenarios. (author)

  15. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  16. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  17. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  18. Investigation of an automatic trim algorithm for restructurable aircraft control

    Science.gov (United States)

    Weiss, J.; Eterno, J.; Grunberg, D.; Looze, D.; Ostroff, A.

    1986-01-01

    This paper develops and solves an automatic trim problem for restructurable aircraft control. The trim solution is applied as a feed-forward control to reject measurable disturbances following control element failures. Disturbance rejection and command following performances are recovered through the automatic feedback control redesign procedure described by Looze et al. (1985). For this project the existence of a failure detection mechanism is assumed, and methods to cope with potential detection and identification inaccuracies are addressed.

  19. Identification of relationships between climate indices and long-term precipitation in South Korea using ensemble empirical mode decomposition

    Science.gov (United States)

    Kim, Taereem; Shin, Ju-Young; Kim, Sunghun; Heo, Jun-Haeng

    2018-02-01

    Climate indices characterize climate systems and may identify important indicators for long-term precipitation, which are driven by climate interactions in atmosphere-ocean circulation. In this study, we investigated the climate indices that are effective indicators of long-term precipitation in South Korea, and examined their relationships based on statistical methods. Monthly total precipitation was collected from a total of 60 meteorological stations, and they were decomposed by ensemble empirical mode decomposition (EEMD) to identify the inherent oscillating patterns or cycles. Cross-correlation analysis and stepwise variable selection were employed to select the significant climate indices at each station. The climate indices that affect the monthly precipitation in South Korea were identified based on the selection frequencies of the selected indices at all stations. The NINO12 indices with four- and ten-month lags and AMO index with no lag were identified as indicators of monthly precipitation in South Korea. Moreover, they indicate meaningful physical information (e.g. periodic oscillations and long-term trend) inherent in the monthly precipitation. The NINO12 indices with four- and ten- month lags was a strong indicator representing periodic oscillations in monthly precipitation. In addition, the long-term trend of the monthly precipitation could be explained by the AMO index. A multiple linear regression model was constructed to investigate the influences of the identified climate indices on the prediction of monthly precipitation. Three identified climate indices successfully explained the monthly precipitation in the winter dry season. Compared to the monthly precipitation in coastal areas, the monthly precipitation in inland areas showed stronger correlation to the identified climate indices.

  20. Identification of first-stage labor arrest by electromyography in term nulliparous women after induction of labor.

    Science.gov (United States)

    Vasak, Blanka; Graatsma, Elisabeth M; Hekman-Drost, Elske; Eijkemans, Marinus J; Schagen van Leeuwen, Jules H; Visser, Gerard H A; Jacod, Benoit C

    2017-07-01

    Worldwide induction and cesarean delivery rates have increased rapidly, with consequences for subsequent pregnancies. The majority of intrapartum cesarean deliveries are performed for failure to progress, typically in nulliparous women at term. Current uterine registration techniques fail to identify inefficient contractions leading to first-stage labor arrest. An alternative technique, uterine electromyography has been shown to identify inefficient contractions leading to first-stage arrest of labor in nulliparous women with spontaneous onset of labor at term. The objective of this study was to determine whether this finding can be reproduced in induction of labor. Uterine activity was measured in 141 nulliparous women with singleton term pregnancies and a fetus in cephalic position during induced labor. Electrical activity of the myometrium during contractions was characterized by its power density spectrum. No significant differences were found in contraction characteristics between women with induced labor delivering vaginally with or without oxytocin and women with arrested labor with subsequent cesarean delivery. Uterine electromyography shows no correlation with progression of labor in induced labor, which is in contrast to spontaneous labor. © 2017 Nordic Federation of Societies of Obstetrics and Gynecology.

  1. A Joint Approach for Single-Channel Speaker Identification and Speech Separation

    DEFF Research Database (Denmark)

    Mowlaee, Pejman; Saeidi, Rahim; Christensen, Mads Græsbøll

    2012-01-01

    ) accuracy, here, we report the objective and subjective results as well. The results show that the proposed system performs as well as the best of the state-of-the-art in terms of perceived quality while its performance in terms of speaker identification and automatic speech recognition results......In this paper, we present a novel system for joint speaker identification and speech separation. For speaker identification a single-channel speaker identification algorithm is proposed which provides an estimate of signal-to-signal ratio (SSR) as a by-product. For speech separation, we propose...... a sinusoidal model-based algorithm. The speech separation algorithm consists of a double-talk/single-talk detector followed by a minimum mean square error estimator of sinusoidal parameters for finding optimal codevectors from pre-trained speaker codebooks. In evaluating the proposed system, we start from...

  2. Diagnosis of district potential in terms of renewable energies. Report 1 - Present situation: Assessment of renewable energy production, Identification and quantification of territory's potentialities in terms of renewable energies

    International Nuclear Information System (INIS)

    2010-10-01

    After a presentation of the Gers district context (geography, administrative organisation, demography, housing, economy, expertise), the report presents the energy situation, an overview of the solar thermal sector (installations and installers), of the solar photovoltaic sector (existing and projected installations, installers), of hydroelectricity, of wood-energy (individual heating, industrial heating plants, planned installations), of wind energy, of biogas, and of geothermal energy (existing and planned installations). It proposes an assessment of these energies as a whole. Then, after an overview of the district situation with respect to national objectives and to other districts of the region, the study reports an identification and quantification of potentialities in terms of theoretical resources for different energy sources (solar, wind, hydraulic, wood, methanization, valorizable biomass, geothermal, and agri-fuels). Avoided CO 2 emissions are assessed

  3. Genetic Programming for Automatic Hydrological Modelling

    Science.gov (United States)

    Chadalawada, Jayashree; Babovic, Vladan

    2017-04-01

    One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach

  4. ENVIRONMENTS and EOL: identification of Environment Ontology terms in text and the annotation of the Encyclopedia of Life.

    Science.gov (United States)

    Pafilis, Evangelos; Frankild, Sune P; Schnetzer, Julia; Fanini, Lucia; Faulwetter, Sarah; Pavloudi, Christina; Vasileiadou, Katerina; Leary, Patrick; Hammock, Jennifer; Schulz, Katja; Parr, Cynthia Sims; Arvanitidis, Christos; Jensen, Lars Juhl

    2015-06-01

    The association of organisms to their environments is a key issue in exploring biodiversity patterns. This knowledge has traditionally been scattered, but textual descriptions of taxa and their habitats are now being consolidated in centralized resources. However, structured annotations are needed to facilitate large-scale analyses. Therefore, we developed ENVIRONMENTS, a fast dictionary-based tagger capable of identifying Environment Ontology (ENVO) terms in text. We evaluate the accuracy of the tagger on a new manually curated corpus of 600 Encyclopedia of Life (EOL) species pages. We use the tagger to associate taxa with environments by tagging EOL text content monthly, and integrate the results into the EOL to disseminate them to a broad audience of users. The software and the corpus are available under the open-source BSD and the CC-BY-NC-SA 3.0 licenses, respectively, at http://environments.hcmr.gr. © The Author 2015. Published by Oxford University Press.

  5. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  6. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  7. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  8. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  9. Automatic Control of Personal Rapid Transit Vehicles

    Science.gov (United States)

    Smith, P. D.

    1972-01-01

    The requirements for automatic longitudinal control of a string of closely packed personal vehicles are outlined. Optimal control theory is used to design feedback controllers for strings of vehicles. An important modification of the usual optimal control scheme is the inclusion of jerk in the cost functional. While the inclusion of the jerk term was considered, the effect of its inclusion was not sufficiently studied. Adding the jerk term will increase passenger comfort.

  10. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads

    2011-01-01

    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  11. Cliff : the automatized zipper

    NARCIS (Netherlands)

    Baharom, M.Z.; Toeters, M.J.; Delbressine, F.L.M.; Bangaru, C.; Feijs, L.M.G.

    2016-01-01

    It is our strong believe that fashion - more specifically apparel - can support us so much more in our daily life than it currently does. The Cliff project takes the opportunity to create a generic automatized zipper. It is a response to the struggle by elderly, people with physical disability, and

  12. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  13. Automatic Oscillating Turret.

    Science.gov (United States)

    1981-03-01

    Final Report: February 1978 ZAUTOMATIC OSCILLATING TURRET SYSTEM September 1980 * 6. PERFORMING 01G. REPORT NUMBER .J7. AUTHOR(S) S. CONTRACT OR GRANT...o....e.... *24 APPENDIX P-4 OSCILLATING BUMPER TURRET ...................... 25 A. DESCRIPTION 1. Turret Controls ...Other criteria requirements were: 1. Turret controls inside cab. 2. Automatic oscillation with fixed elevation to range from 20* below the horizontal to

  14. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  15. Automatic sweep circuit

    International Nuclear Information System (INIS)

    Keefe, D.J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input is described. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found

  16. Automatic sweep circuit

    Science.gov (United States)

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  17. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  18. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  19. The power grid AGC frequency bias coefficient online identification method based on wide area information

    Science.gov (United States)

    Wang, Zian; Li, Shiguang; Yu, Ting

    2015-12-01

    This paper propose online identification method of regional frequency deviation coefficient based on the analysis of interconnected grid AGC adjustment response mechanism of regional frequency deviation coefficient and the generator online real-time operation state by measured data through PMU, analyze the optimization method of regional frequency deviation coefficient in case of the actual operation state of the power system and achieve a more accurate and efficient automatic generation control in power system. Verify the validity of the online identification method of regional frequency deviation coefficient by establishing the long-term frequency control simulation model of two-regional interconnected power system.

  20. Automatic control system at the ''Loviisa'' NPP

    International Nuclear Information System (INIS)

    Kukhtevich, I.V.; Mal'tsev, B.K.; Sergievskaya, E.N.

    1980-01-01

    Automatic control system of the Loviisa-1 NPP (Finland) is described. According to operation conditions of Finland power system the Loviisa-1 NPP must operate in the mode of week and day control of loading schedule and participate in current control of power system frequency and capacity. With provision for these requirements NPP is equipped with the all-regime system for automatic control functioning during reactor start-up, shut-down, in normal and transient regimes and in emergency situations. The automatic control system includes: a data subsystem, an automatic control subsystem, a discrete control subsystem including remote, a subsystem for reactor control and protection and overall station system of protections: control and dosimetry inside the reactor. Structures of a data-computer complex, discrete control subsystems, reactor control and protection systems, neutron flux control system, inside-reactor control system, station protection system and system for control of fuel element tightness are presented in short. Two-year experience of the NPP operation confirmed advisability of the chosen volume of automatization. The Loviisa-1 NPP operates successfully in the mode of the week and day control of supervisor schedule and current control of frequency (short-term control)

  1. Automatic tracking of wake vortices using ground-wind sensor data

    Science.gov (United States)

    1977-01-03

    Algorithms for automatic tracking of wake vortices using ground-wind anemometer : data are developed. Methods of bad-data suppression, track initiation, and : track termination are included. An effective sensor-failure detection-and identification : ...

  2. Identity verification using computer vision for automatic garage door opening

    NARCIS (Netherlands)

    Wijnhoven, R.G.J.; With, de P.H.N.

    2011-01-01

    We present a novel system for automatic identification of vehicles as part of an intelligent access control system for a garage entrance. Using a camera in the door, cars are detected and matched to the database of authenticated cars. Once a car is detected, License Plate Recognition (LPR) is

  3. Profiling School Shooters: Automatic Text-Based Analysis

    Directory of Open Access Journals (Sweden)

    Yair eNeuman

    2015-06-01

    Full Text Available School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various charateristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by six school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/priorization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.

  4. Uranium casting furnace automatic temperature control development

    International Nuclear Information System (INIS)

    Lind, R.F.

    1992-01-01

    Development of an automatic molten uranium temperature control system for use on batch-type induction casting furnaces is described. Implementation of a two-color optical pyrometer, development of an optical scanner for the pyrometer, determination of furnace thermal dynamics, and design of control systems are addressed. The optical scanning system is shown to greatly improve pyrometer measurement repeatability, particularly where heavy floating slag accumulations cause surface temperature gradients. Thermal dynamics of the furnaces were determined by applying least-squares system identification techniques to actual production data. A unity feedback control system utilizing a proportional-integral-derivative compensator is designed by using frequency-domain techniques. 14 refs

  5. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  6. Automatic contact in DYNA3D for vehicle crashworthiness

    International Nuclear Information System (INIS)

    Whirley, R.G.; Engelmann, B.E.

    1994-01-01

    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit, nonlinear, finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. The authors have used a new four-step automatic contact algorithm. Key aspects of the proposed method include (1) automatic identification of adjacent and opposite surfaces in the global search phase, and (2) the use of a smoothly varying surface normal that allows a consistent treatment of shell intersection and corner contact conditions without ad hoc rules. Three examples are given to illustrate the performance of the newly proposed algorithm in the public DYNA3D code

  7. Automatic fault extraction using a modified ant-colony algorithm

    International Nuclear Information System (INIS)

    Zhao, Junsheng; Sun, Sam Zandong

    2013-01-01

    The basis of automatic fault extraction is seismic attributes, such as the coherence cube which is always used to identify a fault by the minimum value. The biggest challenge in automatic fault extraction is noise, including that of seismic data. However, a fault has a better spatial continuity in certain direction, which makes it quite different from noise. Considering this characteristic, a modified ant-colony algorithm is introduced into automatic fault identification and tracking, where the gradient direction and direction consistency are used as constraints. Numerical model test results show that this method is feasible and effective in automatic fault extraction and noise suppression. The application of field data further illustrates its validity and superiority. (paper)

  8. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  9. Automatic structural scene digitalization.

    Science.gov (United States)

    Tang, Rui; Wang, Yuhan; Cosker, Darren; Li, Wenbin

    2017-01-01

    In this paper, we present an automatic system for the analysis and labeling of structural scenes, floor plan drawings in Computer-aided Design (CAD) format. The proposed system applies a fusion strategy to detect and recognize various components of CAD floor plans, such as walls, doors, windows and other ambiguous assets. Technically, a general rule-based filter parsing method is fist adopted to extract effective information from the original floor plan. Then, an image-processing based recovery method is employed to correct information extracted in the first step. Our proposed method is fully automatic and real-time. Such analysis system provides high accuracy and is also evaluated on a public website that, on average, archives more than ten thousands effective uses per day and reaches a relatively high satisfaction rate.

  10. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  11. Automatic calibration of gamma spectrometers

    International Nuclear Information System (INIS)

    Tluchor, D.; Jiranek, V.

    1989-01-01

    The principle is described of energy calibration of the spectrometric path based on the measurement of the standard of one radionuclide or a set of them. The entire computer-aided process is divided into three main steps, viz.: the insertion of the calibration standard by the operator; the start of the calibration program; energy calibration by the computer. The program was selected such that the spectrum identification should not depend on adjustment of the digital or analog elements of the gamma spectrometric measuring path. The ECL program is described for automatic energy calibration as is its control, the organization of data file ECL.DAT and the necessary hardware support. The computer-multichannel analyzer communication was provided using an interface pair of Canberra 8673V and Canberra 8573 operating in the RS-422 standard. All subroutines for communication with the multichannel analyzer were written in MACRO 11 while the main program and the other subroutines were written in FORTRAN-77. (E.J.). 1 tab., 4 refs

  12. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  13. Identification and characterization of novel long-term metabolites of oxymesterone and mesterolone in human urine by application of selected reaction monitoring GC-CI-MS/MS.

    Science.gov (United States)

    Polet, Michael; Van Gansbeke, Wim; Geldof, Lore; Deventer, Koen; Van Eenoo, Peter

    2017-11-01

    The search for metabolites with longer detection times remains an important task in, for example, toxicology and doping control. The impact of these long-term metabolites is highlighted by the high number of positive cases after reanalysis of samples that were stored for several years, e.g. samples of previous Olympic Games. A substantial number of previously alleged negative samples have now been declared positive due to the detection of various long-term steroid metabolites the existence of which was unknown during the Olympic Games of 2008 and 2012. In this work, the metabolism of oxymesterone and mesterolone, two anabolic androgenic steroids (AAS), was investigated by application of a selected reaction monitoring gas chromatography-chemical ionization-triple quadrupole mass spectrometry (GC-CI-MS/MS) protocol for metabolite detection and identification. Correlations between AAS structure and GC-CI-MS/MS fragmentation behaviour enabled the search for previously unknown but expected AAS metabolites by selection of theoretical transitions for expected metabolites. Use of different hydrolysis protocols allowed for evaluation of the detection window of both phase I and phase II metabolites. For oxymesterone, a new metabolite, 18-nor-17β-hydroxymethyl-17α-methyl-4-hydroxy-androst-4,13-diene-3-one, was identified. It was detectable up to 46 days by using GC-CI-MS/MS, whereas with a traditional screening (detection of metabolite 17-epioxymesterone with electron ionization GC-MS/MS) oxymesterone administration was only detectable for 3.5 days. A new metabolite was also found for mesterolone. It was identified as 1α-methyl-5α-androstan-3,6,16-triol-17-one and its sulfate form after hydrolysis with Helix pomatia resulted in a prolonged detection time (up to 15 days) for mesterolone abuse. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  15. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  16. Automatic topics segmentation for TV news video

    Science.gov (United States)

    Hmayda, Mounira; Ejbali, Ridha; Zaied, Mourad

    2017-03-01

    Automatic identification of television programs in the TV stream is an important task for operating archives. This article proposes a new spatio-temporal approach to identify the programs in TV stream into two main steps: First, a reference catalogue for video features visual jingles built. We operate the features that characterize the instances of the same program type to identify the different types of programs in the flow of television. The role of video features is to represent the visual invariants for each visual jingle using appropriate automatic descriptors for each television program. On the other hand, programs in television streams are identified by examining the similarity of the video signal for visual grammars in the catalogue. The main idea of the identification process is to compare the visual similarity of the video signal features in the flow of television to the catalogue. After presenting the proposed approach, the paper overviews encouraging experimental results on several streams extracted from different channels and compounds of several programs.

  17. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  18. AUTOMATIC FREQUENCY CONTROL SYSTEM

    Science.gov (United States)

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  19. Automatic dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.

    2008-01-01

    The Catani-Seymour dipole subtraction is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. We automatized the procedure in a computer code. The code is useful especially for the processes with many parton legs. In this talk, we first explain the algorithm of the dipole subtraction and the whole structure of our code. After that we show the results for some processes where the infrared divergences of real emission processes are subtracted. (author)

  20. Automatic programmable air ozonizer

    International Nuclear Information System (INIS)

    Gubarev, S.P.; Klosovsky, A.V.; Opaleva, G.P.; Taran, V.S.; Zolototrubova, M.I.

    2015-01-01

    In this paper we describe a compact, economical, easy to manage auto air ozonator developed at the Institute of Plasma Physics of the NSC KIPT. It is designed for sanitation, disinfection of premises and cleaning the air from foreign odors. A distinctive feature of the developed device is the generation of a given concentration of ozone, approximately 0.7 maximum allowable concentration (MAC), and automatic maintenance of a specified level. This allows people to be inside the processed premises during operation. The microprocessor controller to control the operation of the ozonator was developed

  1. Identification of Relevant Phytochemical Constituents for Characterization and Authentication of Tomatoes by General Linear Model Linked to Automatic Interaction Detection (GLM-AID) and Artificial Neural Network Models (ANNs).

    Science.gov (United States)

    Hernández Suárez, Marcos; Astray Dopazo, Gonzalo; Larios López, Dina; Espinosa, Francisco

    2015-01-01

    There are a large number of tomato cultivars with a wide range of morphological, chemical, nutritional and sensorial characteristics. Many factors are known to affect the nutrient content of tomato cultivars. A complete understanding of the effect of these factors would require an exhaustive experimental design, multidisciplinary scientific approach and a suitable statistical method. Some multivariate analytical techniques such as Principal Component Analysis (PCA) or Factor Analysis (FA) have been widely applied in order to search for patterns in the behaviour and reduce the dimensionality of a data set by a new set of uncorrelated latent variables. However, in some cases it is not useful to replace the original variables with these latent variables. In this study, Automatic Interaction Detection (AID) algorithm and Artificial Neural Network (ANN) models were applied as alternative to the PCA, AF and other multivariate analytical techniques in order to identify the relevant phytochemical constituents for characterization and authentication of tomatoes. To prove the feasibility of AID algorithm and ANN models to achieve the purpose of this study, both methods were applied on a data set with twenty five chemical parameters analysed on 167 tomato samples from Tenerife (Spain). Each tomato sample was defined by three factors: cultivar, agricultural practice and harvest date. General Linear Model linked to AID (GLM-AID) tree-structured was organized into 3 levels according to the number of factors. p-Coumaric acid was the compound the allowed to distinguish the tomato samples according to the day of harvest. More than one chemical parameter was necessary to distinguish among different agricultural practices and among the tomato cultivars. Several ANN models, with 25 and 10 input variables, for the prediction of cultivar, agricultural practice and harvest date, were developed. Finally, the models with 10 input variables were chosen with fit's goodness between 44 and 100

  2. Identification of Relevant Phytochemical Constituents for Characterization and Authentication of Tomatoes by General Linear Model Linked to Automatic Interaction Detection (GLM-AID and Artificial Neural Network Models (ANNs.

    Directory of Open Access Journals (Sweden)

    Marcos Hernández Suárez

    Full Text Available There are a large number of tomato cultivars with a wide range of morphological, chemical, nutritional and sensorial characteristics. Many factors are known to affect the nutrient content of tomato cultivars. A complete understanding of the effect of these factors would require an exhaustive experimental design, multidisciplinary scientific approach and a suitable statistical method. Some multivariate analytical techniques such as Principal Component Analysis (PCA or Factor Analysis (FA have been widely applied in order to search for patterns in the behaviour and reduce the dimensionality of a data set by a new set of uncorrelated latent variables. However, in some cases it is not useful to replace the original variables with these latent variables. In this study, Automatic Interaction Detection (AID algorithm and Artificial Neural Network (ANN models were applied as alternative to the PCA, AF and other multivariate analytical techniques in order to identify the relevant phytochemical constituents for characterization and authentication of tomatoes. To prove the feasibility of AID algorithm and ANN models to achieve the purpose of this study, both methods were applied on a data set with twenty five chemical parameters analysed on 167 tomato samples from Tenerife (Spain. Each tomato sample was defined by three factors: cultivar, agricultural practice and harvest date. General Linear Model linked to AID (GLM-AID tree-structured was organized into 3 levels according to the number of factors. p-Coumaric acid was the compound the allowed to distinguish the tomato samples according to the day of harvest. More than one chemical parameter was necessary to distinguish among different agricultural practices and among the tomato cultivars. Several ANN models, with 25 and 10 input variables, for the prediction of cultivar, agricultural practice and harvest date, were developed. Finally, the models with 10 input variables were chosen with fit's goodness

  3. Automatic operation device for control rods

    International Nuclear Information System (INIS)

    Sekimizu, Koichi

    1984-01-01

    Purpose: To enable automatic operation of control rods based on the reactor operation planning, and particularly, to decrease the operator's load upon start up and shutdown of the reactor. Constitution: Operation plannings, demand for the automatic operation, break point setting value, power and reactor core flow rate change, demand for operation interrupt, demand for restart, demand for forecasting and the like are inputted to an input device, and an overall judging device performs a long-term forecast as far as the break point by a long-term forecasting device based on the operation plannings. The automatic reactor operation or the like is carried out based on the long-term forecasting and the short time forecasting is performed by the change in the reactor core status due to the control rod operation sequence based on the control rod pattern and the operation planning. Then, it is judged if the operation for the intended control rod is possible or not based on the result of the short time forecasting. (Aizawa, K.)

  4. Simplified automatic on-line document searching

    International Nuclear Information System (INIS)

    Ebinuma, Yukio

    1983-01-01

    The author proposed searching method for users who need not-comprehensive retrieval. That is to provide flexible number of related documents for the users automatically. A group of technical terms are used as search terms to express an inquiry. Logical sums of the terms in the ascending order of frequency of the usage are prepared sequentially and automatically, and then the search formulas, qsub(m) and qsub(m-1) which meet certain threshold values are selected automatically also. Users justify precision of the search output up to 20 items retrieved by the formula qsub(m). If a user wishes more than 30% of recall ratio, the serach result should be output by qsub(m), and if he wishes less than 30% of it, it should be output by qsub(m-1). The search by this method using one year volume of INIS Database (76,600 items) and five inquiries resulted in 32% of recall ratio and 36% of precision ratio on the average in the case of qsub(m). The connecting time of a terminal was within 15 minutes per an inquiry. It showed more efficiency than that of an inexperienced searcher. The method can be applied to on-line searching system for database in which natural language only or natural language and controlled vocabulary are used. (author)

  5. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  6. [The maintenance of automatic analysers and associated documentation].

    Science.gov (United States)

    Adjidé, V; Fournier, P; Vassault, A

    2010-12-01

    The maintenance of automatic analysers and associated documentation taking part in the requirements of the ISO 15189 Standard and the French regulation as well have to be defined in the laboratory policy. The management of the periodic maintenance and documentation shall be implemented and fulfilled. The organisation of corrective maintenance has to be managed to avoid interruption of the task of the laboratory. The different recommendations concern the identification of materials including automatic analysers, the environmental conditions to take into account, the documentation provided by the manufacturer and documents prepared by the laboratory including procedures for maintenance.

  7. Cost-benefit analysis of the ATM automatic deposit service

    Directory of Open Access Journals (Sweden)

    Ivica Županović

    2015-03-01

    Full Text Available Bankers and other financial experts have analyzed the value of automated teller machines (ATM in terms of growing consumer demand, rising costs of technology development, decreasing profitability and market share. This paper presents a step-by-step cost-benefit analysis of the ATM automatic deposit service. The first step is to determine user attitudes towards using ATM automatic deposit service by using the Technology Acceptance Model (TAM. The second step is to determine location priorities for ATMs that provide automatic deposit services using the Analytic Hierarchy Process (AHP model. The results of the previous steps enable a highly efficient application of cost-benefit analysis for evaluating costs and benefits of automatic deposit services. To understand fully the proposed procedure outside of theoretical terms, a real-world application of a case study is conducted.

  8. Analyzing Program Termination and Complexity Automatically with AProVE

    DEFF Research Database (Denmark)

    Giesl, Jürgen; Aschermann, Cornelius; Brockschmidt, Marc

    2017-01-01

    In this system description, we present the tool AProVE for automatic termination and complexity proofs of Java, C, Haskell, Prolog, and rewrite systems. In addition to classical term rewrite systems (TRSs), AProVE also supports rewrite systems containing built-in integers (int-TRSs). To analyze...... programs in high-level languages, AProVE automatically converts them to (int-)TRSs. Then, a wide range of techniques is employed to prove termination and to infer complexity bounds for the resulting rewrite systems. The generated proofs can be exported to check their correctness using automatic certifiers...

  9. Automatic vs. manual curation of a multi-source chemical dictionary: the impact on text mining

    Science.gov (United States)

    2010-01-01

    Background Previously, we developed a combined dictionary dubbed Chemlist for the identification of small molecules and drugs in text based on a number of publicly available databases and tested it on an annotated corpus. To achieve an acceptable recall and precision we used a number of automatic and semi-automatic processing steps together with disambiguation rules. However, it remained to be investigated which impact an extensive manual curation of a multi-source chemical dictionary would have on chemical term identification in text. ChemSpider is a chemical database that has undergone extensive manual curation aimed at establishing valid chemical name-to-structure relationships. Results We acquired the component of ChemSpider containing only manually curated names and synonyms. Rule-based term filtering, semi-automatic manual curation, and disambiguation rules were applied. We tested the dictionary from ChemSpider on an annotated corpus and compared the results with those for the Chemlist dictionary. The ChemSpider dictionary of ca. 80 k names was only a 1/3 to a 1/4 the size of Chemlist at around 300 k. The ChemSpider dictionary had a precision of 0.43 and a recall of 0.19 before the application of filtering and disambiguation and a precision of 0.87 and a recall of 0.19 after filtering and disambiguation. The Chemlist dictionary had a precision of 0.20 and a recall of 0.47 before the application of filtering and disambiguation and a precision of 0.67 and a recall of 0.40 after filtering and disambiguation. Conclusions We conclude the following: (1) The ChemSpider dictionary achieved the best precision but the Chemlist dictionary had a higher recall and the best F-score; (2) Rule-based filtering and disambiguation is necessary to achieve a high precision for both the automatically generated and the manually curated dictionary. ChemSpider is available as a web service at http://www.chemspider.com/ and the Chemlist dictionary is freely available as an XML file in

  10. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  11. Automatic fuel exchanging device

    International Nuclear Information System (INIS)

    Takahashi, Fuminobu.

    1984-01-01

    Purpose: To enable to designate the identification number of a fuel assembly in a nuclear reactor pressure vessel thereby surely exchanging the designated assembly within a short time. Constitution: Identification number (or letter) pressed on a grip of a fuel assembly is to be detected by a two-dimensional ultrasonic probe of a pull-up mechanism. When the detected number corresponds with the designated number, a control signal is outputted, whereby the pull-up drive control mechanism or pull-up mechanism responds to pull-up and exchange the fuel assembly of the identified number. With such a constitution, the fuel assembly can rapidly and surely be recognized even if pressed letters deviate to the left or right of the probe, and further, the hinge portion and the signal processing portion can be simplified. (Horiuchi, T.)

  12. Ferenczi's concept of identification with the aggressor: understanding dissociative structure with interacting victim and abuser self-states.

    Science.gov (United States)

    Howell, Elizabeth F

    2014-03-01

    No one has described more passionately than Ferenczi the traumatic induction of dissociative trance with its resulting fragmentation of the personality. Ferenczi introduced the concept and term, identification with the aggressor in his seminal "Confusion of Tongues" paper, in which he described how the abused child becomes transfixed and robbed of his senses. Having been traumatically overwhelmed, the child becomes hypnotically transfixed by the aggressor's wishes and behavior, automatically identifying by mimicry rather than by a purposeful identification with the aggressor's role. To expand upon Ferenczi's observations, identification with the aggressor can be understood as a two-stage process. The first stage is automatic and initiated by trauma, but the second stage is defensive and purposeful. While identification with the aggressor begins as an automatic organismic process, with repeated activation and use, gradually it becomes a defensive process. Broadly, as a dissociative defense, it has two enacted relational parts, the part of the victim and the part of the aggressor. This paper describes the intrapersonal aspects (how aggressor and victim self-states interrelate in the internal world), as well as the interpersonal aspects (how these become enacted in the external). This formulation has relevance to understanding the broad spectrum of the dissociative structure of mind, borderline personality disorder, and dissociative identity disorder.

  13. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  14. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  15. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  16. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  17. Automatic Text Analysis Based on Transition Phenomena of Word Occurrences

    Science.gov (United States)

    Pao, Miranda Lee

    1978-01-01

    Describes a method of selecting index terms directly from a word frequency list, an idea originally suggested by Goffman. Results of the analysis of word frequencies of two articles seem to indicate that the automated selection of index terms from a frequency list holds some promise for automatic indexing. (Author/MBR)

  18. Neuro-fuzzy system modeling based on automatic fuzzy clustering

    Institute of Scientific and Technical Information of China (English)

    Yuangang TANG; Fuchun SUN; Zengqi SUN

    2005-01-01

    A neuro-fuzzy system model based on automatic fuzzy clustering is proposed.A hybrid model identification algorithm is also developed to decide the model structure and model parameters.The algorithm mainly includes three parts:1) Automatic fuzzy C-means (AFCM),which is applied to generate fuzzy rules automatically,and then fix on the size of the neuro-fuzzy network,by which the complexity of system design is reducesd greatly at the price of the fitting capability;2) Recursive least square estimation (RLSE).It is used to update the parameters of Takagi-Sugeno model,which is employed to describe the behavior of the system;3) Gradient descent algorithm is also proposed for the fuzzy values according to the back propagation algorithm of neural network.Finally,modeling the dynamical equation of the two-link manipulator with the proposed approach is illustrated to validate the feasibility of the method.

  19. Automatic Modulation Recognition by Support Vector Machines Using Wavelet Kernel

    Energy Technology Data Exchange (ETDEWEB)

    Feng, X Z; Yang, J; Luo, F L; Chen, J Y; Zhong, X P [College of Mechatronic Engineering and Automation, National University of Defense Technology, Changsha (China)

    2006-10-15

    Automatic modulation identification plays a significant role in electronic warfare, electronic surveillance systems and electronic counter measure. The task of modulation recognition of communication signals is to determine the modulation type and signal parameters. In fact, automatic modulation identification can be range to an application of pattern recognition in communication field. The support vector machines (SVM) is a new universal learning machine which is widely used in the fields of pattern recognition, regression estimation and probability density. In this paper, a new method using wavelet kernel function was proposed, which maps the input vector xi into a high dimensional feature space F. In this feature space F, we can construct the optimal hyperplane that realizes the maximal margin in this space. That is to say, we can use SVM to classify the communication signals into two groups, namely analogue modulated signals and digitally modulated signals. In addition, computer simulation results are given at last, which show good performance of the method.

  20. Automatic Modulation Recognition by Support Vector Machines Using Wavelet Kernel

    International Nuclear Information System (INIS)

    Feng, X Z; Yang, J; Luo, F L; Chen, J Y; Zhong, X P

    2006-01-01

    Automatic modulation identification plays a significant role in electronic warfare, electronic surveillance systems and electronic counter measure. The task of modulation recognition of communication signals is to determine the modulation type and signal parameters. In fact, automatic modulation identification can be range to an application of pattern recognition in communication field. The support vector machines (SVM) is a new universal learning machine which is widely used in the fields of pattern recognition, regression estimation and probability density. In this paper, a new method using wavelet kernel function was proposed, which maps the input vector xi into a high dimensional feature space F. In this feature space F, we can construct the optimal hyperplane that realizes the maximal margin in this space. That is to say, we can use SVM to classify the communication signals into two groups, namely analogue modulated signals and digitally modulated signals. In addition, computer simulation results are given at last, which show good performance of the method

  1. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  2. Image-based automatic recognition of larvae

    Science.gov (United States)

    Sang, Ru; Yu, Guiying; Fan, Weijun; Guo, Tiantai

    2010-08-01

    As the main objects, imagoes have been researched in quarantine pest recognition in these days. However, pests in their larval stage are latent, and the larvae spread abroad much easily with the circulation of agricultural and forest products. It is presented in this paper that, as the new research objects, larvae are recognized by means of machine vision, image processing and pattern recognition. More visional information is reserved and the recognition rate is improved as color image segmentation is applied to images of larvae. Along with the characteristics of affine invariance, perspective invariance and brightness invariance, scale invariant feature transform (SIFT) is adopted for the feature extraction. The neural network algorithm is utilized for pattern recognition, and the automatic identification of larvae images is successfully achieved with satisfactory results.

  3. Automatized system of radioactive material analysis

    International Nuclear Information System (INIS)

    Pchelkin, V.A.; Sviderskij, M.F.; Litvinov, V.A.; Lavrikov, S.A.

    1979-01-01

    An automatized system has been developed for the identification of substance, element and isotope content of radioactive materials on the basis of data obtained for studying physical-chemical properties of substances (with the help of atomic-absorption spectrometers, infrared spectrometer, mass-spectrometer, derivatograph etc.). The system is based on the following principles: independent operation of each device; a possibility of increasing the number of physical instruments and devices; modular properties of engineering and computer means; modular properties and standardization of mathematical equipment, high reliability of the system; continuity of programming languages; a possibility of controlling the devices with the help of high-level language, typification of the system; simple and easy service; low cost. Block-diagram of the system is given

  4. Automatic extraction of composite terms for construction of ontologies: an experiment in the health care area - DOI: 10.3395/reciis.v3i1.244en

    Directory of Open Access Journals (Sweden)

    Lucelene Lopes

    2009-04-01

    Full Text Available In this article we demonstrate the use of the tool OntoLP in the ontology construction process in an experiment in the health care area. Specifically, terms based on a corpus in the pediatrics area are extracted. We compare the result obtained by the tool with the reference results of a list of terms obtained manually. In this comparison, bigrams and trigrams obtained through different methods are analyzed. We conclude the work by observing the advantages of processing by including complex linguistic information such as syntactical and semantic analysis.

  5. Automatic dirt trail analysis in dermoscopy images.

    Science.gov (United States)

    Cheng, Beibei; Joe Stanley, R; Stoecker, William V; Osterwise, Christopher T P; Stricklin, Sherea M; Hinton, Kristen A; Moss, Randy H; Oliviero, Margaret; Rabinovitz, Harold S

    2013-02-01

    Basal cell carcinoma (BCC) is the most common cancer in the US. Dermatoscopes are devices used by physicians to facilitate the early detection of these cancers based on the identification of skin lesion structures often specific to BCCs. One new lesion structure, referred to as dirt trails, has the appearance of dark gray, brown or black dots and clods of varying sizes distributed in elongated clusters with indistinct borders, often appearing as curvilinear trails. In this research, we explore a dirt trail detection and analysis algorithm for extracting, measuring, and characterizing dirt trails based on size, distribution, and color in dermoscopic skin lesion images. These dirt trails are then used to automatically discriminate BCC from benign skin lesions. For an experimental data set of 35 BCC images with dirt trails and 79 benign lesion images, a neural network-based classifier achieved a 0.902 are under a receiver operating characteristic curve using a leave-one-out approach. Results obtained from this study show that automatic detection of dirt trails in dermoscopic images of BCC is feasible. This is important because of the large number of these skin cancers seen every year and the challenge of discovering these earlier with instrumentation. © 2011 John Wiley & Sons A/S.

  6. Behavioral and electrophysiological evidence for early and automatic detection of phonological equivalence in variable speech inputs.

    Science.gov (United States)

    Kharlamov, Viktor; Campbell, Kenneth; Kazanina, Nina

    2011-11-01

    Speech sounds are not always perceived in accordance with their acoustic-phonetic content. For example, an early and automatic process of perceptual repair, which ensures conformity of speech inputs to the listener's native language phonology, applies to individual input segments that do not exist in the native inventory or to sound sequences that are illicit according to the native phonotactic restrictions on sound co-occurrences. The present study with Russian and Canadian English speakers shows that listeners may perceive phonetically distinct and licit sound sequences as equivalent when the native language system provides robust evidence for mapping multiple phonetic forms onto a single phonological representation. In Russian, due to an optional but productive t-deletion process that affects /stn/ clusters, the surface forms [sn] and [stn] may be phonologically equivalent and map to a single phonological form /stn/. In contrast, [sn] and [stn] clusters are usually phonologically distinct in (Canadian) English. Behavioral data from identification and discrimination tasks indicated that [sn] and [stn] clusters were more confusable for Russian than for English speakers. The EEG experiment employed an oddball paradigm with nonwords [asna] and [astna] used as the standard and deviant stimuli. A reliable mismatch negativity response was elicited approximately 100 msec postchange in the English group but not in the Russian group. These findings point to a perceptual repair mechanism that is engaged automatically at a prelexical level to ensure immediate encoding of speech inputs in phonological terms, which in turn enables efficient access to the meaning of a spoken utterance.

  7. Automatic detection and quantitative analysis of cells in the mouse primary motor cortex

    Science.gov (United States)

    Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui

    2014-09-01

    Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.

  8. Word level language identification in online multilingual communication

    NARCIS (Netherlands)

    Nguyen, Dong-Phuong; Dogruoz, A. Seza

    2013-01-01

    Multilingual speakers switch between languages in online and spoken communication. Analyses of large scale multilingual data require automatic language identification at the word level. For our experiments with multilingual online discussions, we first tag the language of individual words using

  9. Automatic control algorithm effects on energy production

    Science.gov (United States)

    Mcnerney, G. M.

    1981-01-01

    A computer model was developed using actual wind time series and turbine performance data to simulate the power produced by the Sandia 17-m VAWT operating in automatic control. The model was used to investigate the influence of starting algorithms on annual energy production. The results indicate that, depending on turbine and local wind characteristics, a bad choice of a control algorithm can significantly reduce overall energy production. The model can be used to select control algorithms and threshold parameters that maximize long term energy production. The results from local site and turbine characteristics were generalized to obtain general guidelines for control algorithm design.

  10. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  11. Direct blood culturing on solid medium outperforms an automated continuously monitored broth-based blood culture system in terms of time to identification and susceptibility testing

    Directory of Open Access Journals (Sweden)

    E.A. Idelevich

    2016-03-01

    Full Text Available Pathogen identification and antimicrobial susceptibility testing (AST should be available as soon as possible for patients with bloodstream infections. We investigated whether a lysis-centrifugation (LC blood culture (BC method, combined with matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS identification and Vitek 2 AST, provides a time advantage in comparison with the currently used automated broth-based BC system. Seven bacterial reference strains were added each to 10 mL human blood in final concentrations of 100, 10 and 1 CFU/mL. Inoculated blood was added to the Isolator 10 tube and centrifuged at 3000 g for 30 min, then 1.5 mL sediment was distributed onto five 150-mm agar plates. Growth was observed hourly and microcolonies were subjected to MALDI-TOF MS and Vitek 2 as soon as possible. For comparison, seeded blood was introduced into an aerobic BC bottle and incubated in the BACTEC 9240 automated BC system. For all species/concentration combinations except one, successful identification and Vitek 2 inoculation were achieved even before growth detection by BACTEC. The fastest identification and inoculation for AST were achieved with Escherichia coli in concentrations of 100 CFU/mL and 10 CFU/mL (after 7 h each, while BACTEC flagged respective samples positive after 9.5 h and 10 h. Use of the LC-BC method allows skipping of incubation in automated BC systems and, used in combination with rapid diagnostics from microcolonies, provides a considerable advantage in time to result. This suggests that the usefulness of direct BC on solid medium should be re-evaluated in the era of rapid microbiology.

  12. Artificial Intelligence In Automatic Target Recognizers: Technology And Timelines

    Science.gov (United States)

    Gilmore, John F.

    1984-12-01

    The recognition of targets in thermal imagery has been a problem exhaustively analyzed in its current localized dimension. This paper discusses the application of artificial intelligence (AI) technology to automatic target recognition, a concept capable of expanding current ATR efforts into a new globalized dimension. Deficiencies of current automatic target recognition systems are reviewed in terms of system shortcomings. Areas of artificial intelligence which show the most promise in improving ATR performance are analyzed, and a timeline is formed in light of how near (as well as far) term artificial intelligence applications may exist. Current research in the area of high level expert vision systems is reviewed and the possible utilization of artificial intelligence architectures to improve low level image processing functions is also discussed. Additional application areas of relevance to solving the problem of automatic target recognition utilizing both high and low level processing are also explored.

  13. Global Distribution Adjustment and Nonlinear Feature Transformation for Automatic Colorization

    Directory of Open Access Journals (Sweden)

    Terumasa Aoki

    2018-01-01

    Full Text Available Automatic colorization is generally classified into two groups: propagation-based methods and reference-based methods. In reference-based automatic colorization methods, color image(s are used as reference(s to reconstruct original color of a gray target image. The most important task here is to find the best matching pairs for all pixels between reference and target images in order to transfer color information from reference to target pixels. A lot of attractive local feature-based image matching methods have already been developed for the last two decades. Unfortunately, as far as we know, there are no optimal matching methods for automatic colorization because the requirements for pixel matching in automatic colorization are wholly different from those for traditional image matching. To design an efficient matching algorithm for automatic colorization, clustering pixel with low computational cost and generating descriptive feature vector are the most important challenges to be solved. In this paper, we present a novel method to address these two problems. In particular, our work concentrates on solving the second problem (designing a descriptive feature vector; namely, we will discuss how to learn a descriptive texture feature using scaled sparse texture feature combining with a nonlinear transformation to construct an optimal feature descriptor. Our experimental results show our proposed method outperforms the state-of-the-art methods in terms of robustness for color reconstruction for automatic colorization applications.

  14. Automatic adventitious respiratory sound analysis: A systematic review.

    Science.gov (United States)

    Pramono, Renard Xaviero Adhi; Bowyer, Stuart; Rodriguez-Villegas, Esther

    2017-01-01

    Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD), and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established. To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works. A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016) and IEEExplore (1984-2016) databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification. Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated. Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved. A total of 77 reports from the literature were included in this review. 55 (71.43%) of the studies focused on wheeze, 40 (51.95%) on crackle, 9 (11.69%) on stridor, 9 (11

  15. Who identifies with suicidal film characters? Determinants of identification with suicidal protagonists of drama films.

    Science.gov (United States)

    Till, Benedikt; Herberth, Arno; Sonneck, Gernot; Vitouch, Peter; Niederkrotenthaler, Thomas

    2013-06-01

    Identification with a media character is an influential factor for the effects of a media product on the recipient, but still very little is known about this cognitive process. This study investigated to what extent identification of a recipient with the suicidal protagonist of a film drama is influenced by the similarity between them in terms of sex, age, and education as well as by the viewer's empathy and suicidality. Sixty adults were assigned randomly to one of two film groups. Both groups watched a drama that concluded with the tragic suicide of the protagonist. Identification, empathy, suicidality, as well as socio-demographic data were measured by questionnaires that were applied before and after the movie screening. Results indicated that identification was not associated with socio-demographic similarity or the viewer's suicidality. However, the greater the subjects' empathy was, the more they identified with the protagonist in one of the two films. This investigation provides evidence that challenges the common assumption that identification with a film character is automatically generated when viewer and protagonist are similar in terms of sex, age, education or attitude.

  16. 33 CFR 401.20 - Automatic Identification System.

    Science.gov (United States)

    2010-07-01

    ... more than 50 passengers for hire; and (2) Each dredge, floating plant or towing vessel over 8 meters in... close to the primary conning position in the navigation bridge and a standard 120 Volt, AC, 3-prong...

  17. Automatic Identification of E-Learner Emotional States to Ameliorate ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... offer a high quality training and educational services. ... will explore certain challenges which face the distant-learner: the ... Moreover, it is going to help teacher evaluate the learning .... trajectories, as the sign of a slow flow. ... [4] E.L., Vallerand, R.J., Pelletier, L.G. & Ryan, R.M. , Motivation and education:.

  18. Using Automatic Identification System Technology to Improve Maritime Border Security

    Science.gov (United States)

    2014-12-01

    digital selective calling EPIRB Emergency Position Indicting Radio Beacon EU European Union FAA Federal Aviation Administration GAO U. S. Government...that has visited a hovering vessel or received merchandise outside the territorial sea. A hovering vessel is defined as a vessel loitering offshore...often with the intent to introduce merchandise into the United States illegally. Departing the United States and transiting international or foreign

  19. Using automatic identification system technology to improve maritime border security

    OpenAIRE

    Lindstrom, Tedric R.

    2014-01-01

    Approved for public release; distribution is unlimited Our coastal waters are the United States’ most open and vulnerable borders. This vast maritime domain harbors critical threats from terrorism, criminal activities, and natural disasters. Maritime borders pose significant security challenges, as nefarious entities have used small boats to conduct illegal activities for years, and they continue to do so today. Illegal drugs, money, weapons, and migrants flow both directions across our ma...

  20. Automatic Identification of Nutritious Contexts for Learning Vocabulary Words

    Science.gov (United States)

    Mostow, Jack; Gates, Donna; Ellison, Ross; Goutam, Rahul

    2015-01-01

    Vocabulary knowledge is crucial to literacy development and academic success. Previous research has shown learning the meaning of a word requires encountering it in diverse informative contexts. In this work, we try to identify "nutritious" contexts for a word--contexts that help students build a rich mental representation of the word's…

  1. Automatic thematic content analysis: finding frames in news

    NARCIS (Netherlands)

    Odijk, D.; Burscher, B.; Vliegenthart, R.; de Rijke, M.; Jatowt, A.; Lim, E.P.; Ding, Y.; Miura, A.; Tezuka, T.; Dias, G.; Tanaka, K.; Flanagin, A.; Dai, B.T.

    2013-01-01

    Framing in news is the way in which journalists depict an issue in terms of a ‘central organizing idea.’ Frames can be a perspective on an issue. We explore the automatic classification of four generic news frames: conflict, human interest, economic consequences, and morality. Complex

  2. Choosing Actuators for Automatic Control Systems of Thermal Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Gorbunov, A. I., E-mail: gor@tornado.nsk.ru [JSC “Tornado Modular Systems” (Russian Federation); Serdyukov, O. V. [Siberian Branch of the Russian Academy of Sciences, Institute of Automation and Electrometry (Russian Federation)

    2015-03-15

    Two types of actuators for automatic control systems of thermal power plants are analyzed: (i) pulse-controlled actuator and (ii) analog-controlled actuator with positioning function. The actuators are compared in terms of control circuit, control accuracy, reliability, and cost.

  3. A prospective multicenter clinical feasibility study of a new automatic speaking valve for postlaryngectomy voice rehabilitation

    NARCIS (Netherlands)

    Lansaat, L.; de Kleijn, B. J.; Hilgers, F. J. M.; van der Laan, B. F. A. M.; van den Brekel, M. W. M.

    Evaluation of short- and long-term clinical feasibility and exploration of limitations and advantages of a new automatic speaking valve (ASV) for laryngectomized patients with integrated HME, the Provox FreeHands FlexiVoice (FlexiVoice). This ASV not only enables automatic, but also manual closure

  4. THEORETICAL CONSIDERATIONS REGARDING THE AUTOMATIC FISCAL STABILIZERS OPERATING MECHANISM

    Directory of Open Access Journals (Sweden)

    Gondor Mihaela

    2012-07-01

    Full Text Available This paper examines the role of Automatic Fiscal Stabilizers (AFS for stabilizing the cyclical fluctuations of macroeconomic output as an alternative to discretionary fiscal policy, admitting its huge potential of being an anti crisis solution. The objectives of the study are the identification of the general features of the concept of automatic fiscal stabilizers and the logical assessment of them from economic perspectives. Based on the literature in the field, this paper points out the disadvantages of fiscal discretionary policy and argue the need of using Automatic Fiscal Stabilizers in order to provide a faster decision making process, shielded from political interference, and reduced uncertainty for households and business environment. The paper conclude about the need of using fiscal policy for smoothing the economic cycle, but in a way which includes among its features transparency, responsibility and clear operating mechanisms. Based on the research results the present paper assumes that pro-cyclicality reduces de effectiveness of the Automatic Fiscal Stabilizer and as a result concludes that it is very important to avoid the pro-cyclicality in fiscal rule design. Moreover, by committing in advance to specific fiscal policy action contingent on economic developments, uncertainty about the fiscal policy framework during a recession should be reduced. Being based on logical analysis and not focused on empirical, contextualized one, the paper presents some features of AFS operating mechanism and also identifies and systematizes the factors which provide its importance and national individuality. Reaching common understanding on the Automatic Fiscal Stabilizer concept as a institutional device for smoothing the gap of the economic cycles across different countries, particularly for the European Union Member States, will facilitate efforts to coordinate fiscal policy responses during a crisis, especially in the context of the fiscal

  5. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  6. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  7. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  8. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  9. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  10. A biometric approach to laboratory rodent identification.

    Science.gov (United States)

    Cameron, Jens; Jacobson, Christina; Nilsson, Kenneth; Rögnvaldsson, Thorsteinn

    2007-03-01

    Individual identification of laboratory rodents typically involves invasive methods, such as tattoos, ear clips, and implanted transponders. Beyond the ethical dilemmas they may present, these methods may cause pain or distress that confounds research results. The authors describe a prototype device for biometric identification of laboratory rodents that would allow researchers to identify rodents without the complications of other methods. The device, which uses the rodent's ear blood vessel pattern as the identifier, is fast, automatic, noninvasive, and painless.

  11. Child vocalization composition as discriminant information for automatic autism detection.

    Science.gov (United States)

    Xu, Dongxin; Gilkerson, Jill; Richards, Jeffrey; Yapanel, Umit; Gray, Sharmi

    2009-01-01

    Early identification is crucial for young children with autism to access early intervention. The existing screens require either a parent-report questionnaire and/or direct observation by a trained practitioner. Although an automatic tool would benefit parents, clinicians and children, there is no automatic screening tool in clinical use. This study reports a fully automatic mechanism for autism detection/screening for young children. This is a direct extension of the LENA (Language ENvironment Analysis) system, which utilizes speech signal processing technology to analyze and monitor a child's natural language environment and the vocalizations/speech of the child. It is discovered that child vocalization composition contains rich discriminant information for autism detection. By applying pattern recognition and machine learning approaches to child vocalization composition data, accuracy rates of 85% to 90% in cross-validation tests for autism detection have been achieved at the equal-error-rate (EER) point on a data set with 34 children with autism, 30 language delayed children and 76 typically developing children. Due to its easy and automatic procedure, it is believed that this new tool can serve a significant role in childhood autism screening, especially in regards to population-based or universal screening.

  12. Source identification and long-term monitoring of airborne particulate matter (PM2.5/PM10) in an urban region of Korea

    International Nuclear Information System (INIS)

    Yong-Sam Chung; Sun-Ha Kim; Jong-Hwa Moon; Young-Jin Kim; Jong-Myoung Lim; Jin-Hong Lee

    2006-01-01

    For the identification of air pollution sources, about 500 airborne particulate matter (PM 2.5 and PM 10 ) samples were collected by using a Gent air sampler and a polycarbonate filter in an urban region in the middle of Korea from 2000 to 2003. The concentrations of 25 elements in the samples were measured by using instrumental neutron activation analysis (INAA). Receptor modeling was performed on the air monitoring data by using the positive matrix factorization (PMF2) method. According to this analysis, the existence of 6 to 10 PMF factors, such as metal-alloy, oil combustion, diesel exhaust, coal combustion, gasoline exhaust, incinerator, Cu-smelter, biomass burning, sea-salt, and soil dust were identified. (author)

  13. The DanTermBank Project

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Pram Nielsen, Louise

    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank.......This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data from...

  14. OPTICAL correlation identification technology applied in underwater laser imaging target identification

    Science.gov (United States)

    Yao, Guang-tao; Zhang, Xiao-hui; Ge, Wei-long

    2012-01-01

    The underwater laser imaging detection is an effective method of detecting short distance target underwater as an important complement of sonar detection. With the development of underwater laser imaging technology and underwater vehicle technology, the underwater automatic target identification has gotten more and more attention, and is a research difficulty in the area of underwater optical imaging information processing. Today, underwater automatic target identification based on optical imaging is usually realized with the method of digital circuit software programming. The algorithm realization and control of this method is very flexible. However, the optical imaging information is 2D image even 3D image, the amount of imaging processing information is abundant, so the electronic hardware with pure digital algorithm will need long identification time and is hard to meet the demands of real-time identification. If adopt computer parallel processing, the identification speed can be improved, but it will increase complexity, size and power consumption. This paper attempts to apply optical correlation identification technology to realize underwater automatic target identification. The optics correlation identification technology utilizes the Fourier transform characteristic of Fourier lens which can accomplish Fourier transform of image information in the level of nanosecond, and optical space interconnection calculation has the features of parallel, high speed, large capacity and high resolution, combines the flexibility of calculation and control of digital circuit method to realize optoelectronic hybrid identification mode. We reduce theoretical formulation of correlation identification and analyze the principle of optical correlation identification, and write MATLAB simulation program. We adopt single frame image obtained in underwater range gating laser imaging to identify, and through identifying and locating the different positions of target, we can improve

  15. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  16. The automatic lumber planing mill

    Science.gov (United States)

    Peter Koch

    1957-01-01

    It is probable that a truly automatic planning operation could be devised if some of the variables commonly present in the mill-run lumber were eliminated and the remaining variables kept under close control. This paper will deal with the more general situation faced by mostl umber manufacturing plants. In other words, it will be assumed that the incoming lumber has...

  17. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  18. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  19. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  20. Automatic measurement of the radioactive mercury uptake by the kidney

    International Nuclear Information System (INIS)

    Zurowski, S.; Raynaud, C.; CEA, 91 - Orsay

    1976-01-01

    An entirely automatic method to measure the Hg uptake by the kidney is proposed. The following operations are carried out in succession: measurement of extrarenal activity, demarcation of uptake areas, anatomical identification of uptake areas, separation of overlapping organ images and measurement of kidney depth. The first results thus calculated on 30 patients are very close to those obtained with a standard manual method and are highly encouraging. Two important points should be stressed: a broad demarcation of the uptake areas is necessary and an original method, that of standard errors, is useful for the background noise determination and uptake area demarcation. This automatic measurement technique is so designed that it can be applied to other special cases [fr

  1. Early Automatic Detection of Parkinson's Disease Based on Sleep Recordings

    DEFF Research Database (Denmark)

    Kempfner, Jacob; Sorensen, Helge B D; Nikolic, Miki

    2014-01-01

    SUMMARY: Idiopathic rapid-eye-movement (REM) sleep behavior disorder (iRBD) is most likely the earliest sign of Parkinson's Disease (PD) and is characterized by REM sleep without atonia (RSWA) and consequently increased muscle activity. However, some muscle twitching in normal subjects occurs...... during REM sleep. PURPOSE: There are no generally accepted methods for evaluation of this activity and a normal range has not been established. Consequently, there is a need for objective criteria. METHOD: In this study we propose a full-automatic method for detection of RSWA. REM sleep identification...... the number of outliers during REM sleep was used as a quantitative measure of muscle activity. RESULTS: The proposed method was able to automatically separate all iRBD test subjects from healthy elderly controls and subjects with periodic limb movement disorder. CONCLUSION: The proposed work is considered...

  2. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  3. New software for computer-assisted dental-data matching in Disaster Victim Identification and long-term missing persons investigations: "DAVID Web".

    Science.gov (United States)

    Clement, J G; Winship, V; Ceddia, J; Al-Amad, S; Morales, A; Hill, A J

    2006-05-15

    In 1997 an internally supported but unfunded pilot project at the Victorian Institute of Forensic Medicine (VIFM) Australia led to the development of a computer system which closely mimicked Interpol paperwork for the storage, later retrieval and tentative matching of the many AM and PM dental records that are often needed for rapid Disaster Victim Identification. The program was called "DAVID" (Disaster And Victim IDentification). It combined the skills of the VIFM Information Technology systems manager (VW), an experienced odontologist (JGC) and an expert database designer (JC); all current authors on this paper. Students did much of the writing of software to prescription from Monash University. The student group involved won an Australian Information Industry Award in recognition of the contribution the new software could have made to the DVI process. Unfortunately, the potential of the software was never realized because paradoxically the federal nature of Australia frequently thwarts uniformity of systems across the entire country. As a consequence, the final development of DAVID never took place. Given the recent problems encountered post-tsunami by the odontologists who were obliged to use the Plass Data system (Plass Data Software, Holbaek, Denmark) and with the impending risks imposed upon Victoria by the decision to host the Commonwealth Games in Melbourne during March 2006, funding was sought and obtained from the state government to update counter disaster preparedness at the VIFM. Some of these funds have been made available to upgrade and complete the DAVID project. In the wake of discussions between leading expert odontologists from around the world held in Geneva during July 2003 at the invitation of the International Committee of the Red Cross significant alterations to the initial design parameters of DAVID were proposed. This was part of broader discussions directed towards developing instruments which could be used by the ICRC's "The Missing

  4. Topical Session on Liabilities identification and long-term management at national level - Topical Session held during the 36. Meeting of the RWMC

    International Nuclear Information System (INIS)

    2003-01-01

    These proceedings cover a topical session that was held at the March 2003 meeting of the Radioactive Waste Management Committee. The topical session focused on liability assessment and management for decommissioning of all types of nuclear installations, including decontamination of historic sites and waste management, as applicable. The presentations covered the current, national situations. The first oral presentation, from Switzerland, set the scene by providing a broad coverage of the relevant issues. The subsequent presentations - five from Member countries and one from the EC - described additional national positions and the evolving EC proposed directives. Each oral presentation was followed by a brief period of Q and As for clarification only. A plenary discussion took place on the ensemble of presentations and a Rapporteur provided a report on points made and lessons learnt. Additionally, written contributions were provided by RWMC delegates from several other countries. These are included in the proceedings as are the papers from the oral sessions, and the Rapporteur's report. These papers are not intended to be exhaustive, but to give an informed glimpse of NEA countries' approaches to liability identification and management in the context of nuclear facilities decommissioning and dismantling

  5. Identification of long-term trends and seasonality in high-frequency water quality data from the Yangtze River basin, China

    Science.gov (United States)

    He, Bin; Chen, Yaning; Zou, Shan; Wang, Yi; Nover, Daniel; Chen, Wen; Yang, Guishan

    2018-01-01

    Comprehensive understanding of the long-term trends and seasonality of water quality is important for controlling water pollution. This study focuses on spatio-temporal distributions, long-term trends, and seasonality of water quality in the Yangtze River basin using a combination of the seasonal Mann-Kendall test and time-series decomposition. The used weekly water quality data were from 17 environmental stations for the period January 2004 to December 2015. Results show gradual improvement in water quality during this period in the Yangtze River basin and greater improvement in the Uppermost Yangtze River basin. The larger cities, with high GDP and population density, experienced relatively higher pollution levels due to discharge of industrial and household wastewater. There are higher pollution levels in Xiang and Gan River basins, as indicated by higher NH4-N and CODMn concentrations measured at the stations within these basins. Significant trends in water quality were identified for the 2004–2015 period. Operations of the three Gorges Reservoir (TGR) enhanced pH fluctuations and possibly attenuated CODMn, and NH4-N transportation. Finally, seasonal cycles of varying strength were detected for time-series of pollutants in river discharge. Seasonal patterns in pH indicate that maxima appear in winter, and minima in summer, with the opposite true for CODMn. Accurate understanding of long-term trends and seasonality are necessary goals of water quality monitoring system efforts and the analysis methods described here provide essential information for effectively controlling water pollution. PMID:29466354

  6. Identification of long-term trends and seasonality in high-frequency water quality data from the Yangtze River basin, China.

    Science.gov (United States)

    Duan, Weili; He, Bin; Chen, Yaning; Zou, Shan; Wang, Yi; Nover, Daniel; Chen, Wen; Yang, Guishan

    2018-01-01

    Comprehensive understanding of the long-term trends and seasonality of water quality is important for controlling water pollution. This study focuses on spatio-temporal distributions, long-term trends, and seasonality of water quality in the Yangtze River basin using a combination of the seasonal Mann-Kendall test and time-series decomposition. The used weekly water quality data were from 17 environmental stations for the period January 2004 to December 2015. Results show gradual improvement in water quality during this period in the Yangtze River basin and greater improvement in the Uppermost Yangtze River basin. The larger cities, with high GDP and population density, experienced relatively higher pollution levels due to discharge of industrial and household wastewater. There are higher pollution levels in Xiang and Gan River basins, as indicated by higher NH4-N and CODMn concentrations measured at the stations within these basins. Significant trends in water quality were identified for the 2004-2015 period. Operations of the three Gorges Reservoir (TGR) enhanced pH fluctuations and possibly attenuated CODMn, and NH4-N transportation. Finally, seasonal cycles of varying strength were detected for time-series of pollutants in river discharge. Seasonal patterns in pH indicate that maxima appear in winter, and minima in summer, with the opposite true for CODMn. Accurate understanding of long-term trends and seasonality are necessary goals of water quality monitoring system efforts and the analysis methods described here provide essential information for effectively controlling water pollution.

  7. Synthesis of digital locomotive receiver of automatic locomotive signaling

    Directory of Open Access Journals (Sweden)

    K. V. Goncharov

    2013-02-01

    Full Text Available Purpose. Automatic locomotive signaling of continuous type with a numeric coding (ALSN has several disadvantages: a small number of signal indications, low noise stability, high inertia and low functional flexibility. Search for new and more advanced methods of signal processing for automatic locomotive signaling, synthesis of the noise proof digital locomotive receiver are essential. Methodology. The proposed algorithm of detection and identification locomotive signaling codes is based on the definition of mutual correlations of received oscillation and reference signals. For selecting threshold levels of decision element the following criterion has been formulated: the locomotive receiver should maximum set the correct solution for a given probability of dangerous errors. Findings. It has been found that the random nature of the ALSN signal amplitude does not affect the detection algorithm. However, the distribution law and numeric characteristics of signal amplitude affect the probability of errors, and should be considered when selecting a threshold levels According to obtained algorithm of detection and identification ALSN signals the digital locomotive receiver has been synthesized. It contains band pass filter, peak limiter, normalizing amplifier with automatic gain control circuit, analog to digital converter and digital signal processor. Originality. The ALSN system is improved by the way of the transfer of technical means to modern microelectronic element base, more perfect methods of detection and identification codes of locomotive signaling are applied. Practical value. Use of digital technology in the construction of the locomotive receiver ALSN will expand its functionality, will increase the noise immunity and operation stability of the locomotive signal system in conditions of various destabilizing factors.

  8. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  9. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  10. Automatic Service Derivation from Business Process Model Repositories via Semantic Technology

    NARCIS (Netherlands)

    Leopold, H.; Pittke, F.; Mendling, J.

    2015-01-01

    Although several approaches for service identification have been defined in research and practice, there is a notable lack of fully automated techniques. In this paper, we address the problem of manual work in the context of service derivation and present an approach for automatically deriving

  11. Automatic morphometry of synaptic boutons of cultured cells using granulometric analysis of digital images

    NARCIS (Netherlands)

    Prodanov, D.P.; Heeroma, Joost; Marani, Enrico

    2006-01-01

    Numbers, linear density, and surface area of synaptic boutons can be important parameters in studies on synaptic plasticity in cultured neurons. We present a method for automatic identification and morphometry of boutons based on filtering of digital images using granulometric analysis. Cultures of

  12. Automatic radar target recognition of objects falling on railway tracks

    International Nuclear Information System (INIS)

    Mroué, A; Heddebaut, M; Elbahhar, F; Rivenq, A; Rouvaen, J-M

    2012-01-01

    This paper presents an automatic radar target recognition procedure based on complex resonances using the signals provided by ultra-wideband radar. This procedure is dedicated to detection and identification of objects lying on railway tracks. For an efficient complex resonance extraction, a comparison between several pole extraction methods is illustrated. Therefore, preprocessing methods are presented aiming to remove most of the erroneous poles interfering with the discrimination scheme. Once physical poles are determined, a specific discrimination technique is introduced based on the Euclidean distances. Both simulation and experimental results are depicted showing an efficient discrimination of different targets including guided transport passengers

  13. Feature extraction and classification in automatic weld seam radioscopy

    International Nuclear Information System (INIS)

    Heindoerfer, F.; Pohle, R.

    1994-01-01

    The investigations conducted have shown that automatic feature extraction and classification procedures permit the identification of weld seam flaws. Within this context the favored learning fuzzy classificator represents a very good alternative to conventional classificators. The results have also made clear that improvements mainly in the field of image registration are still possible by increasing the resolution of the radioscopy system. Since, only if the flaw is segmented correctly, i.e. in its full size, and due to improved detail recognizability and sufficient contrast difference will an almost error-free classification be conceivable. (orig./MM) [de

  14. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  15. Usage of aids monitoring in automatic braking systems of modern cars

    Directory of Open Access Journals (Sweden)

    Dembitskyi V.

    2016-08-01

    Full Text Available Increased safety can be carried out at the expense the installation on vehicles of automatic braking systems, that monitor the traffic situation and the actions of the driver. In this paper considered the advantages and disadvantages of automatic braking systems, were analyzed modern tracking tools that are used in automatic braking systems. Based on the statistical data on accidents, are set the main dangers, that the automatic braking system will be reduced. In order to ensure the accuracy of information conducted research for determination of optimal combination of different sensors that provide an adequate perception of road conditions. The tracking system should be equipped with a combination of sensors, which in the case of detection of an obstacle or dangers of signal is transmitted to the information processing system and decision making. Information from the monitoring system should include data for the identification of the object, its condition, the speed.

  16. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  17. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.

    2001-01-01

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  18. Automatic digitization of SMA data

    Science.gov (United States)

    Väänänen, Mika; Tanskanen, Eija

    2017-04-01

    In the 1970's and 1980's the Scandinavian Magnetometer Array produced large amounts of excellent data from over 30 stations In Norway, Sweden and Finland. 620 film reels and 20 kilometers of film have been preserved and the longest time series produced in the campaign span almost uninterrupted for five years, but the data has never seen widespread use due to the choice of medium. Film is a difficult medium to digitize efficiently. Previously events of interest were searched for by hand and digitization was done by projecting the film on paper and plotting it by hand. We propose a method of automatically digitizing geomagnetic data stored on film and extracting the numerical values from the digitized data. The automatic digitization process helps in preserving old, valuable data that might otherwise go unused.

  19. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  20. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  1. Automatic Conflict Detection on Contracts

    Science.gov (United States)

    Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo

    Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.

  2. Automatic spent fuel ID number reader (I)

    International Nuclear Information System (INIS)

    Tanabe, S.; Kawamoto, H.; Fujimaki, K.; Kobe, A.

    1991-01-01

    An effective and efficient technique has been developed for facilitating identification works of LWR spent fuel stored in large scale spent fuel storage pools of such as processing plants. Experience shows that there are often difficulties in the implementation of operator's nuclear material accountancy and control works as well as safeguards inspections conducted on spent fuel assemblies stored in deep water pool. This paper reports that the technique is realized as an automatic spent fuel ID number reader system installed on fuel handling machine. The ID number reader system consists of an optical sub-system and an image processing sub-system. Thousands of spent fuel assemblies stored in under water open racks in each storage pool could be identified within relatively short time (e.g. within several hours) by using this combination. Various performance tests were carried out on image processing sub-system in 1990 using TV images obtained from different types of spent fuel assemblies stored in various storage pools of PWR and BWR power stations

  3. Image simulation for automatic license plate recognition

    Science.gov (United States)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  4. Automatically Determining Scale Within Unstructured Point Clouds

    Science.gov (United States)

    Kadamen, Jayren; Sithole, George

    2016-06-01

    Three dimensional models obtained from imagery have an arbitrary scale and therefore have to be scaled. Automatically scaling these models requires the detection of objects in these models which can be computationally intensive. Real-time object detection may pose problems for applications such as indoor navigation. This investigation poses the idea that relational cues, specifically height ratios, within indoor environments may offer an easier means to obtain scales for models created using imagery. The investigation aimed to show two things, (a) that the size of objects, especially the height off ground is consistent within an environment, and (b) that based on this consistency, objects can be identified and their general size used to scale a model. To test the idea a hypothesis is first tested on a terrestrial lidar scan of an indoor environment. Later as a proof of concept the same test is applied to a model created using imagery. The most notable finding was that the detection of objects can be more readily done by studying the ratio between the dimensions of objects that have their dimensions defined by human physiology. For example the dimensions of desks and chairs are related to the height of an average person. In the test, the difference between generalised and actual dimensions of objects were assessed. A maximum difference of 3.96% (2.93cm) was observed from automated scaling. By analysing the ratio between the heights (distance from the floor) of the tops of objects in a room, identification was also achieved.

  5. Identification of flap structure-specific endonuclease 1 as a factor involved in long-term memory formation of aversive learning.

    Science.gov (United States)

    Saavedra-Rodríguez, Lorena; Vázquez, Adrinel; Ortiz-Zuazaga, Humberto G; Chorna, Nataliya E; González, Fernando A; Andrés, Lissette; Rodríguez, Karen; Ramírez, Fernando; Rodríguez, Alan; Peña de Ortiz, Sandra

    2009-05-06

    We previously proposed that DNA recombination/repair processes play a role in memory formation. Here, we examined the possible role of the fen-1 gene, encoding a flap structure-specific endonuclease, in memory consolidation of conditioned taste aversion (CTA). Quantitative real-time PCR showed that amygdalar fen-1 mRNA induction was associated to the central processing of the illness experience related to CTA and to CTA itself, but not to the central processing resulting from the presentation of a novel flavor. CTA also increased expression of the Fen-1 protein in the amygdala, but not the insular cortex. In addition, double immunofluorescence analyses showed that amygdalar Fen-1 expression is mostly localized within neurons. Importantly, functional studies demonstrated that amygdalar antisense knockdown of fen-1 expression impaired consolidation, but not short-term memory, of CTA. Overall, these studies define the fen-1 endonuclease as a new DNA recombination/repair factor involved in the formation of long-term memories.

  6. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  7. The Mark II Automatic Diflux

    Directory of Open Access Journals (Sweden)

    Jean L Rasson

    2011-07-01

    Full Text Available We report here on the new realization of an automatic fluxgate theodolite able to perform unattended absolute geomagnetic declination and inclination measurements: the AUTODIF MKII. The main changes of this version compared with the former one are presented as well as the better specifications we expect now. We also explain the absolute orientation procedure by means of a laser beam and a corner cube and the method for leveling the fluxgate sensor, which is different from a conventional DIflux theodolite.

  8. CLG for Automatic Image Segmentation

    OpenAIRE

    Christo Ananth; S.Santhana Priya; S.Manisha; T.Ezhil Jothi; M.S.Ramasubhaeswari

    2017-01-01

    This paper proposes an automatic segmentation method which effectively combines Active Contour Model, Live Wire method and Graph Cut approach (CLG). The aim of Live wire method is to provide control to the user on segmentation process during execution. Active Contour Model provides a statistical model of object shape and appearance to a new image which are built during a training phase. In the graph cut technique, each pixel is represented as a node and the distance between those nodes is rep...

  9. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  10. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  11. Automatic wipers with mist control

    OpenAIRE

    Ashik K.P; A.N.Basavaraju

    2016-01-01

    - This paper illustrates Automatic wipers with mist control. In modern days, the accidents are most common in commercial vehicles. One of the reasons for these accidents is formation of the mist inside the vehicle due to heavy rain. In rainy seasons for commercial vehicles, the wiper on the windshield has to be controlled by the driver himself, which distracts his concentration on driving. Also when the rain lasts for more time (say for about 15 minutes) the formation of mist on t...

  12. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    the economy. Most types of revenues—mainly personal, corporate, and social insurance taxes —are sensitive to the business cycle and account for most of...Medicare taxes for self-employed people, taxes on production and imports, and unemployment insurance taxes . Those six categories account for the bulk of...federal tax revenues.6 Individual taxes account for most of the automatic stabilizers from revenues, followed by Social Security plus Medicare

  13. Group Dynamics in Automatic Imitation.

    Science.gov (United States)

    Gleibs, Ilka H; Wilson, Neil; Reddy, Geetha; Catmur, Caroline

    Imitation-matching the configural body movements of another individual-plays a crucial part in social interaction. We investigated whether automatic imitation is not only influenced by who we imitate (ingroup vs. outgroup member) but also by the nature of an expected interaction situation (competitive vs. cooperative). In line with assumptions from Social Identity Theory), we predicted that both social group membership and the expected situation impact on the level of automatic imitation. We adopted a 2 (group membership target: ingroup, outgroup) x 2 (situation: cooperative, competitive) design. The dependent variable was the degree to which participants imitated the target in a reaction time automatic imitation task. 99 female students from two British Universities participated. We found a significant two-way interaction on the imitation effect. When interacting in expectation of cooperation, imitation was stronger for an ingroup target compared to an outgroup target. However, this was not the case in the competitive condition where imitation did not differ between ingroup and outgroup target. This demonstrates that the goal structure of an expected interaction will determine the extent to which intergroup relations influence imitation, supporting a social identity approach.

  14. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  15. Single thrombopoietin dose alleviates hematopoietic stem cells intrinsic short- and long-term ionizing radiation damage. In vivo identification of anatomical cell expansion sites.

    Science.gov (United States)

    Tronik-Le Roux, Diana; Nicola, Marie-Anne; Vaigot, Pierre; Nurden, Paquita

    2015-01-01

    Hematopoietic stem cells (HSC) are essential for maintaining the integrity of complex and long-lived organisms. HSC, which are self-renewing, reconstitute the hematopoietic system through out life and facilitate long-term repopulation of myeloablated recipients. We have previously demonstrated that when mice are exposed to sublethal doses of ionizing radiation, subsets of the stem/progenitor compartment are affected. In this study we examine the role of thrombopoietin (TPO) on the regenerative capacities of HSC after irradiation and report the first demonstration of efficacy of a single injection of TPO shortly after in vivo exposure to ionizing radiation for reducing HSC injury and improving their functional outcome. Our results demonstrate that TPO treatment not only reduced the number of apoptotic cells but also induced a significant modification of their intrinsic characteristics. These findings were supported by transplantation assays with long-term HSC that were irradiated or unirradiated, TPO treated or untreated, in CD45.1/CD45.2 systems and by using luciferase-labeled HSC for direct bioluminescence imaging in living animals. Of particular importance, our data demonstrate the skull to be a highly favorable site for the TPO-induced emergence of hematopoietic cells after irradiation, suggesting a TPO-mediated relationship of primitive hematopoietic cells to an anatomical component. Together, the data presented here: provide novel findings about aspects of TPO action on stem cells, open new areas of investigation for therapeutic options in patients who are treated with radiation therapy, and show that early administration of a clinically suitable TPO-agonist counteracts the previously observed adverse effects.

  16. Development of an optimal filter substrate for the identification of small microplastic particles in food by micro-Raman spectroscopy.

    Science.gov (United States)

    Oßmann, Barbara E; Sarau, George; Schmitt, Sebastian W; Holtmannspötter, Heinrich; Christiansen, Silke H; Dicke, Wilhelm

    2017-06-01

    When analysing microplastics in food, due to toxicological reasons it is important to achieve clear identification of particles down to a size of at least 1 μm. One reliable, optical analytical technique allowing this is micro-Raman spectroscopy. After isolation of particles via filtration, analysis is typically performed directly on the filter surface. In order to obtain high qualitative Raman spectra, the material of the membrane filters should not show any interference in terms of background and Raman signals during spectrum acquisition. To facilitate the usage of automatic particle detection, membrane filters should also show specific optical properties. In this work, beside eight different, commercially available membrane filters, three newly designed metal-coated polycarbonate membrane filters were tested to fulfil these requirements. We found that aluminium-coated polycarbonate membrane filters had ideal characteristics as a substrate for micro-Raman spectroscopy. Its spectrum shows no or minimal interference with particle spectra, depending on the laser wavelength. Furthermore, automatic particle detection can be applied when analysing the filter surface under dark-field illumination. With this new membrane filter, analytics free of interference of microplastics down to a size of 1 μm becomes possible. Thus, an important size class of these contaminants can now be visualized and spectrally identified. Graphical abstract A newly developed aluminium coated polycarbonate membrane filter enables automatic particle detection and generation of high qualitative Raman spectra allowing identification of small microplastics.

  17. Identification of long-term carbon sequestration in soils with historical inputs of biochar using novel stable isotope and spectroscopic techniques

    Science.gov (United States)

    Hernandez-Soriano, Maria C.; Kerré, Bart; Hardy, Brieuc; Dufey, Joseph; Smolders, Erik

    2013-04-01

    Biochar is the collective term for organic matter (OM) that has been produced by pyrolysis of biomass, e.g. during production of charcoal or during natural processes such as bush fires. Biochar production and application is now suggested as one of the economically feasible options for global C-sequestration strategies. The C-sequestration in soil through application of biochar is not only related to its persistence (estimated lifetime exceeds 1000 year in soil), but also due to indirect effects such as its potential to adsorb and increase OM stability in soil. Historical charcoal production sites that had been in use >200 years ago in beech/oak forests have been localized in the south of Belgium. Aerial photography identified black spots in arable land on former forest sites. Soil sampling was conducted in an arable field used for maize production near Mettet (Belgium) where charcoal production was intensive until late 18th century. Soils were sampled in a horizontal gradient across the 'black soils' that extend of few decametres, collecting soil from the spots (Biochar Amended, BA) as well as from the non-biochar amended (NBA). Stable C isotope composition was used to estimate the long-term C-sequestration derived from crops in these soils where maize had been produced since about 15 years. Because C in the biochar originates in forest wood (C3 plants), its isotopic signature (δ13C) differs from the maize (a C4 plant). The C and N content and the δ13C were determined for bulk soil samples and for microaggregate size fractions separated by wet sieving. Fourier Transform Infrared Spectroscopy (FTIR) coupled to optical microscopy was used to obtaining fingerprints of biochar and OM composition for soil microaggregates. The total C content in the BA soil (5.5%) and the C/N ratio (16.9) were higher than for NBA (C content 2.7%; C/N ratio 12.6), which confirms the persistence of OM in the BA. The average isotopic signature of bulk soil from BA (-26.08) was slightly

  18. Identification and understanding the factors affecting the public and political acceptance of long term storage of spent fuel and high-level radioactive wastes

    International Nuclear Information System (INIS)

    Gorea, Valica

    2006-01-01

    In the end of 2004, according to the information available to the IAEA, there were 440 nuclear reactors operating worldwide, with a total net capacity of 366.3 GW(e), 6 of them being connected to the grid in 2004 ( 2 in Ukraine, one each in China, Japan and the Russian Federation and a reconnection in Canada) by comparison with 2 connections and 2 re-connections in 2003. Also, in the end of 2004, 26 nuclear power plants were under construction with a total net capacity of 20.8 GW(e). The conclusion accepted by common consent is that the nuclear power is still in progress and represents one of the options for power security on long and middle term. If we refer to the nuclear fusion which will produce commercial electric power, over 30 - 40 years, in practically unlimited quantities, the above underlining becomes even more evident. Fortunately, besides the beneficent characteristics, such as: clean, stable as engendering and price, has also a negative characteristic, which generally breathes fear into the people: radioactive waste. A classification of the radioactive waste is not the target of this presentation. I just want to point that a nuclear power plant produces during the time spent fuel - long life high radioactive, generating heat. Another high radioactive waste have similar characteristics (HLW = High Level Waste) for which reason these two categories of wastes are treated together. The spent fuel and the High Level Waste are interim stored for cooling, for around 50 years, afterwards it is transferred to the final repository where it will be kept for hundreds of years, in the case of an open fuel cycle and this is also the case of Cernavoda NPP. Taking into consideration that the Cernavoda Unit 1 reaches the age of 10 years of commercial running during December 2006, it results that the issue of the final disposal is not such urgent as it looks. The objectives of long term management of radioactive waste are public health and protection of the environment

  19. De Novo Transcriptome Assembly and Identification of Gene Candidates for Rapid Evolution of Soil Al Tolerance in Anthoxanthum odoratum at the Long-Term Park Grass Experiment.

    Science.gov (United States)

    Gould, Billie; McCouch, Susan; Geber, Monica

    2015-01-01

    Studies of adaptation in the wild grass Anthoxanthum odoratum at the Park Grass Experiment (PGE) provided one of the earliest examples of rapid evolution in plants. Anthoxanthum has become locally adapted to differences in soil Al toxicity, which have developed there due to soil acidification from long-term experimental fertilizer treatments. In this study, we used transcriptome sequencing to identify Al stress responsive genes in Anthoxanhum and identify candidates among them for further molecular study of rapid Al tolerance evolution at the PGE. We examined the Al content of Anthoxanthum tissues and conducted RNA-sequencing of root tips, the primary site of Al induced damage. We found that despite its high tolerance Anthoxanthum is not an Al accumulating species. Genes similar to those involved in organic acid exudation (TaALMT1, ZmMATE), cell wall modification (OsSTAR1), and internal Al detoxification (OsNRAT1) in cultivated grasses were responsive to Al exposure. Expression of a large suite of novel loci was also triggered by early exposure to Al stress in roots. Three-hundred forty five transcripts were significantly more up- or down-regulated in tolerant vs. sensitive Anthoxanthum genotypes, providing important targets for future study of rapid evolution at the PGE.

  20. De Novo Transcriptome Assembly and Identification of Gene Candidates for Rapid Evolution of Soil Al Tolerance in Anthoxanthum odoratum at the Long-Term Park Grass Experiment.

    Directory of Open Access Journals (Sweden)

    Billie Gould

    Full Text Available Studies of adaptation in the wild grass Anthoxanthum odoratum at the Park Grass Experiment (PGE provided one of the earliest examples of rapid evolution in plants. Anthoxanthum has become locally adapted to differences in soil Al toxicity, which have developed there due to soil acidification from long-term experimental fertilizer treatments. In this study, we used transcriptome sequencing to identify Al stress responsive genes in Anthoxanhum and identify candidates among them for further molecular study of rapid Al tolerance evolution at the PGE. We examined the Al content of Anthoxanthum tissues and conducted RNA-sequencing of root tips, the primary site of Al induced damage. We found that despite its high tolerance Anthoxanthum is not an Al accumulating species. Genes similar to those involved in organic acid exudation (TaALMT1, ZmMATE, cell wall modification (OsSTAR1, and internal Al detoxification (OsNRAT1 in cultivated grasses were responsive to Al exposure. Expression of a large suite of novel loci was also triggered by early exposure to Al stress in roots. Three-hundred forty five transcripts were significantly more up- or down-regulated in tolerant vs. sensitive Anthoxanthum genotypes, providing important targets for future study of rapid evolution at the PGE.

  1. Automatic positioning control device for automatic control rod exchanger

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To attain accurate positioning for a control rod exchanger. Constitution: The present position for an automatic control rod exchanger is detected by a synchro generator. An aimed stopping position for the exchanger, a stop instruction range depending on the distantial operation delay in the control system and the inertia-running distance of the mechanical system, and a coincidence confirmation range depending on the required positioning accuracy are previously set. If there is a difference between the present position and the aimed stopping position, the automatic exchanger is caused to run toward the aimed stopping position. A stop instruction is generated upon arrival at the position within said stop instruction range, and a coincidence confirmation signal is generated upon arrival at the position within the coincidence confirmation range. Since uncertain factors such as operation delay in the control system and the inertia-running distance of the mechanical system that influence the positioning accuracy are made definite by the method of actual measurement or the like and the stop instruction range and the coincidence confirmation range are set based on the measured data, the accuracy for the positioning can be improved. (Ikeda, J.)

  2. An automatic system to study sperm motility and energetics

    OpenAIRE

    Shi, LZ; Nascimento, JM; Chandsawangbhuwana, C; Botvinick, EL; Berns, MW

    2008-01-01

    An integrated robotic laser and microscope system has been developed to automatically analyze individual sperm motility and energetics. The custom-designed optical system directs near-infrared laser light into an inverted microscope to create a single-point 3-D gradient laser trap at the focal spot of the microscope objective. A two-level computer structure is described that quantifies the sperm motility (in terms of swimming speed and swimming force) and energetics (measuring mid-piece membr...

  3. Exogenous (automatic) attention to emotional stimuli: a review

    OpenAIRE

    Carretié, Luis

    2014-01-01

    Current knowledge on the architecture of exogenous attention (also called automatic, bottom-up, or stimulus-driven attention, among other terms) has been mainly obtained from studies employing neutral, anodyne stimuli. Since, from an evolutionary perspective, exogenous attention can be understood as an adaptive tool for rapidly detecting salient events, reorienting processing resources to them, and enhancing processing mechanisms, emotional events (which are, by definition, salient for the in...

  4. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  5. Automatic Strain-Rate Controller,

    Science.gov (United States)

    1976-12-01

    D—AO37 9~e2 ROME AIR DEVELOPMENT CENTER GRIFFISS AFB N 1’ FIG 13/ 6AUTOMATIC STRAIN—RATE CONTROLLER, (U) DEC 76 R L HUNTSINGER. J A ADAMSK I...goes to zero. CONTROLLER, Leeds and Northrup Series 80 CAT with proportional band , rate , reset, and approach controls . Input from deviation output...8) through ( 16) . (8) Move the set-point slowl y up to 3 or 4. (9) If the recorder po inter hunts , adjust the func t ion controls on tine Ser

  6. Commutated automatic gain control system

    Science.gov (United States)

    Yost, S. R.

    1982-01-01

    A commutated automatic gain control (AGC) system was designed and built for a prototype Loran C receiver. The receiver uses a microcomputer to control a memory aided phase-locked loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The circuit designed for the AGC is described, and bench and flight test results are presented. The AGC circuit described actually samples starting at a point 40 microseconds after a zero crossing determined by the software lock pulse ultimately generated by a 30 microsecond delay and add network in the receiver front end envelope detector.

  7. Automatic liquid nitrogen feeding device

    International Nuclear Information System (INIS)

    Gillardeau, J.; Bona, F.; Dejachy, G.

    1963-01-01

    An automatic liquid nitrogen feeding device has been developed (and used) in the framework of corrosion tests realized with constantly renewed uranium hexafluoride. The issue was to feed liquid nitrogen to a large capacity metallic trap in order to condensate uranium hexafluoride at the exit of the corrosion chambers. After having studied various available devices, a feeding device has been specifically designed to be robust, secure and autonomous, as well as ensuring a high liquid nitrogen flowrate and a highly elevated feeding frequency. The device, made of standard material, has been used during 4000 hours without any problem [fr

  8. Automatic alignment of radionuclide images

    International Nuclear Information System (INIS)

    Barber, D.C.

    1982-01-01

    The variability of the position, dimensions and orientation of a radionuclide image within the field of view of a gamma camera hampers attempts to analyse the image numerically. This paper describes a method of using a set of training images of a particular type, in this case right lateral brain images, to define the likely variations in the position, dimensions and orientation for that type of image and to provide alignment data for a program that automatically aligns new images of the specified type to a standard position, size and orientation. Examples are given of the use of this method on three types of radionuclide image. (author)

  9. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  10. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  11. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  12. Long-term chemical analysis and organic aerosol source apportionment at nine sites in central Europe: source identification and uncertainty assessment

    Science.gov (United States)

    Daellenbach, Kaspar R.; Stefenelli, Giulia; Bozzetti, Carlo; Vlachou, Athanasia; Fermo, Paola; Gonzalez, Raquel; Piazzalunga, Andrea; Colombi, Cristina; Canonaco, Francesco; Hueglin, Christoph; Kasper-Giebl, Anne; Jaffrezo, Jean-Luc; Bianchi, Federico; Slowik, Jay G.; Baltensperger, Urs; El-Haddad, Imad; Prévôt, André S. H.

    2017-11-01

    Long-term monitoring of organic aerosol is important for epidemiological studies, validation of atmospheric models, and air quality management. In this study, we apply a recently developed filter-based offline methodology using an aerosol mass spectrometer (AMS) to investigate the regional and seasonal differences of contributing organic aerosol sources. We present offline AMS measurements for particulate matter smaller than 10 µm at nine stations in central Europe with different exposure characteristics for the entire year of 2013 (819 samples). The focus of this study is a detailed source apportionment analysis (using positive matrix factorization, PMF) including in-depth assessment of the related uncertainties. Primary organic aerosol (POA) is separated in three components: hydrocarbon-like OA related to traffic emissions (HOA), cooking OA (COA), and biomass burning OA (BBOA). We observe enhanced production of secondary organic aerosol (SOA) in summer, following the increase in biogenic emissions with temperature (summer oxygenated OA, SOOA). In addition, a SOA component was extracted that correlated with an anthropogenic secondary inorganic species that is dominant in winter (winter oxygenated OA, WOOA). A factor (sulfur-containing organic, SC-OA) explaining sulfur-containing fragments (CH3SO2+), which has an event-driven temporal behaviour, was also identified. The relative yearly average factor contributions range from 4 to 14 % for HOA, from 3 to 11 % for COA, from 11 to 59 % for BBOA, from 5 to 23 % for SC-OA, from 14 to 27 % for WOOA, and from 15 to 38 % for SOOA. The uncertainty of the relative average factor contribution lies between 2 and 12 % of OA. At the sites north of the alpine crest, the sum of HOA, COA, and BBOA (POA) contributes less to OA (POA / OA = 0.3) than at the southern alpine valley sites (0.6). BBOA is the main contributor to POA with 87 % in alpine valleys and 42 % north of the alpine crest. Furthermore, the influence of primary

  13. Body odors promote automatic imitation in autism.

    Science.gov (United States)

    Parma, Valentina; Bulgheroni, Maria; Tirindelli, Roberto; Castiello, Umberto

    2013-08-01

    Autism spectrum disorders comprise a range of neurodevelopmental pathologies characterized, among other symptoms, by impaired social interactions. Individuals with this diagnosis are reported to often identify people by repetitively sniffing pieces of clothing or the body odor of family members. Since body odors are known to initiate and mediate many different social behaviors, smelling the body odor of a family member might constitute a sensory-based action promoting social contact. In light of this, we hypothesized that the body odor of a family member would facilitate the appearance of automatic imitation, an essential social skill known to be impaired in autism. We recruited 20 autistic and 20 typically developing children. Body odors were collected from the children's mothers' axillae. A child observed a model (their mother or a stranger mother) execute (or not) a reach-to-grasp action toward an object. Subsequently, she performed the same action. The object was imbued with the child's mother's odor, a stranger mother's odor, or no odor. The actions were videotaped, and movement time was calculated post hoc via a digitalization technique. Automatic imitation effects-expressed in terms of total movement time reduction-appear in autistic children only when exposed to objects paired with their own mother's odor. The maternal odor, which conveys a social message otherwise neglected, helps autistic children to covertly imitate the actions of others. Our results represent a starting point holding theoretical and practical relevance for the development of new strategies to enhance communication and social behavior among autistic individuals. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  14. Digital movie-based on automatic titrations.

    Science.gov (United States)

    Lima, Ricardo Alexandre C; Almeida, Luciano F; Lyra, Wellington S; Siqueira, Lucas A; Gaião, Edvaldo N; Paiva Junior, Sérgio S L; Lima, Rafaela L F C

    2016-01-15

    This study proposes the use of digital movies (DMs) in a flow-batch analyzer (FBA) to perform automatic, fast and accurate titrations. The term used for this process is "Digital movie-based on automatic titrations" (DMB-AT). A webcam records the DM during the addition of the titrant to the mixing chamber (MC). While the DM is recorded, it is decompiled into frames ordered sequentially at a constant rate of 26 frames per second (FPS). The first frame is used as a reference to define the region of interest (ROI) of 28×13pixels and the R, G and B values, which are used to calculate the Hue (H) values for each frame. The Pearson's correlation coefficient (r) is calculated between the H values of the initial frame and each subsequent frame. The titration curves are plotted in real time using the r values and the opening time of the titrant valve. The end point is estimated by the second derivative method. A software written in C language manages all analytical steps and data treatment in real time. The feasibility of the method was attested by application in acid/base test samples and edible oils. Results were compared with classical titration and did not present statistically significant differences when the paired t-test at the 95% confidence level was applied. The proposed method is able to process about 117-128 samples per hour for the test and edible oil samples, respectively, and its precision was confirmed by overall relative standard deviation (RSD) values, always less than 1.0%. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. The RNA world, automatic sequences and oncogenetics

    Energy Technology Data Exchange (ETDEWEB)

    Tahir Shah, K

    1993-04-01

    We construct a model of the RNA world in terms of naturally evolving nucleotide sequences assuming only Crick-Watson base pairing and self-cleaving/splicing capability. These sequences have the following properties. (1) They are recognizable by an automation (or automata). That is, to each k-sequence, there exist a k-automation which accepts, recognizes or generates the k-sequence. These are known as automatic sequences. Fibonacci and Morse-Thue sequences are the most natural outcome of pre-biotic chemical conditions. (2) Infinite (resp. large) sequences are self-similar (resp. nearly self-similar) under certain rewrite rules and consequently give rise to fractal (resp.fractal-like) structures. Computationally, such sequences can also be generated by their corresponding deterministic parallel re-write system, known as a DOL system. The self-similar sequences are fixed points of their respective rewrite rules. Some of these automatic sequences have the capability that they can read or ``accept`` other sequences while others can detect errors and trigger error-correcting mechanisms. They can be enlarged and have block and/or palindrome structure. Linear recurring sequences such as Fibonacci sequence are simply Feed-back Shift Registers, a well know model of information processing machines. We show that a mutation of any rewrite rule can cause a combinatorial explosion of error and relates this to oncogenetical behavior. On the other hand, a mutation of sequences that are not rewrite rules, leads to normal evolutionary change. Known experimental results support our hypothesis. (author). Refs.

  16. The RNA world, automatic sequences and oncogenetics

    International Nuclear Information System (INIS)

    Tahir Shah, K.

    1993-04-01

    We construct a model of the RNA world in terms of naturally evolving nucleotide sequences assuming only Crick-Watson base pairing and self-cleaving/splicing capability. These sequences have the following properties. 1) They are recognizable by an automation (or automata). That is, to each k-sequence, there exist a k-automation which accepts, recognizes or generates the k-sequence. These are known as automatic sequences. Fibonacci and Morse-Thue sequences are the most natural outcome of pre-biotic chemical conditions. 2) Infinite (resp. large) sequences are self-similar (resp. nearly self-similar) under certain rewrite rules and consequently give rise to fractal (resp.fractal-like) structures. Computationally, such sequences can also be generated by their corresponding deterministic parallel re-write system, known as a DOL system. The self-similar sequences are fixed points of their respective rewrite rules. Some of these automatic sequences have the capability that they can read or 'accept' other sequences while others can detect errors and trigger error-correcting mechanisms. They can be enlarged and have block and/or palindrome structure. Linear recurring sequences such as Fibonacci sequence are simply Feed-back Shift Registers, a well know model of information processing machines. We show that a mutation of any rewrite rule can cause a combinatorial explosion of error and relates this to oncogenetical behavior. On the other hand, a mutation of sequences that are not rewrite rules, leads to normal evolutionary change. Known experimental results support our hypothesis. (author). Refs

  17. Automatic evaluations and exercise setting preference in frequent exercisers.

    Science.gov (United States)

    Antoniewicz, Franziska; Brand, Ralf

    2014-12-01

    The goals of this study were to test whether exercise-related stimuli can elicit automatic evaluative responses and whether automatic evaluations reflect exercise setting preference in highly active exercisers. An adapted version of the Affect Misattribution Procedure was employed. Seventy-two highly active exercisers (26 years ± 9.03; 43% female) were subliminally primed (7 ms) with pictures depicting typical fitness center scenarios or gray rectangles (control primes). After each prime, participants consciously evaluated the "pleasantness" of a Chinese symbol. Controlled evaluations were measured with a questionnaire and were more positive in participants who regularly visited fitness centers than in those who reported avoiding this exercise setting. Only center exercisers gave automatic positive evaluations of the fitness center setting (partial eta squared = .08). It is proposed that a subliminal Affect Misattribution Procedure paradigm can elicit automatic evaluations to exercising and that, in highly active exercisers, these evaluations play a role in decisions about the exercise setting rather than the amounts of physical exercise. Findings are interpreted in terms of a dual systems theory of social information processing and behavior.

  18. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink

    2017-01-01

    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  19. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  20. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  1. An automatic holographic adaptive phoropter

    Science.gov (United States)

    Amirsolaimani, Babak; Peyghambarian, N.; Schwiegerling, Jim; Bablumyan, Arkady; Savidis, Nickolaos; Peyman, Gholam

    2017-08-01

    Phoropters are the most common instrument used to detect refractive errors. During a refractive exam, lenses are flipped in front of the patient who looks at the eye chart and tries to read the symbols. The procedure is fully dependent on the cooperation of the patient to read the eye chart, provides only a subjective measurement of visual acuity, and can at best provide a rough estimate of the patient's vision. Phoropters are difficult to use for mass screenings requiring a skilled examiner, and it is hard to screen young children and the elderly etc. We have developed a simplified, lightweight automatic phoropter that can measure the optical error of the eye objectively without requiring the patient's input. The automatic holographic adaptive phoropter is based on a Shack-Hartmann wave front sensor and three computercontrolled fluidic lenses. The fluidic lens system is designed to be able to provide power and astigmatic corrections over a large range of corrections without the need for verbal feedback from the patient in less than 20 seconds.

  2. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.

    1978-01-01

    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  3. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  4. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  5. ACIR: automatic cochlea image registration

    Science.gov (United States)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  6. Automatic referral to cardiac rehabilitation.

    Science.gov (United States)

    Fischer, Jane P

    2008-01-01

    The pervasive negative impact of cardiovascular disease in the United States is well documented. Although advances have been made, the campaign to reduce the occurrence, progression, and mortality continues. Determining evidence-based data is only half the battle. Implementing new and updated clinical guidelines into daily practice is a challenging task. Cardiac rehabilitation is an example of a proven intervention whose benefit is hindered through erratic implementation. The American Association of Cardiovascular and Pulmonary Rehabilitation (AACVPR), the American College of Cardiology (ACC), and the American Heart Association (AHA) have responded to this problem by publishing the AACVPR/ACC/AHA 2007 Performance Measures on Cardiac Rehabilitation for Referral to and Delivery of Cardiac Rehabilitation/Secondary Prevention Services. This new national guideline recommends automatic referral to cardiac rehabilitation for every eligible patient (performance measure A-1). This article offers guidance for the initiation of an automatic referral system, including individualizing your protocol with regard to electronic or paper-based order entry structures.

  7. Identification automatique des diatomées de la Merja fouarate : Une ...

    African Journals Online (AJOL)

    SARAH

    30 sept. 2015 ... Automatic identification of Fouarate Merja diatoms: An alternative to manual determination and classification .... 21 genres, avec une erreur de 4% car plusieurs organismes .... Electronic, Technologies of Information and.

  8. Fuel number identification method and device

    International Nuclear Information System (INIS)

    Doi, Takami; Seno, Makoto; Kikuchi, Takashi; Sakamoto, Hiromi; Takahashi, Masaki; Tanaka, Keiji.

    1997-01-01

    The present invention provides a method of and a device for automatically identifying fuel numbers impressed on fuel assemblies disposed in a fuel reprocessing facility, power plant and a reactor core at a high speed and at a high identification rate. Namely, three or more character images are photographed for one fuel assembly as an object of the identification under different illumination conditions. As a result, different character images by the number of the illumination directions can be obtained for identical impressed characters. Learning on a neural network system is applied to the images of all of the characters impressed on the fuel assembly obtained under illumination of predetermined directions. Then, result of the identification by the number of the illumination directions can be obtained for each of the characters as an object of the identification. As a result, since the result of the identification is determined based on a theory of decision of majority, highly automatic identification can be realized. (I.S.)

  9. A bar-code reader for an alpha-beta automatic counting system - FAG

    International Nuclear Information System (INIS)

    Levinson, S.; Shemesh, Y.; Ankry, N.; Assido, H.; German, U.; Peled, O.

    1996-01-01

    A bar-code laser system for sample number reading was integrated into the FAG Alpha-Beta automatic counting system. The sample identification by means of an attached bar-code label enables unmistakable and reliable attribution of results to the counted sample. Installation of the bar-code reader system required several modifications: Mechanical changes in the automatic sample changer, design and production of new sample holders, modification of the sample planchettes, changes in the electronic system, update of the operating software of the system (authors)

  10. A bar-code reader for an alpha-beta automatic counting system - FAG

    Energy Technology Data Exchange (ETDEWEB)

    Levinson, S; Shemesh, Y; Ankry, N; Assido, H; German, U; Peled, O [Israel Atomic Energy Commission, Beersheba (Israel). Nuclear Research Center-Negev

    1996-12-01

    A bar-code laser system for sample number reading was integrated into the FAG Alpha-Beta automatic counting system. The sample identification by means of an attached bar-code label enables unmistakable and reliable attribution of results to the counted sample. Installation of the bar-code reader system required several modifications: Mechanical changes in the automatic sample changer, design and production of new sample holders, modification of the sample planchettes, changes in the electronic system, update of the operating software of the system (authors).

  11. A Fully Automatic Fresh Apple Juicer: Peeling, Coring, Slicing and Juicing

    Directory of Open Access Journals (Sweden)

    Hu Fuwen

    2017-01-01

    Full Text Available With the fresh apple juice as an example, a fully automatic and intelligent juicer prototype was built via the integrated application of servo positioning modules, human-machine interface, image vision sensor system and 3D printing. All steps including peeling, coring, slicing and juicing were achieved automatically. The challenging technical problems about the identification and orientation of apple core, and adaptive peeling were settled creatively. The trial operation results illustrated that the fresh apple juice can be produced without manual intervention and the system has potential application in the crowded sites, such as mall, school, restaurant and hospital.

  12. Automatic adventitious respiratory sound analysis: A systematic review.

    Directory of Open Access Journals (Sweden)

    Renard Xaviero Adhi Pramono

    Full Text Available Automatic detection or classification of adventitious sounds is useful to assist physicians in diagnosing or monitoring diseases such as asthma, Chronic Obstructive Pulmonary Disease (COPD, and pneumonia. While computerised respiratory sound analysis, specifically for the detection or classification of adventitious sounds, has recently been the focus of an increasing number of studies, a standardised approach and comparison has not been well established.To provide a review of existing algorithms for the detection or classification of adventitious respiratory sounds. This systematic review provides a complete summary of methods used in the literature to give a baseline for future works.A systematic review of English articles published between 1938 and 2016, searched using the Scopus (1938-2016 and IEEExplore (1984-2016 databases. Additional articles were further obtained by references listed in the articles found. Search terms included adventitious sound detection, adventitious sound classification, abnormal respiratory sound detection, abnormal respiratory sound classification, wheeze detection, wheeze classification, crackle detection, crackle classification, rhonchi detection, rhonchi classification, stridor detection, stridor classification, pleural rub detection, pleural rub classification, squawk detection, and squawk classification.Only articles were included that focused on adventitious sound detection or classification, based on respiratory sounds, with performance reported and sufficient information provided to be approximately repeated.Investigators extracted data about the adventitious sound type analysed, approach and level of analysis, instrumentation or data source, location of sensor, amount of data obtained, data management, features, methods, and performance achieved.A total of 77 reports from the literature were included in this review. 55 (71.43% of the studies focused on wheeze, 40 (51.95% on crackle, 9 (11.69% on stridor, 9

  13. Telecommand and monitoring of automatic recloser through cell phone communication; Telecomando e monitoramento de religadoras automaticas via comunicacao celular

    Energy Technology Data Exchange (ETDEWEB)

    Gardiman, Vitor Luiz G.; Pires Neto, Francisco M.; Rufini, Ricardo; Marques, Rogerio [Bandeirante Energia, Sao Paulo, SP (Brazil)

    2004-02-01

    This article presents the system of tele command and monitoring automatic recloser of the medium voltage distribution network adopted by the Bandeirante Energia, Brazil, by using the cell phone network. The system, installed in 110 automatic reclosers, presented short term availability of tele supervision and tele control, fast installation, and operation and reliability low costs.

  14. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  15. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  16. Isotope Identification

    Energy Technology Data Exchange (ETDEWEB)

    Karpius, Peter Joseph [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-18

    The objective of this training modules is to examine the process of using gamma spectroscopy for radionuclide identification; apply pattern recognition to gamma spectra; identify methods of verifying energy calibration; and discuss potential causes of isotope misidentification.

  17. Semi-Automatic Removal of Foreground Stars from Images of Galaxies

    Science.gov (United States)

    Frei, Zsolt

    1996-07-01

    A new procedure, designed to remove foreground stars from galaxy proviles is presented here. Although several programs exist for stellar and faint object photometry, none of them treat star removal from the images very carefully. I present my attempt to develop such a system, and briefly compare the performance of my software to one of the well-known stellar photometry packages, DAOPhot (Stetson 1987). Major steps in my procedure are: (1) automatic construction of an empirical 2D point spread function from well separated stars that are situated off the galaxy; (2) automatic identification of those peaks that are likely to be foreground stars, scaling the PSF and removing these stars, and patching residuals (in the automatically determined smallest possible area where residuals are truly significant); and (3) cosmetic fix of remaining degradations in the image. The algorithm and software presented here is significantly better for automatic removal of foreground stars from images of galaxies than DAOPhot or similar packages, since: (a) the most suitable stars are selected automatically from the image for the PSF fit; (b) after star-removal an intelligent and automatic procedure removes any possible residuals; (c) unlimited number of images can be cleaned in one run without any user interaction whatsoever. (SECTION: Computing and Data Analysis)

  18. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  19. Automatic Thermal Infrared Panoramic Imaging Sensor

    National Research Council Canada - National Science Library

    Gutin, Mikhail; Tsui, Eddy K; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-01-01

    .... Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence...

  20. Identification of persons entering through the door from the accelerometers data

    OpenAIRE

    Vodopivec, Tadej

    2013-01-01

    This thesis presents a system for automatic identification of a person who walks through a door based on the information gathered with acceleration sensors, which are mounted on the door. The proccedure for automatic identification consists of four steps. In the first step, data acquisition and coordinate system rotation is performed. Coordinate system rotation is needed in order to match the direction of measured accelerations with the direction of components of the forces acting on the door...

  1. An intelligent support system for automatic detection of cerebral vascular accidents from brain CT images.

    Science.gov (United States)

    Hajimani, Elmira; Ruano, M G; Ruano, A E

    2017-07-01

    This paper presents a Radial Basis Functions Neural Network (RBFNN) based detection system, for automatic identification of Cerebral Vascular Accidents (CVA) through analysis of Computed Tomographic (CT) images. For the design of a neural network classifier, a Multi Objective Genetic Algorithm (MOGA) framework is used to determine the architecture of the classifier, its corresponding parameters and input features by maximizing the classification precision, while ensuring generalization. This approach considers a large number of input features, comprising first and second order pixel intensity statistics, as well as symmetry/asymmetry information with respect to the ideal mid-sagittal line. Values of specificity of 98% and sensitivity of 98% were obtained, at pixel level, by an ensemble of non-dominated models generated by MOGA, in a set of 150 CT slices (1,867,602pixels), marked by a NeuroRadiologist. This approach also compares favorably at a lesion level with three other published solutions, in terms of specificity (86% compared with 84%), degree of coincidence of marked lesions (89% compared with 77%) and classification accuracy rate (96% compared with 88%). Copyright © 2017. Published by Elsevier B.V.

  2. Prototype Design and Application of a Semi-circular Automatic Parking System

    OpenAIRE

    Atacak, Ismail; Erdogdu, Ertugrul

    2017-01-01

    Nowadays, with the increasing population in urban areas, the number of vehicles used in traffic has also increased in these areas. This has brought with it major problems that are caused by insufficient parking areas, in terms of traffic congestion, drivers and environment. In this study, in order to overcome these problems, a multi-storey automatic parking system that automatically performs vehicle recognition, vehicle parking, vehicle delivery and pricing processes has been designed and the...

  3. Automatic Detection of Childhood Absence Epilepsy Seizures: Toward a Monitoring Device

    DEFF Research Database (Denmark)

    Duun-Henriksen, Jonas; Madsen, Rasmus E.; Remvig, Line S.

    2012-01-01

    Automatic detections of paroxysms in patients with childhood absence epilepsy have been neglected for several years. We acquire reliable detections using only a single-channel brainwave monitor, allowing for unobtrusive monitoring of antiepileptic drug effects. Ultimately we seek to obtain optimal...... long-term prognoses, balancing antiepileptic effects and side effects. The electroencephalographic appearance of paroxysms in childhood absence epilepsy is fairly homogeneous, making it feasible to develop patient-independent automatic detection. We implemented a state-of-the-art algorithm...

  4. Automatic health record review to help prioritize gravely ill Social Security disability applicants.

    Science.gov (United States)

    Abbott, Kenneth; Ho, Yen-Yi; Erickson, Jennifer

    2017-07-01

    Every year, thousands of patients die waiting for disability benefits from the Social Security Administration. Some qualify for expedited service under the Compassionate Allowance (CAL) initiative, but CAL software focuses exclusively on information from a single form field. This paper describes the development of a supplemental process for identifying some overlooked but gravely ill applicants, through automatic annotation of health records accompanying new claims. We explore improved prioritization instead of fully autonomous claims approval. We developed a sample of claims containing medical records at the moment of arrival in a single office. A series of tools annotated both patient records and public Web page descriptions of CAL medical conditions. We trained random forests to identify CAL patients and validated each model with 10-fold cross validation. Our main model, a general CAL classifier, had an area under the receiver operating characteristic curve of 0.915. Combining this classifier with existing software improved sensitivity from 0.960 to 0.994, detecting every deceased patient, but reducing positive predictive value to 0.216. True positive CAL identification is a priority, given CAL patient mortality. Mere prioritization of the false positives would not create a meaningful burden in terms of manual review. Death certificate data suggest the presence of truly ill patients among putative false positives. To a limited extent, it is possible to identify gravely ill Social Security disability applicants by analyzing annotations of unstructured electronic health records, and the level of identification is sufficient to be useful in prioritizing case reviews. Published by Oxford University Press on behalf of the American Medical Informatics Association 2017. This work is written by US Government employees and is in the public domain in the US.

  5. Peak fitting and identification software library for high resolution gamma-ray spectra

    International Nuclear Information System (INIS)

    Uher, Josef; Roach, Greg; Tickner, James

    2010-01-01

    A new gamma-ray spectral analysis software package is under development in our laboratory. It can be operated as a stand-alone program or called as a software library from Java, C, C++ and MATLAB TM environments. It provides an advanced graphical user interface for data acquisition, spectral analysis and radioisotope identification. The code uses a peak-fitting function that includes peak asymmetry, Compton continuum and flexible background terms. Peak fitting function parameters can be calibrated as functions of energy. Each parameter can be constrained to improve fitting of overlapping peaks. All of these features can be adjusted by the user. To assist with peak identification, the code can automatically measure half-lives of single or multiple overlapping peaks from a time series of spectra. It implements library-based peak identification, with options for restricting the search based on radioisotope half-lives and reaction types. The software also improves the reliability of isotope identification by utilizing Monte-Carlo simulation results.

  6. Team reasoning and group identification

    NARCIS (Netherlands)

    Hindriks, Frank

    The team reasoning approach explains cooperation in terms of group identification, which in turn is explicated in terms of agency transformation and payoff transformation. Empirical research in social psychology is consistent with the significance of agency and payoff transformation. However, it

  7. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.

    1980-01-01

    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  8. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.

    1984-01-01

    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  9. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  10. Automatic Detection of Terminology Evolution

    Science.gov (United States)

    Tahmasebi, Nina

    As archives contain documents that span over a long period of time, the language used to create these documents and the language used for querying the archive can differ. This difference is due to evolution in both terminology and semantics and will cause a significant number of relevant documents being omitted. A static solution is to use query expansion based on explicit knowledge banks such as thesauri or ontologies. However as we are able to archive resources with more varied terminology, it will be infeasible to use only explicit knowledge for this purpose. There exist only few or no thesauri covering very domain specific terminologies or slang as used in blogs etc. In this Ph.D. thesis we focus on automatically detecting terminology evolution in a completely unsupervised manner as described in this technical paper.

  11. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  12. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.

    1993-01-01

    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  13. Automatic Regulation of Wastewater Discharge

    Directory of Open Access Journals (Sweden)

    Bolea Yolanda

    2017-01-01

    Full Text Available Wastewater plants, mainly with secondary treatments, discharge polluted water to environment that cannot be used in any human activity. When those dumps are in the sea it is expected that most of the biological pollutants die or almost disappear before water reaches human range. This natural withdrawal of bacteria, viruses and other pathogens is due to some conditions such as the salt water of the sea and the sun effect, and the dumps areas are calculated taking into account these conditions. However, under certain meteorological phenomena water arrives to the coast without the full disappearance of pollutant elements. In Mediterranean Sea there are some periods of adverse climatic conditions that pollute the coast near the wastewater dumping. In this paper, authors present an automatic control that prevents such pollution episodes using two mathematical models, one for the pollutant transportation and the other for the pollutant removal in wastewater spills.

  14. Rapid Automatic Motor Encoding of Competing Reach Options

    Directory of Open Access Journals (Sweden)

    Jason P. Gallivan

    2017-02-01

    Full Text Available Mounting neural evidence suggests that, in situations in which there are multiple potential targets for action, the brain prepares, in parallel, competing movements associated with these targets, prior to implementing one of them. Central to this interpretation is the idea that competing viewed targets, prior to selection, are rapidly and automatically transformed into corresponding motor representations. Here, by applying target-specific, gradual visuomotor rotations and dissociating, unbeknownst to participants, the visual direction of potential targets from the direction of the movements required to reach the same targets, we provide direct evidence for this provocative idea. Our results offer strong empirical support for theories suggesting that competing action options are automatically represented in terms of the movements required to attain them. The rapid motor encoding of potential targets may support the fast optimization of motor costs under conditions of target uncertainty and allow the motor system to inform decisions about target selection.

  15. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  16. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  17. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  18. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  19. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  20. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  1. Automatic contact algorithm in ppercase[dyna3d] for crashworthiness and impact problems

    International Nuclear Information System (INIS)

    Whirley, Robert G.; Engelmann, Bruce E.

    1994-01-01

    This paper presents a new approach for the automatic definition and treatment of mechanical contact in explicit non-linear finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. Key aspects of the proposed new method include automatic identification of adjacent and opposite surfaces in the global search phase, and the use of a well-defined surface normal which allows a consistent treatment of shell intersection and corner contact conditions without adhoc rules. The paper concludes with three examples which illustrate the performance of the newly proposed algorithm in the public ppercase[dyna3d] code. ((orig.))

  2. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison

    Science.gov (United States)

    Sa, Qila; Wang, Zhihui

    2018-03-01

    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  4. Grey-identification model based wind power generation short-term prediction%基于灰色-辨识模型的风电功率短期预测

    Institute of Scientific and Technical Information of China (English)

    2013-01-01

      为了准确预测风电机组的输出功率,针对实际风场,给出一种基于灰色 GM(1,1)模型和辨识模型的风电功率预测建模方法,采用残差修正的方法对风速进行预测,得出准确的风速预测序列。同时为了提高风电功率预测的精度,引入 FIR-MA迭代辨识模型,从分段函数的角度对风电场实际风速-风电功率曲线进行拟合,取得合适的 FIR-MA 模型。利用该模型对额定容量为850 kW 的风电机组进行建模,采用平均绝对误差和均方根误差,以及单点误差作为评价指标,与风电场的实测数据进行比较分析。仿真结果表明,基于灰色-辨识模型的风电机组输出功率预测方法是有效和实用的,该模型能够很好地预测风电机组的实时输出功率,从而提高风电场输出功率预测的精确性。%To predict the output power of wind turbine accurately, based on the GM (1, 1) model and the identification method, a wind power generation short-term prediction method is presented for the real wind farm. The revision of residual error is applied to forecast the wind speed and get the accurate predicted wind speed series. Then, in order to increase the prediction precision of wind power, the FIR-MA iterative identification model is adopted to fit the real relationship between sequential wind speed and wind power and get the proper FIR-MA model. By modeling the wind turbine whose rated capacity is 850 kW, this paper compares the predicted wind generation power with the observed data using mean absolute percentage error, root mean square error and single point error as its evaluation indexes. The simulation shows the effectiveness and the practical applicability of the presented method, which can predict the real time generation power of wind turbineness and raise the accuracy of the wind power prediction. Finally, the simulation using the actual data from wind farm in China proves the efficiency of the

  5. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  6. The UAB Informatics Institute and 2016 CEGS N-GRID de-identification shared task challenge.

    Science.gov (United States)

    Bui, Duy Duc An; Wyatt, Mathew; Cimino, James J

    2017-11-01

    Clinical narratives (the text notes found in patients' medical records) are important information sources for secondary use in research. However, in order to protect patient privacy, they must be de-identified prior to use. Manual de-identification is considered to be the gold standard approach but is tedious, expensive, slow, and impractical for use with large-scale clinical data. Automated or semi-automated de-identification using computer algorithms is a potentially promising alternative. The Informatics Institute of the University of Alabama at Birmingham is applying de-identification to clinical data drawn from the UAB hospital's electronic medical records system before releasing them for research. We participated in a shared task challenge by the Centers of Excellence in Genomic Science (CEGS) Neuropsychiatric Genome-Scale and RDoC Individualized Domains (N-GRID) at the de-identification regular track to gain experience developing our own automatic de-identification tool. We focused on the popular and successful methods from previous challenges: rule-based, dictionary-matching, and machine-learning approaches. We also explored new techniques such as disambiguation rules, term ambiguity measurement, and used multi-pass sieve framework at a micro level. For the challenge's primary measure (strict entity), our submissions achieved competitive results (f-measures: 87.3%, 87.1%, and 86.7%). For our preferred measure (binary token HIPAA), our submissions achieved superior results (f-measures: 93.7%, 93.6%, and 93%). With those encouraging results, we gain the confidence to improve and use the tool for the real de-identification task at the UAB Informatics Institute. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. A Multiple Identity Approach to Gender: Identification with Women, Identification with Feminists, and Their Interaction

    OpenAIRE

    van Breen, Jolien A.; Spears, Russell; Kuppens, Toon; de Lemus, Soledad

    2017-01-01

    Across four studies, we examine multiple identities in the context of gender and propose that women's attitudes toward gender group membership are governed by two largely orthogonal dimensions of gender identity: identification with women and identification with feminists. We argue that identification with women reflects attitudes toward the content society gives to group membership: what does it mean to be a woman in terms of group characteristics, interests and values? Identification with f...

  8. Mathematical modelling and quality indices optimization of automatic control systems of reactor facility

    International Nuclear Information System (INIS)

    Severin, V.P.

    2007-01-01

    The mathematical modeling of automatic control systems of reactor facility WWER-1000 with various regulator types is considered. The linear and nonlinear models of neutron power control systems of nuclear reactor WWER-1000 with various group numbers of delayed neutrons are designed. The results of optimization of direct quality indexes of neutron power control systems of nuclear reactor WWER-1000 are designed. The identification and optimization of level control systems with various regulator types of steam generator are executed

  9. Face Prediction Model for an Automatic Age-invariant Face Recognition System

    OpenAIRE

    Yadav, Poonam

    2015-01-01

    07.11.14 KB. Emailed author re copyright. Author says that copyright is retained by author. Ok to add to spiral Automated face recognition and identi cation softwares are becoming part of our daily life; it nds its abode not only with Facebooks auto photo tagging, Apples iPhoto, Googles Picasa, Microsofts Kinect, but also in Homeland Security Departments dedicated biometric face detection systems. Most of these automatic face identification systems fail where the e ects of aging come into...

  10. Automatic locking orthotic knee device

    Science.gov (United States)

    Weddendorf, Bruce C. (Inventor)

    1993-01-01

    An articulated tang in clevis joint for incorporation in newly manufactured conventional strap-on orthotic knee devices or for replacing such joints in conventional strap-on orthotic knee devices is discussed. The instant tang in clevis joint allows the user the freedom to extend and bend the knee normally when no load (weight) is applied to the knee and to automatically lock the knee when the user transfers weight to the knee, thus preventing a damaged knee from bending uncontrollably when weight is applied to the knee. The tang in clevis joint of the present invention includes first and second clevis plates, a tang assembly and a spacer plate secured between the clevis plates. Each clevis plate includes a bevelled serrated upper section. A bevelled shoe is secured to the tank in close proximity to the bevelled serrated upper section of the clevis plates. A coiled spring mounted within an oblong bore of the tang normally urges the shoes secured to the tang out of engagement with the serrated upper section of each clevic plate to allow rotation of the tang relative to the clevis plate. When weight is applied to the joint, the load compresses the coiled spring, the serrations on each clevis plate dig into the bevelled shoes secured to the tang to prevent relative movement between the tang and clevis plates. A shoulder is provided on the tang and the spacer plate to prevent overextension of the joint.

  11. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  12. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  13. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  14. Modeling and Analysis of Surgery Patient Identification Using RFID

    OpenAIRE

    Byungho Jeong; Chen-Yang Cheng; Vittal Prabhu

    2009-01-01

    This article proposes a workflow and reliability model for surgery patient identification using RFID (Radio Frequency Identification). Certain types of mistakes may be prevented by automatically identifying the patient before surgery. The proposed workflow is designed to ensure that both the correct site and patient are engaged in the surgical process. The reliability model can be used to assess improvements in patients’ safety during this process. A proof-of-concept system is developed to ...

  15. Radio frequency identification and its application in e-commerce

    OpenAIRE

    Bahr, Witold; Price, Brian J

    2016-01-01

    This chapter presents Radio Frequency Identification (RFID), which is one of the Automatic Identification and Data Capture (AIDC) technologies (Wamba and Boeck, 2008) and discusses the application of RFID in E-Commerce. Firstly RFID is defined and the tag and reader components of the RFID system are explained. Then historical context of RFID is briefly discussed. Next, RFID is contrasted with other AIDC technologies, especially the use of barcodes which are commonly applied in E-Commerce. Las...

  16. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  17. Development of a microcontroller-based automatic control system for the electrohydraulic total artificial heart.

    Science.gov (United States)

    Kim, H C; Khanwilkar, P S; Bearnson, G B; Olsen, D B

    1997-01-01

    An automatic physiological control system for the actively filled, alternately pumped ventricles of the volumetrically coupled, electrohydraulic total artificial heart (EHTAH) was developed for long-term use. The automatic control system must ensure that the device: 1) maintains a physiological response of cardiac output, 2) compensates for an nonphysiological condition, and 3) is stable, reliable, and operates at a high power efficiency. The developed automatic control system met these requirements both in vitro, in week-long continuous mock circulation tests, and in vivo, in acute open-chested animals (calves). Satisfactory results were also obtained in a series of chronic animal experiments, including 21 days of continuous operation of the fully automatic control mode, and 138 days of operation in a manual mode, in a 159-day calf implant.

  18. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  19. Automatic classification of liver scintigram patterns by computer

    International Nuclear Information System (INIS)

    Csernay, L.; Csirik, J.

    1976-01-01

    The pattern recognition of projection is one of the problems in the automatic evaluation of scintigrams. An algorythm and a computerized programme with the ability to classify the shapes of liver scintigrams has been elaborated by the authors. The programme differentiates not only normal and pathologic basic forms, but performs the identification of nine normal forms described by the literature. To pattern recognition structural and local parameters of the picture were defined. A detailed mechanism of the programme is given in their reports. The programme can classify 55 out of 60 actual liver scintigrams, a result different from subjective definition obtained in 5 cases. These were normal pattern of liver scans. No wrong definition was obtained when classifying normal and pathologic patterns

  20. Automatic Operation For A Robot Lawn Mower

    Science.gov (United States)

    Huang, Y. Y.; Cao, Z. L.; Oh, S. J.; Kattan, E. U.; Hall, E. L.

    1987-02-01

    A domestic mobile robot, lawn mower, which performs the automatic operation mode, has been built up in the Center of Robotics Research, University of Cincinnati. The robot lawn mower automatically completes its work with the region filling operation, a new kind of path planning for mobile robots. Some strategies for region filling of path planning have been developed for a partly-known or a unknown environment. Also, an advanced omnidirectional navigation system and a multisensor-based control system are used in the automatic operation. Research on the robot lawn mower, especially on the region filling of path planning, is significant in industrial and agricultural applications.

  1. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  2. Reactor protection system with automatic self-testing and diagnostic

    International Nuclear Information System (INIS)

    Gaubatz, D.C.

    1996-01-01

    A reactor protection system is disclosed having four divisions, with quad redundant sensors for each scram parameter providing input to four independent microprocessor-based electronic chassis. Each electronic chassis acquires the scram parameter data from its own sensor, digitizes the information, and then transmits the sensor reading to the other three electronic chassis via optical fibers. To increase system availability and reduce false scrams, the reactor protection system employs two levels of voting on a need for reactor scram. The electronic chassis perform software divisional data processing, vote 2/3 with spare based upon information from all four sensors, and send the divisional scram signals to the hardware logic panel, which performs a 2/4 division vote on whether or not to initiate a reactor scram. Each chassis makes a divisional scram decision based on data from all sensors. Automatic detection and discrimination against failed sensors allows the reactor protection system to automatically enter a known state when sensor failures occur. Cross communication of sensor readings allows comparison of four theoretically ''identical'' values. This permits identification of sensor errors such as drift or malfunction. A diagnostic request for service is issued for errant sensor data. Automated self test and diagnostic monitoring, sensor input through output relay logic, virtually eliminate the need for manual surveillance testing. This provides an ability for each division to cross-check all divisions and to sense failures of the hardware logic. 16 figs

  3. A Cough-Based Algorithm for Automatic Diagnosis of Pertussis

    Science.gov (United States)

    Pramono, Renard Xaviero Adhi; Imtiaz, Syed Anas; Rodriguez-Villegas, Esther

    2016-01-01

    Pertussis is a contagious respiratory disease which mainly affects young children and can be fatal if left untreated. The World Health Organization estimates 16 million pertussis cases annually worldwide resulting in over 200,000 deaths. It is prevalent mainly in developing countries where it is difficult to diagnose due to the lack of healthcare facilities and medical professionals. Hence, a low-cost, quick and easily accessible solution is needed to provide pertussis diagnosis in such areas to contain an outbreak. In this paper we present an algorithm for automated diagnosis of pertussis using audio signals by analyzing cough and whoop sounds. The algorithm consists of three main blocks to perform automatic cough detection, cough classification and whooping sound detection. Each of these extract relevant features from the audio signal and subsequently classify them using a logistic regression model. The output from these blocks is collated to provide a pertussis likelihood diagnosis. The performance of the proposed algorithm is evaluated using audio recordings from 38 patients. The algorithm is able to diagnose all pertussis successfully from all audio recordings without any false diagnosis. It can also automatically detect individual cough sounds with 92% accuracy and PPV of 97%. The low complexity of the proposed algorithm coupled with its high accuracy demonstrates that it can be readily deployed using smartphones and can be extremely useful for quick identification or early screening of pertussis and for infection outbreaks control. PMID:27583523

  4. 45 CFR 95.631 - Cost identification for purpose of FFP claims.

    Science.gov (United States)

    2010-10-01

    ... INSURANCE PROGRAMS) Automatic Data Processing Equipment and Services-Conditions for Federal Financial Participation (FFP) Federal Financial Participation in Costs of Adp Acquisitions § 95.631 Cost identification... 45 Public Welfare 1 2010-10-01 2010-10-01 false Cost identification for purpose of FFP claims. 95...

  5. Radiation dosimetry by automatic image analysis of dicentric chromosomes

    International Nuclear Information System (INIS)

    Bayley, R.; Carothers, A.; Farrow, S.; Gordon, J.; Ji, L.; Piper, J.; Rutovitz, D.; Stark, M.; Chen, X.; Wald, N.; Pittsburgh Univ., PA

    1991-01-01

    A system for scoring dicentric chromosomes by image analysis comprised fully automatic location of mitotic cells, automatic retrieval, focus and digitisation at high resolution, automatic rejection of nuclei and debris and detection and segmentation of chromosome clusters, automatic centromere location, and subsequent rapid interactive visual review of potential dicentric chromosomes to confirm positives and reject false positives. A calibration set of about 15000 cells was used to establish the quadratic dose response for 60 Co γ-irradiation. The dose-response function parameters were established by a maximum likelihood technique, and confidence limits in the dose response and in the corresponding inverse curve, of estimated dose for observed dicentric frequency, were established by Monte Carlo techniques. The system was validated in a blind trial by analysing a test comprising a total of about 8000 cells irradiated to 1 of 10 dose levels, and estimating the doses from the observed dicentric frequency. There was a close correspondence between the estimated and true doses. The overall sensitivity of the system in terms of the proportion of the total population of dicentrics present in the cells analysed that were detected by the system was measured to be about 40%. This implies that about 2.5 times more cells must be analysed by machine than by visual analysis. Taking this factor into account, the measured review time and false positive rates imply that analysis by the system of sufficient cells to provide the equivalent of a visual analysis of 500 cells would require about 1 h for operator review. (author). 20 refs.; 4 figs.; 5 tabs

  6. Requirements to a Norwegian national automatic gamma monitoring system

    DEFF Research Database (Denmark)

    Lauritzen, B.; Jensen, Per Hedemann; Nielsen, F.

    2005-01-01

    increments above the natural background levels. The study is based upon simplified deterministic calculations of the radiological consequences of generic nuclear accident scenarios. The density of gammamonitoring stations has been estimated from an analysis of the dispersion of radioactive materials over......An assessment of the overall requirements to a Norwegian gamma-monitoring network is undertaken with special emphasis on the geographical distribution of automatic gamma monitoring stations, type of detectors in such stations and the sensitivity of thesystem in terms of ambient dose equivalent rate...

  7. Automatic coding of online collaboration protocols

    NARCIS (Netherlands)

    Erkens, Gijsbert; Janssen, J.J.H.M.

    2006-01-01

    An automatic coding procedure is described to determine the communicative functions of messages in chat discussions. Five main communicative functions are distinguished: argumentative (indicating a line of argumentation or reasoning), responsive (e.g., confirmations, denials, and answers),

  8. Automatic Amharic text news classification: Aneural networks ...

    African Journals Online (AJOL)

    School of Computing and Electrical Engineering, Institute of Technology, Bahir Dar University, Bahir Dar ... The study is on classification of Amharic news automatically using neural networks approach. Learning Vector ... INTRODUCTION.

  9. Suspect/foil identification in actual crimes and in the laboratory: a reality monitoring analysis.

    Science.gov (United States)

    Behrman, Bruce W; Richards, Regina E

    2005-06-01

    Four reality monitoring variables were used to discriminate suspect from foil identifications in 183 actual criminal cases. Four hundred sixty-one identification attempts based on five and six-person lineups were analyzed. These identification attempts resulted in 238 suspect identifications and 68 foil identifications. Confidence, automatic processing, eliminative processing and feature use comprised the set of reality monitoring variables. Thirty-five verbal confidence phrases taken from police reports were assigned numerical values on a 10-point confidence scale. Automatic processing identifications were those that occurred "immediately" or "without hesitation." Eliminative processing identifications occurred when witnesses compared or eliminated persons in the lineups. Confidence, automatic processing and eliminative processing were significant predictors, but feature use was not. Confidence was the most effective discriminator. In cases that involved substantial evidence extrinsic to the identification 43% of the suspect identifications were made with high confidence, whereas only 10% of the foil identifications were made with high confidence. The results of a laboratory study using the same predictors generally paralleled the archival results. Forensic implications are discussed.

  10. Musical Instrument Identification using Multiscale Mel-frequency Cepstral Coefficients

    DEFF Research Database (Denmark)

    Sturm, Bob L.; Morvidone, Marcela; Daudet, Laurent

    2010-01-01

    We investigate the benefits of evaluating Mel-frequency cepstral coefficients (MFCCs) over several time scales in the context of automatic musical instrument identification for signals that are monophonic but derived from real musical settings. We define several sets of features derived from MFCC...... multiscale decompositions perform significantly better than features computed using a single time-resolution....

  11. Radio Frequency Identification in Construction Operation and Maintenance

    DEFF Research Database (Denmark)

    Sørensen, Kristian Birch; Christiansson, Per; Svidt, Kjeld

    2008-01-01

    As early as in 1995 it was stated that automatic identification of objects using RFID was a promising technology for the construction industry. However, 13 years later the applications of RFID in the construction industry are rare and mostly used in prototype projects or used for theft prevention...

  12. 48 CFR 252.211-7006 - Radio Frequency Identification.

    Science.gov (United States)

    2010-10-01

    ... supply, as defined in DoD 4140.1-R, DoD Supply Chain Materiel Management Regulation, AP1.1.11: (A... International and the Uniform Code Council to establish and support the EPC network as the global standard for immediate, automatic, and accurate identification of any item in the supply chain of any company, in any...

  13. Automatic shadowing device for electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, F W; Bogitch, S

    1960-01-01

    For the past ten years in the laboratory of the Department of Nuclear Medicine and Radiation Biology at the University of California, and before that at Rochester, New York, every evaporation was done with the aid of an automatic shadowing device. For several months the automatic shadowing device has been available at the Atomic Bomb Casualty Commission (ABCC) Hiroshima, Japan with the modifications described. 1 reference.

  14. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  15. Automatic Control of Silicon Melt Level

    Science.gov (United States)

    Duncan, C. S.; Stickel, W. B.

    1982-01-01

    A new circuit, when combined with melt-replenishment system and melt level sensor, offers continuous closed-loop automatic control of melt-level during web growth. Installed on silicon-web furnace, circuit controls melt-level to within 0.1 mm for as long as 8 hours. Circuit affords greater area growth rate and higher web quality, automatic melt-level control also allows semiautomatic growth of web over long periods which can greatly reduce costs.

  16. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  17. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  18. Automatic Vetting for Malice in Android Platforms

    Science.gov (United States)

    2016-05-01

    Android Apps from Play Store Infected with Brain Test Malware. http://www.ibtimes.co.uk/google- removes -13- android -apps-play-store-infected- brain-test...AUTOMATIC VETTING FOR MALICE IN ANDROID PLATFORMS IOWA STATE UNIVERSITY MAY 2016 FINAL TECHNICAL REPORT APPROVED...COVERED (From - To) DEC 2013 - DEC 2015 4. TITLE AND SUBTITLE Automatic Vetting for Malice in Android Platforms 5a. CONTRACT NUMBER FA8750-14-2

  19. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  20. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....