WorldWideScience

Sample records for fields automatically identified

  1. Automatically identifying gene/protein terms in MEDLINE abstracts.

    Science.gov (United States)

    Yu, Hong; Hatzivassiloglou, Vasileios; Rzhetsky, Andrey; Wilbur, W John

    2002-01-01

    Natural language processing (NLP) techniques are used to extract information automatically from computer-readable literature. In biology, the identification of terms corresponding to biological substances (e.g., genes and proteins) is a necessary step that precedes the application of other NLP systems that extract biological information (e.g., protein-protein interactions, gene regulation events, and biochemical pathways). We have developed GPmarkup (for "gene/protein-full name mark up"), a software system that automatically identifies gene/protein terms (i.e., symbols or full names) in MEDLINE abstracts. As a part of marking up process, we also generated automatically a knowledge source of paired gene/protein symbols and full names (e.g., LARD for lymphocyte associated receptor of death) from MEDLINE. We found that many of the pairs in our knowledge source do not appear in the current GenBank database. Therefore our methods may also be used for automatic lexicon generation. GPmarkup has 73% recall and 93% precision in identifying and marking up gene/protein terms in MEDLINE abstracts. A random sample of gene/protein symbols and full names and a sample set of marked up abstracts can be viewed at http://www.cpmc.columbia.edu/homepages/yuh9001/GPmarkup/. Contact. hy52@columbia.edu. Voice: 212-939-7028; fax: 212-666-0140.

  2. Automatic system for evaluation of ionizing field

    International Nuclear Information System (INIS)

    Pimenta, N.L.; Calil, S.J.

    1992-01-01

    A three-dimensional cartesian manipulator for evaluating the ionizing field and able to position a ionization chamber in any point of the space is developed. The control system is made using a IBM microcomputer. The system aimed the study of isodose curves from ionizing sources, verifying the performance of radiotherapeutic equipment. (C.G.C.)

  3. Markov random field based automatic image alignment for electron tomography.

    Science.gov (United States)

    Amat, Fernando; Moussavi, Farshid; Comolli, Luis R; Elidan, Gal; Downing, Kenneth H; Horowitz, Mark

    2008-03-01

    We present a method for automatic full-precision alignment of the images in a tomographic tilt series. Full-precision automatic alignment of cryo electron microscopy images has remained a difficult challenge to date, due to the limited electron dose and low image contrast. These facts lead to poor signal to noise ratio (SNR) in the images, which causes automatic feature trackers to generate errors, even with high contrast gold particles as fiducial features. To enable fully automatic alignment for full-precision reconstructions, we frame the problem probabilistically as finding the most likely particle tracks given a set of noisy images, using contextual information to make the solution more robust to the noise in each image. To solve this maximum likelihood problem, we use Markov Random Fields (MRF) to establish the correspondence of features in alignment and robust optimization for projection model estimation. The resulting algorithm, called Robust Alignment and Projection Estimation for Tomographic Reconstruction, or RAPTOR, has not needed any manual intervention for the difficult datasets we have tried, and has provided sub-pixel alignment that is as good as the manual approach by an expert user. We are able to automatically map complete and partial marker trajectories and thus obtain highly accurate image alignment. Our method has been applied to challenging cryo electron tomographic datasets with low SNR from intact bacterial cells, as well as several plastic section and X-ray datasets.

  4. Parallel computation of automatic differentiation applied to magnetic field calculations

    International Nuclear Information System (INIS)

    Hinkins, R.L.; Lawrence Berkeley Lab., CA

    1994-09-01

    The author presents a parallelization of an accelerator physics application to simulate magnetic field in three dimensions. The problem involves the evaluation of high order derivatives with respect to two variables of a multivariate function. Automatic differentiation software had been used with some success, but the computation time was prohibitive. The implementation runs on several platforms, including a network of workstations using PVM, a MasPar using MPFortran, and a CM-5 using CMFortran. A careful examination of the code led to several optimizations that improved its serial performance by a factor of 8.7. The parallelization produced further improvements, especially on the MasPar with a speedup factor of 620. As a result a problem that took six days on a SPARC 10/41 now runs in minutes on the MasPar, making it feasible for physicists at Lawrence Berkeley Laboratory to simulate larger magnets

  5. Systems and methods for automatically identifying and linking names in digital resources

    Science.gov (United States)

    Parker, Charles T.; Lyons, Catherine M.; Roston, Gerald P.; Garrity, George M.

    2017-06-06

    The present invention provides systems and methods for automatically identifying name-like-strings in digital resources, matching these name-like-string against a set of names held in an expertly curated database, and for those name-like-strings found in said database, enhancing the content by associating additional matter with the name, wherein said matter includes information about the names that is held within said database and pointers to other digital resources which include the same name and it synonyms.

  6. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    Science.gov (United States)

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  7. A comparison of coronal mass ejections identified by manual and automatic methods

    Directory of Open Access Journals (Sweden)

    S. Yashiro

    2008-10-01

    Full Text Available Coronal mass ejections (CMEs are related to many phenomena (e.g. flares, solar energetic particles, geomagnetic storms, thus compiling of event catalogs is important for a global understanding these phenomena. CMEs have been identified manually for a long time, but in the SOHO era, automatic identification methods are being developed. In order to clarify the advantage and disadvantage of the manual and automatic CME catalogs, we examined the distributions of CME properties listed in the CDAW (manual and CACTus (automatic catalogs. Both catalogs have a good agreement on the wide CMEs (width>120° in their properties, while there is a significant discrepancy on the narrow CMEs (width≤30°: CACTus has a larger number of narrow CMEs than CDAW. We carried out an event-by-event examination of a sample of events and found that the CDAW catalog have missed many narrow CMEs during the solar maximum. Another significant discrepancy was found on the fast CMEs (speed>1000 km/s: the majority of the fast CDAW CMEs are wide and originate from low latitudes, while the fast CACTus CMEs are narrow and originate from all latitudes. Event-by-event examination of a sample of events suggests that CACTus has a problem on the detection of the fast CMEs.

  8. Analogue particle identifier and test unit for automatic measuring of errors

    International Nuclear Information System (INIS)

    Boden, A.; Lauch, J.

    1979-04-01

    A high accuracy analogue particle identifier is described. The unit is used for particle identification or data correction of experimental based errors in magnetic spectrometers. Signals which are proportional to the energy, the time-of-flight or the position of absorption of the particles are supplied to an analogue computation circuit (multifunction converter). Three computation functions are available for different applications. The output of the identifier produces correction signals or pulses whose amplitudes are proportional to the mass of the particles. Particle identification and data correction can be optimized by the adjustment of variable parameters. An automatic test unit has been developed for adjustment and routine checking of particle identifiers. The computation functions can be tested by this unit with an accuracy of 1%. (orig.) [de

  9. Optical Automatic Car Identification (OACI) Field Test Program

    Science.gov (United States)

    1976-05-01

    The results of the Optical Automatic Car Identification (OACI) tests at Chicago conducted from August 16 to September 4, 1975 are presented. The main purpose of this test was to determine the suitability of optics as a principle of operation for an a...

  10. Difficulty identifying feelings and automatic activation in the fusiform gyrus in response to facial emotion.

    Science.gov (United States)

    Eichmann, Mischa; Kugel, Harald; Suslow, Thomas

    2008-12-01

    Difficulties in identifying and differentiating one's emotions are a central characteristic of alexithymia. In the present study, automatic activation of the fusiform gyrus to facial emotion was investigated as a function of alexithymia as assessed by the 20-item Toronto Alexithymia Scale. During 3 Tesla fMRI scanning, pictures of faces bearing sad, happy, and neutral expressions masked by neutral faces were presented to 22 healthy adults who also responded to the Toronto Alexithymia Scale. The fusiform gyrus was selected as the region of interest, and voxel values of this region were extracted, summarized as means, and tested among the different conditions (sad, happy, and neutral faces). Masked sad facial emotions were associated with greater bilateral activation of the fusiform gyrus than masked neutral faces. The subscale, Difficulty Identifying Feelings, was negatively correlated with the neural response of the fusiform gyrus to masked sad faces. The correlation results suggest that automatic hyporesponsiveness of the fusiform gyrus to negative emotion stimuli may reflect problems in recognizing one's emotions in everyday life.

  11. Automatic address validation and health record review to identify homeless Social Security disability applicants.

    Science.gov (United States)

    Erickson, Jennifer; Abbott, Kenneth; Susienka, Lucinda

    2018-06-01

    Homeless patients face a variety of obstacles in pursuit of basic social services. Acknowledging this, the Social Security Administration directs employees to prioritize homeless patients and handle their disability claims with special care. However, under existing manual processes for identification of homelessness, many homeless patients never receive the special service to which they are entitled. In this paper, we explore address validation and automatic annotation of electronic health records to improve identification of homeless patients. We developed a sample of claims containing medical records at the moment of arrival in a single office. Using address validation software, we reconciled patient addresses with public directories of homeless shelters, veterans' hospitals and clinics, and correctional facilities. Other tools annotated electronic health records. We trained random forests to identify homeless patients and validated each model with 10-fold cross validation. For our finished model, the area under the receiver operating characteristic curve was 0.942. The random forest improved sensitivity from 0.067 to 0.879 but decreased positive predictive value to 0.382. Presumed false positive classifications bore many characteristics of homelessness. Organizations could use these methods to prompt early collection of information necessary to avoid labor-intensive attempts to reestablish contact with homeless individuals. Annually, such methods could benefit tens of thousands of patients who are homeless, destitute, and in urgent need of assistance. We were able to identify many more homeless patients through a combination of automatic address validation and natural language processing of unstructured electronic health records. Copyright © 2018. Published by Elsevier Inc.

  12. Applying deep learning technology to automatically identify metaphase chromosomes using scanning microscopic images: an initial investigation

    Science.gov (United States)

    Qiu, Yuchen; Lu, Xianglan; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Li, Shibo; Liu, Hong; Zheng, Bin

    2016-03-01

    Automated high throughput scanning microscopy is a fast developing screening technology used in cytogenetic laboratories for the diagnosis of leukemia or other genetic diseases. However, one of the major challenges of using this new technology is how to efficiently detect the analyzable metaphase chromosomes during the scanning process. The purpose of this investigation is to develop a computer aided detection (CAD) scheme based on deep learning technology, which can identify the metaphase chromosomes with high accuracy. The CAD scheme includes an eight layer neural network. The first six layers compose of an automatic feature extraction module, which has an architecture of three convolution-max-pooling layer pairs. The 1st, 2nd and 3rd pair contains 30, 20, 20 feature maps, respectively. The seventh and eighth layers compose of a multiple layer perception (MLP) based classifier, which is used to identify the analyzable metaphase chromosomes. The performance of new CAD scheme was assessed by receiver operation characteristic (ROC) method. A number of 150 regions of interest (ROIs) were selected to test the performance of our new CAD scheme. Each ROI contains either interphase cell or metaphase chromosomes. The results indicate that new scheme is able to achieve an area under the ROC curve (AUC) of 0.886+/-0.043. This investigation demonstrates that applying a deep learning technique may enable to significantly improve the accuracy of the metaphase chromosome detection using a scanning microscopic imaging technology in the future.

  13. Automatable algorithms to identify nonmedical opioid use using electronic data: a systematic review.

    Science.gov (United States)

    Canan, Chelsea; Polinski, Jennifer M; Alexander, G Caleb; Kowal, Mary K; Brennan, Troyen A; Shrank, William H

    2017-11-01

    Improved methods to identify nonmedical opioid use can help direct health care resources to individuals who need them. Automated algorithms that use large databases of electronic health care claims or records for surveillance are a potential means to achieve this goal. In this systematic review, we reviewed the utility, attempts at validation, and application of such algorithms to detect nonmedical opioid use. We searched PubMed and Embase for articles describing automatable algorithms that used electronic health care claims or records to identify patients or prescribers with likely nonmedical opioid use. We assessed algorithm development, validation, and performance characteristics and the settings where they were applied. Study variability precluded a meta-analysis. Of 15 included algorithms, 10 targeted patients, 2 targeted providers, 2 targeted both, and 1 identified medications with high abuse potential. Most patient-focused algorithms (67%) used prescription drug claims and/or medical claims, with diagnosis codes of substance abuse and/or dependence as the reference standard. Eleven algorithms were developed via regression modeling. Four used natural language processing, data mining, audit analysis, or factor analysis. Automated algorithms can facilitate population-level surveillance. However, there is no true gold standard for determining nonmedical opioid use. Users must recognize the implications of identifying false positives and, conversely, false negatives. Few algorithms have been applied in real-world settings. Automated algorithms may facilitate identification of patients and/or providers most likely to need more intensive screening and/or intervention for nonmedical opioid use. Additional implementation research in real-world settings would clarify their utility. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. A technique for automatically extracting useful field of view and central field of view images.

    Science.gov (United States)

    Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar

    2016-01-01

    It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.

  15. A technique for automatically extracting useful field of view and central field of view images

    International Nuclear Information System (INIS)

    Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar

    2016-01-01

    It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints

  16. Automatically generating Feynman rules for improved lattice field theories

    International Nuclear Information System (INIS)

    Hart, A.; Hippel, G.M. von; Horgan, R.R.; Storoni, L.C.

    2005-01-01

    Deriving the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially when improvement terms are present. This physically important task is, however, suitable for automation. We describe a flexible algorithm for generating Feynman rules for a wide range of lattice field theories including gluons, relativistic fermions and heavy quarks. We also present an efficient implementation of this in a freely available, multi-platform programming language (PYTHON), optimised to deal with a wide class of lattice field theories

  17. Field Robotics in Sports: Automatic Generation of guidance Lines for Automatic Grass Cutting, Striping and Pitch Marking of Football Playing Fields

    Directory of Open Access Journals (Sweden)

    Ole Green

    2011-03-01

    Full Text Available Progress is constantly being made and new applications are constantly coming out in the area of field robotics. In this paper, a promising application of field robotics in football playing fields is introduced. An algorithmic approach for generating the way points required for the guidance of a GPS-based field robotic through a football playing field to automatically carry out periodical tasks such as cutting the grass field, pitch and line marking illustrations and lawn striping is represented. The manual operation of these tasks requires very skilful personnel able to work for long hours with very high concentration for the football yard to be compatible with standards of Federation Internationale de Football Association (FIFA. In the other side, a GPS-based guided vehicle or robot with three implements; grass mower, lawn stripping roller and track marking illustrator is capable of working 24 h a day, in most weather and in harsh soil conditions without loss of quality. The proposed approach for the automatic operation of football playing fields requires no or very limited human intervention and therefore it saves numerous working hours and free a worker to focus on other tasks. An economic feasibility study showed that the proposed method is economically superimposing the current manual practices.

  18. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  19. Automatic jargon identifier for scientists engaging with the public and science communication educators

    Science.gov (United States)

    Chapnik, Noam; Yosef, Roy; Baram-Tsabari, Ayelet

    2017-01-01

    Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012–2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1) comparison and correlation with existing frequency word lists in the literature; (2) a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3) a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and be implemented

  20. Automatic jargon identifier for scientists engaging with the public and science communication educators.

    Directory of Open Access Journals (Sweden)

    Tzipora Rakedzon

    Full Text Available Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012-2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1 comparison and correlation with existing frequency word lists in the literature; (2 a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3 a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and

  1. Automatic jargon identifier for scientists engaging with the public and science communication educators.

    Science.gov (United States)

    Rakedzon, Tzipora; Segev, Elad; Chapnik, Noam; Yosef, Roy; Baram-Tsabari, Ayelet

    2017-01-01

    Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012-2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1) comparison and correlation with existing frequency word lists in the literature; (2) a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3) a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and be implemented by

  2. Automatic alignment device for focal spot measurements in the center of the field for mammography

    International Nuclear Information System (INIS)

    Vieira, Marcelo A.C.; Watanabe, Alex O.; Oliveira Junior, Paulo D.; Schiabel, Homero

    2010-01-01

    Some quality control procedures used for mammography, such as focal spot evaluation, requires previous alignment of the measurement equipment with the X-ray central beam. However, alignment procedures are, in general, the most difficult task and the one that needs more time to be performed. Moreover, the operator sometimes is exposed to radiation during this procedure. This work presents an automatic alignment system for mammographic equipment that allows locating the central ray of the radiation beam and, immediately, aligns with it by dislocating itself automatically along the field. The system consists on a bidirectional moving device, connected to a CCD sensor for digital radiographic image acquisition. A computational analysis of a radiographic image, acquired at any position on the field, is performed in order to determine its positioning under the X-ray beam. Finally, a mechanical system for two moving directions, electronically controlled by a microcontroller under USB communication, makes the system to align automatically with the radiation beam central ray. The alignment process is fully automatic, fast and accurate, with no operator exposure to radiation, which allows a considerable time saving for quality control procedures achievement for mammography. (author)

  3. Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system

    Science.gov (United States)

    Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong

    2018-01-01

    We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.

  4. Automatic NMR field-frequency lock-pulsed phase locked loop approach.

    Science.gov (United States)

    Kan, S; Gonord, P; Fan, M; Sauzade, M; Courtieu, J

    1978-06-01

    A self-contained deuterium frequency-field lock scheme for a high-resolution NMR spectrometer is described. It is based on phase locked loop techniques in which the free induction decay signal behaves as a voltage-controlled oscillator. By pulsing the spins at an offset frequency of a few hundred hertz and using a digital phase-frequency discriminator this method not only eliminates the usual phase, rf power, offset adjustments needed in conventional lock systems but also possesses the automatic pull-in characteristics that dispense with the use of field sweeps to locate the NMR line prior to closure of the lock loop.

  5. Developing Automatic Water Table Control System for Reducing Greenhouse Gas Emissions from Paddy Fields

    Science.gov (United States)

    Arif, C.; Fauzan, M. I.; Satyanto, K. S.; Budi, I. S.; Masaru, M.

    2018-05-01

    Water table in rice fields play important role to mitigate greenhouse gas (GHG) emissions from paddy fields. Continuous flooding by maintenance water table 2-5 cm above soil surface is not effective and release more GHG emissions. System of Rice Intensification (SRI) as alternative rice farming apply intermittent irrigation by maintaining lower water table is proven can reduce GHG emissions reducing productivity significantly. The objectives of this study were to develop automatic water table control system for SRI application and then evaluate the performances. The control system was developed based on fuzzy logic algorithms using the mini PC of Raspberry Pi. Based on laboratory and field tests, the developed system was working well as indicated by lower MAPE (mean absolute percentage error) values. MAPE values for simulation and field tests were 16.88% and 15.80%, respectively. This system can save irrigation water up to 42.54% without reducing productivity significantly when compared to manual irrigation systems.

  6. Semi-automatic mapping for identifying complex geobodies in seismic images

    Science.gov (United States)

    Domínguez-C, Raymundo; Romero-Salcedo, Manuel; Velasquillo-Martínez, Luis G.; Shemeretov, Leonid

    2017-03-01

    Seismic images are composed of positive and negative seismic wave traces with different amplitudes (Robein 2010 Seismic Imaging: A Review of the Techniques, their Principles, Merits and Limitations (Houten: EAGE)). The association of these amplitudes together with a color palette forms complex visual patterns. The color intensity of such patterns is directly related to impedance contrasts: the higher the contrast, the higher the color intensity. Generally speaking, low impedance contrasts are depicted with low tone colors, creating zones with different patterns whose features are not evident for a 3D automated mapping option available on commercial software. In this work, a workflow for a semi-automatic mapping of seismic images focused on those areas with low-intensity colored zones that may be associated with geobodies of petroleum interest is proposed. The CIE L*A*B* color space was used to perform the seismic image processing, which helped find small but significant differences between pixel tones. This process generated binary masks that bound color regions to low-intensity colors. The three-dimensional-mask projection allowed the construction of 3D structures for such zones (geobodies). The proposed method was applied to a set of digital images from a seismic cube and tested on four representative study cases. The obtained results are encouraging because interesting geobodies are obtained with a minimum of information.

  7. Automatic Recognition of Chinese Personal Name Using Conditional Random Fields and Knowledge Base

    Directory of Open Access Journals (Sweden)

    Chuan Gu

    2015-01-01

    Full Text Available According to the features of Chinese personal name, we present an approach for Chinese personal name recognition based on conditional random fields (CRF and knowledge base in this paper. The method builds multiple features of CRF model by adopting Chinese character as processing unit, selects useful features based on selection algorithm of knowledge base and incremental feature template, and finally implements the automatic recognition of Chinese personal name from Chinese document. The experimental results on open real corpus demonstrated the effectiveness of our method and obtained high accuracy rate and high recall rate of recognition.

  8. The RISE Framework: Using Learning Analytics to Automatically Identify Open Educational Resources for Continuous Improvement

    Science.gov (United States)

    Bodily, Robert; Nyland, Rob; Wiley, David

    2017-01-01

    The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…

  9. Identifying Basketball Plays from Sensor Data; towards a Low-Cost Automatic Extraction of Advanced Statistics

    DEFF Research Database (Denmark)

    Sangüesa, Adrià Arbués; Moeslund, Thomas B.; Bahnsen, Chris Holmberg

    2017-01-01

    Advanced statistics have proved to be a crucial tool for basketball coaches in order to improve training skills. Indeed, the performance of the team can be further optimized by studying the behaviour of players under certain conditions. In the United States of America, companies such as STATS...... or Second Spectrum use a complex multi-camera setup to deliver advanced statistics to all NBA teams, but the price of this service is far beyond the budget of the vast majority of European teams. For this reason, a first prototype based on positioning sensors is presented. An experimental dataset has been...... created and meaningful basketball features have been extracted. 97.9% accuracy is obtained using Support Vector Machines when identifying 5 different classic plays: floppy offense, pick and roll, press break, post-up situation and fast breaks. After recognizing these plays in video sequences, advanced...

  10. Identifying Discrimination at Work: The Use of Field Experiments.

    Science.gov (United States)

    Pager, Devah; Western, Bruce

    2012-06-01

    Antidiscrimination law offers protection to workers who have been treated unfairly on the basis of their race, gender, religion, or national origin. In order for these protections to be invoked, however, potential plaintiffs must be aware of and able to document discriminatory treatment. Given the subtlety of contemporary forms of discrimination, it is often difficult to identify discrimination when it has taken place. The methodology of field experiments offers one approach to measuring and detecting hiring discrimination, providing direct observation of discrimination in real-world settings. In this article, we discuss the findings of two recent field experiments measuring racial discrimination in low wage labor markets. This research provides several relevant findings for researchers and those interested in civil rights enforcement: (1) it produces estimates of the rate of discrimination at the point of hire; (2) it yields evidence about the interactions associated with discrimination (many of which reveal the subtlety with which contemporary discrimination is practiced); and (3) it provides a vehicle for both research on and enforcement of antidiscrimination law.

  11. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    Science.gov (United States)

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  12. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    Science.gov (United States)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  13. Nonrelativistic effective field theories of QED and QCD. Applications and automatic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Shtabovenko, Vladyslav

    2017-05-22

    This thesis deals with the applications of nonrelativistic Effective Field Theories to electromagnetic and strong interactions. The main results of this work are divided into three parts. In the first part, we use potential Nonrelativistic Quantum Electrodynamics (pNRQED), an EFT of QED at energies much below m{sub e}α (with m{sub e} being the electron mass and α the fine-structure constant), to develop a consistent description of electromagnetic van der Waals forces between two hydrogen atoms at a separation R much larger than the Bohr radius. We consider the interactions at short (R<<1/m{sub e}α{sup 2}), long (R>>1/m{sub e}α{sup 2}) and intermediate (R∝1/m{sub e}α{sup 2}) distances and identify the relevant dynamical scales that characterize each of the three regimes. For each regime we construct a suitable van der Waals EFT, that provides the simplest description of the low-energy dynamics. In this framework, van der Waals potentials naturally arise from the matching coefficients of the corresponding EFTs. They can be computed in a systematic way, order by order in the relevant expansion parameters, as is done in this work. Furthermore, the potentials receive contributions from radiative corrections and have to be renormalized. The development of a consistent EFT framework to treat electromagnetic van der Waals interactions between hydrogen atoms and the renormalization of the corresponding van der Waals potentials are the novel features of this study. In the second part, we study relativistic O(α{sup 0}{sub s}υ{sup 2}) (with α{sub s} being the strong coupling constant) corrections to the exclusive electromagnetic production of the heavy quarkonium χ {sub cJ} and a hard photon in the framework of nonrelativistic Quantum Chromodynamics (NRQCD), an EFT of QCD that takes full advantage of the nonrelativistic nature of charmonia and bottomonia and exploits wide separation of the relevant dynamical scales. These scales are m{sub Q} >> m{sub Q}υ >> m{sub Q

  14. Nonrelativistic effective field theories of QED and QCD. Applications and automatic calculations

    International Nuclear Information System (INIS)

    Shtabovenko, Vladyslav

    2017-01-01

    This thesis deals with the applications of nonrelativistic Effective Field Theories to electromagnetic and strong interactions. The main results of this work are divided into three parts. In the first part, we use potential Nonrelativistic Quantum Electrodynamics (pNRQED), an EFT of QED at energies much below m e α (with m e being the electron mass and α the fine-structure constant), to develop a consistent description of electromagnetic van der Waals forces between two hydrogen atoms at a separation R much larger than the Bohr radius. We consider the interactions at short (R<<1/m e α 2 ), long (R>>1/m e α 2 ) and intermediate (R∝1/m e α 2 ) distances and identify the relevant dynamical scales that characterize each of the three regimes. For each regime we construct a suitable van der Waals EFT, that provides the simplest description of the low-energy dynamics. In this framework, van der Waals potentials naturally arise from the matching coefficients of the corresponding EFTs. They can be computed in a systematic way, order by order in the relevant expansion parameters, as is done in this work. Furthermore, the potentials receive contributions from radiative corrections and have to be renormalized. The development of a consistent EFT framework to treat electromagnetic van der Waals interactions between hydrogen atoms and the renormalization of the corresponding van der Waals potentials are the novel features of this study. In the second part, we study relativistic O(α 0 s υ 2 ) (with α s being the strong coupling constant) corrections to the exclusive electromagnetic production of the heavy quarkonium χ cJ and a hard photon in the framework of nonrelativistic Quantum Chromodynamics (NRQCD), an EFT of QCD that takes full advantage of the nonrelativistic nature of charmonia and bottomonia and exploits wide separation of the relevant dynamical scales. These scales are m Q >> m Q υ >> m Q υ 2 , where m Q is the heavy quark mass and υ is the relative

  15. Automatic detection of diabetic retinopathy features in ultra-wide field retinal images

    Science.gov (United States)

    Levenkova, Anastasia; Sowmya, Arcot; Kalloniatis, Michael; Ly, Angelica; Ho, Arthur

    2017-03-01

    Diabetic retinopathy (DR) is a major cause of irreversible vision loss. DR screening relies on retinal clinical signs (features). Opportunities for computer-aided DR feature detection have emerged with the development of Ultra-WideField (UWF) digital scanning laser technology. UWF imaging covers 82% greater retinal area (200°), against 45° in conventional cameras3 , allowing more clinically relevant retinopathy to be detected4 . UWF images also provide a high resolution of 3078 x 2702 pixels. Currently DR screening uses 7 overlapping conventional fundus images, and the UWF images provide similar results1,4. However, in 40% of cases, more retinopathy was found outside the 7-field ETDRS) fields by UWF and in 10% of cases, retinopathy was reclassified as more severe4 . This is because UWF imaging allows examination of both the central retina and more peripheral regions, with the latter implicated in DR6 . We have developed an algorithm for automatic recognition of DR features, including bright (cotton wool spots and exudates) and dark lesions (microaneurysms and blot, dot and flame haemorrhages) in UWF images. The algorithm extracts features from grayscale (green "red-free" laser light) and colour-composite UWF images, including intensity, Histogram-of-Gradient and Local binary patterns. Pixel-based classification is performed with three different classifiers. The main contribution is the automatic detection of DR features in the peripheral retina. The method is evaluated by leave-one-out cross-validation on 25 UWF retinal images with 167 bright lesions, and 61 other images with 1089 dark lesions. The SVM classifier performs best with AUC of 94.4% / 95.31% for bright / dark lesions.

  16. Automatic lung tumor segmentation on PET/CT images using fuzzy Markov random field model.

    Science.gov (United States)

    Guo, Yu; Feng, Yuanming; Sun, Jian; Zhang, Ning; Lin, Wang; Sa, Yu; Wang, Ping

    2014-01-01

    The combination of positron emission tomography (PET) and CT images provides complementary functional and anatomical information of human tissues and it has been used for better tumor volume definition of lung cancer. This paper proposed a robust method for automatic lung tumor segmentation on PET/CT images. The new method is based on fuzzy Markov random field (MRF) model. The combination of PET and CT image information is achieved by using a proper joint posterior probability distribution of observed features in the fuzzy MRF model which performs better than the commonly used Gaussian joint distribution. In this study, the PET and CT simulation images of 7 non-small cell lung cancer (NSCLC) patients were used to evaluate the proposed method. Tumor segmentations with the proposed method and manual method by an experienced radiation oncologist on the fused images were performed, respectively. Segmentation results obtained with the two methods were similar and Dice's similarity coefficient (DSC) was 0.85 ± 0.013. It has been shown that effective and automatic segmentations can be achieved with this method for lung tumors which locate near other organs with similar intensities in PET and CT images, such as when the tumors extend into chest wall or mediastinum.

  17. Automatic Lung Tumor Segmentation on PET/CT Images Using Fuzzy Markov Random Field Model

    Directory of Open Access Journals (Sweden)

    Yu Guo

    2014-01-01

    Full Text Available The combination of positron emission tomography (PET and CT images provides complementary functional and anatomical information of human tissues and it has been used for better tumor volume definition of lung cancer. This paper proposed a robust method for automatic lung tumor segmentation on PET/CT images. The new method is based on fuzzy Markov random field (MRF model. The combination of PET and CT image information is achieved by using a proper joint posterior probability distribution of observed features in the fuzzy MRF model which performs better than the commonly used Gaussian joint distribution. In this study, the PET and CT simulation images of 7 non-small cell lung cancer (NSCLC patients were used to evaluate the proposed method. Tumor segmentations with the proposed method and manual method by an experienced radiation oncologist on the fused images were performed, respectively. Segmentation results obtained with the two methods were similar and Dice’s similarity coefficient (DSC was 0.85 ± 0.013. It has been shown that effective and automatic segmentations can be achieved with this method for lung tumors which locate near other organs with similar intensities in PET and CT images, such as when the tumors extend into chest wall or mediastinum.

  18. Automatic analysis of altered gait in arylsulphatase A-deficient mice in the open field.

    Science.gov (United States)

    Leroy, Toon; Stroobants, Stijn; Aerts, Jean-Marie; D'Hooge, Rudi; Berckmans, Daniel

    2009-08-01

    In current research with laboratory animals, observing their dynamic behavior or locomotion is a labor-intensive task. Automatic continuous monitoring can provide quantitative data on each animal's condition and coordination ability. The objective of the present work is to develop an automated mouse observation system integrated with a conventional open-field test for motor function evaluation. Data were acquired from 86 mice having a targeted disruption of the arylsulphatase A (ASA) gene and having lowered coordinated locomotion abilities as a symptom. The mice used were 36 heterozygotes (12 females) and 50 knockout mice (30 females) at the age of 6 months. The mice were placed one at a time into the test setup, which consisted of a Plexiglas cage (53x34.5x26 cm) and two fluorescent bulbs for proper illumination. The transparent cage allowed images to be captured from underneath the cage, so image information could be obtained about the dynamic variation of the positions of the limbs of the mice for gait reconstruction. Every mouse was recorded for 10 min. Background subtraction and color filtering were used to measure and calculate image features, which are variables that contain crucial information, such as the mouse's position, orientation, body outline, and possible locations for the mouse's paws. A set of heuristic rules was used to prune implausible paw features and label the remaining ones as front/hind and left/right. After we had pruned the implausible paw features, the paw features that were consistent over subsequent images were matched to footprints. Finally, from the measured footprint sequence, eight parameters were calculated in order to quantify the gait of the mouse. This automatic observation technique can be integrated with a regular open-field test, where the trajectory and motor function of a free-moving mouse are measured simultaneously.

  19. Automatic control of positioning along the joint during EBW in conditions of action of magnetic fields

    Science.gov (United States)

    Druzhinina, A. A.; Laptenok, V. D.; Murygin, A. V.; Laptenok, P. V.

    2016-11-01

    Positioning along the joint during the electron beam welding is a difficult scientific and technical problem to achieve the high quality of welds. The final solution of this problem is not found. This is caused by weak interference protection of sensors of the joint position directly in the welding process. Frequently during the electron beam welding magnetic fields deflect the electron beam from the optical axis of the electron beam gun. The collimated X-ray sensor is used to monitor the beam deflection caused by the action of magnetic fields. Signal of X-ray sensor is processed by the method of synchronous detection. Analysis of spectral characteristics of the X-ray sensor showed that the displacement of the joint from the optical axis of the gun affects on the output signal of sensor. The authors propose dual-circuit system for automatic positioning of the electron beam on the joint during the electron beam welding in conditions of action of magnetic interference. This system includes a contour of joint tracking and contour of compensation of magnetic fields. The proposed system is stable. Calculation of dynamic error of system showed that error of positioning does not exceed permissible deviation of the electron beam from the joint plane.

  20. Automatic feathering of split fields for step-and-shoot intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Dogan, Nesrin; Leybovich, Leonid B; Sethi, Anil; Emami, Bahman

    2003-01-01

    Due to leaf travel range limitations of the Varian Dynamic Multileaf Collimator (DMLC) system, an IMRT field width exceeding 14.5 cm is split into two or more adjacent abutting sub-fields. The abutting sub-fields are then delivered as separate treatment fields. The accuracy of the delivery is very sensitive to multileaf positioning accuracy. The uncertainties in leaf and carriage positions cause errors in the delivered dose (e.g., hot or cold spots) along the match line of abutting sub-fields. The dose errors are proportional to the penumbra slope at the edge of each sub-field. To alleviate this problem, we developed techniques that feather the split line of IMRT fields. Feathering of the split line was achieved by dividing IMRT fields into several sub-groups with different split line positions. A Varian 21EX accelerator with an 80-leaf DLMC was used for IMRT delivery. Cylindrical targets with varying widths (>14.5 cm) were created to study the split line positions. Seven coplanar 6 MV fields were selected for planning using the NOMOS-CORVUS TM system. The isocentre of the fields was positioned at the centre of the target volume. Verification was done in a 30 x 30 x 30 cm 3 polystyrene phantom using film dosimetry. We investigated two techniques to move the split line from its original position or cause feathering of them: (1) varying the isocentre position along the target width and (2) introduction of a 'pseudo target' outside of the patient (phantom). The position of the 'pseudo target' was determined by analysing the divergence of IMRT fields. For target widths of 14-28 cm, IMRT fields were automatically split into two sub-fields, and the split line was positioned along the centre of the target by CORVUS. Measured dose distributions demonstrated that the dose to the critical structure was 10% higher than planned when the split line crossed through the centre of the target. Both methods of modifying the split line positions resulted in maximum shifts of ∼1 cm

  1. Towards an automatic wind speed and direction profiler for Wide Field adaptive optics systems

    Science.gov (United States)

    Sivo, G.; Turchi, A.; Masciadri, E.; Guesalaga, A.; Neichel, B.

    2018-05-01

    Wide Field Adaptive Optics (WFAO) systems are among the most sophisticated adaptive optics (AO) systems available today on large telescopes. Knowledge of the vertical spatio-temporal distribution of wind speed (WS) and direction (WD) is fundamental to optimize the performance of such systems. Previous studies already proved that the Gemini Multi-Conjugated AO system (GeMS) is able to retrieve measurements of the WS and WD stratification using the SLOpe Detection And Ranging (SLODAR) technique and to store measurements in the telemetry data. In order to assess the reliability of these estimates and of the SLODAR technique applied to such complex AO systems, in this study we compared WS and WD values retrieved from GeMS with those obtained with the atmospheric model Meso-NH on a rich statistical sample of nights. It has previously been proved that the latter technique provided excellent agreement with a large sample of radiosoundings, both in statistical terms and on individual flights. It can be considered, therefore, as an independent reference. The excellent agreement between GeMS measurements and the model that we find in this study proves the robustness of the SLODAR approach. To bypass the complex procedures necessary to achieve automatic measurements of the wind with GeMS, we propose a simple automatic method to monitor nightly WS and WD using Meso-NH model estimates. Such a method can be applied to whatever present or new-generation facilities are supported by WFAO systems. The interest of this study is, therefore, well beyond the optimization of GeMS performance.

  2. Automatically Identifying Fusion Events between GLUT4 Storage Vesicles and the Plasma Membrane in TIRF Microscopy Image Sequences

    Directory of Open Access Journals (Sweden)

    Jian Wu

    2015-01-01

    Full Text Available Quantitative analysis of the dynamic behavior about membrane-bound secretory vesicles has proven to be important in biological research. This paper proposes a novel approach to automatically identify the elusive fusion events between VAMP2-pHluorin labeled GLUT4 storage vesicles (GSVs and the plasma membrane. The differentiation is implemented to detect the initiation of fusion events by modified forward subtraction of consecutive frames in the TIRFM image sequence. Spatially connected pixels in difference images brighter than a specified adaptive threshold are grouped into a distinct fusion spot. The vesicles are located at the intensity-weighted centroid of their fusion spots. To reveal the true in vivo nature of a fusion event, 2D Gaussian fitting for the fusion spot is used to derive the intensity-weighted centroid and the spot size during the fusion process. The fusion event and its termination can be determined according to the change of spot size. The method is evaluated on real experiment data with ground truth annotated by expert cell biologists. The evaluation results show that it can achieve relatively high accuracy comparing favorably to the manual analysis, yet at a small fraction of time.

  3. Exploratory field trial of motorcycle autonomous emergency braking (MAEB): Considerations on the acceptability of unexpected automatic decelerations.

    Science.gov (United States)

    Savino, Giovanni; Pierini, Marco; Thompson, Jason; Fitzharris, Michael; Lenné, Michael G

    2016-11-16

    Autonomous emergency braking (AEB) acts to slow down a vehicle when an unavoidable impending collision is detected. In addition to documented benefits when applied to passenger cars, AEB has also shown potential when applied to motorcycles (MAEB). However, the feasibility of MAEB as practically applied to motorcycles in the real world is not well understood. In this study we performed a field trial involving 16 riders on a test motorcycle subjected to automatic decelerations, thus simulating MAEB activation. The tests were conducted along a rectilinear path at nominal speed of 40 km/h and with mean deceleration of 0.15 g (15% of full braking) deployed at random times. Riders were also exposed to one final undeclared brake activation with the aim of providing genuinely unexpected automatic braking events. Participants were consistently able to manage automatic decelerations of the vehicle with minor to moderate effort. Results of undeclared activations were consistent with those of standard runs. This study demonstrated the feasibility of a moderate automatic deceleration in a scenario of motorcycle travelling in a straight path, supporting the notion that the application of AEB on motorcycles is practicable. Furthermore, the proposed field trial can be used as a reference for future regulation or consumer tests in order to address safety and acceptability of unexpected automatic decelerations on a motorcycle.

  4. Diagnostic tools for identifying sleepy drivers in the field.

    Science.gov (United States)

    2013-05-06

    The overarching goal of this project was to identify and evaluate cognitive and behavioral indices that are sensitive to sleep : deprivation and may help identify commercial motor vehicle drivers (CMV) who are at-risk for driving in a sleep deprived ...

  5. Development of a doorframe-typed swinging seedling pick-up device for automatic field transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Han, H.; Mao, H.; Hu, J.; Tian, K.

    2015-07-01

    A doorframe-typed swing seedling pick-up device for automatic field transplanters was developed and evaluated in a laboratory. The device, consisting of a path manipulator and two grippers, can move the pins slowly to extract seedlings from the tray cells and return quickly to the pick-up point for the next extraction. The path manipulator was constructed with the creative design of type-Ⅱ mechanism combination in series. It consists of an oscillating guide linkage mechanism and a grooved globoidal cam mechanism. The gripper is a pincette-type mechanism using the pick-up pins to penetrate into the root mass for seedling extraction. The dynamic analysis of the designed seedling pick-up device was simulated with ADAMS software. Being the first prototype, various performance tests under local production conditions were conducted to find out the optimal machine operation parameters and transplant production conditions. As the gripper with multiple fine pins was moved by the swing pick-up device, it can effectively complete the transplanting work cycle of extracting, transferring, and discharging a seedling. The laboratory evaluation showed that the pick-up device equipped with two grippers can extract 80 seedlings/min with a 90% success and a 3% failure in discharging seedlings, using 42-day-old tomato plantlets. The quality of extracting seedlings was satisfactory. (Author)

  6. Bianchi identities and the automatic conservation of energy-momentum and angular momentum in general-relativistic field theories

    International Nuclear Information System (INIS)

    Hehl, F.W.; McCrea, J.D.

    1986-01-01

    Automatic conservation of energy-momentum and angular momentum is guaranteed in a gravitational theory if, via the field equations, the conservation laws for the material currents are reduced to the contracted Bianchi identities. We first execute an irreducible decomposition of the Bianchi identities in a Riemann-Cartan space-time. Then, starting from a Riemannian space-time with or without torsion, we determine those gravitational theories which have automatic conservation: general relativity and the Einstein-Cartan-Sciama-Kibble theory, both with cosmological constant, and the nonviable pseudoscalar model. The Poincare gauge theory of gravity, like gauge theories of internal groups, has no automatic conservation in the sense defined above. This does not lead to any difficulties in principle. Analogies to 3-dimensional continuum mechanics are stressed throughout the article

  7. Bianchi identities and the automatic conservation of energy-momentum and angular momentum in general-relativistic field theories

    Science.gov (United States)

    Hehl, Friedrich W.; McCrea, J. Dermott

    1986-03-01

    Automatic conservation of energy-momentum and angular momentum is guaranteed in a gravitational theory if, via the field equations, the conservation laws for the material currents are reduced to the contracted Bianchi identities. We first execute an irreducible decomposition of the Bianchi identities in a Riemann-Cartan space-time. Then, starting from a Riemannian space-time with or without torsion, we determine those gravitational theories which have automatic conservation: general relativity and the Einstein-Cartan-Sciama-Kibble theory, both with cosmological constant, and the nonviable pseudoscalar model. The Poincaré gauge theory of gravity, like gauge theories of internal groups, has no automatic conservation in the sense defined above. This does not lead to any difficulties in principle. Analogies to 3-dimensional continuum mechanics are stressed throughout the article.

  8. The system for automatic dose rate measurements by mobile groups in field

    International Nuclear Information System (INIS)

    Drabova, D.; Filgas, R.; Cespirova, I.; Ejemova, M.

    1998-01-01

    The comparison of characteristics between a pressurized ionization chamber, plastic scintillator and proportional counter is given. Based on requirements and comparison of properties of various probes, the system for automatic dose rate measurement and integration of geographic co-ordinates in field was designed and tested.The system consists of proportional counter. This is so-called intelligent probe can be easily connected to a personal computer. The probe measures in the energy range 30 keV - 1.3 MeV with reasonable energy and angular response, it can measure the dose rate in the range 50 nSv/h - 1 Sv/h with the typical efficiency 9.5 imp/s/μSv/h. The probe is fixed in the holder placed on the front mask of a car. For the simultaneous determination of geographical co-ordinates the personal GPS navigator Garmin 95 is used. Both devices are controlled by a notebook via two serial ports. The second serial port that is not quite common in notebook can be easily realised by a PCMCIA card. The notebook is used in the field by a mobile group can be transmitted to the assessment centre by the cellular GSM phone. The system Nokia 2110 connected to notebook by PCMCIA card is used. The whole system is powered up from the car battery. The system is controlled by specially developed software. The software was developed in the FoxPro 2.5 environment and works under MS-DOS 6.22. It has no problems to work in Windows 95 DOS window. The results of dose rate measurements obtained during route monitoring are stored in files. They can be displayed on a graphic screen, presenting the geographical distribution of the dose rate values colour coded on a map and the time sequence of the measured data. (authors)

  9. Automatization of laboratory extraction installation intended for investigations in the field of reprocessing of spenf fuels

    International Nuclear Information System (INIS)

    Vznuzdaev, E.A.; Galkin, B.Ya.; Gofman, F.Eh.

    1981-01-01

    Automatized stand for solving the problem of optimum control on technological extraction process in the spent fuel reprocessing by means of an automatized control system which is based on the means of computation technick is described in the paper. Preliminary experiments which had been conducted on the stand with spent fuel from WWER-440 reactor have shown high efficiency of automatization and possibility to conduct technological investigations in a short period of time and to have much of information which can not be obtained by ordinary organisation of work [ru

  10. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions.

    Science.gov (United States)

    Zdravevski, Eftim; Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger

    2017-01-01

    Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from

  11. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions.

    Directory of Open Access Journals (Sweden)

    Eftim Zdravevski

    Full Text Available Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position.The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers.The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be

  12. Identifying Future Training Technology Opportunities Using Career Field Models and Simulations

    National Research Council Canada - National Science Library

    Bennett, Jr., Winston; Stone, Brice; Turner, Kathryn; Ruck, Hendrick W

    2002-01-01

    ... itself. This report presents results from a recent application of a career field education and training planning simulation capability to identify cost-effective opportunities for the introduction...

  13. Scan-Less Line Field Optical Coherence Tomography, with Automatic Image Segmentation, as a Measurement Tool for Automotive Coatings

    Directory of Open Access Journals (Sweden)

    Samuel Lawman

    2017-04-01

    Full Text Available The measurement of the thicknesses of layers is important for the quality assurance of industrial coating systems. Current measurement techniques only provide a limited amount of information. Here, we show that spectral domain Line Field (LF Optical Coherence Tomography (OCT is able to return to the user a cross sectional B-Scan image in a single shot with no mechanical moving parts. To reliably extract layer thicknesses from such images of automotive paint systems, we present an automatic graph search image segmentation algorithm. To show that the algorithm works independently of the OCT device, the measurements are repeated with a separate time domain Full Field (FF OCT system. This gives matching mean thickness values within the standard deviations of the measured thicknesses across each B-Scan image. The combination of an LF-OCT with graph search segmentation is potentially a powerful technique for the quality assurance of non-opaque industrial coating layers.

  14. Automatic detection of tulip breaking virus (TBV) in tulip fields using machine vision

    NARCIS (Netherlands)

    Polder, G.; Heijden, van der G.W.A.M.; Doorn, van J.; Baltissen, A.H.M.C.

    2014-01-01

    Tulip breaking virus (TBV) causes severe economic losses in flower bulbs in the Netherlands. To prevent further spread by aphids, the vector of the disease, infected plants must be removed from the field as soon as possible. Until now screening has been carried out by visual inspection in the field.

  15. Automatic detection of tulip breaking virus (TBV) in tulip fields using machine vision

    NARCIS (Netherlands)

    Polder, G.; Heijden, van der G.W.A.M.; Doorn, van J.; Baltissen, A.H.M.C.

    2012-01-01

    Tulip breaking virus (TBV) causes severe economic losses for the Netherlands. Infected plants must be removed from the field as soon as possible to prevent further spread by aphids. Until now screening is done by visual inspection in the field. As the availability of human experts is limited there

  16. Terminology of the public relations field: corpus — automatic term recognition — terminology database

    Directory of Open Access Journals (Sweden)

    Nataša Logar Berginc

    2013-12-01

    Full Text Available The article describes an analysis of automatic term recognition results performed for single- and multi-word terms with the LUIZ term extraction system. The target application of the results is a terminology database of Public Relations and the main resource the KoRP Public Relations Corpus. Our analysis is focused on two segments: (a single-word noun term candidates, which we compare with the frequency list of nouns from KoRP and evaluate termhood on the basis of the judgements of two domain experts, and (b multi-word term candidates with verb and noun as headword. In order to better assess the performance of the system and the soundness of our approach we also performed an analysis of recall. Our results show that the terminological relevance of extracted nouns is indeed higher than that of merely frequent nouns, and that verbal phrases only rarely count as proper terms. The most productive patterns of multi-word terms with noun as a headword have the following structure: [adjective + noun], [adjective + and + adjective + noun] and [adjective + adjective + noun]. The analysis of recall shows low inter-annotator agreement, but nevertheless very satisfactory recall levels.

  17. Identifying research fields within business and management: a journal cross-citation analysis

    NARCIS (Netherlands)

    Mingers, J.; Leydesdorff, L.

    2015-01-01

    A discipline such as business and management (B&M) is very broad and has many fields within it, ranging from fairly scientific ones such as management science or economics to softer ones such as information systems. There are at least three reasons why it is important to identify these sub-fields

  18. FieldChopper, a new tool for automatic model generation and virtual screening based on molecular fields.

    Science.gov (United States)

    Kalliokoski, Tuomo; Ronkko, Toni; Poso, Antti

    2008-06-01

    Algorithms were developed for ligand-based virtual screening of molecular databases. FieldChopper (FC) is based on the discretization of the electrostatic and van der Waals field into three classes. A model is built from a set of superimposed active molecules. The similarity of the compounds in the database to the model is then calculated using matrices that define scores for comparing field values of different categories. The method was validated using 12 publicly available data sets by comparing the method to the electrostatic similarity comparison program EON. The results suggest that FC is competitive with more complex descriptors and could be used as a molecular sieve in virtual screening experiments when multiple active ligands are known.

  19. Automatic verification of step-and-shoot IMRT field segments using portal imaging

    International Nuclear Information System (INIS)

    Woo, M.K.; Lightstone, A.W.; Shan, G.; Kumaraswamy, L.; Li, Y.

    2003-01-01

    In step-and-shoot IMRT, many individual beam segments are delivered. These segments are generated by the IMRT treatment planning system and subsequently transmitted electronically through computer hardware and software modules before they are finally delivered. Hence, an independent system that monitors the actual field shape during treatment delivery is an added level of quality assurance in this complicated process. In this paper we describe the development and testing of such a system. The system verifies the field shape by comparing the radiation field detected by the built-in portal imaging system on the linac to the actual field shape planned on the treatment planning system. The comparison is based on a software algorithm that detects the leaf edge positions of the radiation field on the portal image and compares that to the calculated positions. The process is fully automated and requires minimal intervention of the radiation therapists. The system has been tested with actual clinical plan sequences and was able to alert the operator of incorrect settings in real time

  20. Persistent Identifiers for Field Deployments: A Missing Link in the Provenance Chain

    Science.gov (United States)

    Arko, R. A.; Ji, P.; Fils, D.; Shepherd, A.; Chandler, C. L.; Lehnert, K.

    2016-12-01

    Research in the geosciences is characterized by a wide range of complex and costly field deployments including oceanographic cruises, submersible dives, drilling expeditions, seismic networks, geodetic campaigns, moored arrays, aircraft flights, and satellite missions. Each deployment typically produces a mix of sensor and sample data, spanning a period from hours to decades, that ultimately yields a long tail of post-field products and publications. Publishing persistent, citable identifiers for field deployments will facilitate 1) preservation and reuse of the original field data, 2) reproducibility of the resulting publications, and 3) recognition for both the facilities that operate the platforms and the investigators who secure funding for the experiments. In the ocean domain, sharing unique identifiers for field deployments is a familiar practice. For example, the Biological and Chemical Oceanography Data Management Office (BCO-DMO) routinely links datasets to cruise identifiers published by the Rolling Deck to Repository (R2R) program. In recent years, facilities have started to publish formal/persistent identifiers, typically Digital Object Identifiers (DOIs), for field deployments including seismic networks, oceanographic cruises, and moored arrays. For example, the EarthChem Library (ECL) publishes a DOI for each dataset which, if it derived from an oceanographic research cruise on a US vessel, is linked to a DOI for the cruise published by R2R. Work is underway to create similar links for the IODP JOIDES Resolution Science Operator (JRSO) and the Continental Scientific Drilling Coordination Office (CSDCO). We present results and lessons learned including a draft schema for publishing field deployments as DataCite DOI records; current practice for linking these DOIs with related identifiers such as Open Researcher and Contributor IDs (ORCIDs), Open Funder Registry (OFR) codes, and International Geo Sample Numbers (IGSNs); and consideration of other

  1. The readmission risk flag: using the electronic health record to automatically identify patients at risk for 30-day readmission.

    Science.gov (United States)

    Baillie, Charles A; VanZandbergen, Christine; Tait, Gordon; Hanish, Asaf; Leas, Brian; French, Benjamin; Hanson, C William; Behta, Maryam; Umscheid, Craig A

    2013-12-01

    Identification of patients at high risk for readmission is a crucial step toward improving care and reducing readmissions. The adoption of electronic health records (EHR) may prove important to strategies designed to risk stratify patients and introduce targeted interventions. To develop and implement an automated prediction model integrated into our health system's EHR that identifies on admission patients at high risk for readmission within 30 days of discharge. Retrospective and prospective cohort. Healthcare system consisting of 3 hospitals. All adult patients admitted from August 2009 to September 2012. An automated readmission risk flag integrated into the EHR. Thirty-day all-cause and 7-day unplanned healthcare system readmissions. Using retrospective data, a single risk factor, ≥ 2 inpatient admissions in the past 12 months, was found to have the best balance of sensitivity (40%), positive predictive value (31%), and proportion of patients flagged (18%), with a C statistic of 0.62. Sensitivity (39%), positive predictive value (30%), proportion of patients flagged (18%), and C statistic (0.61) during the 12-month period after implementation of the risk flag were similar. There was no evidence for an effect of the intervention on 30-day all-cause and 7-day unplanned readmission rates in the 12-month period after implementation. An automated prediction model was effectively integrated into an existing EHR and identified patients on admission who were at risk for readmission within 30 days of discharge. © 2013 Society of Hospital Medicine.

  2. De-identifying Swedish clinical text - refinement of a gold standard and experiments with Conditional random fields

    Directory of Open Access Journals (Sweden)

    Dalianis Hercules

    2010-04-01

    Full Text Available Abstract Background In order to perform research on the information contained in Electronic Patient Records (EPRs, access to the data itself is needed. This is often very difficult due to confidentiality regulations. The data sets need to be fully de-identified before they can be distributed to researchers. De-identification is a difficult task where the definitions of annotation classes are not self-evident. Results We present work on the creation of two refined variants of a manually annotated Gold standard for de-identification, one created automatically, and one created through discussions among the annotators. The data is a subset from the Stockholm EPR Corpus, a data set available within our research group. These are used for the training and evaluation of an automatic system based on the Conditional Random Fields algorithm. Evaluating with four-fold cross-validation on sets of around 4-6 000 annotation instances, we obtained very promising results for both Gold Standards: F-score around 0.80 for a number of experiments, with higher results for certain annotation classes. Moreover, 49 false positives that were verified true positives were found by the system but missed by the annotators. Conclusions Our intention is to make this Gold standard, The Stockholm EPR PHI Corpus, available to other research groups in the future. Despite being slightly more time-consuming we believe the manual consensus gold standard is the most valuable for further research. We also propose a set of annotation classes to be used for similar de-identification tasks.

  3. ZCURVE 3.0: identify prokaryotic genes with higher accuracy as well as automatically and accurately select essential genes.

    Science.gov (United States)

    Hua, Zhi-Gang; Lin, Yan; Yuan, Ya-Zhou; Yang, De-Chang; Wei, Wen; Guo, Feng-Biao

    2015-07-01

    In 2003, we developed an ab initio program, ZCURVE 1.0, to find genes in bacterial and archaeal genomes. In this work, we present the updated version (i.e. ZCURVE 3.0). Using 422 prokaryotic genomes, the average accuracy was 93.7% with the updated version, compared with 88.7% with the original version. Such results also demonstrate that ZCURVE 3.0 is comparable with Glimmer 3.02 and may provide complementary predictions to it. In fact, the joint application of the two programs generated better results by correctly finding more annotated genes while also containing fewer false-positive predictions. As the exclusive function, ZCURVE 3.0 contains one post-processing program that can identify essential genes with high accuracy (generally >90%). We hope ZCURVE 3.0 will receive wide use with the web-based running mode. The updated ZCURVE can be freely accessed from http://cefg.uestc.edu.cn/zcurve/ or http://tubic.tju.edu.cn/zcurveb/ without any restrictions. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. ZCURVE 3.0: identify prokaryotic genes with higher accuracy as well as automatically and accurately select essential genes

    Science.gov (United States)

    Hua, Zhi-Gang; Lin, Yan; Yuan, Ya-Zhou; Yang, De-Chang; Wei, Wen; Guo, Feng-Biao

    2015-01-01

    In 2003, we developed an ab initio program, ZCURVE 1.0, to find genes in bacterial and archaeal genomes. In this work, we present the updated version (i.e. ZCURVE 3.0). Using 422 prokaryotic genomes, the average accuracy was 93.7% with the updated version, compared with 88.7% with the original version. Such results also demonstrate that ZCURVE 3.0 is comparable with Glimmer 3.02 and may provide complementary predictions to it. In fact, the joint application of the two programs generated better results by correctly finding more annotated genes while also containing fewer false-positive predictions. As the exclusive function, ZCURVE 3.0 contains one post-processing program that can identify essential genes with high accuracy (generally >90%). We hope ZCURVE 3.0 will receive wide use with the web-based running mode. The updated ZCURVE can be freely accessed from http://cefg.uestc.edu.cn/zcurve/ or http://tubic.tju.edu.cn/zcurveb/ without any restrictions. PMID:25977299

  5. Screening for chloroquine maculopathy in populations with uncertain reliability in outcomes of automatic visual field testing

    NARCIS (Netherlands)

    P. Kunavisarut (Paradee); Chavengsaksongkram, P. (Pimploy); A. Rothová (Aniki); K. Pathanapitoon (Kessara)

    2016-01-01

    textabstractPurpose: The purpose of this study was to compare screening methods for the early detection of maculopathy in patients treated with chloroquine (CQ) or hydroxychloroquine (HCQ) and to identify the risk factors for the development of toxic maculopathy. Methods: We performed a prospective

  6. Influence of fogging lenses and cycloplegia on open-field automatic refraction.

    Science.gov (United States)

    Queirós, A; González-Méijome, J; Jorge, J

    2008-07-01

    To compare refractive values measured with and without cycloplegia, or with fogging lenses, using an open-field auto-refractor. One hundred and forty-two young adults were enrolled from a university population; 96 were female (67.6%) and 46 were male (32.4%), the age range was 18-26 years (mean 22.3 +/- 3.7 years). The refraction measurement was obtained for the right eye of each subject with the Grand Seiko Auto Ref/Keratometer WAM-5500 (GS) under three conditions, always in this sequence: (1) without cycloplegia (GS), (2) without cycloplegia but using a + 2.00 D fogging lens (GS_2D) and (3) with cycloplegia (GS_cycl). When the average values of spherical equivalent were compared, both accommodation control strategies were almost equally successful: GS, M = -0.85 +/- 2.21 D; GC_2D, M = -0.53 +/- 2.10 D and GS_cycl, M = -0.57 +/- 2.24 D (Kruskal-Wallis test, p open-field autorefraction is performed in young adults.

  7. Hyperspectral classification of grassland species: towards a UAS application for semi-automatic field surveys

    Science.gov (United States)

    Lopatin, Javier; Fassnacht, Fabian E.; Kattenborn, Teja; Schmidtlein, Sebastian

    2017-04-01

    Grasslands are one of the ecosystems that have been strongly intervened during the past decades due to anthropogenic impacts, affecting their structural and functional composition. To monitor the spatial and/or temporal changes of these environments, a reliable field survey is first needed. As quality relevés are usually expensive and time consuming, the amount of information available is usually poor or not well spatially distributed at the regional scale. In the present study, we investigate the possibility of a semi-automated method used for repeated surveys of monitoring sites. We analyze the applicability of very high spatial resolution hyperspectral data to classify grassland species at the level of individuals. The AISA+ imaging spectrometer mounted on a scaffold was applied to scan 1 m2 grassland plots and assess the impact of four sources of variation on the predicted species cover: (1) the spatial resolution of the scans, (2) the species number and structural diversity, (3) the species cover, and (4) the species functional types (bryophytes, forbs and graminoids). We found that the spatial resolution and the diversity level (mainly structural diversity) were the most important source of variation for the proposed approach. A spatial resolution below 1 cm produced relatively high model performances, while predictions with pixel sizes over that threshold produced non adequate results. Areas with low interspecies overlap reached classification median values of 0.8 (kappa). On the contrary, results were not satisfactory in plots with frequent interspecies overlap in multiple layers. By means of a bootstrapping procedure, we found that areas with shadows and mixed pixels introduce uncertainties into the classification. We conclude that the application of very high resolution hyperspectral remote sensing as a robust alternative or supplement to field surveys is possible for environments with low structural heterogeneity. This study presents the first try of a

  8. Field manual for identifying and preserving high-water mark data

    Science.gov (United States)

    Feaster, Toby D.; Koenig, Todd A.

    2017-09-26

    This field manual provides general guidance for identifying and collecting high-water marks and is meant to be used by field personnel as a quick reference. The field manual describes purposes for collecting and documenting high-water marks along with the most common types of high-water marks. The manual provides a list of suggested field equipment, describes rules of thumb and best practices for finding high-water marks, and describes the importance of evaluating each high-water mark and assigning a numeric uncertainty value as part of the flagging process. The manual also includes an appendix of photographs of a variety of high-water marks obtained from various U.S. Geological Survey field investigations along with general comments about the logic for the assigned uncertainty values.

  9. Automatic segmentation for brain MR images via a convex optimized segmentation and bias field correction coupled model.

    Science.gov (United States)

    Chen, Yunjie; Zhao, Bo; Zhang, Jianwei; Zheng, Yuhui

    2014-09-01

    Accurate segmentation of magnetic resonance (MR) images remains challenging mainly due to the intensity inhomogeneity, which is also commonly known as bias field. Recently active contour models with geometric information constraint have been applied, however, most of them deal with the bias field by using a necessary pre-processing step before segmentation of MR data. This paper presents a novel automatic variational method, which can segment brain MR images meanwhile correcting the bias field when segmenting images with high intensity inhomogeneities. We first define a function for clustering the image pixels in a smaller neighborhood. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. In order to reduce the effect of the noise, the local intensity variations are described by the Gaussian distributions with different means and variances. Then, the objective functions are integrated over the entire domain. In order to obtain the global optimal and make the results independent of the initialization of the algorithm, we reconstructed the energy function to be convex and calculated it by using the Split Bregman theory. A salient advantage of our method is that its result is independent of initialization, which allows robust and fully automated application. Our method is able to estimate the bias of quite general profiles, even in 7T MR images. Moreover, our model can also distinguish regions with similar intensity distribution with different variances. The proposed method has been rigorously validated with images acquired on variety of imaging modalities with promising results. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Automatic history matching of an offshore field in Brazil; Ajuste automatico de historico de um campo offshore no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Jose P.M. dos [PETROBRAS S.A., Macae, RJ (Brazil). Exploracao e Producao. Bacia de Campos]. E-mail: zepedro@ep-bc.petrobras.com.br; Schiozer, Denis J. [Universidade Estadual de Campinas, SP (Brazil). Dept. de Engenharia de Petroleo]. E-mail: denis@cepetro.unicamp.br

    2000-07-01

    Efficient reservoir management is strongly influenced by good production prediction which depends on a good reservoir characterization. The validation of this characterization, due to the complexity of the dynamics of multiphase flow in porous media and to several geological uncertainties involved in the process, it is obtained through an history matching associated to the study of the reservoir in subject. History matching is usually a very complex task and most of the time it can be a frustrating experience due to the high number of variables to be adjusted to reach a final objective which can be a combination of several matches. Automated history matching techniques were object of several studies but with a limited acceptance due to the large computational effort required. Nowadays, they are becoming more attractive motivated by recent hardware and software developments. This work shows an example of application of automatic history matching using an offshore field in Brazil, with emphasis in the benefits of the use of parallel computing and optimization techniques to reduce the total time of the process. It is shown that although the computational effort is higher, the total time of a reservoir study can be significantly reduced with a higher quality of the results. (author)

  11. Review of Stat-Spotting: A Field Guide to Identifying Dubious Data by Joel Best

    Directory of Open Access Journals (Sweden)

    Joe Swingle

    2009-07-01

    Full Text Available Best, Joel. Stat-Spotting: A Field Guide to Identifying Dubious Data. (Berkeley: University of California Press, 2008 144 pp. $19.95. ISBN 1-978-0-520-25746-7.Stat-Spotting is a practical, do-it-yourself manual for detecting questionable claims reported in the media. Using examples drawn mostly from mass media sources, Stat-Spotting provides readers with a number of useful tips for identifying potentially problematic statistics. The author’s skillful analyses and explanations presented in clear and concise prose make Stat-Spotting an ideal guide for anyone who reads a newspaper, watches television, or surfs the Web. In short, everyone.

  12. Persistent Identifiers for Field Expeditions: A Next Step for the US Oceanographic Research Fleet

    Science.gov (United States)

    Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen

    2016-04-01

    Oceanographic research cruises are complex affairs, typically requiring an extensive effort to secure the funding, plan the experiment, and mobilize the field party. Yet cruises are not typically published online as first-class digital objects with persistent, citable identifiers linked to the scientific literature. The Rolling Deck to Repository (R2R; info@rvdata.us) program maintains a master catalog of oceanographic cruises for the United States research fleet, currently documenting over 6,000 expeditions on 37 active and retired vessels. In 2015, R2R started routinely publishing a Digital Object Identifier (DOI) for each completed cruise. Cruise DOIs, in turn, are linked to related persistent identifiers where available including the Open Researcher and Contributor ID (ORCID) for members of the science party, the International Geo Sample Number (IGSN) for physical specimens collected during the cruise, the Open Funder Registry (FundRef) codes that supported the experiment, and additional DOIs for datasets, journal articles, and other products resulting from the cruise. Publishing a persistent identifier for each field expedition will facilitate interoperability between the many different repositories that hold research products from cruises; will provide credit to the investigators who secured the funding and carried out the experiment; and will facilitate the gathering of fleet-wide altmetrics that demonstrate the broad impact of oceanographic research.

  13. New Method to Identify Field Joint Coating Failures Based on MFL In-Line Inspection Signals

    Directory of Open Access Journals (Sweden)

    Lianshuang Dai

    2018-02-01

    Full Text Available Above ground indirect detections and random excavations that have applied the past years for buried long distance oil and gas pipelines can only identify some damaged coating locations. Hence, large number of field joint coating (FJC failures happen unconsciously until they lead to failures of the pipelines. Based on the analysis of magnetic flux leakage (MFL in-line inspection (ILI signals, combined with the statistical results of 414 excavations from two different pipeline sections, a new method to identify the failed FJC is established. Though it can only identify FJC failures when there are signs of corrosion on pipe body, it is much more efficient and cost-saving. The concluded identification rule still needs more validations and improvements to be more applicable and accuracy.

  14. The identifiable victim effect in charitable giving: evidence from a natural field experiment

    DEFF Research Database (Denmark)

    Lesner, Tine; Rasmussen, O. D.

    2014-01-01

    or a statistical victim. Unlike much previous research, which has used only laboratory experiments, we find that the campaign letter focusing on one identifiable victim did not result in significantly larger donations than the campaign letter focusing on the statistical victim. In addition to the role......We design a natural field experiment to enhance our understanding of the role of the identifiable victim effect in charitable giving. Using direct mail solicitations to 25797 prior donors of a nonprofit charity, we tested the responsiveness of donors to make a contribution to either an identifiable...... campaigns. We find some evidence of crowding out, indicating that charitable giving could be a zero-sum game; however, the treatment letters did not have different effects on other payments....

  15. Identifying open magnetic field regions of the Sun and their heliospheric counterparts

    Science.gov (United States)

    Krista, L. D.; Reinard, A.

    2017-12-01

    Open magnetic regions on the Sun are either long-lived (coronal holes) or transient (dimmings) in nature. Both phenomena are fundamental to our understanding of the solar behavior as a whole. Coronal holes are the sources of high-speed solar wind streams that cause recurrent geomagnetic storms. Furthermore, the variation of coronal hole properties (area, location, magnetic field strength) over the solar activity cycle is an important marker of the global evolution of the solar magnetic field. Dimming regions, on the other hand, are short-lived coronal holes that often emerge in the wake of solar eruptions. By analyzing their physical properties and their temporal evolution, we aim to understand their connection with their eruptive counterparts (flares and coronal mass ejections) and predict the possibility of a geomagnetic storm. The author developed the Coronal Hole Automated Recognition and Monitoring (CHARM) and the Coronal Dimming Tracker (CoDiT) algorithms. These tools not only identify but track the evolution of open magnetic field regions. CHARM also provides daily coronal hole maps, that are used for forecasts at the NOAA Space Weather Prediction Center. Our goal is to better understand the processes that give rise to eruptive and non-eruptive open field regions and investigate how these regions evolve over time and influence space weather.

  16. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping.

    Science.gov (United States)

    Sadeghi-Tehran, Pouria; Virlet, Nicolas; Sabermanesh, Kasra; Hawkesford, Malcolm J

    2017-01-01

    Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1) comparison with ground-truth images, (2) variation along a day with changes in ambient illumination, (3) comparison with manual measurements and (4) an estimation of performance along the full life cycle of a wheat canopy. The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  17. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping

    Directory of Open Access Journals (Sweden)

    Pouria Sadeghi-Tehran

    2017-11-01

    Full Text Available Abstract Background Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. Results In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1 comparison with ground-truth images, (2 variation along a day with changes in ambient illumination, (3 comparison with manual measurements and (4 an estimation of performance along the full life cycle of a wheat canopy. Conclusion The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  18. Enhancing the Employability of Chinese International Students: Identifying Achievements and Gaps in the Research Field

    Directory of Open Access Journals (Sweden)

    Xuemeng Cao

    2017-10-01

    Full Text Available This article shows what achievements have been made by existing studies on graduate employability, and what gaps need to be filled in this field. It starts with a retrospective account of the changing concept of employability, followed by a presentation of the practices that have been used to support graduate employability enhancement in different countries. Moreover, this article gives a critical review of Chinese contexts of graduate labour market. Last but not least, limitations of existing studies are identified, which reflect an expectation for future research on graduate employability to meet the demand of an increasingly international dimension of higher education.

  19. A simple field method to identify foot strike pattern during running.

    Science.gov (United States)

    Giandolini, Marlène; Poupard, Thibaut; Gimenez, Philippe; Horvais, Nicolas; Millet, Guillaume Y; Morin, Jean-Benoît; Samozino, Pierre

    2014-05-07

    Identifying foot strike patterns in running is an important issue for sport clinicians, coaches and footwear industrials. Current methods allow the monitoring of either many steps in laboratory conditions or only a few steps in the field. Because measuring running biomechanics during actual practice is critical, our purpose is to validate a method aiming at identifying foot strike patterns during continuous field measurements. Based on heel and metatarsal accelerations, this method requires two uniaxial accelerometers. The time between heel and metatarsal acceleration peaks (THM) was compared to the foot strike angle in the sagittal plane (αfoot) obtained by 2D video analysis for various conditions of speed, slope, footwear, foot strike and state of fatigue. Acceleration and kinematic measurements were performed at 1000Hz and 120Hz, respectively, during 2-min treadmill running bouts. Significant correlations were observed between THM and αfoot for 14 out of 15 conditions. The overall correlation coefficient was r=0.916 (Pstrike except for extreme forefoot strike during which the heel rarely or never strikes the ground, and for different footwears and states of fatigue. We proposed a classification based on THM: FFS<-5.49ms

  20. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  1. Towards identifying the mechanisms underlying field-aligned edge-loss of HHFW power on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    Perkins, R. J.; Bell, R. E.; Bertelli, N.; Diallo, A.; Gerhardt, S.; Hosea, J. C.; Jaworski, M. A.; LeBlanc, B. P.; Kramer, G. J.; Maingi, R.; Phillips, C. K.; Podestà, M.; Roquemore, L.; Scotti, F.; Taylor, G.; Wilson, J. R. [Princeton Plasma Physics Laboratory, Princeton, NJ (United States); Ahn, J-W.; Gray, T. K.; Green, D. L.; McLean, A. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2014-02-12

    Fast-wave heating will be a major heating scheme on ITER, as it can heat ions directly and is relatively unaffected by the large machine size unlike neutral beams. However, fast-wave interactions with the plasma edge can lead to deleterious effects such as, in the case of the high-harmonic fast-wave (HHFW) system on NSTX, large losses of fast-wave power in the scrape off layer (SOL) under certain conditions. In such scenarios, a large fraction of the lost HHFW power is deposited on the upper and lower divertors in bright spiral shapes. The responsible mechanism(s) has not yet been identified but may include fast-wave propagation in the scrape off layer, parametric decay instability, and RF currents driven by the antenna reactive fields. Understanding and mitigating these losses is important not only for improving the heating and current-drive on NSTX-Upgrade but also for understanding fast-wave propagation across the SOL in any fast-wave system. This talk summarizes experimental results demonstrating that the flow of lost HHFW power to the divertor regions largely follows the open SOL magnetic field lines. This lost power flux is relatively large close to both the antenna and the last closed flux surface with a reduced level in between, so the loss mechanism cannot be localized to the antenna. At the same time, significant losses also occur along field lines connected to the inboard edge of the bottom antenna plate. The power lost within the spirals is roughly estimated, showing that these field-aligned losses to the divertor are significant but may not account for the total HHFW loss. To elucidate the role of the onset layer for perpendicular fast-wave propagation with regards to fast-wave propagation in the SOL, a cylindrical cold-plasma model is being developed. This model, in addition to advanced RF codes such as TORIC and AORSA, is aimed at identifying the underlying mechanism(s) behind these SOL losses, to minimize their effects in NSTX-U, and to predict

  2. Towards identifying the mechanisms underlying field-aligned edge-loss of HHFW power on NSTX

    International Nuclear Information System (INIS)

    Perkins, R. J.; Bell, R. E.; Bertelli, N.; Diallo, A.; Gerhardt, S.; Hosea, J. C.; Jaworski, M. A.; LeBlanc, B. P.; Kramer, G. J.; Maingi, R.; Phillips, C. K.; Podestà, M.; Roquemore, L.; Scotti, F.; Taylor, G.; Wilson, J. R.; Ahn, J-W.; Gray, T. K.; Green, D. L.; McLean, A.

    2014-01-01

    Fast-wave heating will be a major heating scheme on ITER, as it can heat ions directly and is relatively unaffected by the large machine size unlike neutral beams. However, fast-wave interactions with the plasma edge can lead to deleterious effects such as, in the case of the high-harmonic fast-wave (HHFW) system on NSTX, large losses of fast-wave power in the scrape off layer (SOL) under certain conditions. In such scenarios, a large fraction of the lost HHFW power is deposited on the upper and lower divertors in bright spiral shapes. The responsible mechanism(s) has not yet been identified but may include fast-wave propagation in the scrape off layer, parametric decay instability, and RF currents driven by the antenna reactive fields. Understanding and mitigating these losses is important not only for improving the heating and current-drive on NSTX-Upgrade but also for understanding fast-wave propagation across the SOL in any fast-wave system. This talk summarizes experimental results demonstrating that the flow of lost HHFW power to the divertor regions largely follows the open SOL magnetic field lines. This lost power flux is relatively large close to both the antenna and the last closed flux surface with a reduced level in between, so the loss mechanism cannot be localized to the antenna. At the same time, significant losses also occur along field lines connected to the inboard edge of the bottom antenna plate. The power lost within the spirals is roughly estimated, showing that these field-aligned losses to the divertor are significant but may not account for the total HHFW loss. To elucidate the role of the onset layer for perpendicular fast-wave propagation with regards to fast-wave propagation in the SOL, a cylindrical cold-plasma model is being developed. This model, in addition to advanced RF codes such as TORIC and AORSA, is aimed at identifying the underlying mechanism(s) behind these SOL losses, to minimize their effects in NSTX-U, and to predict

  3. Intracellular recording, sensory field mapping, and culturing identified neurons in the leech, Hirudo medicinalis.

    Science.gov (United States)

    Titlow, Josh; Majeed, Zana R; Nicholls, John G; Cooper, Robin L

    2013-11-04

    The freshwater leech, Hirudo medicinalis, is a versatile model organism that has been used to address scientific questions in the fields of neurophysiology, neuroethology, and developmental biology. The goal of this report is to consolidate experimental techniques from the leech system into a single article that will be of use to physiologists with expertise in other nervous system preparations, or to biology students with little or no electrophysiology experience. We demonstrate how to dissect the leech for recording intracellularly from identified neural circuits in the ganglion. Next we show how individual cells of known function can be removed from the ganglion to be cultured in a Petri dish, and how to record from those neurons in culture. Then we demonstrate how to prepare a patch of innervated skin to be used for mapping sensory or motor fields. These leech preparations are still widely used to address basic electrical properties of neural networks, behavior, synaptogenesis, and development. They are also an appropriate training module for neuroscience or physiology teaching laboratories.

  4. Identifying diffused nitrate sources in a stream in an agricultural field using a dual isotopic approach

    International Nuclear Information System (INIS)

    Ding, Jingtao; Xi, Beidou; Gao, Rutai; He, Liansheng; Liu, Hongliang; Dai, Xuanli; Yu, Yijun

    2014-01-01

    Nitrate (NO 3 − ) pollution is a severe problem in aquatic systems in Taihu Lake Basin in China. A dual isotope approach (δ 15 N-NO 3 − and δ 18 O-NO 3 − ) was applied to identify diffused NO 3 − inputs in a stream in an agricultural field at the basin in 2013. The site-specific isotopic characteristics of five NO 3 − sources (atmospheric deposition, AD; NO 3 − derived from soil organic matter nitrification, NS; NO 3 − derived from chemical fertilizer nitrification, NF; groundwater, GW; and manure and sewage, M and S) were identified. NO 3 − concentrations in the stream during the rainy season [mean ± standard deviation (SD) = 2.5 ± 0.4 mg/L] were lower than those during the dry season (mean ± SD = 4.0 ± 0.5 mg/L), whereas the δ 18 O-NO 3 − values during the rainy season (mean ± SD = + 12.3 ± 3.6‰) were higher than those during the dry season (mean ± SD = + 0.9 ± 1.9‰). Both chemical and isotopic characteristics indicated that mixing with atmospheric NO 3 − resulted in the high δ 18 O values during the rainy season, whereas NS and M and S were the dominant NO 3 − sources during the dry season. A Bayesian model was used to determine the contribution of each NO 3 − source to total stream NO 3 − . Results showed that reduced N nitrification in soil zones (including soil organic matter and fertilizer) was the main NO 3 − source throughout the year. M and S contributed more NO 3 − during the dry season (22.4%) than during the rainy season (17.8%). AD generated substantial amounts of NO 3 − in May (18.4%), June (29.8%), and July (24.5%). With the assessment of temporal variation of diffused NO 3 − sources in agricultural field, improved agricultural management practices can be implemented to protect the water resource and avoid further water quality deterioration in Taihu Lake Basin. - Highlights: • The isotopic characteristics of potential NO 3 − sources were identified. • Mixing with atmospheric NO 3 − resulted

  5. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  6. Full field optical coherence tomography can identify spermatogenesis in a rodent sertoli-cell only model.

    Science.gov (United States)

    Ramasamy, Ranjith; Sterling, Joshua; Manzoor, Maryem; Salamoon, Bekheit; Jain, Manu; Fisher, Erik; Li, Phillip S; Schlegel, Peter N; Mukherjee, Sushmita

    2012-01-01

    Microdissection testicular sperm extraction (micro-TESE) has replaced conventional testis biopsies as a method of choice for obtaining sperm for in vitro fertilization for men with nonobstructive azoospermia. A technical challenge of micro-TESE is that the low magnification inspection of the tubules with a surgical microscope is insufficient to definitively identify sperm-containing tubules, necessitating tissue removal and cytologic assessment. Full field optical coherence tomography (FFOCT) uses white light interference microscopy to generate quick high-resolution tomographic images of fresh (unprocessed and unstained) tissue. Furthermore, by using a nonlaser safe light source (150 W halogen lamp) for tissue illumination, it ensures that the sperm extracted for in vitro fertilization are not photo-damaged or mutagenized. A focal Sertoli-cell only rodent model was created with busulfan injection in adult rats. Ex vivo testicular tissues from both normal and busulfan-treated rats were imaged with a commercial modified FFOCT system, Light-CT™, and the images were correlated with gold standard hematoxylin and eosin staining. Light-CT™ identified spermatogenesis within the seminiferous tubules in freshly excised testicular tissue, without the use of exogenous contrast or fixation. Normal adult rats exhibited tubules with uniform size and shape (diameter 328 ±11 μm). The busulfan-treated animals showed marked heterogeneity in tubular size and shape (diameter 178 ± 35 μm) and only 10% contained sperm within the lumen. FFOCT has the potential to facilitate real-time visualization of spermatogenesis in humans, and aid in micro-TESE for men with infertility.

  7. Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF: A systematic review of identifying criteria

    Directory of Open Access Journals (Sweden)

    Baliatsas Christos

    2012-08-01

    Full Text Available Abstract Background Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF remains a complex and unclear phenomenon, often characterized by the report of various, non-specific physical symptoms (NSPS when an EMF source is present or perceived by the individual. The lack of validated criteria for defining and assessing IEI-EMF affects the quality of the relevant research, hindering not only the comparison or integration of study findings, but also the identification and management of patients by health care providers. The objective of this review was to evaluate and summarize the criteria that previous studies employed to identify IEI-EMF participants. Methods An extensive literature search was performed for studies published up to June 2011. We searched EMBASE, Medline, Psychinfo, Scopus and Web of Science. Additionally, citation analyses were performed for key papers, reference sections of relevant papers were searched, conference proceedings were examined and a literature database held by the Mobile Phones Research Unit of King’s College London was reviewed. Results Sixty-three studies were included. “Hypersensitivity to EMF” was the most frequently used descriptive term. Despite heterogeneity, the criteria predominantly used to identify IEI-EMF individuals were: 1. Self-report of being (hypersensitive to EMF. 2. Attribution of NSPS to at least one EMF source. 3. Absence of medical or psychiatric/psychological disorder capable of accounting for these symptoms 4. Symptoms should occur soon (up to 24 hours after the individual perceives an exposure source or exposed area. (Hypersensitivity to EMF was either generalized (attribution to various EMF sources or source-specific. Experimental studies used a larger number of criteria than those of observational design and performed more frequently a medical examination or interview as prerequisite for inclusion. Conclusions Considerable heterogeneity exists in the

  8. Using Smoke Injection in Drains to Identify Potential Preferential Pathways in a Drained Arable Field

    Science.gov (United States)

    Nielsen, M. H.; Petersen, C. T.; Hansen, S.

    2014-12-01

    Macropores forming a continuous pathway between the soil surface and subsurface drains favour the transport of many contaminants from agricultural fields to surface waters. The smoke injection method presented by Shipitalo and Gibbs (2000) used for demonstrating and quantifying such pathways has been further developed and used on a drained Danish sandy loam. In order to identify the preferential pathways to drains, smoke was injected in three 1.15 m deep tile drains (total drain length 93 m), and smoke emitting macropores (SEMP) at the soil surface were counted and characterized as producing either strong or weak plumes compared to reference plumes from 3 and 6 mm wide tubes. In the two situations investigated in the present study - an early spring and an autumn situation, smoke only penetrated the soil surface layer via earthworm burrows located in a 1.0 m wide belt directly above the drain lines. However, it is known from previous studies that desiccation fractures in a dry summer situation also can contribute to the smoke pattern. The distance between SEMP measured along the drain lines was on average 0.46 m whereas the average spacing between SEMP with strong plumes was 2.3 m. Ponded water was applied in 6 cm wide rings placed above 52 burrows including 17 reference burrows which did not emit smoke. Thirteen pathways in the soil were examined using dye tracer and profile excavation. SEMP with strong plumes marked the entrance of highly efficient transport pathways conducting surface applied water and dye tracer into the drain. However, no single burrow was traced all the way from the surface into the drain, the dye patterns branched off in a network of other macropores. Water infiltration rates were significantly higher (P drains and surface waters, pathways being associated primarily with unevenly distributed SEMP producing strong smoke plumes.

  9. Developing a methodology for identifying action zones to protect and manage groundwater well fields

    Science.gov (United States)

    Bellier, Sandra; Viennot, Pascal; Ledoux, Emmanuel; Schott, Celine

    2013-04-01

    Implementation of a long term action plan to manage and protect well fields is a complex and very expensive process. In this context, the relevance and efficiency of such action plans on water quality should be evaluated. The objective of this study is to set up a methodology to identify relevant actions zones in which environmental changes may significantly impact the quantity or quality of pumped water. In the Seine-et-Marne department (France), under French environmental laws three sectors integrating numerous well-field pumping in Champigny's limestone aquifer are considered as priority. This aquifer, located at south-east of Paris, supplies more than one million people with drinking water. Catchments areas of these abstractions are very large (2000 km2) and their intrinsic vulnerability was established by a simple parametric approach that does not permit to consider the complexity of hydrosystem. Consequently, a methodology based on a distributed modeling of the process of the aquifer was developed. The basin is modeled using the hydrogeological model MODCOU, developed in MINES ParisTech since the 1980s. It simulates surface and groundwater flow in aquifer systems and allows to represent the local characteristics of the hydrosystem (aquifers communicating by leakage, rivers infiltration, supply from sinkholes and locally perched or dewatering aquifers). The model was calibrated by matching simulated river discharge hydrographs and piezometric heads with observed ones since the 1970s. Thanks to this modelling tool, a methodology based on the transfer of a theoretical tracer through the hydrosystem from the ground surface to the outlets was implemented to evaluate the spatial distribution of the contribution areas at contrasted, wet or dry recharge periods. The results show that the surface of areas contributing to supply most catchments is lower than 300 km2 and the major contributory zones are located along rivers. This finding illustrates the importance of

  10. Automatic Weather Station (AWS Program operated by the University of Wisconsin-Madison during the 2012-2013 field season: Challenges and Successes

    Directory of Open Access Journals (Sweden)

    Matthew A. Lazzara

    2015-03-01

    Full Text Available This report reviews 2012-2013 field season activities of the University of Wisconsin-Madison's Antarctic Automatic Weather Station (AWS program, summarizes the science that these sites are supporting, and outlines the factors that impact the number of AWS sites serviced in any given field season. The 2012-2013 austral summer season was unusual in the AWS network history. Challenges encountered include, but are not limited to, warmer than normal conditions in the Ross Island area impacting airfield operations, changes to logistical procedures, and competition for shared resources. A flexible work plan provides the best means for taking on these challenges while maximizing AWS servicing efforts under restricted conditions and meeting the need for routine servicing that maintaining an autonomous observing network demands.

  11. Comparative Study between Sequential Automatic and Manual Home Respiratory Polygraphy Scoring Using a Three-Channel Device: Impact of the Manual Editing of Events to Identify Severe Obstructive Sleep Apnea

    Directory of Open Access Journals (Sweden)

    Glenda Ernst

    2015-01-01

    Full Text Available Objective. According to current guidelines, autoscoring of respiratory events in respiratory polygraphy requires manual scoring. The aim of this study was to evaluate the agreement between automatic analysis and manual scoring to identify patients with suspected OSA. Methods. This retrospective study analyzed 791 records from respiratory polygraphy (RP performed at home. The association grade between automatic scoring and manual scoring was evaluated using Kappa coefficient and the agreement using Bland and Altman test and intraclass correlation coefficient (CCI. To determine the accuracy in the identification of AHI≥30 eV/h, the ROC curve analysis was used. Results. The population analyzed consisted of 493 male (62.3% and 298 female patients, with an average age of 54.7±14.20 years and BMI of 32.7±8.21 kg/m2. There was no significant difference between automatic and manual apnea/hypopnea indexes (aAHI, mAHI: aAHI 17.25 (SD: 17.42 versus mAHI 21.20±7.96 (p; NS. The agreement between mAHI and aAHI to AHI≥30 was 94%, with a Kappa coefficient of 0.83 (p<0.001 and a CCI of 0.83. The AUC-ROC, sensitivity, and specificity were 0.99 (CI 95%: 0.98-0.99, p<0.001, 86% (CI 95%: 78.7–91.4, and 97% (CI 95%: 96–98.3, respectively. Conclusions. We observed good agreement between automatic scoring and sequential manual scoring to identify subjects with AHI≥30 eV/h.

  12. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    Directory of Open Access Journals (Sweden)

    Alexander J. Schmithausen

    2016-10-01

    Full Text Available Trace gases such as nitrous oxide (N2O, methane (CH4, and carbon dioxide (CO2 are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR; photoacoustic system (PAS are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC with electron capture detection (ECD, but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system and GC in the laboratory (offline system. Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges.

  13. Identifying fecal matter contamination in produce fields using multispectral reflectance imaging under ambient solar illumination

    Science.gov (United States)

    An imaging device to detect fecal contamination in fresh produce fields could allow the producer to avoid harvesting fecal-contaminated produce. E.coli O157:H7 outbreaks have been associated with fecal-contaminated leafy greens. In this study, in-field spectral profiles of bovine fecal matter, soil,...

  14. Ultra-low field nuclear magnetic resonance and magnetic resonance imaging to discriminate and identify materials

    Science.gov (United States)

    Matlashov, Andrei Nikolaevich; Urbaitis, Algis V.; Savukov, Igor Mykhaylovich; Espy, Michelle A.; Volegov, Petr Lvovich; Kraus, Jr., Robert Henry

    2013-03-05

    Method comprising obtaining an NMR measurement from a sample wherein an ultra-low field NMR system probes the sample and produces the NMR measurement and wherein a sampling temperature, prepolarizing field, and measurement field are known; detecting the NMR measurement by means of inductive coils; analyzing the NMR measurement to obtain at least one measurement feature wherein the measurement feature comprises T1, T2, T1.rho., or the frequency dependence thereof; and, searching for the at least one measurement feature within a database comprising NMR reference data for at least one material to determine if the sample comprises a material of interest.

  15. Identifying the Tunneling Site in Strong-Field Ionization of H_{2}^{+}.

    Science.gov (United States)

    Liu, Kunlong; Barth, Ingo

    2017-12-15

    The tunneling site of the electron in a molecule exposed to a strong laser field determines the initial position of the ionizing electron and, as a result, has a large impact on the subsequent ultrafast electron dynamics on the polyatomic Coulomb potential. Here, the tunneling site of the electron of H_{2}^{+} ionized by a strong circularly polarized (CP) laser pulse is studied by numerically solving the time-dependent Schrödinger equation. We show that the electron removed from the down-field site is directly driven away by the CP field and the lateral photoelectron momentum distribution (LPMD) exhibits a Gaussian-like distribution, whereas the corresponding LPMD of the electron removed from the up-field site differs from the Gaussian shape due to the Coulomb focusing and scattering by the down-field core. Our current study presents the direct evidence clarifying a long-standing controversy over the tunneling site in H_{2}^{+} and raises the important role of the tunneling site in strong-field molecular ionization.

  16. Identifying a cooperative control mechanism between an applied field and the environment of open quantum systems

    Science.gov (United States)

    Gao, Fang; Rey-de-Castro, Roberto; Wang, Yaoxiong; Rabitz, Herschel; Shuang, Feng

    2016-05-01

    Many systems under control with an applied field also interact with the surrounding environment. Understanding the control mechanisms has remained a challenge, especially the role played by the interaction between the field and the environment. In order to address this need, here we expand the scope of the Hamiltonian-encoding and observable-decoding (HE-OD) technique. HE-OD was originally introduced as a theoretical and experimental tool for revealing the mechanism induced by control fields in closed quantum systems. The results of open-system HE-OD analysis presented here provide quantitative mechanistic insights into the roles played by a Markovian environment. Two model open quantum systems are considered for illustration. In these systems, transitions are induced by either an applied field linked to a dipole operator or Lindblad operators coupled to the system. For modest control yields, the HE-OD results clearly show distinct cooperation between the dynamics induced by the optimal field and the environment. Although the HE-OD methodology introduced here is considered in simulations, it has an analogous direct experimental formulation, which we suggest may be applied to open systems in the laboratory to reveal mechanistic insights.

  17. Identifying Student Competencies in Macro Practice: Articulating the Practice Wisdom of Field Instructors

    Science.gov (United States)

    Regehr, Cheryl; Bogo, Marion; Donovan, Kirsten; Lim, April; Anstice, Susan

    2012-01-01

    Although a growing literature examines competencies in clinical practice, competencies of students in macro social work practice have received comparatively little attention. A grounded-theory methodology was used to elicit field instructor views of student competencies in community, organization, and policy contexts. Competencies described by…

  18. An automatic time domain reflectometry device to measure and store soil water contents for stand-alone field use

    NARCIS (Netherlands)

    Elsen, van den H.G.M.; Kokot, J.; Skierucha, W.; Halbertsma, J.M.

    1995-01-01

    A field set-up was developed to measure soil moisture content on ten different positions using the time domain reflectometry (TDR) technique. The set-up works on a 12 V battery or solar panel system, independent of an external power source, has low power consumption, and compact dimensions. The

  19. Identifying students’ learning performance as a way to determine the admission process in physical education field

    Science.gov (United States)

    Prihanto, J. B.; Kartiko, D. C.; Wijaya, A.

    2018-01-01

    The interest in the physical education field has been rising in the past ten years. It can be seen that registrants of the physical education program in several universities increase. This research is meant to analyze students’ admission process and its relation to their performance in the learning activities in the department of physical education at Universitas Negeri Surabaya. The design of this study was quantitative data analysis. The research was conducted by collecting students’ admission data and their transcripts. The result showed that the most influential factor of admission in physical education program was the student’ field of study in high school. In addition, their achievements in sports competitions and family welfare are not likely to be important factors. These results give a recommendation for the next admission process which related to the quality of graduates.

  20. Identifying fecal matter contamination in produce fields using multispectral reflectance imaging under ambient solar illumination

    Science.gov (United States)

    Everard, Colm D.; Kim, Moon S.; Lee, Hoonsoo; O'Donnell, Colm P.

    2016-05-01

    An imaging device to detect fecal contamination in fresh produce fields could allow the producer avoid harvesting fecal contaminated produce. E.coli O157:H7 outbreaks have been associated with fecal contaminated leafy greens. In this study, in-field spectral profiles of bovine fecal matter, soil, and spinach leaves are compared. A common aperture imager designed with two identical monochromatic cameras, a beam splitter, and optical filters was used to simultaneously capture two-spectral images of leaves contaminated with both fecal matter and soil. The optical filters where 10 nm full width half maximum bandpass filters, one at 690 nm and the second at 710 nm. These were mounted in front of the object lenses. New images were created using the ratio of these two spectral images on a pixel by pixel basis. Image analysis results showed that the fecal matter contamination could be distinguished from soil and leaf on the ratio images. The use of this technology has potential to allow detection of fecal contamination in produce fields which can be a source of foodbourne illnesses. It has the added benefit of mitigating cross-contamination during harvesting and processing.

  1. Study of Events with Identified Forward Particles at the Split Field Magnet

    CERN Multimedia

    2002-01-01

    This experiment will study two aspects of particle production in the forward region : \\item 1) In the recent discovery of charm production in hadronic interactions at the Split Field Magnet, the triggering on strange particles at medium p^t has proven to be a very effective tool for the study of heavy resonances, especially those carrying new flavours like charm and beauty. We want to carry out a more detailed investigation of the production-dynamics of charmed particles, together with a search for beauty mesons and baryons. \\item 2) A trigger on forward particles at high p^t ($>$ 5GeV/c) provides unique features to determine the properties of the parton-parton subprocesses. We want to study the relative contributions of quark, diquark and gluon scattering.\\\\ \\\\ This experimental programme will be carried out, using the improved Split Field Magnet spectrometer (SFM). The different detection systems provide : \\item a) Detection and momentum analysis of charged particles in 4@p solid angle. An improved programm...

  2. Application of Chebyshev Formalism to Identify Nonlinear Magnetic Field Components in Beam Transport Systems

    Energy Technology Data Exchange (ETDEWEB)

    Spata, Michael [Old Dominion Univ., Norfolk, VA (United States)

    2012-08-01

    An experiment was conducted at Jefferson Lab's Continuous Electron Beam Accelerator Facility to develop a beam-based technique for characterizing the extent of the nonlinearity of the magnetic fields of a beam transport system. Horizontally and vertically oriented pairs of air-core kicker magnets were simultaneously driven at two different frequencies to provide a time-dependent transverse modulation of the beam orbit relative to the unperturbed reference orbit. Fourier decomposition of the position data at eight different points along the beamline was then used to measure the amplitude of these frequencies. For a purely linear transport system one expects to find solely the frequencies that were applied to the kickers with amplitudes that depend on the phase advance of the lattice. In the presence of nonlinear fields one expects to also find harmonics of the driving frequencies that depend on the order of the nonlinearity. Chebyshev polynomials and their unique properties allow one to directly quantify the magnitude of the nonlinearity with the minimum error. A calibration standard was developed using one of the sextupole magnets in a CEBAF beamline. The technique was then applied to a pair of Arc 1 dipoles and then to the magnets in the Transport Recombiner beamline to measure their multipole content as a function of transverse position within the magnets.

  3. Self-identified obese people request less money: a field experiment.

    Directory of Open Access Journals (Sweden)

    Antonios Proestakis

    2016-09-01

    Full Text Available Empirical evidence suggests that obese people are discriminated in different social environments such as the work place. Yet, the degree to which obese people are internalizing and adjusting their own behaviour as a result of this discriminatory behaviour has not been studied thoroughly. We develop a proxy for measuring experimentally the self-weight bias by giving to both self-identified obese (n=90 and non-obese (n=180 individuals the opportunity to request a positive amount of money after having performed an identical task. Consistent with the System Justification Theory, we find that self-identified obese individuals, due to a preexisting false consciousness, request significantly lower amounts of money than non-obese ones. A within subject comparison between self-reports and external interviewers' evaluations reveals that the excessive weight felt by the self but not reported by evaluators captures the self-weight bias not only for obese but also for non-obese individuals. Linking our experimental results to the supply side of the labour market, we argue that self-weight bias, as expressed by lower salary requests, enhances discriminatory behaviour against individuals who feel, but may not actually be, obese and consequently exacerbates the wage gab across weight.

  4. Self-identified Obese People Request Less Money: A Field Experiment.

    Science.gov (United States)

    Proestakis, Antonios; Brañas-Garza, Pablo

    2016-01-01

    Empirical evidence suggests that obese people are discriminated in different social environments, such as the work place. Yet, the degree to which obese people are internalizing and adjusting their own behavior as a result of this discriminatory behavior has not been thoroughly studied. We develop a proxy for measuring experimentally the "self-weight bias" by giving to both self-identified obese ( n = 90) and non-obese ( n = 180) individuals the opportunity to request a positive amount of money after having performed an identical task. Consistent with the System Justification Theory, we find that self-identified obese individuals, due to a preexisting false consciousness , request significantly lower amounts of money than non-obese ones. A within subject comparison between self-reports and external monitors' evaluations reveals that the excessive weight felt by the "self" but not reported by evaluators captures the self-weight bias not only for obese but also for non-obese individuals. Linking our experimental results to the supply side of the labor market, we argue that self-weight bias, as expressed by lower salary requests, enhances discriminatory behavior against individuals who feel, but may not actually be, obese and consequently exacerbates the wage gap across weight.

  5. Trace Metal Bioremediation: Assessment of Model Components from Laboratory and Field Studies to Identify Critical Variables

    International Nuclear Information System (INIS)

    Peter Jaffe; Herschel Rabitz

    2003-01-01

    The objective of this project was to gain an insight into the modeling support needed for the understanding, design, and operation of trace metal/radionuclide bioremediation. To achieve this objective, a workshop was convened to discuss the elements such a model should contain. A ''protomodel'' was developed, based on the recommendations of the workshop, and was used to perform sensitivity analysis as well as some preliminary simulations in support for bioremediation test experiments at UMTRA sites. To simulate the numerous biogeochemical processes that will occur during the bioremediation of uranium contaminated aquifers, a time-dependent one-dimensional reactive transport model has been developed. The model consists of a set of coupled, steady state mass balance equations, accounting for advection, diffusion, dispersion, and a kinetic formulation of the transformations affecting an organic substrate, electron acceptors, corresponding reduced species, and uranium. This set of equations is solved numerically, using a finite element scheme. The redox conditions of the domain are characterized by estimating the pE, based on the concentrations of the dominant terminal electron acceptor and its corresponding reduced specie. This pE and the concentrations of relevant species are passed to a modified version of MINTEQA2, which calculates the speciation and solubilities of the species of interest. Kinetics of abiotic reactions are described as being proportional to the difference between the actual and equilibrium concentration. A global uncertainty assessment, determined by Random Sampling High Dimensional Model Representation (RS-HDMR), was performed to attain a phenomenological understanding of the origins of output variability and to suggest input parameter refinements as well as to provide guidance for field experiments to improve the quality of the model predictions. Results indicated that for the usually high nitrate contents found ate many DOE sites, overall

  6. Identifying Gender-Sensitive Agroforestry Options: Methodological Considerations From the Field

    Directory of Open Access Journals (Sweden)

    Sarah-Lan Mathez-Stiefel

    2016-11-01

    Full Text Available Agroforestry is seen as a promising set of land use practices that can lead to increased ecological integrity and sustainable benefits in mountain areas. Agroforestry practices can also enhance smallholder farmers' resilience in the face of social and ecological change. There is a need for critical examination of existing practices to ensure that agroforestry recommendations for smallholder farmers are socially inclusive and grounded in local experience, knowledge, and perceptions. In this paper, we present a transdisciplinary systems approach to the identification and analysis of suitable agroforestry options, which takes into account gendered perceptions of the benefits and values of natural resources. The 4-step approach consists of an appraisal of local perceptions of the social-ecological context and dynamics, an inventory of existing agroforestry practices and species, a gendered valuation of agroforestry practices and species, and the development of locally adapted and gender-sensitive agroforestry options. In a study using this approach in the Peruvian Andes, data were collected through a combination of participatory tools for gender research and ethnobotanical methods. This paper shares lessons learned and offers recommendations for researchers and practitioners in the field of sustainable mountain development. We discuss methodological considerations in the identification of locally adapted agroforestry options, the understanding of local social-ecological systems, the facilitation of social learning processes, engagement in gender research, and the establishment of ethical research collaborations. The methodology presented here is especially recommended for the exploratory phase of any natural resource management initiative in mountain areas with high environmental and sociocultural variability.

  7. Accuracy of topographic index models at identifying ephemeral gully trajectories on agricultural fields

    Science.gov (United States)

    Sheshukov, Aleksey Y.; Sekaluvu, Lawrence; Hutchinson, Stacy L.

    2018-04-01

    Topographic index (TI) models have been widely used to predict trajectories and initiation points of ephemeral gullies (EGs) in agricultural landscapes. Prediction of EGs strongly relies on the selected value of critical TI threshold, and the accuracy depends on topographic features, agricultural management, and datasets of observed EGs. This study statistically evaluated the predictions by TI models in two paired watersheds in Central Kansas that had different levels of structural disturbances due to implemented conservation practices. Four TI models with sole dependency on topographic factors of slope, contributing area, and planform curvature were used in this study. The observed EGs were obtained by field reconnaissance and through the process of hydrological reconditioning of digital elevation models (DEMs). The Kernel Density Estimation analysis was used to evaluate TI distribution within a 10-m buffer of the observed EG trajectories. The EG occurrence within catchments was analyzed using kappa statistics of the error matrix approach, while the lengths of predicted EGs were compared with the observed dataset using the Nash-Sutcliffe Efficiency (NSE) statistics. The TI frequency analysis produced bi-modal distribution of topographic indexes with the pixels within the EG trajectory having a higher peak. The graphs of kappa and NSE versus critical TI threshold showed similar profile for all four TI models and both watersheds with the maximum value representing the best comparison with the observed data. The Compound Topographic Index (CTI) model presented the overall best accuracy with NSE of 0.55 and kappa of 0.32. The statistics for the disturbed watershed showed higher best critical TI threshold values than for the undisturbed watershed. Structural conservation practices implemented in the disturbed watershed reduced ephemeral channels in headwater catchments, thus producing less variability in catchments with EGs. The variation in critical thresholds for all

  8. Response Properties of a Newly Identified Tristratified Narrow Field Amacrine Cell in the Mouse Retina.

    Directory of Open Access Journals (Sweden)

    G S Newkirk

    Full Text Available Amacrine cells were targeted for whole cell recording using two-photon fluorescence microscopy in a transgenic mouse line in which the promoter for dopamine receptor 2 drove expression of green fluorescent protein in a narrow field tristratified amacrine cell (TNAC that had not been studied previously. Light evoked a multiphasic response that was the sum of hyperpolarizing and depolarization synaptic inputs consistent with distinct dendritic ramifications in the off and on sublamina of the inner plexiform layer. The amplitude and waveform of the response, which consisted of an initial brief hyperpolarization at light onset followed by recovery to a plateau potential close to dark resting potential and a hyperpolarizing response at the light offset varied little over an intensity range from 0.4 to ~10^6 Rh*/rod/s. This suggests that the cell functions as a differentiator that generates an output signal (a transient reduction in inhibitory input to downstream retina neurons that is proportional to the derivative of light input independent of its intensity. The underlying circuitry appears to consist of rod and cone driven on and off bipolar cells that provide direct excitatory input to the cell as well as to GABAergic amacrine cells that are synaptically coupled to TNAC. Canonical reagents that blocked excitatory (glutamatergic and inhibitory (GABA and glycine synaptic transmission had effects on responses to scotopic stimuli consistent with the rod driven component of the proposed circuit. However, responses evoked by photopic stimuli were paradoxical and could not be interpreted on the basis of conventional thinking about the neuropharmacology of synaptic interactions in the retina.

  9. Automatic fault extraction using a modified ant-colony algorithm

    International Nuclear Information System (INIS)

    Zhao, Junsheng; Sun, Sam Zandong

    2013-01-01

    The basis of automatic fault extraction is seismic attributes, such as the coherence cube which is always used to identify a fault by the minimum value. The biggest challenge in automatic fault extraction is noise, including that of seismic data. However, a fault has a better spatial continuity in certain direction, which makes it quite different from noise. Considering this characteristic, a modified ant-colony algorithm is introduced into automatic fault identification and tracking, where the gradient direction and direction consistency are used as constraints. Numerical model test results show that this method is feasible and effective in automatic fault extraction and noise suppression. The application of field data further illustrates its validity and superiority. (paper)

  10. Identifying the sources of produced water in the oil field by isotopic techniques

    International Nuclear Information System (INIS)

    Nguyen Minh Quy; Hoang Long; Le Thi Thu Huong; Luong Van Huan; Vo Thi Tuong Hanh

    2014-01-01

    The objective of this study is to identify the sources of the formation water in the Southwest Su-Tu-Den (STD SW) basement reservoir. To achieve the objective, isotopic techniques along with geochemical analysis for chloride, bromide, strontium dissolved in the water were applied. The isotopic techniques used in this study were the determination of water stable isotopes signatures (δ 2 H and (δ 18 O) and of the 87 Sr/ 86 Sr ratio of strontium in rock cutting sample and that dissolved in the formation water. The obtained results showed that the stable isotopes compositions of water in the Lower Miocene was -3‰ and -23‰ for (δ 18 O and (δ 2 H, respectively indicating the primeval nature of seawater in the reservoir. Meanwhile, the isotopic composition of water in the basement was clustered in a range of alternated freshwater with (δ 18 O and (δ 2 H being -(3-4)‰ and -(54-60)‰, respectively). The strontium isotopes ratio for water in the Lower Miocene reservoir was lower compared to that for water in the basement confirming the different natures of the water in the two reservoirs. The obtained results are assured for the techniques applicability, and it is recommended that studies on identification of the flow-path of the formation water in the STD SW basement reservoir should be continued. (author)

  11. Review of Stat-Spotting: A Field Guide to Identifying Dubious Data by Joel Best

    OpenAIRE

    Joe Swingle

    2009-01-01

    Best, Joel. Stat-Spotting: A Field Guide to Identifying Dubious Data. (Berkeley: University of California Press, 2008) 144 pp. $19.95. ISBN 1-978-0-520-25746-7.Stat-Spotting is a practical, do-it-yourself manual for detecting questionable claims reported in the media. Using examples drawn mostly from mass media sources, Stat-Spotting provides readers with a number of useful tips for identifying potentially problematic statistics. The author’s skillful analyses and explanations presented in cl...

  12. Fungi in Thailand: a case study of the efficacy of an ITS barcode for automatically identifying species within the Annulohypoxylon and Hypoxylon genera.

    Directory of Open Access Journals (Sweden)

    Nuttika Suwannasai

    Full Text Available Thailand, a part of the Indo-Burma biodiversity hotspot, has many endemic animals and plants. Some of its fungal species are difficult to recognize and separate, complicating assessments of biodiversity. We assessed species diversity within the fungal genera Annulohypoxylon and Hypoxylon, which produce biologically active and potentially therapeutic compounds, by applying classical taxonomic methods to 552 teleomorphs collected from across Thailand. Using probability of correct identification (PCI, we also assessed the efficacy of automated species identification with a fungal barcode marker, ITS, in the model system of Annulohypoxylon and Hypoxylon. The 552 teleomorphs yielded 137 ITS sequences; in addition, we examined 128 GenBank ITS sequences, to assess biases in evaluating a DNA barcode with GenBank data. The use of multiple sequence alignment in a barcode database like BOLD raises some concerns about non-protein barcode markers like ITS, so we also compared species identification using different alignment methods. Our results suggest the following. (1 Multiple sequence alignment of ITS sequences is competitive with pairwise alignment when identifying species, so BOLD should be able to preserve its present bioinformatics workflow for species identification for ITS, and possibly therefore with at least some other non-protein barcode markers. (2 Automated species identification is insensitive to a specific choice of evolutionary distance, contributing to resolution of a current debate in DNA barcoding. (3 Statistical methods are available to address, at least partially, the possibility of expert misidentification of species. Phylogenetic trees discovered a cryptic species and strongly supported monophyletic clades for many Annulohypoxylon and Hypoxylon species, suggesting that ITS can contribute usefully to a barcode for these fungi. The PCIs here, derived solely from ITS, suggest that a fungal barcode will require secondary markers in

  13. Fungi in Thailand: a case study of the efficacy of an ITS barcode for automatically identifying species within the Annulohypoxylon and Hypoxylon genera.

    Science.gov (United States)

    Suwannasai, Nuttika; Martín, María P; Phosri, Cherdchai; Sihanonth, Prakitsin; Whalley, Anthony J S; Spouge, John L

    2013-01-01

    Thailand, a part of the Indo-Burma biodiversity hotspot, has many endemic animals and plants. Some of its fungal species are difficult to recognize and separate, complicating assessments of biodiversity. We assessed species diversity within the fungal genera Annulohypoxylon and Hypoxylon, which produce biologically active and potentially therapeutic compounds, by applying classical taxonomic methods to 552 teleomorphs collected from across Thailand. Using probability of correct identification (PCI), we also assessed the efficacy of automated species identification with a fungal barcode marker, ITS, in the model system of Annulohypoxylon and Hypoxylon. The 552 teleomorphs yielded 137 ITS sequences; in addition, we examined 128 GenBank ITS sequences, to assess biases in evaluating a DNA barcode with GenBank data. The use of multiple sequence alignment in a barcode database like BOLD raises some concerns about non-protein barcode markers like ITS, so we also compared species identification using different alignment methods. Our results suggest the following. (1) Multiple sequence alignment of ITS sequences is competitive with pairwise alignment when identifying species, so BOLD should be able to preserve its present bioinformatics workflow for species identification for ITS, and possibly therefore with at least some other non-protein barcode markers. (2) Automated species identification is insensitive to a specific choice of evolutionary distance, contributing to resolution of a current debate in DNA barcoding. (3) Statistical methods are available to address, at least partially, the possibility of expert misidentification of species. Phylogenetic trees discovered a cryptic species and strongly supported monophyletic clades for many Annulohypoxylon and Hypoxylon species, suggesting that ITS can contribute usefully to a barcode for these fungi. The PCIs here, derived solely from ITS, suggest that a fungal barcode will require secondary markers in Annulohypoxylon and

  14. Approaches to identifying reservoir heterogeneity and reserve growth opportunities from subsurface data: The Oficina Formation, Budare field, Venezuela

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, D.S.; Raeuchle, S.K.; Holtz, M.H. [Bureau of Economic Geology, Austin, TX (United States)] [and others

    1997-08-01

    We applied an integrated geologic, geophysical, and engineering approach devised to identify heterogeneities in the subsurface that might lead to reserve growth opportunities in our analysis of the Oficina Formation at Budare field, Venezuela. The approach involves 4 key steps: (1) Determine geologic reservoir architecture; (2) Investigate trends in reservoir fluid flow; (3) Integrate fluid flow trends with reservoir architecture; and (4) Estimate original oil-in-place, residual oil saturation, and remaining mobile oil, to identify opportunities for reserve growth. There are three main oil-producing reservoirs in the Oficina Formation that were deposited in a bed-load fluvial system, an incised valley-fill, and a barrier-strandplain system. Reservoir continuity is complex because, in addition to lateral facies variability, the major Oficina depositional systems were internally subdivided by high-frequency stratigraphic surfaces. These surfaces define times of intermittent lacustrine and marine flooding events that punctuated the fluvial and marginal marine sedimentation, respectively. Syn and post depositional faulting further disrupted reservoir continuity. Trends in fluid flow established from initial fluid levels, response to recompletion workovers, and pressure depletion data demonstrated barriers to lateral and vertical fluid flow caused by a combination of reservoir facies pinchout, flooding shale markers, and the faults. Considerable reserve growth potential exists at Budare field because the reservoir units are highly compartment by the depositional heterogeneity and structural complexity. Numerous reserve growth opportunities were identified in attics updip of existing production, in untapped or incompletely drained compartments, and in field extensions.

  15. First results from the field test of households with dynamic tarif and automatic control in the regenerative model region Harz; Erste Ergebnisse des Haushaltsfeldtests mit dynamischen Tarif und automatischer Steuerung in der Regenerativen Modellregion Harz

    Energy Technology Data Exchange (ETDEWEB)

    Funke, Stephan; Landau, Markus [Fraunhofer Institut fuer Windenergie und Energiesystemtechnik (IWES), Kassel (Germany); Filzek, Dirk; Volkert, Christina [CUBE Engineering GmbH, Kassel (Germany); Fechner, Amelie [Saarland Univ., Saarbruecken (Germany). Forschungsgruppe Umweltpsychologie

    2012-07-01

    As part of the E-Energy research project RegModHarz (Regenerative Model Region Harz) a field test with test households is carried out. A system developed in the project and consisting of a dynamic tariff, appliance control and monitoring system is tested. Concomitant to this, the acceptance of this system by the participants of the field test is evaluated. First results from the commissioning of the system are already available. Currently, the second phase of the field test is performed. At this, the participants of the field test can adjust their power consumption actively and automatically to the availability of electricity from renewable energy sources in the model region.

  16. Remote Sensing and GIS as Tools for Identifying Risk for Phreatomagmatic Eruptions in the Bishoftu Volcanic Field, Ethiopia

    Science.gov (United States)

    Pennington, H. G.; Graettinger, A.

    2017-12-01

    Bishoftu is a fast-growing town in the Oromia region of Ethiopia, located 47 km southeast of the nation's capital, Addis Ababa. It is situated atop a monogenetic basaltic volcanic field, called the Bishoftu Volcanic Field (BVF), which is composed of maar craters, scoria cones, lava flows, and rhyolite domes. Although not well dated, the morphology and archeological evidence have been used to infer a Holocene age, indicating that the community is exposed to continued volcanic risk. The presence of phreatomagmatic constructs in particular indicates that the hazards are not only vent-localized, but may have far reaching impacts. Hazard mapping is an essential tool for evaluating and communicating risks. This study presents the results of GIS analyses of proximal and distal syn-eruptive hazards associated with phreatomagmatic eruptions in the BVF. A digitized infrastructure map based on a SPOT 6 satellite image is used to identify the areas at risk from eruption scenarios. Parameters such as wind direction, vent location, and explosion energy are varied for hazard simulations to quantify the area impacted by different eruption scenarios. Proximal syn-eruptive hazards include tephra fall, base pyroclastic surges, and ballistic bombs. Distal hazards include predominantly ash fall. Eruption scenarios are simulated using Eject and Plumeria models as well as similar case studies from other urban volcanic fields. Within 5 km of the volcanic field center, more than 30 km2 of residential and commercial/industrial infrastructure will be damaged by proximal syn-eruptive hazards, in addition to 34 km2 of agricultural land, 291 km of roads, more than 10 km of railway, an airport, and two health centers. Within 100 km of the volcanic field center, ash fall will affect 3946 km2 of agricultural land, 179 km2 of residential land, and 28 km2 of commercial/industrial land. Approximately 2700 km of roads and railways, 553 km of waterways, an airport, and 14 health centers are located

  17. Automatic alignment device for focal spot measurements in the center of the field for mammography; Sistema automatico de alinhamento para avaliacao do ponto focal no centro do campo de equipamentos mamograficos

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Marcelo A.C.; Watanabe, Alex O.; Oliveira Junior, Paulo D.; Schiabel, Homero [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Escola de Engenharia. Dept. de Engenharia Eletrica], e-mail: mvieira@sc.usp.br

    2010-03-15

    Some quality control procedures used for mammography, such as focal spot evaluation, requires previous alignment of the measurement equipment with the X-ray central beam. However, alignment procedures are, in general, the most difficult task and the one that needs more time to be performed. Moreover, the operator sometimes is exposed to radiation during this procedure. This work presents an automatic alignment system for mammographic equipment that allows locating the central ray of the radiation beam and, immediately, aligns with it by dislocating itself automatically along the field. The system consists on a bidirectional moving device, connected to a CCD sensor for digital radiographic image acquisition. A computational analysis of a radiographic image, acquired at any position on the field, is performed in order to determine its positioning under the X-ray beam. Finally, a mechanical system for two moving directions, electronically controlled by a microcontroller under USB communication, makes the system to align automatically with the radiation beam central ray. The alignment process is fully automatic, fast and accurate, with no operator exposure to radiation, which allows a considerable time saving for quality control procedures achievement for mammography. (author)

  18. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches

    Science.gov (United States)

    Han, Yuling

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues. PMID:29329313

  19. Development of a field testing protocol for identifying Deepwater Horizon oil spill residues trapped near Gulf of Mexico beaches.

    Science.gov (United States)

    Han, Yuling; Clement, T Prabhakar

    2018-01-01

    The Deepwater Horizon (DWH) accident, one of the largest oil spills in U.S. history, contaminated several beaches located along the Gulf of Mexico (GOM) shoreline. The residues from the spill still continue to be deposited on some of these beaches. Methods to track and monitor the fate of these residues require approaches that can differentiate the DWH residues from other types of petroleum residues. This is because, historically, the crude oil released from sources such as natural seeps and anthropogenic discharges have also deposited other types of petroleum residues on GOM beaches. Therefore, identifying the origin of these residues is critical for developing effective management strategies for monitoring the long-term environmental impacts of the DWH oil spill. Advanced fingerprinting methods that are currently used for identifying the source of oil spill residues require detailed laboratory studies, which can be cost-prohibitive. Also, most agencies typically use untrained workers or volunteers to conduct shoreline monitoring surveys and these worker will not have access to advanced laboratory facilities. Furthermore, it is impractical to routinely fingerprint large volumes of samples that are collected after a major oil spill event, such as the DWH spill. In this study, we propose a simple field testing protocol that can identify DWH oil spill residues based on their unique physical characteristics. The robustness of the method is demonstrated by testing a variety of oil spill samples, and the results are verified by characterizing the samples using advanced chemical fingerprinting methods. The verification data show that the method yields results that are consistent with the results derived from advanced fingerprinting methods. The proposed protocol is a reliable, cost-effective, practical field approach for differentiating DWH residues from other types of petroleum residues.

  20. Isl1 expression at the venous pole identifies a novel role for the second heart field in cardiac development.

    Science.gov (United States)

    Snarr, Brian S; O'Neal, Jessica L; Chintalapudi, Mastan R; Wirrig, Elaine E; Phelps, Aimee L; Kubalak, Steven W; Wessels, Andy

    2007-11-09

    The right ventricle and outflow tract of the developing heart are derived from mesodermal progenitor cells from the second heart field (SHF). SHF cells have been characterized by expression of the transcription factor Islet-1 (Isl1). Although Isl1 expression has also been reported in the venous pole, the specific contribution of the SHF to this part of the heart is unknown. Here we show that Isl1 is strongly expressed in the dorsal mesenchymal protrusion (DMP), a non-endocardially-derived mesenchymal structure involved in atrioventricular septation. We further demonstrate that abnormal development of the SHF-derived DMP is associated with the pathogenesis of atrioventricular septal defects. These results identify a novel role for the SHF.

  1. Identifying Faults Associated with the 2001 Avoca Induced(?) Seismicity Sequence of Western New York State Using Potential Field Wavelets.

    Science.gov (United States)

    Horowitz, F. G.; Ebinger, C.; Jordan, T. E.

    2017-12-01

    Results from recent DOE and USGS sponsored projects in the (intraplate) northeastern portions of the US and southeastern portions of Canada have identified locations of steeply dipping structures - many previously unknown - from a Poisson wavelet multiscale edge ('worm') analysis of gravity and magnetic fields. The Avoca sequence of induced(?) seismicity in western New York state occurred during January and February of 2001. The Avoca earthquake sequence is associated with industrial hydraulic fracturing activity "related to a proposed natural gas storage facility near Avoca to be constructed by solution mining" (Kim, 2001). The main Avoca event was a felt Mb = 3.2 earthquake on Feb. 3, 2001 recorded by the Lamont Cooperative Seismic Network. Earlier, smaller events were located by the Canadian Geological Survey's seismic network north of the Canadian border - implying that the event locations might be biased because they occurred off the southern edge of the array. Some of these events were also felt locally, according to local newspaper reports. By plotting the location of the seismic events and that of the injection well - reported via it's API number - we find a strong correlation with structures detected via our potential field worms. The injection occurred near a NE-SW striking structure that was not activated. All but one of the earthquakes occurred about 5 km north of the injection well on or nearby to an E-W striking structure that appears to intersect the NE-SW structure. The final, small (MN=2.2) earthquake was located on a different complex structure about 10 km north of the other events. We suggest that potential field methods such as ours might be appropriate to locating structures of concern for induced seismic activity in association with industrial activity. Reference: Kim, W.-Y. (2001). The Lamont cooperative seismic network and the national seismic system: Earthquake hazard studies in the northeastern United States. Tech. Rep. 98-01, Lamont

  2. Nuclear proliferomics: A new field of study to identify signatures of nuclear materials as demonstrated on alpha-UO3.

    Science.gov (United States)

    Schwerdt, Ian J; Brenkmann, Alexandria; Martinson, Sean; Albrecht, Brent D; Heffernan, Sean; Klosterman, Michael R; Kirkham, Trenton; Tasdizen, Tolga; McDonald Iv, Luther W

    2018-08-15

    The use of a limited set of signatures in nuclear forensics and nuclear safeguards may reduce the discriminating power for identifying unknown nuclear materials, or for verifying processing at existing facilities. Nuclear proliferomics is a proposed new field of study that advocates for the acquisition of large databases of nuclear material properties from a variety of analytical techniques. As demonstrated on a common uranium trioxide polymorph, α-UO 3 , in this paper, nuclear proliferomics increases the ability to improve confidence in identifying the processing history of nuclear materials. Specifically, α-UO 3 was investigated from the calcination of unwashed uranyl peroxide at 350, 400, 450, 500, and 550 °C in air. Scanning electron microscopy (SEM) images were acquired of the surface morphology, and distinct qualitative differences are presented between unwashed and washed uranyl peroxide, as well as the calcination products from the unwashed uranyl peroxide at the investigated temperatures. Differential scanning calorimetry (DSC), UV-Vis spectrophotometry, powder X-ray diffraction (p-XRD), and thermogravimetric analysis-mass spectrometry (TGA-MS) were used to understand the source of these morphological differences as a function of calcination temperature. Additionally, the SEM images were manually segmented using Morphological Analysis for MAterials (MAMA) software to identify quantifiable differences in morphology for three different surface features present on the unwashed uranyl peroxide calcination products. No single quantifiable signature was sufficient to discern all calcination temperatures with a high degree of confidence; therefore, advanced statistical analysis was performed to allow the combination of a number of quantitative signatures, with their associated uncertainties, to allow for complete discernment by calcination history. Furthermore, machine learning was applied to the acquired SEM images to demonstrate automated discernment with

  3. Coupling field and laboratory measurements to estimate the emission factors of identified and unidentified trace gases for prescribed fires

    Directory of Open Access Journals (Sweden)

    R. J. Yokelson

    2013-01-01

    Full Text Available An extensive program of experiments focused on biomass burning emissions began with a laboratory phase in which vegetative fuels commonly consumed in prescribed fires were collected in the southeastern and southwestern US and burned in a series of 71 fires at the US Forest Service Fire Sciences Laboratory in Missoula, Montana. The particulate matter (PM2.5 emissions were measured by gravimetric filter sampling with subsequent analysis for elemental carbon (EC, organic carbon (OC, and 38 elements. The trace gas emissions were measured by an open-path Fourier transform infrared (OP-FTIR spectrometer, proton-transfer-reaction mass spectrometry (PTR-MS, proton-transfer ion-trap mass spectrometry (PIT-MS, negative-ion proton-transfer chemical-ionization mass spectrometry (NI-PT-CIMS, and gas chromatography with MS detection (GC-MS. 204 trace gas species (mostly non-methane organic compounds (NMOC were identified and quantified with the above instruments. Many of the 182 species quantified by the GC-MS have rarely, if ever, been measured in smoke before. An additional 153 significant peaks in the unit mass resolution mass spectra were quantified, but either could not be identified or most of the signal at that molecular mass was unaccounted for by identifiable species.

    In a second, "field" phase of this program, airborne and ground-based measurements were made of the emissions from prescribed fires that were mostly located in the same land management units where the fuels for the lab fires were collected. A broad variety, but smaller number of species (21 trace gas species and PM2.5 was measured on 14 fires in chaparral and oak savanna in the southwestern US, as well as pine forest understory in the southeastern US and Sierra Nevada mountains of California. The field measurements of emission factors (EF are useful both for modeling and to examine the representativeness of our lab fire EF. The lab EF/field EF ratio for

  4. Coupling field and laboratory measurements to estimate the emission factors of identified and unidentified trace gases for prescribed fires

    Energy Technology Data Exchange (ETDEWEB)

    Yokelson, R. J.; Burling, I. R.; Gilman, J. B.; Warneke, C.; Stockwell, C. E.; de Gouw, J.; Akagi, S. K.; Urbanski, S. P.; Veres, P.; Roberts, J. M.; Kuster, W. C.; Reardon, J.; Griffith, D. W. T.; Johnson, T. J.; Hosseini, S.; Miller, J. W.; Cocker III, D. R.; Jung, H.; Weise, D. R.

    2013-01-01

    Vegetative fuels commonly consumed in prescribed fires were collected from five locations in the southeastern and southwestern U.S. and burned in a series of 77 fires at the U.S. Forest Service Fire Sciences Laboratory in Missoula, Montana. The particulate matter (PM2.5) emissions were measured by gravimetric filter sampling with subsequent analysis for elemental carbon (EC), organic carbon (OC), and 38 elements. The trace gas emissions were measured with a large suite of state-of-the-art instrumentation including an open-path Fourier transform infrared (OP FTIR) spectrometer, proton-transfer-reaction mass spectrometry (PTR-MS), proton-transfer ion-trap mass spectrometry (PIT-MS), negative-ion proton-transfer chemical-ionization mass spectrometry (NI-PT-CIMS), and gas chromatography with MS detection (GC-MS). 204 trace gas species (mostly non-methane organic compounds (NMOC)) were identified and quantified with the above instruments. An additional 152 significant peaks in the unit mass resolution mass spectra were quantified, but either could not be identified or most of the signal at that molecular mass was unaccounted for by identifiable species. As phase II of this study, we conducted airborne and ground-based sampling of the emissions from real prescribed fires mostly in the same land management units where the fuels for the lab fires were collected. A broad variety, but smaller number of species (21 trace gas species and PM2.5) was measured on 14 fires in chaparral and oak savanna in the southwestern US, as well as pine forest understory in the southeastern US and Sierra Nevada mountains of California. These extensive field measurements of emission factors (EF) for temperate biomass burning are useful both for modeling and to examine the representativeness of our lab fire EF. The lab/field EF ratio for the pine understory fuels was not statistically different from one, on average. However, our lab EF for “smoldering compounds” emitted by burning the semi

  5. Identifying and prioritizing the factors influencing the success of science and technology foresight in the field of economy

    Directory of Open Access Journals (Sweden)

    Afsaneh Raieninezhad

    2014-08-01

    Full Text Available Promoting complex global environment, tremendous growth and increase of network communication technology in the world, strategic planning and foresight activities in science and technology have become very important. Gradually, organizations and businesses are realizing the importance of foresight; many organizations attempt to execute such activities. However, this concept is not still well known in our country and among our organizations. Therefore, recognizing the factors influencing the success of this concept is a kind of issues that the organizations and activists are faced. Thus, this research seeks to identify and to rank the factors, particularly in the areas of economy, and it has developed five hypotheses. In this paper, factors affecting the success of foresight are given in four groups of rational, structure, scope, and results. Data collection for this study is a questionnaire and the binomial tests, Pearson correlation and Friedman test have been used to prove the hypothesis. According to the analysis of data obtained from the questionnaire conducted by SPSS software, all research hypotheses were confirmed. It also became clear that the rational component had the greatest impact on the future success of science and technology in the field of economic.

  6. Study of the method of water-injected meat identifying based on low-field nuclear magnetic resonance

    Science.gov (United States)

    Xu, Jianmei; Lin, Qing; Yang, Fang; Zheng, Zheng; Ai, Zhujun

    2018-01-01

    The aim of this study to apply low-field nuclear magnetic resonance technique was to study regular variation of the transverse relaxation spectral parameters of water-injected meat with the proportion of water injection. Based on this, the method of one-way ANOVA and discriminant analysis was used to analyse the differences between these parameters in the capacity of distinguishing water-injected proportion, and established a model for identifying water-injected meat. The results show that, except for T 21b, T 22e and T 23b, the other parameters of the T 2 relaxation spectrum changed regularly with the change of water-injected proportion. The ability of different parameters to distinguish water-injected proportion was different. Based on S, P 22 and T 23m as the prediction variable, the Fisher model and the Bayes model were established by discriminant analysis method, qualitative and quantitative classification of water-injected meat can be realized. The rate of correct discrimination of distinguished validation and cross validation were 88%, the model was stable.

  7. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  8. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  9. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  10. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  11. Comparative Mapping of Soil Physical-Chemical and Structural Parameters at Field Scale to Identify Zones of Enhanced Leaching Risk

    DEFF Research Database (Denmark)

    Nørgaard, Trine; Møldrup, Per; Olsen, Preben

    2013-01-01

    Preferential flow and particle-facilitated transport through macropores contributes significantly to the transport of strongly sorbing substances such as pesticides and phosphorus. The aim of this study was to perform a field-scale characterization of basic soil physical properties like clay...... and organic carbon content and investigate whether it was possible to relate these to derived structural parameters such as bulk density and conservative tracer parameters and to actual particle and phosphorus leaching patterns obtained from laboratory leaching experiments. Sixty-five cylindrical soil columns...... of 20 cm height and 20 cm diameter and bulk soil were sampled from the topsoil in a 15 m  15 m grid in an agricultural loamy field. Highest clay contents and highest bulk densities were found in the northern part of the field. Leaching experiments with a conservative tracer showed fast 5% tracer...

  12. Stable isotopes as signposts of fluid throughput in Rotokawa and other geothermal fields, and the difficulty of identifying magmatic fingerprints

    International Nuclear Information System (INIS)

    Blattner, P.; Woldemichael, S.; Auckland Univ.; Browne, P.R.L.; Auckland Univ.

    1994-01-01

    We present a background for water-rock interaction generally, and new data on the Rotokawa geothermal field. The oxygen isotope shift of total rock samples allow the deduction of past flowpaths and total fluid throughput. Estimates of any input true exsolved magmatic water are difficult as the lithosphere can act as an effective isotopic screen. (authors). 1 fig., 6 refs

  13. THEORETICAL CONSIDERATIONS REGARDING THE AUTOMATIC FISCAL STABILIZERS OPERATING MECHANISM

    Directory of Open Access Journals (Sweden)

    Gondor Mihaela

    2012-07-01

    Full Text Available This paper examines the role of Automatic Fiscal Stabilizers (AFS for stabilizing the cyclical fluctuations of macroeconomic output as an alternative to discretionary fiscal policy, admitting its huge potential of being an anti crisis solution. The objectives of the study are the identification of the general features of the concept of automatic fiscal stabilizers and the logical assessment of them from economic perspectives. Based on the literature in the field, this paper points out the disadvantages of fiscal discretionary policy and argue the need of using Automatic Fiscal Stabilizers in order to provide a faster decision making process, shielded from political interference, and reduced uncertainty for households and business environment. The paper conclude about the need of using fiscal policy for smoothing the economic cycle, but in a way which includes among its features transparency, responsibility and clear operating mechanisms. Based on the research results the present paper assumes that pro-cyclicality reduces de effectiveness of the Automatic Fiscal Stabilizer and as a result concludes that it is very important to avoid the pro-cyclicality in fiscal rule design. Moreover, by committing in advance to specific fiscal policy action contingent on economic developments, uncertainty about the fiscal policy framework during a recession should be reduced. Being based on logical analysis and not focused on empirical, contextualized one, the paper presents some features of AFS operating mechanism and also identifies and systematizes the factors which provide its importance and national individuality. Reaching common understanding on the Automatic Fiscal Stabilizer concept as a institutional device for smoothing the gap of the economic cycles across different countries, particularly for the European Union Member States, will facilitate efforts to coordinate fiscal policy responses during a crisis, especially in the context of the fiscal

  14. A comparison of hydroponic and soil-based screening methods to identify salt tolerance in the field in barley

    Science.gov (United States)

    Tavakkoli, Ehsan; Fatehi, Foad; Rengasamy, Pichu; McDonald, Glenn K.

    2012-01-01

    Success in breeding crops for yield and other quantitative traits depends on the use of methods to evaluate genotypes accurately under field conditions. Although many screening criteria have been suggested to distinguish between genotypes for their salt tolerance under controlled environmental conditions, there is a need to test these criteria in the field. In this study, the salt tolerance, ion concentrations, and accumulation of compatible solutes of genotypes of barley with a range of putative salt tolerance were investigated using three growing conditions (hydroponics, soil in pots, and natural saline field). Initially, 60 genotypes of barley were screened for their salt tolerance and uptake of Na+, Cl–, and K+ at 150 mM NaCl and, based on this, a subset of 15 genotypes was selected for testing in pots and in the field. Expression of salt tolerance in saline solution culture was not a reliable indicator of the differences in salt tolerance between barley plants that were evident in saline soil-based comparisons. Significant correlations were observed in the rankings of genotypes on the basis of their grain yield production at a moderately saline field site and their relative shoot growth in pots at ECe 7.2 [Spearman’s rank correlation (rs)=0.79] and ECe 15.3 (rs=0.82) and the crucial parameter of leaf Na+ (rs=0.72) and Cl– (rs=0.82) concentrations at ECe 7.2 dS m−1. This work has established screening procedures that correlated well with grain yield at sites with moderate levels of soil salinity. This study also showed that both salt exclusion and osmotic tolerance are involved in salt tolerance and that the relative importance of these traits may differ with the severity of the salt stress. In soil, ion exclusion tended to be more important at low to moderate levels of stress but osmotic stress became more important at higher stress levels. Salt exclusion coupled with a synthesis of organic solutes were shown to be important components of salt

  15. A comparison of screening methods to identify waterlogging tolerance in the field in Brassica napus L. during plant ontogeny.

    Directory of Open Access Journals (Sweden)

    Xiling Zou

    Full Text Available Waterlogging tolerance is typically evaluated at a specific development stage, with an implicit assumption that differences in waterlogging tolerance expressed in these systems will result in improved yield performance in fields. It is necessary to examine these criteria in fields. In the present study, three experiments were conducted to screen waterlogging tolerance in 25 rapeseed (Brassica napus L. varieties at different developmental stages, such as seedling establishment stage and seedling stage at controlled environment, and maturity stage in the fields. The assessments for physiological parameters at three growth stages suggest that there were difference of waterlogging tolerance at all the development stages, providing an important basis for further development of breeding more tolerant materials. The results indicated that flash waterlogging restricts plant growth and growth is still restored after removal of the stress. Correlation analysis between waterlogging tolerance coefficient (WTC of yield and other traits revealed that there was consistency in waterlogging tolerance of the genotypes until maturity, and good tolerance at seedling establishment stage and seedling stage can guarantee tolerance in later stages. The waterlogging-tolerant plants could be selected using some specific traits at any stage, and selections would be more effective at the seedling establishment stage. Thus, our study provides a method for screening waterlogging tolerance, which would enable the suitable basis for initial selection of a large number of germplasm or breeding populations for waterlogging tolerance and help for verifying their potential utility in crop-improvement.

  16. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  17. Video-Based Electroshocking Platform to Identify Lamprey Ammocoete Habitats: Field Validation and New Discoveries in the Columbia River Basin

    Energy Technology Data Exchange (ETDEWEB)

    Arntzen, Evan V.; Mueller, Robert P.

    2017-05-04

    A deep water electroshocking platform (DEP), developed to characterize larval lampreys (ammocoetes) and associated habitat in depths up to 15 m, was recently tested in the field. The DEP samples 0.55 m2∙min-1 without requiring ammocoete transport to the surface. Searches were conducted at a known rearing location (mouth of the Wind River, WA) and at locations on the Cowlitz River, WA, where ammocoetes had not previously been found. At the mouth of the Wind River, video imaged ammocoetes ranged from 50 to 150 mm in water depths between 1.5 m and 4.5 m and were more common in sediments containing organic silt. Ammocoetes (n=137) were detected at 61% of locations sampled (summer) and 50% of the locations sampled (winter). Following the field verification, the DEP was used on the lower 11.7 km of the Cowlitz River, WA. Ammocoetes (n=41) were found with a detection rate of 26% at specific search locations. Cowlitz River sediment containing ammocoetes was also dominated by silt with organic material, often downstream of alluvial bars in water depths from 0.8 to 1.7 m. Test results indicated a high sampling efficiency, favorable detection rates, and little or no impact to ammocoetes and their surrounding benthic environments.

  18. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  19. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  20. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  1. Automatic device for compensating the earths, magnetic field around a {beta} spectrometer; Ensemble automatique de compensation du champ terrestre autour d'un spectrometre

    Energy Technology Data Exchange (ETDEWEB)

    Ristori, Ch [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires

    1967-02-15

    This paper shows how the earth magnetic field inside a double focusing, {pi}{radical}2, iron free, beta ray spectrometer (radius 50 cm ) has been compensated. Three orthogonal magnetic fields are generated by three square coils sets. Each stabilized power supply is regulated through its own magnetometer (of the fluxgate type) and the earth field inside the spectrometer is compensated to 10{sup -4} Oe whatever the earth field or power supply oscillation could be. (author) [French] Cette etude a pour but de compenser le champ magnetique terrestre autour d'un spectrometre beta {pi}{radical}2 a double focalisation, a bobines sans fer et de rayon moyen des trajectoires de 50 cm. Le champ magnetique terrestre est compense par superposition de trois champs orthogonaux, chacun cree par un ensemble de cadres carres. Chacune de ces composantes est mesuree par un magnetometre. Cet ensemble permet de travailler en regulation de courant ou en regulation de champ. En regulation de courant, l'operation est manuelle. En regulation de champ, pour chaque groupe de cadres, l'alimentation stabilisee est asservie par son magnetometre et malgre les variations du champ terrestre ou de la tension secteur, la compensation du champ terrestre se fait toujours correctement au niveau du spectrometre, a 10{sup -4} Oe pres. (auteur)

  2. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  3. Development of labor saving operation technique by making large scale paddy field and direct seeding cultivation of rice in Tohoku district [Japan], 2: Development of technique for automatic precision laser-levelling system

    International Nuclear Information System (INIS)

    Kimura, S.; Imazono, S.; Yaji, Y.

    1999-01-01

    1) Preparation for large large paddy fields and utilization of direct rice seeding cultivation are expected to be the key technologies for the low cost and labor saving large farm rice cultivation. To achieve this, the technique of land leveling for field operations have to be developed. A precise land leveling operation by a wheel tractor with laser-beam emitter and recover in a wet paddy field are developed. 2) The automatic measurement system of a rice paddy field surface level by a tractor that we developed was highly practical. After measured data ate stored in a memory of hand-held computer the standard deviation of a field height value is shown on the display. Also, measured data are exported to the personal computer and by RC232C, the contour map of the paddy field is draw quickly, which is useful for the land leveling work. 3) Considering the relation between rice seed germination and water depth in the field, the preciseness of the field leveling for direct rice seeding is required to be under 1.5cm of standard deviation (s.d.). To realize this preciseness, a prototype leveling aparatus consisting a laser emitter, a laser receiver and dry land leveler pulled by a tracter, was developed and was tested the performance. The results of land leveling test at a field of 1 ha indicated that the elevation difference of the field of 16cm was improved to that of 92% of +- 2.5cm (1.58cm s.d.) after leveling work. The working efficiency was 0.57hour/10a. For a precise leveling work, the of the soil water content should be under the plastic limitation, under which less amount of soil adoheres to the blade of the leveler. The performance tests of the laser assisted leveling apparatus for a paddy harrowing work revealed that for an accurate operation only a blade should be controlled by a hydraulic cylinder according to a laser beam. Since large amount of soil can not be handled by the apparatus, the leveling for a paddy harrowing work is recommended for a fine leveling

  4. Two-processor automatized system to control fast measurements of the magnetic field index of the JINR 10 GeV proton synchrotron

    International Nuclear Information System (INIS)

    Chernykh, E.V.

    1981-01-01

    A two-processor system comprizing a hard-wired module and ES-1010 computer to control measurements of the magnetic field index of the JINR 10 GeV proton synchrotron is described. The system comprises the control module, a computer interface and a parallel branch driver residing in CAMAC system crate. The control module controls analogue multiplexer and analogue-to-digital converter through their front panels and writes down the information into a buffer memory module through the CAMAC highway. The computer controls the system, reads the information into core memory, writes down it on a magnetic tape, processes it and outputs n=f(r) plots on TV monitor and printer. The system provides the measurement up to 100 points during a magnetic field rise and minimal time of measurement 50 μs [ru

  5. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  6. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  7. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  8. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  9. A mobile automatic gas chromatograph system to measure CO2, CH4 and N2O fluxes from soil in the field

    International Nuclear Information System (INIS)

    Silvola, J.; Martikainen, P.; Nykaenen, H.

    1992-01-01

    A caravan has been converted into mobile laboratory for measuring fluxes of CO 2 , CH 4 and N 2 O from the soil in the field. The caravan was equipped with a gas chromatograph fitted with TC-, FI- and EC-detectors, and a PC controlled data logger. The gas collecting chambers can be used up to 50 m from the caravan. The closing and opening of the chambers, as well as the flows of sample gases from chambers to the gas chromatograph. is pneumatically regulated. Simultaneous recordings of temperature, light intensity and the depth of water table are made. The system has been used for two months in 1992, and some preliminary results are presented

  10. An automatic taxonomy of galaxy morphology using unsupervised machine learning

    Science.gov (United States)

    Hocking, Alex; Geach, James E.; Sun, Yi; Davey, Neil

    2018-01-01

    We present an unsupervised machine learning technique that automatically segments and labels galaxies in astronomical imaging surveys using only pixel data. Distinct from previous unsupervised machine learning approaches used in astronomy we use no pre-selection or pre-filtering of target galaxy type to identify galaxies that are similar. We demonstrate the technique on the Hubble Space Telescope (HST) Frontier Fields. By training the algorithm using galaxies from one field (Abell 2744) and applying the result to another (MACS 0416.1-2403), we show how the algorithm can cleanly separate early and late type galaxies without any form of pre-directed training for what an 'early' or 'late' type galaxy is. We then apply the technique to the HST Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) fields, creating a catalogue of approximately 60 000 classifications. We show how the automatic classification groups galaxies of similar morphological (and photometric) type and make the classifications public via a catalogue, a visual catalogue and galaxy similarity search. We compare the CANDELS machine-based classifications to human-classifications from the Galaxy Zoo: CANDELS project. Although there is not a direct mapping between Galaxy Zoo and our hierarchical labelling, we demonstrate a good level of concordance between human and machine classifications. Finally, we show how the technique can be used to identify rarer objects and present lensed galaxy candidates from the CANDELS imaging.

  11. Automatic apparatus and data transmission for field response tests of the ground; Automatisation et teletransmission des donnees pour les tests de reponse du terrain

    Energy Technology Data Exchange (ETDEWEB)

    Laloui, L.; Steinmann, G.

    2004-07-01

    This is the report on the third part of a development started 1998 at the Swiss Federal Institute of Technology Lausanne (EPFL) in Lausanne, Switzerland. Energy piles are becoming increasingly used as a heat exchanger and heat storage device, as are geothermal probes. Their design and sizing is subject to some uncertainty due to the fact that the planner has to estimate the thermal and mechanical properties of the ground surrounding the piles or probes. The aim of the project was to develop an apparatus for field measurements of thermal and mechanical properties of an energy pile or a geothermal probe (thermal response tests). In the reported third phase of the project the portable apparatus was equipped with a data transmission device using the Internet. Real-time data acquisition and supervision is now implemented and data processing has been improved. Another goal of the project was to obtain the official accreditation of such response tests according to the European standard EN 45,000. First operation experience from a test in Lyon, France is reported.

  12. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  13. Automatic Estimation of Volumetric Breast Density Using Artificial Neural Network-Based Calibration of Full-Field Digital Mammography: Feasibility on Japanese Women With and Without Breast Cancer.

    Science.gov (United States)

    Wang, Jeff; Kato, Fumi; Yamashita, Hiroko; Baba, Motoi; Cui, Yi; Li, Ruijiang; Oyama-Manabe, Noriko; Shirato, Hiroki

    2017-04-01

    Breast cancer is the most common invasive cancer among women and its incidence is increasing. Risk assessment is valuable and recent methods are incorporating novel biomarkers such as mammographic density. Artificial neural networks (ANN) are adaptive algorithms capable of performing pattern-to-pattern learning and are well suited for medical applications. They are potentially useful for calibrating full-field digital mammography (FFDM) for quantitative analysis. This study uses ANN modeling to estimate volumetric breast density (VBD) from FFDM on Japanese women with and without breast cancer. ANN calibration of VBD was performed using phantom data for one FFDM system. Mammograms of 46 Japanese women diagnosed with invasive carcinoma and 53 with negative findings were analyzed using ANN models learned. ANN-estimated VBD was validated against phantom data, compared intra-patient, with qualitative composition scoring, with MRI VBD, and inter-patient with classical risk factors of breast cancer as well as cancer status. Phantom validations reached an R 2 of 0.993. Intra-patient validations ranged from R 2 of 0.789 with VBD to 0.908 with breast volume. ANN VBD agreed well with BI-RADS scoring and MRI VBD with R 2 ranging from 0.665 with VBD to 0.852 with breast volume. VBD was significantly higher in women with cancer. Associations with age, BMI, menopause, and cancer status previously reported were also confirmed. ANN modeling appears to produce reasonable measures of mammographic density validated with phantoms, with existing measures of breast density, and with classical biomarkers of breast cancer. FFDM VBD is significantly higher in Japanese women with cancer.

  14. Thai Automatic Speech Recognition

    National Research Council Canada - National Science Library

    Suebvisai, Sinaporn; Charoenpornsawat, Paisarn; Black, Alan; Woszczyna, Monika; Schultz, Tanja

    2005-01-01

    .... We focus on the discussion of the rapid deployment of ASR for Thai under limited time and data resources, including rapid data collection issues, acoustic model bootstrap, and automatic generation of pronunciations...

  15. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  16. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  17. Automatic alignment of radionuclide images

    International Nuclear Information System (INIS)

    Barber, D.C.

    1982-01-01

    The variability of the position, dimensions and orientation of a radionuclide image within the field of view of a gamma camera hampers attempts to analyse the image numerically. This paper describes a method of using a set of training images of a particular type, in this case right lateral brain images, to define the likely variations in the position, dimensions and orientation for that type of image and to provide alignment data for a program that automatically aligns new images of the specified type to a standard position, size and orientation. Examples are given of the use of this method on three types of radionuclide image. (author)

  18. Consciousness wanted, attention found: Reasons for the advantage of the left visual field in identifying T2 among rapidly presented series.

    Science.gov (United States)

    Verleger, Rolf; Śmigasiewicz, Kamila

    2015-09-01

    Everyday experience suggests that people are equally aware of events in both hemi-fields. However, when two streams of stimuli are rapidly presented left and right containing two targets, the second target is better identified in the left than in the right visual field. This might be considered evidence for a right-hemisphere advantage in generating conscious percepts. However, this putative asymmetry of conscious perception cannot be measured independently of participants' access to their conscious percepts, and there is actually evidence from split-brain patients for the reverse, left-hemisphere advantage in having access to conscious percepts. Several other topics were studied in search of the responsible mechanism, among others: Mutual inhibition of hemispheres, cooperation of hemispheres in perceiving midline stimuli, and asymmetries in processing various perceptual inputs. Directing attention by salient cues turned out to be one of the few mechanisms capable of modifying the left visual-field advantage in this paradigm. Thus, this left visual-field advantage is best explained by the notion of a right-hemisphere advantage in directing attention to salient events. Dovetailing with the pathological asymmetries of attention after right-hemisphere lesions and with asymmetries of brain activation when healthy participants shift their attention, the present results extend that body of evidence by demonstrating unusually large and reliable behavioral asymmetries for attention-directing processes in healthy participants. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  20. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  1. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  2. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  3. 2nd International Conference on Mechatronics and Automatic Control

    CERN Document Server

    2015-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the second International Conference on Mechatronics and Automatic Control Systems held in Beijing, China on September 20-21, 2014. Examines how to improve productivity through the latest advanced technologies Covering new systems and techniques in the broad field of mechatronics and automatic control systems.

  4. Further characterization of field strains of rotavirus from Nigeria VP4 genotype P6 most frequently identified among symptomatically infected children.

    Science.gov (United States)

    Adah, M I; Rohwedder, A; Olaleye, O D; Durojaiye, O A; Werchau, H

    1997-10-01

    Polymerase chain reaction was utilized to characterize the VP4 types of 39 Rotavirus field isolates from symptomatically infected children in Nigeria. Genotype P6 was identified most frequently, occurring in 41.03 per cent of the typed specimens. Genotype P8 was identified as the next most prevalent (33.3% per cent). Genotype p6 was widespread (68.75 per cent) among infected neonates in Southern Nigeria, but mix infection was more prevalent (70 per cent) among Northern Nigerian children. Four distinct strains were identified with four different P genotypes. Overall strain G1P8 predominated (22.22 per cent) followed by G3P6 (17.8 per cent). Strain G1P8 was most prevalent (70 per cent) among infants aged 3.1-9 months, but strain G3P6 was most frequently identified among neonates occurance of mix infection genotype demonstrates the potential for reassortment events among different rotavirus genogroups in Nigeria. The epidemiological implications of these findings for rotavirus vaccine development and application in the country were discussed.

  5. A new type industrial total station based on target automatic collimation

    Science.gov (United States)

    Lao, Dabao; Zhou, Weihu; Ji, Rongyi; Dong, Dengfeng; Xiong, Zhi; Wei, Jiang

    2018-01-01

    In the case of industrial field measurement, the present measuring instruments work with manual operation and collimation, which give rise to low efficiency for field measurement. In order to solve the problem, a new type industrial total station is presented in this paper. The new instrument can identify and trace cooperative target automatically, in the mean time, coordinate of the target is measured in real time. For realizing the system, key technology including high precision absolutely distance measurement, small high accuracy angle measurement, target automatic collimation with vision, and quick precise controlling should be worked out. After customized system assemblage and adjustment, the new type industrial total station will be established. As the experiments demonstrated, the coordinate accuracy of the instrument is under 15ppm in the distance of 60m, which proved that the measuring system is feasible. The result showed that the total station can satisfy most industrial field measurement requirements.

  6. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  7. A web based semi automatic frame work for astrobiological researches

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Astrobiology addresses the possibility of extraterrestrial life and explores measures towards its recognition. Researches in this context are founded upon the premise that indicators of life encountered in space will be recognizable. However, effective recognition can be accomplished through a universal adaptation of life signatures without restricting solely to those attributes that represent local solutions to the challenges of survival. The life indicators should be modelled with reference to temporal and environmental variations specific to each planet and time. In this paper, we investigate a semi-automatic open source frame work for the accurate detection and interpretation of life signatures by facilitating public participation, in a similar way as adopted by SETI@home project. The involvement of public in identifying patterns can bring a thrust to the mission and is implemented using semi-automatic framework. Different advanced intelligent methodologies may augment the integration of this human machine analysis. Automatic and manual evaluations along with dynamic learning strategy have been adopted to provide accurate results. The system also helps to provide a deep public understanding about space agency’s works and facilitate a mass involvement in the astrobiological studies. It will surely help to motivate young eager minds to pursue a career in this field.

  8. fields

    Directory of Open Access Journals (Sweden)

    Brad J. Arnold

    2014-07-01

    Full Text Available Surface irrigation, such as flood or furrow, is the predominant form of irrigation in California for agronomic crops. Compared to other irrigation methods, however, it is inefficient in terms of water use; large quantities of water, instead of being used for crop production, are lost to excess deep percolation and tail runoff. In surface-irrigated fields, irrigators commonly cut off the inflow of water when the water advance reaches a familiar or convenient location downfield, but this experience-based strategy has not been very successful in reducing the tail runoff water. Our study compared conventional cutoff practices to a retroactively applied model-based cutoff method in four commercially producing alfalfa fields in Northern California, and evaluated the model using a simple sensor system for practical application in typical alfalfa fields. These field tests illustrated that the model can be used to reduce tail runoff in typical surface-irrigated fields, and using it with a wireless sensor system saves time and labor as well as water.

  9. Combining field performance with controlled environment plant imaging to identify the genetic control of growth and transpiration underlying yield response to water-deficit stress in wheat.

    Science.gov (United States)

    Parent, Boris; Shahinnia, Fahimeh; Maphosa, Lance; Berger, Bettina; Rabie, Huwaida; Chalmers, Ken; Kovalchuk, Alex; Langridge, Peter; Fleury, Delphine

    2015-09-01

    Crop yield in low-rainfall environments is a complex trait under multigenic control that shows significant genotype×environment (G×E) interaction. One way to understand and track this trait is to link physiological studies to genetics by using imaging platforms to phenotype large segregating populations. A wheat population developed from parental lines contrasting in their mechanisms of yield maintenance under water deficit was studied in both an imaging platform and in the field. We combined phenotyping methods in a common analysis pipeline to estimate biomass and leaf area from images and then inferred growth and relative growth rate, transpiration, and water-use efficiency, and applied these to genetic analysis. From the 20 quantitative trait loci (QTLs) found for several traits in the platform, some showed strong effects, accounting for between 26 and 43% of the variation on chromosomes 1A and 1B, indicating that the G×E interaction could be reduced in a controlled environment and by using dynamic variables. Co-location of QTLs identified in the platform and in the field showed a possible common genetic basis at some loci. Co-located QTLs were found for average growth rate, leaf expansion rate, transpiration rate, and water-use efficiency from the platform with yield, spike number, grain weight, grain number, and harvest index in the field. These results demonstrated that imaging platforms are a suitable alternative to field-based screening and may be used to phenotype recombinant lines for positional cloning. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  10. Very Portable Remote Automatic Weather Stations

    Science.gov (United States)

    John R. Warren

    1987-01-01

    Remote Automatic Weather Stations (RAWS) were introduced to Forest Service and Bureau of Land Management field units in 1978 following development, test, and evaluation activities conducted jointly by the two agencies. The original configuration was designed for semi-permanent installation. Subsequently, a need for a more portable RAWS was expressed, and one was...

  11. Distributed automatic control of technological processes in conditions of weightlessness

    Science.gov (United States)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  12. Automatic sign language recognition inspired by human sign perception

    NARCIS (Netherlands)

    Ten Holt, G.A.

    2010-01-01

    Automatic sign language recognition is a relatively new field of research (since ca. 1990). Its objectives are to automatically analyze sign language utterances. There are several issues within the research area that merit investigation: how to capture the utterances (cameras, magnetic sensors,

  13. A model based method for automatic facial expression recognition

    NARCIS (Netherlands)

    Kuilenburg, H. van; Wiering, M.A.; Uyl, M. den

    2006-01-01

    Automatic facial expression recognition is a research topic with interesting applications in the field of human-computer interaction, psychology and product marketing. The classification accuracy for an automatic system which uses static images as input is however largely limited by the image

  14. Application of the RES methodology for identifying features, events and processes (FEPs) for near-field analysis of copper-steel canister

    International Nuclear Information System (INIS)

    Vieno, T.; Hautojaervi, A.; Raiko, H.; Ahonen, L.; Salo, J.P.

    1994-12-01

    Rock Engineering Systems (RES) is an approach to discover the important characteristics and interactions of a complex problem. Recently RES has been applied to identify features, events and processes (FEPs) for performance analysis of nuclear waste repositories. The RES methodology was applied to identify FEPs for the near-field analysis of the copper-steel canister for spent fuel disposal. The aims of the exercise were to learn and test the RES methodology and, secondly, to find out how much the results differ when RES is applied by two different groups on the same problem. A similar exercise was previously carried out by a SKB group. A total of 90 potentially significant FEPs were identified. The exercise showed that the RES methodology is a practicable tool to get a comprehensive and transparent picture of a complex problem. The approach is easy to learn and use. It reveals the important characteristics and interactions and organizes them in a format easy to understand. (9 refs., 5 figs., 3 tabs.)

  15. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  16. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  17. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  18. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  19. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  20. Natural or Induced: Identifying Natural and Induced Swarms from Pre-production and Co-production Microseismic Catalogs at the Coso Geothermal Field

    Science.gov (United States)

    Schoenball, Martin; Kaven, Joern; Glen, Jonathan M. G.; Davatzes, Nicholas C.

    2015-01-01

    Increased levels of seismicity coinciding with injection of reservoir fluids have prompted interest in methods to distinguish induced from natural seismicity. Discrimination between induced and natural seismicity is especially difficult in areas that have high levels of natural seismicity, such as the geothermal fields at the Salton Sea and Coso, both in California. Both areas show swarm-like sequences that could be related to natural, deep fluid migration as part of the natural hydrothermal system. Therefore, swarms often have spatio-temporal patterns that resemble fluid-induced seismicity, and might possibly share other characteristics. The Coso Geothermal Field and its surroundings is one of the most seismically active areas in California with a large proportion of its activity occurring as seismic swarms. Here we analyze clustered seismicity in and surrounding the currently produced reservoir comparatively for pre-production and co-production periods. We perform a cluster analysis, based on the inter-event distance in a space-time-energy domain to identify notable earthquake sequences. For each event j, the closest previous event i is identified and their relationship categorized. If this nearest neighbor’s distance is below a threshold based on the local minimum of the bimodal distribution of nearest neighbor distances, then the event j is included in the cluster as a child to this parent event i. If it is above the threshold, event j begins a new cluster. This process identifies subsets of events whose nearest neighbor distances and relative timing qualify as a cluster as well as a characterizing the parent-child relationships among events in the cluster. We apply this method to three different catalogs: (1) a two-year microseismic survey of the Coso geothermal area that was acquired before exploration drilling in the area began; (2) the HYS_catalog_2013 that contains 52,000 double-difference relocated events and covers the years 1981 to 2013; and (3) a

  1. Cliff : the automatized zipper

    NARCIS (Netherlands)

    Baharom, M.Z.; Toeters, M.J.; Delbressine, F.L.M.; Bangaru, C.; Feijs, L.M.G.

    2016-01-01

    It is our strong believe that fashion - more specifically apparel - can support us so much more in our daily life than it currently does. The Cliff project takes the opportunity to create a generic automatized zipper. It is a response to the struggle by elderly, people with physical disability, and

  2. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  3. Automatic Oscillating Turret.

    Science.gov (United States)

    1981-03-01

    Final Report: February 1978 ZAUTOMATIC OSCILLATING TURRET SYSTEM September 1980 * 6. PERFORMING 01G. REPORT NUMBER .J7. AUTHOR(S) S. CONTRACT OR GRANT...o....e.... *24 APPENDIX P-4 OSCILLATING BUMPER TURRET ...................... 25 A. DESCRIPTION 1. Turret Controls ...Other criteria requirements were: 1. Turret controls inside cab. 2. Automatic oscillation with fixed elevation to range from 20* below the horizontal to

  4. Reactor component automatic grapple

    International Nuclear Information System (INIS)

    Greenaway, P.R.

    1982-01-01

    A grapple for handling nuclear reactor components in a medium such as liquid sodium which, upon proper seating and alignment of the grapple with the component as sensed by a mechanical logic integral to the grapple, automatically seizes the component. The mechanical logic system also precludes seizure in the absence of proper seating and alignment. (author)

  5. Automatic sweep circuit

    International Nuclear Information System (INIS)

    Keefe, D.J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input is described. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found

  6. Automatic sweep circuit

    Science.gov (United States)

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  7. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  8. Automatic Commercial Permit Sets

    Energy Technology Data Exchange (ETDEWEB)

    Grana, Paul [Folsom Labs, Inc., San Francisco, CA (United States)

    2017-12-21

    Final report for Folsom Labs’ Solar Permit Generator project, which has successfully completed, resulting in the development and commercialization of a software toolkit within the cloud-based HelioScope software environment that enables solar engineers to automatically generate and manage draft documents for permit submission.

  9. Whole genome-wide transcript profiling to identify differentially expressed genes associated with seed field emergence in two soybean low phytate mutants.

    Science.gov (United States)

    Yuan, Fengjie; Yu, Xiaomin; Dong, Dekun; Yang, Qinghua; Fu, Xujun; Zhu, Shenlong; Zhu, Danhua

    2017-01-18

    Seed germination is important to soybean (Glycine max) growth and development, ultimately affecting soybean yield. A lower seed field emergence has been the main hindrance for breeding soybeans low in phytate. Although this reduction could be overcome by additional breeding and selection, the mechanisms of seed germination in different low phytate mutants remain unknown. In this study, we performed a comparative transcript analysis of two low phytate soybean mutants (TW-1 and TW-1-M), which have the same mutation, a 2 bp deletion in GmMIPS1, but show a significant difference in seed field emergence, TW-1-M was higher than that of TW-1 . Numerous genes analyzed by RNA-Seq showed markedly different expression levels between TW-1-M and TW-1 mutants. Approximately 30,000-35,000 read-mapped genes and ~21000-25000 expressed genes were identified for each library. There were ~3900-9200 differentially expressed genes (DEGs) in each contrast library, the number of up-regulated genes was similar with down-regulated genes in the mutant TW-1and TW-1-M. Gene ontology functional categories of DEGs indicated that the ethylene-mediated signaling pathway, the abscisic acid-mediated signaling pathway, response to hormone, ethylene biosynthetic process, ethylene metabolic process, regulation of hormone levels, and oxidation-reduction process, regulation of flavonoid biosynthetic process and regulation of abscisic acid-activated signaling pathway had high correlations with seed germination. In total, 2457 DEGs involved in the above functional categories were identified. Twenty-two genes with 20 biological functions were the most highly up/down- regulated (absolute value Log2FC >5) in the high field emergence mutant TW-1-M and were related to metabolic or signaling pathways. Fifty-seven genes with 36 biological functions had the greatest expression abundance (FRPM >100) in germination-related pathways. Seed germination in the soybean low phytate mutants is a very complex process

  10. Automatic characterization of dynamics in Absence Epilepsy

    DEFF Research Database (Denmark)

    Petersen, Katrine N. H.; Nielsen, Trine N.; Kjær, Troels W.

    2013-01-01

    Dynamics of the spike-wave paroxysms in Childhood Absence Epilepsy (CAE) are automatically characterized using novel approaches. Features are extracted from scalograms formed by Continuous Wavelet Transform (CWT). Detection algorithms are designed to identify an estimate of the temporal development...

  11. A Machine Vision System for Automatically Grading Hardwood Lumber - (Proceedings)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas H. Drayer; Joe G. Tront; Philip A. Araman; Robert L. Brisbon

    1990-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  12. Statistical pattern recognition for automatic writer identification and verification

    NARCIS (Netherlands)

    Bulacu, Marius Lucian

    2007-01-01

    The thesis addresses the problem of automatic person identification using scanned images of handwriting.Identifying the author of a handwritten sample using automatic image-based methods is an interesting pattern recognition problem with direct applicability in the forensic and historic document

  13. Sensitivity-based virtual fields for the non-linear virtual fields method

    Science.gov (United States)

    Marek, Aleksander; Davis, Frances M.; Pierron, Fabrice

    2017-09-01

    The virtual fields method is an approach to inversely identify material parameters using full-field deformation data. In this manuscript, a new set of automatically-defined virtual fields for non-linear constitutive models has been proposed. These new sensitivity-based virtual fields reduce the influence of noise on the parameter identification. The sensitivity-based virtual fields were applied to a numerical example involving small strain plasticity; however, the general formulation derived for these virtual fields is applicable to any non-linear constitutive model. To quantify the improvement offered by these new virtual fields, they were compared with stiffness-based and manually defined virtual fields. The proposed sensitivity-based virtual fields were consistently able to identify plastic model parameters and outperform the stiffness-based and manually defined virtual fields when the data was corrupted by noise.

  14. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.

    1978-01-01

    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  15. Application of an automatic cloud tracking technique to Meteosat water vapor and infrared observations

    Science.gov (United States)

    Endlich, R. M.; Wolf, D. E.

    1980-01-01

    The automatic cloud tracking system was applied to METEOSAT 6.7 micrometers water vapor measurements to learn whether the system can track the motions of water vapor patterns. Data for the midlatitudes, subtropics, and tropics were selected from a sequence of METEOSAT pictures for 25 April 1978. Trackable features in the water vapor patterns were identified using a clustering technique and the features were tracked by two different methods. In flat (low contrast) water vapor fields, the automatic motion computations were not reliable, but in areas where the water vapor fields contained small scale structure (such as in the vicinity of active weather phenomena) the computations were successful. Cloud motions were computed using METEOSAT infrared observations (including tropical convective systems and midlatitude jet stream cirrus).

  16. Identifying Possible Pheromones of Cerambycid Beetles by Field Testing Known Pheromone Components in Four Widely Separated Regions of the United States.

    Science.gov (United States)

    Millar, Jocelyn G; Mitchell, Robert F; Mongold-Diers, Judith A; Zou, Yunfan; Bográn, Carlos E; Fierke, Melissa K; Ginzel, Matthew D; Johnson, Crawford W; Meeker, James R; Poland, Therese M; Ragenovich, Iral; Hanks, Lawrence M

    2018-02-09

    The pheromone components of many cerambycid beetles appear to be broadly shared among related species, including species native to different regions of the world. This apparent conservation of pheromone structures within the family suggests that field trials of common pheromone components could be used as a means of attracting multiple species, which then could be targeted for full identification of their pheromones. Here, we describe the results of such field trials that were conducted in nine states in the northeastern, midwestern, southern, and western United States. Traps captured 12,742 cerambycid beetles of 153 species and subspecies. Species attracted in significant numbers to a particular treatment (some in multiple regions) included 19 species in the subfamily Cerambycinae, 15 species in the Lamiinae, one species in the Prioninae, and two species in the Spondylidinae. Pheromones or likely pheromones for many of these species, such as 3-hydroxyhexan-2-one and syn- and anti-2,3-hexanediols for cerambycine species, and fuscumol and/or fuscumol acetate for lamiine species, had already been identified. New information about attractants (in most cases likely pheromone components) was found for five cerambycine species (Ancylocera bicolor [Olivier], Elaphidion mucronatum [Say], Knulliana cincta cincta [Drury], Phymatodes aeneus LeConte, and Rusticoclytus annosus emotus [Brown]), and five lamiine species (Ecyrus dasycerus dasycerus [Say], Lepturges symmetricus [Haldeman], Sternidius misellus [LeConte], Styloleptus biustus biustus [LeConte], and Urgleptes signatus [LeConte]). Consistent attraction of some species to the same compounds in independent bioassays demonstrated the utility and reliability of pheromone-based methods for sampling cerambycid populations across broad spatial scales. © The Author(s) 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  18. Discriminative Chemical Patterns: Automatic and Interactive Design.

    Science.gov (United States)

    Bietz, Stefan; Schomburg, Karen T; Hilbig, Matthias; Rarey, Matthias

    2015-08-24

    The classification of molecules with respect to their inhibiting, activating, or toxicological potential constitutes a central aspect in the field of cheminformatics. Often, a discriminative feature is needed to distinguish two different molecule sets. Besides physicochemical properties, substructures and chemical patterns belong to the descriptors most frequently applied for this purpose. As a commonly used example of this descriptor class, SMARTS strings represent a powerful concept for the representation and processing of abstract chemical patterns. While their usage facilitates a convenient way to apply previously derived classification rules on new molecule sets, the manual generation of useful SMARTS patterns remains a complex and time-consuming process. Here, we introduce SMARTSminer, a new algorithm for the automatic derivation of discriminative SMARTS patterns from preclassified molecule sets. Based on a specially adapted subgraph mining algorithm, SMARTSminer identifies structural features that are frequent in only one of the given molecule classes. In comparison to elemental substructures, it also supports the consideration of general and specific SMARTS features. Furthermore, SMARTSminer is integrated into an interactive pattern editor named SMARTSeditor. This allows for an intuitive visualization on the basis of the SMARTSviewer concept as well as interactive adaption and further improvement of the generated patterns. Additionally, a new molecular matching feature provides an immediate feedback on a pattern's matching behavior across the molecule sets. We demonstrate the utility of the SMARTSminer functionality and its integration into the SMARTSeditor software in several different classification scenarios.

  19. Automatic sentence extraction for the detection of scientific paper relations

    Science.gov (United States)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  20. Automatic classification of time-variable X-ray sources

    International Nuclear Information System (INIS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  1. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  2. Application of a simplified calculation for full-wave microtremor H/ V spectral ratio based on the diffuse field approximation to identify underground velocity structures

    Science.gov (United States)

    Wu, Hao; Masaki, Kazuaki; Irikura, Kojiro; Sánchez-Sesma, Francisco José

    2017-12-01

    Under the diffuse field approximation, the full-wave (FW) microtremor H/ V spectral ratio ( H/ V) is modeled as the square root of the ratio of the sum of imaginary parts of the Green's function of the horizontal components to that of the vertical one. For a given layered medium, the FW H/ V can be well approximated with only surface waves (SW) H/ V of the "cap-layered" medium which consists of the given layered medium and a new larger velocity half-space (cap layer) at large depth. Because the contribution of surface waves can be simply obtained by the residue theorem, the computation of SW H/ V of cap-layered medium is faster than that of FW H/ V evaluated by discrete wavenumber method and contour integration method. The simplified computation of SW H/ V was then applied to identify the underground velocity structures at six KiK-net strong-motion stations. The inverted underground velocity structures were used to evaluate FW H/ Vs which were consistent with the SW H/ Vs of corresponding cap-layered media. The previous study on surface waves H/ Vs proposed with the distributed surface sources assumption and a fixed Rayleigh-to-Love waves amplitude ratio for horizontal motions showed a good agreement with the SW H/ Vs of our study. The consistency between observed and theoretical spectral ratios, such as the earthquake motions of H/ V spectral ratio and spectral ratio of horizontal motions between surface and bottom of borehole, indicated that the underground velocity structures identified from SW H/ V of cap-layered medium were well resolved by the new method.[Figure not available: see fulltext.

  3. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  4. Automatic structural scene digitalization.

    Science.gov (United States)

    Tang, Rui; Wang, Yuhan; Cosker, Darren; Li, Wenbin

    2017-01-01

    In this paper, we present an automatic system for the analysis and labeling of structural scenes, floor plan drawings in Computer-aided Design (CAD) format. The proposed system applies a fusion strategy to detect and recognize various components of CAD floor plans, such as walls, doors, windows and other ambiguous assets. Technically, a general rule-based filter parsing method is fist adopted to extract effective information from the original floor plan. Then, an image-processing based recovery method is employed to correct information extracted in the first step. Our proposed method is fully automatic and real-time. Such analysis system provides high accuracy and is also evaluated on a public website that, on average, archives more than ten thousands effective uses per day and reaches a relatively high satisfaction rate.

  5. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  6. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  7. Automatic LOD selection

    OpenAIRE

    Forsman, Isabelle

    2017-01-01

    In this paper a method to automatically generate transition distances for LOD, improving image stability and performance is presented. Three different methods were tested all measuring the change between two level of details using the spatial frequency. The methods were implemented as an optional pre-processing step in order to determine the transition distances from multiple view directions. During run-time both view direction based selection and the furthest distance for each direction was ...

  8. Automatically identifying characteristic features of non-native English accents

    NARCIS (Netherlands)

    Bloem, Jelke; Wieling, Martijn; Nerbonne, John; Côté, Marie-Hélène; Knooihuizen, Remco; Nerbonne, John

    2016-01-01

    In this work, we demonstrate the application of statistical measures from dialectometry to the study of accented English speech. This new methodology enables a more quantitative approach to the study of accents. Studies on spoken dialect data have shown that a combination of representativeness (the

  9. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    as input data for three different PARAFAC methods. Firstly inserting missing values in the scatter regions are tested, secondly an interpolation of the scatter regions is performed and finally the scatter regions are down-weighted. These results show that the PARAFAC method to choose after scatter...

  10. Identifying the Factors Affecting Papers' Citability in the Field of Medicine: an Evidence-based Approach Using 200 Highly and Lowly-cited Papers.

    Science.gov (United States)

    Yaminfirooz, Mousa; Ardali, Farzaneh Raeesi

    2018-01-01

    Nowadays, publishing highly-cited papers is important for researchers and editors. In this evidence-based study, the factors influencing the citability of published papers in the field of medicine have been identified. 200 papers indexed in Scopus (in two groups: highly-cited and lowly-cited) with 100 papers in each were studied. Needed data were manually collected with a researcher-made checklist. Data analysis was done in SPSS using descriptive and inferential statistics. Variables such as journal IF, journal rank, journal subject quartile, the first/corresponding author's h-index, the number of documents produced by the first/corresponding author, SJR and SNIP had significantly positive correlation with paper citability (ppaper age, paper type, the number of references, the number of authors, indexing institute and journal kind had not any relationship with paper citability (p> .05). the factors affecting the citability are among indicators relating to authors, publishing journals and published papers. Determining the extent to which these factors influence the citability of a paper needs further large-scaled research. Authors and editors searching for high-citedness should consider these factors when authoring and publishing papers.

  11. Identifying the Factors Affecting Papers’ Citability in the Field of Medicine: an Evidence-based Approach Using 200 Highly and Lowly-cited Papers

    Science.gov (United States)

    Yaminfirooz, Mousa; Ardali, Farzaneh Raeesi

    2018-01-01

    Introduction: Nowadays, publishing highly-cited papers is important for researchers and editors. In this evidence-based study, the factors influencing the citability of published papers in the field of medicine have been identified. Material and Methods: 200 papers indexed in Scopus (in two groups: highly-cited and lowly-cited) with 100 papers in each were studied. Needed data were manually collected with a researcher-made checklist. Data analysis was done in SPSS using descriptive and inferential statistics. Results: Variables such as journal IF, journal rank, journal subject quartile, the first/corresponding author’s h-index, the number of documents produced by the first/corresponding author, SJR and SNIP had significantly positive correlation with paper citability (ppaper age, paper type, the number of references, the number of authors, indexing institute and journal kind had not any relationship with paper citability (p> .05). Conclusion: the factors affecting the citability are among indicators relating to authors, publishing journals and published papers. Determining the extent to which these factors influence the citability of a paper needs further large-scaled research. Authors and editors searching for high-citedness should consider these factors when authoring and publishing papers. PMID:29719306

  12. UTILIZACIÓN DE SOFTWARE DE CORRECCIÓN AUTOMÁTICA EN EL CAMPO DE LAS CIENCIAS DE LA SALUD Using automatic correction software in the field of health sciences

    Directory of Open Access Journals (Sweden)

    Ferrán Prados

    2010-06-01

    Full Text Available Estamos viviendo una época de cambios profundos en la educación universitaria. La implantación del plan de Bolonia nos ha llevado a plantear nuevas metodologías docentes, a revisar el papel del estudiante, la evaluación por competencias, la incorporación de las TIC. Hechos impensables hace poco más de una década. Entre las diferentes plataformas informáticas, cabe destacar las que permiten corrección automática de ejercicios, porque son instrumentos de un gran interés pedagógico ya que evalúan al instante al alumnado y aportan un feedback del conocimiento que tiene en forma de mensaje de ayuda o de nota. Si la potencia de estas herramientas la sumamos a la de Internet, usando un entorno de e-learning, el resultado permitirá trabajar, corregir, evaluar, resolver dudas, etc., desde cualquier lugar y a cualquier hora. Este trabajo presenta parte de una plataforma y los resultados de su utilización en el ámbito de las ciencias de la salud.We live in an era of profound changes in university education. The implementation of Bologna plan has led us to raise new teaching methodologies, to review the role of the student, competency assessment, the incorporation of ICT. Unthinkable acts, one or two decade ago. The TIC concept is very broad and is attributed to the media, processes and content usage. Inside the supports and platforms, we stress tools that allow automatic correction of exercises, because they are instruments of great educational value because instantly they assess students and provide instant feedback about the knowledge that they have either as message support or note. If the power of these tools, we add the Internet, using e-learning environment, the results allow us to work, edit, evaluate, resolve doubts, and so on, anywhere, anytime. We present part of a platform and the results of its use in the field of health sciences.

  13. Automatic measurement of images on astrometric plates

    Science.gov (United States)

    Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.

    1994-04-01

    We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).

  14. Using Novel Laboratory Incubations and Field Experiments to Identify the Source and Fate of Reactive Organic Carbon in an Arsenic-contaminated Aquifer System

    Science.gov (United States)

    Stahl, M.; Tarek, M. H.; Badruzzaman, B.; Harvey, C. F.

    2017-12-01

    Characterizing the sources and fate of organic matter (OM) within aquifer systems is key to our understanding of both the broader global carbon cycle as well as the quality of our groundwater resources. The linkage between the subsurface carbon cycle and groundwater quality is perhaps nowhere more apparent than in the aquifer systems of South and Southeast Asia, where the contamination of groundwater with geogenic arsenic (As) is widespread and threatens the health of millions of individuals. OM fuels the biogeochemical processes driving As mobilization within these aquifers, however the source (i.e., modern surface-derived or aged sedimentary OM) of the reactive OM is widely debated. To characterize the sources of OM driving aquifer redox processes we tracked DIC and DOC concentrations and isotopes (stable and radiocarbon) along groundwater flow-paths and beneath an instrumented study pond at a field site in Bangladesh. We also conducted a set of novel groundwater incubation experiments, where we carbon-dated the DOC at the start and end of a experiment in order to determine the age of the OM that was mineralized. Our carbon/isotope balance reveals that aquifer recharge introduces a large quantity of young (i.e. near modern) OM that is efficiently mineralized within the upper few meters of the aquifer, effectively limiting this pool of reactive surface-sourced OM from being transported deeper into the aquifer where significant As mobilization takes place. The OM mineralized past the upper few meters is an aged, sedimentary source. Consistent with our field data, our incubation experiments show that past the upper few meters of the aquifer the reactive DOC is significantly older than the bulk DOC and has an age consistent with sedimentary OM. Combining our novel set of incubation experiments and a carbon/isotope balance along groundwater flow-paths and beneath our study pond we have identified the sources of reactive OM across different aquifer depths in a

  15. The Associate Principal Astronomer for AI Management of Automatic Telescopes

    Science.gov (United States)

    Henry, Gregory W.

    1998-01-01

    This research program in scheduling and management of automatic telescopes had the following objectives: 1. To field test the 1993 Automatic Telescope Instruction Set (ATIS93) programming language, which was specifically developed to allow real-time control of an automatic telescope via an artificial intelligence scheduler running on a remote computer. 2. To develop and test the procedures for two-way communication between a telescope controller and remote scheduler via the Internet. 3. To test various concepts in Al scheduling being developed at NASA Ames Research Center on an automatic telescope operated by Tennessee State University at the Fairborn Observatory site in southern Arizona. and 4. To develop a prototype software package, dubbed the Associate Principal Astronomer, for the efficient scheduling and management of automatic telescopes.

  16. An Autonomous Robotic System for Mapping Weeds in Fields

    DEFF Research Database (Denmark)

    Hansen, Karl Damkjær; Garcia Ruiz, Francisco Jose; Kazmi, Wajahat

    2013-01-01

    The ASETA project develops theory and methods for robotic agricultural systems. In ASETA, unmanned aircraft and unmanned ground vehicles are used to automate the task of identifying and removing weeds in sugar beet fields. The framework for a working automatic robotic weeding system is presented...

  17. The association between type of spine fracture and the mechanism of trauma: A useful tool for identifying mechanism of trauma on legal medicine field.

    Science.gov (United States)

    Aghakhani, Kamran; Kordrostami, Roya; Memarian, Azadeh; Asl, Nahid Dadashzadeh; Zavareh, Fatemeh Noorian

    2018-05-01

    Determining the association between mechanism of trauma, and the type of spine column fracture is a useful approach for exactly describing spine injury on forensic medicine field. We aimed to determine mechanism of trauma based on distribution of the transition of spinal column fractures. This cross-sectional survey was performed on 117 consecutive patients with the history of spinal trauma who were admitted to emergency ward of Rasoul-e-Akram Hospital in Tehran, Iran from April 2015 to March 2016. The baseline characteristics were collected by reviewing the hospital recorded files. With respect to mechanism of fracture, 63.2% of fractures were caused by falling, 30.8% by collisions with motor vehicles, and others caused by the violence. Regarding site of fracture, lumbosacral was affected in 47.9%, thoracic in 29.9%, and cervical in 13.7%. Regarding type of fracture, burst fracture was the most common type (71.8%) followed by compressive fracture (14.5%). The site of fracture was specifically associated with the mechanism of injury; the most common injuries induced by falling from height were found in lumbosacral and cervical sites, and the most frequent injuries by traffic accidents were found in thoracic site; also the injuries following violence were observed more in lumbar vertebrae. The burst fractures were more revealed in the patients affected by falling from height and by traffic accidents, and both burst and compressive fractures were more observed with the same result in the patients injured with violence (p = 0.003). The type of spine fracture due to trauma is closely associated with the mechanism of trauma that can be helpful in legal medicine to identify the mechanism of trauma in affected patients. Copyright © 2018. Published by Elsevier Ltd.

  18. Automatic quantitative renal scintigraphy

    International Nuclear Information System (INIS)

    Valeyre, J.; Deltour, G.; Delisle, M.J.; Bouchard, A.

    1976-01-01

    Renal scintigraphy data may be analyzed automatically by the use of a processing system coupled to an Anger camera (TRIDAC-MULTI 8 or CINE 200). The computing sequence is as follows: normalization of the images; background noise subtraction on both images; evaluation of mercury 197 uptake by the liver and spleen; calculation of the activity fractions on each kidney with respect to the injected dose, taking into account the kidney depth and the results referred to normal values; edition of the results. Automation minimizes the scattering parameters and by its simplification is a great asset in routine work [fr

  19. AUTOMATIC FREQUENCY CONTROL SYSTEM

    Science.gov (United States)

    Hansen, C.F.; Salisbury, J.D.

    1961-01-10

    A control is described for automatically matching the frequency of a resonant cavity to that of a driving oscillator. The driving oscillator is disconnected from the cavity and a secondary oscillator is actuated in which the cavity is the frequency determining element. A low frequency is mixed with the output of the driving oscillator and the resultant lower and upper sidebands are separately derived. The frequencies of the sidebands are compared with the secondary oscillator frequency. deriving a servo control signal to adjust a tuning element in the cavity and matching the cavity frequency to that of the driving oscillator. The driving oscillator may then be connected to the cavity.

  20. Automatic dipole subtraction

    International Nuclear Information System (INIS)

    Hasegawa, K.

    2008-01-01

    The Catani-Seymour dipole subtraction is a general procedure to treat infrared divergences in real emission processes at next-to-leading order in QCD. We automatized the procedure in a computer code. The code is useful especially for the processes with many parton legs. In this talk, we first explain the algorithm of the dipole subtraction and the whole structure of our code. After that we show the results for some processes where the infrared divergences of real emission processes are subtracted. (author)

  1. Automatic programmable air ozonizer

    International Nuclear Information System (INIS)

    Gubarev, S.P.; Klosovsky, A.V.; Opaleva, G.P.; Taran, V.S.; Zolototrubova, M.I.

    2015-01-01

    In this paper we describe a compact, economical, easy to manage auto air ozonator developed at the Institute of Plasma Physics of the NSC KIPT. It is designed for sanitation, disinfection of premises and cleaning the air from foreign odors. A distinctive feature of the developed device is the generation of a given concentration of ozone, approximately 0.7 maximum allowable concentration (MAC), and automatic maintenance of a specified level. This allows people to be inside the processed premises during operation. The microprocessor controller to control the operation of the ozonator was developed

  2. Automatic Detect and Trace of Solar Filaments

    Science.gov (United States)

    Fang, Cheng; Chen, P. F.; Tang, Yu-hua; Hao, Qi; Guo, Yang

    We developed a series of methods to automatically detect and trace solar filaments in solar Hα images. The programs are able to not only recognize filaments and determine their properties, such as the position, the area and other relevant parameters, but also to trace the daily evolution of the filaments. For solar full disk Hα images, the method consists of three parts: first, preprocessing is applied to correct the original images; second, the Canny edge-detection method is used to detect the filaments; third, filament properties are recognized through the morphological operators. For each Hα filament and its barb features, we introduced the unweighted undirected graph concept and adopted Dijkstra shortest-path algorithm to recognize the filament spine; then, using polarity inversion line shift method for measuring the polarities in both sides of the filament to determine the filament axis chirality; finally, employing connected components labeling method to identify the barbs and calculating the angle between each barb and spine to indicate the barb chirality. Our algorithms are applied to the observations from varied observatories, including the Optical & Near Infrared Solar Eruption Tracer (ONSET) in Nanjing University, Mauna Loa Solar Observatory (MLSO) and Big Bear Solar Observatory (BBSO). The programs are demonstrated to be effective and efficient. We used our method to automatically process and analyze 3470 images obtained by MLSO from January 1998 to December 2009, and a butterfly diagram of filaments is obtained. It shows that the latitudinal migration of solar filaments has three trends in the Solar Cycle 23: The drift velocity was fast from 1998 to the solar maximum; after the solar maximum, it became relatively slow and after 2006, the migration became divergent, signifying the solar minimum. About 60% filaments with the latitudes larger than 50 degree migrate towards the Polar Regions with relatively high velocities, and the latitudinal migrating

  3. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  4. Culture, attribution and automaticity: a social cognitive neuroscience view.

    Science.gov (United States)

    Mason, Malia F; Morris, Michael W

    2010-06-01

    A fundamental challenge facing social perceivers is identifying the cause underlying other people's behavior. Evidence indicates that East Asian perceivers are more likely than Western perceivers to reference the social context when attributing a cause to a target person's actions. One outstanding question is whether this reflects a culture's influence on automatic or on controlled components of causal attribution. After reviewing behavioral evidence that culture can shape automatic mental processes as well as controlled reasoning, we discuss the evidence in favor of cultural differences in automatic and controlled components of causal attribution more specifically. We contend that insights emerging from social cognitive neuroscience research can inform this debate. After introducing an attribution framework popular among social neuroscientists, we consider findings relevant to the automaticity of attribution, before speculating how one could use a social neuroscience approach to clarify whether culture affects automatic, controlled or both types of attribution processes.

  5. [An automatic system controlled by microcontroller for carotid sinus perfusion].

    Science.gov (United States)

    Yi, X L; Wang, M Y; Fan, Z Z; He, R R

    2001-08-01

    To establish a new method for controlling automatically the carotid perfusion pressure. A cheap practical automatic perfusion unit based on AT89C2051 micro controller was designed. The unit, LDB-M perfusion pump and the carotid sinus of an animal constituted an automatic perfusion system. This system was able to provide ramp and stepwise updown perfusion pattern and has been used in the research of baroreflex. It can insure the precision and reproducibility of perfusion pressure curve, and improve the technical level in corresponding medical field.

  6. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  7. High-throughput phenotyping (HTP) identifies seedling root traits linked to variation in seed yield and nutrient capture in field-grown oilseed rape (Brassica napus L.).

    Science.gov (United States)

    Thomas, C L; Graham, N S; Hayden, R; Meacham, M C; Neugebauer, K; Nightingale, M; Dupuy, L X; Hammond, J P; White, P J; Broadley, M R

    2016-04-06

    Root traits can be selected for crop improvement. Techniques such as soil excavations can be used to screen root traits in the field, but are limited to genotypes that are well-adapted to field conditions. The aim of this study was to compare a low-cost, high-throughput root phenotyping (HTP) technique in a controlled environment with field performance, using oilseed rape (OSR;Brassica napus) varieties. Primary root length (PRL), lateral root length and lateral root density (LRD) were measured on 14-d-old seedlings of elite OSR varieties (n = 32) using a 'pouch and wick' HTP system (∼40 replicates). Six field experiments were conducted using the same varieties at two UK sites each year for 3 years. Plants were excavated at the 6- to 8-leaf stage for general vigour assessments of roots and shoots in all six experiments, and final seed yield was determined. Leaves were sampled for mineral composition from one of the field experiments. Seedling PRL in the HTP system correlated with seed yield in four out of six (r = 0·50, 0·50, 0·33, 0·49;P emergence in three out of five (r = 0·59, 0·22, 0·49;P emergence, general early vigour or yield in the field. Associations between PRL and field performance are generally related to early vigour. These root traits might therefore be of limited additional selection value, given that vigour can be measured easily on shoots/canopies. In contrast, LRD cannot be assessed easily in the field and, if LRD can improve nutrient uptake, then it may be possible to use HTP systems to screen this trait in both elite and more genetically diverse, non-field-adapted OSR. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company.

  8. Automatic identification in mining

    Energy Technology Data Exchange (ETDEWEB)

    Puckett, D; Patrick, C [Mine Computers and Electronics Inc., Morehead, KY (United States)

    1998-06-01

    The feasibility of monitoring the locations and vital statistics of equipment and personnel in surface and underground mining operations has increased with advancements in radio frequency identification (RFID) technology. This paper addresses the use of RFID technology, which is relatively new to the mining industry, to track surface equipment in mine pits, loading points and processing facilities. Specific applications are discussed, including both simplified and complex truck tracking systems and an automatic pit ticket system. This paper concludes with a discussion of the future possibilities of using RFID technology in mining including monitoring heart and respiration rates, body temperatures and exertion levels; monitoring repetitious movements for the study of work habits; and logging air quality via personnel sensors. 10 refs., 5 figs.

  9. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  10. Semi-automatic fluoroscope

    International Nuclear Information System (INIS)

    Tarpley, M.W.

    1976-10-01

    Extruded aluminum-clad uranium-aluminum alloy fuel tubes must pass many quality control tests before irradiation in Savannah River Plant nuclear reactors. Nondestructive test equipment has been built to automatically detect high and low density areas in the fuel tubes using x-ray absorption techniques with a video analysis system. The equipment detects areas as small as 0.060-in. dia with 2 percent penetrameter sensitivity. These areas are graded as to size and density by an operator using electronic gages. Video image enhancement techniques permit inspection of ribbed cylindrical tubes and make possible the testing of areas under the ribs. Operation of the testing machine, the special low light level television camera, and analysis and enhancement techniques are discussed

  11. Automatic surveying techniques

    International Nuclear Information System (INIS)

    Sah, R.

    1976-01-01

    In order to investigate the feasibility of automatic surveying methods in a more systematic manner, the PEP organization signed a contract in late 1975 for TRW Systems Group to undertake a feasibility study. The completion of this study resulted in TRW Report 6452.10-75-101, dated December 29, 1975, which was largely devoted to an analysis of a survey system based on an Inertial Navigation System. This PEP note is a review and, in some instances, an extension of that TRW report. A second survey system which employed an ''Image Processing System'' was also considered by TRW, and it will be reviewed in the last section of this note. 5 refs., 5 figs., 3 tabs

  12. AUTOMATIC ARCHITECTURAL STYLE RECOGNITION

    Directory of Open Access Journals (Sweden)

    M. Mathias

    2012-09-01

    Full Text Available Procedural modeling has proven to be a very valuable tool in the field of architecture. In the last few years, research has soared to automatically create procedural models from images. However, current algorithms for this process of inverse procedural modeling rely on the assumption that the building style is known. So far, the determination of the building style has remained a manual task. In this paper, we propose an algorithm which automates this process through classification of architectural styles from facade images. Our classifier first identifies the images containing buildings, then separates individual facades within an image and determines the building style. This information could then be used to initialize the building reconstruction process. We have trained our classifier to distinguish between several distinct architectural styles, namely Flemish Renaissance, Haussmannian and Neoclassical. Finally, we demonstrate our approach on various street-side images.

  13. Learning algorithms and automatic processing of languages

    International Nuclear Information System (INIS)

    Fluhr, Christian Yves Andre

    1977-01-01

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts

  14. Automatic traveltime picking using instantaneous traveltime

    KAUST Repository

    Saragiotis, Christos

    2013-02-08

    Event picking is used in many steps of seismic processing. We present an automatic event picking method that is based on a new attribute of seismic signals, instantaneous traveltime. The calculation of the instantaneous traveltime consists of two separate but interrelated stages. First, a trace is mapped onto the time-frequency domain. Then the time-frequency representation is mapped back onto the time domain by an appropriate operation. The computed instantaneous traveltime equals the recording time at those instances at which there is a seismic event, a feature that is used to pick the events. We analyzed the concept of the instantaneous traveltime and demonstrated the application of our automatic picking method on dynamite and Vibroseis field data.

  15. Automatic traveltime picking using instantaneous traveltime

    KAUST Repository

    Saragiotis, Christos; Alkhalifah, Tariq Ali; Fomel, Sergey

    2013-01-01

    Event picking is used in many steps of seismic processing. We present an automatic event picking method that is based on a new attribute of seismic signals, instantaneous traveltime. The calculation of the instantaneous traveltime consists of two separate but interrelated stages. First, a trace is mapped onto the time-frequency domain. Then the time-frequency representation is mapped back onto the time domain by an appropriate operation. The computed instantaneous traveltime equals the recording time at those instances at which there is a seismic event, a feature that is used to pick the events. We analyzed the concept of the instantaneous traveltime and demonstrated the application of our automatic picking method on dynamite and Vibroseis field data.

  16. Automatic evidence retrieval for systematic reviews.

    Science.gov (United States)

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy

    2014-10-01

    Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.

  17. Automatic localization of the da Vinci surgical instrument tips in 3-D transrectal ultrasound.

    Science.gov (United States)

    Mohareri, Omid; Ramezani, Mahdi; Adebar, Troy K; Abolmaesumi, Purang; Salcudean, Septimiu E

    2013-09-01

    Robot-assisted laparoscopic radical prostatectomy (RALRP) using the da Vinci surgical system is the current state-of-the-art treatment option for clinically confined prostate cancer. Given the limited field of view of the surgical site in RALRP, several groups have proposed the integration of transrectal ultrasound (TRUS) imaging in the surgical workflow to assist with accurate resection of the prostate and the sparing of the neurovascular bundles (NVBs). We previously introduced a robotic TRUS manipulator and a method for automatically tracking da Vinci surgical instruments with the TRUS imaging plane, in order to facilitate the integration of intraoperative TRUS in RALRP. Rapid and automatic registration of the kinematic frames of the da Vinci surgical system and the robotic TRUS probe manipulator is a critical component of the instrument tracking system. In this paper, we propose a fully automatic registration technique based on automatic 3-D TRUS localization of robot instrument tips pressed against the air-tissue boundary anterior to the prostate. The detection approach uses a multiscale filtering technique to identify and localize surgical instrument tips in the TRUS volume, and could also be used to detect other surface fiducials in 3-D ultrasound. Experiments have been performed using a tissue phantom and two ex vivo tissue samples to show the feasibility of the proposed methods. Also, an initial in vivo evaluation of the system has been carried out on a live anaesthetized dog with a da Vinci Si surgical system and a target registration error (defined as the root mean square distance of corresponding points after registration) of 2.68 mm has been achieved. Results show this method's accuracy and consistency for automatic registration of TRUS images to the da Vinci surgical system.

  18. Two-field photography can identify patients with vision-threatening diabetic retinopathy - A screening approach in the primary care setting

    NARCIS (Netherlands)

    Stellingwerf, C; Hardus, PLLJ; Hooymans, JMM

    2001-01-01

    OBJECTIVE - To compare the effectiveness of two 45 degrees photographic fields per eye in the screening for diabetic retinopathy with the routine ophthalmologist's examination and to study the effectiveness of visual acuity measurement in the detection of diabetic macular edema, RESEARCH DESIGN AND

  19. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  20. Automatic termination of a protective action

    International Nuclear Information System (INIS)

    Heil, P.H.

    1986-01-01

    Subcommittee 6 of NPEC is responsible for the development of IEEE Standard 603. The adequacy of requirements concerning control and termination of protective actions was raised during the balloting of IEEE Standard 603-1980. In essence, the concern dealt with the requirement for deliberate operator action to return the system to normal. It was questioned if control actions such as automatic termination of system operation were allowed. Changes in the standard were made to clarify that there was a distinction between control (including termination) and return to normal and also state that automatic control may be required. Additionally, an action item was identified in the forward of IEEE Standard 603-1980 to determine if any additional changes were needed. The purpose of the paper is to present the results of this additional work

  1. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    Andriamampianina, Lala

    1983-01-01

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author) [fr

  2. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  3. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  4. Identifying the site of granite uranium deposit with radon survey and soil-natural themoluminescence survey. A case study of Xiazhuang granite uranium field

    International Nuclear Information System (INIS)

    Yang Yaxin; Wu Yamei; Wu Xinmin; Chen Yue; Zheng Yongming; Zhang Ye; Wu Lieqin

    2007-01-01

    This paper briefly introduces the methods and procedures for field and indoor radon survey and themoluminescence (TL) survey. The application of these two methods to Xiazhuang uranium field in Guangdong province shows: (1) the positive anomalies of radon survey coincide well with fractured zone and the positive anomalies of TL survey response to uranium mineralization on granite type uranium deposit of silicated fracture zone, the uranium deposit can be effectively explored when these two kinds of anomalies match together. (2) the positive anomalies of radon survey coincide well with fractured zone and the positive anomalies of TL response to the position that intersection between the fractured zone and diabase dyke is projected on the ground. (authors)

  5. The Governmentality of Meta-governance : Identifying Theoretical and Empirical Challenges of Network Governance in the Political Field of Security and Beyond

    OpenAIRE

    Larsson, Oscar

    2015-01-01

    Meta-governance recently emerged in the field of governance as a new approach which claims that its use enables modern states to overcome problems associated with network governance. This thesis shares the view that networks are an important feature of contemporary politics which must be taken seriously, but it also maintains that networks pose substantial analytical and political challenges. It proceeds to investigate the potential possibilities and problems associated with meta-governance o...

  6. Can we use subchondral bone thickness on high-field magnetic resonance images to identify Thoroughbred racehorses at risk of catastrophic lateral condylar fracture?

    Science.gov (United States)

    Tranquille, C A; Murray, R C; Parkin, T D H

    2017-03-01

    Fractures of the lateral condyle of the third metacarpus (MC3) are a significant welfare concern in horseracing worldwide. The primary aim of this work was to identify magnetic resonance (MR) image-detectable prefracture markers that have the potential for use as a screening tool to identify horses at significant risk of catastrophic fracture. Case-control study of bone-level risk factors for fracture in racehorses. A total of 191 MC3s from horses, with and without lateral condylar fracture of MC3, were subjected to MR imaging. The depth of dense subchondral/trabecular bone was measured at several sites around the distal end of the bone and regression analyses were conducted to identify differences in this depth between horses with and without lateral condylar fracture. Greater depth of dense subchondral/trabecular bone in the palmar half of the lateral parasagittal groove of distal MC3 was associated with an increased likelihood of being from a horse that had sustained a fracture. Receiver operator characteristic analysis was used to identify the optimal cut-off in the depth of dense subchondral/trabecular bone at this site to best discriminate fracture status. Positive and negative predictive values were calculated using the prevalence of fracture within the current study and also a prevalence estimate for the wider racehorse population. There is a requirement to identify suitable prescreening test(s) to eliminate many true negative horses and increase the prevalence of prefracture pathology in the sub population that would be screened using MR imaging, in turn maximising the positive predictive value of this test. © 2016 EVJ Ltd.

  7. Determination of polycyclic aromatic compounds. Part project 11: Practice-oriented adaption and field testing of various automatic measuring devices; Messung polycyclischer aromatischer Verbindungen. Teilvorhaben 11: Praxisbezogene Anpassung und Felderprobung verschiedener automatischer Messeinrichtungen

    Energy Technology Data Exchange (ETDEWEB)

    Wilbring, P.; Jockel, W.

    1997-05-01

    The purpose of the present study was to examine various automatic emission measuring devices. The task was to determine polycyclic aromatic hydrocarbons (PAH) on-line. The following measuring devices were used: photoelectric aerosol sensor; emission mass spectrometer; laser-induced aerosol fluorescence; chemical ionisation mass spectrometer; photoelectric aerosol sensor. Most of the above-named measuring devices for automatic PAH monitoring had already demonstrated their general suitability in the course of extensive studies carried out in precursor projects. The next step, performed in this study, was to test the measuring devices` fitness for use. First, practice-oriented laboratory tests were carried out on the devices, whose measuring principles are incidentally highly diverse. These tests focussed on the identification of process parameters (e.g., detection limit, cross-sensitivity, availability, drift behaviour) and examination of the devices` analysis function and hence of their calibratability. (orig./SR) [Deutsch] In diesem Vorhaben wurden verschiedene automatisch arbeitende Emissionsmesseinrichtungen untersucht. Die Aufgabenstellung bestand in der on-line-Bestimmung von polycyclischen aromatischen Kohlenwasserstoffen (PAK). Dazu standen folgende Messeinrichtungen zur Verfuegung: - Photoelektrischer Aerosolsensor - Emissionsmassenspektrometer - Laserinduzierte Aerosolfluoreszenz - Chemisches Ionisationsmassenspektrometer - Photoelektrischer Aerosolsensor. Aus den umfangreichen Untersuchungen im Rahmen der Vorlaeuferprojekte ist die prinzipielle Einsatzfaehigkeit der meisten der oben genannten Messeinrichtungen zur automatisierbaren PAK-Kontrolle bekannt. Ziel des Vorhabens war es, die Praxistauglichkeit dieser auf den unterschiedlichsten Messprinzipien basierenden Messeinrichtungen zu untersuchen. Dazu waren zunaechst Labortests vorzunehmen. Diese Pruefungen erfolgen praxisorientert mit den Schwerpunkten: - Ermittlung von Verfahrenskenngroessen (z. B

  8. Microbial mineralization of cis-dichloroethene and vinyl chloride as a component of natural attenuation of chloroethene contaminants under conditions identified in the field as anoxic

    Science.gov (United States)

    Bradley, Paul M.

    2012-01-01

    Chlororespiration is a key component of remediation at many chloroethene-contaminated sites. In some instances, limited accumulation of reductive dechlorination daughter products may suggest that natural attenuation is not adequate for site remediation. This conclusion is justified when evidence for parent compound (tetrachloroethene, PCE, or trichloroethene, TCE) degradation is lacking. For many chloroethene-contaminated shallow aquifer systems, however, non-conservative losses of the parent compounds are clear but the mass balance between parent compound attenuation and accumulation of reductive dechlorination daughter products is incomplete. Incomplete mass balance indicates a failure to account for important contaminant attenuation mechanisms, and is consistent with contaminant degradation to non-diagnostic mineralization products. An ongoing technical debate over the potential for mineralization of dichloroethene (DCE) and vinyl chloride (VC) to CO2 in the complete absence of diatomic oxygen has largely obscured the importance of microbial DCE/VC mineralization at dissolved oxygen (DO) concentrations below the current field standard (DO conditions. This study demonstrates that oxygen-based microbial mineralization of DCE and VC can be substantial under field conditions that are frequently characterized as "anoxic." Because mischaracterization of operant contaminant biodegradation processes can lead to expensive and ineffective remedial actions, a modified framework for assessing the potential importance of oxygen during chloroethene biodegradation was developed.

  9. Personality in speech assessment and automatic classification

    CERN Document Server

    Polzehl, Tim

    2015-01-01

    This work combines interdisciplinary knowledge and experience from research fields of psychology, linguistics, audio-processing, machine learning, and computer science. The work systematically explores a novel research topic devoted to automated modeling of personality expression from speech. For this aim, it introduces a novel personality assessment questionnaire and presents the results of extensive labeling sessions to annotate the speech data with personality assessments. It provides estimates of the Big 5 personality traits, i.e. openness, conscientiousness, extroversion, agreeableness, and neuroticism. Based on a database built on the questionnaire, the book presents models to tell apart different personality types or classes from speech automatically.

  10. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  11. Prestack inversion based on anisotropic Markov random field-maximum posterior probability inversion and its application to identify shale gas sweet spots

    Science.gov (United States)

    Wang, Kang-Ning; Sun, Zan-Dong; Dong, Ning

    2015-12-01

    Economic shale gas production requires hydraulic fracture stimulation to increase the formation permeability. Hydraulic fracturing strongly depends on geomechanical parameters such as Young's modulus and Poisson's ratio. Fracture-prone sweet spots can be predicted by prestack inversion, which is an ill-posed problem; thus, regularization is needed to obtain unique and stable solutions. To characterize gas-bearing shale sedimentary bodies, elastic parameter variations are regarded as an anisotropic Markov random field. Bayesian statistics are adopted for transforming prestack inversion to the maximum posterior probability. Two energy functions for the lateral and vertical directions are used to describe the distribution, and the expectation-maximization algorithm is used to estimate the hyperparameters of the prior probability of elastic parameters. Finally, the inversion yields clear geological boundaries, high vertical resolution, and reasonable lateral continuity using the conjugate gradient method to minimize the objective function. Antinoise and imaging ability of the method were tested using synthetic and real data.

  12. Automatic Melody Segmentation

    NARCIS (Netherlands)

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation

  13. Identifiability in stochastic models

    CERN Document Server

    1992-01-01

    The problem of identifiability is basic to all statistical methods and data analysis, occurring in such diverse areas as Reliability Theory, Survival Analysis, and Econometrics, where stochastic modeling is widely used. Mathematics dealing with identifiability per se is closely related to the so-called branch of ""characterization problems"" in Probability Theory. This book brings together relevant material on identifiability as it occurs in these diverse fields.

  14. Technology-Enhanced Assessment of Math Fact Automaticity: Patterns of Performance for Low- and Typically Achieving Students

    Science.gov (United States)

    Stickney, Eric M.; Sharp, Lindsay B.; Kenyon, Amanda S.

    2012-01-01

    Because math fact automaticity has been identified as a key barrier for students struggling with mathematics, we examined how initial math achievement levels influenced the path to automaticity (e.g., variation in number of attempts, speed of retrieval, and skill maintenance over time) and the relation between attainment of automaticity and gains…

  15. Intensive field phenotyping of maize (Zea mays L.) root crowns identifies phenes and phene integration associated with plant growth and nitrogen acquisition.

    Science.gov (United States)

    York, Larry M; Lynch, Jonathan P

    2015-09-01

    Root architecture is an important regulator of nitrogen (N) acquisition. Existing methods to phenotype the root architecture of cereal crops are generally limited to seedlings or to the outer roots of mature root crowns. The functional integration of root phenes is poorly understood. In this study, intensive phenotyping of mature root crowns of maize was conducted to discover phenes and phene modules related to N acquisition. Twelve maize genotypes were grown under replete and deficient N regimes in the field in South Africa and eight in the USA. An image was captured for every whorl of nodal roots in each crown. Custom software was used to measure root phenes including nodal occupancy, angle, diameter, distance to branching, lateral branching, and lateral length. Variation existed for all root phenes within maize root crowns. Size-related phenes such as diameter and number were substantially influenced by nodal position, while angle, lateral density, and distance to branching were not. Greater distance to branching, the length from the shoot to the emergence of laterals, is proposed to be a novel phene state that minimizes placing roots in already explored soil. Root phenes from both older and younger whorls of nodal roots contributed to variation in shoot mass and N uptake. The additive integration of root phenes accounted for 70% of the variation observed in shoot mass in low N soil. These results demonstrate the utility of intensive phenotyping of mature root systems, as well as the importance of phene integration in soil resource acquisition. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  16. Automatic generation of anatomic characteristics from cerebral aneurysm surface models.

    Science.gov (United States)

    Neugebauer, M; Lawonn, K; Beuing, O; Preim, B

    2013-03-01

    Computer-aided research on cerebral aneurysms often depends on a polygonal mesh representation of the vessel lumen. To support a differentiated, anatomy-aware analysis, it is necessary to derive anatomic descriptors from the surface model. We present an approach on automatic decomposition of the adjacent vessels into near- and far-vessel regions and computation of the axial plane. We also exemplarily present two applications of the geometric descriptors: automatic computation of a unique vessel order and automatic viewpoint selection. Approximation methods are employed to analyze vessel cross-sections and the vessel area profile along the centerline. The resulting transition zones between near- and far- vessel regions are used as input for an optimization process to compute the axial plane. The unique vessel order is defined via projection into the plane space of the axial plane. The viewing direction for the automatic viewpoint selection is derived from the normal vector of the axial plane. The approach was successfully applied to representative data sets exhibiting a broad variability with respect to the configuration of their adjacent vessels. A robustness analysis showed that the automatic decomposition is stable against noise. A survey with 4 medical experts showed a broad agreement with the automatically defined transition zones. Due to the general nature of the underlying algorithms, this approach is applicable to most of the likely aneurysm configurations in the cerebral vasculature. Additional geometric information obtained during automatic decomposition can support correction in case the automatic approach fails. The resulting descriptors can be used for various applications in the field of visualization, exploration and analysis of cerebral aneurysms.

  17. Candidate isolated neutron stars and other optically blank x-ray fields identified from the rosat all-sky and sloan digital sky surveys

    Energy Technology Data Exchange (ETDEWEB)

    Agueros, Marcel A.; Anderson, Scott F.; /Washington U., Seattle, Astron. Dept.; Margon, Bruce; /Baltimore, Space Telescope Sci.; Haberl, Frank; Voges, Wolfgang; /Garching,; Annis, James; /Fermilab; Schneider, Donald P.; /Penn State U., Astron. Astrophys.; Brinkmann, Jonathan; /Apache Point Observ.

    2005-11-01

    Only seven radio-quiet isolated neutron stars (INSs) emitting thermal X rays are known, a sample that has yet to definitively address such fundamental issues as the equation of state of degenerate neutron matter. We describe a selection algorithm based on a cross-correlation of the ROSAT All-Sky Survey (RASS) and the Sloan Digital Sky Survey (SDSS) that identifies X-ray error circles devoid of plausible optical counterparts to the SDSS g {approx} 22 magnitudes limit. We quantitatively characterize these error circles as optically blank; they may host INSs or other similarly exotic X-ray sources such as radio-quiet BL Lacs, obscured AGN, etc. Our search is an order of magnitude more selective than previous searches for optically blank RASS error circles, and excludes the 99.9% of error circles that contain more common X-ray-emitting subclasses. We find 11 candidates, nine of which are new. While our search is designed to find the best INS candidates and not to produce a complete list of INSs in the RASS, it is reassuring that our number of candidates is consistent with predictions from INS population models. Further X-ray observations will obtain pinpoint positions and determine whether these sources are entirely optically blank at g {approx} 22, supporting the presence of likely isolated neutron stars and perhaps enabling detailed follow-up studies of neutron star physics.

  18. Automatic topics segmentation for TV news video

    Science.gov (United States)

    Hmayda, Mounira; Ejbali, Ridha; Zaied, Mourad

    2017-03-01

    Automatic identification of television programs in the TV stream is an important task for operating archives. This article proposes a new spatio-temporal approach to identify the programs in TV stream into two main steps: First, a reference catalogue for video features visual jingles built. We operate the features that characterize the instances of the same program type to identify the different types of programs in the flow of television. The role of video features is to represent the visual invariants for each visual jingle using appropriate automatic descriptors for each television program. On the other hand, programs in television streams are identified by examining the similarity of the video signal for visual grammars in the catalogue. The main idea of the identification process is to compare the visual similarity of the video signal features in the flow of television to the catalogue. After presenting the proposed approach, the paper overviews encouraging experimental results on several streams extracted from different channels and compounds of several programs.

  19. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  20. Automatic Segmentation and Deep Learning of Bird Sounds

    NARCIS (Netherlands)

    Koops, Hendrik Vincent; Van Balen, J.M.H.; Wiering, F.

    2015-01-01

    We present a study on automatic birdsong recognition with deep neural networks using the BIRDCLEF2014 dataset. Through deep learning, feature hierarchies are learned that represent the data on several levels of abstraction. Deep learning has been applied with success to problems in fields such as

  1. The Role of Automatic Obesity Stereotypes in Real Hiring Discrimination

    Science.gov (United States)

    Agerstrom, Jens; Rooth, Dan-Olof

    2011-01-01

    This study examined whether automatic stereotypes captured by the implicit association test (IAT) can predict real hiring discrimination against the obese. In an unobtrusive field experiment, job applications were sent to a large number of real job vacancies. The applications were matched on credentials but differed with respect to the applicant's…

  2. An enhanced model for automatically extracting topic phrase from ...

    African Journals Online (AJOL)

    The key benefit foreseen from this automatic document classification is not only related to search engines, but also to many other fields like, document organization, text filtering and semantic index managing. Key words: Keyphrase extraction, machine learning, search engine snippet, document classification, topic tracking ...

  3. An Efficient Metric of Automatic Weight Generation for Properties in Instance Matching Technique

    OpenAIRE

    Seddiqui, Md. Hanif; Nath, Rudra Pratap Deb; Aono, Masaki

    2015-01-01

    The proliferation of heterogeneous data sources of semantic knowledge base intensifies the need of an automatic instance matching technique. However, the efficiency of instance matching is often influenced by the weight of a property associated to instances. Automatic weight generation is a non-trivial, however an important task in instance matching technique. Therefore, identifying an appropriate metric for generating weight for a property automatically is nevertheless a formidab...

  4. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  5. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  6. Trends of progress in medical technics as far as automatization is concerned

    Energy Technology Data Exchange (ETDEWEB)

    Agoston, M [Medicor Muevek, Budapest (Hungary)

    1978-09-01

    Modernization of medical treatment is developing to the direction of establishing big hospitals and policlinics. Highly productive automatic equipments give possibilities for performing the mass examinations with high efficiency. Still the X-ray instruments form the most valuable and indispensable device group. One direction to develop the automatization of these machines is to achieve the best X-ray exposure. The relatively slow but continuous spreading of isotope diagnostic instruments has been connected with a number of results in automatization, too. In the field of sterilization bactericid materials, gas- and ray sterilizing methods, as well as combined systems become used. Automatization has a strong influence on the domain of epidemiology as well.

  7. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  8. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  9. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  10. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  11. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  12. Special Issue on Automatic Application Tuning for HPC Architectures

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    2014-01-01

    Full Text Available High Performance Computing architectures have become incredibly complex and exploiting their full potential is becoming more and more challenging. As a consequence, automatic performance tuning (autotuning of HPC applications is of growing interest and many research groups around the world are currently involved. Autotuning is still a rapidly evolving research field with many different approaches being taken. This special issue features selected papers presented at the Dagstuhl seminar on “Automatic Application Tuning for HPC Architectures” in October 2013, which brought together researchers from the areas of autotuning and performance analysis in order to exchange ideas and steer future collaborations.

  13. The development of automatic neutron diffractometry at Harwell

    International Nuclear Information System (INIS)

    Hall, J.W.

    1978-08-01

    Neutron diffractometry contributes substantially to studies of the structure of materials. Scientists at Harwell were among the first to make the collection of diffractometer data automatic and have continued to contribute to this field. This paper outlines the development of automatic neutron diffractometers at Harwell from 1960, and considers the various ANDROMACHE systems up to a hierarchical computer system that is anticipated for 1979. Appendices provide examples of the documentation provided for users of the ANDROMACHE Mark 6 neutron diffractometer system and give brief descriptions of the elements of the programs. (author)

  14. Automatic fuel exchanging device

    International Nuclear Information System (INIS)

    Takahashi, Fuminobu.

    1984-01-01

    Purpose: To enable to designate the identification number of a fuel assembly in a nuclear reactor pressure vessel thereby surely exchanging the designated assembly within a short time. Constitution: Identification number (or letter) pressed on a grip of a fuel assembly is to be detected by a two-dimensional ultrasonic probe of a pull-up mechanism. When the detected number corresponds with the designated number, a control signal is outputted, whereby the pull-up drive control mechanism or pull-up mechanism responds to pull-up and exchange the fuel assembly of the identified number. With such a constitution, the fuel assembly can rapidly and surely be recognized even if pressed letters deviate to the left or right of the probe, and further, the hinge portion and the signal processing portion can be simplified. (Horiuchi, T.)

  15. Accuracy of Automatic Cephalometric Software on Landmark Identification

    Science.gov (United States)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.

    2017-11-01

    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (pmean differences in both horizontal and vertical directions. Small mean differences (mean differences were found for A-point (3.0 4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  16. The Potential of Automatic Word Comparison for Historical Linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Greenhill, Simon J; Gray, Russell D

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection-although not perfect-could become an important component of future research in historical linguistics.

  17. Metabolic changes in occipital lobe epilepsy with automatisms

    Directory of Open Access Journals (Sweden)

    Chong H Wong

    2014-07-01

    Full Text Available Purpose: Some studies suggest that the pattern of glucose hypometabolism relates not only to the ictal-onset zone, but also reflects seizure propagation. We investigated metabolic changes in patients with occipital lobe epilepsy (OLE that may reflect propagation of ictal discharge during seizures with automatisms.Methods: Fifteen patients who had undergone epilepsy surgery for intractable OLE and had undergone interictal Fluorine-18-fluorodeoxyglucose positron emission tomography (18F-FDG-PET between 1994 and 2004 were divided into two groups (with and without automatisms during seizure. Significant regions of hypometabolism were identified by comparing 18F-FDG-PET results from each group with 16 healthy controls by using Statistical Parametric Mapping (SPM 2.Key Findings: Significant hypometabolism was confined largely to the epileptogenic occipital lobe in the patient group without automatisms. In patients with automatisms, glucose hypometabolism extended from the epileptogenic occipital lobe into the ipsilateral temporal lobe.Significance: We identified a distinctive hypometabolic pattern that was specific for OLE patients with automatisms during a seizure. This finding supports the postulate that seizure propagation is a cause of glucose hypometabolism beyond the region of seizure onset.

  18. Metabolic changes in occipital lobe epilepsy with automatisms.

    Science.gov (United States)

    Wong, Chong H; Mohamed, Armin; Wen, Lingfeng; Eberl, Stefan; Somerville, Ernest; Fulham, Michael; Bleasel, Andrew F

    2014-01-01

    Some studies suggest that the pattern of glucose hypometabolism relates not only to the ictal-onset zone but also reflects seizure propagation. We investigated metabolic changes in patients with occipital lobe epilepsy (OLE) that may reflect propagation of ictal discharge during seizures with automatisms. Fifteen patients who had undergone epilepsy surgery for intractable OLE and had undergone interictal Fluorine-18-fluorodeoxyglucose positron-emission tomography ((18)F-FDG-PET) between 1994 and 2004 were divided into two groups (with and without automatisms during seizure). Significant regions of hypometabolism were identified by comparing (18)F-FDG-PET results from each group with 16 healthy controls by using statistical parametric mapping. Significant hypometabolism was confined largely to the epileptogenic occipital lobe in the patient group without automatisms. In patients with automatisms, glucose hypometabolism extended from the epileptogenic occipital lobe into the ipsilateral temporal lobe. We identified a distinctive hypometabolic pattern that was specific for OLE patients with automatisms during a seizure. This finding supports the postulate that seizure propagation is a cause of glucose hypometabolism beyond the region of seizure onset.

  19. Automatic detection and classification of damage zone(s) for incorporating in digital image correlation technique

    Science.gov (United States)

    Bhattacharjee, Sudipta; Deb, Debasis

    2016-07-01

    Digital image correlation (DIC) is a technique developed for monitoring surface deformation/displacement of an object under loading conditions. This method is further refined to make it capable of handling discontinuities on the surface of the sample. A damage zone is referred to a surface area fractured and opened in due course of loading. In this study, an algorithm is presented to automatically detect multiple damage zones in deformed image. The algorithm identifies the pixels located inside these zones and eliminate them from FEM-DIC processes. The proposed algorithm is successfully implemented on several damaged samples to estimate displacement fields of an object under loading conditions. This study shows that displacement fields represent the damage conditions reasonably well as compared to regular FEM-DIC technique without considering the damage zones.

  20. The automatic lumber planing mill

    Science.gov (United States)

    Peter Koch

    1957-01-01

    It is probable that a truly automatic planning operation could be devised if some of the variables commonly present in the mill-run lumber were eliminated and the remaining variables kept under close control. This paper will deal with the more general situation faced by mostl umber manufacturing plants. In other words, it will be assumed that the incoming lumber has...

  1. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  2. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  3. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  4. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  5. Automatic Tamil lyric generation based on ontological interpretation ...

    Indian Academy of Sciences (India)

    This system proposes an -gram based approach to automatic Tamil lyric generation, by the ontological semantic interpretation of the input scene. The approach is based on identifying the semantics conveyed in the scenario, thereby making the system understand the situation and generate lyrics accordingly. The heart of ...

  6. A Machine Vision System for Automatically Grading Hardwood Lumber - (Industrial Metrology)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas T. Drayer; Philip A. Araman; Robert L. Brisbon

    1992-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  7. TMB: Automatic Differentiation and Laplace Approximation

    Directory of Open Access Journals (Sweden)

    Kasper Kristensen

    2016-04-01

    Full Text Available TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable models in a manner similar to the established AD Model Builder package (ADMB, http://admb-project.org/; Fournier et al. 2011. In addition, it offers easy access to parallel computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects are automatically integrated out. This approximation, and its derivatives, are obtained using automatic differentiation (up to order three of the joint likelihood. The computations are designed to be fast for problems with many random effects (≈ 106 and parameters (≈ 103 . Computation times using ADMB and TMB are compared on a suite of examples ranging from simple models to large spatial models where the random effects are a Gaussian random field. Speedups ranging from 1.5 to about 100 are obtained with increasing gains for large problems. The package and examples are available at http://tmb-project.org/.

  8. Automatic component calibration and error diagnostics for model-based accelerator control. Phase I final report

    International Nuclear Information System (INIS)

    Carl Stern; Martin Lee

    1999-01-01

    Phase I work studied the feasibility of developing software for automatic component calibration and error correction in beamline optics models. A prototype application was developed that corrects quadrupole field strength errors in beamline models

  9. Automatic component calibration and error diagnostics for model-based accelerator control. Phase I final report

    CERN Document Server

    Carl-Stern

    1999-01-01

    Phase I work studied the feasibility of developing software for automatic component calibration and error correction in beamline optics models. A prototype application was developed that corrects quadrupole field strength errors in beamline models.

  10. Automatic Texture and Orthophoto Generation from Registered Panoramic Views

    DEFF Research Database (Denmark)

    Krispel, Ulrich; Evers, Henrik Leander; Tamke, Martin

    2015-01-01

    are automatically identified from the geometry and an image per view is created via projection. We combine methods of computer vision to train a classifier to detect the objects of interest from these orthographic views. Furthermore, these views can be used for automatic texturing of the proxy geometry....... from range data only. In order to detect these elements, we developed a method that utilizes range data and color information from high-resolution panoramic images of indoor scenes, taken at the scanners position. A proxy geometry is derived from the point clouds; orthographic views of the scene...

  11. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  12. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  13. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  14. Automatic measurement of target crossing speed

    Science.gov (United States)

    Wardell, Mark; Lougheed, James H.

    1992-11-01

    The motion of ground vehicle targets after a ballistic round is launched can be a major source of inaccuracy for small (handheld) anti-armour weapon systems. A method of automatically measuring the crossing component to compensate the fire control solution has been devised and tested against various targets in a range of environments. A photodetector array aligned with the sight's horizontal reticle obtains scene features, which are digitized and processed to separate target from sight motion. Relative motion of the target against the background is briefly monitored to deduce angular crossing rate and a compensating lead angle is introduced into the aim point. Research to gather quantitative data and optimize algorithm performance is described, and some results from field testing are presented.

  15. Template-based automatic extraction of the joint space of foot bones from CT scan

    Science.gov (United States)

    Park, Eunbi; Kim, Taeho; Park, Jinah

    2016-03-01

    Clean bone segmentation is critical in studying the joint anatomy for measuring the spacing between the bones. However, separation of the coupled bones in CT images is sometimes difficult due to ambiguous gray values coming from the noise and the heterogeneity of bone materials as well as narrowing of the joint space. For fine reconstruction of the individual local boundaries, manual operation is a common practice where the segmentation remains to be a bottleneck. In this paper, we present an automatic method for extracting the joint space by applying graph cut on Markov random field model to the region of interest (ROI) which is identified by a template of 3D bone structures. The template includes encoded articular surface which identifies the tight region of the high-intensity bone boundaries together with the fuzzy joint area of interest. The localized shape information from the template model within the ROI effectively separates the bones nearby. By narrowing the ROI down to the region including two types of tissue, the object extraction problem was reduced to binary segmentation and solved via graph cut. Based on the shape of a joint space marked by the template, the hard constraint was set by the initial seeds which were automatically generated from thresholding and morphological operations. The performance and the robustness of the proposed method are evaluated on 12 volumes of ankle CT data, where each volume includes a set of 4 tarsal bones (calcaneus, talus, navicular and cuboid).

  16. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  17. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    Brun, R.; Rademakers, F.

    2001-01-01

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  18. Automatic digitization of SMA data

    Science.gov (United States)

    Väänänen, Mika; Tanskanen, Eija

    2017-04-01

    In the 1970's and 1980's the Scandinavian Magnetometer Array produced large amounts of excellent data from over 30 stations In Norway, Sweden and Finland. 620 film reels and 20 kilometers of film have been preserved and the longest time series produced in the campaign span almost uninterrupted for five years, but the data has never seen widespread use due to the choice of medium. Film is a difficult medium to digitize efficiently. Previously events of interest were searched for by hand and digitization was done by projecting the film on paper and plotting it by hand. We propose a method of automatically digitizing geomagnetic data stored on film and extracting the numerical values from the digitized data. The automatic digitization process helps in preserving old, valuable data that might otherwise go unused.

  19. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  20. Automatic Conflict Detection on Contracts

    Science.gov (United States)

    Fenech, Stephen; Pace, Gordon J.; Schneider, Gerardo

    Many software applications are based on collaborating, yet competing, agents or virtual organisations exchanging services. Contracts, expressing obligations, permissions and prohibitions of the different actors, can be used to protect the interests of the organisations engaged in such service exchange. However, the potentially dynamic composition of services with different contracts, and the combination of service contracts with local contracts can give rise to unexpected conflicts, exposing the need for automatic techniques for contract analysis. In this paper we look at automatic analysis techniques for contracts written in the contract language mathcal{CL}. We present a trace semantics of mathcal{CL} suitable for conflict analysis, and a decision procedure for detecting conflicts (together with its proof of soundness, completeness and termination). We also discuss its implementation and look into the applications of the contract analysis approach we present. These techniques are applied to a small case study of an airline check-in desk.

  1. MOS voltage automatic tuning circuit

    OpenAIRE

    李, 田茂; 中田, 辰則; 松本, 寛樹

    2004-01-01

    Abstract ###Automatic tuning circuit adjusts frequency performance to compensate for the process variation. Phase locked ###loop (PLL) is a suitable oscillator for the integrated circuit. It is a feedback system that compares the input ###phase with the output phase. It can make the output frequency equal to the input frequency. In this paper, PLL ###fomed of MOSFET's is presented.The presented circuit consists of XOR circuit, Low-pass filter and Relaxation ###Oscillator. On PSPICE simulation...

  2. The Mark II Automatic Diflux

    Directory of Open Access Journals (Sweden)

    Jean L Rasson

    2011-07-01

    Full Text Available We report here on the new realization of an automatic fluxgate theodolite able to perform unattended absolute geomagnetic declination and inclination measurements: the AUTODIF MKII. The main changes of this version compared with the former one are presented as well as the better specifications we expect now. We also explain the absolute orientation procedure by means of a laser beam and a corner cube and the method for leveling the fluxgate sensor, which is different from a conventional DIflux theodolite.

  3. CLG for Automatic Image Segmentation

    OpenAIRE

    Christo Ananth; S.Santhana Priya; S.Manisha; T.Ezhil Jothi; M.S.Ramasubhaeswari

    2017-01-01

    This paper proposes an automatic segmentation method which effectively combines Active Contour Model, Live Wire method and Graph Cut approach (CLG). The aim of Live wire method is to provide control to the user on segmentation process during execution. Active Contour Model provides a statistical model of object shape and appearance to a new image which are built during a training phase. In the graph cut technique, each pixel is represented as a node and the distance between those nodes is rep...

  4. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  5. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  6. Automatic wipers with mist control

    OpenAIRE

    Ashik K.P; A.N.Basavaraju

    2016-01-01

    - This paper illustrates Automatic wipers with mist control. In modern days, the accidents are most common in commercial vehicles. One of the reasons for these accidents is formation of the mist inside the vehicle due to heavy rain. In rainy seasons for commercial vehicles, the wiper on the windshield has to be controlled by the driver himself, which distracts his concentration on driving. Also when the rain lasts for more time (say for about 15 minutes) the formation of mist on t...

  7. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    the economy. Most types of revenues—mainly personal, corporate, and social insurance taxes —are sensitive to the business cycle and account for most of...Medicare taxes for self-employed people, taxes on production and imports, and unemployment insurance taxes . Those six categories account for the bulk of...federal tax revenues.6 Individual taxes account for most of the automatic stabilizers from revenues, followed by Social Security plus Medicare

  8. Group Dynamics in Automatic Imitation.

    Science.gov (United States)

    Gleibs, Ilka H; Wilson, Neil; Reddy, Geetha; Catmur, Caroline

    Imitation-matching the configural body movements of another individual-plays a crucial part in social interaction. We investigated whether automatic imitation is not only influenced by who we imitate (ingroup vs. outgroup member) but also by the nature of an expected interaction situation (competitive vs. cooperative). In line with assumptions from Social Identity Theory), we predicted that both social group membership and the expected situation impact on the level of automatic imitation. We adopted a 2 (group membership target: ingroup, outgroup) x 2 (situation: cooperative, competitive) design. The dependent variable was the degree to which participants imitated the target in a reaction time automatic imitation task. 99 female students from two British Universities participated. We found a significant two-way interaction on the imitation effect. When interacting in expectation of cooperation, imitation was stronger for an ingroup target compared to an outgroup target. However, this was not the case in the competitive condition where imitation did not differ between ingroup and outgroup target. This demonstrates that the goal structure of an expected interaction will determine the extent to which intergroup relations influence imitation, supporting a social identity approach.

  9. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  10. A Review of Automatic Methods Based on Image Processing Techniques for Tuberculosis Detection from Microscopic Sputum Smear Images.

    Science.gov (United States)

    Panicker, Rani Oomman; Soman, Biju; Saini, Gagan; Rajan, Jeny

    2016-01-01

    Tuberculosis (TB) is an infectious disease caused by the bacteria Mycobacterium tuberculosis. It primarily affects the lungs, but it can also affect other parts of the body. TB remains one of the leading causes of death in developing countries, and its recent resurgences in both developed and developing countries warrant global attention. The number of deaths due to TB is very high (as per the WHO report, 1.5 million died in 2013), although most are preventable if diagnosed early and treated. There are many tools for TB detection, but the most widely used one is sputum smear microscopy. It is done manually and is often time consuming; a laboratory technician is expected to spend at least 15 min per slide, limiting the number of slides that can be screened. Many countries, including India, have a dearth of properly trained technicians, and they often fail to detect TB cases due to the stress of a heavy workload. Automatic methods are generally considered as a solution to this problem. Attempts have been made to develop automatic approaches to identify TB bacteria from microscopic sputum smear images. In this paper, we provide a review of automatic methods based on image processing techniques published between 1998 and 2014. The review shows that the accuracy of algorithms for the automatic detection of TB increased significantly over the years and gladly acknowledges that commercial products based on published works also started appearing in the market. This review could be useful to researchers and practitioners working in the field of TB automation, providing a comprehensive and accessible overview of methods of this field of research.

  11. Individual differences in automatic emotion regulation affect the asymmetry of the LPP component.

    Directory of Open Access Journals (Sweden)

    Jing Zhang

    Full Text Available The main goal of this study was to investigate how automatic emotion regulation altered the hemispheric asymmetry of ERPs elicited by emotion processing. We examined the effect of individual differences in automatic emotion regulation on the late positive potential (LPP when participants were viewing blocks of positive high arousal, positive low arousal, negative high arousal and negative low arousal pictures from International affect picture system (IAPS. Two participant groups were categorized by the Emotion Regulation-Implicit Association Test which has been used in previous research to identify two groups of participants with automatic emotion control and with automatic emotion express. The main finding was that automatic emotion express group showed a right dominance of the LPP component at posterior electrodes, especially in high arousal conditions. But no right dominance of the LPP component was observed for automatic emotion control group. We also found the group with automatic emotion control showed no differences in the right posterior LPP amplitude between high- and low-arousal emotion conditions, while the participants with automatic emotion express showed larger LPP amplitude in the right posterior in high-arousal conditions compared to low-arousal conditions. This result suggested that AER (Automatic emotion regulation modulated the hemispheric asymmetry of LPP on posterior electrodes and supported the right hemisphere hypothesis.

  12. Individual differences in automatic emotion regulation affect the asymmetry of the LPP component.

    Science.gov (United States)

    Zhang, Jing; Zhou, Renlai

    2014-01-01

    The main goal of this study was to investigate how automatic emotion regulation altered the hemispheric asymmetry of ERPs elicited by emotion processing. We examined the effect of individual differences in automatic emotion regulation on the late positive potential (LPP) when participants were viewing blocks of positive high arousal, positive low arousal, negative high arousal and negative low arousal pictures from International affect picture system (IAPS). Two participant groups were categorized by the Emotion Regulation-Implicit Association Test which has been used in previous research to identify two groups of participants with automatic emotion control and with automatic emotion express. The main finding was that automatic emotion express group showed a right dominance of the LPP component at posterior electrodes, especially in high arousal conditions. But no right dominance of the LPP component was observed for automatic emotion control group. We also found the group with automatic emotion control showed no differences in the right posterior LPP amplitude between high- and low-arousal emotion conditions, while the participants with automatic emotion express showed larger LPP amplitude in the right posterior in high-arousal conditions compared to low-arousal conditions. This result suggested that AER (Automatic emotion regulation) modulated the hemispheric asymmetry of LPP on posterior electrodes and supported the right hemisphere hypothesis.

  13. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  14. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  15. System for automatic detection of lung nodules exhibiting growth

    Science.gov (United States)

    Novak, Carol L.; Shen, Hong; Odry, Benjamin L.; Ko, Jane P.; Naidich, David P.

    2004-05-01

    Lung nodules that exhibit growth over time are considered highly suspicious for malignancy. We present a completely automated system for detection of growing lung nodules, using initial and follow-up multi-slice CT studies. The system begins with automatic detection of lung nodules in the later CT study, generating a preliminary list of candidate nodules. Next an automatic system for registering locations in two studies matches each candidate in the later study to its corresponding position in the earlier study. Then a method for automatic segmentation of lung nodules is applied to each candidate and its matching location, and the computed volumes are compared. The output of the system is a list of nodule candidates that are new or have exhibited volumetric growth since the previous scan. In a preliminary test of 10 patients examined by two radiologists, the automatic system identified 18 candidates as growing nodules. 7 (39%) of these corresponded to validated nodules or other focal abnormalities that exhibited growth. 4 of the 7 true detections had not been identified by either of the radiologists during their initial examinations of the studies. This technique represents a powerful method of surveillance that may reduce the probability of missing subtle or early malignant disease.

  16. Automatic Modulation Recognition by Support Vector Machines Using Wavelet Kernel

    Energy Technology Data Exchange (ETDEWEB)

    Feng, X Z; Yang, J; Luo, F L; Chen, J Y; Zhong, X P [College of Mechatronic Engineering and Automation, National University of Defense Technology, Changsha (China)

    2006-10-15

    Automatic modulation identification plays a significant role in electronic warfare, electronic surveillance systems and electronic counter measure. The task of modulation recognition of communication signals is to determine the modulation type and signal parameters. In fact, automatic modulation identification can be range to an application of pattern recognition in communication field. The support vector machines (SVM) is a new universal learning machine which is widely used in the fields of pattern recognition, regression estimation and probability density. In this paper, a new method using wavelet kernel function was proposed, which maps the input vector xi into a high dimensional feature space F. In this feature space F, we can construct the optimal hyperplane that realizes the maximal margin in this space. That is to say, we can use SVM to classify the communication signals into two groups, namely analogue modulated signals and digitally modulated signals. In addition, computer simulation results are given at last, which show good performance of the method.

  17. Automatic Modulation Recognition by Support Vector Machines Using Wavelet Kernel

    International Nuclear Information System (INIS)

    Feng, X Z; Yang, J; Luo, F L; Chen, J Y; Zhong, X P

    2006-01-01

    Automatic modulation identification plays a significant role in electronic warfare, electronic surveillance systems and electronic counter measure. The task of modulation recognition of communication signals is to determine the modulation type and signal parameters. In fact, automatic modulation identification can be range to an application of pattern recognition in communication field. The support vector machines (SVM) is a new universal learning machine which is widely used in the fields of pattern recognition, regression estimation and probability density. In this paper, a new method using wavelet kernel function was proposed, which maps the input vector xi into a high dimensional feature space F. In this feature space F, we can construct the optimal hyperplane that realizes the maximal margin in this space. That is to say, we can use SVM to classify the communication signals into two groups, namely analogue modulated signals and digitally modulated signals. In addition, computer simulation results are given at last, which show good performance of the method

  18. Automatic anatomically selective image enhancement in digital chest radiography

    International Nuclear Information System (INIS)

    Sezan, M.I.; Minerbo, G.N.; Schaetzing, R.

    1989-01-01

    The authors develop a technique for automatic anatomically selective enhancement of digital chest radiographs. Anatomically selective enhancement is motivated by the desire to simultaneously meet the different enhancement requirements of the lung field and the mediastinum. A recent peak detection algorithm and a set of rules are applied to the image histogram to determine automatically a gray-level threshold between the lung field and mediastinum. The gray-level threshold facilitates anatomically selective gray-scale modification and/or unsharp masking. Further, in an attempt to suppress possible white-band or black-band artifacts due to unsharp masking at sharp edges, local-contrast adaptivity is incorporated into anatomically selective unsharp masking by designing an anatomy-sensitive emphasis parameter which varies asymmetrically with positive and negative values of the local image contrast

  19. Automatic Reverse Engineering of Private Flight Control Protocols of UAVs

    Directory of Open Access Journals (Sweden)

    Ran Ji

    2017-01-01

    Full Text Available The increasing use of civil unmanned aerial vehicles (UAVs has the potential to threaten public safety and privacy. Therefore, airspace administrators urgently need an effective method to regulate UAVs. Understanding the meaning and format of UAV flight control commands by automatic protocol reverse-engineering techniques is highly beneficial to UAV regulation. To improve our understanding of the meaning and format of UAV flight control commands, this paper proposes a method to automatically analyze the private flight control protocols of UAVs. First, we classify flight control commands collected from a binary network trace into clusters; then, we analyze the meaning of flight control commands by the accumulated error of each cluster; next, we extract the binary format of commands and infer field semantics in these commands; and finally, we infer the location of the check field in command and the generator polynomial matrix. The proposed approach is validated via experiments on a widely used consumer UAV.

  20. ALOHA: Automatic libraries of helicity amplitudes for Feynman diagram computations

    Science.gov (United States)

    de Aquino, Priscila; Link, William; Maltoni, Fabio; Mattelaer, Olivier; Stelzer, Tim

    2012-10-01

    We present an application that automatically writes the HELAS (HELicity Amplitude Subroutines) library corresponding to the Feynman rules of any quantum field theory Lagrangian. The code is written in Python and takes the Universal FeynRules Output (UFO) as an input. From this input it produces the complete set of routines, wave-functions and amplitudes, that are needed for the computation of Feynman diagrams at leading as well as at higher orders. The representation is language independent and currently it can output routines in Fortran, C++, and Python. A few sample applications implemented in the MADGRAPH 5 framework are presented. Program summary Program title: ALOHA Catalogue identifier: AEMS_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: http://www.opensource.org/licenses/UoI-NCSA.php No. of lines in distributed program, including test data, etc.: 6094320 No. of bytes in distributed program, including test data, etc.: 7479819 Distribution format: tar.gz Programming language: Python2.6 Computer: 32/64 bit Operating system: Linux/Mac/Windows RAM: 512 Mbytes Classification: 4.4, 11.6 Nature of problem: An effcient numerical evaluation of a squared matrix element can be done with the help of the helicity routines implemented in the HELAS library [1]. This static library contains a limited number of helicity functions and is therefore not always able to provide the needed routine in the presence of an arbitrary interaction. This program provides a way to automatically create the corresponding routines for any given model. Solution method: ALOHA takes the Feynman rules associated to the vertex obtained from the model information (in the UFO format [2]), and multiplies it by the different wavefunctions or propagators. As a result the analytical expression of the helicity routines is obtained. Subsequently, this expression is

  1. Automatic alignment of double optical paths in excimer laser amplifier

    Science.gov (United States)

    Wang, Dahui; Zhao, Xueqing; Hua, Hengqi; Zhang, Yongsheng; Hu, Yun; Yi, Aiping; Zhao, Jun

    2013-05-01

    A kind of beam automatic alignment method used for double paths amplification in the electron pumped excimer laser system is demonstrated. In this way, the beams from the amplifiers can be transferred along the designated direction and accordingly irradiate on the target with high stabilization and accuracy. However, owing to nonexistence of natural alignment references in excimer laser amplifiers, two cross-hairs structure is used to align the beams. Here, one crosshair put into the input beam is regarded as the near-field reference while the other put into output beam is regarded as the far-field reference. The two cross-hairs are transmitted onto Charge Coupled Devices (CCD) by image-relaying structures separately. The errors between intersection points of two cross-talk images and centroid coordinates of actual beam are recorded automatically and sent to closed loop feedback control mechanism. Negative feedback keeps running until preset accuracy is reached. On the basis of above-mentioned design, the alignment optical path is built and the software is compiled, whereafter the experiment of double paths automatic alignment in electron pumped excimer laser amplifier is carried through. Meanwhile, the related influencing factors and the alignment precision are analyzed. Experimental results indicate that the alignment system can achieve the aiming direction of automatic aligning beams in short time. The analysis shows that the accuracy of alignment system is 0.63μrad and the beam maximum restoration error is 13.75μm. Furthermore, the bigger distance between the two cross-hairs, the higher precision of the system is. Therefore, the automatic alignment system has been used in angular multiplexing excimer Main Oscillation Power Amplification (MOPA) system and can satisfy the requirement of beam alignment precision on the whole.

  2. Automatic positioning control device for automatic control rod exchanger

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To attain accurate positioning for a control rod exchanger. Constitution: The present position for an automatic control rod exchanger is detected by a synchro generator. An aimed stopping position for the exchanger, a stop instruction range depending on the distantial operation delay in the control system and the inertia-running distance of the mechanical system, and a coincidence confirmation range depending on the required positioning accuracy are previously set. If there is a difference between the present position and the aimed stopping position, the automatic exchanger is caused to run toward the aimed stopping position. A stop instruction is generated upon arrival at the position within said stop instruction range, and a coincidence confirmation signal is generated upon arrival at the position within the coincidence confirmation range. Since uncertain factors such as operation delay in the control system and the inertia-running distance of the mechanical system that influence the positioning accuracy are made definite by the method of actual measurement or the like and the stop instruction range and the coincidence confirmation range are set based on the measured data, the accuracy for the positioning can be improved. (Ikeda, J.)

  3. ORCID Author Identifiers: A Primer for Librarians.

    Science.gov (United States)

    Akers, Katherine G; Sarkozy, Alexandra; Wu, Wendy; Slyman, Alison

    2016-01-01

    The ORCID (Open Researcher and Contributor ID) registry helps disambiguate authors and streamline research workflows by assigning unique 16-digit author identifiers that enable automatic linkages between researchers and their scholarly activities. This article describes how ORCID works, the benefits of using ORCID, and how librarians can promote ORCID at their institutions by raising awareness of ORCID, helping researchers create and populate ORCID profiles, and integrating ORCID identifiers into institutional repositories and other university research information systems.

  4. Automatic noninvasive measurement of systolic blood pressure using photoplethysmography

    Directory of Open Access Journals (Sweden)

    Glik Zehava

    2009-10-01

    Full Text Available Abstract Background Automatic measurement of arterial blood pressure is important, but the available commercial automatic blood pressure meters, mostly based on oscillometry, are of low accuracy. Methods In this study, we present a cuff-based technique for automatic measurement of systolic blood pressure, based on photoplethysmographic signals measured simultaneously in fingers of both hands. After inflating the pressure cuff to a level above systolic blood pressure in a relatively slow rate, it is slowly deflated. The cuff pressure for which the photoplethysmographic signal reappeared during the deflation of the pressure-cuff was taken as the systolic blood pressure. The algorithm for the detection of the photoplethysmographic signal involves: (1 determination of the time-segments in which the photoplethysmographic signal distal to the cuff is expected to appear, utilizing the photoplethysmographic signal in the free hand, and (2 discrimination between random fluctuations and photoplethysmographic pattern. The detected pulses in the time-segments were identified as photoplethysmographic pulses if they met two criteria, based on the pulse waveform and on the correlation between the signal in each segment and the signal in the two neighboring segments. Results Comparison of the photoplethysmographic-based automatic technique to sphygmomanometry, the reference standard, shows that the standard deviation of their differences was 3.7 mmHg. For subjects with systolic blood pressure above 130 mmHg the standard deviation was even lower, 2.9 mmHg. These values are much lower than the 8 mmHg value imposed by AAMI standard for automatic blood pressure meters. Conclusion The photoplethysmographic-based technique for automatic measurement of systolic blood pressure, and the algorithm which was presented in this study, seems to be accurate.

  5. Automatic acquisition and shape analysis of metastable peaks

    International Nuclear Information System (INIS)

    Maendli, H.; Robbiani, R.; Kuster, Th.; Seibl, J.

    1979-01-01

    A method for automatic acquisition and evaluation of metastable peaks due to transitions in the first field-free region of a double focussing mass spectrometer is presented. The data are acquired by computer-controlled repetitive scanning of the accelerating voltage and concomitant accumulation, the evaluation made by a mathematical derivatization of the resulting curve. Examples for application of the method are given. (Auth.)

  6. A general graphical user interface for automatic reliability modeling

    Science.gov (United States)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  7. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  8. Automatic Strain-Rate Controller,

    Science.gov (United States)

    1976-12-01

    D—AO37 9~e2 ROME AIR DEVELOPMENT CENTER GRIFFISS AFB N 1’ FIG 13/ 6AUTOMATIC STRAIN—RATE CONTROLLER, (U) DEC 76 R L HUNTSINGER. J A ADAMSK I...goes to zero. CONTROLLER, Leeds and Northrup Series 80 CAT with proportional band , rate , reset, and approach controls . Input from deviation output...8) through ( 16) . (8) Move the set-point slowl y up to 3 or 4. (9) If the recorder po inter hunts , adjust the func t ion controls on tine Ser

  9. Commutated automatic gain control system

    Science.gov (United States)

    Yost, S. R.

    1982-01-01

    A commutated automatic gain control (AGC) system was designed and built for a prototype Loran C receiver. The receiver uses a microcomputer to control a memory aided phase-locked loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The circuit designed for the AGC is described, and bench and flight test results are presented. The AGC circuit described actually samples starting at a point 40 microseconds after a zero crossing determined by the software lock pulse ultimately generated by a 30 microsecond delay and add network in the receiver front end envelope detector.

  10. Automatic liquid nitrogen feeding device

    International Nuclear Information System (INIS)

    Gillardeau, J.; Bona, F.; Dejachy, G.

    1963-01-01

    An automatic liquid nitrogen feeding device has been developed (and used) in the framework of corrosion tests realized with constantly renewed uranium hexafluoride. The issue was to feed liquid nitrogen to a large capacity metallic trap in order to condensate uranium hexafluoride at the exit of the corrosion chambers. After having studied various available devices, a feeding device has been specifically designed to be robust, secure and autonomous, as well as ensuring a high liquid nitrogen flowrate and a highly elevated feeding frequency. The device, made of standard material, has been used during 4000 hours without any problem [fr

  11. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  12. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  13. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  14. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  15. SELFADJUSTING AUTOMATIC CONTROL OF SOWING UNIT

    Directory of Open Access Journals (Sweden)

    A. Yu. Izmaylov

    2015-01-01

    Full Text Available The selfadjusting automatic control of sowing unit and differentiated introduction of mineral fertilizers doses according to agrochemical indicators of the soil (precision agriculture are used wider nowadays. It was defined that the main requirement to the differentiated seeding and fertilizing is an accuracy and duration of transition from one norm to another. Established that at a speed of unit of 10 km/h object moves for 0.5 s about on 1.5 m and more. Thus in this device the radio channel originated differentiated correction is updated in 10 s, and in the RTK mode - 0.5-2 s that breaks the accuracy of introduction of seeds and fertilizers. The block schematic diagram of system of automatic control of technological process of seeding and mineral fertilizing with use of navigation means of machine-tractor aggregates orientation in the field and technical means for realization of technology of precision agriculture at sowing and fertilizers application due to electronic maps of soil fertility and navigation satellite systems was worked out. It was noted that for regulation of a fertilizing dose it is necessary to complete the unit with the electric drive, and for error reduction use navigation GLONASS, GPS, Galileo receivers. To tracking of four leading navigation systems GPS/GLONASS/Galileo/Compass receiver with 32 canals developed by domestic-owned firm «KB NAVIS» was suggested. It was established that the automated device created by All-Russia Research Institute of Mechanization for Agriculture information based on NAVSTAR and GLONASS/GPS system successfully operates seeding and make possible the differentiate fertilizing.

  16. Development of Automatic Remote Exposure Controller for Gamma Radiography

    International Nuclear Information System (INIS)

    Joo, Gwang Tae; Shin, Jin Seong; Kim, Dong Eun; Song, Jung Ho; Choo, Seung Hwan; Chang, Hong Keun

    2002-01-01

    Recently, gamma radiographic equipment have been used about 1,000 sets manually and operated by about 2,500 persons in Korea. In order for a radiography to work effectively with avoiding any hazard of the high level radiation from the source, many field workers have expected developing a wireless automatic remote exposure controller. The KlTCO research team has developed an automatic remote exposure controller that can regulate the speed of 0.4∼1.2m/s by BLDC motor of 24V 200W which has output of 54 kgf·, suitable torque and safety factor for the work. And the developed automatic remote exposure controller can control rpm of motor, pigtail position by photo-sensor and exposure time by timer to RF sensor. Thus, the developed equipment is expected that the unit can be used in many practical applications with benefits in economical advantage to combine the use of both automatic and manual type because attachment is possible existent manual remote exposure controller, AC and DC combined use

  17. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  18. SRV-automatic handling device

    International Nuclear Information System (INIS)

    Yamada, Koji

    1987-01-01

    Automatic handling device for the steam relief valves (SRV's) is developed in order to achieve a decrease in exposure of workers, increase in availability factor, improvement in reliability, improvement in safety of operation, and labor saving. A survey is made during a periodical inspection to examine the actual SVR handling operation. An SRV automatic handling device consists of four components: conveyor, armed conveyor, lifting machine, and control/monitoring system. The conveyor is so designed that the existing I-rail installed in the containment vessel can be used without any modification. This is employed for conveying an SRV along the rail. The armed conveyor, designed for a box rail, is used for an SRV installed away from the rail. By using the lifting machine, an SRV installed away from the I-rail is brought to a spot just below the rail so that the SRV can be transferred by the conveyor. The control/monitoring system consists of a control computer, operation panel, TV monitor and annunciator. The SRV handling device is operated by remote control from a control room. A trial equipment is constructed and performance/function testing is carried out using actual SRV's. As a result, is it shown that the SRV handling device requires only two operators to serve satisfactorily. The required time for removal and replacement of one SRV is about 10 minutes. (Nogami, K.)

  19. A new uranium automatic analyzer

    International Nuclear Information System (INIS)

    Xia Buyun; Zhu Yaokun; Wang Bin; Cong Peiyuan; Zhang Lan

    1993-01-01

    A new uranium automatic analyzer based on the flow injection analysis (FIA) principle has been developed. It consists of a multichannel peristaltic pump, an injection valve, a photometric detector, a single-chip microprocessor system and electronic circuit. The new designed multifunctional auto-injection valve can automatically change the injection volume of the sample and the channels so that the determination ranges and items can easily be changed. It also can make the instrument vary the FIA operation modes that it has functions of a universal instrument. A chromatographic column with extractant-containing resin was installed in the manifold of the analyzer for the concentration and separation of trace uranium. The 2-(5-bromo-2-pyridylazo)-5-diethyl-aminophenol (Br-PADAP) was used as colour reagent. Uranium was determined in the aqueous solution by adding cetyl-pyridium bromide (CPB). The uranium in the solution in the range 0.02-500 mg · L -1 can be directly determined without any pretreatment. A sample throughput rate of 30-90 h -1 and reproducibility of 1-2% were obtained. The analyzer has been satisfactorily applied to the laboratory and the plant

  20. An automatic holographic adaptive phoropter

    Science.gov (United States)

    Amirsolaimani, Babak; Peyghambarian, N.; Schwiegerling, Jim; Bablumyan, Arkady; Savidis, Nickolaos; Peyman, Gholam

    2017-08-01

    Phoropters are the most common instrument used to detect refractive errors. During a refractive exam, lenses are flipped in front of the patient who looks at the eye chart and tries to read the symbols. The procedure is fully dependent on the cooperation of the patient to read the eye chart, provides only a subjective measurement of visual acuity, and can at best provide a rough estimate of the patient's vision. Phoropters are difficult to use for mass screenings requiring a skilled examiner, and it is hard to screen young children and the elderly etc. We have developed a simplified, lightweight automatic phoropter that can measure the optical error of the eye objectively without requiring the patient's input. The automatic holographic adaptive phoropter is based on a Shack-Hartmann wave front sensor and three computercontrolled fluidic lenses. The fluidic lens system is designed to be able to provide power and astigmatic corrections over a large range of corrections without the need for verbal feedback from the patient in less than 20 seconds.

  1. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  2. Automatic Computer Mapping of Terrain

    Science.gov (United States)

    Smedes, H. W.

    1971-01-01

    Computer processing of 17 wavelength bands of visible, reflective infrared, and thermal infrared scanner spectrometer data, and of three wavelength bands derived from color aerial film has resulted in successful automatic computer mapping of eight or more terrain classes in a Yellowstone National Park test site. The tests involved: (1) supervised and non-supervised computer programs; (2) special preprocessing of the scanner data to reduce computer processing time and cost, and improve the accuracy; and (3) studies of the effectiveness of the proposed Earth Resources Technology Satellite (ERTS) data channels in the automatic mapping of the same terrain, based on simulations, using the same set of scanner data. The following terrain classes have been mapped with greater than 80 percent accuracy in a 12-square-mile area with 1,800 feet of relief; (1) bedrock exposures, (2) vegetated rock rubble, (3) talus, (4) glacial kame meadow, (5) glacial till meadow, (6) forest, (7) bog, and (8) water. In addition, shadows of clouds and cliffs are depicted, but were greatly reduced by using preprocessing techniques.

  3. ACIR: automatic cochlea image registration

    Science.gov (United States)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  4. Automatic referral to cardiac rehabilitation.

    Science.gov (United States)

    Fischer, Jane P

    2008-01-01

    The pervasive negative impact of cardiovascular disease in the United States is well documented. Although advances have been made, the campaign to reduce the occurrence, progression, and mortality continues. Determining evidence-based data is only half the battle. Implementing new and updated clinical guidelines into daily practice is a challenging task. Cardiac rehabilitation is an example of a proven intervention whose benefit is hindered through erratic implementation. The American Association of Cardiovascular and Pulmonary Rehabilitation (AACVPR), the American College of Cardiology (ACC), and the American Heart Association (AHA) have responded to this problem by publishing the AACVPR/ACC/AHA 2007 Performance Measures on Cardiac Rehabilitation for Referral to and Delivery of Cardiac Rehabilitation/Secondary Prevention Services. This new national guideline recommends automatic referral to cardiac rehabilitation for every eligible patient (performance measure A-1). This article offers guidance for the initiation of an automatic referral system, including individualizing your protocol with regard to electronic or paper-based order entry structures.

  5. Automatic Assessment of Craniofacial Growth in a Mouse Model of Crouzon Syndrome

    DEFF Research Database (Denmark)

    Thorup, Signe Strann; Larsen, Rasmus; Darvann, Tron Andre

    2009-01-01

    for each mouse-type; growth models were created using linear interpolation and visualized as 3D animations. Spatial regions of significantly different growth were identified using the local False Discovery Rate method, estimating the expected percentage of false predictions in a set of predictions. For all......-rigid volumetric image registration was applied to micro-CT scans of ten 4-week and twenty 6-week euthanized mice for growth modeling. Each age group consisted of 50% normal and 50% Crouzon mice. Four 3D mean shapes, one for each mouse-type and age group were created. Extracting a dense field of growth vectors...... a tool for spatially detailed automatic phenotyping. MAIN OBJECTIVES OF PRESENTATION: We will present a 3D growth model of normal and Crouzon mice, and differences will be statistically and visually compared....

  6. NEUROIMAGING AND PATTERN RECOGNITION TECHNIQUES FOR AUTOMATIC DETECTION OF ALZHEIMER’S DISEASE: A REVIEW

    Directory of Open Access Journals (Sweden)

    Rupali Kamathe

    2017-08-01

    Full Text Available Alzheimer’s disease (AD is the most common form of dementia with currently unavailable firm treatments that can stop or reverse the disease progression. A combination of brain imaging and clinical tests for checking the signs of memory impairment is used to identify patients with AD. In recent years, Neuroimaging techniques combined with machine learning algorithms have received lot of attention in this field. There is a need for development of automated techniques to detect the disease well before patient suffers from irreversible loss. This paper is about the review of such semi or fully automatic techniques with detail comparison of methods implemented, class labels considered, data base used and the results obtained for related study. This review provides detailed comparison of different Neuroimaging techniques and reveals potential application of machine learning algorithms in medical image analysis; particularly in AD enabling even the early detection of the disease- the class labelled as Multiple Cognitive Impairment.

  7. Automatic block-matching registration to improve lung tumor localization during image-guided radiotherapy

    Science.gov (United States)

    Robertson, Scott Patrick

    To improve relatively poor outcomes for locally-advanced lung cancer patients, many current efforts are dedicated to minimizing uncertainties in radiotherapy. This enables the isotoxic delivery of escalated tumor doses, leading to better local tumor control. The current dissertation specifically addresses inter-fractional uncertainties resulting from patient setup variability. An automatic block-matching registration (BMR) algorithm is implemented and evaluated for the purpose of directly localizing advanced-stage lung tumors during image-guided radiation therapy. In this algorithm, small image sub-volumes, termed "blocks", are automatically identified on the tumor surface in an initial planning computed tomography (CT) image. Each block is independently and automatically registered to daily images acquired immediately prior to each treatment fraction. To improve the accuracy and robustness of BMR, this algorithm incorporates multi-resolution pyramid registration, regularization with a median filter, and a new multiple-candidate-registrations technique. The result of block-matching is a sparse displacement vector field that models local tissue deformations near the tumor surface. The distribution of displacement vectors is aggregated to obtain the final tumor registration, corresponding to the treatment couch shift for patient setup correction. Compared to existing rigid and deformable registration algorithms, the final BMR algorithm significantly improves the overlap between target volumes from the planning CT and registered daily images. Furthermore, BMR results in the smallest treatment margins for the given study population. However, despite these improvements, large residual target localization errors were noted, indicating that purely rigid couch shifts cannot correct for all sources of inter-fractional variability. Further reductions in treatment uncertainties may require the combination of high-quality target localization and adaptive radiotherapy.

  8. Glaucomatous patterns in Frequency Doubling Technology (FDT) perimetry data identified by unsupervised machine learning classifiers.

    Science.gov (United States)

    Bowd, Christopher; Weinreb, Robert N; Balasubramanian, Madhusudhanan; Lee, Intae; Jang, Giljin; Yousefi, Siamak; Zangwill, Linda M; Medeiros, Felipe A; Girkin, Christopher A; Liebmann, Jeffrey M; Goldbaum, Michael H

    2014-01-01

    The variational Bayesian independent component analysis-mixture model (VIM), an unsupervised machine-learning classifier, was used to automatically separate Matrix Frequency Doubling Technology (FDT) perimetry data into clusters of healthy and glaucomatous eyes, and to identify axes representing statistically independent patterns of defect in the glaucoma clusters. FDT measurements were obtained from 1,190 eyes with normal FDT results and 786 eyes with abnormal FDT results from the UCSD-based Diagnostic Innovations in Glaucoma Study (DIGS) and African Descent and Glaucoma Evaluation Study (ADAGES). For all eyes, VIM input was 52 threshold test points from the 24-2 test pattern, plus age. FDT mean deviation was -1.00 dB (S.D. = 2.80 dB) and -5.57 dB (S.D. = 5.09 dB) in FDT-normal eyes and FDT-abnormal eyes, respectively (p<0.001). VIM identified meaningful clusters of FDT data and positioned a set of statistically independent axes through the mean of each cluster. The optimal VIM model separated the FDT fields into 3 clusters. Cluster N contained primarily normal fields (1109/1190, specificity 93.1%) and clusters G1 and G2 combined, contained primarily abnormal fields (651/786, sensitivity 82.8%). For clusters G1 and G2 the optimal number of axes were 2 and 5, respectively. Patterns automatically generated along axes within the glaucoma clusters were similar to those known to be indicative of glaucoma. Fields located farther from the normal mean on each glaucoma axis showed increasing field defect severity. VIM successfully separated FDT fields from healthy and glaucoma eyes without a priori information about class membership, and identified familiar glaucomatous patterns of loss.

  9. Body odors promote automatic imitation in autism.

    Science.gov (United States)

    Parma, Valentina; Bulgheroni, Maria; Tirindelli, Roberto; Castiello, Umberto

    2013-08-01

    Autism spectrum disorders comprise a range of neurodevelopmental pathologies characterized, among other symptoms, by impaired social interactions. Individuals with this diagnosis are reported to often identify people by repetitively sniffing pieces of clothing or the body odor of family members. Since body odors are known to initiate and mediate many different social behaviors, smelling the body odor of a family member might constitute a sensory-based action promoting social contact. In light of this, we hypothesized that the body odor of a family member would facilitate the appearance of automatic imitation, an essential social skill known to be impaired in autism. We recruited 20 autistic and 20 typically developing children. Body odors were collected from the children's mothers' axillae. A child observed a model (their mother or a stranger mother) execute (or not) a reach-to-grasp action toward an object. Subsequently, she performed the same action. The object was imbued with the child's mother's odor, a stranger mother's odor, or no odor. The actions were videotaped, and movement time was calculated post hoc via a digitalization technique. Automatic imitation effects-expressed in terms of total movement time reduction-appear in autistic children only when exposed to objects paired with their own mother's odor. The maternal odor, which conveys a social message otherwise neglected, helps autistic children to covertly imitate the actions of others. Our results represent a starting point holding theoretical and practical relevance for the development of new strategies to enhance communication and social behavior among autistic individuals. Copyright © 2013 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  10. Automatic Recognition of Object Names in Literature

    Science.gov (United States)

    Bonnin, C.; Lesteven, S.; Derriere, S.; Oberto, A.

    2008-08-01

    SIMBAD is a database of astronomical objects that provides (among other things) their bibliographic references in a large number of journals. Currently, these references have to be entered manually by librarians who read each paper. To cope with the increasing number of papers, CDS develops a tool to assist the librarians in their work, taking advantage of the Dictionary of Nomenclature of Celestial Objects, which keeps track of object acronyms and of their origin. The program searches for object names directly in PDF documents by comparing the words with all the formats stored in the Dictionary of Nomenclature. It also searches for variable star names based on constellation names and for a large list of usual names such as Aldebaran or the Crab. Object names found in the documents often correspond to several astronomical objects. The system retrieves all possible matches, displays them with their object type given by SIMBAD, and lets the librarian make the final choice. The bibliographic reference can then be automatically added to the object identifiers in the database. Besides, the systematic usage of the Dictionary of Nomenclature, which is updated manually, permitted to automatically check it and to detect errors and inconsistencies. Last but not least, the program collects some additional information such as the position of the object names in the document (in the title, subtitle, abstract, table, figure caption...) and their number of occurrences. In the future, this will permit to calculate the 'weight' of an object in a reference and to provide SIMBAD users with an important new information, which will help them to find the most relevant papers in the object reference list.

  11. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  12. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  13. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  14. Automatic Thermal Infrared Panoramic Imaging Sensor

    National Research Council Canada - National Science Library

    Gutin, Mikhail; Tsui, Eddy K; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-01-01

    .... Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence...

  15. Development of an automatic human duress detection system

    International Nuclear Information System (INIS)

    Greene, E.R.; Davis, J.G.; Tuttle, W.C.

    1979-01-01

    A method for automatically detecting duress in security personnel utilizes real-time assessment of physiological data (heart rate) to evaluate psychological stress. Using body-worn tape recorders, field data have been collected on 22 Albuquerque police officers (20 male, 2 female) to determine actual heart rate responses in both routine and life-threatening situations. Off-line computer analysis has been applied to the data to determine the speed and reliability with which an alarm could be triggered. Alarm algorithms relating field responses to laboratory collected baseline responses have been developed

  16. Evolving a rule system controller for automatic driving in a car racing competition

    OpenAIRE

    Pérez, Diego; Sáez Achaerandio, Yago; Recio Isasi, Gustavo; Isasi Viñuela, Pedro

    2008-01-01

    IEEE Symposium on Computational Intelligence and Games. Perth, Australia, 15-18 December 2008. The techniques and the technologies supporting Automatic Vehicle Guidance are important issues. Automobile manufacturers view automatic driving as a very interesting product with motivating key features which allow improvement of the car safety, reduction in emission or fuel consumption or optimization of driver comfort during long journeys. Car racing is an active research field where new ...

  17. The ''controbloc'', a programmable automatic device for the 1,300 MW generation of power stations

    International Nuclear Information System (INIS)

    Pralus, B.; Winzelle, J.C.

    1983-01-01

    Technological progress in the field of microelectronics has led to the development of an automatic control device, the ''controbloc'', for operating and controlling nuclear power plants. The ''controbloc'' will be used in automatic systems with a high degree of safety and versatility and is now being installed in the first of the new generation 1,300 MW power stations. The main characteristics of the device and the evaluation tests which have been carried out are described [fr

  18. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink

    2017-01-01

    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  19. Antares automatic beam alignment system

    International Nuclear Information System (INIS)

    Appert, Q.; Swann, T.; Sweatt, W.; Saxman, A.

    1980-01-01

    Antares is a 24-beam-line CO 2 laser system for controlled fusion research, under construction at Los Alamos Scientific Laboratory (LASL). Rapid automatic alignment of this system is required prior to each experiment shot. The alignment requirements, operational constraints, and a developed prototype system are discussed. A visible-wavelength alignment technique is employed that uses a telescope/TV system to view point light sources appropriately located down the beamline. Auto-alignment is accomplished by means of a video centroid tracker, which determines the off-axis error of the point sources. The error is nulled by computer-driven, movable mirrors in a closed-loop system. The light sources are fiber-optic terminations located at key points in the optics path, primarily at the center of large copper mirrors, and remotely illuminated to reduce heating effects

  20. Computerized automatic tip scanning operation

    International Nuclear Information System (INIS)

    Nishikawa, K.; Fukushima, T.; Nakai, H.; Yanagisawa, A.

    1984-01-01

    In BWR nuclear power stations the Traversing Incore Probe (TIP) system is one of the most important components in reactor monitoring and control. In previous TIP systems, however, operators have suffered from the complexity of operation and long operation time required. The system presented in this paper realizes the automatic operation of the TIP system by monitoring and driving it with a process computer. This system significantly reduces the burden on customer operators and improves plant efficiency by simplifying the operating procedure, augmenting the accuracy of the measured data, and shortening operating time. The process computer is one of the PODIA (Plant Operation by Displayed Information Automation) systems. This computer transfers control signals to the TIP control panel, which in turn drives equipment by microprocessor control. The process computer contains such components as the CRT/KB unit, the printer plotter, the hard copier, and the message typers required for efficient man-machine communications. Its operation and interface properties are described

  1. Automatic Differentiation and Deep Learning

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Statistical learning has been getting more and more interest from the particle-physics community in recent times, with neural networks and gradient-based optimization being a focus. In this talk we shall discuss three things: automatic differention tools: tools to quickly build DAGs of computation that are fully differentiable. We shall focus on one such tool "PyTorch".  Easy deployment of trained neural networks into large systems with many constraints: for example, deploying a model at the reconstruction phase where the neural network has to be integrated into CERN's bulk data-processing C++-only environment Some recent models in deep learning for segmentation and generation that might be useful for particle physics problems.

  2. Automatic Detection of Terminology Evolution

    Science.gov (United States)

    Tahmasebi, Nina

    As archives contain documents that span over a long period of time, the language used to create these documents and the language used for querying the archive can differ. This difference is due to evolution in both terminology and semantics and will cause a significant number of relevant documents being omitted. A static solution is to use query expansion based on explicit knowledge banks such as thesauri or ontologies. However as we are able to archive resources with more varied terminology, it will be infeasible to use only explicit knowledge for this purpose. There exist only few or no thesauri covering very domain specific terminologies or slang as used in blogs etc. In this Ph.D. thesis we focus on automatically detecting terminology evolution in a completely unsupervised manner as described in this technical paper.

  3. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    Lamargot, J.-P.; Wanin, Maurice.

    1980-01-01

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device [fr

  4. Automatic creation of simulation configuration

    International Nuclear Information System (INIS)

    Oudot, G.; Poizat, F.

    1993-01-01

    SIPA, which stands for 'Simulator for Post Accident', includes: 1) a sophisticated software oriented workshop SWORD (which stands for 'Software Workshop Oriented towards Research and Development') designed in the ADA language including integrated CAD system and software tools for automatic generation of simulation software and man-machine interface in order to operate run-time simulation; 2) a 'simulator structure' based on hardware equipment and software for supervision and communications; 3) simulation configuration generated by SWORD, operated under the control of the 'simulator structure' and run on a target computer. SWORD has already been used to generate two simulation configurations (French 900 MW and 1300 MW nuclear power plants), which are now fully operational on the SIPA training simulator. (Z.S.) 1 ref

  5. Automatic Regulation of Wastewater Discharge

    Directory of Open Access Journals (Sweden)

    Bolea Yolanda

    2017-01-01

    Full Text Available Wastewater plants, mainly with secondary treatments, discharge polluted water to environment that cannot be used in any human activity. When those dumps are in the sea it is expected that most of the biological pollutants die or almost disappear before water reaches human range. This natural withdrawal of bacteria, viruses and other pathogens is due to some conditions such as the salt water of the sea and the sun effect, and the dumps areas are calculated taking into account these conditions. However, under certain meteorological phenomena water arrives to the coast without the full disappearance of pollutant elements. In Mediterranean Sea there are some periods of adverse climatic conditions that pollute the coast near the wastewater dumping. In this paper, authors present an automatic control that prevents such pollution episodes using two mathematical models, one for the pollutant transportation and the other for the pollutant removal in wastewater spills.

  6. ANA, automatic natural learning of a semantic network

    International Nuclear Information System (INIS)

    Enguehard, Chantal

    1992-01-01

    The objective of this research thesis is the automatic extraction of terminology and the study of its automatic structuring in order to produce a semantic network. Such an operation is applied to text corpus representing knowledge on a specific field in order to select the relevant technical vocabulary regarding this field. Thus, the author developed a method and a software for the automatic acquisition of terminology items. The author first gives an overview of systems and methods of document indexing and of thesaurus elaboration, and a brief presentation of the state-of-the-art of learning. Then, he discusses some drawbacks of computer systems of natural language processing which are using large knowledge sources such as grammars and dictionaries. After a presentation of the adopted approach and of some hypotheses, the author defines objects and operators which are necessary for an easier data handling, presents the knowledge acquisition process, and finally precisely describes the system computerization. Some results are assessed and discussed, and limitations and perspectives are commented [fr

  7. Automatic process control in anaerobic digestion technology: A critical review.

    Science.gov (United States)

    Nguyen, Duc; Gadhamshetty, Venkataramana; Nitayavardhana, Saoharit; Khanal, Samir Kumar

    2015-10-01

    Anaerobic digestion (AD) is a mature technology that relies upon a synergistic effort of a diverse group of microbial communities for metabolizing diverse organic substrates. However, AD is highly sensitive to process disturbances, and thus it is advantageous to use online monitoring and process control techniques to efficiently operate AD process. A range of electrochemical, chromatographic and spectroscopic devices can be deployed for on-line monitoring and control of the AD process. While complexity of the control strategy ranges from a feedback control to advanced control systems, there are some debates on implementation of advanced instrumentations or advanced control strategies. Centralized AD plants could be the answer for the applications of progressive automatic control field. This article provides a critical overview of the available automatic control technologies that can be implemented in AD processes at different scales. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. 5th International Conference on Electrical Engineering and Automatic Control

    CERN Document Server

    Yao, Yufeng

    2016-01-01

    On the basis of instrument electrical and automatic control system, the 5th International Conference on Electrical Engineering and Automatic Control (CEEAC) was established at the crossroads of information technology and control technology, and seeks to effectively apply information technology to a sweeping trend that views control as the core of intelligent manufacturing and life. This book takes a look forward into advanced manufacturing development, an area shaped by intelligent manufacturing. It highlights the application and promotion of process control represented by traditional industries, such as the steel industry and petrochemical industry; the technical equipment and system cooperative control represented by robot technology and multi-axis CNC; and the control and support of emerging process technologies represented by laser melting and stacking, as well as the emerging industry represented by sustainable and intelligent life. The book places particular emphasis on the micro-segments field, such as...

  9. Segmenting articular cartilage automatically using a voxel classification approach

    DEFF Research Database (Denmark)

    Folkesson, Jenny; Dam, Erik B; Olsen, Ole F

    2007-01-01

    We present a fully automatic method for articular cartilage segmentation from magnetic resonance imaging (MRI) which we use as the foundation of a quantitative cartilage assessment. We evaluate our method by comparisons to manual segmentations by a radiologist and by examining the interscan...... reproducibility of the volume and area estimates. Training and evaluation of the method is performed on a data set consisting of 139 scans of knees with a status ranging from healthy to severely osteoarthritic. This is, to our knowledge, the only fully automatic cartilage segmentation method that has good...... agreement with manual segmentations, an interscan reproducibility as good as that of a human expert, and enables the separation between healthy and osteoarthritic populations. While high-field scanners offer high-quality imaging from which the articular cartilage have been evaluated extensively using manual...

  10. Fast Appearance Modeling for Automatic Primary Video Object Segmentation.

    Science.gov (United States)

    Yang, Jiong; Price, Brian; Shen, Xiaohui; Lin, Zhe; Yuan, Junsong

    2016-02-01

    Automatic segmentation of the primary object in a video clip is a challenging problem as there is no prior knowledge of the primary object. Most existing techniques thus adapt an iterative approach for foreground and background appearance modeling, i.e., fix the appearance model while optimizing the segmentation and fix the segmentation while optimizing the appearance model. However, these approaches may rely on good initialization and can be easily trapped in local optimal. In addition, they are usually time consuming for analyzing videos. To address these limitations, we propose a novel and efficient appearance modeling technique for automatic primary video object segmentation in the Markov random field (MRF) framework. It embeds the appearance constraint as auxiliary nodes and edges in the MRF structure, and can optimize both the segmentation and appearance model parameters simultaneously in one graph cut. The extensive experimental evaluations validate the superiority of the proposed approach over the state-of-the-art methods, in both efficiency and effectiveness.

  11. 2011 International Conference in Electrics, Communication and Automatic Control Proceedings

    CERN Document Server

    2012-01-01

    This two-volume set contains the very latest, cutting-edge material in electrics, communication and automatic control. As a vital field of research that is highly relevant to current developments in a number of technological domains, the subjects it covers include micro-electronics and integrated circuit control, signal processing technology, next-generation network infrastructure, wireless communication and scientific instruments. The aim of the International Conference in Electrics, Communication and Automatic Control, held in Chongqing, China, in June 2011 was to provide a valuable inclusive platform for researchers, engineers, academicians and industrial professionals from all over the world to share their research results with fellow scientists in the sector. The call for papers netted well over 600 submissions, of which 224 were selected for presentation. This fully peer-reviewed collection of papers from the conference can be viewed as a single-source compendium of the latest trends and techniques in t...

  12. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  13. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  14. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  15. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  16. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  17. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  18. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  19. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  20. Post-factum detection of radiation treatment of meat and fish by means of DNA alterations identified by gas chromatography-mass spectrometry or pulsed-field gel electrophoresis

    International Nuclear Information System (INIS)

    Mayer, M.

    1994-01-01

    The doctoral thesis explains methods and experiments for post-factum detection of radiation-induced alterations of DNA. There are various manifestations of such alterations. Ionizing radiation can directly alter the bases and/or sugar component, or can indirectly induce DNA damage by way of forming water radicals. Both mechanisms result in base derivatives, released for some part from the DNA strand, or formed by alterations of the 2-deoxyribose, inducing strand breaks ( single and double strand breaks). The first part of the thesis explains the approach applying GC-MS for detection of radiation-induced base derivatives, using herring sperm DNA as a model DNA. Some typical types of base derivatives were identified (thymine glycol, 5-hydroxycytosine).Some base derivatives were also found in DNA samples derived from poultry meat. These base derivatives are known to be indicators of food processing with ionizing radiation, but surprisingly were also found in non-irradiated controls, although in minor amounts. The second part discusses the identification of strand breaks applying the pused-field gel electrophoresis. This method is capable of producing evidence that irradiation markedly enhances the short-chain DNA molecules as compared to non-irradiated controls. DNA molecules of a size of approx. 2.2 million base pairs are almost completely broken into short-chain fragments. The method reliably detects radiation treatments down to 1500 Gy, even if applied long ago. (orig./MG) [de

  1. Automatically ordering events and times in text

    CERN Document Server

    Derczynski, Leon R A

    2017-01-01

    The book offers a detailed guide to temporal ordering, exploring open problems in the field and providing solutions and extensive analysis. It addresses the challenge of automatically ordering events and times in text. Aided by TimeML, it also describes and presents concepts relating to time in easy-to-compute terms. Working out the order that events and times happen has proven difficult for computers, since the language used to discuss time can be vague and complex. Mapping out these concepts for a computational system, which does not have its own inherent idea of time, is, unsurprisingly, tough. Solving this problem enables powerful systems that can plan, reason about events, and construct stories of their own accord, as well as understand the complex narratives that humans express and comprehend so naturally. This book presents a theory and data-driven analysis of temporal ordering, leading to the identification of exactly what is difficult about the task. It then proposes and evaluates machine-learning so...

  2. Automatic design of prestressed concrete vessels

    International Nuclear Information System (INIS)

    Sotomura, Kentaro; Murazumi, Yasuyuki

    1984-01-01

    Prestressed concrete appeared after high strnegth steel had been produced, therefore it has the history of only 40 years even in Europe where it was developed. High compressive force is given to concrete beforehand by high strength steel to resist tensile force. It is superior to ordinary steel in strength, economy, rust prevention, fire protection and workability, and it competes with ordinary steel in the fields of bridges, towers, water tanks, water pipes, barges, LPG and LNG tanks, reactor pressure vessels, reactor containment vessels and so on. The design of prestressed concrete containment vessels (PCCV) being constructed in Japan adopts the form of mounting a semi-spherical dome on a cylindrical wall of 43m inside diameter and about 1.5m thickness, and the steel pipe sheaths for inserting tendons are arranged in the wall. The Taisei Construction Co. has developed the PC-ADE system which enables the optimum design of PCCVs. The outline of the automatic design system, the design of tendon arrangement, the preparation of the data on the load for stress analysis, the stress analysis by axisymmetric finite element method and the calculation of cross sections are explained. Design is a creative activity, and in the design of PCCVs also, the intention of designers should be materialized when this program is utilized. (Kako, I.)

  3. Automatic keywording of High Energy Physics

    CERN Document Server

    Dallman, David Peter

    1999-01-01

    Bibliographic databases were developed from the traditional library card catalogue in order to enable users to access library documents via various types of bibliographic information, such as title, author, series or conference date. In addition these catalogues sometimes contained some form of indexation by subject, such as the Universal (or Dewey) Decimal Classification used for books. With the introduction of the eprint archives, set up by the High Energy Physics (HEP) Community in the early 90s, huge collections of documents in several fields have been made available on the World Wide Web. These developments however have not yet been followed up from a keywording point of view. We will see in this paper how important it is to attribute keywords to all documents in the area of HEP Grey Literature. As libraries are facing a future with less and less manpower available and more and more documents, we will explore the possibility of being helped by automatic classification software. We will specifically menti...

  4. Genital automatisms: Reappraisal of a remarkable but ignored symptom of focal seizures.

    Science.gov (United States)

    Dede, Hava Özlem; Bebek, Nerses; Gürses, Candan; Baysal-Kıraç, Leyla; Baykan, Betül; Gökyiğit, Ayşen

    2018-03-01

    Genital automatisms (GAs) are uncommon clinical phenomena of focal seizures. They are defined as repeated fondling, grabbing, or scratching of the genitals. The aim of this study was to determine the lateralizing and localizing value and associated clinical characteristics of GAs. Three hundred thirteen consecutive patients with drug-resistant seizures who were referred to our tertiary center for presurgical evaluation between 2009 and 2016 were investigated. The incidence of specific kinds of behavior, clinical semiology, associated symptoms/signs with corresponding ictal electroencephalography (EEG) findings, and their potential role in seizure localization and lateralization were evaluated. Fifteen (4.8%) of 313 patients had GAs. Genital automatisms were identified in 19 (16.4%) of a total 116 seizures. Genital automatisms were observed to occur more often in men than in women (M/F: 10/5). Nine of fifteen patients (60%) had temporal lobe epilepsy (right/left: 4/5) and three (20%) had frontal lobe epilepsy (right/left: 1/2), whereas the remaining two patients could not be classified. One patient was diagnosed as having Rasmussen encephalitis. Genital automatisms were ipsilateral to epileptic focus in 12 patients and contralateral in only one patient according to ictal-interictal EEG and neuroimaging findings. Epileptic focus could not be lateralized in the last 2 patients. Genital automatisms were associated with unilateral hand automatisms such as postictal nose wiping or manual automatisms in 13 (86.7%) of 15 and contralateral dystonia was seen in 6 patients. All patients had amnesia of the performance of GAs. Genital automatisms are more frequent in seizures originating from the temporal lobe, and they can also be seen in frontal lobe seizures. Genital automatisms seem to have a high lateralizing value to the ipsilateral hemisphere and are mostly concordant with other unilateral hand automatisms. Men exhibit GAs more often than women. Copyright © 2017

  5. Anatomy-based automatic detection and segmentation of major vessels in thoracic CTA images

    International Nuclear Information System (INIS)

    Zou Xiaotao; Liang Jianming; Wolf, M.; Salganicoff, M.; Krishnan, A.; Nadich, D.P.

    2007-01-01

    Existing approaches for automated computerized detection of pulmonary embolism (PE) using computed tomography angiography (CTA) usually focus on segmental and sub-segmental emboli. The goal of our current research is to extend our existing approach to automated detection of central PE. In order to detect central emboli, the major vessels must be first identified and segmented automatically. This submission presents an anatomy-based method for automatic computerized detection and segmentation of aortas and main pulmonary arteries in CTA images. (orig.)

  6. Near Identifiability of Dynamical Systems

    Science.gov (United States)

    Hadaegh, F. Y.; Bekey, G. A.

    1987-01-01

    Concepts regarding approximate mathematical models treated rigorously. Paper presents new results in analysis of structural identifiability, equivalence, and near equivalence between mathematical models and physical processes they represent. Helps establish rigorous mathematical basis for concepts related to structural identifiability and equivalence revealing fundamental requirements, tacit assumptions, and sources of error. "Structural identifiability," as used by workers in this field, loosely translates as meaning ability to specify unique mathematical model and set of model parameters that accurately predict behavior of corresponding physical system.

  7. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  8. Connection of automatic integral multichannel monitor of aerosol concentration

    International Nuclear Information System (INIS)

    Krejci, M.; Stulik, P.

    1985-01-01

    The instrument consists of the actual aerosol concentration monitor with two equivalent inputs, of an electropneumatic sampling selector, an aerosol pump, an electropneumatic valve, and of an exhaust device. For integral operating mode the instrument allows rapid checking and indication of exceedance of the permissible aerosol concentration limit at any sampling point. Upon exceedance of the permissible concentration limit, the device automatically switches into the multichannel cyclic measurement mode while the sampling point is identified where the aerosol concentration was increased. An emergency is displayed if the permissible limit has been exceeded. Following removal of the source of dangerous aerosol concentration, the control unit automatically switches the device into the integral measurement mode. (J.B.)

  9. Automatic calibration of gamma spectrometers

    International Nuclear Information System (INIS)

    Tluchor, D.; Jiranek, V.

    1989-01-01

    The principle is described of energy calibration of the spectrometric path based on the measurement of the standard of one radionuclide or a set of them. The entire computer-aided process is divided into three main steps, viz.: the insertion of the calibration standard by the operator; the start of the calibration program; energy calibration by the computer. The program was selected such that the spectrum identification should not depend on adjustment of the digital or analog elements of the gamma spectrometric measuring path. The ECL program is described for automatic energy calibration as is its control, the organization of data file ECL.DAT and the necessary hardware support. The computer-multichannel analyzer communication was provided using an interface pair of Canberra 8673V and Canberra 8573 operating in the RS-422 standard. All subroutines for communication with the multichannel analyzer were written in MACRO 11 while the main program and the other subroutines were written in FORTRAN-77. (E.J.). 1 tab., 4 refs

  10. Automatic locking orthotic knee device

    Science.gov (United States)

    Weddendorf, Bruce C. (Inventor)

    1993-01-01

    An articulated tang in clevis joint for incorporation in newly manufactured conventional strap-on orthotic knee devices or for replacing such joints in conventional strap-on orthotic knee devices is discussed. The instant tang in clevis joint allows the user the freedom to extend and bend the knee normally when no load (weight) is applied to the knee and to automatically lock the knee when the user transfers weight to the knee, thus preventing a damaged knee from bending uncontrollably when weight is applied to the knee. The tang in clevis joint of the present invention includes first and second clevis plates, a tang assembly and a spacer plate secured between the clevis plates. Each clevis plate includes a bevelled serrated upper section. A bevelled shoe is secured to the tank in close proximity to the bevelled serrated upper section of the clevis plates. A coiled spring mounted within an oblong bore of the tang normally urges the shoes secured to the tang out of engagement with the serrated upper section of each clevic plate to allow rotation of the tang relative to the clevis plate. When weight is applied to the joint, the load compresses the coiled spring, the serrations on each clevis plate dig into the bevelled shoes secured to the tang to prevent relative movement between the tang and clevis plates. A shoulder is provided on the tang and the spacer plate to prevent overextension of the joint.

  11. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  12. Portable and Automatic Moessbauer Analysis

    International Nuclear Information System (INIS)

    Souza, P. A. de; Garg, V. K.; Klingelhoefer, G.; Gellert, R.; Guetlich, P.

    2002-01-01

    A portable Moessbauer spectrometer, developed for extraterrestrial applications, opens up new industrial applications of MBS. But for industrial applications, an available tool for fast data analysis is also required, and it should be easy to handle. The analysis of Moessbauer spectra and their parameters is a barrier for the popularity of this wide-applicable spectroscopic technique in industry. Based on experience, the analysis of a Moessbauer spectrum is time-consuming and requires the dedication of a specialist. However, the analysis of Moessbauer spectra, from the fitting to the identification of the sample phases, can be faster using by genetic algorithms, fuzzy logic and artificial neural networks. Industrial applications are very specific ones and the data analysis can be performed using these algorithms. In combination with an automatic analysis, the Moessbauer spectrometer can be used as a probe instrument which covers the main industrial needs for an on-line monitoring of its products, processes and case studies. Some of these real industrial applications will be discussed.

  13. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  14. Automatic spent fuel ID number reader (I)

    International Nuclear Information System (INIS)

    Tanabe, S.; Kawamoto, H.; Fujimaki, K.; Kobe, A.

    1991-01-01

    An effective and efficient technique has been developed for facilitating identification works of LWR spent fuel stored in large scale spent fuel storage pools of such as processing plants. Experience shows that there are often difficulties in the implementation of operator's nuclear material accountancy and control works as well as safeguards inspections conducted on spent fuel assemblies stored in deep water pool. This paper reports that the technique is realized as an automatic spent fuel ID number reader system installed on fuel handling machine. The ID number reader system consists of an optical sub-system and an image processing sub-system. Thousands of spent fuel assemblies stored in under water open racks in each storage pool could be identified within relatively short time (e.g. within several hours) by using this combination. Various performance tests were carried out on image processing sub-system in 1990 using TV images obtained from different types of spent fuel assemblies stored in various storage pools of PWR and BWR power stations

  15. Automatic caption generation for news images.

    Science.gov (United States)

    Feng, Yansong; Lapata, Mirella

    2013-04-01

    This paper is concerned with the task of automatically generating captions for images, which is important for many image-related applications. Examples include video and image retrieval as well as the development of tools that aid visually impaired individuals to access pictorial information. Our approach leverages the vast resource of pictures available on the web and the fact that many of them are captioned and colocated with thematically related documents. Our model learns to create captions from a database of news articles, the pictures embedded in them, and their captions, and consists of two stages. Content selection identifies what the image and accompanying article are about, whereas surface realization determines how to verbalize the chosen content. We approximate content selection with a probabilistic image annotation model that suggests keywords for an image. The model postulates that images and their textual descriptions are generated by a shared set of latent variables (topics) and is trained on a weakly labeled dataset (which treats the captions and associated news articles as image labels). Inspired by recent work in summarization, we propose extractive and abstractive surface realization models. Experimental results show that it is viable to generate captions that are pertinent to the specific content of an image and its associated article, while permitting creativity in the description. Indeed, the output of our abstractive model compares favorably to handwritten captions and is often superior to extractive methods.

  16. Automatic Deduction in Dynamic Geometry using Sage

    Directory of Open Access Journals (Sweden)

    Francisco Botana

    2012-02-01

    Full Text Available We present a symbolic tool that provides robust algebraic methods to handle automatic deduction tasks for a dynamic geometry construction. The main prototype has been developed as two different worksheets for the open source computer algebra system Sage, corresponding to two different ways of coding a geometric construction. In one worksheet, diagrams constructed with the open source dynamic geometry system GeoGebra are accepted. In this worksheet, Groebner bases are used to either compute the equation of a geometric locus in the case of a locus construction or to determine the truth of a general geometric statement included in the GeoGebra construction as a boolean variable. In the second worksheet, locus constructions coded using the common file format for dynamic geometry developed by the Intergeo project are accepted for computation. The prototype and several examples are provided for testing. Moreover, a third Sage worksheet is presented in which a novel algorithm to eliminate extraneous parts in symbolically computed loci has been implemented. The algorithm, based on a recent work on the Groebner cover of parametric systems, identifies degenerate components and extraneous adherence points in loci, both natural byproducts of general polynomial algebraic methods. Detailed examples are discussed.

  17. Automatic Generation of Minimal Cut Sets

    Directory of Open Access Journals (Sweden)

    Sentot Kromodimoeljo

    2015-06-01

    Full Text Available A cut set is a collection of component failure modes that could lead to a system failure. Cut Set Analysis (CSA is applied to critical systems to identify and rank system vulnerabilities at design time. Model checking tools have been used to automate the generation of minimal cut sets but are generally based on checking reachability of system failure states. This paper describes a new approach to CSA using a Linear Temporal Logic (LTL model checker called BT Analyser that supports the generation of multiple counterexamples. The approach enables a broader class of system failures to be analysed, by generalising from failure state formulae to failure behaviours expressed in LTL. The traditional approach to CSA using model checking requires the model or system failure to be modified, usually by hand, to eliminate already-discovered cut sets, and the model checker to be rerun, at each step. By contrast, the new approach works incrementally and fully automatically, thereby removing the tedious and error-prone manual process and resulting in significantly reduced computation time. This in turn enables larger models to be checked. Two different strategies for using BT Analyser for CSA are presented. There is generally no single best strategy for model checking: their relative efficiency depends on the model and property being analysed. Comparative results are given for the A320 hydraulics case study in the Behavior Tree modelling language.

  18. Automatically Determining Scale Within Unstructured Point Clouds

    Science.gov (United States)

    Kadamen, Jayren; Sithole, George

    2016-06-01

    Three dimensional models obtained from imagery have an arbitrary scale and therefore have to be scaled. Automatically scaling these models requires the detection of objects in these models which can be computationally intensive. Real-time object detection may pose problems for applications such as indoor navigation. This investigation poses the idea that relational cues, specifically height ratios, within indoor environments may offer an easier means to obtain scales for models created using imagery. The investigation aimed to show two things, (a) that the size of objects, especially the height off ground is consistent within an environment, and (b) that based on this consistency, objects can be identified and their general size used to scale a model. To test the idea a hypothesis is first tested on a terrestrial lidar scan of an indoor environment. Later as a proof of concept the same test is applied to a model created using imagery. The most notable finding was that the detection of objects can be more readily done by studying the ratio between the dimensions of objects that have their dimensions defined by human physiology. For example the dimensions of desks and chairs are related to the height of an average person. In the test, the difference between generalised and actual dimensions of objects were assessed. A maximum difference of 3.96% (2.93cm) was observed from automated scaling. By analysing the ratio between the heights (distance from the floor) of the tops of objects in a room, identification was also achieved.

  19. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  20. Research progress of on-line automatic monitoring of chemical oxygen demand (COD) of water

    Science.gov (United States)

    Cai, Youfa; Fu, Xing; Gao, Xiaolu; Li, Lianyin

    2018-02-01

    With the increasingly stricter control of pollutant emission in China, the on-line automatic monitoring of water quality is particularly urgent. The chemical oxygen demand (COD) is a comprehensive index to measure the contamination caused by organic matters, and thus it is taken as one important index of energy-saving and emission reduction in China’s “Twelve-Five” program. So far, the COD on-line automatic monitoring instrument has played an important role in the field of sewage monitoring. This paper reviews the existing methods to achieve on-line automatic monitoring of COD, and on the basis, points out the future trend of the COD on-line automatic monitoring instruments.

  1. Design of cylindrical pipe automatic welding control system based on STM32

    Science.gov (United States)

    Chen, Shuaishuai; Shen, Weicong

    2018-04-01

    The development of modern economy makes the demand for pipeline construction and construction rapidly increasing, and the pipeline welding has become an important link in pipeline construction. At present, there are still a large number of using of manual welding methods at home and abroad, and field pipe welding especially lacks miniature and portable automatic welding equipment. An automated welding system consists of a control system, which consisting of a lower computer control panel and a host computer operating interface, as well as automatic welding machine mechanisms and welding power systems in coordination with the control system. In this paper, a new control system of automatic pipe welding based on the control panel of the lower computer and the interface of the host computer is proposed, which has many advantages over the traditional automatic welding machine.

  2. Automatic synthesis of sequential control schemes

    International Nuclear Information System (INIS)

    Klein, I.

    1993-01-01

    Of all hard- and software developed for industrial control purposes, the majority is devoted to sequential, or binary valued, control and only a minor part to classical linear control. Typically, the sequential parts of the controller are invoked during startup and shut-down to bring the system into its normal operating region and into some safe standby region, respectively. Despite its importance, fairly little theoretical research has been devoted to this area, and sequential control programs are therefore still created manually without much theoretical support to obtain a systematic approach. We propose a method to create sequential control programs automatically. The main ideas is to spend some effort off-line modelling the plant, and from this model generate the control strategy, that is the plan. The plant is modelled using action structures, thereby concentrating on the actions instead of the states of the plant. In general the planning problem shows exponential complexity in the number of state variables. However, by focusing on the actions, we can identify problem classes as well as algorithms such that the planning complexity is reduced to polynomial complexity. We prove that these algorithms are sound, i.e., the generated solution will solve the stated problem, and complete, i.e., if the algorithms fail, then no solution exists. The algorithms generate a plan as a set of actions and a partial order on this set specifying the execution order. The generated plant is proven to be minimal and maximally parallel. For a larger class of problems we propose a method to split the original problem into a number of simple problems that can each be solved using one of the presented algorithms. It is also shown how a plan can be translated into a GRAFCET chart, and to illustrate these ideas we have implemented a planing tool, i.e., a system that is able to automatically create control schemes. Such a tool can of course also be used on-line if it is fast enough. This

  3. Automatic Operation For A Robot Lawn Mower

    Science.gov (United States)

    Huang, Y. Y.; Cao, Z. L.; Oh, S. J.; Kattan, E. U.; Hall, E. L.

    1987-02-01

    A domestic mobile robot, lawn mower, which performs the automatic operation mode, has been built up in the Center of Robotics Research, University of Cincinnati. The robot lawn mower automatically completes its work with the region filling operation, a new kind of path planning for mobile robots. Some strategies for region filling of path planning have been developed for a partly-known or a unknown environment. Also, an advanced omnidirectional navigation system and a multisensor-based control system are used in the automatic operation. Research on the robot lawn mower, especially on the region filling of path planning, is significant in industrial and agricultural applications.

  4. Semi Automatic Ontology Instantiation in the domain of Risk Management

    Science.gov (United States)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  5. Electrical design of a 110-ft long muon pipe with automatic degaussing

    International Nuclear Information System (INIS)

    Visser, A.T.

    1985-11-01

    This memo describes a magnetized cylindrical pipe made from tape wound grain oriented low carbon steel rolls. Grain oriented steel yields much higher magnetic fields at low ampereturns than cast iron or other steel pipes. This is especially important when only a few windings are allowed in the inner bore. The power supply and operating cost are also much lower. The pipe has a high (approx.9 kG) remnant field, but is automatically degaussed upon shutdown of the DC excitation power supply. A remnant field detector senses whether degaussing was successful. The pipe is used in the muon beam line. Its magnetic field deflects unwanted halo muons. Tests need to be conducted with and without pipe field. It is therefore desirable that the pipe field automatically returns to zero when the DC excitation is shut off. This can be rather easily accomplished

  6. Software design of automatic counting system for nuclear track based on mathematical morphology algorithm

    International Nuclear Information System (INIS)

    Pan Yi; Mao Wanchong

    2010-01-01

    The parameter measurement of nuclear track occupies an important position in the field of nuclear technology. However, traditional artificial counting method has many limitations. In recent years, DSP and digital image processing technology have been applied in nuclear field more and more. For the sake of reducing errors of visual measurement in artificial counting method, an automatic counting system for nuclear track based on DM642 real-time image processing platform is introduced in this article, which is able to effectively remove interferences from the background and noise points, as well as automatically extract nuclear track-points by using mathematical morphology algorithm. (authors)

  7. Automatic categorization of diverse experimental information in the bioscience literature

    Science.gov (United States)

    2012-01-01

    Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at

  8. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  9. Automatic categorization of diverse experimental information in the bioscience literature.

    Science.gov (United States)

    Fang, Ruihua; Schindelman, Gary; Van Auken, Kimberly; Fernandes, Jolene; Chen, Wen; Wang, Xiaodong; Davis, Paul; Tuli, Mary Ann; Marygold, Steven J; Millburn, Gillian; Matthews, Beverley; Zhang, Haiyan; Brown, Nick; Gelbart, William M; Sternberg, Paul W

    2012-01-26

    Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at WormBase for

  10. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    Science.gov (United States)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  11. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  12. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  13. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  14. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  15. Automatic coding of online collaboration protocols

    NARCIS (Netherlands)

    Erkens, Gijsbert; Janssen, J.J.H.M.

    2006-01-01

    An automatic coding procedure is described to determine the communicative functions of messages in chat discussions. Five main communicative functions are distinguished: argumentative (indicating a line of argumentation or reasoning), responsive (e.g., confirmations, denials, and answers),

  16. Automatic Amharic text news classification: Aneural networks ...

    African Journals Online (AJOL)

    School of Computing and Electrical Engineering, Institute of Technology, Bahir Dar University, Bahir Dar ... The study is on classification of Amharic news automatically using neural networks approach. Learning Vector ... INTRODUCTION.

  17. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  18. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  19. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic......, uncommitted machine-learning based framework. Six different classifiers were evaluated in cross-validation schemes and the results showed that the presence of OA can be quantified by a bone structure marker. The performance of the developed marker reached a generalization area-under-the-ROC (AUC) of 0...

  20. Feature extraction and classification in automatic weld seam radioscopy

    International Nuclear Information System (INIS)

    Heindoerfer, F.; Pohle, R.

    1994-01-01

    The investigations conducted have shown that automatic feature extraction and classification procedures permit the identification of weld seam flaws. Within this context the favored learning fuzzy classificator represents a very good alternative to conventional classificators. The results have also made clear that improvements mainly in the field of image registration are still possible by increasing the resolution of the radioscopy system. Since, only if the flaw is segmented correctly, i.e. in its full size, and due to improved detail recognizability and sufficient contrast difference will an almost error-free classification be conceivable. (orig./MM) [de

  1. Low-cost automatic station for compost temperature monitoring

    Directory of Open Access Journals (Sweden)

    Marcelo D. L. Jordão

    Full Text Available ABSTRACT Temperature monitoring is an important procedure to control the composting process. Due to cost limitation, temperature monitoring is manual and with daily sampling resolution. The objective of this study was to develop an automatic station with US$ 150 dollars, able to monitor air temperature at two different points in a compost pile, with a 5-min time resolution. In the calibration test, the sensors showed an estimated uncertainty from ± 1 to ± 1.9 ºC. In the field validation test, the station guaranteed secure autonomy for seven days and endured high humidity and extreme temperature (> 70 °C.

  2. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    Science.gov (United States)

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  3. Automatic shadowing device for electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, F W; Bogitch, S

    1960-01-01

    For the past ten years in the laboratory of the Department of Nuclear Medicine and Radiation Biology at the University of California, and before that at Rochester, New York, every evaporation was done with the aid of an automatic shadowing device. For several months the automatic shadowing device has been available at the Atomic Bomb Casualty Commission (ABCC) Hiroshima, Japan with the modifications described. 1 reference.

  4. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  5. Automatic Control of Silicon Melt Level

    Science.gov (United States)

    Duncan, C. S.; Stickel, W. B.

    1982-01-01

    A new circuit, when combined with melt-replenishment system and melt level sensor, offers continuous closed-loop automatic control of melt-level during web growth. Installed on silicon-web furnace, circuit controls melt-level to within 0.1 mm for as long as 8 hours. Circuit affords greater area growth rate and higher web quality, automatic melt-level control also allows semiautomatic growth of web over long periods which can greatly reduce costs.

  6. Towards automatic verification of ladder logic programs

    OpenAIRE

    Zoubek , Bohumir; Roussel , Jean-Marc; Kwiatkowska , Martha

    2003-01-01

    International audience; Control system programs are usually validated by testing prior to their deployment. Unfortunately, testing is not exhaustive and therefore it is possible that a program which passed all the required tests still contains errors. In this paper we apply techniques of automatic verification to a control program written in ladder logic. A model is constructed mechanically from the ladder logic program and subjected to automatic verification against requirements that include...

  7. The problem of automatic identification of concepts

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre

    1975-11-01

    This paper deals with the problem of the automatic recognition of concepts and describes an important language tool, the ''linguistic filter'', which facilitates the construction of statistical algorithms. Certain special filters, of prepositions, conjunctions, negatives, logical implication, compound words, are presented. This is followed by a detailed description of a statistical algorithm allowing recognition of pronoun referents, and finally the problem of the automatic treatment of negatives in French is discussed [fr

  8. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  9. Automatic Vetting for Malice in Android Platforms

    Science.gov (United States)

    2016-05-01

    Android Apps from Play Store Infected with Brain Test Malware. http://www.ibtimes.co.uk/google- removes -13- android -apps-play-store-infected- brain-test...AUTOMATIC VETTING FOR MALICE IN ANDROID PLATFORMS IOWA STATE UNIVERSITY MAY 2016 FINAL TECHNICAL REPORT APPROVED...COVERED (From - To) DEC 2013 - DEC 2015 4. TITLE AND SUBTITLE Automatic Vetting for Malice in Android Platforms 5a. CONTRACT NUMBER FA8750-14-2

  10. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  11. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  12. Automatic single questionnaire intensity (SQI, EMS98 scale) estimation using ranking models built on the existing BCSF database

    Science.gov (United States)

    Schlupp, A.; Sira, C.; Schmitt, K.; Schaming, M.

    2013-12-01

    the fact that each definitive BCSF SQIs is determined by an expert analysis. We compare the SQIs obtained by these methods from our database and discuss the coherency and variations between automatic and manual processes. These methods lead to high scores with up to 85% of the forms well classified and most of the remaining forms classified with only a shift of one intensity degree. This allows us to use the ranking methods as the best automatic methods to fast SQIs estimation and to produce fast shakemaps. The next step, to improve the use of these methods, will be to identify explanations for the forms not classified at the correct value and a way to select the few remaining forms that should be analyzed by the expert. Note that beyond intensity VI, on-line questionnaires are insufficient and a field survey is indispensable to estimate intensity. For such survey, in France, BCSF leads a macroseismic intervention group (GIM).

  13. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  14. IADE: a system for intelligent automatic design of bioisosteric analogs

    Science.gov (United States)

    Ertl, Peter; Lewis, Richard

    2012-11-01

    IADE, a software system supporting molecular modellers through the automatic design of non-classical bioisosteric analogs, scaffold hopping and fragment growing, is presented. The program combines sophisticated cheminformatics functionalities for constructing novel analogs and filtering them based on their drug-likeness and synthetic accessibility using automatic structure-based design capabilities: the best candidates are selected according to their similarity to the template ligand and to their interactions with the protein binding site. IADE works in an iterative manner, improving the fitness of designed molecules in every generation until structures with optimal properties are identified. The program frees molecular modellers from routine, repetitive tasks, allowing them to focus on analysis and evaluation of the automatically designed analogs, considerably enhancing their work efficiency as well as the area of chemical space that can be covered. The performance of IADE is illustrated through a case study of the design of a nonclassical bioisosteric analog of a farnesyltransferase inhibitor—an analog that has won a recent "Design a Molecule" competition.

  15. Implicit proactive interference, age, and automatic versus controlled retrieval strategies.

    Science.gov (United States)

    Ikier, Simay; Yang, Lixia; Hasher, Lynn

    2008-05-01

    We assessed the extent to which implicit proactive interference results from automatic versus controlled retrieval among younger and older adults. During a study phase, targets (e.g., "ALLERGY") either were or were not preceded by nontarget competitors (e.g., "ANALOGY"). After a filled interval, the participants were asked to complete word fragments, some of which cued studied words (e.g., "A_L_ _GY"). Retrieval strategies were identified by the difference in response speed between a phase containing fragments that cued only new words and a phase that included a mix of fragments cuing old and new words. Previous results were replicated: Proactive interference was found in implicit memory, and the negative effects were greater for older than for younger adults. Novel findings demonstrate two retrieval processes that contribute to interference: an automatic one that is age invariant and a controlled process that can reduce the magnitude of the automatic interference effects. The controlled process, however, is used effectively only by younger adults. This pattern of findings potentially explains age differences in susceptibility to proactive interference.

  16. Automatic Detection of Vehicles Using Intensity Laser and Anaglyph Image

    Directory of Open Access Journals (Sweden)

    Hideo Araki

    2006-12-01

    Full Text Available In this work is presented a methodology to automatic car detection motion presents in digital aerial image on urban area using intensity, anaglyph and subtracting images. The anaglyph image is used to identify the motion cars on the expose take, because the cars provide red color due the not homology between objects. An implicit model was developed to provide a digital pixel value that has the specific propriety presented early, using the ratio between the RGB color of car object in the anaglyph image. The intensity image is used to decrease the false positive and to do the processing to work into roads and streets. The subtracting image is applied to decrease the false positives obtained due the markings road. The goal of this paper is automatically detect motion cars presents in digital aerial image in urban areas. The algorithm implemented applies normalization on the left and right images and later form the anaglyph with using the translation. The results show the applicability of proposed method and it potentiality on the automatic car detection and presented the performance of proposed methodology.

  17. Automatic Parallelization of Scientific Application

    DEFF Research Database (Denmark)

    Blum, Troels

    performance gains. Scientists working with computer simulations should be allowed to focus on their field of research and not spend excessive amounts of time learning exotic programming models and languages. We have with Bohrium achieved very promising results by starting out with a relatively simple approach...

  18. Automatic Earthquake Detection by Active Learning

    Science.gov (United States)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  19. Towards automatic musical instrument timbre recognition

    Science.gov (United States)

    Park, Tae Hong

    This dissertation is comprised of two parts---focus on issues concerning research and development of an artificial system for automatic musical instrument timbre recognition and musical compositions. The technical part of the essay includes a detailed record of developed and implemented algorithms for feature extraction and pattern recognition. A review of existing literature introducing historical aspects surrounding timbre research, problems associated with a number of timbre definitions, and highlights of selected research activities that have had significant impact in this field are also included. The developed timbre recognition system follows a bottom-up, data-driven model that includes a pre-processing module, feature extraction module, and a RBF/EBF (Radial/Elliptical Basis Function) neural network-based pattern recognition module. 829 monophonic samples from 12 instruments have been chosen from the Peter Siedlaczek library (Best Service) and other samples from the Internet and personal collections. Significant emphasis has been put on feature extraction development and testing to achieve robust and consistent feature vectors that are eventually passed to the neural network module. In order to avoid a garbage-in-garbage-out (GIGO) trap and improve generality, extra care was taken in designing and testing the developed algorithms using various dynamics, different playing techniques, and a variety of pitches for each instrument with inclusion of attack and steady-state portions of a signal. Most of the research and development was conducted in Matlab. The compositional part of the essay includes brief introductions to "A d'Ess Are ," "Aboji," "48 13 N, 16 20 O," and "pH-SQ." A general outline pertaining to the ideas and concepts behind the architectural designs of the pieces including formal structures, time structures, orchestration methods, and pitch structures are also presented.

  20. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  1. The estimation of tax-benefit automatic stabilizers in Serbia: A combined micro-macro approach

    Directory of Open Access Journals (Sweden)

    Ranđelović Saša

    2013-01-01

    Full Text Available The large volatility of GDP due to the economic crisis, particularly in transition economies, has brought the issue of automatic stabilizers back into the focus of economic policy. The vast majority of empirical literature in this field relates to the estimation of the size of automatic stabilizers in developed countries, usually based on macroeconomic data. On the other hand empirical literature on this topic based on micro data, particularly for transition economies, is limited. This paper provides an evaluation of the size of automatic stabilizers in one transition economy (Serbia, by combining tax-benefit simulation modelling based on micro data and econometric methods based on macroeconomic data. The results show that, in the case of shock, around 17% of fall in market income would be absorbed by automatic stabilizers. Although the stabilizing effects of the tax-benefit system in Serbia are lower than in other European countries, the total size of automatic stabilizers is close to the average value in these countries, due to the higher elasticity of demand to income. The results also show that progressivity-enhancing income tax reform would only slightly increase automatic stabilizers, due to the large informal economy and the large share of agriculture in total households’ income.

  2. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  3. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  4. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  5. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  6. Automatic Mosaicking of Satellite Imagery Considering the Clouds

    Science.gov (United States)

    Kang, Yifei; Pan, Li; Chen, Qi; Zhang, Tong; Zhang, Shasha; Liu, Zhang

    2016-06-01

    With the rapid development of high resolution remote sensing for earth observation technology, satellite imagery is widely used in the fields of resource investigation, environment protection, and agricultural research. Image mosaicking is an important part of satellite imagery production. However, the existence of clouds leads to lots of disadvantages for automatic image mosaicking, mainly in two aspects: 1) Image blurring may be caused during the process of image dodging, 2) Cloudy areas may be passed through by automatically generated seamlines. To address these problems, an automatic mosaicking method is proposed for cloudy satellite imagery in this paper. Firstly, modified Otsu thresholding and morphological processing are employed to extract cloudy areas and obtain the percentage of cloud cover. Then, cloud detection results are used to optimize the process of dodging and mosaicking. Thus, the mosaic image can be combined with more clear-sky areas instead of cloudy areas. Besides, clear-sky areas will be clear and distortionless. The Chinese GF-1 wide-field-of-view orthoimages are employed as experimental data. The performance of the proposed approach is evaluated in four aspects: the effect of cloud detection, the sharpness of clear-sky areas, the rationality of seamlines and efficiency. The evaluation results demonstrated that the mosaic image obtained by our method has fewer clouds, better internal color consistency and better visual clarity compared with that obtained by traditional method. The time consumed by the proposed method for 17 scenes of GF-1 orthoimages is within 4 hours on a desktop computer. The efficiency can meet the general production requirements for massive satellite imagery.

  7. Automatic Adviser on stationary devices status identification and anticipated change

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  8. MadEvent: automatic event generation with MadGraph

    International Nuclear Information System (INIS)

    Maltoni, Fabio; Stelzer, Tim

    2003-01-01

    We present a new multi-channel integration method and its implementation in the multi-purpose event generator MadEvent, which is based on MadGraph. Given a process, MadGraph automatically identifies all the relevant subprocesses, generates both the amplitudes and the mappings needed for an efficient integration over the phase space, and passes them to MadEvent. As a result, a process-specific, stand-alone code is produced that allows the user to calculate cross sections and produce unweighted events in a standard output format. Several examples are given for processes that are relevant for physics studies at present and forthcoming colliders. (author)

  9. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  10. Subsurface occurrence and potential source areas of chlorinated ethenes identified using concentrations and concentration ratios, Air Force Plant 4 and Naval Air Station-Joint Reserve Base Carswell Field, Fort Worth, Texas

    Science.gov (United States)

    Garcia, C. Amanda

    2005-01-01

    The U.S. Geological Survey, in cooperation with the U.S. Air Force Aeronautical Systems Center, Environmental Management Directorate, conducted a study during 2003-05 to characterize the subsurface occurrence and identify potential source areas of the volatile organic compounds classified as chlorinated ethenes at U.S. Air Force Plant 4 (AFP4) and adjacent Naval Air Station-Joint Reserve Base Carswell Field (NAS-JRB) at Fort Worth, Texas. The solubilized chlorinated ethenes detected in the alluvial aquifer originated as either released solvents (tetrachloroethene [PCE], trichloroethene [TCE], and trans-1,2-dichloroethene [trans-DCE]) or degradation products of the released solvents (TCE, cis-1,2-dichloroethene [cis-DCE], and trans-DCE). The combined influences of topographic- and bedrock-surface configurations result in a water table that generally slopes away from a ground-water divide approximately coincident with bedrock highs and the 1-mile-long aircraft assembly building at AFP4. Highest TCE concentrations (10,000 to 920,000 micrograms per liter) occur near Building 181, west of Building 12, and at landfill 3. Highest PCE concentrations (500 to 920 micrograms per liter) occur near Buildings 4 and 5. Highest cis-DCE concentrations (5,000 to 710,000 micrograms per liter) occur at landfill 3. Highest trans-DCE concentrations (1,000 to 1,700 micrograms per liter) occur just south of Building 181 and at landfill 3. Ratios of parent-compound to daughter-product concentrations that increase in relatively short distances (tens to 100s of feet) along downgradient ground-water flow paths can indicate a contributing source in the vicinity of the increase. Largest increases in ratio of PCE to TCE concentrations are three orders of magnitude from 0.01 to 2.7 and 7.1 between nearby wells in the northeastern part of NAS-JRB. In the northern part of NAS-JRB, the largest increases in TCE to total DCE concentration ratios relative to ratios at upgradient wells are from 17 to

  11. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  12. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  13. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  14. Automatic control system at the ''Loviisa'' NPP

    International Nuclear Information System (INIS)

    Kukhtevich, I.V.; Mal'tsev, B.K.; Sergievskaya, E.N.

    1980-01-01

    Automatic control system of the Loviisa-1 NPP (Finland) is described. According to operation conditions of Finland power system the Loviisa-1 NPP must operate in the mode of week and day control of loading schedule and participate in current control of power system frequency and capacity. With provision for these requirements NPP is equipped with the all-regime system for automatic control functioning during reactor start-up, shut-down, in normal and transient regimes and in emergency situations. The automatic control system includes: a data subsystem, an automatic control subsystem, a discrete control subsystem including remote, a subsystem for reactor control and protection and overall station system of protections: control and dosimetry inside the reactor. Structures of a data-computer complex, discrete control subsystems, reactor control and protection systems, neutron flux control system, inside-reactor control system, station protection system and system for control of fuel element tightness are presented in short. Two-year experience of the NPP operation confirmed advisability of the chosen volume of automatization. The Loviisa-1 NPP operates successfully in the mode of the week and day control of supervisor schedule and current control of frequency (short-term control)

  15. Automatically sweeping dual-channel boxcar integrator

    International Nuclear Information System (INIS)

    Keefe, D.J.; Patterson, D.R.

    1978-01-01

    An automatically sweeping dual-channel boxcar integrator has been developed to automate the search for a signal that repeatedly follows a trigger pulse by a constant or slowly varying time delay when that signal is completely hidden in random electrical noise and dc-offset drifts. The automatically sweeping dual-channel boxcar integrator improves the signal-to-noise ratio and eliminates dc-drift errors in the same way that a conventional dual-channel boxcar integrator does, but, in addition, automatically locates the hidden signal. When the signal is found, its time delay is displayed with 100-ns resolution, and its peak value is automatically measured and displayed. This relieves the operator of the tedious, time-consuming, and error-prone search for the signal whenever the time delay changes. The automatically sweeping boxcar integrator can also be used as a conventional dual-channel boxcar integrator. In either mode, it can repeatedly integrate a signal up to 990 times and thus make accurate measurements of the signal pulse height in the presence of random noise, dc offsets, and unsynchronized interfering signals

  16. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  17. Automatic identification of otologic drilling faults: a preliminary report.

    Science.gov (United States)

    Shen, Peng; Feng, Guodong; Cao, Tianyang; Gao, Zhiqiang; Li, Xisheng

    2009-09-01

    A preliminary study was carried out to identify parameters to characterize drilling faults when using an otologic drill under various operating conditions. An otologic drill was modified by the addition of four sensors. Under consistent conditions, the drill was used to simulate three important types of drilling faults and the captured data were analysed to extract characteristic signals. A multisensor information fusion system was designed to fuse the signals and automatically identify the faults. When identifying drilling faults, there was a high degree of repeatability and regularity, with an average recognition rate of >70%. This study shows that the variables measured change in a fashion that allows the identification of particular drilling faults, and that it is feasible to use these data to provide rapid feedback for a control system. Further experiments are being undertaken to implement such a system.

  18. 77 FR 3404 - Energy Conservation Standards for Automatic Commercial Ice Makers: Public Meeting and...

    Science.gov (United States)

    2012-01-24

    .... Email: [email protected] . SUPPLEMENTARY INFORMATION: I. Statutory Authority II. History of... feedback from interested parties on its analytical framework, models, and preliminary results. II. History... automatic commercial ice makers installed in the field, such as in hospitals and restaurants. Details of the...

  19. Integration of wireless sensor networks into automatic irrigation scheduling of a center pivot

    Science.gov (United States)

    A six-span center pivot system was used as a platform for testing two wireless sensor networks (WSN) of infrared thermometers. The cropped field was a semi-circle, divided into six pie shaped sections of which three were irrigated manually and three were irrigated automatically based on the time tem...

  20. Label-free sensor for automatic identification of erythrocytes using digital in-line holographic microscopy and machine learning.

    Science.gov (United States)

    Go, Taesik; Byeon, Hyeokjun; Lee, Sang Joon

    2018-04-30

    Cell types of erythrocytes should be identified because they are closely related to their functionality and viability. Conventional methods for classifying erythrocytes are time consuming and labor intensive. Therefore, an automatic and accurate erythrocyte classification system is indispensable in healthcare and biomedical fields. In this study, we proposed a new label-free sensor for automatic identification of erythrocyte cell types using a digital in-line holographic microscopy (DIHM) combined with machine learning algorithms. A total of 12 features, including information on intensity distributions, morphological descriptors, and optical focusing characteristics, is quantitatively obtained from numerically reconstructed holographic images. All individual features for discocytes, echinocytes, and spherocytes are statistically different. To improve the performance of cell type identification, we adopted several machine learning algorithms, such as decision tree model, support vector machine, linear discriminant classification, and k-nearest neighbor classification. With the aid of these machine learning algorithms, the extracted features are effectively utilized to distinguish erythrocytes. Among the four tested algorithms, the decision tree model exhibits the best identification performance for the training sets (n = 440, 98.18%) and test sets (n = 190, 97.37%). This proposed methodology, which smartly combined DIHM and machine learning, would be helpful for sensing abnormal erythrocytes and computer-aided diagnosis of hematological diseases in clinic. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Automatic control variac system for electronic accelerator

    International Nuclear Information System (INIS)

    Zhang Shuocheng; Wang Dan; Jing Lan; Qiao Weimin; Ma Yunhai

    2006-01-01

    An automatic control variac system is designed in order to satisfy the controlling requirement of the electronic accelerator developed by the Institute. Both design and operational principles, structure of the system as well as the software of industrial PC and micro controller unit are described. The interfaces of the control module are RS232 and RS485. A fiber optical interface (FOC) could be set up if an industrial FOC network is necessary, which will extend the filed of its application and make the communication of the system better. It is shown in practice that the system can adjust the variac output voltage automatically and assure the accurate and automatic control of the electronic accelerator. The system is designed in accordance with the general design principles and possesses the merits such as easy operation and maintenance, good expansibility, and low cost, thus it could also be used in other industrial branches. (authors)

  2. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  3. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.

  4. Automatic inference of indexing rules for MEDLINE

    Directory of Open Access Journals (Sweden)

    Shooshan Sonya E

    2008-11-01

    Full Text Available Abstract Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI, a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  5. Mimicry and automatic imitation are not correlated

    Science.gov (United States)

    van Den Bossche, Sofie; Cracco, Emiel; Bardi, Lara; Rigoni, Davide; Brass, Marcel

    2017-01-01

    It is widely known that individuals have a tendency to imitate each other. However, different psychological disciplines assess imitation in different manners. While social psychologists assess mimicry by means of action observation, cognitive psychologists assess automatic imitation with reaction time based measures on a trial-by-trial basis. Although these methods differ in crucial methodological aspects, both phenomena are assumed to rely on similar underlying mechanisms. This raises the fundamental question whether mimicry and automatic imitation are actually correlated. In the present research we assessed both phenomena and did not find a meaningful correlation. Moreover, personality traits such as empathy, autism traits, and traits related to self- versus other-focus did not correlate with mimicry or automatic imitation either. Theoretical implications are discussed. PMID:28877197

  6. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  7. Automatic welding of stainless steel tubing

    Science.gov (United States)

    Clautice, W. E.

    1978-01-01

    The use of automatic welding for making girth welds in stainless steel tubing was investigated as well as the reduction in fabrication costs resulting from the elimination of radiographic inspection. Test methodology, materials, and techniques are discussed, and data sheets for individual tests are included. Process variables studied include welding amperes, revolutions per minute, and shielding gas flow. Strip chart recordings, as a definitive method of insuring weld quality, are studied. Test results, determined by both radiographic and visual inspection, are presented and indicate that once optimum welding procedures for specific sizes of tubing are established, and the welding machine operations are certified, then the automatic tube welding process produces good quality welds repeatedly, with a high degree of reliability. Revised specifications for welding tubing using the automatic process and weld visual inspection requirements at the Kennedy Space Center are enumerated.

  8. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  9. Automatic color preference correction for color reproduction

    Science.gov (United States)

    Tsukada, Masato; Funayama, Chisato; Tajima, Johji

    2000-12-01

    The reproduction of natural objects in color images has attracted a great deal of attention. Reproduction more pleasing colors of natural objects is one of the methods available to improve image quality. We developed an automatic color correction method to maintain preferred color reproduction for three significant categories: facial skin color, green grass and blue sky. In this method, a representative color in an object area to be corrected is automatically extracted from an input image, and a set of color correction parameters is selected depending on the representative color. The improvement in image quality for reproductions of natural image was more than 93 percent in subjective experiments. These results show the usefulness of our automatic color correction method for the reproduction of preferred colors.

  10. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    of identifying a release of radioactivity provided that the number of sampling locations is sufficiently high. Consequently, we believe that a combination of applying the presented mapping approach and the physical knowledge of the transport processes of radioactivity should be used to predict the extreme values.......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  11. Automatic detection of biological cells

    International Nuclear Information System (INIS)

    Alves Da Costa, Caiuby

    1983-01-01

    The present research work has dealt with the analysis of biological cell images in general, and more specially with the cervical cells. This work was carried out in order to develop an automaton leading to a better prevention of cancer through automated mass screening. The device has been implemented on Motorola 68.000 microprocessor system. The automaton carries out cell nucleus analysis in several steps. The main steps are: - First: the automaton focuses on an individual cell nucleus among the smear's cell (about 10.000), - Second: it process each nucleus image. The digital processing yields geometrical of the nucleus (area and perimeter) for each cell. These data are stored in a local memory for further discriminant analysis by a microcomputer. In this way smears are classed in two groups: hale smears and uncertain smears. The automaton uses a wired logic for image acquisition and its software algorithms provide image reconstruction. The reconstruction algorithms are general purpose. Tests have proved that they can reconstruct any two dimensional images independently of its geometrical form. Moreover they can make the reconstruction of any image among the several images present in observation field. The processing times registered during the tests (for different cases) were situated, all of them, below three minutes for 10,000 images (each of them formed by an average of 450 pixels). The interest of the method is generality and speed. The only restriction is the primary device sensor (CCD linear array) length. Thus the automaton application can be extended beyond the biological image field. (author) [fr

  12. Automatic control system in the reactor peggy

    International Nuclear Information System (INIS)

    Bertrand, J.; Mourchon, R.; Da Costa, D.; Desandre-Navarre, Ch.

    1967-01-01

    The equipment makes it possible for the reactor to attain a given power automatically and for the power to be maintained around this level. The principle of its operation consists in the changing from one power to another, at constant period, by means of a programmer transforming a power-step request into a voltage variation which is linear with time and which represents the logarithm of the required power. The real power is compared continuously with the required power. Stabilization occurs automatically as soon as the difference between the reactor power and the required power diminishes to a few per cent. (authors) [fr

  13. Automatic Smoker Detection from Telephone Speech Signals

    DEFF Research Database (Denmark)

    Poorjam, Amir Hossein; Hesaraki, Soheila; Safavi, Saeid

    2017-01-01

    This paper proposes an automatic smoking habit detection from spontaneous telephone speech signals. In this method, each utterance is modeled using i-vector and non-negative factor analysis (NFA) frameworks, which yield low-dimensional representation of utterances by applying factor analysis...... method is evaluated on telephone speech signals of speakers whose smoking habits are known drawn from the National Institute of Standards and Technology (NIST) 2008 and 2010 Speaker Recognition Evaluation databases. Experimental results over 1194 utterances show the effectiveness of the proposed approach...... for the automatic smoking habit detection task....

  14. Automatic segmentation of vertebrae from radiographs

    DEFF Research Database (Denmark)

    Mysling, Peter; Petersen, Peter Kersten; Nielsen, Mads

    2011-01-01

    Segmentation of vertebral contours is an essential task in the design of automatic tools for vertebral fracture assessment. In this paper, we propose a novel segmentation technique which does not require operator interaction. The proposed technique solves the segmentation problem in a hierarchical...... is constrained by a conditional shape model, based on the variability of the coarse spine location estimates. The technique is evaluated on a data set of manually annotated lumbar radiographs. The results compare favorably to the previous work in automatic vertebra segmentation, in terms of both segmentation...

  15. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    and specialization of classes (inheritance) are considered different abstractions. We present a new programming language, Lapis, that unifies inheritance and program specialization at the conceptual, syntactic, and semantic levels. This paper presents the initial development of Lapis, which uses inheritance...... with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented...

  16. Automatic incrementalization of Prolog based static analyses

    DEFF Research Database (Denmark)

    Eichberg, Michael; Kahl, Matthias; Saha, Diptikalyan

    2007-01-01

    Modem development environments integrate various static analyses into the build process. Analyses that analyze the whole project whenever the project changes are impractical in this context. We present an approach to automatic incrementalization of analyses that are specified as tabled logic...... programs and evaluated using incremental tabled evaluation, a technique for efficiently updating memo tables in response to changes in facts and rules. The approach has been implemented and integrated into the Eclipse IDE. Our measurements show that this technique is effective for automatically...

  17. Automatization of the radiation control measurements

    International Nuclear Information System (INIS)

    Seki, Akio; Ogata, Harumi; Horikoshi, Yoshinori; Shirai, Kenji

    1988-01-01

    Plutonium Fuel Production Facility (PFPF) was constructed to fabricate the MOX fuels for 'MONJU' and 'JOYO' reactors and to develop the practical fuel fabricating technology. For the fuel fabrication process in this facility, centralized controlling system is being adopted for the mass production of the fuel and reduction of the radiation exposure dose. Also, the radiation control systems are suitable for the large-scale facility and the automatic-remote process of the fuel fabrication. One of the typical radiation control systems is the self moving survey system which has been developed by PNC and adopted for the automatic routine monitoring. (author)

  18. Automatic Control Of Length Of Welding Arc

    Science.gov (United States)

    Iceland, William F.

    1991-01-01

    Nonlinear relationships among current, voltage, and length stored in electronic memory. Conceptual microprocessor-based control subsystem maintains constant length of welding arc in gas/tungsten arc-welding system, even when welding current varied. Uses feedback of current and voltage from welding arc. Directs motor to set position of torch according to previously measured relationships among current, voltage, and length of arc. Signal paths marked "calibration" or "welding" used during those processes only. Other signal paths used during both processes. Control subsystem added to existing manual or automatic welding system equipped with automatic voltage control.

  19. Semi-automatic logarithmic converter of logs

    International Nuclear Information System (INIS)

    Gol'dman, Z.A.; Bondar's, V.V.

    1974-01-01

    Semi-automatic logarithmic converter of logging charts. An original semi-automatic converter was developed for use in converting BK resistance logging charts and the time interval, ΔT, of acoustic logs from a linear to a logarithmic scale with a specific ratio for subsequent combining of them with neutron-gamma logging charts in operative interpretation of logging materials by a normalization method. The converter can be used to increase productivity by giving curves different from those obtained in manual, pointwise processing. The equipment operates reliably and is simple in use. (author)

  20. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  1. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  2. Some results of automatic processing of images

    International Nuclear Information System (INIS)

    Golenishchev, I.A.; Gracheva, T.N.; Khardikov, S.V.

    1975-01-01

    The problems of automatic deciphering of the radiographic picture the purpose of which is making a conclusion concerning the quality of the inspected product on the basis of the product defect images in the picture are considered. The methods of defect image recognition are listed, and the algorithms and the class features of defects are described. The results of deciphering of a small radiographic picture by means of the ''Minsk-22'' computer are presented. It is established that the sensitivity of the method of the automatic deciphering is close to that obtained for visual deciphering

  3. Development of automatic laser welding system

    International Nuclear Information System (INIS)

    Ohwaki, Katsura

    2002-01-01

    Laser are a new production tool for high speed and low distortion welding and applications to automatic welding lines are increasing. IHI has long experience of laser processing for the preservation of nuclear power plants, welding of airplane engines and so on. Moreover, YAG laser oscillators and various kinds of hardware have been developed for laser welding and automation. Combining these welding technologies and laser hardware technologies produce the automatic laser welding system. In this paper, the component technologies are described, including combined optics intended to improve welding stability, laser oscillators, monitoring system, seam tracking system and so on. (author)

  4. Automatic Evaluations and Exercising: Systematic Review and Implications for Future Research.

    Science.gov (United States)

    Schinkoeth, Michaela; Antoniewicz, Franziska

    2017-01-01

    The general purpose of this systematic review was to summarize, structure and evaluate the findings on automatic evaluations of exercising. Studies were eligible for inclusion if they reported measuring automatic evaluations of exercising with an implicit measure and assessed some kind of exercise variable. Fourteen nonexperimental and six experimental studies (out of a total N = 1,928) were identified and rated by two independent reviewers. The main study characteristics were extracted and the grade of evidence for each study evaluated. First, results revealed a large heterogeneity in the applied measures to assess automatic evaluations of exercising and the exercise variables. Generally, small to large-sized significant relations between automatic evaluations of exercising and exercise variables were identified in the vast majority of studies. The review offers a systematization of the various examined exercise variables and prompts to differentiate more carefully between actually observed exercise behavior (proximal exercise indicator) and associated physiological or psychological variables (distal exercise indicator). Second, a lack of transparent reported reflections on the differing theoretical basis leading to the use of specific implicit measures was observed. Implicit measures should be applied purposefully, taking into consideration the individual advantages or disadvantages of the measures. Third, 12 studies were rated as providing first-grade evidence (lowest grade of evidence), five represent second-grade and three were rated as third-grade evidence. There is a dramatic lack of experimental studies, which are essential for illustrating the cause-effect relation between automatic evaluations of exercising and exercise and investigating under which conditions automatic evaluations of exercising influence behavior. Conclusions about the necessity of exercise interventions targeted at the alteration of automatic evaluations of exercising should therefore

  5. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  6. RECOVIR Software for Identifying Viruses

    Science.gov (United States)

    Chakravarty, Sugoto; Fox, George E.; Zhu, Dianhui

    2013-01-01

    Most single-stranded RNA (ssRNA) viruses mutate rapidly to generate a large number of strains with highly divergent capsid sequences. Determining the capsid residues or nucleotides that uniquely characterize these strains is critical in understanding the strain diversity of these viruses. RECOVIR (an acronym for "recognize viruses") software predicts the strains of some ssRNA viruses from their limited sequence data. Novel phylogenetic-tree-based databases of protein or nucleic acid residues that uniquely characterize these virus strains are created. Strains of input virus sequences (partial or complete) are predicted through residue-wise comparisons with the databases. RECOVIR uses unique characterizing residues to identify automatically strains of partial or complete capsid sequences of picorna and caliciviruses, two of the most highly diverse ssRNA virus families. Partition-wise comparisons of the database residues with the corresponding residues of more than 300 complete and partial sequences of these viruses resulted in correct strain identification for all of these sequences. This study shows the feasibility of creating databases of hitherto unknown residues uniquely characterizing the capsid sequences of two of the most highly divergent ssRNA virus families. These databases enable automated strain identification from partial or complete capsid sequences of these human and animal pathogens.

  7. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    Regional hydrological models are important tools in water resources management. Model prediction uncertainty is primarily due to structural (geological) non-uniqueness which makes sampling of the structural model space necessary to estimate prediction uncertainties. Geological structures and hete...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  8. Automatic Hidden-Web Table Interpretation by Sibling Page Comparison

    Science.gov (United States)

    Tao, Cui; Embley, David W.

    The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%.

  9. AISLE: an automatic volumetric segmentation method for the study of lung allometry.

    Science.gov (United States)

    Ren, Hongliang; Kazanzides, Peter

    2011-01-01

    We developed a fully automatic segmentation method for volumetric CT (computer tomography) datasets to support construction of a statistical atlas for the study of allometric laws of the lung. The proposed segmentation method, AISLE (Automated ITK-Snap based on Level-set), is based on the level-set implementation from an existing semi-automatic segmentation program, ITK-Snap. AISLE can segment the lung field without human interaction and provide intermediate graphical results as desired. The preliminary experimental results show that the proposed method can achieve accurate segmentation, in terms of volumetric overlap metric, by comparing with the ground-truth segmentation performed by a radiologist.

  10. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Routine use of quantitative three dimensional analysis of material microstructure by in particular, focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice or the quality of manual and automatic...... segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...

  11. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  12. Improvement of remote control system of automatic ultrasonic equipment for inspection of reactor pressure vessel

    International Nuclear Information System (INIS)

    Cheong, Yong Moo; Jung, H. K.; Joo, Y. S.; Koo, K. M.; Hyung, H.; Sim, C. M.; Gong, U. S.; Kim, S. H.; Lee, J. P.; Rhoo, H. C.; Kim, M. S.; Ryoo, S. K.; Choi, C. H.; Oh, K. I.

    1999-12-01

    One of the important issues related to the nuclear safety is in-service inspection of reactor pressure vessel (RPV). A remote controlled automatic ultrasonic method is applied to the inspection. At present the automatic ultrasonic inspection system owned by KAERI is interrupted due to degradation of parts. In order to resume field inspection new remote control system for the equipment was designed and installed to the existing equipment. New ultrasonic sensors and their modules for RPV inspection were designed and fabricated in accordance with the new requirements of the inspection codes. Ultrasonic sensors were verified for the use in the RPV inspection. (author)

  13. Improvement of remote control system of automatic ultrasonic equipment for inspection of reactor pressure vessel

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Yong Moo; Jung, H. K.; Joo, Y. S.; Koo, K. M.; Hyung, H.; Sim, C. M.; Gong, U. S.; Kim, S. H.; Lee, J. P.; Rhoo, H. C.; Kim, M. S.; Ryoo, S. K.; Choi, C. H.; Oh, K. I

    1999-12-01

    One of the important issues related to the nuclear safety is in-service inspection of reactor pressure vessel (RPV). A remote controlled automatic ultrasonic method is applied to the inspection. At present the automatic ultrasonic inspection system owned by KAERI is interrupted due to degradation of parts. In order to resume field inspection new remote control system for the equipment was designed and installed to the existing equipment. New ultrasonic sensors and their modules for RPV inspection were designed and fabricated in accordance with the new requirements of the inspection codes. Ultrasonic sensors were verified for the use in the RPV inspection. (autho0008.

  14. The development of an automatic scanning method for CR-39 neutron dosimeter

    International Nuclear Information System (INIS)

    Tawara, Hiroko; Miyajima, Mitsuhiro; Sasaki, Shin-ichi; Hozumi, Ken-ichi

    1989-01-01

    A method of measuring low level neutron dose has been developed with CR-39 track detectors using an automatic scanning system. It is composed of the optical microscope with a video camera, an image processor and a personal computer. The focus point of the microscope and the X-Y stage are controlled from the computer. The minimum detectable neutron dose is estimated at 4.6 mrem in the uniform field of neutron with equivalent energy spectrum to Am-Be source from the results of automatic measurements. (author)

  15. Using Historical Data to Automatically Identify Air-Traffic Control Behavior

    Science.gov (United States)

    Lauderdale, Todd A.; Wu, Yuefeng; Tretto, Celeste

    2014-01-01

    This project seeks to develop statistical-based machine learning models to characterize the types of errors present when using current systems to predict future aircraft states. These models will be data-driven - based on large quantities of historical data. Once these models are developed, they will be used to infer situations in the historical data where an air-traffic controller intervened on an aircraft's route, even when there is no direct recording of this action.

  16. Using the Chandra Source-Finding Algorithm to Automatically Identify Solar X-ray Bright Points

    Science.gov (United States)

    Adams, Mitzi L.; Tennant, A.; Cirtain, J. M.

    2009-01-01

    This poster details a technique of bright point identification that is used to find sources in Chandra X-ray data. The algorithm, part of a program called LEXTRCT, searches for regions of a given size that are above a minimum signal to noise ratio. The algorithm allows selected pixels to be excluded from the source-finding, thus allowing exclusion of saturated pixels (from flares and/or active regions). For Chandra data the noise is determined by photon counting statistics, whereas solar telescopes typically integrate a flux. Thus the calculated signal-to-noise ratio is incorrect, but we find we can scale the number to get reasonable results. For example, Nakakubo and Hara (1998) find 297 bright points in a September 11, 1996 Yohkoh image; with judicious selection of signal-to-noise ratio, our algorithm finds 300 sources. To further assess the efficacy of the algorithm, we analyze a SOHO/EIT image (195 Angstroms) and compare results with those published in the literature (McIntosh and Gurman, 2005). Finally, we analyze three sets of data from Hinode, representing different parts of the decline to minimum of the solar cycle.

  17. Supporting Teachers in Identifying Students' Learning Styles in Learning Management Systems: An Automatic Student Modelling Approach

    Science.gov (United States)

    Graf, Sabine; Kinshuk; Liu, Tzu-Chien

    2009-01-01

    In learning management systems (LMSs), teachers have more difficulties to notice and know how individual students behave and learn in a course, compared to face-to-face education. Enabling teachers to know their students' learning styles and making students aware of their own learning styles increases teachers' and students' understanding about…

  18. Advances in automatic welding control

    International Nuclear Information System (INIS)

    White, D.; Woodacre, A.; Taylor, A.F.

    1972-01-01

    The development at the Reactor Fuel Element Laboratories, UKAEA Springfields, of a computer-based welding process control system, was aimed initially at the TIG welding of the end seals of nuclear fuel elements. The system provides for mixed multi-station operation with on-line real-time capability and can be used either as a research tool or for production requirements at competitive costs. The operation of the control system, the form of power source, and the servo motor control units are described. Typically, continuous or pulse-arc welding sequences can be digitally programmed on 0.1 sec increments, with current in 0.5 A increments up to a maximum of 256 A; up to three servo motors can be operated with speeds selected in 0.1 percent increments of their maximum. Up to six welding parameters can be monitored digitally at speeds from once every 10 msec. Some applications are described and it is shown that the equipment has wider uses outside the nuclear fuel element field. High quality industrial welding requirements can also be met and the system is not limited to the TIG process

  19. Advances in automatic welding control

    International Nuclear Information System (INIS)

    White, D.; Woodacre, A.; Taylor, A.F.

    1972-01-01

    The development at the Reactor Fuel Element Laboratories, UKAEA Springfields, of a computer-based welding process control system, was aimed initially at the TIG welding of the end seals of nuclear fuel elements. The system provides for mixed multi-station operation with on-line real-time capability and can be used either as a research tool or for production requirements at competitive costs. The operation of the control system, the form of power source and servo motor control units are described. Typically, continuous or pulse-arc welding sequences can be digitally programmed on 0.1 sec increments, with current in 0.5 A increments up to a maximum of 256 A; up to three servo motors can be operated with speeds selected in 0.1% increments of their maximum. Up to six welding parameters can be monitored digitally at speeds from once every 10 msec. Some applications are described and it is shown that the equipment has wider uses outside the nuclear fuel element field. High quality industrial welding requirements can also be met and the system is not limited to the TIG process. (author)

  20. Automatic spinal cord localization, robust to MRI contrasts using global curve optimization.

    Science.gov (United States)

    Gros, Charley; De Leener, Benjamin; Dupont, Sara M; Martin, Allan R; Fehlings, Michael G; Bakshi, Rohit; Tummala, Subhash; Auclair, Vincent; McLaren, Donald G; Callot, Virginie; Cohen-Adad, Julien; Sdika, Michaël

    2018-02-01

    During the last two decades, MRI has been increasingly used for providing valuable quantitative information about spinal cord morphometry, such as quantification of the spinal cord atrophy in various diseases. However, despite the significant improvement of MR sequences adapted to the spinal cord, automatic image processing tools for spinal cord MRI data are not yet as developed as for the brain. There is nonetheless great interest in fully automatic and fast processing methods to be able to propose quantitative analysis pipelines on large datasets without user bias. The first step of most of these analysis pipelines is to detect the spinal cord, which is challenging to achieve automatically across the broad range of MRI contrasts, field of view, resolutions and pathologies. In this paper, a fully automated, robust and fast method for detecting the spinal cord centerline on MRI volumes is introduced. The algorithm uses a global optimization scheme that attempts to strike a balance between a probabilistic localization map of the spinal cord center point and the overall spatial consistency of the spinal cord centerline (i.e. the rostro-caudal continuity of the spinal cord). Additionally, a new post-processing feature, which aims to automatically split brain and spine regions is introduced, to be able to detect a consistent spinal cord centerline, independently from the field of view. We present data on the validation of the proposed algorithm, known as "OptiC", from a large dataset involving 20 centers, 4 contrasts (T 2 -weighted n = 287, T 1 -weighted n = 120, T 2 ∗ -weighted n = 307, diffusion-weighted n = 90), 501 subjects including 173 patients with a variety of neurologic diseases. Validation involved the gold-standard centerline coverage, the mean square error between the true and predicted centerlines and the ability to accurately separate brain and spine regions. Overall, OptiC was able to cover 98.77% of the gold-standard centerline, with a