WorldWideScience

Sample records for fields automatically identified

  1. Automatically identifying periodic social events from Twitter

    NARCIS (Netherlands)

    Kunneman, F.A.; Bosch, A.P.J. van den

    2015-01-01

    Many events referred to on Twitter are of a periodic nature, characterized by roughly constant time intervals in between occurrences. Examples are annual music festivals, weekly television programs, and the full moon cycle. We propose a system that can automatically identify periodic events from Twi

  2. Automatically identifying periodic social events from Twitter

    NARCIS (Netherlands)

    Kunneman, F.A.; Bosch, A.P.J. van den

    2015-01-01

    Many events referred to on Twitter are of a periodic nature, characterized by roughly constant time intervals in between occurrences. Examples are annual music festivals, weekly television programs, and the full moon cycle. We propose a system that can automatically identify periodic events from

  3. Automatically Identifying Morphological Relations in = Machine-Readable Dictionaries

    CERN Document Server

    Pentheroudakis, J; Pentheroudakis, Joseph; Vanderwende, Lucy

    1994-01-01

    We describe an automated method for identifying classes of morphologically related words in an on-line dictionary, and for linking individual senses in the derived form to one or more senses in the base form by means of morphological relation attributes. We also present an algorithm for computing a score reflecting the system=92s certainty in these derivational links; this computation relies on the content of semantic relations associated with each sense, which are extracted automatically by parsing each sense definition and subjecting the parse structure to automated semantic analysis. By processing the entire set of headwords in the dictionary in this fashion we create a large set of directed derivational graphs, which can then be accessed by other components in our broad-coverage NLP system. Spurious or unlikely derivations are not discarded, but are rather added to the dictionary and assigned a negative score; this allows the system to handle non-standard uses of these forms.

  4. Automatic Tooth Segmentation of Dental Mesh Based on Harmonic Fields

    Directory of Open Access Journals (Sweden)

    Sheng-hui Liao

    2015-01-01

    Full Text Available An important preprocess in computer-aided orthodontics is to segment teeth from the dental models accurately, which should involve manual interactions as few as possible. But fully automatic partition of all teeth is not a trivial task, since teeth occur in different shapes and their arrangements vary substantially from one individual to another. The difficulty is exacerbated when severe teeth malocclusion and crowding problems occur, which is a common occurrence in clinical cases. Most published methods in this area either are inaccurate or require lots of manual interactions. Motivated by the state-of-the-art general mesh segmentation methods that adopted the theory of harmonic field to detect partition boundaries, this paper proposes a novel, dental-targeted segmentation framework for dental meshes. With a specially designed weighting scheme and a strategy of a priori knowledge to guide the assignment of harmonic constraints, this method can identify teeth partition boundaries effectively. Extensive experiments and quantitative analysis demonstrate that the proposed method is able to partition high-quality teeth automatically with robustness and efficiency.

  5. Automatic Tooth Segmentation of Dental Mesh Based on Harmonic Fields.

    Science.gov (United States)

    Liao, Sheng-hui; Liu, Shi-jian; Zou, Bei-ji; Ding, Xi; Liang, Ye; Huang, Jun-hui

    2015-01-01

    An important preprocess in computer-aided orthodontics is to segment teeth from the dental models accurately, which should involve manual interactions as few as possible. But fully automatic partition of all teeth is not a trivial task, since teeth occur in different shapes and their arrangements vary substantially from one individual to another. The difficulty is exacerbated when severe teeth malocclusion and crowding problems occur, which is a common occurrence in clinical cases. Most published methods in this area either are inaccurate or require lots of manual interactions. Motivated by the state-of-the-art general mesh segmentation methods that adopted the theory of harmonic field to detect partition boundaries, this paper proposes a novel, dental-targeted segmentation framework for dental meshes. With a specially designed weighting scheme and a strategy of a priori knowledge to guide the assignment of harmonic constraints, this method can identify teeth partition boundaries effectively. Extensive experiments and quantitative analysis demonstrate that the proposed method is able to partition high-quality teeth automatically with robustness and efficiency.

  6. Automatic detection of asteroids and meteoroids. A Wide Field Survey

    Science.gov (United States)

    Vereš, P.; Tóth, J.; Jedicke, R.; Tonry, J.; Denneau, L.; Wainscoat, R.; Kornoš, L.; Šilha, J.

    2014-07-01

    We propose a low-cost robotic optical survey aimed at 1-300 m Near Earth Objects (NEO) based on four state-of-the-art telescopes having extremely wide field of view. The small Near-Earth Asteroids (NEA) represent a potential risk but also easily accessible space resources for future robotic or human space in-situ exploration, or commercial activities. The survey system will be optimized for the detection of fast moving-trailed-asteroids, space debris and will provide real-time alert notifications. The expected cost of the system including 1-year development and 2-year operation is 1,000,000 EUR. The successful demonstration of the system will promote cost-effectiveicient ADAM-WFS (Automatic Detection of Asteroids and Meteoroids -- A Wide Field Survey) systems to be built around the world.

  7. Automatic Detection of Asteroids and Meteoroids - A Wide Field Survey

    CERN Document Server

    Vereš, P; Jedicke, R; Tonry, J; Denneau, L; Wainscoat, R; Kornoš, L; Šilha, J

    2014-01-01

    We propose a low-cost robotic optical survey aimed at $1-300$ m Near Earth Objects (NEO) based on four state-of-the-art telescopes having extremely wide field of view. The small Near-Earth Asteroids (NEA) represent a potential risk but also easily accessible space resources for future robotic or human space in-situ exploration, or commercial activities. The survey system will be optimized for the detection of fast moving - trailed - asteroids, space debris and will provide real-time alert notifications. The expected cost of the system including 1-year development and 2-year operation is 1,000,000 EUR. The successful demonstration of the system will promote cost-efficient ADAM-WFS (Automatic Detection of Asteroids and Meteoroids - A Wide Field Survey) systems to be built around the world.

  8. An improved, SSH-based method to automatically identify mesoscale eddies in the ocean

    Institute of Scientific and Technical Information of China (English)

    WANG Xin; DU Yun-yan; ZHOU Cheng-hu; FAN Xing; YI Jia-wei

    2013-01-01

      Mesoscale eddies are an important component of oceanic features. How to automatically identify these mesoscale eddies from available data has become an important research topic. Through careful examination of existing methods, we propose an improved, SSH-based automatic identification method. Using the inclusion relation of enclosed SSH contours, the mesoscale eddy boundary and core(s) can be automatically identified. The time evolution of eddies can be examined by a threshold search algorithm and a tracking algorithm based on similarity. Sea-surface height (SSH) data from Naval Research Laboratory Layered Ocean Model (NLOM) and sea-level anomaly (SLA) data from altimeter are used in the many experiments, in which different automatic identification methods are compared. Our results indicate that the improved method is able to extract the mesoscale eddy boundary more precisely, retaining the multiple-core structure. In combination with the tracking algorithm, this method can capture complete mesoscale eddy processes. It can thus provide reliable information for further study of reconstructing eddy dynamics, merging, splitting, and evolution of a multi-core structure.

  9. Automatic mapping of rice fields in the Sacramento Valley for water resources management

    Science.gov (United States)

    Zhong, L.; Yin, H.; Reyes, E.; Chung, F. I.

    2015-12-01

    Water use by rice fields is one of the most important components in hydrologic model simulation of the Sacramento Valley, California. In this study, rice fields were mapped by an automatic approach using Landsat imagery. The automatic approach is advantageous for its capacity of mapping rice fields repeatedly, consistently and timely without the need to collect training data. Seasonal dynamics of Enhanced Vegetation Index (EVI) and Normalized Difference Moisture Index (NDMI) were employed to identify rice based on its phenological characteristics. Classification could be conducted around planting date for early response to cropland use change, or for the full growing season to monitor rice growth. Two studies are illustrated as the applications of this mapping method: 1. A rice map was produced before mid-June to forecast rice acreage and water use in the 2015 drought. Due to continuous drought, rice acreage in the Sacramento Valley reached the historical minimum of the past 20 years in 2014, and further reduction is occurring in 2015. A quantitative measure of rice field extent is needed to forecast rice water use as early as possible. The automatic mapping method utilized the spectral dynamics during initial flooding to identify rice fields. Based on the map product, the forecast of rice water demand was made to facilitate the simulation of current-year hydrologic conditions. 2. Rice field extent has been mapped since 1989 and phenological metrics have been derived to study the change in growing season. The increasing use of short-season rice varieties and special weather condition (like El Nino in 2015) may alter the seasonal pattern of water demand by rice. Rice fields were identified based on the temporal profiles of NDMI and EVI derived from series of segmented images. Validation using field survey data and other land use maps showed a promising accuracy. The start and the end of the growing season and other phenological metrics were extracted from object

  10. Automatic derivation of domain terms and concept location based on the analysis of the identifiers

    CERN Document Server

    Vaclavik, Peter; Mezei, Marek

    2010-01-01

    Developers express the meaning of the domain ideas in specifically selected identifiers and comments that form the target implemented code. Software maintenance requires knowledge and understanding of the encoded ideas. This paper presents a way how to create automatically domain vocabulary. Knowledge of domain vocabulary supports the comprehension of a specific domain for later code maintenance or evolution. We present experiments conducted in two selected domains: application servers and web frameworks. Knowledge of domain terms enables easy localization of chunks of code that belong to a certain term. We consider these chunks of code as "concepts" and their placement in the code as "concept location". Application developers may also benefit from the obtained domain terms. These terms are parts of speech that characterize a certain concept. Concepts are encoded in "classes" (OO paradigm) and the obtained vocabulary of terms supports the selection and the comprehension of the class' appropriate identifiers. ...

  11. A New Automatic Method to Identify Galaxy Mergers I. Description and Application to the STAGES Survey

    CERN Document Server

    Hoyos, Carlos; Gray, Meghan E; Maltby, David T; Bell, Eric F; Barazza, Fabio D; Boehm, Asmus; Haussler, Boris; Jahnke, Knud; Jogee, Sharda; Lane, Kyle P; McIntosh, Daniel H; Wolf, Christian

    2011-01-01

    We present an automatic method to identify galaxy mergers using the morphological information contained in the residual images of galaxies after the subtraction of a Sersic model. The removal of the bulk signal from the host galaxy light is done with the aim of detecting the fainter minor mergers. The specific morphological parameters that are used in the merger diagnostic suggested here are the Residual Flux Fraction and the asymmetry of the residuals. The new diagnostic has been calibrated and optimized so that the resulting merger sample is very complete. However, the contamination by non-mergers is also high. If the same optimization method is adopted for combinations of other structural parameters such as the CAS system, the merger indicator we introduce yields merger samples of equal or higher statistical quality than the samples obtained through the use of other structural parameters. We explore the ability of the method presented here to select minor mergers by identifying a sample of visually classif...

  12. Automatic Pain Intensity Estimation using Heteroscedastic Conditional Ordinal Random Fields

    NARCIS (Netherlands)

    Rudovic, Ognjen; Pavlovic, Vladimir; Pantic, Maja

    2013-01-01

    Automatic pain intensity estimation from facial images is challenging mainly because of high variability in subject-specific pain expressiveness. This heterogeneity in the subjects causes their facial appearance to vary significantly when experiencing the same pain level. The standard classification

  13. Automatic Pain Intensity Estimation using Heteroscedastic Conditional Ordinal Random Fields

    NARCIS (Netherlands)

    Rudovic, Ognjen; Pavlovic, Vladimir; Pantic, Maja

    Automatic pain intensity estimation from facial images is challenging mainly because of high variability in subject-specific pain expressiveness. This heterogeneity in the subjects causes their facial appearance to vary significantly when experiencing the same pain level. The standard classification

  14. Identifying Solar Analogs in the Kepler Field

    Science.gov (United States)

    Buzasi, Derek L.; Lezcano, Andrew; Preston, Heather L.

    2014-06-01

    Since human beings live on a planet orbiting a G2 V star, to us perhaps the most intrinsically interesting category of stars about which planets have been discovered is solar analogs. While Kepler has observed more than 26000 targets which have effective temperatures within 100K of the Sun, many of these are not true solar analogs due to activity, surface gravity, metallicity, or other considerations. Here we combine ground-based measurements of effective temperature and metallicity with data on rotational periods and surface gravities derived from 16 quarters of Kepler observations to produce a near-complete sample of solar analogs in the Kepler field. We then compare the statistical distribution of stellar physical parameters, including activity level, for subsets of solar analogs consisting of KOIs and those with no detected exoplanets. Finally, we produce a list of potential solar twins in the Kepler field.

  15. An improved schlieren method for measurement and automatic reconstruction of the far-field focal spot

    Science.gov (United States)

    Wang, Zhengzhou; Hu, Bingliang; Yin, Qinye

    2017-01-01

    The schlieren method of measuring far-field focal spots offers many advantages at the Shenguang III laser facility such as low cost and automatic laser-path collimation. However, current methods of far-field focal spot measurement often suffer from low precision and efficiency when the final focal spot is merged manually, thereby reducing the accuracy of reconstruction. In this paper, we introduce an improved schlieren method to construct the high dynamic-range image of far-field focal spots and improve the reconstruction accuracy and efficiency. First, a detection method based on weak light beam sampling and magnification imaging was designed; images of the main and side lobes of the focused laser irradiance in the far field were obtained using two scientific CCD cameras. Second, using a self-correlation template matching algorithm, a circle the same size as the schlieren ball was dug from the main lobe cutting image and used to change the relative region of the main lobe cutting image within a 100×100 pixel region. The position that had the largest correlation coefficient between the side lobe cutting image and the main lobe cutting image when a circle was dug was identified as the best matching point. Finally, the least squares method was used to fit the center of the side lobe schlieren small ball, and the error was less than 1 pixel. The experimental results show that this method enables the accurate, high-dynamic-range measurement of a far-field focal spot and automatic image reconstruction. Because the best matching point is obtained through image processing rather than traditional reconstruction methods based on manual splicing, this method is less sensitive to the efficiency of focal-spot reconstruction and thus offers better experimental precision. PMID:28207758

  16. Identifying the field of health communication.

    Science.gov (United States)

    Hannawa, Annegret F; García-Jiménez, Leonarda; Candrian, Carey; Rossmann, Constanze; Schulz, Peter J

    2015-01-01

    This empirical investigation addresses four paradigmatically framed research questions to illuminate the epistemological status of the field of health communication, systematically addressing the limitations of existing disciplinary introspections. A content analysis of published health communication research indicated that the millennium marked a new stage of health communication research with a visible shift onto macro-level communication of health information among nonhealth professionals. The analysis also revealed the emergence of a paradigm around this particular topic area, with its contributing scholars predominantly sharing postpositivistic thought traditions and cross-sectional survey-analytic methodologies. More interdisciplinary collaborations and meta-theoretical assessments are needed to facilitate a continued growth of this evolving paradigm, which may advance health communication scholars in their search for a disciplinary identity.

  17. Applying deep learning technology to automatically identify metaphase chromosomes using scanning microscopic images: an initial investigation

    Science.gov (United States)

    Qiu, Yuchen; Lu, Xianglan; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Li, Shibo; Liu, Hong; Zheng, Bin

    2016-03-01

    Automated high throughput scanning microscopy is a fast developing screening technology used in cytogenetic laboratories for the diagnosis of leukemia or other genetic diseases. However, one of the major challenges of using this new technology is how to efficiently detect the analyzable metaphase chromosomes during the scanning process. The purpose of this investigation is to develop a computer aided detection (CAD) scheme based on deep learning technology, which can identify the metaphase chromosomes with high accuracy. The CAD scheme includes an eight layer neural network. The first six layers compose of an automatic feature extraction module, which has an architecture of three convolution-max-pooling layer pairs. The 1st, 2nd and 3rd pair contains 30, 20, 20 feature maps, respectively. The seventh and eighth layers compose of a multiple layer perception (MLP) based classifier, which is used to identify the analyzable metaphase chromosomes. The performance of new CAD scheme was assessed by receiver operation characteristic (ROC) method. A number of 150 regions of interest (ROIs) were selected to test the performance of our new CAD scheme. Each ROI contains either interphase cell or metaphase chromosomes. The results indicate that new scheme is able to achieve an area under the ROC curve (AUC) of 0.886+/-0.043. This investigation demonstrates that applying a deep learning technique may enable to significantly improve the accuracy of the metaphase chromosome detection using a scanning microscopic imaging technology in the future.

  18. Using Nanoinformatics Methods for Automatically Identifying Relevant Nanotoxicology Entities from the Literature

    Directory of Open Access Journals (Sweden)

    Miguel García-Remesal

    2013-01-01

    Full Text Available Nanoinformatics is an emerging research field that uses informatics techniques to collect, process, store, and retrieve data, information, and knowledge on nanoparticles, nanomaterials, and nanodevices and their potential applications in health care. In this paper, we have focused on the solutions that nanoinformatics can provide to facilitate nanotoxicology research. For this, we have taken a computational approach to automatically recognize and extract nanotoxicology-related entities from the scientific literature. The desired entities belong to four different categories: nanoparticles, routes of exposure, toxic effects, and targets. The entity recognizer was trained using a corpus that we specifically created for this purpose and was validated by two nanomedicine/nanotoxicology experts. We evaluated the performance of our entity recognizer using 10-fold cross-validation. The precisions range from 87.6% (targets to 93.0% (routes of exposure, while recall values range from 82.6% (routes of exposure to 87.4% (toxic effects. These results prove the feasibility of using computational approaches to reliably perform different named entity recognition (NER-dependent tasks, such as for instance augmented reading or semantic searches. This research is a “proof of concept” that can be expanded to stimulate further developments that could assist researchers in managing data, information, and knowledge at the nanolevel, thus accelerating research in nanomedicine.

  19. Using Nanoinformatics Methods for Automatically Identifying Relevant Nanotoxicology Entities from the Literature

    Science.gov (United States)

    García-Remesal, Miguel; García-Ruiz, Alejandro; Pérez-Rey, David; de la Iglesia, Diana; Maojo, Víctor

    2013-01-01

    Nanoinformatics is an emerging research field that uses informatics techniques to collect, process, store, and retrieve data, information, and knowledge on nanoparticles, nanomaterials, and nanodevices and their potential applications in health care. In this paper, we have focused on the solutions that nanoinformatics can provide to facilitate nanotoxicology research. For this, we have taken a computational approach to automatically recognize and extract nanotoxicology-related entities from the scientific literature. The desired entities belong to four different categories: nanoparticles, routes of exposure, toxic effects, and targets. The entity recognizer was trained using a corpus that we specifically created for this purpose and was validated by two nanomedicine/nanotoxicology experts. We evaluated the performance of our entity recognizer using 10-fold cross-validation. The precisions range from 87.6% (targets) to 93.0% (routes of exposure), while recall values range from 82.6% (routes of exposure) to 87.4% (toxic effects). These results prove the feasibility of using computational approaches to reliably perform different named entity recognition (NER)-dependent tasks, such as for instance augmented reading or semantic searches. This research is a “proof of concept” that can be expanded to stimulate further developments that could assist researchers in managing data, information, and knowledge at the nanolevel, thus accelerating research in nanomedicine. PMID:23509721

  20. Field Robotics in Sports: Automatic Generation of guidance Lines for Automatic Grass Cutting, Striping and Pitch Marking of Football Playing Fields

    Directory of Open Access Journals (Sweden)

    Ole Green

    2011-03-01

    Full Text Available Progress is constantly being made and new applications are constantly coming out in the area of field robotics. In this paper, a promising application of field robotics in football playing fields is introduced. An algorithmic approach for generating the way points required for the guidance of a GPS-based field robotic through a football playing field to automatically carry out periodical tasks such as cutting the grass field, pitch and line marking illustrations and lawn striping is represented. The manual operation of these tasks requires very skilful personnel able to work for long hours with very high concentration for the football yard to be compatible with standards of Federation Internationale de Football Association (FIFA. In the other side, a GPS-based guided vehicle or robot with three implements; grass mower, lawn stripping roller and track marking illustrator is capable of working 24 h a day, in most weather and in harsh soil conditions without loss of quality. The proposed approach for the automatic operation of football playing fields requires no or very limited human intervention and therefore it saves numerous working hours and free a worker to focus on other tasks. An economic feasibility study showed that the proposed method is economically superimposing the current manual practices.

  1. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  2. Field potential soil variability index to identify precision agriculture opportunity

    Science.gov (United States)

    Precision agriculture (PA) technologies used for identifying and managing within-field variability are not widely used despite decades of advancement. Technological innovations in agronomic tools, such as canopy reflectance or electrical conductivity sensors, have created opportunities to achieve a ...

  3. Automatic Associations with “Erotic” in Child Sexual Offenders: Identifying Those in Danger of Reoffence

    Directory of Open Access Journals (Sweden)

    Melanie Caroline Steffens

    2008-12-01

    Full Text Available If sexual offence (rape or sexual abuse has aspects of automatic rather than controlled behavior in the sense of being triggered by situational cues, it might be predicted better by reaction-time measures of automatic cognition rather than by questionnaires. Two Implicit Association Tests (IATs were used to test whether male pedophile and sadistic offenders (N = 46 differ from each other and from a male control group (N = 47 with regard to their automatic associations of erotic. The first IAT tested associations of erotic with child versus woman, the second IAT tested associations of erotic with harmony versus humiliation. Supplementary scales concern social desirability, locus of control, behaviour control, and evaluation of aims. First, no evidence for the validity of the humiliation-erotic IAT could be found. Second, offenders who were rated to be in danger of relapse by their therapists, and those rated to be exclusively pedophile, showed an increased child-erotic association as compared to the other groups.

  4. Automatic Method for Identifying Photospheric Bright Points and Granules Observed by Sunrise

    CERN Document Server

    Javaherian, Mohsen; Amiri, Ali; Ziaei, Shervin

    2014-01-01

    In this study, we propose methods for the automatic detection of photospheric features (bright points and granules) from ultra-violet (UV) radiation, using a feature-based classifier. The methods use quiet-Sun observations at 214 nm and 525 nm images taken by Sunrise on 9 June 2009. The function of region growing and mean shift procedure are applied to segment the bright points (BPs) and granules, respectively. Zernike moments of each region are computed. The Zernike moments of BPs, granules, and other features are distinctive enough to be separated using a support vector machine (SVM) classifier. The size distribution of BPs can be fitted with a power-law slope -1.5. The peak value of granule sizes is found to be about 0.5 arcsec^2. The mean value of the filling factor of BPs is 0.01, and for granules it is 0.51. There is a critical scale for granules so that small granules with sizes smaller than 2.5 arcsec^2 cover a wide range of brightness, while the brightness of large granules approaches unity. The mean...

  5. The RISE Framework: Using Learning Analytics to Automatically Identify Open Educational Resources for Continuous Improvement

    Science.gov (United States)

    Bodily, Robert; Nyland, Rob; Wiley, David

    2017-01-01

    The RISE (Resource Inspection, Selection, and Enhancement) Framework is a framework supporting the continuous improvement of open educational resources (OER). The framework is an automated process that identifies learning resources that should be evaluated and either eliminated or improved. This is particularly useful in OER contexts where the…

  6. An automatic system to identify heart disease risk factors in clinical texts over time.

    Science.gov (United States)

    Chen, Qingcai; Li, Haodi; Tang, Buzhou; Wang, Xiaolong; Liu, Xin; Liu, Zengjian; Liu, Shu; Wang, Weida; Deng, Qiwen; Zhu, Suisong; Chen, Yangxin; Wang, Jingfeng

    2015-12-01

    Despite recent progress in prediction and prevention, heart disease remains a leading cause of death. One preliminary step in heart disease prediction and prevention is risk factor identification. Many studies have been proposed to identify risk factors associated with heart disease; however, none have attempted to identify all risk factors. In 2014, the National Center of Informatics for Integrating Biology and Beside (i2b2) issued a clinical natural language processing (NLP) challenge that involved a track (track 2) for identifying heart disease risk factors in clinical texts over time. This track aimed to identify medically relevant information related to heart disease risk and track the progression over sets of longitudinal patient medical records. Identification of tags and attributes associated with disease presence and progression, risk factors, and medications in patient medical history were required. Our participation led to development of a hybrid pipeline system based on both machine learning-based and rule-based approaches. Evaluation using the challenge corpus revealed that our system achieved an F1-score of 92.68%, making it the top-ranked system (without additional annotations) of the 2014 i2b2 clinical NLP challenge. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Automatic Rice Crop Height Measurement Using a Field Server and Digital Image Processing

    Directory of Open Access Journals (Sweden)

    Tanakorn Sritarapipat

    2014-01-01

    Full Text Available Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required.

  8. Automatic rice crop height measurement using a field server and digital image processing.

    Science.gov (United States)

    Sritarapipat, Tanakorn; Rakwatin, Preesan; Kasetkasem, Teerasit

    2014-01-07

    Rice crop height is an important agronomic trait linked to plant type and yield potential. This research developed an automatic image processing technique to detect rice crop height based on images taken by a digital camera attached to a field server. The camera acquires rice paddy images daily at a consistent time of day. The images include the rice plants and a marker bar used to provide a height reference. The rice crop height can be indirectly measured from the images by measuring the height of the marker bar compared to the height of the initial marker bar. Four digital image processing steps are employed to automatically measure the rice crop height: band selection, filtering, thresholding, and height measurement. Band selection is used to remove redundant features. Filtering extracts significant features of the marker bar. The thresholding method is applied to separate objects and boundaries of the marker bar versus other areas. The marker bar is detected and compared with the initial marker bar to measure the rice crop height. Our experiment used a field server with a digital camera to continuously monitor a rice field located in Suphanburi Province, Thailand. The experimental results show that the proposed method measures rice crop height effectively, with no human intervention required.

  9. Automatic Recognition of Chinese Personal Name Using Conditional Random Fields and Knowledge Base

    Directory of Open Access Journals (Sweden)

    Chuan Gu

    2015-01-01

    Full Text Available According to the features of Chinese personal name, we present an approach for Chinese personal name recognition based on conditional random fields (CRF and knowledge base in this paper. The method builds multiple features of CRF model by adopting Chinese character as processing unit, selects useful features based on selection algorithm of knowledge base and incremental feature template, and finally implements the automatic recognition of Chinese personal name from Chinese document. The experimental results on open real corpus demonstrated the effectiveness of our method and obtained high accuracy rate and high recall rate of recognition.

  10. Automatic Cell Detection in Bright-Field Microscope Images Using SIFT, Random Forests, and Hierarchical Clustering.

    Science.gov (United States)

    Mualla, Firas; Scholl, Simon; Sommerfeldt, Bjorn; Maier, Andreas; Hornegger, Joachim

    2013-12-01

    We present a novel machine learning-based system for unstained cell detection in bright-field microscope images. The system is fully automatic since it requires no manual parameter tuning. It is also highly invariant with respect to illumination conditions and to the size and orientation of cells. Images from two adherent cell lines and one suspension cell line were used in the evaluation for a total number of more than 3500 cells. Besides real images, simulated images were also used in the evaluation. The detection error was between approximately zero and 15.5% which is a significantly superior performance compared to baseline approaches.

  11. Forensic Loci Allele Database (FLAD): Automatically generated, permanent identifiers for sequenced forensic alleles.

    Science.gov (United States)

    Van Neste, Christophe; Van Criekinge, Wim; Deforce, Dieter; Van Nieuwerburgh, Filip

    2016-01-01

    It is difficult to predict if and when massively parallel sequencing of forensic STR loci will replace capillary electrophoresis as the new standard technology in forensic genetics. The main benefits of sequencing are increased multiplexing scales and SNP detection. There is not yet a consensus on how sequenced profiles should be reported. We present the Forensic Loci Allele Database (FLAD) service, made freely available on http://forensic.ugent.be/FLAD/. It offers permanent identifiers for sequenced forensic alleles (STR or SNP) and their microvariants for use in forensic allele nomenclature. Analogous to Genbank, its aim is to provide permanent identifiers for forensically relevant allele sequences. Researchers that are developing forensic sequencing kits or are performing population studies, can register on http://forensic.ugent.be/FLAD/ and add loci and allele sequences with a short and simple application interface (API).

  12. Pride-asap: Automatic fragment ion annotation of identified PRIDE spectra

    OpenAIRE

    Hulstaert, Niels; Reisinger, Florian; Rameseder, Jonathan; Barsnes, Harald; Vizcaíno, Juan Antonio; Vizcaíno, Juan Antonio; Martens, Lennart

    2013-01-01

    We present an open source software application and library written in Java that provides a uniform annotation of identified spectra stored in the PRIDE database. Pride-asap can be ran in a command line mode for automated processing of multiple PRIDE experiments, but also has a graphical user interface that allows end users to annotate the spectra in PRIDE experiments and to inspect the results in detail. Pride-asap binaries, source code and additional information can be downloaded from http:/...

  13. An Automatic Uav Mapping System for Supporting un (united Nations) Field Operations

    Science.gov (United States)

    Choi, K.; Cheon, J. W.; Kim, H. Y.; Lee, I.

    2016-06-01

    The United Nations (UN) has performed field operations worldwide such as peacekeeping or rescue missions. When such an operation is needed, the UN dispatches an operation team usually with a GIS (Geographic Information System) customized to a specific operation. The base maps for the GIS are generated mostly with satellite images which may not retain a high resolution and the current situation. To build an up-to-date high resolution map, we propose a UAV (unmanned aerial vehicle) based automatic mapping system, which can operate in a fully automatic way from the data acquisition of sensory data to the data processing for the generation of the geospatial products such as a mosaicked orthoimage of a target area. In this study, we analyse the requirements for UN field operations, suggest a UAV mapping system with an operation scenario, and investigate the applicability of the system. With the proposed system, we can construct a tailored GIS with up-to-date and high resolution base maps for a specific operation efficiently.

  14. Automatic Processing of Chinese GF-1 Wide Field of View Images

    Science.gov (United States)

    Zhang, Y.; Wan, Y.; Wang, B.; Kang, Y.; Xiong, J.

    2015-04-01

    The wide field of view (WFV) imaging instrument carried on the Chinese GF-1 satellite includes four cameras. Each camera has 200km swath-width that can acquire earth image at the same time and the observation can be repeated within only 4 days. This enables the applications of remote sensing imagery to advance from non-scheduled land-observation to periodically land-monitoring in the areas that use the images in such resolutions. This paper introduces an automatic data analysing and processing technique for the wide-swath images acquired by GF-1 satellite. Firstly, the images are validated by a self-adaptive Gaussian mixture model based cloud detection method to confirm whether they are qualified and suitable to be involved into the automatic processing workflow. Then the ground control points (GCPs) are quickly and automatically matched from the public geo-information products such as the rectified panchromatic images of Landsat-8. Before the geometric correction, the cloud detection results are also used to eliminate the invalid GCPs distributed in the cloud covered areas, which obviously reduces the ratio of blunders of GCPs. The geometric correction module not only rectifies the rational function models (RFMs), but also provides the self-calibration model and parameters for the non-linear distortion, and it is iteratively processed to detect blunders. The maximum geometric distortion in WFV image decreases from about 10-15 pixels to 1-2 pixels when compensated by self-calibration model. The processing experiments involve hundreds of WFV images of GF-1 satellite acquired from June to September 2013, which covers the whole mainland of China. All the processing work can be finished by one operator within 2 days on a desktop computer made up by a second-generation Intel Core-i7 CPU and a 4-solid-State-Disk array. The digital ortho maps (DOM) are automatically generated with 3 arc second Shuttle Radar Topography Mission (SRTM). The geometric accuracies of the

  15. Automatic detection of diabetic retinopathy features in ultra-wide field retinal images

    Science.gov (United States)

    Levenkova, Anastasia; Sowmya, Arcot; Kalloniatis, Michael; Ly, Angelica; Ho, Arthur

    2017-03-01

    Diabetic retinopathy (DR) is a major cause of irreversible vision loss. DR screening relies on retinal clinical signs (features). Opportunities for computer-aided DR feature detection have emerged with the development of Ultra-WideField (UWF) digital scanning laser technology. UWF imaging covers 82% greater retinal area (200°), against 45° in conventional cameras3 , allowing more clinically relevant retinopathy to be detected4 . UWF images also provide a high resolution of 3078 x 2702 pixels. Currently DR screening uses 7 overlapping conventional fundus images, and the UWF images provide similar results1,4. However, in 40% of cases, more retinopathy was found outside the 7-field ETDRS) fields by UWF and in 10% of cases, retinopathy was reclassified as more severe4 . This is because UWF imaging allows examination of both the central retina and more peripheral regions, with the latter implicated in DR6 . We have developed an algorithm for automatic recognition of DR features, including bright (cotton wool spots and exudates) and dark lesions (microaneurysms and blot, dot and flame haemorrhages) in UWF images. The algorithm extracts features from grayscale (green "red-free" laser light) and colour-composite UWF images, including intensity, Histogram-of-Gradient and Local binary patterns. Pixel-based classification is performed with three different classifiers. The main contribution is the automatic detection of DR features in the peripheral retina. The method is evaluated by leave-one-out cross-validation on 25 UWF retinal images with 167 bright lesions, and 61 other images with 1089 dark lesions. The SVM classifier performs best with AUC of 94.4% / 95.31% for bright / dark lesions.

  16. Automatic lung tumor segmentation on PET/CT images using fuzzy Markov random field model.

    Science.gov (United States)

    Guo, Yu; Feng, Yuanming; Sun, Jian; Zhang, Ning; Lin, Wang; Sa, Yu; Wang, Ping

    2014-01-01

    The combination of positron emission tomography (PET) and CT images provides complementary functional and anatomical information of human tissues and it has been used for better tumor volume definition of lung cancer. This paper proposed a robust method for automatic lung tumor segmentation on PET/CT images. The new method is based on fuzzy Markov random field (MRF) model. The combination of PET and CT image information is achieved by using a proper joint posterior probability distribution of observed features in the fuzzy MRF model which performs better than the commonly used Gaussian joint distribution. In this study, the PET and CT simulation images of 7 non-small cell lung cancer (NSCLC) patients were used to evaluate the proposed method. Tumor segmentations with the proposed method and manual method by an experienced radiation oncologist on the fused images were performed, respectively. Segmentation results obtained with the two methods were similar and Dice's similarity coefficient (DSC) was 0.85 ± 0.013. It has been shown that effective and automatic segmentations can be achieved with this method for lung tumors which locate near other organs with similar intensities in PET and CT images, such as when the tumors extend into chest wall or mediastinum.

  17. Automatic Lung Tumor Segmentation on PET/CT Images Using Fuzzy Markov Random Field Model

    Directory of Open Access Journals (Sweden)

    Yu Guo

    2014-01-01

    Full Text Available The combination of positron emission tomography (PET and CT images provides complementary functional and anatomical information of human tissues and it has been used for better tumor volume definition of lung cancer. This paper proposed a robust method for automatic lung tumor segmentation on PET/CT images. The new method is based on fuzzy Markov random field (MRF model. The combination of PET and CT image information is achieved by using a proper joint posterior probability distribution of observed features in the fuzzy MRF model which performs better than the commonly used Gaussian joint distribution. In this study, the PET and CT simulation images of 7 non-small cell lung cancer (NSCLC patients were used to evaluate the proposed method. Tumor segmentations with the proposed method and manual method by an experienced radiation oncologist on the fused images were performed, respectively. Segmentation results obtained with the two methods were similar and Dice’s similarity coefficient (DSC was 0.85 ± 0.013. It has been shown that effective and automatic segmentations can be achieved with this method for lung tumors which locate near other organs with similar intensities in PET and CT images, such as when the tumors extend into chest wall or mediastinum.

  18. Automatic magnetic resonance spinal cord segmentation with topology constraints for variable fields of view.

    Science.gov (United States)

    Chen, Min; Carass, Aaron; Oh, Jiwon; Nair, Govind; Pham, Dzung L; Reich, Daniel S; Prince, Jerry L

    2013-12-01

    Spinal cord segmentation is an important step in the analysis of neurological diseases such as multiple sclerosis. Several studies have shown correlations between disease progression and metrics relating to spinal cord atrophy and shape changes. Current practices primarily involve segmenting the spinal cord manually or semi-automatically, which can be inconsistent and time-consuming for large datasets. An automatic method that segments the spinal cord and cerebrospinal fluid from magnetic resonance images is presented. The method uses a deformable atlas and topology constraints to produce results that are robust to noise and artifacts. The method is designed to be easily extended to new data with different modalities, resolutions, and fields of view. Validation was performed on two distinct datasets. The first consists of magnetization transfer-prepared T2*-weighted gradient-echo MRI centered only on the cervical vertebrae (C1-C5). The second consists of T1-weighted MRI that covers both the cervical and portions of the thoracic vertebrae (C1-T4). Results were found to be highly accurate in comparison to manual segmentations. A pilot study was carried out to demonstrate the potential utility of this new method for research and clinical studies of multiple sclerosis.

  19. Identifying Structures in Social Conversations in NSCLC Patients through the Semi-Automatic extraction of Topical Taxonomies

    Directory of Open Access Journals (Sweden)

    Giancarlo Crocetti

    2016-01-01

    Full Text Available The exploration of social conversations for addressing patient’s needs is an important analytical task in which many scholarly publications are contributing to fill the knowledge gap in this area. The main difficulty remains the inability to turn such contributions into pragmatic processes the pharmaceutical industry can leverage in order to generate insight from social media data, which can be considered as one of the most challenging source of information available today due to its sheer volume and noise. This study is based on the work by Scott Spangler and Jeffrey Kreulen and applies it to identify structure in social media through the extraction of a topical taxonomy able to capture the latent knowledge in social conversations in health-related sites. The mechanism for automatically identifying and generating a taxonomy from social conversations is developed and pressured tested using public data from media sites focused on the needs of cancer patients and their families. Moreover, a novel method for generating the category’s label and the determination of an optimal number of categories is presented which extends Scott and Jeffrey’s research in a meaningful way. We assume the reader is familiar with taxonomies, what they are and how they are used.

  20. Automatic control of positioning along the joint during EBW in conditions of action of magnetic fields

    Science.gov (United States)

    Druzhinina, A. A.; Laptenok, V. D.; Murygin, A. V.; Laptenok, P. V.

    2016-11-01

    Positioning along the joint during the electron beam welding is a difficult scientific and technical problem to achieve the high quality of welds. The final solution of this problem is not found. This is caused by weak interference protection of sensors of the joint position directly in the welding process. Frequently during the electron beam welding magnetic fields deflect the electron beam from the optical axis of the electron beam gun. The collimated X-ray sensor is used to monitor the beam deflection caused by the action of magnetic fields. Signal of X-ray sensor is processed by the method of synchronous detection. Analysis of spectral characteristics of the X-ray sensor showed that the displacement of the joint from the optical axis of the gun affects on the output signal of sensor. The authors propose dual-circuit system for automatic positioning of the electron beam on the joint during the electron beam welding in conditions of action of magnetic interference. This system includes a contour of joint tracking and contour of compensation of magnetic fields. The proposed system is stable. Calculation of dynamic error of system showed that error of positioning does not exceed permissible deviation of the electron beam from the joint plane.

  1. Automatic NMO Correction and Full Common Depth Point NMO Velocity Field Estimation in Anisotropic Media

    Science.gov (United States)

    Sedek, Mohamed; Gross, Lutz; Tyson, Stephen

    2017-01-01

    We present a new computational method of automatic normal moveout (NMO) correction that not only accurately flattens and corrects the far offset data, but simultaneously provides NMO velocity (v_nmo) for each individual seismic trace. The method is based on a predefined number of NMO velocity sweeps using linear vertical interpolation of different NMO velocities at each seismic trace. At each sweep, we measure the semblance between the zero offset trace (pilot trace) and the next seismic trace using a trace-by-trace rather than sample-by-sample based semblance measure; then after all the sweeps are done, the one with the maximum semblance value is chosen, which is assumed to be the most suitable NMO velocity trace that accurately flattens seismic reflection events. Other traces follow the same process, and a final velocity field is then extracted. Isotropic, anisotropic and lateral heterogenous synthetic geological models were built to test the method. A range of synthetic background noise, ranging from 10 to 30 %, was applied to the models. In addition, the method was tested on Hess's VTI (vertical transverse isotropy) model. Furthermore, we tested our method on a real pre-stack seismic CDP gathered from a gas field in Alaska. The results from the presented examples show an excellent NMO correction and extracted a reasonably accurate NMO velocity field.

  2. Automatically Identifying Fusion Events between GLUT4 Storage Vesicles and the Plasma Membrane in TIRF Microscopy Image Sequences

    Directory of Open Access Journals (Sweden)

    Jian Wu

    2015-01-01

    Full Text Available Quantitative analysis of the dynamic behavior about membrane-bound secretory vesicles has proven to be important in biological research. This paper proposes a novel approach to automatically identify the elusive fusion events between VAMP2-pHluorin labeled GLUT4 storage vesicles (GSVs and the plasma membrane. The differentiation is implemented to detect the initiation of fusion events by modified forward subtraction of consecutive frames in the TIRFM image sequence. Spatially connected pixels in difference images brighter than a specified adaptive threshold are grouped into a distinct fusion spot. The vesicles are located at the intensity-weighted centroid of their fusion spots. To reveal the true in vivo nature of a fusion event, 2D Gaussian fitting for the fusion spot is used to derive the intensity-weighted centroid and the spot size during the fusion process. The fusion event and its termination can be determined according to the change of spot size. The method is evaluated on real experiment data with ground truth annotated by expert cell biologists. The evaluation results show that it can achieve relatively high accuracy comparing favorably to the manual analysis, yet at a small fraction of time.

  3. Machine learning algorithm for automatic detection of CT-identifiable hyperdense lesions associated with traumatic brain injury

    Science.gov (United States)

    Keshavamurthy, Krishna N.; Leary, Owen P.; Merck, Lisa H.; Kimia, Benjamin; Collins, Scott; Wright, David W.; Allen, Jason W.; Brock, Jeffrey F.; Merck, Derek

    2017-03-01

    Traumatic brain injury (TBI) is a major cause of death and disability in the United States. Time to treatment is often related to patient outcome. Access to cerebral imaging data in a timely manner is a vital component of patient care. Current methods of detecting and quantifying intracranial pathology can be time-consuming and require careful review of 2D/3D patient images by a radiologist. Additional time is needed for image protocoling, acquisition, and processing. These steps often occur in series, adding more time to the process and potentially delaying time-dependent management decisions for patients with traumatic brain injury. Our team adapted machine learning and computer vision methods to develop a technique that rapidly and automatically detects CT-identifiable lesions. Specifically, we use scale invariant feature transform (SIFT)1 and deep convolutional neural networks (CNN)2 to identify important image features that can distinguish TBI lesions from background data. Our learning algorithm is a linear support vector machine (SVM)3. Further, we also employ tools from topological data analysis (TDA) for gleaning insights into the correlation patterns between healthy and pathological data. The technique was validated using 409 CT scans of the brain, acquired via the Progesterone for the Treatment of Traumatic Brain Injury phase III clinical trial (ProTECT_III) which studied patients with moderate to severe TBI4. CT data were annotated by a central radiologist and included patients with positive and negative scans. Additionally, the largest lesion on each positive scan was manually segmented. We reserved 80% of the data for training the SVM and used the remaining 20% for testing. Preliminary results are promising with 92.55% prediction accuracy (sensitivity = 91.15%, specificity = 93.45%), indicating the potential usefulness of this technique in clinical scenarios.

  4. [Full-field and automatic methodology of spectral calibration for PGP imaging spectrometer].

    Science.gov (United States)

    Sun, Ci; Bayanheshig; Cui, Ji-cheng; Pan, Ming-zhong; Li, Xiao-tian; Tang, Yu-guo

    2014-08-01

    In order to analyze spectral data quantitatively which is obtained by prism-grating-prism imaging spectrometer, spectral calibration is required in order to determine spectral characteristics of PGP imaging spectrometer, such as the center wavelength of every spectral channel, spectral resolution and spectral bending. A spectral calibration system of full field based on collimated monochromatic light method is designed. Spherical mirror is used to provide collimated light, and a freely sliding and rotating folding mirror is adopted to change the angle of incident light in order to realize full field and automatic calibration of imaging spectrometer. Experiments of spectral calibration have been done for PGP imaging spectrometer to obtain parameters of spectral performance, and accuracy analysis combined with the structural features of the entire spectral calibration system have been done. Analysis results indicate that spectral calibration accuracy of the calibration system reaches 0.1 nm, and the bandwidth accuracy reaches 1.3%. The calibration system has merits of small size, better commonality, high precision and so on, and because of adopting the control of automation, the additional errors which are caused by human are avoided. The calibration system can be used for spectral calibration of other imaging spectrometers whose structures are similar to PGP.

  5. Evaluation of automatic exposure control performance in full-field digital mammography systems using contrast-detail analysis

    Science.gov (United States)

    Suarez Castellanos, Ivan M.; Kaczmarek, Richard; Brunner, Claudia C.; de Las Heras, Hugo; Liu, Haimo; Chakrabarti, Kish

    2012-03-01

    Full Field Digital Mammography (FFDM) is increasingly replacing screen-film systems for screening and diagnosis of breast abnormalities. All FFDM systems are equipped with an Automatic Exposure Control (AEC) which automatically selects technique factors to optimize dose and image quality. It is therefore crucial that AEC performance is properly adjusted and optimized to different breast thicknesses. In this work, we studied the AEC performance of three widely used FFDM systems using the CDMAM and QUART mam/digi phantoms. We used the CDMAM phantom to generate Contrast-Detail (C-D) curves for each AEC mode available in the FFDM systems under study for phantoms with equivalent X-Ray attenuation properties as 3.2 cm, 6 cm and 7.5 cm thick breasts. Generated C-D curves were compared with ideal C-D curves constructed using a metric referred to as the k-factor which is the product of the thickness and the diameter of the smallest correctly identified disks in the CDMAM phantom. Previous observer studies have indicated that k-factor values of 60 to 80 μm2 are particularly useful in demonstrating the threshold for object detectability for detectors used in digital mammography systems. The QUART mam/digi phantom was used to calculate contrast-to-noise ratio (CNR) values at different phantom thicknesses. The results of the C-D analysis and CNR measurements were used to determine limiting CNR values intended to provide a threshold for proper image quality assessment. The results of the Contrast-Detail analysis show that for two of the three evaluated FFDM systems, at higher phantom thicknesses, low contrast signal detectability gets worse. This agrees with the results obtained with the QUART mam/digi phantom, where CNR decreases below determined limiting CNR values.

  6. Adaptive automatic data analysis in full-field fringe-pattern-based optical metrology

    Science.gov (United States)

    Trusiak, Maciej; Patorski, Krzysztof; Sluzewski, Lukasz; Pokorski, Krzysztof; Sunderland, Zofia

    2016-12-01

    Fringe pattern processing and analysis is an important task of full-field optical measurement techniques like interferometry, digital holography, structural illumination and moiré. In this contribution we present several adaptive automatic data analysis solutions based on the notion of Hilbert-Huang transform for measurand retrieval via fringe pattern phase and amplitude demodulation. The Hilbert-Huang transform consists of 2D empirical mode decomposition algorithm and Hilbert spiral transform analysis. Empirical mode decomposition adaptively dissects a meaningful number of same-scale subimages from the analyzed pattern - it is a data-driven method. Appropriately managing this set of unique subimages results in a very powerful fringe pre-filtering tool. Phase/amplitude demodulation is performed using Hilbert spiral transform aided by the local fringe orientation estimator. We describe several optical measurement techniques for technical and biological objects characterization basing on the especially tailored Hilbert-Huang algorithm modifications for fringe pattern denoising, detrending and amplitude/phase demodulation.

  7. Development of a doorframe-typed swinging seedling pick-up device for automatic field transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Han, H.; Mao, H.; Hu, J.; Tian, K.

    2015-07-01

    A doorframe-typed swing seedling pick-up device for automatic field transplanters was developed and evaluated in a laboratory. The device, consisting of a path manipulator and two grippers, can move the pins slowly to extract seedlings from the tray cells and return quickly to the pick-up point for the next extraction. The path manipulator was constructed with the creative design of type-Ⅱ mechanism combination in series. It consists of an oscillating guide linkage mechanism and a grooved globoidal cam mechanism. The gripper is a pincette-type mechanism using the pick-up pins to penetrate into the root mass for seedling extraction. The dynamic analysis of the designed seedling pick-up device was simulated with ADAMS software. Being the first prototype, various performance tests under local production conditions were conducted to find out the optimal machine operation parameters and transplant production conditions. As the gripper with multiple fine pins was moved by the swing pick-up device, it can effectively complete the transplanting work cycle of extracting, transferring, and discharging a seedling. The laboratory evaluation showed that the pick-up device equipped with two grippers can extract 80 seedlings/min with a 90% success and a 3% failure in discharging seedlings, using 42-day-old tomato plantlets. The quality of extracting seedlings was satisfactory. (Author)

  8. Development of a doorframe-typed swinging seedling pick-up device for automatic field transplantation

    Directory of Open Access Journals (Sweden)

    Han Luhua

    2015-06-01

    Full Text Available A doorframe-typed swing seedling pick-up device for automatic field transplanters was developed and evaluated in a laboratory. The device, consisting of a path manipulator and two grippers, can move the pins slowly to extract seedlings from the tray cells and return quickly to the pick-up point for the next extraction. The path manipulator was constructed with the creative design of type-II mechanism combination in series. It consists of an oscillating guide linkage mechanism and a grooved globoidal cam mechanism. The gripper is a pincette-type mechanism using the pick-up pins to penetrate into the root mass for seedling extraction. The dynamic analysis of the designed seedling pick-up device was simulated with ADAMS software. Being the first prototype, various performance tests under local production conditions were conducted to find out the optimal machine operation parameters and transplant production conditions. As the gripper with multiple fine pins was moved by the swing pick-up device, it can effectively complete the transplanting work cycle of extracting, transferring, and discharging a seedling. The laboratory evaluation showed that the pick-up device equipped with two grippers can extract 80 seedlings/min with a 90% success and a 3% failure in discharging seedlings, using 42-day-old tomato plantlets. The quality of extracting seedlings was satisfactory.

  9. Working of spontaneously combustible coal seams with automatic air pressure regulation in the excavation field

    Energy Technology Data Exchange (ETDEWEB)

    Golik, A.S.; Churikov, Yu.V.; Troyan, N.P.

    1980-01-01

    A demonstration is made of the effectiveness of using an automatic air pressure control system during the working of spontaneously combustible coal seams in order to control endogenic fires and gas. 2 figures.

  10. Automatic detection of asteroids and meteoroids --- a wide-field survey

    Science.gov (United States)

    Vereš, P.; Tóth, J.; Jedicke, R.; Tonry, J.; Denneau, L.; Wainscoat, R.; Kornoš, L.; Šilha, J.

    2014-07-01

    The small Near-Earth Asteroids (NEAs) represent a potential risk but also an easily accessible space resource for future robotic or human in-situ space exploration or commercial activities. However, the population of 1--300 m NEAs is not well understood in terms of size- frequency and orbital distribution. NEAs with diameters below 200 m tend to have much faster spin rates than large objects and they are believed to be monolithic and not rubble-pile like their large counterparts. Moreover, the current surveys do not systematically search for the small NEAs that are mostly overlooked. We propose a low- cost robotic optical survey (ADAM-WFS) aimed at small NEAs based on four state-of-the-art telescopes having extremely wide fields of view. The four Houghton-Terebizh 30-cm astrographs (Fig. left) with 4096×4096 -pixel CCD cameras will acquire 96 square degrees in one exposure with the plate scale of 4.4 arcsec/pixel. In 30 seconds, the system will be able to reach +17.5 mag in unfiltered mode. The survey will be operated on semi-automatic basis, covering the entire night sky three times per night and optimized toward fast moving targets recognition. The advantage of the proposed system is the usage of existing of-the-shelf components and software for the image processing and object identification and linking (Denneau et al., 2013). The one-year simulation of the survey (Fig. right) at the testing location at AGO Modra observatory in Slovakia revealed that we will detect 60--240 NEAs between 1--300 m that get closer than 10 lunar distances from the Earth. The number of detections will rise by a factor of 1.5--2 in case the survey is placed at a superb observing location such as Canary Islands. The survey will also serve as an impact warning system for imminent impactors. Our simulation showed that we have a 20 % chance of finding a 50-m NEA on a direct impact orbit. The survey will provide multiple byproducts from the all-sky scans, such as comet discoveries, sparse

  11. Inhibition of bradycardia pacing caused by far-field atrial sensing in a third-generation cardioverter defibrillator with an automatic gain feature.

    Science.gov (United States)

    Curwin, J H; Roelke, M; Ruskin, J N

    1996-01-01

    The diagnostic accuracy of implantable cardioverter defibrillators may be improved by automatically adjusting gain algorithms, which in general reduce the likelihood of oversensing while maintaining the ability to detect the low amplitude signals associated with ventricular fibrillation. We present a patient with a third-generation device who developed prolonged ventricular asystole arising as a complication of the automatic gain feature. During asystole the device automatically increased sensitivity in order to prevent undersensing of ventricular fibrillation, which in this case resulted in far-field sensing of atrial activity and inhibition of ventricular pacing.

  12. Using chi-Squared Automatic Interaction Detection (CHAID) modelling to identify groups of methadone treatment clients experiencing significantly poorer treatment outcomes.

    Science.gov (United States)

    Murphy, Emma L; Comiskey, Catherine M

    2013-10-01

    In times of scarce resources it is important for services to make evidence based decisions when identifying clients with poor outcomes. chi-Squared Automatic Interaction Detection (CHAID) modelling was used to identify characteristics of clients experiencing statistically significant poor outcomes. A national, longitudinal study recruited and interviewed, using the Maudsley Addiction Profile (MAP), 215 clients starting methadone treatment and 78% were interviewed one year later. Four CHAID analyses were conducted to model the interactions between the primary outcome variable, used heroin in the last 90 days prior to one year interview and variables on drug use, treatment history, social functioning and demographics. Results revealed that regardless of these other variables, males over 22 years of age consistently demonstrated significantly poorer outcomes than all other clients. CHAID models can be easily applied by service providers to provide ongoing evidence on clients exhibiting poor outcomes and requiring priority within services.

  13. Automatic detection of tulip breaking virus (TBV) in tulip fields using machine vision

    NARCIS (Netherlands)

    Polder, G.; Heijden, van der G.W.A.M.; Doorn, van J.; Baltissen, A.H.M.C.

    2012-01-01

    Tulip breaking virus (TBV) causes severe economic losses for the Netherlands. Infected plants must be removed from the field as soon as possible to prevent further spread by aphids. Until now screening is done by visual inspection in the field. As the availability of human experts is limited there

  14. ZCURVE 3.0: identify prokaryotic genes with higher accuracy as well as automatically and accurately select essential genes.

    Science.gov (United States)

    Hua, Zhi-Gang; Lin, Yan; Yuan, Ya-Zhou; Yang, De-Chang; Wei, Wen; Guo, Feng-Biao

    2015-07-01

    In 2003, we developed an ab initio program, ZCURVE 1.0, to find genes in bacterial and archaeal genomes. In this work, we present the updated version (i.e. ZCURVE 3.0). Using 422 prokaryotic genomes, the average accuracy was 93.7% with the updated version, compared with 88.7% with the original version. Such results also demonstrate that ZCURVE 3.0 is comparable with Glimmer 3.02 and may provide complementary predictions to it. In fact, the joint application of the two programs generated better results by correctly finding more annotated genes while also containing fewer false-positive predictions. As the exclusive function, ZCURVE 3.0 contains one post-processing program that can identify essential genes with high accuracy (generally >90%). We hope ZCURVE 3.0 will receive wide use with the web-based running mode. The updated ZCURVE can be freely accessed from http://cefg.uestc.edu.cn/zcurve/ or http://tubic.tju.edu.cn/zcurveb/ without any restrictions.

  15. De-identifying Swedish clinical text - refinement of a gold standard and experiments with Conditional random fields

    Directory of Open Access Journals (Sweden)

    Dalianis Hercules

    2010-04-01

    Full Text Available Abstract Background In order to perform research on the information contained in Electronic Patient Records (EPRs, access to the data itself is needed. This is often very difficult due to confidentiality regulations. The data sets need to be fully de-identified before they can be distributed to researchers. De-identification is a difficult task where the definitions of annotation classes are not self-evident. Results We present work on the creation of two refined variants of a manually annotated Gold standard for de-identification, one created automatically, and one created through discussions among the annotators. The data is a subset from the Stockholm EPR Corpus, a data set available within our research group. These are used for the training and evaluation of an automatic system based on the Conditional Random Fields algorithm. Evaluating with four-fold cross-validation on sets of around 4-6 000 annotation instances, we obtained very promising results for both Gold Standards: F-score around 0.80 for a number of experiments, with higher results for certain annotation classes. Moreover, 49 false positives that were verified true positives were found by the system but missed by the annotators. Conclusions Our intention is to make this Gold standard, The Stockholm EPR PHI Corpus, available to other research groups in the future. Despite being slightly more time-consuming we believe the manual consensus gold standard is the most valuable for further research. We also propose a set of annotation classes to be used for similar de-identification tasks.

  16. Markov Random Field Based Automatic Image Alignment for ElectronTomography

    Energy Technology Data Exchange (ETDEWEB)

    Moussavi, Farshid; Amat, Fernando; Comolli, Luis R.; Elidan, Gal; Downing, Kenneth H.; Horowitz, Mark

    2007-11-30

    Cryo electron tomography (cryo-ET) is the primary method for obtaining 3D reconstructions of intact bacteria, viruses, and complex molecular machines ([7],[2]). It first flash freezes a specimen in a thin layer of ice, and then rotates the ice sheet in a transmission electron microscope (TEM) recording images of different projections through the sample. The resulting images are aligned and then back projected to form the desired 3-D model. The typical resolution of biological electron microscope is on the order of 1 nm per pixel which means that small imprecision in the microscope's stage or lenses can cause large alignment errors. To enable a high precision alignment, biologists add a small number of spherical gold beads to the sample before it is frozen. These beads generate high contrast dots in the image that can be tracked across projections. Each gold bead can be seen as a marker with a fixed location in 3D, which provides the reference points to bring all the images to a common frame as in the classical structure from motion problem. A high accuracy alignment is critical to obtain a high resolution tomogram (usually on the order of 5-15nm resolution). While some methods try to automate the task of tracking markers and aligning the images ([8],[4]), they require user intervention if the SNR of the image becomes too low. Unfortunately, cryogenic electron tomography (or cryo-ET) often has poor SNR, since the samples are relatively thick (for TEM) and the restricted electron dose usually results in projections with SNR under 0 dB. This paper shows that formulating this problem as a most-likely estimation task yields an approach that is able to automatically align with high precision cryo-ET datasets using inference in graphical models. This approach has been packaged into a publicly available software called RAPTOR-Robust Alignment and Projection estimation for Tomographic Reconstruction.

  17. Automatic compensation of magnetic field for a rubidium space cold atom clock

    Science.gov (United States)

    Lin, Li; Jingwei, Ji; Wei, Ren; Xin, Zhao; Xiangkai, Peng; Jingfeng, Xiang; Desheng, Lü; Liang, Liu

    2016-07-01

    When the cold atom clock operates in microgravity around the near-earth orbit, its performance will be affected by the fluctuation of magnetic field. A strategy is proposed to suppress the fluctuation of magnetic field by additional coils, whose current is changed accordingly to compensate the magnetic fluctuation by the linear and incremental compensation. The flight model of the cold atom clock is tested in a simulated orbital magnetic environment and the magnetic field fluctuation in the Ramsey cavity is reduced from 17 nT to 2 nT, which implied the uncertainty due to the second order Zeeman shift is reduced to be less than 2×10-16. In addition, utilizing the compensation, the magnetic field in the trapping zone can be suppressed from 7.5 μT to less than 0.3 μT to meet the magnetic field requirement of polarization gradients cooling of atoms. Project supported by the Ministry of Science and Technology of China (Grant No. 2013YQ09094304), the Youth Innovation Promotion Association, Chinese Academy of Sciences, and the National Natural Science Foundation of China (Grant Nos. 11034008 and 11274324).

  18. Scaling laws for Bénard-Marangoni convection using automatic background field generation

    Science.gov (United States)

    Wynn, Andrew; Pershin, Anton; Fantuzzi, Giovanni

    2016-11-01

    We consider scaling laws for Bénard-Marangoni convection and, in particular, study the question of obtaining an upper bound on the Nusselt number Nu in terms of the flow's forcing parameter, the Marangoni number Ma . It has recently been proven using the background field method that in the case of infinite Prandtl number, these fundamental quantities are related by the scaling law Nu optimal background field which numerically improves this bound. The constructed background profiles have boundary layers at both the upper and lower domain boundaries, which is in contrast to those used in previous work. Analysis will be presented explaining how both boundary layers interact (in the context of the optimization problem associated with the background field method) to improve the Nusselt number bound within the background profile method.

  19. Automatic segmentation for brain MR images via a convex optimized segmentation and bias field correction coupled model.

    Science.gov (United States)

    Chen, Yunjie; Zhao, Bo; Zhang, Jianwei; Zheng, Yuhui

    2014-09-01

    Accurate segmentation of magnetic resonance (MR) images remains challenging mainly due to the intensity inhomogeneity, which is also commonly known as bias field. Recently active contour models with geometric information constraint have been applied, however, most of them deal with the bias field by using a necessary pre-processing step before segmentation of MR data. This paper presents a novel automatic variational method, which can segment brain MR images meanwhile correcting the bias field when segmenting images with high intensity inhomogeneities. We first define a function for clustering the image pixels in a smaller neighborhood. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. In order to reduce the effect of the noise, the local intensity variations are described by the Gaussian distributions with different means and variances. Then, the objective functions are integrated over the entire domain. In order to obtain the global optimal and make the results independent of the initialization of the algorithm, we reconstructed the energy function to be convex and calculated it by using the Split Bregman theory. A salient advantage of our method is that its result is independent of initialization, which allows robust and fully automated application. Our method is able to estimate the bias of quite general profiles, even in 7T MR images. Moreover, our model can also distinguish regions with similar intensity distribution with different variances. The proposed method has been rigorously validated with images acquired on variety of imaging modalities with promising results.

  20. Automatic mapping of visual cortex receptive fields: a fast and precise algorithm.

    Science.gov (United States)

    Fiorani, Mario; Azzi, João C B; Soares, Juliana G M; Gattass, Ricardo

    2014-01-15

    An important issue for neurophysiological studies of the visual system is the definition of the region of the visual field that can modify a neuron's activity (i.e., the neuron's receptive field - RF). Usually a trade-off exists between precision and the time required to map a RF. Manual methods (qualitative) are fast but impose a variable degree of imprecision, while quantitative methods are more precise but usually require more time. We describe a rapid quantitative method for mapping visual RFs that is derived from computerized tomography and named back-projection. This method finds the intersection of responsive regions of the visual field based on spike density functions that are generated over time in response to long bars moving in different directions. An algorithm corrects the response profiles for latencies and allows for the conversion of the time domain into a 2D-space domain. The final product is an RF map that shows the distribution of the neuronal activity in visual-spatial coordinates. In addition to mapping the RF, this method also provides functional properties, such as latency, orientation and direction preference indexes. This method exhibits the following beneficial properties: (a) speed; (b) ease of implementation; (c) precise RF localization; (d) sensitivity (this method can map RFs based on few responses); (e) reliability (this method provides consistent information about RF shapes and sizes, which will allow for comparative studies); (f) comprehensiveness (this method can scan for RFs over an extensive area of the visual field); (g) informativeness (it provides functional quantitative data about the RF); and (h) usefulness (this method can map RFs in regions without direct retinal inputs, such as the cortical representations of the optic disc and of retinal lesions, which should allow for studies of functional connectivity, reorganization and neural plasticity). Furthermore, our method allows for precise mapping of RFs in a 30° by 30

  1. Semi-automatic liver tumor segmentation with hidden Markov measure field model and non-parametric distribution estimation.

    Science.gov (United States)

    Häme, Yrjö; Pollari, Mika

    2012-01-01

    A novel liver tumor segmentation method for CT images is presented. The aim of this work was to reduce the manual labor and time required in the treatment planning of radiofrequency ablation (RFA), by providing accurate and automated tumor segmentations reliably. The developed method is semi-automatic, requiring only minimal user interaction. The segmentation is based on non-parametric intensity distribution estimation and a hidden Markov measure field model, with application of a spherical shape prior. A post-processing operation is also presented to remove the overflow to adjacent tissue. In addition to the conventional approach of using a single image as input data, an approach using images from multiple contrast phases was developed. The accuracy of the method was validated with two sets of patient data, and artificially generated samples. The patient data included preoperative RFA images and a public data set from "3D Liver Tumor Segmentation Challenge 2008". The method achieved very high accuracy with the RFA data, and outperformed other methods evaluated with the public data set, receiving an average overlap error of 30.3% which represents an improvement of 2.3% points to the previously best performing semi-automatic method. The average volume difference was 23.5%, and the average, the RMS, and the maximum surface distance errors were 1.87, 2.43, and 8.09 mm, respectively. The method produced good results even for tumors with very low contrast and ambiguous borders, and the performance remained high with noisy image data.

  2. Field manual for identifying and preserving high-water mark data

    Science.gov (United States)

    Feaster, Toby D.; Koenig, Todd A.

    2017-09-26

    This field manual provides general guidance for identifying and collecting high-water marks and is meant to be used by field personnel as a quick reference. The field manual describes purposes for collecting and documenting high-water marks along with the most common types of high-water marks. The manual provides a list of suggested field equipment, describes rules of thumb and best practices for finding high-water marks, and describes the importance of evaluating each high-water mark and assigning a numeric uncertainty value as part of the flagging process. The manual also includes an appendix of photographs of a variety of high-water marks obtained from various U.S. Geological Survey field investigations along with general comments about the logic for the assigned uncertainty values.

  3. In Search of Paradigms: Identifying the Theoretical Foundations of the IS Field

    NARCIS (Netherlands)

    Moody, D.L.; Iacob, Maria Eugenia; Amrit, Chintan Amrit; Alexander, T.; Turpin, M.; van Deventer, J.P.

    2010-01-01

    The goal of this paper is identify the theoretical foundations of the IS field. Currently there is a lack of consensus about what the core IS theories are, or even if we have any at all. If we do, they certainly don’t appear in IS curricula or textbooks as they do in more mature disciplines. So far,

  4. The identifiable victim effect in charitable giving: evidence from a natural field experiment

    DEFF Research Database (Denmark)

    Lesner, Tine; Rasmussen, O. D.

    2014-01-01

    We design a natural field experiment to enhance our understanding of the role of the identifiable victim effect in charitable giving. Using direct mail solicitations to 25797 prior donors of a nonprofit charity, we tested the responsiveness of donors to make a contribution to either an identifiab...

  5. RECOVIR: An application package to automatically identify some single stranded RNA viruses using capsid protein residues that uniquely distinguish among these viruses

    Directory of Open Access Journals (Sweden)

    Fox George E

    2007-10-01

    Full Text Available Abstract Background Most single stranded RNA (ssRNA viruses mutate rapidly to generate large number of strains having highly divergent capsid sequences. Accurate strain recognition in uncharacterized target capsid sequences is essential for epidemiology, diagnostics, and vaccine development. Strain recognition based on similarity scores between target sequences and sequences of homology matched reference strains is often time consuming and ambiguous. This is especially true if only partial target sequences are available or if different ssRNA virus families are jointly analyzed. In such cases, knowledge of residues that uniquely distinguish among known reference strains is critical for rapid and unambiguous strain identification. Conventional sequence comparisons are unable to identify such capsid residues due to high sequence divergence among the ssRNA virus reference strains. Consequently, automated general methods to reliably identify strains using strain distinguishing residues are not currently available. Results We present here RECOVIR ("recognize viruses", a software tool to automatically detect strains of caliciviruses and picornaviruses by comparing their capsid residues with built-in databases of residues that uniquely distinguish among known reference strains of these viruses. The databases were created by constructing partitioned phylogenetic trees of complete capsid sequences of these viruses. Strains were correctly identified for more than 300 complete and partial target sequences by comparing the database residues with the aligned residues of these sequences. It required about 5 seconds of real time to process each sequence. A Java-based user interface coupled with Perl-coded computational modules ensures high portability of the software. RECOVIR currently runs on Windows XP and Linux platforms. The software generalizes a manual method briefly outlined earlier for human caliciviruses. Conclusion This study shows implementation of

  6. Automatic history matching of an offshore field in Brazil; Ajuste automatico de historico de um campo offshore no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Jose P.M. dos [PETROBRAS S.A., Macae, RJ (Brazil). Exploracao e Producao. Bacia de Campos]. E-mail: zepedro@ep-bc.petrobras.com.br; Schiozer, Denis J. [Universidade Estadual de Campinas, SP (Brazil). Dept. de Engenharia de Petroleo]. E-mail: denis@cepetro.unicamp.br

    2000-07-01

    Efficient reservoir management is strongly influenced by good production prediction which depends on a good reservoir characterization. The validation of this characterization, due to the complexity of the dynamics of multiphase flow in porous media and to several geological uncertainties involved in the process, it is obtained through an history matching associated to the study of the reservoir in subject. History matching is usually a very complex task and most of the time it can be a frustrating experience due to the high number of variables to be adjusted to reach a final objective which can be a combination of several matches. Automated history matching techniques were object of several studies but with a limited acceptance due to the large computational effort required. Nowadays, they are becoming more attractive motivated by recent hardware and software developments. This work shows an example of application of automatic history matching using an offshore field in Brazil, with emphasis in the benefits of the use of parallel computing and optimization techniques to reduce the total time of the process. It is shown that although the computational effort is higher, the total time of a reservoir study can be significantly reduced with a higher quality of the results. (author)

  7. Identifying a framework of institutional change in the field of higher education in Russia

    Directory of Open Access Journals (Sweden)

    Volchik Vyacheslav, V.

    2015-12-01

    Full Text Available The paper focuses on the features of institutional change in the field of higher education in Russia. Institutional environment of Russian higher education is very dynamic, institutions change quickly; therefore, interactions between actors occur spontaneously rather than deliberately. The article aims at identifying relevant institutions, regulatory mechanisms, informal rules and practices that influence actors’ behavior in the field. The paper emphasizes the application of qualitative interpretative methods in examining actors’ behavior. Participant observation and questionnaires have been chosen as prevailing data collection methods. The results obtained through participant observation and questionnaires are intermediate, preceding the stage of semi-structured interviews.

  8. A New Method of Identifying 3D Null Points in Solar Vector Magnetic Fields

    Institute of Scientific and Technical Information of China (English)

    Hui Zhao; Jing-Xiu Wang; Jun Zhang; Chi-Jie Xiao

    2005-01-01

    Employing the Poincaré index of isolated null-points in a vector field,we worked out a mathematical method of searching for 3D null-points in coronal magnetic fields. After introducing the relevant differential topology, we test the method by using the analytical model of Brown & Priest. The location of nullpoint identified by our method coincides precisely with the analytical solution.Finally we apply the method to the 3D coronal magnetic fields reconstructed from an observed MDI magnetogram of a super-active region (NOAA 10488). We find that the 3D null-point seems to be a key element in the magnetic topology associated with flare occurrence.

  9. Automatic segmentation of ground-glass opacities in lung CT images by using Markov random field-based algorithms.

    Science.gov (United States)

    Zhu, Yanjie; Tan, Yongqing; Hua, Yanqing; Zhang, Guozhen; Zhang, Jianguo

    2012-06-01

    Chest radiologists rely on the segmentation and quantificational analysis of ground-glass opacities (GGO) to perform imaging diagnoses that evaluate the disease severity or recovery stages of diffuse parenchymal lung diseases. However, it is computationally difficult to segment and analyze patterns of GGO while compared with other lung diseases, since GGO usually do not have clear boundaries. In this paper, we present a new approach which automatically segments GGO in lung computed tomography (CT) images using algorithms derived from Markov random field theory. Further, we systematically evaluate the performance of the algorithms in segmenting GGO in lung CT images under different situations. CT image studies from 41 patients with diffuse lung diseases were enrolled in this research. The local distributions were modeled with both simple and adaptive (AMAP) models of maximum a posteriori (MAP). For best segmentation, we used the simulated annealing algorithm with a Gibbs sampler to solve the combinatorial optimization problem of MAP estimators, and we applied a knowledge-guided strategy to reduce false positive regions. We achieved AMAP-based GGO segmentation results of 86.94%, 94.33%, and 94.06% in average sensitivity, specificity, and accuracy, respectively, and we evaluated the performance using radiologists' subjective evaluation and quantificational analysis and diagnosis. We also compared the results of AMAP-based GGO segmentation with those of support vector machine-based methods, and we discuss the reliability and other issues of AMAP-based GGO segmentation. Our research results demonstrate the acceptability and usefulness of AMAP-based GGO segmentation for assisting radiologists in detecting GGO in high-resolution CT diagnostic procedures.

  10. Persistent Identifiers for Field Expeditions: A Next Step for the US Oceanographic Research Fleet

    Science.gov (United States)

    Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen

    2016-04-01

    Oceanographic research cruises are complex affairs, typically requiring an extensive effort to secure the funding, plan the experiment, and mobilize the field party. Yet cruises are not typically published online as first-class digital objects with persistent, citable identifiers linked to the scientific literature. The Rolling Deck to Repository (R2R; info@rvdata.us) program maintains a master catalog of oceanographic cruises for the United States research fleet, currently documenting over 6,000 expeditions on 37 active and retired vessels. In 2015, R2R started routinely publishing a Digital Object Identifier (DOI) for each completed cruise. Cruise DOIs, in turn, are linked to related persistent identifiers where available including the Open Researcher and Contributor ID (ORCID) for members of the science party, the International Geo Sample Number (IGSN) for physical specimens collected during the cruise, the Open Funder Registry (FundRef) codes that supported the experiment, and additional DOIs for datasets, journal articles, and other products resulting from the cruise. Publishing a persistent identifier for each field expedition will facilitate interoperability between the many different repositories that hold research products from cruises; will provide credit to the investigators who secured the funding and carried out the experiment; and will facilitate the gathering of fleet-wide altmetrics that demonstrate the broad impact of oceanographic research.

  11. Human State Computing – Employing different feature sources of non-intrusive biosignals for pattern recognition based automatic state recognition within occupational fields of application

    OpenAIRE

    Schoss, Tom

    2016-01-01

    The aim of the present thesis is to outline, how psychology in general and occupational psychology in particular can benefit from automatic biosignal analysis. Progress in this field within the last decade bears chances to widen the horizon of commonly employed statistics and use an interdisciplinary approach to gain new insights of typical occupational issues with the help of human state computing. Advances within psychological methodology are necessary, because recent approaches are neit...

  12. A Field Evaluation of a Prototype Global Identifier for UF6 Cylinders

    Energy Technology Data Exchange (ETDEWEB)

    White-Horton, Jessica L [ORNL; Whitaker, J Michael [ORNL

    2016-01-01

    The U.S. Department of Energy s National Nuclear Security Administration (DOE/NNSA), members of the U.S. national laboratories, UF6 industry stakeholders, and international inspectorates have been working on developing a global identifier (ID) for UF6 cylinders. This industry-driven project has identified efficiency gains for facility operations, state and/or regional regulation, and inspections by the International Atomic Energy Agency. The global ID features standardized alphanumeric characters, a large font, a barcode (for automated reading), and it is affixed to the cylinder. Four years of active engagement with all of the stakeholders has resulted in on the development of user requirements, implementation guidelines, and a preferred design for the ID. Although this project was conceived and has been largely managed from the DOE/NNSA side to address international non-proliferation concerns, it remains an industry-driven initiative. When designing the field evaluation, the NNSA team worked closely with a World Nuclear Transport Institute (WNTI) Working Group on UF6 Cylinder Identification to determine features that would provide the most benefit. The WNTI Working Group consists predominately of industry members associated with cylinder fabrication, UF6 conversion, enrichment, fuel fabrication, and cylinder transport. Despite its industry-laden focus, the DOE/NNSA team realized that a vote of confidence from the IAEA could serve as a catalyst for the overall project, and its eventual implementation. In April of 2016, a field evaluation was conducted to demonstrate how the key features of the identifier would work in an operational setting. A selected team travelled to Vienna to evaluate the benefits of a global identifier in performing a PIV in a cylinder storage area containing ~50-100 cylinders. The mock tag-checking exercise was conducted three separate times, with varying scenarios with three different teams. The first group performed the exercise according

  13. Identifying Areas for Field Conservation of Forages in Latin American Disturbed Environments

    Directory of Open Access Journals (Sweden)

    Michael Peters

    2005-06-01

    Full Text Available This paper uses the spatial analysis tools DIVA and FloraMap to identify potential areas for the in situ conservation of a set of 10 forage species. We introduce the idea of roadside verges as conservation areas and discuss the risks and opportunities of two potential scenarios for conservation. These are the introduction of mass reservoirs outside of the original areas of collection and conservation inside the area of origin. Four potential areas for in situ conservation in Latin America are identified. Although more detailed studies using remote sensing, soil information, and field reconnaissance will be necessary for a final assessment of the suggested areas as field conservation sites, we discuss the possibilities of establishing low-maintenance communities and the potential dangers of introducing harmful weed species. We do not have final answers with regard to the permanent maintenance of genetic diversity in these areas but suggest that further studies of genetic drift in the populations would not only be scientifically useful but might also lead to identifying useful genotypes for local use.

  14. Field-based high throughput phenotyping rapidly identifies genomic regions controlling yield components in rice

    Science.gov (United States)

    Tanger, Paul; Klassen, Stephen; Mojica, Julius P.; Lovell, John T.; Moyers, Brook T.; Baraoidan, Marietta; Naredo, Maria Elizabeth B.; McNally, Kenneth L.; Poland, Jesse; Bush, Daniel R.; Leung, Hei; Leach, Jan E.; McKay, John K.

    2017-01-01

    To ensure food security in the face of population growth, decreasing water and land for agriculture, and increasing climate variability, crop yields must increase faster than the current rates. Increased yields will require implementing novel approaches in genetic discovery and breeding. Here we demonstrate the potential of field-based high throughput phenotyping (HTP) on a large recombinant population of rice to identify genetic variation underlying important traits. We find that detecting quantitative trait loci (QTL) with HTP phenotyping is as accurate and effective as traditional labor-intensive measures of flowering time, height, biomass, grain yield, and harvest index. Genetic mapping in this population, derived from a cross of an modern cultivar (IR64) with a landrace (Aswina), identified four alleles with negative effect on grain yield that are fixed in IR64, demonstrating the potential for HTP of large populations as a strategy for the second green revolution. PMID:28220807

  15. Field-based high throughput phenotyping rapidly identifies genomic regions controlling yield components in rice.

    Science.gov (United States)

    Tanger, Paul; Klassen, Stephen; Mojica, Julius P; Lovell, John T; Moyers, Brook T; Baraoidan, Marietta; Naredo, Maria Elizabeth B; McNally, Kenneth L; Poland, Jesse; Bush, Daniel R; Leung, Hei; Leach, Jan E; McKay, John K

    2017-02-21

    To ensure food security in the face of population growth, decreasing water and land for agriculture, and increasing climate variability, crop yields must increase faster than the current rates. Increased yields will require implementing novel approaches in genetic discovery and breeding. Here we demonstrate the potential of field-based high throughput phenotyping (HTP) on a large recombinant population of rice to identify genetic variation underlying important traits. We find that detecting quantitative trait loci (QTL) with HTP phenotyping is as accurate and effective as traditional labor-intensive measures of flowering time, height, biomass, grain yield, and harvest index. Genetic mapping in this population, derived from a cross of an modern cultivar (IR64) with a landrace (Aswina), identified four alleles with negative effect on grain yield that are fixed in IR64, demonstrating the potential for HTP of large populations as a strategy for the second green revolution.

  16. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  17. Progression of patterns (POP): a machine classifier algorithm to identify glaucoma progression in visual fields.

    Science.gov (United States)

    Goldbaum, Michael H; Lee, Intae; Jang, Giljin; Balasubramanian, Madhusudhanan; Sample, Pamela A; Weinreb, Robert N; Liebmann, Jeffrey M; Girkin, Christopher A; Anderson, Douglas R; Zangwill, Linda M; Fredette, Marie-Josee; Jung, Tzyy-Ping; Medeiros, Felipe A; Bowd, Christopher

    2012-09-25

    We evaluated Progression of Patterns (POP) for its ability to identify progression of glaucomatous visual field (VF) defects. POP uses variational Bayesian independent component mixture model (VIM), a machine learning classifier (MLC) developed previously. VIM separated Swedish Interactive Thresholding Algorithm (SITA) VFs from a set of 2,085 normal and glaucomatous eyes into nine axes (VF patterns): seven glaucomatous. Stable glaucoma was simulated in a second set of 55 patient eyes with five VFs each, collected within four weeks. A third set of 628 eyes with 4,186 VFs (mean ± SD of 6.7 ± 1.7 VFs over 4.0 ± 1.4 years) was tested for progression. Tested eyes were placed into suspect and glaucoma categories at baseline, based on VFs and disk stereoscopic photographs; a subset of eyes had stereophotographic evidence of progressive glaucomatous optic neuropathy (PGON). Each sequence of fields was projected along seven VIM glaucoma axes. Linear regression (LR) slopes generated from projections onto each axis yielded a degree of confidence (DOC) that there was progression. At 95% specificity, progression cutoffs were established for POP, visual field index (VFI), and mean deviation (MD). Guided progression analysis (GPA) was also compared. POP identified a statistically similar number of eyes (P > 0.05) as progressing compared with VFI, MD, and GPA in suspects (3.8%, 2.7%, 5.6%, and 2.9%, respectively), and more eyes than GPA (P = 0.01) in glaucoma (16.0%, 15.3%, 12.0%, and 7.3%, respectively), and more eyes than GPA (P = 0.05) in PGON eyes (26.3%, 23.7%, 27.6%, and 14.5%, respectively). POP, with its display of DOC of progression and its identification of progressing VF defect pattern, adds to the information available to the clinician for detecting VF progression.

  18. Towards identifying the mechanisms underlying field-aligned edge-loss of HHFW power on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    Perkins, R. J. [Princeton Plasma Physics Laboratory (PPPL); Ahn, Joonwook [ORNL; Bell, R. E. [Princeton Plasma Physics Laboratory (PPPL); Bertelli, Nicola [Princeton Plasma Physics Laboratory (PPPL); Diallo, A. [Princeton Plasma Physics Laboratory (PPPL); Gerhardt, S. [Princeton Plasma Physics Laboratory (PPPL); Gray, T. K. [Oak Ridge National Laboratory (ORNL); Green, David L [ORNL; Jaeger, E. F. [XCEL; Hosea, J. [Princeton Plasma Physics Laboratory (PPPL); Jaworski, M. A. [Princeton Plasma Physics Laboratory (PPPL); LeBlanc, B [Princeton Plasma Physics Laboratory (PPPL); Kramer, G. [Princeton Plasma Physics Laboratory (PPPL); McLean, Adam G [ORNL; Maingi, Rajesh [ORNL; Phillips, C. K. [Princeton Plasma Physics Laboratory (PPPL); Podesta, M. [Princeton Plasma Physics Laboratory (PPPL); Ryan, Philip Michael [ORNL; Sabbagh, S. A. [Columbia University; Scotti, F. [Princeton Plasma Physics Laboratory (PPPL); Taylor, G. [Princeton Plasma Physics Laboratory (PPPL); Wilson, J. R. [Princeton Plasma Physics Laboratory (PPPL)

    2013-01-01

    Fast-wave heating will be a major heating scheme on ITER, as it can heat ions directly and is relatively unaffected by the large machine size unlike neutral beams. However, fast-wave interactions with the plasma edge can lead to deleterious effects such as, in the case of the high-harmonic fast-wave (HHFW) system on NSTX, large losses of fast-wave power in the scrape off layer (SOL) under certain conditions. In such scenarios, a large fraction of the lost HHFW power is deposited on the upper and lower divertors in bright spiral shapes. The responsible mechanism(s) has not yet been identified but may include fast-wave propagation in the scrape off layer, parametric decay instability, and RF currents driven by the antenna reactive fields. Understanding and mitigating these losses is important not only for improving the heating and current-drive on NSTX-Upgrade but also for understanding fast-wave propagation across the SOL in any fast-wave system. This talk summarizes experimental results demonstrating that the flow of lost HHFW power to the divertor regions largely follows the open SOL magnetic field lines. This lost power flux is relatively large close to both the antenna and the last closed flux surface with a reduced level in between, so the loss mechanism cannot be localized to the antenna. At the same time, significant losses also occur along field lines connected to the inboard edge of the bottom antenna plate. The power lost within the spirals is roughly estimated, showing that these field-aligned losses to the divertor are significant but may not account for the total HHFW loss. To elucidate the role of the onset layer for perpendicular fast-wave propagation with regards to fast-wave propagation in the SOL, a cylindrical cold-plasma model is being developed. This model, in addition to advanced RF codes such as TORIC and AORSA, is aimed at identifying the underlying mechanism(s) behind these SOL losses, to minimize their effects in NSTX-U, and to predict

  19. The primary motor area for voluntary diaphragmatic motion identified by high field fMRI.

    Science.gov (United States)

    Nakayama, Takahiro; Fujii, Yukihiko; Suzuki, Kiyotaka; Kanazawa, Ichiro; Nakada, Tsutomu

    2004-06-01

    In order to identify the precise location of the primary motor area for the diaphragm with respect to the classical motor homunculus, functional magnetic resonance imaging (fMRI) experiments were performed utilizing independent component-cross correlation- sequential epoch (ICS) analysis on a high-field (3.0 Tesla) system. Activations which correlated with voluntary diaphragmatic motion mapped onto the area anterolateral to that for voluntary hand motion (internal control in ICS analysis). Multiple subject analysis yielded the primary motor cortex for the diaphragm to be (+/-48, -4, 47) in the Talairach and Tournoux coordinates. The results were highly consistent with the previously reported cortical area for the diaphragm determined by transcranial electrical/magnetic stimulation.

  20. Cardiac magnetic field map topology quantified by Kullback-Leibler entropy identifies patients with hypertrophic cardiomyopathy

    Science.gov (United States)

    Schirdewan, A.; Gapelyuk, A.; Fischer, R.; Koch, L.; Schütt, H.; Zacharzowsky, U.; Dietz, R.; Thierfelder, L.; Wessel, N.

    2007-03-01

    Hypertrophic cardiomyopathy (HCM) is a common primary inherited cardiac muscle disorder, defined clinically by the presence of unexplained left ventricular hypertrophy. The detection of affected patients remains challenging. Genetic testing is limited because only in 50%-60% of all HCM diagnoses an underlying mutation can be found. Furthermore, the disease has a varied clinical course and outcome, with many patients having little or no discernible cardiovascular symptoms, whereas others develop profound exercise limitation and recurrent arrhythmias or sudden cardiac death. Therefore prospective screening of HCM family members is strongly recommended. According to the current guidelines this includes serial echocardiographic and electrocardiographic examinations. In this study we investigated the capability of cardiac magnetic field mapping (CMFM) to detect patients suffering from HCM. We introduce for the first time a combined diagnostic approach based on map topology quantification using Kullback-Leibler (KL) entropy and regional magnetic field strength parameters. The cardiac magnetic field was recorded over the anterior chest wall using a multichannel-LT-SQUID system. CMFM was calculated based on a regular 36 point grid. We analyzed CMFM in patients with confirmed diagnosis of HCM (HCM, n =33, 43.8±13 years, 13 women, 20 men), a control group of healthy subjects (NORMAL, n =57, 39.6±8.9 years; 22 women and 35 men), and patients with confirmed cardiac hypertrophy due to arterial hypertension (HYP, n =42, 49.7±7.9 years, 15 women and 27 men). A subgroup analysis was performed between HCM patients suffering from the obstructive (HOCM, n =19) and nonobstructive (HNCM, n =14) form of the disease. KL entropy based map topology quantification alone identified HCM patients with a sensitivity of 78.8% and specificity of 86.9% (overall classification rate 84.8%). The combination of the KL parameters with a regional field strength parameter improved the overall

  1. A simple field method to identify foot strike pattern during running.

    Science.gov (United States)

    Giandolini, Marlène; Poupard, Thibaut; Gimenez, Philippe; Horvais, Nicolas; Millet, Guillaume Y; Morin, Jean-Benoît; Samozino, Pierre

    2014-05-07

    Identifying foot strike patterns in running is an important issue for sport clinicians, coaches and footwear industrials. Current methods allow the monitoring of either many steps in laboratory conditions or only a few steps in the field. Because measuring running biomechanics during actual practice is critical, our purpose is to validate a method aiming at identifying foot strike patterns during continuous field measurements. Based on heel and metatarsal accelerations, this method requires two uniaxial accelerometers. The time between heel and metatarsal acceleration peaks (THM) was compared to the foot strike angle in the sagittal plane (αfoot) obtained by 2D video analysis for various conditions of speed, slope, footwear, foot strike and state of fatigue. Acceleration and kinematic measurements were performed at 1000Hz and 120Hz, respectively, during 2-min treadmill running bouts. Significant correlations were observed between THM and αfoot for 14 out of 15 conditions. The overall correlation coefficient was r=0.916 (P<0.0001, n=288). The THM method is thus highly reliable for a wide range of speeds and slopes, and for all types of foot strike except for extreme forefoot strike during which the heel rarely or never strikes the ground, and for different footwears and states of fatigue. We proposed a classification based on THM: FFS<-5.49ms

  2. Non-destructive Phenotyping to Identify Brachiaria Hybrids Tolerant to Waterlogging Stress under Field Conditions.

    Science.gov (United States)

    Jiménez, Juan de la Cruz; Cardoso, Juan A; Leiva, Luisa F; Gil, Juanita; Forero, Manuel G; Worthington, Margaret L; Miles, John W; Rao, Idupulapati M

    2017-01-01

    Brachiaria grasses are sown in tropical regions around the world, especially in the Neotropics, to improve livestock production. Waterlogging is a major constraint to the productivity and persistence of Brachiaria grasses during the rainy season. While some Brachiaria cultivars are moderately tolerant to seasonal waterlogging, none of the commercial cultivars combines superior yield potential and nutritional quality with a high level of waterlogging tolerance. The Brachiaria breeding program at the International Center for Tropical Agriculture, has been using recurrent selection for the past two decades to combine forage yield with resistance to biotic and abiotic stress factors. The main objective of this study was to test the suitability of normalized difference vegetation index (NDVI) and image-based phenotyping as non-destructive approaches to identify Brachiaria hybrids tolerant to waterlogging stress under field conditions. Nineteen promising hybrid selections from the breeding program and three commercial checks were evaluated for their tolerance to waterlogging under field conditions. The waterlogging treatment was imposed by applying and maintaining water to 3 cm above soil surface. Plant performance was determined non-destructively using proximal sensing and image-based phenotyping and also destructively via harvesting for comparison. Image analysis of projected green and dead areas, NDVI and shoot biomass were positively correlated (r ≥ 0.8). Our results indicate that image analysis and NDVI can serve as non-destructive screening approaches for the identification of Brachiaria hybrids tolerant to waterlogging stress.

  3. Non-destructive Phenotyping to Identify Brachiaria Hybrids Tolerant to Waterlogging Stress under Field Conditions

    Science.gov (United States)

    Jiménez, Juan de la Cruz; Cardoso, Juan A.; Leiva, Luisa F.; Gil, Juanita; Forero, Manuel G.; Worthington, Margaret L.; Miles, John W.; Rao, Idupulapati M.

    2017-01-01

    Brachiaria grasses are sown in tropical regions around the world, especially in the Neotropics, to improve livestock production. Waterlogging is a major constraint to the productivity and persistence of Brachiaria grasses during the rainy season. While some Brachiaria cultivars are moderately tolerant to seasonal waterlogging, none of the commercial cultivars combines superior yield potential and nutritional quality with a high level of waterlogging tolerance. The Brachiaria breeding program at the International Center for Tropical Agriculture, has been using recurrent selection for the past two decades to combine forage yield with resistance to biotic and abiotic stress factors. The main objective of this study was to test the suitability of normalized difference vegetation index (NDVI) and image-based phenotyping as non-destructive approaches to identify Brachiaria hybrids tolerant to waterlogging stress under field conditions. Nineteen promising hybrid selections from the breeding program and three commercial checks were evaluated for their tolerance to waterlogging under field conditions. The waterlogging treatment was imposed by applying and maintaining water to 3 cm above soil surface. Plant performance was determined non-destructively using proximal sensing and image-based phenotyping and also destructively via harvesting for comparison. Image analysis of projected green and dead areas, NDVI and shoot biomass were positively correlated (r ≥ 0.8). Our results indicate that image analysis and NDVI can serve as non-destructive screening approaches for the identification of Brachiaria hybrids tolerant to waterlogging stress. PMID:28243249

  4. Automatic de-identification of electronic medical records using token-level and character-level conditional random fields.

    Science.gov (United States)

    Liu, Zengjian; Chen, Yangxin; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai; Li, Haodi; Wang, Jingfeng; Deng, Qiwen; Zhu, Suisong

    2015-12-01

    De-identification, identifying and removing all protected health information (PHI) present in clinical data including electronic medical records (EMRs), is a critical step in making clinical data publicly available. The 2014 i2b2 (Center of Informatics for Integrating Biology and Bedside) clinical natural language processing (NLP) challenge sets up a track for de-identification (track 1). In this study, we propose a hybrid system based on both machine learning and rule approaches for the de-identification track. In our system, PHI instances are first identified by two (token-level and character-level) conditional random fields (CRFs) and a rule-based classifier, and then are merged by some rules. Experiments conducted on the i2b2 corpus show that our system submitted for the challenge achieves the highest micro F-scores of 94.64%, 91.24% and 91.63% under the "token", "strict" and "relaxed" criteria respectively, which is among top-ranked systems of the 2014 i2b2 challenge. After integrating some refined localization dictionaries, our system is further improved with F-scores of 94.83%, 91.57% and 91.95% under the "token", "strict" and "relaxed" criteria respectively.

  5. Three-dimensional modeling of a thermal dendrite using the phase field method with automatic anisotropic and unstructured adaptive finite element meshing

    Science.gov (United States)

    Sarkis, C.; Silva, L.; Gandin, Ch-A.; Plapp, M.

    2016-03-01

    Dendritic growth is computed with automatic adaptation of an anisotropic and unstructured finite element mesh. The energy conservation equation is formulated for solid and liquid phases considering an interface balance that includes the Gibbs-Thomson effect. An equation for a diffuse interface is also developed by considering a phase field function with constant negative value in the liquid and constant positive value in the solid. Unknowns are the phase field function and a dimensionless temperature, as proposed by [1]. Linear finite element interpolation is used for both variables, and discretization stabilization techniques ensure convergence towards a correct non-oscillating solution. In order to perform quantitative computations of dendritic growth on a large domain, two additional numerical ingredients are necessary: automatic anisotropic unstructured adaptive meshing [2,[3] and parallel implementations [4], both made available with the numerical platform used (CimLib) based on C++ developments. Mesh adaptation is found to greatly reduce the number of degrees of freedom. Results of phase field simulations for dendritic solidification of a pure material in two and three dimensions are shown and compared with reference work [1]. Discussion on algorithm details and the CPU time will be outlined.

  6. LanHEP—a package for the automatic generation of Feynman rules in field theory. Version 3.0

    Science.gov (United States)

    Semenov, A. V.

    2009-03-01

    The LanHEP program version 3.0 for Feynman rules generation from the Lagrangian is described. It reads the Lagrangian written in a compact form, close to the one used in publications. It means that Lagrangian terms can be written with summation over indices of broken symmetries and using special symbols for complicated expressions, such as covariant derivative and strength tensor for gauge fields. Supersymmetric theories can be described using the superpotential formalism and the 2-component fermion notation. The output is Feynman rules in terms of physical fields and independent parameters in the form of CompHEP model files, which allows one to start calculations of processes in the new physical model. Alternatively, Feynman rules can be generated in FeynArts format or as LaTeX table. One-loop counterterms can be generated in FeynArts format. Program summaryProgram title: LanHEP Catalogue identifier: ADZV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 83 041 No. of bytes in distributed program, including test data, etc.: 1 090 931 Distribution format: tar.gz Programming language: C Computer: PC Operating system: Linux RAM: 2 MB (SM), 12 MB (MSSM), 120 MB (MSSM with counterterms) Classification: 4.4 Nature of problem: Deriving Feynman rules from the Lagrangian Solution method: The program reads the Lagrangian written in a compact form, close to the one used in publications. It means that Lagrangian terms can be written with summation over indices of broken symmetries and using special symbols for complicated expressions, such as covariant derivative and strength tensor for gauge fields. Tools for checking the correctness of the model, and for simplifying the output expressions are provided. The output is

  7. Comparative Study between Sequential Automatic and Manual Home Respiratory Polygraphy Scoring Using a Three-Channel Device: Impact of the Manual Editing of Events to Identify Severe Obstructive Sleep Apnea

    Directory of Open Access Journals (Sweden)

    Glenda Ernst

    2015-01-01

    Full Text Available Objective. According to current guidelines, autoscoring of respiratory events in respiratory polygraphy requires manual scoring. The aim of this study was to evaluate the agreement between automatic analysis and manual scoring to identify patients with suspected OSA. Methods. This retrospective study analyzed 791 records from respiratory polygraphy (RP performed at home. The association grade between automatic scoring and manual scoring was evaluated using Kappa coefficient and the agreement using Bland and Altman test and intraclass correlation coefficient (CCI. To determine the accuracy in the identification of AHI≥30 eV/h, the ROC curve analysis was used. Results. The population analyzed consisted of 493 male (62.3% and 298 female patients, with an average age of 54.7±14.20 years and BMI of 32.7±8.21 kg/m2. There was no significant difference between automatic and manual apnea/hypopnea indexes (aAHI, mAHI: aAHI 17.25 (SD: 17.42 versus mAHI 21.20±7.96 (p; NS. The agreement between mAHI and aAHI to AHI≥30 was 94%, with a Kappa coefficient of 0.83 (p<0.001 and a CCI of 0.83. The AUC-ROC, sensitivity, and specificity were 0.99 (CI 95%: 0.98-0.99, p<0.001, 86% (CI 95%: 78.7–91.4, and 97% (CI 95%: 96–98.3, respectively. Conclusions. We observed good agreement between automatic scoring and sequential manual scoring to identify subjects with AHI≥30 eV/h.

  8. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  9. Identifying areas of the visual field important for quality of life in patients with glaucoma.

    Directory of Open Access Journals (Sweden)

    Hiroshi Murata

    Full Text Available The purpose of this study was to create a vision-related quality of life (VRQoL prediction system to identify visual field (VF test points associated with decreased VRQoL in patients with glaucoma.VRQoL score was surveyed in 164 patients with glaucoma using the 'Sumi questionnaire'. A binocular VF was created from monocular VFs by using the integrated VF (IVF method. VRQoL score was predicted using the 'Random Forest' method, based on visual acuity (VA of better and worse eyes (better-eye and worse-eye VA and total deviation (TD values from the IVF. For comparison, VRQoL scores were regressed (linear regression against: (i mean of TD (IVF MD; (ii better-eye VA; (iii worse-eye VA; and (iv IVF MD and better- and worse-eye VAs. The rank of importance of IVF test points was identified using the Random Forest method.The root mean of squared prediction error associated with the Random Forest method (0.30 to 1.97 was significantly smaller than those with linear regression models (0.34 to 3.38, p<0.05, ten-fold cross validation test. Worse-eye VA was the most important variable in all VRQoL tasks. In general, important VF test points were concentrated along the horizontal meridian. Particular areas of the IVF were important for different tasks: peripheral superior and inferior areas in the left hemifield for the 'letters and sentences' task, peripheral, mid-peripheral and para-central inferior regions for the 'walking' task, the peripheral superior region for the 'going out' task, and a broad scattered area across the IVF for the 'dining' task.The VRQoL prediction model with the Random Forest method enables clinicians to better understand patients' VRQoL based on standard clinical measurements of VA and VF.

  10. Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF: A systematic review of identifying criteria

    Directory of Open Access Journals (Sweden)

    Baliatsas Christos

    2012-08-01

    Full Text Available Abstract Background Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF remains a complex and unclear phenomenon, often characterized by the report of various, non-specific physical symptoms (NSPS when an EMF source is present or perceived by the individual. The lack of validated criteria for defining and assessing IEI-EMF affects the quality of the relevant research, hindering not only the comparison or integration of study findings, but also the identification and management of patients by health care providers. The objective of this review was to evaluate and summarize the criteria that previous studies employed to identify IEI-EMF participants. Methods An extensive literature search was performed for studies published up to June 2011. We searched EMBASE, Medline, Psychinfo, Scopus and Web of Science. Additionally, citation analyses were performed for key papers, reference sections of relevant papers were searched, conference proceedings were examined and a literature database held by the Mobile Phones Research Unit of King’s College London was reviewed. Results Sixty-three studies were included. “Hypersensitivity to EMF” was the most frequently used descriptive term. Despite heterogeneity, the criteria predominantly used to identify IEI-EMF individuals were: 1. Self-report of being (hypersensitive to EMF. 2. Attribution of NSPS to at least one EMF source. 3. Absence of medical or psychiatric/psychological disorder capable of accounting for these symptoms 4. Symptoms should occur soon (up to 24 hours after the individual perceives an exposure source or exposed area. (Hypersensitivity to EMF was either generalized (attribution to various EMF sources or source-specific. Experimental studies used a larger number of criteria than those of observational design and performed more frequently a medical examination or interview as prerequisite for inclusion. Conclusions Considerable heterogeneity exists in the

  11. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    Science.gov (United States)

    Schmithausen, Alexander J.; Trimborn, Manfred; Büscher, Wolfgang

    2016-01-01

    Trace gases such as nitrous oxide (N2O), methane (CH4), and carbon dioxide (CO2) are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR); photoacoustic system (PAS)) are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC) with electron capture detection (ECD), but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system) and GC in the laboratory (offline system). Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges. PMID:27706101

  12. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    Directory of Open Access Journals (Sweden)

    Alexander J. Schmithausen

    2016-10-01

    Full Text Available Trace gases such as nitrous oxide (N2O, methane (CH4, and carbon dioxide (CO2 are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR; photoacoustic system (PAS are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC with electron capture detection (ECD, but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system and GC in the laboratory (offline system. Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges.

  13. Identifying fecal matter contamination in produce fields using multispectral reflectance imaging under ambient solar illumination

    Science.gov (United States)

    An imaging device to detect fecal contamination in fresh produce fields could allow the producer to avoid harvesting fecal-contaminated produce. E.coli O157:H7 outbreaks have been associated with fecal-contaminated leafy greens. In this study, in-field spectral profiles of bovine fecal matter, soil,...

  14. 一种新的数控现场自动编程方法%Novel Method of CNC Automatic Programming for Machining Field

    Institute of Scientific and Technical Information of China (English)

    姚壮; 马跃; 张富彦

    2011-01-01

    数控技术的飞速发展对数控编程的高效性、便捷性提出了更高的要求,为满足车间工程技术人员现场编程的需求,本文提出一种适用于数控现场自动编程的方法.根据不同的加工零件轮廓图形,该方法采用以DXF文件为对象的自动编程和图形尺寸直接输入自动编程两种编程方式,生成数控加工程序,并通过实验证明该方法能使数控编程更灵活、高效.%The fast development of CNC technology requires NC programming to be more efficient and convenient. This paper presents a programming method fitting for field which satisfies the needs of spot programming of engineering technicians. The method is based on contour of different workpieces and uses two programming methods to generate NC processing programs. The two methods are automatic programming using DXF files and automatic programming that inputs graph sizes directly and uses actual print papers of work-pieces contour. By the experiment the method is proved to make the CNC programming become more efficient and flexible.

  15. DEFINING THE MAGNETIC FIELD FOR THE ELEMENTS OF AIR MOTORS AND DEVICES FOR AUTOMATIC TAKE-OFF OF THE MILKING MACHINE

    Directory of Open Access Journals (Sweden)

    Koledov R. V.

    2015-04-01

    Full Text Available Efficient dairy cattle breeding are largely dependent from technology content and service animals. Most farms use a tethered way content. With this method of milking, the technology requires the introduction of new techniques to increase productivity on the farm and cost reduction of dairy products. The most rational is improving the design of existing milking machines. Regarded device for automatic removal of hanging part of a milking machine, its structure and working principle. The main unit of the device is a pneumatic motor that operates from the vacuum system of the milking plant. In the housing mounted on the shaft of the pneumatic motor rotor with curved vanes, there are magnetic elements. These elements should have the necessary geometrical and force parameters to ensure the normal operation of the device for automatic removal of hanging part of milking machine. The article describes a laboratory research and an analysis of the magnetic elements. In laboratory studies, we have revealed a dependence of the traction capacity of pneumatic motor from linear and power parameters of the magnetic elements, as well as established necessary geometric parameters and the magnetic field strength of the elements for normal operation

  16. 用多传感器对制造车间进行大范围监测%Monitoring a Wide Manufacture Field Automatically by Multiple Sensors

    Institute of Scientific and Technical Information of China (English)

    吕健; 滨岛京子; 姜伟

    2006-01-01

    This research is dedicated to develop a safety measurement for human-machine cooperative system, in which the machine region and the human region cannot be separated due to the overlap and the movement both from human and from machines. Our proposal here is to automatically monitor the moving objects by image sensing/recognition method, such that the machine system can get enough information about the environment situation and about the production progress at any time, and therefore the machines can accordingly take some corresponding actions automatically to avoid hazard. For this purpose, two types of monitor systems are proposed. The first type is based on the omni directional vision sensor, and the second is based on the stereo vision sensor. Each type may be used alone or together with another type, depending on the safety system's requirements and the specific situation of the manufacture field to be monitored. In this paper, the description about these two types are given, and as for the special application of these image sensors into safety control, the construction of a hierarchy safety system is proposed.

  17. Failure Analysis to Identify Thermal Runaway of Bypass Diodes in Fielded Modules

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, Chuanxiao, Uchida, Yasunori; Johnston, Steve; Hacke, Peter; Wohlgemuth, John; Al-Jassim, Mowafak

    2017-03-14

    We studied a bypass diode recuperated from fielded modules in a rooftop installation to determine the failure mechanism. The field-failed diode showed similar characteristics to thermal runaway, specifically X-ray tomography evidence of migrated metal. We also observed burn marks on the silicon surface like those lab-stressed for thermal runaway. Reaction products are more soluble than silicon and the surface is oxygen rich.

  18. Identifying erosive periods by using RUSLE factors in mountain fields of the Central Spanish Pyrenees

    Directory of Open Access Journals (Sweden)

    M. López-Vicente

    2008-03-01

    Full Text Available The Mediterranean environment is characterized by strong temporal variations in rainfall volume and intensity, soil moisture and vegetation cover along the year. These factors play a key role on soil erosion. The aim of this work is to identify different erosive periods in function of the temporal changes in rainfall and runoff characteristics (erosivity, maximum intensity and number of erosive events, soil properties (soil erodibility in relation to freeze-thaw processes and soil moisture content and current tillage practices in a set of agricultural fields in a mountainous area of the Central Pyrenees in NE Spain. To this purpose the rainfall and runoff erosivity (R, the soil erodibility (K and the cover-management (C factors of the empirical RUSLE soil loss model were used. The R, K and C factors were calculated at monthly scale. The first erosive period extends from July to October and presents the highest values of erosivity (87.8 MJ mm ha−1 h−1, maximum rainfall intensity (22.3 mm h−1 and monthly soil erosion (0.25 Mg ha−1 month−1 with the minimum values of duration of erosive storms, freeze-thaw cycles, soil moisture content and soil erodibility (0.007 Mg h MJ−1 mm−1. This period includes the harvesting and the plowing tillage practices. The second erosive period has a duration of two months, from May to June, and presents the lowest total and monthly soil losses (0.10 Mg ha−1 month−1 that correspond to the maximum protection of the soil by the crop-cover ($C$ factor = 0.05 due to the maximum stage of the growing season and intermediate values of rainfall and runoff erosivity, maximum rainfall intensity and soil erodibility. The third erosive period extends from November to April and has the minimum values of rainfall erosivity (17.5 MJ mm ha−1 h−1 and

  19. Identifying erosive periods by using RUSLE factors in mountain fields of the Central Spanish Pyrenees

    Directory of Open Access Journals (Sweden)

    M. López-Vicente

    2007-07-01

    Full Text Available The Mediterranean environment is characterized by strong temporal variations in rainfall volume and intensity, soil moisture and vegetation cover along the year. These factors play a key role on soil erosion. The aim of this work is to identify different erosive periods in function of the temporal changes in rainfall and runoff characteristics (erosivity, maximum intensity and number of erosive events, soil properties (soil erodibility in relation to freeze-thaw processes and soil moisture content and current tillage practices in a set of agricultural fields in a mountainous area of the Central Pyrenees in NE Spain. To this purpose the rainfall and runoff erosivity (R, the soil erodibility (K and the cover-management (C factors of the empirical RUSLE soil loss model were used. The R, K and C factors were calculated at monthly scale. The first erosive period extends from July to October and presents the highest values of erosivity (87.8 MJ mm ha−1 h−1, maximum rainfall intensity (22.3 mm h−1 and monthly soil erosion (0.10 Mg ha−1 month−1 with the minimum values of duration of erosive storms, freeze-thaw cycles, soil moisture content and soil erodibility (0.007 Mg h MJ−1 mm−1. This period includes the harvesting and the plowing tillage practices. The second erosive period has a duration of two months, from May to June, and presents the lowest total and monthly soil losses (0.04 Mg ha−1 month−1 that correspond to the maximum protection of the soil by the crop-cover (C factor = 0.05 due to the maximum stage of the growing season and intermediate values of rainfall and runoff erosivity, maximum rainfall intensity and soil erodibility. The third erosive period extends from November to April and has the minimum values of rainfall erosivity (17.5 MJ mm ha−1 h−1 and maximum rainfall intensity (6.0 mm h−1

  20. Methods for automatized detection of rapid changes in lateral boundary condition fields for NWP limited area models

    Science.gov (United States)

    Tudor, M.

    2015-08-01

    Three-hourly temporal resolution of lateral boundary data for limited area models (LAMs) can be too infrequent to resolve rapidly moving storms. This problem is expected to be worse with increasing horizontal resolution. In order to detect intensive disturbances in surface pressure moving rapidly through the model domain, a filtered surface pressure field (MCUF) is computed operationally in the ARPEGE global model of Météo France. The field is distributed in the coupling files along with conventional meteorological fields used for lateral boundary conditions (LBCs) for the operational forecast using limited area model ALADIN (Aire Limitée Adaptation dynamique Développement InterNational) in the Meteorological and Hydrological Service of Croatia (DHMZ). Here an analysis is performed of the MCUF field for the LACE coupling domain for the period from 23 January 2006, when it became available, until 15 November 2014. The MCUF field is a good indicator of rapidly moving pressure disturbances (RMPDs). Its spatial and temporal distribution can be associated with the usual cyclone tracks and areas known to be supporting cyclogenesis. An alternative set of coupling files from the IFS operational run in the European Centre for Medium-Range Weather Forecasts (ECMWF) is also available operationally in DHMZ with 3-hourly temporal resolution, but the MCUF field is not available. Here, several methods are tested that detect RMPDs in surface pressure a posteriori from the IFS model fields provided in the coupling files. MCUF is computed by running ALADIN on the coupling files from IFS. The error function is computed using one-time-step integration of ALADIN on the coupling files without initialization, initialized with digital filter initialization (DFI) or scale-selective DFI (SSDFI). Finally, the amplitude of changes in the mean sea level pressure is computed from the fields in the coupling files. The results are compared to the MCUF field of ARPEGE and the results of same

  1. Semi-automatic measurement of visual verticality perception in humans reveals a new category of visual field dependency.

    Science.gov (United States)

    Kaleff, C R; Aschidamini, C; Baron, J; Di Leone, C N; Leone, C N; Canavarro, S; Vargas, C D

    2011-08-01

    Previous assessment of verticality by means of rod and rod and frame tests indicated that human subjects can be more (field dependent) or less (field independent) influenced by a frame placed around a tilted rod. In the present study we propose a new approach to these tests. The judgment of visual verticality (rod test) was evaluated in 50 young subjects (28 males, ranging in age from 20 to 27 years) by randomly projecting a luminous rod tilted between -18 and +18° (negative values indicating left tilts) onto a tangent screen. In the rod and frame test the rod was displayed within a luminous fixed frame tilted at +18 or -18°. Subjects were instructed to verbally indicate the rod's inclination direction (forced choice). Visual dependency was estimated by means of a Visual Index calculated from rod and rod and frame test values. Based on this index, volunteers were classified as field dependent, intermediate and field independent. A fourth category was created within the field-independent subjects for whom the amount of correct guesses in the rod and frame test exceeded that of the rod test, thus indicating improved performance when a surrounding frame was present. In conclusion, the combined use of subjective visual vertical and the rod and frame test provides a specific and reliable form of evaluation of verticality in healthy subjects and might be of use to probe changes in brain function after central or peripheral lesions.

  2. Semi-automatic measurement of visual verticality perception in humans reveals a new category of visual field dependency

    Directory of Open Access Journals (Sweden)

    C.R. Kaleff

    2011-08-01

    Full Text Available Previous assessment of verticality by means of rod and rod and frame tests indicated that human subjects can be more (field dependent or less (field independent influenced by a frame placed around a tilted rod. In the present study we propose a new approach to these tests. The judgment of visual verticality (rod test was evaluated in 50 young subjects (28 males, ranging in age from 20 to 27 years by randomly projecting a luminous rod tilted between -18 and +18° (negative values indicating left tilts onto a tangent screen. In the rod and frame test the rod was displayed within a luminous fixed frame tilted at +18 or -18°. Subjects were instructed to verbally indicate the rod’s inclination direction (forced choice. Visual dependency was estimated by means of a Visual Index calculated from rod and rod and frame test values. Based on this index, volunteers were classified as field dependent, intermediate and field independent. A fourth category was created within the field-independent subjects for whom the amount of correct guesses in the rod and frame test exceeded that of the rod test, thus indicating improved performance when a surrounding frame was present. In conclusion, the combined use of subjective visual vertical and the rod and frame test provides a specific and reliable form of evaluation of verticality in healthy subjects and might be of use to probe changes in brain function after central or peripheral lesions.

  3. Brut: Automatic bubble classifier

    Science.gov (United States)

    Beaumont, Christopher; Goodman, Alyssa; Williams, Jonathan; Kendrew, Sarah; Simpson, Robert

    2014-07-01

    Brut, written in Python, identifies bubbles in infrared images of the Galactic midplane; it uses a database of known bubbles from the Milky Way Project and Spitzer images to build an automatic bubble classifier. The classifier is based on the Random Forest algorithm, and uses the WiseRF implementation of this algorithm.

  4. Identifying Student Competencies in Macro Practice: Articulating the Practice Wisdom of Field Instructors

    Science.gov (United States)

    Regehr, Cheryl; Bogo, Marion; Donovan, Kirsten; Lim, April; Anstice, Susan

    2012-01-01

    Although a growing literature examines competencies in clinical practice, competencies of students in macro social work practice have received comparatively little attention. A grounded-theory methodology was used to elicit field instructor views of student competencies in community, organization, and policy contexts. Competencies described by…

  5. Comparative evaluation of genetic assays to identify oral pre-cancerous fields

    NARCIS (Netherlands)

    Bremmer, J.F.; Braakhuis, B.J.; Brink, A.; Broeckaert, M.A.; Beliën, J.A.M.; Meijer, G.A.; Kuik, D.J.; Leemans, C.R.; Bloemena, E.; van der Waal, I.; Brakenhoff, R.H.

    2008-01-01

    Background: Oral squamous cell carcinomas often develop in a pre-cancerous field, defined as mucosal epithelium with cancer-related genetic alterations, and which may appear as a clinically visible lesion. The test characteristics of three genetic assays that were developed to detect pre-cancerous f

  6. Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF): A systematic review of identifying criteria

    NARCIS (Netherlands)

    Baliatsas, C.; van Kamp, I.; Lebret, E.; Rubin, J.G.

    2012-01-01

    ABSTRACT: BACKGROUND: Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF) remains a complex and unclear phenomenon, often characterized by the report of various, non-specific physical symptoms (NSPS) when an EMF source is present or perceived by the individual. The la

  7. Latent TGFβ binding protein 3 identifies a second heart field in zebrafish

    Science.gov (United States)

    Zhou, Yong; Cashman, Timothy J.; Nevis, Kathleen R.; Obregon, Pablo; Carney, Sara A.; Liu, Yan; Gu, Aihua; Mosimann, Christian; Sondalle, Samuel; Peterson, Richard E.; Heideman, Warren; Burns, Caroline E.; Burns, C. Geoffrey

    2012-01-01

    The four-chambered mammalian heart develops from two fields of cardiac progenitor cells (CPCs) distinguished by their spatiotemporal patterns of differentiation and contributions to the definitive heart [1–3]. The first heart field differentiates earlier in lateral plate mesoderm, generates the linear heart tube and ultimately gives rise to the left ventricle. The second heart field (SHF) differentiates later in pharyngeal mesoderm, elongates the heart tube, and gives rise to the outflow tract (OFT) and much of the right ventricle. Because hearts in lower vertebrates contain a rudimentary OFT but not a right ventricle [4], the existence and function of SHF-like cells in these species has remained a topic of speculation [4–10]. Here we provide direct evidence from Cre/Lox-mediated lineage tracing and loss of function studies in zebrafish, a lower vertebrate with a single ventricle, that latent-TGFβ binding protein 3 (ltbp3) transcripts mark a field of CPCs with defining characteristics of the anterior SHF in mammals. Specifically, ltbp3+ cells differentiate in pharyngeal mesoderm after formation of the heart tube, elongate the heart tube at the outflow pole, and give rise to three cardiovascular lineages in the OFT and myocardium in the distal ventricle. In addition to expressing Ltbp3, a protein that regulates the bioavailability of TGFβ ligands [11], zebrafish SHF cells co-express nkx2.5, an evolutionarily conserved marker of CPCs in both fields [4]. Embryos devoid of ltbp3 lack the same cardiac structures derived from ltbp3+ cells due to compromised progenitor proliferation. Additionally, small-molecule inhibition of TGFβ signaling phenocopies the ltbp3-morphant phenotype whereas expression of a constitutively active TGFβ type I receptor rescues it. Taken together, our findings uncover a requirement for ltbp3-TGFβ signaling during zebrafish SHF development, a process that serves to enlarge the single ventricular chamber in this species. PMID:21623370

  8. An automatic time domain reflectometry device to measure and store soil water contents for stand-alone field use

    NARCIS (Netherlands)

    Elsen, van den H.G.M.; Kokot, J.; Skierucha, W.; Halbertsma, J.M.

    1995-01-01

    A field set-up was developed to measure soil moisture content on ten different positions using the time domain reflectometry (TDR) technique. The set-up works on a 12 V battery or solar panel system, independent of an external power source, has low power consumption, and compact dimensions. The

  9. An automatic time domain reflectometry device to measure and store soil water contents for stand-alone field use

    NARCIS (Netherlands)

    Elsen, van den H.G.M.; Kokot, J.; Skierucha, W.; Halbertsma, J.M.

    1995-01-01

    A field set-up was developed to measure soil moisture content on ten different positions using the time domain reflectometry (TDR) technique. The set-up works on a 12 V battery or solar panel system, independent of an external power source, has low power consumption, and compact dimensions. The syst

  10. Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF): A systematic review of identifying criteria

    OpenAIRE

    Baliatsas Christos; Van Kamp Irene; Lebret Erik; Rubin G

    2012-01-01

    Abstract Background Idiopathic environmental intolerance attributed to electromagnetic fields (IEI-EMF) remains a complex and unclear phenomenon, often characterized by the report of various, non-specific physical symptoms (NSPS) when an EMF source is present or perceived by the individual. The lack of validated criteria for defining and assessing IEI-EMF affects the quality of the relevant research, hindering not only the comparison or integration of study findings, but also the identificati...

  11. Intracellular Recording, Sensory Field Mapping, and Culturing Identified Neurons in the Leech, Hirudo medicinalis

    OpenAIRE

    Titlow, Josh; Majeed, Zana R.; Nicholls, John G.; Cooper, Robin L.

    2013-01-01

    The freshwater leech, Hirudo medicinalis, is a versatile model organism that has been used to address scientific questions in the fields of neurophysiology, neuroethology, and developmental biology. The goal of this report is to consolidate experimental techniques from the leech system into a single article that will be of use to physiologists with expertise in other nervous system preparations, or to biology students with little or no electrophysiology experience. We demonstrate how to disse...

  12. Ocean-bottom Pressure Signals as Potential Identifiers of Tsunami Earthquakes in the Near Field

    Science.gov (United States)

    Salaree, A.; Okal, E. A.

    2015-12-01

    The real-time detection of "tsunami earthquakes" remains a challenge, especially in the near field. These events are characterized by an anomalously slow seismic rupture, with their true long-period seismic moment, and hence, tsunami potential, deceptively concealed from short-period waves and in particular felt accelerations. In the context of the deployment of long-period ocean-bottom sensors in epicentral areas, we explore simple but robust ways to quantify source parameters which could potentially lead to the real-time identification of tsunami earthquakes in the near field. We use records of 2011 Tohoku aftershocks on the JAMSTEC stations deployed off the coast of Japan in the wake of the mainshock. Because seismic phases are not resolvable at short distances, we simply consider an integrated measurement Ω of the square of pressure variations, sharing the philosophy of Boatwright and Choy's (1986) seismic energy, and compare this parameter, scaled to seismic moment, with other discriminants, such as Newman and Okal's (1998) energy-to-moment ratio, Θ, Okal et al.'s (2002) T-wave parameter Γ, or Okal's (2013) parameter Φ combining (in the far field) body-wave duration and energy. We also consider the duration of the pressure signal, and examine its relation to Ω.

  13. Identifying fecal matter contamination in produce fields using multispectral reflectance imaging under ambient solar illumination

    Science.gov (United States)

    Everard, Colm D.; Kim, Moon S.; Lee, Hoonsoo; O'Donnell, Colm P.

    2016-05-01

    An imaging device to detect fecal contamination in fresh produce fields could allow the producer avoid harvesting fecal contaminated produce. E.coli O157:H7 outbreaks have been associated with fecal contaminated leafy greens. In this study, in-field spectral profiles of bovine fecal matter, soil, and spinach leaves are compared. A common aperture imager designed with two identical monochromatic cameras, a beam splitter, and optical filters was used to simultaneously capture two-spectral images of leaves contaminated with both fecal matter and soil. The optical filters where 10 nm full width half maximum bandpass filters, one at 690 nm and the second at 710 nm. These were mounted in front of the object lenses. New images were created using the ratio of these two spectral images on a pixel by pixel basis. Image analysis results showed that the fecal matter contamination could be distinguished from soil and leaf on the ratio images. The use of this technology has potential to allow detection of fecal contamination in produce fields which can be a source of foodbourne illnesses. It has the added benefit of mitigating cross-contamination during harvesting and processing.

  14. Comparative Mapping of Soil Physical-Chemical and Structural Parameters at Field Scale to Identify Zones of Enhanced Leaching Risk

    DEFF Research Database (Denmark)

    Nørgaard, Trine; Møldrup, Per; Olsen, Preben

    2013-01-01

    characteristics including soil texture, bulk density, dissolved tracer, particle and phosphorus transport parameters identified the northern one-third of the field as a zone with higher leaching risk. This risk assessment based on parameter mapping from measurements on intact samples was in good agreement...

  15. Topology-based automatical assignment of force field atom types%基于拓扑子集的势场原子类型的派定

    Institute of Scientific and Technical Information of China (English)

    陈健; 冯长根

    2002-01-01

    对于除了标准氨基酸、核酸残基以外的一般分子,派定分子中原子的原子类型,确定原子类型在不同分子势场之间的转换,这些工作一般是由人工来完成.本文从分子拓扑学的角度解析了分子势场的原子类型(atom type)定义,以基于分子拓扑子集取代原子类型定义的分子片断为途径,实现势场原子类型的计算机自动派定.该方法易于编程,并能满足用户调整原子类型的意愿.%For general molecules the atom types are usually manually assigned rather than automatically. And the conventional definitionof the atom types for force fields represents structural fragments in the structure of a molecule. From the topological point of view, eachof such fragments is one-to-one mapped on a topological subset of molecules of interest, which can be used as the topological definitionof atom types in place of their conventional semantic definition of atom types. Based upon the topological definition, the authors in thispresent paper put forward an approach to the automatical assignment of atom types, which proves to be a favorite for the actual researchpractice.

  16. Using earthquake clusters to identify fracture zones at Puna geothermal field, Hawaii

    Science.gov (United States)

    Lucas, A.; Shalev, E.; Malin, P.; Kenedi, C. L.

    2010-12-01

    The actively producing Puna geothermal system (PGS) is located on the Kilauea East Rift Zone (ERZ), which extends out from the active Kilauea volcano on Hawaii. In the Puna area the rift trend is identified as NE-SW from surface expressions of normal faulting with a corresponding strike; at PGS the surface expression offsets in a left step, but no rift perpendicular faulting is observed. An eight station borehole seismic network has been installed in the area of the geothermal system. Since June 2006, a total of 6162 earthquakes have been located close to or inside the geothermal system. The spread of earthquake locations follows the rift trend, but down rift to the NE of PGS almost no earthquakes are observed. Most earthquakes located within the PGS range between 2-3 km depth. Up rift to the SW of PGS the number of events decreases and the depth range increases to 3-4 km. All initial locations used Hypoinverse71 and showed no trends other than the dominant rift parallel. Double difference relocation of all earthquakes, using both catalog and cross-correlation, identified one large cluster but could not conclusively identify trends within the cluster. A large number of earthquake waveforms showed identifiable shear wave splitting. For five stations out of the six where shear wave splitting was observed, the dominant polarization direction was rift parallel. Two of the five stations also showed a smaller rift perpendicular signal. The sixth station (located close to the area of the rift offset) displayed a N-S polarization, approximately halfway between rift parallel and perpendicular. The shear wave splitting time delays indicate that fracture density is higher at the PGS compared to the surrounding ERZ. Correlation co-efficient clustering with independent P and S wave windows was used to identify clusters based on similar earthquake waveforms. In total, 40 localized clusters containing ten or more events were identified. The largest cluster was located in the

  17. Automatic generation of force fields and property surfaces for use in variational vibrational calculations of anharmonic vibrational energies and zero-point vibrational averaged properties.

    Science.gov (United States)

    Kongsted, Jacob; Christiansen, Ove

    2006-09-28

    An automatic and general procedure for the calculation of geometrical derivatives of the energy and general property surfaces for molecular systems is developed and implemented. General expressions for an n-mode representation are derived, where the n-mode representation includes only the couplings between n or less degrees of freedom. The general expressions are specialized to derivative force fields and property surfaces, and a scheme for calculation of the numerical derivatives is implemented. The implementation is interfaced to electronic structure programs and may be used for both ground and excited electronic states. The implementation is done in the context of a vibrational structure program and can be used in combination with vibrational self-consistent field (VSCF), vibrational configuration interaction (VCI), vibrational Moller-Plesset, and vibrational coupled cluster calculations of anharmonic wave functions and calculation of vibrational averaged properties at the VSCF and VCI levels. Sample calculations are presented for fundamental vibrational energies and vibrationally averaged dipole moments and frequency dependent polarizabilities and hyperpolarizabilities of water and formaldehyde.

  18. Study of Events with Identified Forward Particles at the Split Field Magnet

    CERN Multimedia

    2002-01-01

    This experiment will study two aspects of particle production in the forward region : \\item 1) In the recent discovery of charm production in hadronic interactions at the Split Field Magnet, the triggering on strange particles at medium p^t has proven to be a very effective tool for the study of heavy resonances, especially those carrying new flavours like charm and beauty. We want to carry out a more detailed investigation of the production-dynamics of charmed particles, together with a search for beauty mesons and baryons. \\item 2) A trigger on forward particles at high p^t ($>$ 5GeV/c) provides unique features to determine the properties of the parton-parton subprocesses. We want to study the relative contributions of quark, diquark and gluon scattering.\\\\ \\\\ This experimental programme will be carried out, using the improved Split Field Magnet spectrometer (SFM). The different detection systems provide : \\item a) Detection and momentum analysis of charged particles in 4@p solid angle. An improved programm...

  19. Application of Chebyshev Formalism to Identify Nonlinear Magnetic Field Components in Beam Transport Systems

    Energy Technology Data Exchange (ETDEWEB)

    Spata, Michael [Old Dominion Univ., Norfolk, VA (United States)

    2012-08-01

    An experiment was conducted at Jefferson Lab's Continuous Electron Beam Accelerator Facility to develop a beam-based technique for characterizing the extent of the nonlinearity of the magnetic fields of a beam transport system. Horizontally and vertically oriented pairs of air-core kicker magnets were simultaneously driven at two different frequencies to provide a time-dependent transverse modulation of the beam orbit relative to the unperturbed reference orbit. Fourier decomposition of the position data at eight different points along the beamline was then used to measure the amplitude of these frequencies. For a purely linear transport system one expects to find solely the frequencies that were applied to the kickers with amplitudes that depend on the phase advance of the lattice. In the presence of nonlinear fields one expects to also find harmonics of the driving frequencies that depend on the order of the nonlinearity. Chebyshev polynomials and their unique properties allow one to directly quantify the magnitude of the nonlinearity with the minimum error. A calibration standard was developed using one of the sextupole magnets in a CEBAF beamline. The technique was then applied to a pair of Arc 1 dipoles and then to the magnets in the Transport Recombiner beamline to measure their multipole content as a function of transverse position within the magnets.

  20. Automatic Reading

    Institute of Scientific and Technical Information of China (English)

    胡迪

    2007-01-01

    <正>Reading is the key to school success and,like any skill,it takes practice.A child learns to walk by practising until he no longer has to think about how to put one foot in front of the other.The great athlete practises until he can play quickly,accurately and without thinking.Ed- ucators call it automaticity.

  1. Research fronts analysis : A bibliometric to identify emerging fields of research

    Science.gov (United States)

    Miwa, Sayaka; Ando, Satoko

    Research fronts analysis identifies emerging areas of research through observing co-clustering in highly-cited papers. This article introduces the concept of research fronts analysis, explains its methodology and provides case examples. It also demonstrates developing research fronts in Japan by looking at the past winners of Thomson Reuters Research Fronts Awards. Research front analysis is currently being used by the Japanese government to determine new trends in science and technology. Information professionals can also utilize this bibliometric as a research evaluation tool.

  2. Project MINERVA's Follow-up on Wide-Field, Small Telescope Photometry to Identify Exoplanets

    Science.gov (United States)

    Houghton, Audrey; Henderson, Morgan; Johnson, Samson; Sergi, Anthony; Eastman, Jason D.; Beatty, Thomas G.; McCrady, Nate

    2017-01-01

    MINERVA is an array of four 0.7-m telescopes equipped for high precision photometry and spectroscopy dedicated to exoplanet observations. During the first 18 months of science operations, MINERVA engaged in a program of photometric follow-up of potential transiting exoplanet targets identified by the Kilodegree Extremely Little Telescope (KELT). Robotically-obtained observations are passed through our data reduction pipeline and we extract light curves via differential photometry. We seek transit signals via a Markov chain Monte Carlo fit using BATMAN. We discuss results for over 100 target stars analyzed to date.

  3. 基于条件随机场的《伤寒论》中医术语自动识别%Automatic identification of TCM terminology in Shanghan Lun based on conditional random field

    Institute of Scientific and Technical Information of China (English)

    孟洪宇; 谢晴宇; 常虹; 孟庆刚

    2015-01-01

    Objective To explore the methods of automatic identification of TCM terminology and to ex-pand the forms of natural language processing in TCM documents.Methods Based on the methods of conditional random field( CRF) , annotation and automatic identification on terms of symptoms, diseases, pulse-types and prescriptions recorded in Shanghan Lun as the research subjects, the effects of different combinations of the features, such as Chinese character itself, part of speech, word boundary and term category label, on identification of terminology were analyzed and the most effective combination was selected.Results The TCM terminology automatic identification model, combining with the features of Chinese character itself, part of speech, word boundary and term category label, had the precision of 85.00%, recall of 68.00%and F score of 75.56%.Conclusion The multi-features model of combi-nation of Chinese character itself, part of speech, word boundary and the term category label achieved the best identifying result in all combinations.%目的:探索中医术语的自动识别方法,扩充中医文本的自然语言处理形式。方法采用基于条件随机场( CRF)的方法,针对《伤寒论》文本中的症状、病名、脉象、方剂等中医术语的自动识别标注问题,通过结合字本身、词性、词边界、术语类别标注的特征,分析不同特征组合对术语识别的影响,并探讨最具有效性的组合。结果以字本身、词边界、词性、类别标签为特征组合的中医术语识别模型准确率为85.00%,召回率为68.00%,F值为75.56%。结论字本身、词性、词边界、术语类别标注的多特征融合的模型识别效果最优。

  4. Identifying the Presence of AMD-Derived Soil CO2 in Field Investigations Using Isotope Ratios

    Directory of Open Access Journals (Sweden)

    Kwame Awuah-Offei

    2016-03-01

    Full Text Available Recent incidents of hazardous accumulations of CO2 in homes on or adjacent to reclaimed mine land have been shown to be linked to neutralization reactions between acidic mine drainage and carbonate material. An efficient and economic method is necessary to identify the presence of acid mine drainage- (AMD- derived CO2 on reclaimed mine land, prior to construction. One approach to identify the presence of AMD-derived CO2 is to characterize stable carbon isotope ratios of soil CO2. To do so, a viable method is necessary to acquire soil gas samples for isotope ratio analysis. This paper presents preliminary investigations of the effectiveness of two methods of acquiring gas samples (sampling during soil flux measurements and using slam bar for isotope analysis. The results indicate that direct soil gas sampling is cheaper and provides better results. Neither method is adequate without accounting for temporal effects due to changing gas transport mechanisms. These results have significant implications for safe post-mining land uses and future investigations of leakages from geologic carbon sequestration sites.

  5. Self-identified obese people request less money: a field experiment.

    Directory of Open Access Journals (Sweden)

    Antonios Proestakis

    2016-09-01

    Full Text Available Empirical evidence suggests that obese people are discriminated in different social environments such as the work place. Yet, the degree to which obese people are internalizing and adjusting their own behaviour as a result of this discriminatory behaviour has not been studied thoroughly. We develop a proxy for measuring experimentally the self-weight bias by giving to both self-identified obese (n=90 and non-obese (n=180 individuals the opportunity to request a positive amount of money after having performed an identical task. Consistent with the System Justification Theory, we find that self-identified obese individuals, due to a preexisting false consciousness, request significantly lower amounts of money than non-obese ones. A within subject comparison between self-reports and external interviewers' evaluations reveals that the excessive weight felt by the self but not reported by evaluators captures the self-weight bias not only for obese but also for non-obese individuals. Linking our experimental results to the supply side of the labour market, we argue that self-weight bias, as expressed by lower salary requests, enhances discriminatory behaviour against individuals who feel, but may not actually be, obese and consequently exacerbates the wage gab across weight.

  6. Accuracy of the discharge destination field in administrative data for identifying transfer to a long-term acute care hospital

    Directory of Open Access Journals (Sweden)

    Iwashyna Theodore J

    2010-07-01

    Full Text Available Abstract Background Long-term acute care hospitals (LTACs provide specialized care for patients recovering from severe acute illness. In order to facilitate research into LTAC utilization and outcomes, we studied whether or not the discharge destination field in administrative data accurately identifies patients transferred to an LTAC following acute care hospitalization. Findings We used the 2006 hospitalization claims for United States Medicare beneficiaries to examine the performance characteristics of the discharge destination field in the administrative record, compared to the reference standard of directly observing LTAC transfers in the claims. We found that the discharge destination field was highly specific (99.7%, 95 percent CI: 99.7% - 99.8% but modestly sensitive (77.3%, 95 percent CI: 77.0% - 77.6%, with corresponding low positive predictive value (72.6%, 95 percent CI: 72.3% - 72.9% and high negative predictive value (99.8%, 95 percent CI: 99.8% - 99.8%. Sensitivity and specificity were similar when limiting the analysis to only intensive care unit patients and mechanically ventilated patients, two groups with higher rates of LTAC utilization. Performance characteristics were slightly better when limiting the analysis to Pennsylvania, a state with relatively high LTAC penetration. Conclusions The discharge destination field in administrative data can result in misclassification when used to identify patients transferred to long-term acute care hospitals. Directly observing transfers in the claims is the preferable method, although this approach is only feasible in identified data.

  7. Fungi in Thailand: a case study of the efficacy of an ITS barcode for automatically identifying species within the Annulohypoxylon and Hypoxylon genera.

    Directory of Open Access Journals (Sweden)

    Nuttika Suwannasai

    Full Text Available Thailand, a part of the Indo-Burma biodiversity hotspot, has many endemic animals and plants. Some of its fungal species are difficult to recognize and separate, complicating assessments of biodiversity. We assessed species diversity within the fungal genera Annulohypoxylon and Hypoxylon, which produce biologically active and potentially therapeutic compounds, by applying classical taxonomic methods to 552 teleomorphs collected from across Thailand. Using probability of correct identification (PCI, we also assessed the efficacy of automated species identification with a fungal barcode marker, ITS, in the model system of Annulohypoxylon and Hypoxylon. The 552 teleomorphs yielded 137 ITS sequences; in addition, we examined 128 GenBank ITS sequences, to assess biases in evaluating a DNA barcode with GenBank data. The use of multiple sequence alignment in a barcode database like BOLD raises some concerns about non-protein barcode markers like ITS, so we also compared species identification using different alignment methods. Our results suggest the following. (1 Multiple sequence alignment of ITS sequences is competitive with pairwise alignment when identifying species, so BOLD should be able to preserve its present bioinformatics workflow for species identification for ITS, and possibly therefore with at least some other non-protein barcode markers. (2 Automated species identification is insensitive to a specific choice of evolutionary distance, contributing to resolution of a current debate in DNA barcoding. (3 Statistical methods are available to address, at least partially, the possibility of expert misidentification of species. Phylogenetic trees discovered a cryptic species and strongly supported monophyletic clades for many Annulohypoxylon and Hypoxylon species, suggesting that ITS can contribute usefully to a barcode for these fungi. The PCIs here, derived solely from ITS, suggest that a fungal barcode will require secondary markers in

  8. Identifying best practices for clinical decision support and knowledge management in the field.

    Science.gov (United States)

    Ash, Joan S; Sittig, Dean F; Dykstra, Richard; Wright, Adam; McMullen, Carmit; Richardson, Joshua; Middleton, Blackford

    2010-01-01

    To investigate best practices for implementing and managing clinical decision support (CDS) in community hospitals and ambulatory settings, we carried out a series of ethnographic studies to gather information from nine diverse organizations. Using the Rapid Assessment Process methodology, we conducted surveys, interviews, and observations over a period of two years in eight different geographic regions of the U.S.A. We first utilized a template organizing method for an expedited analysis of the data, followed by a deeper and more time consuming interpretive approach. We identified five major categories of best practices that require careful consideration while carrying out the planning, implementation, and knowledge management processes related to CDS. As more health care organizations implement clinical systems such as computerized provider order entry with CDS, descriptions of lessons learned by CDS pioneers can provide valuable guidance so that CDS can have optimal impact on health care quality.

  9. Identifying Landscape Areas Prone to Generating Storm Runoff in Central New York Agricultural Fields

    Science.gov (United States)

    Hofmeister, K.; Walter, M. T.

    2015-12-01

    Nonpoint source (NPS) pollution continues to be a leading cause of surface water degradation, especially in agricultural areas. In humid regions where variable source area (VSA) hydrology dominates storm runoff, NPS pollution is generated where VSAs coincide with polluting activities. Mapping storm runoff risks could allow for more precise and informed targeting of NPS pollution mitigation practices in agricultural landscapes. Topographic wetness indices (TWI) provide good approximations of relative soil moisture patterns and relative storm runoff risks. Simulation models are typically used in conjunction with TWIs to quantify VSA behavior. In this study we use empirically derived relationships between TWI values, volumetric water content (VWC) and rainfall frequencies to develop runoff probability maps. Rainfall and soil VWC were measured across regionally representative agricultural areas in central New York over three years (2012-2015) to determine the volume of runoff generated from agricultural fields in the area. We assumed the threshold for storm runoff occurs when the combination of antecedent soil water and rainfall are sufficient to saturate the soil. We determined that approximately 50% of the storm runoff volume is generated from 10% of the land area during spring, summer, and autumn seasons, while the risk of storm runoff generation is higher in the spring and autumn seasons than in the summer for the same area of land.

  10. Response Properties of a Newly Identified Tristratified Narrow Field Amacrine Cell in the Mouse Retina.

    Directory of Open Access Journals (Sweden)

    G S Newkirk

    Full Text Available Amacrine cells were targeted for whole cell recording using two-photon fluorescence microscopy in a transgenic mouse line in which the promoter for dopamine receptor 2 drove expression of green fluorescent protein in a narrow field tristratified amacrine cell (TNAC that had not been studied previously. Light evoked a multiphasic response that was the sum of hyperpolarizing and depolarization synaptic inputs consistent with distinct dendritic ramifications in the off and on sublamina of the inner plexiform layer. The amplitude and waveform of the response, which consisted of an initial brief hyperpolarization at light onset followed by recovery to a plateau potential close to dark resting potential and a hyperpolarizing response at the light offset varied little over an intensity range from 0.4 to ~10^6 Rh*/rod/s. This suggests that the cell functions as a differentiator that generates an output signal (a transient reduction in inhibitory input to downstream retina neurons that is proportional to the derivative of light input independent of its intensity. The underlying circuitry appears to consist of rod and cone driven on and off bipolar cells that provide direct excitatory input to the cell as well as to GABAergic amacrine cells that are synaptically coupled to TNAC. Canonical reagents that blocked excitatory (glutamatergic and inhibitory (GABA and glycine synaptic transmission had effects on responses to scotopic stimuli consistent with the rod driven component of the proposed circuit. However, responses evoked by photopic stimuli were paradoxical and could not be interpreted on the basis of conventional thinking about the neuropharmacology of synaptic interactions in the retina.

  11. Identifying Gender-Sensitive Agroforestry Options: Methodological Considerations From the Field

    Directory of Open Access Journals (Sweden)

    Sarah-Lan Mathez-Stiefel

    2016-11-01

    Full Text Available Agroforestry is seen as a promising set of land use practices that can lead to increased ecological integrity and sustainable benefits in mountain areas. Agroforestry practices can also enhance smallholder farmers' resilience in the face of social and ecological change. There is a need for critical examination of existing practices to ensure that agroforestry recommendations for smallholder farmers are socially inclusive and grounded in local experience, knowledge, and perceptions. In this paper, we present a transdisciplinary systems approach to the identification and analysis of suitable agroforestry options, which takes into account gendered perceptions of the benefits and values of natural resources. The 4-step approach consists of an appraisal of local perceptions of the social-ecological context and dynamics, an inventory of existing agroforestry practices and species, a gendered valuation of agroforestry practices and species, and the development of locally adapted and gender-sensitive agroforestry options. In a study using this approach in the Peruvian Andes, data were collected through a combination of participatory tools for gender research and ethnobotanical methods. This paper shares lessons learned and offers recommendations for researchers and practitioners in the field of sustainable mountain development. We discuss methodological considerations in the identification of locally adapted agroforestry options, the understanding of local social-ecological systems, the facilitation of social learning processes, engagement in gender research, and the establishment of ethical research collaborations. The methodology presented here is especially recommended for the exploratory phase of any natural resource management initiative in mountain areas with high environmental and sociocultural variability.

  12. Identifying nearby field T dwarfs in the UKIDSS Galactic Clusters Survey

    CERN Document Server

    Lodieu, N; Hambly, N C; Pinfield, D J

    2008-01-01

    We present the discovery of two new late-T dwarfs identified in the UKIRT Infrared Deep Sky Survey (UKIDSS) Galactic Clusters Survey (GCS) Data Release 2 (DR2). These T dwarfs are nearby old T dwarfs along the line of sight to star-forming regions and open clusters targeted by the UKIDSS GCS. They are found towards the Alpha Per cluster and Orion complex, respectively, from a search in 54 square degrees surveyed in five filters. Photometric candidates were picked up in two-colour diagrams, in a very similar manner to candidates extracted from the UKIDSS Large Area Survey (LAS) but taking advantage of the Z filter employed by the GCS. Both candidates exhibit near-infrared J-band spectra with strong methane and water absorption bands characteristic of late-T dwarfs. We derive spectral types of T6.5+/-0.5 and T7+/-1 and estimate photometric distances less than 50 pc for UGCS J030013.86+490142.5 and UGCS J053022.52-052447.4, respectively. The space density of T dwarfs found in the GCS seems consistent with discov...

  13. Identify biosorption effects of Thiobacillus towards perfluorooctanoic acid (PFOA): Pilot study from field to laboratory.

    Science.gov (United States)

    Li, Lei; Wang, Tieyu; Sun, Yajun; Wang, Pei; Yvette, Baninla; Meng, Jing; Li, Qifeng; Zhou, Yunqiao

    2017-03-01

    The concentration of Perfluoroalkyl acids (PFAAs) and the bacterial community composition along the Xiaoqing River were explored with HPLC-MS/MS and Illumina high-throughput sequencing in present study. The results showed that perfluorooctanoic acid (PFOA) was the predominant PFAAs in all sediment samples, and high level of PFOA could lead to an evident increase in the abundance of Thiobacillus. Thiobacillus was identified with the survival ability in high concentrations of PFOA accordingly. Therefore, Thiobacillus thioparus and Thiobacillus denitrificans were selected as receptors to design indoor biosorption experiment. The growth curves under different PFOA concentrations and residual rates of PFOA in the processes of cultivation were analyzed. The results showed that upwards concentrations of PFOA below 5000 ng/L led to an obvious increase in the growth rate of T. thioparus. Whereas PFOA promoted the growth of T. denitrificans in a relatively limited range of concentration, and the effect was not obvious. The addition of different concentrations of PFOA had no apparent effects on pH values in the media of both T. thioparus and T. denitrificans. The concentrations of PFOA in liquid media reduced after the process of bacteria culturing. The removal rates of T. thioparus and T. denitrificans to PFOA were 21.1-26.8% and 13.5-18.4%, respectively. The current findings indicated that T. thioparus could play a significant role as potential biosorbent with the ability to eliminate PFOA effectively in aquatic environment, which would provide novel information for PFOA ecological decontamination and remediation.

  14. A Field Program to Identify TRI Chemicals and Determine Emission Factors from DoD Munitions Activities

    Science.gov (United States)

    2006-01-01

    FINAL REPORT to STRATEGIC ENVIROMENTAL RESEARCH AND DEVELOPMENT PROGRAM (SERDP) on A FIELD PROGRAM TO IDENTIFY TRI CHEMICALS...adhere to EPCRA, including the toxic release inventory (TRI) requirements. A particularly difficult reporting issue for DoD concerns air emissions...primarily because of constraints imposed on the physical location of the lidar due to laser safety issues . The lack of flexibility in repositioning

  15. Approaches to identifying reservoir heterogeneity and reserve growth opportunities from subsurface data: The Oficina Formation, Budare field, Venezuela

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, D.S.; Raeuchle, S.K.; Holtz, M.H. [Bureau of Economic Geology, Austin, TX (United States)] [and others

    1997-08-01

    We applied an integrated geologic, geophysical, and engineering approach devised to identify heterogeneities in the subsurface that might lead to reserve growth opportunities in our analysis of the Oficina Formation at Budare field, Venezuela. The approach involves 4 key steps: (1) Determine geologic reservoir architecture; (2) Investigate trends in reservoir fluid flow; (3) Integrate fluid flow trends with reservoir architecture; and (4) Estimate original oil-in-place, residual oil saturation, and remaining mobile oil, to identify opportunities for reserve growth. There are three main oil-producing reservoirs in the Oficina Formation that were deposited in a bed-load fluvial system, an incised valley-fill, and a barrier-strandplain system. Reservoir continuity is complex because, in addition to lateral facies variability, the major Oficina depositional systems were internally subdivided by high-frequency stratigraphic surfaces. These surfaces define times of intermittent lacustrine and marine flooding events that punctuated the fluvial and marginal marine sedimentation, respectively. Syn and post depositional faulting further disrupted reservoir continuity. Trends in fluid flow established from initial fluid levels, response to recompletion workovers, and pressure depletion data demonstrated barriers to lateral and vertical fluid flow caused by a combination of reservoir facies pinchout, flooding shale markers, and the faults. Considerable reserve growth potential exists at Budare field because the reservoir units are highly compartment by the depositional heterogeneity and structural complexity. Numerous reserve growth opportunities were identified in attics updip of existing production, in untapped or incompletely drained compartments, and in field extensions.

  16. First results from the field test of households with dynamic tarif and automatic control in the regenerative model region Harz; Erste Ergebnisse des Haushaltsfeldtests mit dynamischen Tarif und automatischer Steuerung in der Regenerativen Modellregion Harz

    Energy Technology Data Exchange (ETDEWEB)

    Funke, Stephan; Landau, Markus [Fraunhofer Institut fuer Windenergie und Energiesystemtechnik (IWES), Kassel (Germany); Filzek, Dirk; Volkert, Christina [CUBE Engineering GmbH, Kassel (Germany); Fechner, Amelie [Saarland Univ., Saarbruecken (Germany). Forschungsgruppe Umweltpsychologie

    2012-07-01

    As part of the E-Energy research project RegModHarz (Regenerative Model Region Harz) a field test with test households is carried out. A system developed in the project and consisting of a dynamic tariff, appliance control and monitoring system is tested. Concomitant to this, the acceptance of this system by the participants of the field test is evaluated. First results from the commissioning of the system are already available. Currently, the second phase of the field test is performed. At this, the participants of the field test can adjust their power consumption actively and automatically to the availability of electricity from renewable energy sources in the model region.

  17. DNase SISPA-next generation sequencing confirms Schmallenberg virus in Belgian field samples and identifies genetic variation in Europe.

    Directory of Open Access Journals (Sweden)

    Toon Rosseel

    Full Text Available In 2011, a novel Orthobunyavirus was identified in cattle and sheep in Germany and The Netherlands. This virus was named Schmallenberg virus (SBV. Later, presence of the virus was confirmed using real time RT-PCR in cases of congenital malformations of bovines and ovines in several European countries, including Belgium. In the absence of specific sequencing protocols for this novel virus we confirmed its presence in RT-qPCR positive field samples using DNase SISPA-next generation sequencing (NGS, a virus discovery method based on random amplification and next generation sequencing. An in vitro transcribed RNA was used to construct a standard curve allowing the quantification of viral RNA in the field samples. Two field samples of aborted lambs containing 7.66 and 7.64 log(10 RNA copies per µL total RNA allowed unambiguous identification of SBV. One sample yielded 192 SBV reads covering about 81% of the L segment, 56% of the M segment and 13% of the S segment. The other sample resulted in 8 reads distributed over the L and M segments. Three weak positive field samples (one from an aborted calf, two from aborted lambs containing virus quantities equivalent to 4.27-4.89 log(10 RNA copies per µL did not allow identification using DNase SISPA-NGS. This partial sequence information was compared to the whole genome sequence of SBV isolated from bovines in Germany, identifying several sequence differences. The applied viral discovery method allowed the confirmation of SBV in RT-qPCR positive brain samples. However, the failure to confirm SBV in weak PCR-positive samples illustrates the importance of the selection of properly targeted and fresh field samples in any virus discovery method. The partial sequences derived from the field samples showed several differences compared to the sequences from bovines in Germany, indicating sequence divergence within the epidemic.

  18. 复合物理场强化的全自动多级超滤系统%Multi-Stage Automatic Ultrafiltration System Enhanced with Complex Physical Fields

    Institute of Scientific and Technical Information of China (English)

    傅晓琴; 李琳; 李冰; 陈玲

    2011-01-01

    Proposed in this paper is an automatic ultrafiltration system enhanced with ultrasonic and pulse electric field, which is used to overcome the concentration polarization and membrane fouling during the ultrafiltration and improve the automation level of ultrafiltration process. The system consists of a crude filtration loop, two ultrafiltration loops, a cleaning loop and a heat-exchange loop, and possesses excellent operation flexibility because it can simultaneously perform multi-stage or separately perform single-stage ultrafiltration. It also possesses a multi-layer distributed automatic control subsystem based on Kunlun MGCS configuration, which adopts an industrial personal computer and a Simens S7-200 PLC as the system hardware. The application of the proposed system to the concentration of pumpkin polysaccharides shows that, in the adopted experimental conditions, the average permeation flux increases by 1.15 ~2. 53 times and the rejection by 15 points of percentage, which means that the system greatly improves the ultrafiltration efficiency and effect.%为克服超滤过程中的浓差极化,防止膜污染,同时提高超滤生产的自动化水平,设计了超声场和脉冲电场复合强化的全自动超滤系统.该系统包括粗滤回路、一级和二级超滤回路、清洗回路和热交换回路,可同时或分别进行多级或单级过滤/超滤,具有较好的操作灵活性.采用分布式分级控制的方式,以工业控制计算机、Simens S7-200 PLC作为控制系统的硬件,基于昆仑MGCS组态环境,实现了超滤过程的自动控制.在南瓜多糖超滤浓缩中的应用表明,所设计的系统可显著提高分离效率和效果,在文中实验条件下,平均膜通量增加了1.15~2.53倍,截留率增幅达15个百分点.

  19. Candidate Clusters of Galaxies at $z>1.3$ Identified in the Spitzer SPT Deep Field Survey

    CERN Document Server

    Rettura, A; Stern, D; Mei, S; Ashby, M L N; Brodwin, M; Gettings, D; Gonzalez, A H; Stanford, S A; Bartlett, J G

    2014-01-01

    We present 279 galaxy cluster candidates at $z > 1.3$ selected from the 94 deg$^{2}$ Spitzer South Pole Telescope Deep Field (SSDF) survey. We use a simple algorithm to select candidate high-redshift clusters of galaxies based on Spitzer/IRAC mid-infrared data combined with shallow all-sky optical data. We identify distant cluster candidates in SSDF adopting an overdensity threshold that results in a high purity (80%) cluster sample based on tests in the Spitzer Deep, Wide-Field Survey of the Bo\\"otes field. Our simple algorithm detects all three $1.4 < z \\leq 1.75$ X-ray detected clusters in the Bo\\"otes field. The uniqueness of the SSDF survey resides not just in its area, one of the largest contiguous extragalactic fields observed with Spitzer, but also in its deep, multi-wavelength coverage by the South Pole Telescope (SPT), Herschel/SPIRE and XMM-Newton. This rich dataset will allow direct or stacked measurements of Sunyaev-Zel'dovich effect decrements or X-ray masses for many of the SSDF clusters pre...

  20. Automatic alignment device for focal spot measurements in the center of the field for mammography; Sistema automatico de alinhamento para avaliacao do ponto focal no centro do campo de equipamentos mamograficos

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Marcelo A.C.; Watanabe, Alex O.; Oliveira Junior, Paulo D.; Schiabel, Homero [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Escola de Engenharia. Dept. de Engenharia Eletrica], e-mail: mvieira@sc.usp.br

    2010-03-15

    Some quality control procedures used for mammography, such as focal spot evaluation, requires previous alignment of the measurement equipment with the X-ray central beam. However, alignment procedures are, in general, the most difficult task and the one that needs more time to be performed. Moreover, the operator sometimes is exposed to radiation during this procedure. This work presents an automatic alignment system for mammographic equipment that allows locating the central ray of the radiation beam and, immediately, aligns with it by dislocating itself automatically along the field. The system consists on a bidirectional moving device, connected to a CCD sensor for digital radiographic image acquisition. A computational analysis of a radiographic image, acquired at any position on the field, is performed in order to determine its positioning under the X-ray beam. Finally, a mechanical system for two moving directions, electronically controlled by a microcontroller under USB communication, makes the system to align automatically with the radiation beam central ray. The alignment process is fully automatic, fast and accurate, with no operator exposure to radiation, which allows a considerable time saving for quality control procedures achievement for mammography. (author)

  1. Metagenomic analysis of viruses associated with field-grown and retail lettuce identifies human and animal viruses.

    Science.gov (United States)

    Aw, Tiong Gim; Wengert, Samantha; Rose, Joan B

    2016-04-16

    The emergence of culture- and sequence-independent metagenomic methods has not only provided great insight into the microbial community structure in a wide range of clinical and environmental samples but has also proven to be powerful tools for pathogen detection. Recent studies of the food microbiome have revealed the vast genetic diversity of bacteria associated with fresh produce. However, no work has been done to apply metagenomic methods to tackle viruses associated with fresh produce for addressing food safety. Thus, there is a little knowledge about the presence and diversity of viruses associated with fresh produce from farm-to-fork. To address this knowledge gap, we assessed viruses on commercial romaine and iceberg lettuces in fields and a produce distribution center using a shotgun metagenomic sequencing targeting both RNA and DNA viruses. Commercial lettuce harbors an immense assemblage of viruses that infect a wide range of hosts. As expected, plant pathogenic viruses dominated these communities. Sequences of rotaviruses and picobirnaviruses were also identified in both field-harvest and retail lettuce samples, suggesting an emerging foodborne transmission threat that has yet to be fully recognized. The identification of human and animal viruses in lettuce samples in the field emphasizes the importance of preventing viral contamination on leafy greens starting at the field. Although there are still some inherent experimental and bioinformatics challenges in applying viral metagenomic approaches for food safety testing, this work will facilitate further application of this unprecedented deep sequencing method to food samples.

  2. 油田联合站自动加药系统%Automatical Adding Chemical Medicine System in the Union Station of an Oil Field

    Institute of Scientific and Technical Information of China (English)

    翟波; 杨文

    2011-01-01

    针对油田联合站现行人工加药方法的缺点,采用PLC控制单元,根据监测量及设定浓度控制变频器,通过自动检测输油管流量、温度,油压、泵电流、泵电压,从而控制加药量.下位机与上位机实时通讯,适时显示加药浓度,加药量和原油流量,并设有多种故障报警功能及按键互锁功能.采用该自动加药系统.降低了工人的劳动强度,提高了联合站加药精度和生产效率.%For the shortcomings of the method of artificial adding medicine in the union station in an oil field, the dosage adding medicine can be controlled by PLC as a control unit.It accords the monitor quatity and given density to control frequency convertor and through the automatic detected data of pipeline flow, temperature, pressure, current and voltage of pump to control the dosage of medicine.The medicine density, dosage and oil flow are displayed by communication between the upper unit and the lower one in the system with the functions of accident alarm and key interlock.The adding medicine accuracy and production efficiency in the union station are increased, and working intensity is reduced.

  3. Semi-automatic delimitation of volcanic edifice boundaries: Validation and application to the cinder cones of the Tancitaro-Nueva Italia region (Michoacán-Guanajuato Volcanic Field, Mexico)

    Science.gov (United States)

    Di Traglia, Federico; Morelli, Stefano; Casagli, Nicola; Garduño Monroy, Victor Hugo

    2014-08-01

    The shape and size of monogenetic volcanoes are the result of complex evolutions involving the interaction of eruptive activity, structural setting and degradational processes. Morphological studies of cinder cones aim to evaluate volcanic hazard on the Earth and to decipher the origins of various structures on extraterrestrial planets. Efforts have been dedicated so far to the characterization of the cinder cone morphology in a systematic and comparable manner. However, manual delimitation is time-consuming and influenced by the user subjectivity but, on the other hand, automatic boundary delimitation of volcanic terrains can be affected by irregular topography. In this work, the semi-automatic delimitation of volcanic edifice boundaries proposed by Grosse et al. (2009) for stratovolcanoes was tested for the first time over monogenetic cinder cones. The method, based on the integration of the DEM-derived slope and curvature maps, is applied here to the Tancitaro-Nueva Italia region of the Michoacán-Guanajuato Volcanic Field (Mexico), where 309 Plio-Quaternary cinder cones are located. The semiautomatic extraction allowed identification of 137 of the 309 cinder cones of the Tancitaro-Nueva Italia region, recognized by means of the manual extraction. This value corresponds to the 44.3% of the total number of cinder cones. Analysis on vent alignments allowed us to identify NE-SW vent alignments and cone elongations, consistent with a NE-SW σmax and a NW-SE σmin. Constructing a vent intensity map, based on computing the number of vents within a radius r centred on each vent of the data set and choosing r = 5 km, four vent intensity maxima were derived: one is positioned in the NW with respect to the Volcano Tancitaro, one in the NE, one to the S and another vent cluster located at the SE boundary of the studied area. The spacing of centroid of each cluster (24 km) can be related to the thickness of the crust (9-10 km) overlying the magma reservoir.

  4. Automatization of lexicographic work

    Directory of Open Access Journals (Sweden)

    Iztok Kosem

    2013-12-01

    Full Text Available A new approach to lexicographic work, in which the lexicographer is seen more as a validator of the choices made by computer, was recently envisaged by Rundell and Kilgarriff (2011. In this paper, we describe an experiment using such an approach during the creation of Slovene Lexical Database (Gantar, Krek, 2011. The corpus data, i.e. grammatical relations, collocations, examples, and grammatical labels, were automatically extracted from 1,18-billion-word Gigafida corpus of Slovene. The evaluation of the extracted data consisted of making a comparison between the time spent writing a manual entry and a (semi-automatic entry, and identifying potential improvements in the extraction algorithm and in the presentation of data. An important finding was that the automatic approach was far more effective than the manual approach, without any significant loss of information. Based on our experience, we would propose a slightly revised version of the approach envisaged by Rundell and Kilgarriff in which the validation of data is left to lower-level linguists or crowd-sourcing, whereas high-level tasks such as meaning description remain the domain of lexicographers. Such an approach indeed reduces the scope of lexicographer’s work, however it also results in the ability of bringing the content to the users more quickly.

  5. Automatic Moth Detection from Trap Images for Pest Management

    OpenAIRE

    Ding, Weiguang; Taylor, Graham

    2016-01-01

    Monitoring the number of insect pests is a crucial component in pheromone-based pest management systems. In this paper, we propose an automatic detection pipeline based on deep learning for identifying and counting pests in images taken inside field traps. Applied to a commercial codling moth dataset, our method shows promising performance both qualitatively and quantitatively. Compared to previous attempts at pest detection, our approach uses no pest-specific engineering which enables it to ...

  6. CANDIDATE CLUSTERS OF GALAXIES AT z > 1.3 IDENTIFIED IN THE SPITZER SOUTH POLE TELESCOPE DEEP FIELD SURVEY

    Energy Technology Data Exchange (ETDEWEB)

    Rettura, A.; Stern, D. [Jet Propulsion Laboratory, California Institute of Technology, MS 169-234, Pasadena, CA 91109 (United States); Martinez-Manso, J.; Gettings, D.; Gonzalez, A. H. [Department of Astronomy, University of Florida, Gainesville, FL 32611 (United States); Mei, S. [GEPI, Observatoire de Paris, Section de Meudon, Meudon Cedex (France); Ashby, M. L. N. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States); Brodwin, M. [Department of Physics and Astronomy, University of Missouri, Kansas City, MO 64110 (United States); Stanford, S. A. [Department of Physics, University of California, Davis, CA 95616 (United States); Bartlett, J. G. [APC, AstroParticule et Cosmologie, Universite Paris Diderot, CNRS/IN2P3, CEA/lrfu, Observatoire de Paris, Sorbonne Paris Cite, 75205 Paris Cedex 13 (France)

    2014-12-20

    We present 279 galaxy cluster candidates at z > 1.3 selected from the 94 deg{sup 2} Spitzer South Pole Telescope Deep Field (SSDF) survey. We use a simple algorithm to select candidate high-redshift clusters of galaxies based on Spitzer/IRAC mid-infrared data combined with shallow all-sky optical data. We identify distant cluster candidates adopting an overdensity threshold that results in a high purity (80%) cluster sample based on tests in the Spitzer Deep, Wide-Field Survey of the Boötes field. Our simple algorithm detects all three 1.4 < z ≤ 1.75 X-ray detected clusters in the Boötes field. The uniqueness of the SSDF survey resides not just in its area, one of the largest contiguous extragalactic fields observed with Spitzer, but also in its deep, multi-wavelength coverage by the South Pole Telescope (SPT), Herschel/SPIRE, and XMM-Newton. This rich data set will allow direct or stacked measurements of Sunyaev-Zel'dovich effect decrements or X-ray masses for many of the SSDF clusters presented here, and enable a systematic study of the most distant clusters on an unprecedented scale. We measure the angular correlation function of our sample and find that these candidates show strong clustering. Employing the COSMOS/UltraVista photometric catalog in order to infer the redshift distribution of our cluster selection, we find that these clusters have a comoving number density n{sub c}=(0.7{sub −0.6}{sup +6.3})×10{sup −7} h{sup 3} Mpc{sup −3} and a spatial clustering correlation scale length r {sub 0} = (32 ± 7) h {sup –1} Mpc. Assuming our sample is comprised of dark matter halos above a characteristic minimum mass, M {sub min}, we derive that at z = 1.5 these clusters reside in halos larger than M{sub min}=1.5{sub −0.7}{sup +0.9}×10{sup 14} h{sup −1} M{sub ⊙}. We find that the mean mass of our cluster sample is equal to M{sub mean}=1.9{sub −0.8}{sup +1.0}×10{sup 14} h{sup −1} M{sub ⊙}; thus, our sample contains the progenitors of

  7. Numerical simulation analysis of temperature and velocity field of automatic umbrella dryer%自动干伞机温度场和流场数值模拟分析

    Institute of Scientific and Technical Information of China (English)

    徐逢; 胡勇

    2014-01-01

    The working principle and design scheme of automatic umbrella dryer are introduced in this paper.The temperature field and velocity field of the inner automatic umbrella dryerare analyzed using Solidworks Flow Simulation.The results show that the tempera-ture distribution is homogeneity and the highest temperature is 40℃where the wind speed is low which is too far to hurt the umbrella. The final results show that the structure of automatic umbrella dryer is reasonable and have fine application market.%文章介绍了一种雨伞干燥装置的设计方案和工作原理。通过Solidworks Flow Simulation进行了机厢内温度场及速度场分析,表明厢内温度分布较均匀,在风速较低的局部区域温度40℃左右,远未达到伞面破坏温度,方案结构设计比较合理,具有市场推广应用价值。

  8. Identifying Active Faults in Northeast North America Using Hypocenters and Multiscale Edge Wavelet Analyses of Potential Fields

    Science.gov (United States)

    Carpenter, K.; Horowitz, F.; Ebinger, C. J.; Navarrete, L. C.; Diaz-Etchevehere, D.

    2015-12-01

    Multiscale edge Poisson wavelet analyses of potential field data ("worms") have a physical interpretation as the locations of lateral boundaries in a source distribution that exactly generates the observed field. The worm technique is therefore well-suited to analyses of crustal-scale stuctures that could be reactivated by tectonic stress or by fluid injection processes, providing new tools to analyze existing continental-scale data sets. Northeastern North America (US, Canada) hosts potentially damaging intraplate earthquakes, yet many of the Proterozoic structures are covered by thick sedimentary sequences or dense vegetation, and crustal structure is relatively poorly known.For the purpose of extending basement structure beneath the Appalachian basin and establishing a consistent regional basis for comparison, we use worms to identify steeply dipping structures in compiled gravity and magnetic anomaly data sets. We compare results to intraplate earthquake locations to assess seismic hazards. Clearly, not all locations of lateral boundaries are faults, and we do not expect all faults to have shown activity in the ~50 years of seismic records available. However, proximity statistics between hypocenters and worms are of interest since they assist in the identification and location of a subset of potentially active faults. We compare structures of lateral mass-density or magnetization contrast with locations of earthquake hypocenters cataloged from the ISC, the NEIC, and the ANF from the EarthScope Transportable Array. We develop a GIS based method for calculating hypocenter/worm proximity, and we will show statistics and maps from this method for the region at the meeting.

  9. Receptive Field Vectors of Genetically-Identified Retinal Ganglion Cells Reveal Cell-Type-Dependent Visual Functions.

    Directory of Open Access Journals (Sweden)

    Matthew L Katz

    Full Text Available Sensory stimuli are encoded by diverse kinds of neurons but the identities of the recorded neurons that are studied are often unknown. We explored in detail the firing patterns of eight previously defined genetically-identified retinal ganglion cell (RGC types from a single transgenic mouse line. We first introduce a new technique of deriving receptive field vectors (RFVs which utilises a modified form of mutual information ("Quadratic Mutual Information". We analysed the firing patterns of RGCs during presentation of short duration (~10 second complex visual scenes (natural movies. We probed the high dimensional space formed by the visual input for a much smaller dimensional subspace of RFVs that give the most information about the response of each cell. The new technique is very efficient and fast and the derivation of novel types of RFVs formed by the natural scene visual input was possible even with limited numbers of spikes per cell. This approach enabled us to estimate the 'visual memory' of each cell type and the corresponding receptive field area by calculating Mutual Information as a function of the number of frames and radius. Finally, we made predictions of biologically relevant functions based on the RFVs of each cell type. RGC class analysis was complemented with results for the cells' response to simple visual input in the form of black and white spot stimulation, and their classification on several key physiological metrics. Thus RFVs lead to predictions of biological roles based on limited data and facilitate analysis of sensory-evoked spiking data from defined cell types.

  10. Coupling field and laboratory measurements to estimate the emission factors of identified and unidentified trace gases for prescribed fires

    Science.gov (United States)

    Yokelson, R. J.; Burling, I. R.; Gilman, J. B.; Warneke, C.; Stockwell, C. E.; de Gouw, J.; Akagi, S. K.; Urbanski, S. P.; Veres, P.; Roberts, J. M.; Kuster, W. C.; Reardon, J.; Griffith, D. W. T.; Johnson, T. J.; Hosseini, S.; Miller, J. W.; Cocker, D. R., III; Jung, H.; Weise, D. R.

    2013-01-01

    An extensive program of experiments focused on biomass burning emissions began with a laboratory phase in which vegetative fuels commonly consumed in prescribed fires were collected in the southeastern and southwestern US and burned in a series of 71 fires at the US Forest Service Fire Sciences Laboratory in Missoula, Montana. The particulate matter (PM2.5) emissions were measured by gravimetric filter sampling with subsequent analysis for elemental carbon (EC), organic carbon (OC), and 38 elements. The trace gas emissions were measured by an open-path Fourier transform infrared (OP-FTIR) spectrometer, proton-transfer-reaction mass spectrometry (PTR-MS), proton-transfer ion-trap mass spectrometry (PIT-MS), negative-ion proton-transfer chemical-ionization mass spectrometry (NI-PT-CIMS), and gas chromatography with MS detection (GC-MS). 204 trace gas species (mostly non-methane organic compounds (NMOC)) were identified and quantified with the above instruments. Many of the 182 species quantified by the GC-MS have rarely, if ever, been measured in smoke before. An additional 153 significant peaks in the unit mass resolution mass spectra were quantified, but either could not be identified or most of the signal at that molecular mass was unaccounted for by identifiable species. In a second, "field" phase of this program, airborne and ground-based measurements were made of the emissions from prescribed fires that were mostly located in the same land management units where the fuels for the lab fires were collected. A broad variety, but smaller number of species (21 trace gas species and PM2.5) was measured on 14 fires in chaparral and oak savanna in the southwestern US, as well as pine forest understory in the southeastern US and Sierra Nevada mountains of California. The field measurements of emission factors (EF) are useful both for modeling and to examine the representativeness of our lab fire EF. The lab EF/field EF ratio for the pine understory fuels was not

  11. Coupling field and laboratory measurements to estimate the emission factors of identified and unidentified trace gases for prescribed fires

    Directory of Open Access Journals (Sweden)

    R. J. Yokelson

    2013-01-01

    Full Text Available An extensive program of experiments focused on biomass burning emissions began with a laboratory phase in which vegetative fuels commonly consumed in prescribed fires were collected in the southeastern and southwestern US and burned in a series of 71 fires at the US Forest Service Fire Sciences Laboratory in Missoula, Montana. The particulate matter (PM2.5 emissions were measured by gravimetric filter sampling with subsequent analysis for elemental carbon (EC, organic carbon (OC, and 38 elements. The trace gas emissions were measured by an open-path Fourier transform infrared (OP-FTIR spectrometer, proton-transfer-reaction mass spectrometry (PTR-MS, proton-transfer ion-trap mass spectrometry (PIT-MS, negative-ion proton-transfer chemical-ionization mass spectrometry (NI-PT-CIMS, and gas chromatography with MS detection (GC-MS. 204 trace gas species (mostly non-methane organic compounds (NMOC were identified and quantified with the above instruments. Many of the 182 species quantified by the GC-MS have rarely, if ever, been measured in smoke before. An additional 153 significant peaks in the unit mass resolution mass spectra were quantified, but either could not be identified or most of the signal at that molecular mass was unaccounted for by identifiable species.

    In a second, "field" phase of this program, airborne and ground-based measurements were made of the emissions from prescribed fires that were mostly located in the same land management units where the fuels for the lab fires were collected. A broad variety, but smaller number of species (21 trace gas species and PM2.5 was measured on 14 fires in chaparral and oak savanna in the southwestern US, as well as pine forest understory in the southeastern US and Sierra Nevada mountains of California. The field measurements of emission factors (EF are useful both for modeling and to examine the representativeness of our lab fire EF. The lab EF/field EF ratio for

  12. Coupling field and laboratory measurements to estimate the emission factors of identified and unidentified trace gases for prescribed fires

    Directory of Open Access Journals (Sweden)

    R. J. Yokelson

    2012-08-01

    Full Text Available An extensive program of experiments focused on biomass burning emissions began with a laboratory phase in which vegetative fuels commonly consumed in prescribed fires were collected in the southeastern and southwestern US and burned in a series of 71 fires at the US Forest Service Fire Sciences Laboratory in Missoula, Montana. The particulate matter (PM2.5 emissions were measured by gravimetric filter sampling with subsequent analysis for elemental carbon (EC, organic carbon (OC, and 38 elements. The trace gas emissions were measured by an open-path Fourier transform infrared (OP-FTIR spectrometer, proton-transfer-reaction mass spectrometry (PTR-MS, proton-transfer ion-trap mass spectrometry (PIT-MS, negative-ion proton-transfer chemical-ionization mass spectrometry (NI-PT-CIMS, and gas chromatography with MS detection (GC-MS. 204 trace gas species (mostly non-methane organic compounds – NMOC were identified and quantified with the above instruments. Many of the 182 species quantified by the GC-MS have rarely, if ever, been measured in smoke before. An additional 153 significant peaks in the unit mass resolution mass spectra were quantified, but either could not be identified or most of the signal at that molecular mass was unaccounted for by identifiable species.

    In a second, "field" phase of this program, airborne and ground-based measurements were made of the emissions from prescribed fires that were mostly located in the same land management units where the fuels for the lab fires were collected. A broad variety, but smaller number of species (21 trace gas species and PM2.5 was measured on 14 fires in chaparral and oak savanna in the southwestern US, as well as pine forest understory in the southeastern US and Sierra Nevada mountains of California. The field measurements of emission factors (EF are useful both for modeling and to examine the representativeness of our lab fire EF. The lab EF/field EF ratio for

  13. Coupling field and laboratory measurements to estimate the emission factors of identified and unidentified trace gases for prescribed fires

    Energy Technology Data Exchange (ETDEWEB)

    Yokelson, R. J.; Burling, I. R.; Gilman, J. B.; Warneke, C.; Stockwell, C. E.; de Gouw, J.; Akagi, S. K.; Urbanski, S. P.; Veres, P.; Roberts, J. M.; Kuster, W. C.; Reardon, J.; Griffith, D. W. T.; Johnson, T. J.; Hosseini, S.; Miller, J. W.; Cocker III, D. R.; Jung, H.; Weise, D. R.

    2013-01-01

    Vegetative fuels commonly consumed in prescribed fires were collected from five locations in the southeastern and southwestern U.S. and burned in a series of 77 fires at the U.S. Forest Service Fire Sciences Laboratory in Missoula, Montana. The particulate matter (PM2.5) emissions were measured by gravimetric filter sampling with subsequent analysis for elemental carbon (EC), organic carbon (OC), and 38 elements. The trace gas emissions were measured with a large suite of state-of-the-art instrumentation including an open-path Fourier transform infrared (OP FTIR) spectrometer, proton-transfer-reaction mass spectrometry (PTR-MS), proton-transfer ion-trap mass spectrometry (PIT-MS), negative-ion proton-transfer chemical-ionization mass spectrometry (NI-PT-CIMS), and gas chromatography with MS detection (GC-MS). 204 trace gas species (mostly non-methane organic compounds (NMOC)) were identified and quantified with the above instruments. An additional 152 significant peaks in the unit mass resolution mass spectra were quantified, but either could not be identified or most of the signal at that molecular mass was unaccounted for by identifiable species. As phase II of this study, we conducted airborne and ground-based sampling of the emissions from real prescribed fires mostly in the same land management units where the fuels for the lab fires were collected. A broad variety, but smaller number of species (21 trace gas species and PM2.5) was measured on 14 fires in chaparral and oak savanna in the southwestern US, as well as pine forest understory in the southeastern US and Sierra Nevada mountains of California. These extensive field measurements of emission factors (EF) for temperate biomass burning are useful both for modeling and to examine the representativeness of our lab fire EF. The lab/field EF ratio for the pine understory fuels was not statistically different from one, on average. However, our lab EF for “smoldering compounds” emitted by burning the semi

  14. Identifying and prioritizing the factors influencing the success of science and technology foresight in the field of economy

    Directory of Open Access Journals (Sweden)

    Afsaneh Raieninezhad

    2014-08-01

    Full Text Available Promoting complex global environment, tremendous growth and increase of network communication technology in the world, strategic planning and foresight activities in science and technology have become very important. Gradually, organizations and businesses are realizing the importance of foresight; many organizations attempt to execute such activities. However, this concept is not still well known in our country and among our organizations. Therefore, recognizing the factors influencing the success of this concept is a kind of issues that the organizations and activists are faced. Thus, this research seeks to identify and to rank the factors, particularly in the areas of economy, and it has developed five hypotheses. In this paper, factors affecting the success of foresight are given in four groups of rational, structure, scope, and results. Data collection for this study is a questionnaire and the binomial tests, Pearson correlation and Friedman test have been used to prove the hypothesis. According to the analysis of data obtained from the questionnaire conducted by SPSS software, all research hypotheses were confirmed. It also became clear that the rational component had the greatest impact on the future success of science and technology in the field of economic.

  15. Identifying and preventing medical errors in patients with limited English proficiency: key findings and tools for the field.

    Science.gov (United States)

    Wasserman, Melanie; Renfrew, Megan R; Green, Alexander R; Lopez, Lenny; Tan-McGrory, Aswita; Brach, Cindy; Betancourt, Joseph R

    2014-01-01

    Since the 1999 Institute of Medicine (IOM) report To Err is Human, progress has been made in patient safety, but few efforts have focused on safety in patients with limited English proficiency (LEP). This article describes the development, content, and testing of two new evidence-based Agency for Healthcare Research and Quality (AHRQ) tools for LEP patient safety. In the content development phase, a comprehensive mixed-methods approach was used to identify common causes of errors for LEP patients, high-risk scenarios, and evidence-based strategies to address them. Based on our findings, Improving Patient Safety Systems for Limited English Proficient Patients: A Guide for Hospitals contains recommendations to improve detection and prevention of medical errors across diverse populations, and TeamSTEPPS Enhancing Safety for Patients with Limited English Proficiency Module trains staff to improve safety through team communication and incorporating interpreters in the care process. The Hospital Guide was validated with leaders in quality and safety at diverse hospitals, and the TeamSTEPPS LEP module was field-tested in varied settings within three hospitals. Both tools were found to be implementable, acceptable to their audiences, and conducive to learning. Further research on the impact of the combined use of the guide and module would shed light on their value as a multifaceted intervention.

  16. Effect of auditory deafferentation on the synaptic connectivity of a pair of identified interneurons in adult field crickets.

    Science.gov (United States)

    Brodfuehrer, P D; Hoy, R R

    1988-01-01

    In adult crickets, Teleogryllus oceanicus, unilateral auditory deafferentation causes the medial dendrites of an afferent-deprived, identified auditory interneuron (Int-1) in the prothoracic ganglion to sprout and form new functional connections in the contralateral auditory neuropil. The establishment of these new functional connections by the deafferented Int-1, however, does not appear to affect the physiological responses of Int-1's homolog on the intact side of the prothoracic ganglion which also innervates this auditory neuropil. Thus it appears that the sprouting dendrites of the deafferented Int-1 are not functionally competing with those of the intact Int-1 for synaptic connections in the remaining auditory neuropil following unilateral deafferentation in adult crickets. Moreover, we demonstrate that auditory function is restored to the afferent-deprived Int-1 within 4-6 days following deafferentation, when few branches of Int-1's medial dendrites can be seen to have sprouted. The strength of the physiological responses and extent of dendritic sprouting in the deafferented Int-1 progressively increase with time following deafferentation. By 28 days following deafferentation, most of the normal physiological responses of Int-1 to auditory stimuli have been restored in the deafferented Int-1, and the medial dendrites of the deafferented Int-1 have clearly sprouted and grown across into the contralateral auditory afferent field. The strength of the physiological responses of the deafferented Int-1 to auditory stimuli and extent of dendritic sprouting in the deafferented Int-1 are greater in crickets deafferented as juveniles than as adults. Thus, neuronal plasticity persists in Int-1 following sensory deprivation from the earliest juvenile stages through adulthood.

  17. Development and testing of a photometric method to identify non-operating solar hot water systems in field settings.

    Energy Technology Data Exchange (ETDEWEB)

    He, Hongbo (University of New Mexico, Albuquerque, NM); Vorobieff, Peter V. (University of New Mexico, Albuquerque, NM); Menicucci, David (University of New Mexico, Albuquerque, NM); Mammoli, Andrea A. (University of New Mexico, Albuquerque, NM); Carlson, Jeffrey J.

    2012-06-01

    This report presents the results of experimental tests of a concept for using infrared (IR) photos to identify non-operational systems based on their glazing temperatures; operating systems have lower glazing temperatures than those in stagnation. In recent years thousands of new solar hot water (SHW) systems have been installed in some utility districts. As these numbers increase, concern is growing about the systems dependability because installation rebates are often based on the assumption that all of the SHW systems will perform flawlessly for a 20-year period. If SHW systems routinely fail prematurely, then the utilities will have overpaid for grid-energy reduction performance that is unrealized. Moreover, utilities are responsible for replacing energy for loads that failed SHW system were supplying. Thus, utilities are seeking data to quantify the reliability of SHW systems. The work described herein is intended to help meet this need. The details of the experiment are presented, including a description of the SHW collectors that were examined, the testbed that was used to control the system and record data, the IR camera that was employed, and the conditions in which testing was completed. The details of the associated analysis are presented, including direct examination of the video records of operational and stagnant collectors, as well as the development of a model to predict glazing temperatures and an analysis of temporal intermittency of the images, both of which are critical to properly adjusting the IR camera for optimal performance. Many IR images and a video are presented to show the contrast between operating and stagnant collectors. The major conclusion is that the technique has potential to be applied by using an aircraft fitted with an IR camera that can fly over an area with installed SHW systems, thus recording the images. Subsequent analysis of the images can determine the operational condition of the fielded collectors. Specific

  18. Automatic Fixture Planning

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Fixture planning is a crucial problem in the field of fixture design. In this paper, the research scope and research methods of the computer-aided fixture planning are presented. Based on positioning principles of typical workparts, an ANN algorithm, namely Hopfield algorithm, is adopted for the automatic fixture planning. Also, this paper leads a deep research into the selection of positioning and clamping surfaces (or points) on workparts using positioning-clamping-surface-selecting rules and matrix evaluation of deterministic workpart positioning. In the end of this paper, the methods to select positioning and clamping elements from database and the layout algorithm to assemble the selected fixture elements into a tangible fixture are developed.

  19. THEORETICAL CONSIDERATIONS REGARDING THE AUTOMATIC FISCAL STABILIZERS OPERATING MECHANISM

    Directory of Open Access Journals (Sweden)

    Gondor Mihaela

    2012-07-01

    Full Text Available This paper examines the role of Automatic Fiscal Stabilizers (AFS for stabilizing the cyclical fluctuations of macroeconomic output as an alternative to discretionary fiscal policy, admitting its huge potential of being an anti crisis solution. The objectives of the study are the identification of the general features of the concept of automatic fiscal stabilizers and the logical assessment of them from economic perspectives. Based on the literature in the field, this paper points out the disadvantages of fiscal discretionary policy and argue the need of using Automatic Fiscal Stabilizers in order to provide a faster decision making process, shielded from political interference, and reduced uncertainty for households and business environment. The paper conclude about the need of using fiscal policy for smoothing the economic cycle, but in a way which includes among its features transparency, responsibility and clear operating mechanisms. Based on the research results the present paper assumes that pro-cyclicality reduces de effectiveness of the Automatic Fiscal Stabilizer and as a result concludes that it is very important to avoid the pro-cyclicality in fiscal rule design. Moreover, by committing in advance to specific fiscal policy action contingent on economic developments, uncertainty about the fiscal policy framework during a recession should be reduced. Being based on logical analysis and not focused on empirical, contextualized one, the paper presents some features of AFS operating mechanism and also identifies and systematizes the factors which provide its importance and national individuality. Reaching common understanding on the Automatic Fiscal Stabilizer concept as a institutional device for smoothing the gap of the economic cycles across different countries, particularly for the European Union Member States, will facilitate efforts to coordinate fiscal policy responses during a crisis, especially in the context of the fiscal

  20. UMLS-based automatic image indexing.

    Science.gov (United States)

    Sneiderman, C; Sneiderman, Charles Alan; Demner-Fushman, D; Demner-Fushman, Dina; Fung, K W; Fung, Kin Wah; Bray, B; Bray, Bruce

    2008-01-01

    To date, most accurate image retrieval techniques rely on textual descriptions of images. Our goal is to automatically generate indexing terms for an image extracted from a biomedical article by identifying Unified Medical Language System (UMLS) concepts in image caption and its discussion in the text. In a pilot evaluation of the suggested image indexing method by five physicians, a third of the automatically identified index terms were found suitable for indexing.

  1. Automatic speech recognition a deep learning approach

    CERN Document Server

    Yu, Dong

    2015-01-01

    This book summarizes the recent advancement in the field of automatic speech recognition with a focus on discriminative and hierarchical models. This will be the first automatic speech recognition book to include a comprehensive coverage of recent developments such as conditional random field and deep learning techniques. It presents insights and theoretical foundation of a series of recent models such as conditional random field, semi-Markov and hidden conditional random field, deep neural network, deep belief network, and deep stacking models for sequential learning. It also discusses practical considerations of using these models in both acoustic and language modeling for continuous speech recognition.

  2. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  3. UAV-BASED AUTOMATIC TREE GROWTH MEASUREMENT FOR BIOMASS ESTIMATION

    Directory of Open Access Journals (Sweden)

    M. Karpina

    2016-06-01

    Full Text Available Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  4. Uav-Based Automatic Tree Growth Measurement for Biomass Estimation

    Science.gov (United States)

    Karpina, M.; Jarząbek-Rychard, M.; Tymków, P.; Borkowski, A.

    2016-06-01

    Manual in-situ measurements of geometric tree parameters for the biomass volume estimation are time-consuming and economically non-effective. Photogrammetric techniques can be deployed in order to automate the measurement procedure. The purpose of the presented work is an automatic tree growth estimation based on Unmanned Aircraft Vehicle (UAV) imagery. The experiment was conducted in an agriculture test field with scots pine canopies. The data was collected using a Leica Aibotix X6V2 platform equipped with a Nikon D800 camera. Reference geometric parameters of selected sample plants were measured manually each week. In situ measurements were correlated with the UAV data acquisition. The correlation aimed at the investigation of optimal conditions for a flight and parameter settings for image acquisition. The collected images are processed in a state of the art tool resulting in a generation of dense 3D point clouds. The algorithm is developed in order to estimate geometric tree parameters from 3D points. Stem positions and tree tops are identified automatically in a cross section, followed by the calculation of tree heights. The automatically derived height values are compared to the reference measurements performed manually. The comparison allows for the evaluation of automatic growth estimation process. The accuracy achieved using UAV photogrammetry for tree heights estimation is about 5cm.

  5. Automatic translation among spoken languages

    Science.gov (United States)

    Walter, Sharon M.; Costigan, Kelly

    1994-01-01

    The Machine Aided Voice Translation (MAVT) system was developed in response to the shortage of experienced military field interrogators with both foreign language proficiency and interrogation skills. Combining speech recognition, machine translation, and speech generation technologies, the MAVT accepts an interrogator's spoken English question and translates it into spoken Spanish. The spoken Spanish response of the potential informant can then be translated into spoken English. Potential military and civilian applications for automatic spoken language translation technology are discussed in this paper.

  6. An automatic approach for 3D registration of CT scans

    Science.gov (United States)

    Hu, Yang; Saber, Eli; Dianat, Sohail; Vantaram, Sreenath Rao; Abhyankar, Vishwas

    2012-03-01

    CT (Computed tomography) is a widely employed imaging modality in the medical field. Normally, a volume of CT scans is prescribed by a doctor when a specific region of the body (typically neck to groin) is suspected of being abnormal. The doctors are required to make professional diagnoses based upon the obtained datasets. In this paper, we propose an automatic registration algorithm that helps healthcare personnel to automatically align corresponding scans from 'Study' to 'Atlas'. The proposed algorithm is capable of aligning both 'Atlas' and 'Study' into the same resolution through 3D interpolation. After retrieving the scanned slice volume in the 'Study' and the corresponding volume in the original 'Atlas' dataset, a 3D cross correlation method is used to identify and register various body parts.

  7. Automatic classification of time-variable X-ray sources

    CERN Document Server

    Lo, Kitty K; Murphy, Tara; Gaensler, B M

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the second \\textit{XMM-Newton} serendipitous source catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10-fold cross validation accuracy of the training data is ${\\sim}$97% on a seven-class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest der...

  8. Automatic Narrow-Deep Feature Recognition for Mould Manufacturing

    Institute of Scientific and Technical Information of China (English)

    Zheng-Ming Chen; Kun-Jin He; Jing Liu

    2011-01-01

    There usually exist narrow-long-deep areas in mould needed to be machined in special machining. To identify the narrow-deep areas automatically, an automatic narrow-deep feature (NF) recognition method is put forward accordingly. First, the narrow-deep feature is defined innovatively in this field and then feature hint is extracted from the mould by the characteristics of narrow-deep feature. Second, the elementary constituent faces (ECF) of a feature are found on the basis of the feature hint. By means of extending and clipping the ECF, the feature faces are obtained incrementally by geometric reasoning. As a result, basic narrow-deep features (BNF) related are combined heuristically. The proposed NF recognition method provides an intelligent connection between CAD and CAPP for machining narrow-deep areas in mould.

  9. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  10. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  11. A comparison of hydroponic and soil-based screening methods to identify salt tolerance in the field in barley

    Science.gov (United States)

    Tavakkoli, Ehsan; Fatehi, Foad; Rengasamy, Pichu; McDonald, Glenn K.

    2012-01-01

    Success in breeding crops for yield and other quantitative traits depends on the use of methods to evaluate genotypes accurately under field conditions. Although many screening criteria have been suggested to distinguish between genotypes for their salt tolerance under controlled environmental conditions, there is a need to test these criteria in the field. In this study, the salt tolerance, ion concentrations, and accumulation of compatible solutes of genotypes of barley with a range of putative salt tolerance were investigated using three growing conditions (hydroponics, soil in pots, and natural saline field). Initially, 60 genotypes of barley were screened for their salt tolerance and uptake of Na+, Cl–, and K+ at 150 mM NaCl and, based on this, a subset of 15 genotypes was selected for testing in pots and in the field. Expression of salt tolerance in saline solution culture was not a reliable indicator of the differences in salt tolerance between barley plants that were evident in saline soil-based comparisons. Significant correlations were observed in the rankings of genotypes on the basis of their grain yield production at a moderately saline field site and their relative shoot growth in pots at ECe 7.2 [Spearman’s rank correlation (rs)=0.79] and ECe 15.3 (rs=0.82) and the crucial parameter of leaf Na+ (rs=0.72) and Cl– (rs=0.82) concentrations at ECe 7.2 dS m−1. This work has established screening procedures that correlated well with grain yield at sites with moderate levels of soil salinity. This study also showed that both salt exclusion and osmotic tolerance are involved in salt tolerance and that the relative importance of these traits may differ with the severity of the salt stress. In soil, ion exclusion tended to be more important at low to moderate levels of stress but osmotic stress became more important at higher stress levels. Salt exclusion coupled with a synthesis of organic solutes were shown to be important components of salt

  12. A comparison of hydroponic and soil-based screening methods to identify salt tolerance in the field in barley.

    Science.gov (United States)

    Tavakkoli, Ehsan; Fatehi, Foad; Rengasamy, Pichu; McDonald, Glenn K

    2012-06-01

    Success in breeding crops for yield and other quantitative traits depends on the use of methods to evaluate genotypes accurately under field conditions. Although many screening criteria have been suggested to distinguish between genotypes for their salt tolerance under controlled environmental conditions, there is a need to test these criteria in the field. In this study, the salt tolerance, ion concentrations, and accumulation of compatible solutes of genotypes of barley with a range of putative salt tolerance were investigated using three growing conditions (hydroponics, soil in pots, and natural saline field). Initially, 60 genotypes of barley were screened for their salt tolerance and uptake of Na(+), Cl(-), and K(+) at 150 mM NaCl and, based on this, a subset of 15 genotypes was selected for testing in pots and in the field. Expression of salt tolerance in saline solution culture was not a reliable indicator of the differences in salt tolerance between barley plants that were evident in saline soil-based comparisons. Significant correlations were observed in the rankings of genotypes on the basis of their grain yield production at a moderately saline field site and their relative shoot growth in pots at EC(e) 7.2 [Spearman's rank correlation (rs)=0.79] and EC(e) 15.3 (rs=0.82) and the crucial parameter of leaf Na(+) (rs=0.72) and Cl(-) (rs=0.82) concentrations at EC(e) 7.2 dS m(-1). This work has established screening procedures that correlated well with grain yield at sites with moderate levels of soil salinity. This study also showed that both salt exclusion and osmotic tolerance are involved in salt tolerance and that the relative importance of these traits may differ with the severity of the salt stress. In soil, ion exclusion tended to be more important at low to moderate levels of stress but osmotic stress became more important at higher stress levels. Salt exclusion coupled with a synthesis of organic solutes were shown to be important components of

  13. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  14. Development and evaluation of an automatically adjusting coarse-grained force field for a β-O-4 type lignin from atomistic simulations

    Science.gov (United States)

    Li, Wenzhuo; Zhao, Yingying; Huang, Shuaiyu; Zhang, Song; Zhang, Lin

    2017-01-01

    This goal of this work was to develop a coarse-grained (CG) model of a β-O-4 type lignin polymer, because of the time consuming process required to achieve equilibrium for its atomistic model. The automatic adjustment method was used to develop the lignin CG model, which enables easy discrimination between chemically-varied polymers. In the process of building the lignin CG model, a sum of n Gaussian functions was obtained by an approximation of the corresponding atomistic potentials derived from a simple Boltzmann inversion of the distributions of the structural parameters. This allowed the establishment of the potential functions of the CG bond stretching and angular bending. To obtain the potential function of the CG dihedral angle, an algorithm similar to a Fourier progression form was employed together with a nonlinear curve-fitting method. The numerical potentials of the nonbonded portion of the lignin CG model were obtained using a potential inversion iterative method derived from the corresponding atomistic nonbonded distributions. The study results showed that the proposed CG model of lignin agreed well with its atomistic model in terms of the distributions of bond lengths, bending angles, dihedral angles and nonbonded distances between the CG beads. The lignin CG model also reproduced the static and dynamic properties of the atomistic model. The results of the comparative evaluation of the two models suggested that the designed lignin CG model was efficient and reliable.

  15. Identifying Areas of Tension in the Field of Technology-Enhanced Learning: Results of an International Delphi Study

    Science.gov (United States)

    Plesch, Christine; Kaendler, Celia; Rummel, Nikol; Wiedmann, Michael; Spada, Hans

    2013-01-01

    Despite steady progress in research in technology-enhanced learning (TEL), the translation of research findings and technology into educational practices falls short of expectations. We present five Areas of Tension which were identified and evaluated in an international Delphi study on TEL. These tensions might impede a more comprehensive…

  16. A comparison of screening methods to identify waterlogging tolerance in the field in Brassica napus L. during plant ontogeny.

    Directory of Open Access Journals (Sweden)

    Xiling Zou

    Full Text Available Waterlogging tolerance is typically evaluated at a specific development stage, with an implicit assumption that differences in waterlogging tolerance expressed in these systems will result in improved yield performance in fields. It is necessary to examine these criteria in fields. In the present study, three experiments were conducted to screen waterlogging tolerance in 25 rapeseed (Brassica napus L. varieties at different developmental stages, such as seedling establishment stage and seedling stage at controlled environment, and maturity stage in the fields. The assessments for physiological parameters at three growth stages suggest that there were difference of waterlogging tolerance at all the development stages, providing an important basis for further development of breeding more tolerant materials. The results indicated that flash waterlogging restricts plant growth and growth is still restored after removal of the stress. Correlation analysis between waterlogging tolerance coefficient (WTC of yield and other traits revealed that there was consistency in waterlogging tolerance of the genotypes until maturity, and good tolerance at seedling establishment stage and seedling stage can guarantee tolerance in later stages. The waterlogging-tolerant plants could be selected using some specific traits at any stage, and selections would be more effective at the seedling establishment stage. Thus, our study provides a method for screening waterlogging tolerance, which would enable the suitable basis for initial selection of a large number of germplasm or breeding populations for waterlogging tolerance and help for verifying their potential utility in crop-improvement.

  17. Automatic Kurdish Dialects Identification

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2016-02-01

    Full Text Available Automatic dialect identification is a necessary Lan guage Technology for processing multi- dialect languages in which the dialects are linguis tically far from each other. Particularly, this becomes crucial where the dialects are mutually uni ntelligible. Therefore, to perform computational activities on these languages, the sy stem needs to identify the dialect that is the subject of the process. Kurdish language encompasse s various dialects. It is written using several different scripts. The language lacks of a standard orthography. This situation makes the Kurdish dialectal identification more interesti ng and required, both form the research and from the application perspectives. In this research , we have applied a classification method, based on supervised machine learning, to identify t he dialects of the Kurdish texts. The research has focused on two widely spoken and most dominant Kurdish dialects, namely, Kurmanji and Sorani. The approach could be applied to the other Kurdish dialects as well. The method is also applicable to the languages which are similar to Ku rdish in their dialectal diversity and differences.

  18. Meaning representation for automatic indexing of Arabic texts

    Directory of Open Access Journals (Sweden)

    Bakhouche Abdelali

    2012-11-01

    Full Text Available The aim of indexing is to identify the words that represent the main idea of a paragraph or a specific text, in the framework of the representation of the meaning in an automatic treatment (NLP of the Arabic; we propose a model based on conceptual vectors. These vectors try to represent the whole of ideas contained in textual segment (word, expression, texts. This model lean on modern linguistic conception the semantic field theory. By basing itself on the semantic relations (synonymy, homonymy between the words, we use these fields for the construction of semantic field data base and of a vectorial space then we calculate the meaning of textual segments in the semantic fields. Finally we use this model for indexing the text.

  19. Automatic object recognition: critical issues and current approaches

    Science.gov (United States)

    Sadjadi, Firooz A.

    1991-08-01

    Automatic object recognition, with its diverse applications in numerous fields of science and technology, is permeating many aspects of military and civilian industries. This paper gives an overview of the issues confronting the automatic object recognition field and the approaches being used to address these issues.

  20. Automatic Germination Evaluation and Qualitative Analysis of Essential Oil of Mentha × piperita L. under the Influence of High Frequency Pulsatile Electromagnetic and Ultrasound Pulsatile Fields

    Directory of Open Access Journals (Sweden)

    Valentin SINGUREANU

    2015-04-01

    Full Text Available The study illustrates the influence of high frequency pulsatile electromagnetic fields and ultrasound pulsatile fields on Mentha × piperita L. seed germination and the quality of its essential oil. The physiological role of the above mentioned experimental factors was considered to be a catalyticall base point, improving germination percent, SVI (seedling vigor index, GVI (germination velocity index. All the biometric aspects of the germination process (seed area, seed perimeter, seed development on x and y radius, radicele length, hypocotyl length where determined using open free software, consolidating the general idea that scientific communities can improve and perfect open source projects. High frequency pulsatile electromagnetic fields (91.75% and ultrasound pulsatile fields (64.75% experimental variants gave higher germination percent compared to control (47.00%. Following the main terpenes determination, the same experimental variants obtained high accumulations of menthol, eugenol, thymol, eucalyptol, linalool and other components. These aspects can be scientifically sustained by the seedling vigor index marks obtained at high frequency pulsatile electromagnetic fields (1985.47 and ultrasound pulsatile fields (1480.09, creating the general premises for better development stages in the nursery sector. Raised accumulation of main therapeutical terpenes in Mentha × piperita L. must be supervised in further studies, when microscopically imagery of glandular trichomes and their density may lead to more profound conclusions.

  1. Pulsed field gel electrophoresis identifies an outbreak of Salmonella enterica serotype Montevideo infection associated with a supermarket hot food outlet.

    Science.gov (United States)

    Threlfall, E J; Hampton, M D; Ward, L R; Richardson, I R; Lanser, S; Greener, T

    1999-09-01

    In February 1996 Salmonella enterica serotype Montevideo infection in a patient in the North Tyneside area was attributed to consumption of cooked chicken bought from a supermarket hot food outlet. Isolates from the patient, leftover food, and environmental samples were indistinguishable by pulsed field gel electrophoresis (PFGE). PFGE also demonstrated that an outbreak of infection with S. Montevideo associated with the hot food outlet had occurred in late 1995 and early 1996. This study shows the importance of microbial strain discrimination in outbreak investigations and illustrates the value of close liaison between microbiologists, epidemiologists, and environmental health officers in the control of salmonella outbreaks.

  2. Video-Based Electroshocking Platform to Identify Lamprey Ammocoete Habitats: Field Validation and New Discoveries in the Columbia River Basin

    Energy Technology Data Exchange (ETDEWEB)

    Arntzen, Evan V.; Mueller, Robert P.

    2017-05-04

    A deep water electroshocking platform (DEP), developed to characterize larval lampreys (ammocoetes) and associated habitat in depths up to 15 m, was recently tested in the field. The DEP samples 0.55 m2∙min-1 without requiring ammocoete transport to the surface. Searches were conducted at a known rearing location (mouth of the Wind River, WA) and at locations on the Cowlitz River, WA, where ammocoetes had not previously been found. At the mouth of the Wind River, video imaged ammocoetes ranged from 50 to 150 mm in water depths between 1.5 m and 4.5 m and were more common in sediments containing organic silt. Ammocoetes (n=137) were detected at 61% of locations sampled (summer) and 50% of the locations sampled (winter). Following the field verification, the DEP was used on the lower 11.7 km of the Cowlitz River, WA. Ammocoetes (n=41) were found with a detection rate of 26% at specific search locations. Cowlitz River sediment containing ammocoetes was also dominated by silt with organic material, often downstream of alluvial bars in water depths from 0.8 to 1.7 m. Test results indicated a high sampling efficiency, favorable detection rates, and little or no impact to ammocoetes and their surrounding benthic environments.

  3. Mediation and Automatization.

    Science.gov (United States)

    Hutchins, Edwin

    This paper discusses the relationship between the mediation of task performance by some structure that is not inherent in the task domain itself and the phenomenon of automatization, in which skilled performance becomes effortless or phenomenologically "automatic" after extensive practice. The use of a common simple explicit mediating…

  4. Automatic Differentiation Package

    Energy Technology Data Exchange (ETDEWEB)

    2007-03-01

    Sacado is an automatic differentiation package for C++ codes using operator overloading and C++ templating. Sacado provide forward, reverse, and Taylor polynomial automatic differentiation classes and utilities for incorporating these classes into C++ codes. Users can compute derivatives of computations arising in engineering and scientific applications, including nonlinear equation solving, time integration, sensitivity analysis, stability analysis, optimization and uncertainity quantification.

  5. Digital automatic gain control

    Science.gov (United States)

    Uzdy, Z.

    1980-01-01

    Performance analysis, used to evaluated fitness of several circuits to digital automatic gain control (AGC), indicates that digital integrator employing coherent amplitude detector (CAD) is best device suited for application. Circuit reduces gain error to half that of conventional analog AGC while making it possible to automatically modify response of receiver to match incoming signal conditions.

  6. Focusing Automatic Code Inspections

    NARCIS (Netherlands)

    Boogerd, C.J.

    2010-01-01

    Automatic Code Inspection tools help developers in early detection of defects in software. A well-known drawback of many automatic inspection approaches is that they yield too many warnings and require a clearer focus. In this thesis, we provide such focus by proposing two methods to prioritize

  7. Identifying soil management zones in a sugarcane field using proximal sensed electromagnetic induction and gamma-ray spectrometry data

    Science.gov (United States)

    Dennerley, Claire; Huang, Jingyi; Nielson, Rod; Sefton, Michael; Triantafilis, John

    2017-04-01

    Over 70% of the Australian sugarcane industry operates in alluvial-estuarine areas characterised by sodic and infertile soils. There is a need to supply ameliorants and improve fertilisers and minimise off-farm pollution to the Great Barrier Reef. Therefore, information is required about the spatial variation in soils. However, traditional approaches are cost-prohibitive. Herein we showed how a digital soil mapping (DSM) approach can be used to identify soil management zones. In the first instance, ancillary data, including electromagnetic induction and gamma-ray spectrometry data were collected. Using a fuzzy k-means clustering algorithm management zones from two to six were identified. Using restricted maximum likelihood (REML) analysis of various topsoil (0-0.3m) and subsoil (0.6-0.9m) physical (e.g. clay) and chemical (e.g. exchangeable sodium percentage [ESP], exchangeable calcium and magnesium) properties, 3 zones were determined from minimising the mean squared prediction error. To manage the moderately sodic topsoil ESP of zones 3A and 3C and sodic 3B, different gypsum requirements were prescribed. Lime can also be added differentially to address low exchangeable Ca in zone 3A, 3B and 3C. With regard to exchangeable Mg, zones 3A and 3C do not require any fertiliser, whereas zone 3A requires the addition of a moderate amount. The results were consistent with percentage yield variance, suggesting the lower yield in 3C due to topsoil sodicity and strongly sodic subsoil with higher clay content. We concluded that the DSM approach was successful in identifying soil management zones and can be used to improve structural stability and soil fertility.

  8. Genetic Fate Mapping Identifies Second Heart Field Progenitor Cells As a Source of Adipocytes in Arrhythmogenic Right Ventricular Cardiomyopathy

    Science.gov (United States)

    Lombardi, Raffaella; Dong, Jinjiang; Rodriguez, Gabriela; Bell, Achim; Leung, Tack Ki; Schwartz, Robert J.; Willerson, James T.; Brugada, Ramon; Marian, Ali J.

    2009-01-01

    The phenotypic hallmark of arrhythmogenic right ventricular cardiomyopathy, a genetic disease of desmosomal proteins, is fibroadipocytic replacement of the right ventricle. Cellular origin of excess adipocytes, the responsible mechanism(s) and the basis for predominant involvement of the right ventricle are unknown. We generated 3 sets of lineage tracer mice regulated by cardiac lineage promoters α-myosin heavy chain (αMyHC), Nkx2.5, or Mef2C. We conditionally expressed the reporter enhanced yellow fluorescent protein while concomitantly deleting the desmosomal protein desmoplakin in cardiac myocyte lineages using the Cre-LoxP technique. Lineage tracer mice showed excess fibroadiposis and increased numbers of adipocytes in the hearts. Few adipocytes in the hearts of αMyHC-regulated lineage tracer mice, but the majority of adipocytes in the hearts of Nkx2.5- and Mef2C-regulated lineage tracer mice, expressed enhanced yellow fluorescent protein. In addition, rare cells coexpressed adipogenic transcription factors and the second heart field markers Isl1 and Mef2C in the lineage tracer mouse hearts and in human myocardium from patients with arrhythmogenic right ventricular cardiomyopathy. To delineate the responsible mechanism, we generated transgenic mice expressing desmosomal protein plakoglobin in myocyte lineages. Transgene plakoglobin translocated to nucleus, detected by immunoblotting and immunofluorescence staining and coimmunoprecipitated with Tcf7l2, a canonical Wnt signaling transcription factor. Expression levels of canonical Wnt/Tcf7l2 targets bone morphogenetic protein 7 and Wnt5b, which promote adipogenesis, were increased and expression level of connective tissue growth factor, an inhibitor of adipogenesis, was decreased. We conclude adipocytes in arrhythmogenic right ventricular cardiomyopathy originate from the second heart field cardiac progenitors, which switch to an adipogenic fate because of suppressed canonical Wnt signaling by nuclear

  9. New lipolytic enzymes identified by screening two metagenomic libraries derived from the soil of a winter wheat field

    Directory of Open Access Journals (Sweden)

    Stroobants, A.

    2015-01-01

    Full Text Available Description of the subject. Lipolytic enzymes are widely distributed and fulfil important physiological functions in the microorganisms inhabiting diverse environments. Soils are rich, diversified environments containing microbial communities that remain largely unknown. Objectives. This work aimed to discover new lipolytic enzymes. Method. New enzymes were found by functional screening of two seasonal metagenomic libraries (a winter and a spring library constructed from an agricultural soil. Screens were performed on 2xYT medium supplemented with 3% lipase reagent. Results. Nineteen positive clones were isolated. Analysis of the corresponding inserts led to identifying 23 putative lipolytic enzymes (13 for the winter library and 10 for the spring library displaying between 31% and 62% identity to known enzymes and belonging to seven different families. Conclusions. As enzymes show low identity to known enzymes, the encoded enzymes may display novel biochemical features.

  10. Correlation Between Quadrant Specific Automatic Visual Field Defect and Retinal Nerve Fiber Layer Thickness as Measured by Scanning Laser Polarimetry in Patients With Primary Open Angle Glaucoma

    Directory of Open Access Journals (Sweden)

    Yo-Chen Chang

    2008-05-01

    Full Text Available The purpose of this study was to correlate quadrant specific Humphrey visual field mean deviation (MD with retinal nerve fiber layer (RNFL thickness as measured by scanning laser polarimetry (GDx, and to determine whether there is a difference in the correlation with visual field defect between the Asian normative database provided by GDx (GDx database and our native normative database (KMU database. In an age-matched study, a control group of 240 normal eyes underwent GDx. Another 60 eyes with visual field defect due to primary angle glaucoma underwent autoperimetry and GDx examination. First, we compared four GDx measurements between the control and study groups. Next, we divided the visual field into four quadrants (superior, inferior, temporal, nasal and calculated the MD of each quadrant. We correlated the MD of superior, inferior and overall visual field with RNFL thickness judged by two databases (the GDx Asian internal normative database and the database from our control group. GDx detected abnormal RNFL thickness significantly more accurately when using the KMU database (p = 0.0473 for superior quadrant; p = 0.0074 for inferior quadrant; p = 0.0011 for average thickness than when using the GDx database. There was no significant difference in the specificity between these two databases. The normal ranges in the GDx internal normative database for Asians are too wide. By using our own GDx normative database, the correlations with MD of autoperimetry were significantly improved. We suggest that every laboratory and clinic establish its own normative database of GDx in Asia.

  11. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  12. Strong-Field Breit-Wheeler Pair Production in Short Laser Pulses: Identifying Multiphoton Interference and Carrier-Envelope Phase Effects

    CERN Document Server

    Jansen, Martin J A

    2015-01-01

    The creation of electron-positron pairs by the strong-field Breit-Wheeler process in intense short laser pulses is investigated in the framework of laser-dressed quantum electrodynamics. Regarding laser field parameters in the multiphoton regime, special attention is brought to the energy spectrum of the created particles, which can be reproduced and explained by means of an intuitive model. The model is based on the probabilities of multiphoton events driven by the spectral components of the laser pulse. It allows, in particular, to identify interferences between different pair production channels which exhibit a characteristic dependence on the laser carrier-envelope phase.

  13. Strong-field Breit-Wheeler pair production in short laser pulses: Identifying multiphoton interference and carrier-envelope-phase effects

    Science.gov (United States)

    Jansen, Martin J. A.; Müller, Carsten

    2016-03-01

    The creation of electron-positron pairs by the strong-field Breit-Wheeler process in intense short laser pulses is investigated in the framework of laser-dressed quantum electrodynamics. Regarding laser field parameters in the multiphoton regime, special attention is brought to the energy spectrum of the created particles, which can be reproduced and explained by means of an intuitive model. The model is based on the probabilities of multiphoton events driven by the spectral components of the laser pulse. It allows us, in particular, to identify interferences between different pair production channels which exhibit a characteristic dependence on the laser carrier-envelope phase.

  14. Tendinopathy: Investigating the Intersection of Clinical and Animal Research to Identify Progress and Hurdles in the Field

    Science.gov (United States)

    Titan, Ashley; Andarawis-Puri, Nelly

    2017-01-01

    Biological treatments, surgical interventions, and rehabilitation exercises have been successfully used to treat tendinopathy, but the development of effective treatments has been hindered by the lack of mechanistic data regarding the pathogenesis of the disease.While insightful, clinical studies are limited in their capacity to provide data regarding the pathogenesis of tendinopathies, emphasizing the value of animal models and cell culture studies to fill this essential gap in knowledge.Clinical pathological findings from imaging studies or histological analysis are not universal across patients with tendinopathy and have not been clearly associated with the onset of symptoms.There are several unresolved controversies, including the cellular changes that accompany the tendinopathic disease state and the role of inflammation.Additional research is needed to correlate the manifestations of the disease with its pathogenesis, with the goal of reaching a field-wide consensus on the pathology of the disease state. Such a consensus will allow standardized clinical practices to more effectively diagnose and treat tendinopathy. PMID:27792676

  15. Identifying and Interpreting Stratification in Sedimentary Rocks on Mars: Insight from Rover and Orbital Observations and Terrestrial Field Analogs

    Science.gov (United States)

    Edgar, Lauren A.

    Sedimentary rocks on Mars provide insight into past aqueous and atmospheric processes, climate regimes, and potential habitability. The stratigraphic architecture of sedimentary rocks on Mars is similar to that of Earth, indicating that the processes that govern deposition and erosion on Mars can be reasonably inferred through reference to analogous terrestrial systems. This dissertation aims to understand Martian surface processes through the use of (1) ground-based observations from the Mars Exploration Rovers, (2) orbital data from the High Resolution Imaging Science Experiment onboard the Mars Reconnaissance Orbiter, and (3) the use of terrestrial field analogs to understand bedforms and sediment transport on Mars. Chapters 1 and 2 trace the history of aqueous activity at Meridiani Planum, through the reconstruction of eolian bedforms at Victoria crater, and the identification of a potential mudstone facies at Santa Maria crater. Chapter 3 uses Terrestrial Laser Scanning to study cross-bedding in pyroclastic surge deposits on Earth in order to understand sediment transport in these events and to establish criteria for their identification on Mars. The final chapter analyzes stratal geometries in the Martian North Polar Layered Deposits using tools for sequence stratigraphic analysis, to better constrain past surface processes and past climate conditions on Mars.

  16. Automatic tracking sensor camera system

    Science.gov (United States)

    Tsuda, Takao; Kato, Daiichiro; Ishikawa, Akio; Inoue, Seiki

    2001-04-01

    We are developing a sensor camera system for automatically tracking and determining the positions of subjects moving in three-dimensions. The system is intended to operate even within areas as large as soccer fields. The system measures the 3D coordinates of the object while driving the pan and tilt movements of camera heads, and the degree of zoom of the lenses. Its principal feature is that it automatically zooms in as the object moves farther away and out as the object moves closer. This maintains the area of the object as a fixed position of the image. This feature makes stable detection by the image processing possible. We are planning to use the system to detect the position of a soccer ball during a soccer game. In this paper, we describe the configuration of the developing automatic tracking sensor camera system. We then give an analysis of the movements of the ball within images of games, the results of experiments on method of image processing used to detect the ball, and the results of other experiments to verify the accuracy of an experimental system. These results show that the system is sufficiently accurate in terms of obtaining positions in three-dimensions.

  17. 2nd International Conference on Mechatronics and Automatic Control

    CERN Document Server

    2015-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the second International Conference on Mechatronics and Automatic Control Systems held in Beijing, China on September 20-21, 2014. Examines how to improve productivity through the latest advanced technologies Covering new systems and techniques in the broad field of mechatronics and automatic control systems.

  18. Word Automaticity of Tree Automatic Scattered Linear Orderings Is Decidable

    CERN Document Server

    Huschenbett, Martin

    2012-01-01

    A tree automatic structure is a structure whose domain can be encoded by a regular tree language such that each relation is recognisable by a finite automaton processing tuples of trees synchronously. Words can be regarded as specific simple trees and a structure is word automatic if it is encodable using only these trees. The question naturally arises whether a given tree automatic structure is already word automatic. We prove that this problem is decidable for tree automatic scattered linear orderings. Moreover, we show that in case of a positive answer a word automatic presentation is computable from the tree automatic presentation.

  19. A web based semi automatic frame work for astrobiological researches

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Astrobiology addresses the possibility of extraterrestrial life and explores measures towards its recognition. Researches in this context are founded upon the premise that indicators of life encountered in space will be recognizable. However, effective recognition can be accomplished through a universal adaptation of life signatures without restricting solely to those attributes that represent local solutions to the challenges of survival. The life indicators should be modelled with reference to temporal and environmental variations specific to each planet and time. In this paper, we investigate a semi-automatic open source frame work for the accurate detection and interpretation of life signatures by facilitating public participation, in a similar way as adopted by SETI@home project. The involvement of public in identifying patterns can bring a thrust to the mission and is implemented using semi-automatic framework. Different advanced intelligent methodologies may augment the integration of this human machine analysis. Automatic and manual evaluations along with dynamic learning strategy have been adopted to provide accurate results. The system also helps to provide a deep public understanding about space agency’s works and facilitate a mass involvement in the astrobiological studies. It will surely help to motivate young eager minds to pursue a career in this field.

  20. Automatic Image-Based Pencil Sketch Rendering

    Institute of Scientific and Technical Information of China (English)

    王进; 鲍虎军; 周伟华; 彭群生; 徐迎庆

    2002-01-01

    This paper presents an automatic image-based approach for converting greyscale images to pencil sketches, in which strokes follow the image features. The algorithm first extracts a dense direction field automatically using Logical/Linear operators which embody the drawing mechanism. Next, a reconstruction approach based on a sampling-and-interpolation scheme is introduced to generate stroke paths from the direction field. Finally, pencil strokes are rendered along the specified paths with consideration of image tone and artificial illumination.As an important application, the technique is applied to render portraits from images with little user interaction. The experimental results demonstrate that the approach can automatically achieve compelling pencil sketches from reference images.

  1. Identifying Seismic Risk in the Appalachian Basin Geothermal Play Fairway Analysis Project Using Potential Fields, Seismicity, and the World Stress Map

    Science.gov (United States)

    Horowitz, F. G.

    2015-12-01

    A collaborative effort between Cornell University, Southern Methodist University, and West Virginia University has been sponsored by the US Department Of Energy to perform a Geothermal Play Fairway Analysis of the low temperature direct use potential for portions of the Appalachian sedimentary basin in New York, Pennsylvania and West Virginia - abbreviated here as GPFA-AB. One risk factor - of several being analyzed for the GPFA-AB - is whether a candidate location is near an active fault, and thereby potentially susceptible to induced seismicity from geothermal operations. Existing fault maps do not share the GPFA-AB boundaries or scale. Hence, their use leads to problems of uneven coverage, varying interpretation of faults vs. lineaments, and different mapping scales. For more uniformity across the GPFA-AB region, we use an analysis of gravity and magnetic fields. Multiscale edge Poisson wavelet analyses of potential fields ("worms") have a physical interpretation as the locations of lateral boundaries in a source distribution that exactly generates the observed field. Not all worms are faults, and of faults, only a subset might be active. Also, worms are only sensitive to steeply dipping structures. To identify some active structures, we plot worms and intra-plate earthquakes from the ISC, NEIC, and EarthScope TA catalogs. Worms within a small distance of epicenters are tracked spatially. To within errors in location, this is a sufficient condition to identify structures that might be active faults - which we categorize with higher risk than other structures. Plotting worms within World Stress Map σ1 directions yields an alternative approach to identifying activatable structures. Here, we use worms to identify structures with strikes favorably oriented for failure by Byerlee's law. While this is a necessary criterion for fault activation it is not a sufficient one - because we lack detailed information about stress magnitudes throughout the GPFA-AB region

  2. Automatic generation of time resolved motion vector fields of coronary arteries and 4D surface extraction using rotational x-ray angiography

    Science.gov (United States)

    Jandt, Uwe; Schäfer, Dirk; Grass, Michael; Rasche, Volker

    2009-01-01

    Rotational coronary angiography provides a multitude of x-ray projections of the contrast agent enhanced coronary arteries along a given trajectory with parallel ECG recording. These data can be used to derive motion information of the coronary arteries including vessel displacement and pulsation. In this paper, a fully automated algorithm to generate 4D motion vector fields for coronary arteries from multi-phase 3D centerline data is presented. The algorithm computes similarity measures of centerline segments at different cardiac phases and defines corresponding centerline segments as those with highest similarity. In order to achieve an excellent matching accuracy, an increasing number of bifurcations is included as reference points in an iterative manner. Based on the motion data, time-dependent vessel surface extraction is performed on the projections without the need of prior reconstruction. The algorithm accuracy is evaluated quantitatively on phantom data. The magnitude of longitudinal errors (parallel to the centerline) reaches approx. 0.50 mm and is thus more than twice as large as the transversal 3D extraction errors of the underlying multi-phase 3D centerline data. It is shown that the algorithm can extract asymmetric stenoses accurately. The feasibility on clinical data is demonstrated on five different cases. The ability of the algorithm to extract time-dependent surface data, e.g. for quantification of pulsating stenosis is demonstrated.

  3. Automatic generation of time resolved motion vector fields of coronary arteries and 4D surface extraction using rotational x-ray angiography

    Energy Technology Data Exchange (ETDEWEB)

    Jandt, Uwe; Schaefer, Dirk; Grass, Michael [Philips Research Europe-Hamburg, Roentgenstr. 24, 22335 Hamburg (Germany); Rasche, Volker [University of Ulm, Department of Internal Medicine II, Robert-Koch-Strasse 8, 89081 Ulm (Germany)], E-mail: ujandt@gmx.de

    2009-01-07

    Rotational coronary angiography provides a multitude of x-ray projections of the contrast agent enhanced coronary arteries along a given trajectory with parallel ECG recording. These data can be used to derive motion information of the coronary arteries including vessel displacement and pulsation. In this paper, a fully automated algorithm to generate 4D motion vector fields for coronary arteries from multi-phase 3D centerline data is presented. The algorithm computes similarity measures of centerline segments at different cardiac phases and defines corresponding centerline segments as those with highest similarity. In order to achieve an excellent matching accuracy, an increasing number of bifurcations is included as reference points in an iterative manner. Based on the motion data, time-dependent vessel surface extraction is performed on the projections without the need of prior reconstruction. The algorithm accuracy is evaluated quantitatively on phantom data. The magnitude of longitudinal errors (parallel to the centerline) reaches approx. 0.50 mm and is thus more than twice as large as the transversal 3D extraction errors of the underlying multi-phase 3D centerline data. It is shown that the algorithm can extract asymmetric stenoses accurately. The feasibility on clinical data is demonstrated on five different cases. The ability of the algorithm to extract time-dependent surface data, e.g. for quantification of pulsating stenosis is demonstrated.

  4. Automatic apparatus and data transmission for field response tests of the ground; Automatisation et teletransmission des donnees pour les tests de reponse du terrain

    Energy Technology Data Exchange (ETDEWEB)

    Laloui, L.; Steinmann, G.

    2004-07-01

    This is the report on the third part of a development started 1998 at the Swiss Federal Institute of Technology Lausanne (EPFL) in Lausanne, Switzerland. Energy piles are becoming increasingly used as a heat exchanger and heat storage device, as are geothermal probes. Their design and sizing is subject to some uncertainty due to the fact that the planner has to estimate the thermal and mechanical properties of the ground surrounding the piles or probes. The aim of the project was to develop an apparatus for field measurements of thermal and mechanical properties of an energy pile or a geothermal probe (thermal response tests). In the reported third phase of the project the portable apparatus was equipped with a data transmission device using the Internet. Real-time data acquisition and supervision is now implemented and data processing has been improved. Another goal of the project was to obtain the official accreditation of such response tests according to the European standard EN 45,000. First operation experience from a test in Lyon, France is reported.

  5. Designing a Knowledge Base for Automatic Book Classification.

    Science.gov (United States)

    Kim, Jeong-Hyen; Lee, Kyung-Ho

    2002-01-01

    Reports on the design of a knowledge base for an automatic classification in the library science field by using the facet classification principles of colon classification. Discusses inputting titles or key words into the computer to create class numbers through automatic subject recognition and processing title key words. (Author/LRW)

  6. Automatic Positioning System of Small Agricultural Robot

    Science.gov (United States)

    Momot, M. V.; Proskokov, A. V.; Natalchenko, A. S.; Biktimirov, A. S.

    2016-08-01

    The present article discusses automatic positioning systems of agricultural robots used in field works. The existing solutions in this area have been analyzed. The article proposes an original solution, which is easy to implement and is characterized by high- accuracy positioning.

  7. Performance in Magnetic Field of the Bunch and Track Identifier Prototype for the Muon Barrel Trigger: Results of the 2000 Test Beam

    CERN Document Server

    Castellani, Lorenzo; Martinelli, Roberto; Meneguzzo, Anna Teresa; Vanini, Sara; Zotto, Pierluigi

    2001-01-01

    A sample of ASIC prototypes of the first level trigger front-end for the CMS muon barrel drift chambers, the so called Bunch and Track Identifier, was tested on a Superlayer of a chamber prototype with the final cell design. Tests were performed using a collimated muon beam incident on the Superlayer at several angles. The Superlayer was immersed in a magnetic field parallel to the beam. We report on the performance of the tested prototypes with respect to bunch crossing efficiency, resolution and noise. The impact of different setting parameters of the BTI module has been investigated

  8. A formal re-description of the cockroach Hebardina concinna anchored on DNA Barcodes confirms wing polymorphism and identifies morphological characters for field identification.

    Directory of Open Access Journals (Sweden)

    Qiaoyun Yue

    Full Text Available BACKGROUND: Hebardina concinna is a domestic pest and potential vector of pathogens throughout East and Southeast Asia, yet identification of this species has been difficult due to a lack of diagnostic morphological characters, and to uncertainty in the relationship between macroptyrous (long-winged and brachypterous (small-winged morphotypes. In insects male genital structures are typically species-specific and are frequently used to identify species. However, male genital structures in H. concinna had not previously been described, in part due to difficulty in identifying conspecifics. METHODS/PRINCIPAL FINDINGS: We collected 15 putative H. concinna individuals, from Chinese populations, of both wing morphotypes and both sexes and then generated mitochondrial COI (the standard barcode region and COII sequences from five of these individuals. These confirmed that both morphotypes of both sexes are the same species. We then dissected male genitalia and compared genital structures from macropterous and brachypterous individuals, which we showed to be identical, and present here for the first time a detailed description of H. concinna male genital structures. We also present a complete re-description of the morphological characters of this species, including both wing morphs. CONCLUSIONS/SIGNIFICANCE: This work describes a practical application of DNA barcoding to confirm that putatively polymorphic insects are conspecific and then to identify species-specific characters that can be used in the field to identify individuals and to obviate the delay and cost of returning samples to a laboratory for DNA sequencing.

  9. Automatic Program Development

    DEFF Research Database (Denmark)

    by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers......Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...

  10. Automatic computerized radiographic identification of cephalometric landmarks.

    Science.gov (United States)

    Rudolph, D J; Sinclair, P M; Coggins, J M

    1998-02-01

    Computerized cephalometric analysis currently requires manual identification of landmark locations. This process is time-consuming and limited in accuracy. The purpose of this study was to develop and test a novel method for automatic computer identification of cephalometric landmarks. Spatial spectroscopy (SS) is a computerized method that identifies image structure on the basis of a convolution of the image with a set of filters followed by a decision method using statistical pattern recognition techniques. By this method, characteristic features are used to recognize anatomic structures. This study compared manual identification on a computer monitor and the SS automatic method for landmark identification on minimum resolution images (0.16 cm2 per pixel). Minimum resolution (defined as the lowest resolution at which a cephalometric structure could be identified) was used to reduce computational time and memory requirements during this development stage of the SS method. Fifteen landmarks were selected on a set of 14 test images. The results showed no statistical difference (p > 0.05) in mean landmark identification errors between manual identification on the computer display and automatic identification using SS. We conclude that SS shows potential for the automatic detection of landmarks, which is an important step in the development of a completely automatic cephalometric analysis.

  11. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  12. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  13. Statistical pattern recognition for automatic writer identification and verification

    NARCIS (Netherlands)

    Bulacu, Marius Lucian

    2007-01-01

    The thesis addresses the problem of automatic person identification using scanned images of handwriting.Identifying the author of a handwritten sample using automatic image-based methods is an interesting pattern recognition problem with direct applicability in the forensic and historic document

  14. A Machine Vision System for Automatically Grading Hardwood Lumber - (Proceedings)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas H. Drayer; Joe G. Tront; Philip A. Araman; Robert L. Brisbon

    1990-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  15. Statistical pattern recognition for automatic writer identification and verification

    NARCIS (Netherlands)

    Bulacu, Marius Lucian

    2007-01-01

    The thesis addresses the problem of automatic person identification using scanned images of handwriting.Identifying the author of a handwritten sample using automatic image-based methods is an interesting pattern recognition problem with direct applicability in the forensic and historic document ana

  16. Automatic characterization of dynamics in Absence Epilepsy

    DEFF Research Database (Denmark)

    Petersen, Katrine N. H.; Nielsen, Trine N.; Kjær, Troels W.

    2013-01-01

    Dynamics of the spike-wave paroxysms in Childhood Absence Epilepsy (CAE) are automatically characterized using novel approaches. Features are extracted from scalograms formed by Continuous Wavelet Transform (CWT). Detection algorithms are designed to identify an estimate of the temporal development...

  17. Automatic Age Estimation System for Face Images

    Directory of Open Access Journals (Sweden)

    Chin-Teng Lin

    2012-11-01

    Full Text Available Humans are the most important tracking objects in surveillance systems. However, human tracking is not enough to provide the required information for personalized recognition. In this paper, we present a novel and reliable framework for automatic age estimation based on computer vision. It exploits global face features based on the combination of Gabor wavelets and orthogonal locality preserving projections. In addition, the proposed system can extract face aging features automatically in real‐time. This means that the proposed system has more potential in applications compared to other semi‐automatic systems. The results obtained from this novel approach could provide clearer insight for operators in the field of age estimation to develop real‐world applications.

  18. Exploring Automatization Processes.

    Science.gov (United States)

    DeKeyser, Robert M.

    1996-01-01

    Presents the rationale for and the results of a pilot study attempting to document in detail how automatization takes place as the result of different kinds of intensive practice. Results show that reaction times and error rates gradually decline with practice, and the practice effect is skill-specific. (36 references) (CK)

  19. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstrac...

  20. Automatic Dance Lesson Generation

    Science.gov (United States)

    Yang, Yang; Leung, H.; Yue, Lihua; Deng, LiQun

    2012-01-01

    In this paper, an automatic lesson generation system is presented which is suitable in a learning-by-mimicking scenario where the learning objects can be represented as multiattribute time series data. The dance is used as an example in this paper to illustrate the idea. Given a dance motion sequence as the input, the proposed lesson generation…

  1. Automaticity and Reading: Perspectives from the Instance Theory of Automatization.

    Science.gov (United States)

    Logan, Gordon D.

    1997-01-01

    Reviews recent literature on automaticity, defining the criteria that distinguish automatic processing from non-automatic processing, and describing modern theories of the underlying mechanisms. Focuses on evidence from studies of reading and draws implications from theory and data for practical issues in teaching reading. Suggests that…

  2. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  3. Automatic Detection of Cyberbullying on Social Media

    OpenAIRE

    Engman, Love

    2016-01-01

    Bullying on social media is a dire problem for many youths, leading to severe health problems. In this thesis we describe the construction of a software prototype capable of automatically identifying bullying comments on the social media platform ASKfm using Natural Language Processing (NLP) and Machine Learning (ML) techniques. State of the art NLP and ML algorithms from previous research are studied and evaluated for the task of identifying bullying comments in a data set from ASKfm. The be...

  4. Discriminative Chemical Patterns: Automatic and Interactive Design.

    Science.gov (United States)

    Bietz, Stefan; Schomburg, Karen T; Hilbig, Matthias; Rarey, Matthias

    2015-08-24

    The classification of molecules with respect to their inhibiting, activating, or toxicological potential constitutes a central aspect in the field of cheminformatics. Often, a discriminative feature is needed to distinguish two different molecule sets. Besides physicochemical properties, substructures and chemical patterns belong to the descriptors most frequently applied for this purpose. As a commonly used example of this descriptor class, SMARTS strings represent a powerful concept for the representation and processing of abstract chemical patterns. While their usage facilitates a convenient way to apply previously derived classification rules on new molecule sets, the manual generation of useful SMARTS patterns remains a complex and time-consuming process. Here, we introduce SMARTSminer, a new algorithm for the automatic derivation of discriminative SMARTS patterns from preclassified molecule sets. Based on a specially adapted subgraph mining algorithm, SMARTSminer identifies structural features that are frequent in only one of the given molecule classes. In comparison to elemental substructures, it also supports the consideration of general and specific SMARTS features. Furthermore, SMARTSminer is integrated into an interactive pattern editor named SMARTSeditor. This allows for an intuitive visualization on the basis of the SMARTSviewer concept as well as interactive adaption and further improvement of the generated patterns. Additionally, a new molecular matching feature provides an immediate feedback on a pattern's matching behavior across the molecule sets. We demonstrate the utility of the SMARTSminer functionality and its integration into the SMARTSeditor software in several different classification scenarios.

  5. Natural or Induced: Identifying Natural and Induced Swarms from Pre-production and Co-production Microseismic Catalogs at the Coso Geothermal Field

    Science.gov (United States)

    Schoenball, Martin; Kaven, Joern; Glen, Jonathan M. G.; Davatzes, Nicholas C.

    2015-01-01

    Increased levels of seismicity coinciding with injection of reservoir fluids have prompted interest in methods to distinguish induced from natural seismicity. Discrimination between induced and natural seismicity is especially difficult in areas that have high levels of natural seismicity, such as the geothermal fields at the Salton Sea and Coso, both in California. Both areas show swarm-like sequences that could be related to natural, deep fluid migration as part of the natural hydrothermal system. Therefore, swarms often have spatio-temporal patterns that resemble fluid-induced seismicity, and might possibly share other characteristics. The Coso Geothermal Field and its surroundings is one of the most seismically active areas in California with a large proportion of its activity occurring as seismic swarms. Here we analyze clustered seismicity in and surrounding the currently produced reservoir comparatively for pre-production and co-production periods. We perform a cluster analysis, based on the inter-event distance in a space-time-energy domain to identify notable earthquake sequences. For each event j, the closest previous event i is identified and their relationship categorized. If this nearest neighbor’s distance is below a threshold based on the local minimum of the bimodal distribution of nearest neighbor distances, then the event j is included in the cluster as a child to this parent event i. If it is above the threshold, event j begins a new cluster. This process identifies subsets of events whose nearest neighbor distances and relative timing qualify as a cluster as well as a characterizing the parent-child relationships among events in the cluster. We apply this method to three different catalogs: (1) a two-year microseismic survey of the Coso geothermal area that was acquired before exploration drilling in the area began; (2) the HYS_catalog_2013 that contains 52,000 double-difference relocated events and covers the years 1981 to 2013; and (3) a

  6. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    Medical ultrasound has been a widely used imaging modality in healthcare platforms for examination, diagnostic purposes, and for real-time guidance during surgery. However, despite the recent advances, medical ultrasound remains the most operator-dependent imaging modality, as it heavily relies...... on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  7. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  8. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  9. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  10. Document Exploration and Automatic Knowledge Extraction for Unstructured Biomedical Text

    Science.gov (United States)

    Chu, S.; Totaro, G.; Doshi, N.; Thapar, S.; Mattmann, C. A.; Ramirez, P.

    2015-12-01

    We describe our work on building a web-browser based document reader with built-in exploration tool and automatic concept extraction of medical entities for biomedical text. Vast amounts of biomedical information are offered in unstructured text form through scientific publications and R&D reports. Utilizing text mining can help us to mine information and extract relevant knowledge from a plethora of biomedical text. The ability to employ such technologies to aid researchers in coping with information overload is greatly desirable. In recent years, there has been an increased interest in automatic biomedical concept extraction [1, 2] and intelligent PDF reader tools with the ability to search on content and find related articles [3]. Such reader tools are typically desktop applications and are limited to specific platforms. Our goal is to provide researchers with a simple tool to aid them in finding, reading, and exploring documents. Thus, we propose a web-based document explorer, which we called Shangri-Docs, which combines a document reader with automatic concept extraction and highlighting of relevant terms. Shangri-Docsalso provides the ability to evaluate a wide variety of document formats (e.g. PDF, Words, PPT, text, etc.) and to exploit the linked nature of the Web and personal content by performing searches on content from public sites (e.g. Wikipedia, PubMed) and private cataloged databases simultaneously. Shangri-Docsutilizes Apache cTAKES (clinical Text Analysis and Knowledge Extraction System) [4] and Unified Medical Language System (UMLS) to automatically identify and highlight terms and concepts, such as specific symptoms, diseases, drugs, and anatomical sites, mentioned in the text. cTAKES was originally designed specially to extract information from clinical medical records. Our investigation leads us to extend the automatic knowledge extraction process of cTAKES for biomedical research domain by improving the ontology guided information extraction

  11. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    is developed based on robust statistical methods. The method does not demand any visual inspection of the data prior to modeling, and can handle first and second order Rayleigh scatter as well as Raman scatter in various types of EEM data. The results of the automated scatter identification method were used......First and second order Rayleigh and Raman scatter is a common problem when fitting Parallel Factor Analysis (PARAFAC) to fluorescence excitation-emission data (EEM). The scatter does not contain any relevant chemical information and does not conform to the low-rank trilinear model. The scatter...... as input data for three different PARAFAC methods. Firstly inserting missing values in the scatter regions are tested, secondly an interpolation of the scatter regions is performed and finally the scatter regions are down-weighted. These results show that the PARAFAC method to choose after scatter...

  12. Automatically identifying characteristic features of non-native English accents

    NARCIS (Netherlands)

    Bloem, Jelke; Wieling, Martijn; Nerbonne, John; Côté, Marie-Hélène; Knooihuizen, Remco; Nerbonne, John

    2016-01-01

    In this work, we demonstrate the application of statistical measures from dialectometry to the study of accented English speech. This new methodology enables a more quantitative approach to the study of accents. Studies on spoken dialect data have shown that a combination of representativeness (the

  13. Automatic stereoscopic system for person recognition

    Science.gov (United States)

    Murynin, Alexander B.; Matveev, Ivan A.; Kuznetsov, Victor D.

    1999-06-01

    A biometric access control system based on identification of human face is presented. The system developed performs remote measurements of the necessary face features. Two different scenarios of the system behavior are implemented. The first one assumes the verification of personal data entered by visitor from console using keyboard or card reader. The system functions as an automatic checkpoint, that strictly controls access of different visitors. The other scenario makes it possible to identify visitors without any person identifier or pass. Only person biometrics are used to identify the visitor. The recognition system automatically finds necessary identification information preliminary stored in the database. Two laboratory models of recognition system were developed. The models are designed to use different information types and sources. In addition to stereoscopic images inputted to computer from cameras the models can use voice data and some person physical characteristics such as person's height, measured by imaging system.

  14. Semi-automatic knee cartilage segmentation

    Science.gov (United States)

    Dam, Erik B.; Folkesson, Jenny; Pettersen, Paola C.; Christiansen, Claus

    2006-03-01

    Osteo-Arthritis (OA) is a very common age-related cause of pain and reduced range of motion. A central effect of OA is wear-down of the articular cartilage that otherwise ensures smooth joint motion. Quantification of the cartilage breakdown is central in monitoring disease progression and therefore cartilage segmentation is required. Recent advances allow automatic cartilage segmentation with high accuracy in most cases. However, the automatic methods still fail in some problematic cases. For clinical studies, even if a few failing cases will be averaged out in the overall results, this reduces the mean accuracy and precision and thereby necessitates larger/longer studies. Since the severe OA cases are often most problematic for the automatic methods, there is even a risk that the quantification will introduce a bias in the results. Therefore, interactive inspection and correction of these problematic cases is desirable. For diagnosis on individuals, this is even more crucial since the diagnosis will otherwise simply fail. We introduce and evaluate a semi-automatic cartilage segmentation method combining an automatic pre-segmentation with an interactive step that allows inspection and correction. The automatic step consists of voxel classification based on supervised learning. The interactive step combines a watershed transformation of the original scan with the posterior probability map from the classification step at sub-voxel precision. We evaluate the method for the task of segmenting the tibial cartilage sheet from low-field magnetic resonance imaging (MRI) of knees. The evaluation shows that the combined method allows accurate and highly reproducible correction of the segmentation of even the worst cases in approximately ten minutes of interaction.

  15. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.

    2012-01-01

    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation...... regions. The result is a blue-brown ratio for each eye. Furthermore, an image clustering approach has been used with promising results. The approach is based on using a sparse dictionary of feature vectors learned from a training set of iris regions. The feature vectors contain both local structural...... is completely data driven and it can divide a group of eye images into classes based on structure, colour or a combination of the two. The methods have been tested on a large set of photos with promising results....

  16. Automatic summarising factors and directions

    CERN Document Server

    Jones, K S

    1998-01-01

    This position paper suggests that progress with automatic summarising demands a better research methodology and a carefully focussed research strategy. In order to develop effective procedures it is necessary to identify and respond to the context factors, i.e. input, purpose, and output factors, that bear on summarising and its evaluation. The paper analyses and illustrates these factors and their implications for evaluation. It then argues that this analysis, together with the state of the art and the intrinsic difficulty of summarising, imply a nearer-term strategy concentrating on shallow, but not surface, text analysis and on indicative summarising. This is illustrated with current work, from which a potentially productive research programme can be developed.

  17. Carrier-phase differential GPS for automatic control of land vehicles

    Science.gov (United States)

    O'Connor, Michael Lee

    Real-time centimeter-level navigation has countless potential applications in land vehicles, including precise topographic field mapping, runway snowplowing in bad weather, and land mine detection and avoidance. Perhaps the most obvious and immediate need for accurate, robust land vehicle sensing is in the guidance and control of agricultural vehicles. Accurate guidance and automatic control of farm vehicles offers many potential advantages; however, previous attempts to automate these vehicles have been unsuccessful due to sensor limitations. With the recent development of real-time carrier-phase differential GPS (CDGPS), a single inexpensive GPS receiver can measure a vehicle's position to within a few centimeters and orientation to fractions of a degree. This ability to provide accurate real-time measurements of multiple vehicle states makes CDGPS ideal for automatic control of vehicles. This work describes the theoretical and experimental work behind the first successfully demonstrated automatic control system for land vehicles based on CDGPS. An extension of pseudolite-based CDGPS initialization methods was explored for land vehicles and demonstrated experimentally. Original land vehicle dynamic models were developed and identified using this innovative sensor. After initial automatic control testing using a Yamaha Fleetmaster golf cart, a centimeter-level, fully autonomous row guidance capability was demonstrated on a John Deere 7800 farm tractor.

  18. An Autonomous Robotic System for Mapping Weeds in Fields

    DEFF Research Database (Denmark)

    Hansen, Karl Damkjær; Garcia Ruiz, Francisco Jose; Kazmi, Wajahat

    2013-01-01

    The ASETA project develops theory and methods for robotic agricultural systems. In ASETA, unmanned aircraft and unmanned ground vehicles are used to automate the task of identifying and removing weeds in sugar beet fields. The framework for a working automatic robotic weeding system is presented...

  19. UTILIZACIÓN DE SOFTWARE DE CORRECCIÓN AUTOMÁTICA EN EL CAMPO DE LAS CIENCIAS DE LA SALUD Using automatic correction software in the field of health sciences

    Directory of Open Access Journals (Sweden)

    Ferrán Prados

    2010-06-01

    Full Text Available Estamos viviendo una época de cambios profundos en la educación universitaria. La implantación del plan de Bolonia nos ha llevado a plantear nuevas metodologías docentes, a revisar el papel del estudiante, la evaluación por competencias, la incorporación de las TIC. Hechos impensables hace poco más de una década. Entre las diferentes plataformas informáticas, cabe destacar las que permiten corrección automática de ejercicios, porque son instrumentos de un gran interés pedagógico ya que evalúan al instante al alumnado y aportan un feedback del conocimiento que tiene en forma de mensaje de ayuda o de nota. Si la potencia de estas herramientas la sumamos a la de Internet, usando un entorno de e-learning, el resultado permitirá trabajar, corregir, evaluar, resolver dudas, etc., desde cualquier lugar y a cualquier hora. Este trabajo presenta parte de una plataforma y los resultados de su utilización en el ámbito de las ciencias de la salud.We live in an era of profound changes in university education. The implementation of Bologna plan has led us to raise new teaching methodologies, to review the role of the student, competency assessment, the incorporation of ICT. Unthinkable acts, one or two decade ago. The TIC concept is very broad and is attributed to the media, processes and content usage. Inside the supports and platforms, we stress tools that allow automatic correction of exercises, because they are instruments of great educational value because instantly they assess students and provide instant feedback about the knowledge that they have either as message support or note. If the power of these tools, we add the Internet, using e-learning environment, the results allow us to work, edit, evaluate, resolve doubts, and so on, anywhere, anytime. We present part of a platform and the results of its use in the field of health sciences.

  20. Towards automatic calibration of 2-dimensional flood propagation models

    Directory of Open Access Journals (Sweden)

    P. Fabio

    2009-11-01

    Full Text Available Hydraulic models for flood propagation description are an essential tool in many fields, e.g. civil engineering, flood hazard and risk assessments, evaluation of flood control measures, etc. Nowadays there are many models of different complexity regarding the mathematical foundation and spatial dimensions available, and most of them are comparatively easy to operate due to sophisticated tools for model setup and control. However, the calibration of these models is still underdeveloped in contrast to other models like e.g. hydrological models or models used in ecosystem analysis. This has basically two reasons: first, the lack of relevant data against the models can be calibrated, because flood events are very rarely monitored due to the disturbances inflicted by them and the lack of appropriate measuring equipment in place. Secondly, especially the two-dimensional models are computationally very demanding and therefore the use of available sophisticated automatic calibration procedures is restricted in many cases. This study takes a well documented flood event in August 2002 at the Mulde River in Germany as an example and investigates the most appropriate calibration strategy for a full 2-D hyperbolic finite element model. The model independent optimiser PEST, that gives the possibility of automatic calibrations, is used. The application of the parallel version of the optimiser to the model and calibration data showed that a it is possible to use automatic calibration in combination of 2-D hydraulic model, and b equifinality of model parameterisation can also be caused by a too large number of degrees of freedom in the calibration data in contrast to a too simple model setup. In order to improve model calibration and reduce equifinality a method was developed to identify calibration data with likely errors that obstruct model calibration.

  1. Automatic Detect and Trace of Solar Filaments

    Science.gov (United States)

    Fang, Cheng; Chen, P. F.; Tang, Yu-hua; Hao, Qi; Guo, Yang

    We developed a series of methods to automatically detect and trace solar filaments in solar Hα images. The programs are able to not only recognize filaments and determine their properties, such as the position, the area and other relevant parameters, but also to trace the daily evolution of the filaments. For solar full disk Hα images, the method consists of three parts: first, preprocessing is applied to correct the original images; second, the Canny edge-detection method is used to detect the filaments; third, filament properties are recognized through the morphological operators. For each Hα filament and its barb features, we introduced the unweighted undirected graph concept and adopted Dijkstra shortest-path algorithm to recognize the filament spine; then, using polarity inversion line shift method for measuring the polarities in both sides of the filament to determine the filament axis chirality; finally, employing connected components labeling method to identify the barbs and calculating the angle between each barb and spine to indicate the barb chirality. Our algorithms are applied to the observations from varied observatories, including the Optical & Near Infrared Solar Eruption Tracer (ONSET) in Nanjing University, Mauna Loa Solar Observatory (MLSO) and Big Bear Solar Observatory (BBSO). The programs are demonstrated to be effective and efficient. We used our method to automatically process and analyze 3470 images obtained by MLSO from January 1998 to December 2009, and a butterfly diagram of filaments is obtained. It shows that the latitudinal migration of solar filaments has three trends in the Solar Cycle 23: The drift velocity was fast from 1998 to the solar maximum; after the solar maximum, it became relatively slow and after 2006, the migration became divergent, signifying the solar minimum. About 60% filaments with the latitudes larger than 50 degree migrate towards the Polar Regions with relatively high velocities, and the latitudinal migrating

  2. Towards automatic classification of all WISE sources

    Science.gov (United States)

    Kurcz, A.; Bilicki, M.; Solarz, A.; Krupa, M.; Pollo, A.; Małek, K.

    2016-07-01

    Context. The Wide-field Infrared Survey Explorer (WISE) has detected hundreds of millions of sources over the entire sky. Classifying them reliably is, however, a challenging task owing to degeneracies in WISE multicolour space and low levels of detection in its two longest-wavelength bandpasses. Simple colour cuts are often not sufficient; for satisfactory levels of completeness and purity, more sophisticated classification methods are needed. Aims: Here we aim to obtain comprehensive and reliable star, galaxy, and quasar catalogues based on automatic source classification in full-sky WISE data. This means that the final classification will employ only parameters available from WISE itself, in particular those which are reliably measured for the majority of sources. Methods: For the automatic classification we applied a supervised machine learning algorithm, support vector machines (SVM). It requires a training sample with relevant classes already identified, and we chose to use the SDSS spectroscopic dataset (DR10) for that purpose. We tested the performance of two kernels used by the classifier, and determined the minimum number of sources in the training set required to achieve stable classification, as well as the minimum dimension of the parameter space. We also tested SVM classification accuracy as a function of extinction and apparent magnitude. Thus, the calibrated classifier was finally applied to all-sky WISE data, flux-limited to 16 mag (Vega) in the 3.4 μm channel. Results: By calibrating on the test data drawn from SDSS, we first established that a polynomial kernel is preferred over a radial one for this particular dataset. Next, using three classification parameters (W1 magnitude, W1-W2 colour, and a differential aperture magnitude) we obtained very good classification efficiency in all the tests. At the bright end, the completeness for stars and galaxies reaches ~95%, deteriorating to ~80% at W1 = 16 mag, while for quasars it stays at a level of

  3. MARZ: Manual and automatic redshifting software

    Science.gov (United States)

    Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.

    2016-04-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.

  4. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological...... aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and...

  5. Automatic Configuration in NTP

    Institute of Scientific and Technical Information of China (English)

    Jiang Zongli(蒋宗礼); Xu Binbin

    2003-01-01

    NTP is nowadays the most widely used distributed network time protocol, which aims at synchronizing the clocks of computers in a network and keeping the accuracy and validation of the time information which is transmitted in the network. Without automatic configuration mechanism, the stability and flexibility of the synchronization network built upon NTP protocol are not satisfying. P2P's resource discovery mechanism is used to look for time sources in a synchronization network, and according to the network environment and node's quality, the synchronization network is constructed dynamically.

  6. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstrac...... interpretation. The semantics-based setting makes it possible to prove the correctness of the time bound function. The system can analyse programs in a first-order subset of Lisp and we show how the system also can be used to analyse programs in other languages....

  7. [An automatic system controlled by microcontroller for carotid sinus perfusion].

    Science.gov (United States)

    Yi, X L; Wang, M Y; Fan, Z Z; He, R R

    2001-08-01

    To establish a new method for controlling automatically the carotid perfusion pressure. A cheap practical automatic perfusion unit based on AT89C2051 micro controller was designed. The unit, LDB-M perfusion pump and the carotid sinus of an animal constituted an automatic perfusion system. This system was able to provide ramp and stepwise updown perfusion pattern and has been used in the research of baroreflex. It can insure the precision and reproducibility of perfusion pressure curve, and improve the technical level in corresponding medical field.

  8. Automatic color based reassembly of fragmented images and paintings.

    Science.gov (United States)

    Tsamoura, Efthymia; Pitas, Ioannis

    2010-03-01

    The problem of reassembling image fragments arises in many scientific fields, such as forensics and archaeology. In the field of archaeology, the pictorial excavation findings are almost always in the form of painting fragments. The manual execution of this task is very difficult, as it requires great amount of time, skill and effort. Thus, the automation of such a work is very important and can lead to faster, more efficient, painting reassembly and to a significant reduction in the human effort involved. In this paper, an integrated method for automatic color based 2-D image fragment reassembly is presented. The proposed 2-D reassembly technique is divided into four steps. Initially, the image fragments which are probably spatially adjacent, are identified utilizing techniques employed in content based image retrieval systems. The second operation is to identify the matching contour segments for every retained couple of image fragments, via a dynamic programming technique. The next step is to identify the optimal transformation in order to align the matching contour segments. Many registration techniques have been evaluated to this end. Finally, the overall image is reassembled from its properly aligned fragments. This is achieved via a novel algorithm, which exploits the alignment angles found during the previous step. In each stage, the most robust algorithms having the best performance are investigated and their results are fed to the next step. We have experimented with the proposed method using digitally scanned images of actual torn pieces of paper image prints and we produced very satisfactory reassembly results.

  9. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychologic......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses.......This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological...... aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and...

  10. Automatic Calculation of Dimension Chains in AutoCAD

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In the course of mechanical part designing, process p lanning and assembling designing, we often have to calculate and analyse a dimen sion chain. Traditionally, a dimension chain is established and calculated m anually. With wide computer application in the field of mechanical design and ma nufacture, people began to use a computer to acquire and calculate a dimension c hain automatically. In reported work, a dimension chain can be established and c alculated automatically. However, dimension text value...

  11. Comparison of automatic control systems

    Science.gov (United States)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  12. Automatic Caption Generation for Electronics Textbooks

    Directory of Open Access Journals (Sweden)

    Veena Thakur

    2014-12-01

    Full Text Available Automatic or semi-automatic approaches for developing Technology Supported Learning Systems (TSLS are required to lighten their development cost. The main objective of this paper is to automate the generation of a caption module; it aims at reproducing the way teachers prepare their lessons and the learning material they will use throughout the course. Teachers tend to choose one or more textbooks that cover the contents of their subjects, determine the topics to be addressed, and identify the parts of the textbooks which may be helpful for the students it describes the entities, attributes, role and their relationship plus the constraints that govern the problem domain. The caption model is created in order to represent the vocabulary and key concepts of the problem domain. The caption model also identifies the relationships among all the entities within the scope of the problem domain, and commonly identifies their attributes. It defines a vocabulary and is helpful as a communication tool. DOM-Sortze, a framework that enables the semi-automatic generation of the Caption Module for technology supported learning system (TSLS from electronic textbooks. The semiautomatic generation of the Caption Module entails the identification and elicitation of knowledge from the documents to which end Natural Language Processing (NLP techniques are combined with ontologies and heuristic reasoning.

  13. Image feature meaning for automatic key-frame extraction

    Science.gov (United States)

    Di Lecce, Vincenzo; Guerriero, Andrea

    2003-12-01

    Video abstraction and summarization, being request in several applications, has address a number of researches to automatic video analysis techniques. The processes for automatic video analysis are based on the recognition of short sequences of contiguous frames that describe the same scene, shots, and key frames representing the salient content of the shot. Since effective shot boundary detection techniques exist in the literature, in this paper we will focus our attention on key frames extraction techniques to identify the low level visual features of the frames that better represent the shot content. To evaluate the features performance, key frame automatically extracted using these features, are compared to human operator video annotations.

  14. Automatic image cropping for republishing

    Science.gov (United States)

    Cheatle, Phil

    2010-02-01

    Image cropping is an important aspect of creating aesthetically pleasing web pages and repurposing content for different web or printed output layouts. Cropping provides both the possibility of improving the composition of the image, and also the ability to change the aspect ratio of the image to suit the layout design needs of different document or web page formats. This paper presents a method for aesthetically cropping images on the basis of their content. Underlying the approach is a novel segmentation-based saliency method which identifies some regions as "distractions", as an alternative to the conventional "foreground" and "background" classifications. Distractions are a particular problem with typical consumer photos found on social networking websites such as FaceBook, Flickr etc. Automatic cropping is achieved by identifying the main subject area of the image and then using an optimization search to expand this to form an aesthetically pleasing crop. Evaluation of aesthetic functions like auto-crop is difficult as there is no single correct solution. A further contribution of this paper is an automated evaluation method which goes some way towards handling the complexity of aesthetic assessment. This allows crop algorithms to be easily evaluated against a large test set.

  15. Using Hybrid Decision Tree -Houph Transform Approach For Automatic Bank Check Processing

    Directory of Open Access Journals (Sweden)

    Heba A. Elnemr

    2012-05-01

    Full Text Available One of the first steps in the realization of an automatic system of bank check processing is the automatic classification of checks and extraction of handwritten area. This paper presents a new hybrid method which couple together the statistical color histogram features, the entropy, the energy and the Houph transform to achieve the automatic classification of checks as well as the segmentation and recognition of the various information on the check. The proposed method relies on two stages. First, a two-step classification algorithm is implemented. In the first step, a decision classification tree is built using the entropy, the energy, the logo location and histogram features of colored bank checks. These features are used to classify checks into several groups. Each group may contain one or more type of checks. Therefore, in the second step the bank logo or bank name are matched against its stored template to identify the correct prototype. Second, Hough transform is utilized to detect lines in the classified checks. These lines are used as indicator to the bank check fields. A group of experiments is performed showing that the proposed technique is promising as regards classifying the bank checks and extracting the important fields in that check.

  16. Automatic traveltime picking using instantaneous traveltime

    KAUST Repository

    Saragiotis, Christos

    2013-02-08

    Event picking is used in many steps of seismic processing. We present an automatic event picking method that is based on a new attribute of seismic signals, instantaneous traveltime. The calculation of the instantaneous traveltime consists of two separate but interrelated stages. First, a trace is mapped onto the time-frequency domain. Then the time-frequency representation is mapped back onto the time domain by an appropriate operation. The computed instantaneous traveltime equals the recording time at those instances at which there is a seismic event, a feature that is used to pick the events. We analyzed the concept of the instantaneous traveltime and demonstrated the application of our automatic picking method on dynamite and Vibroseis field data.

  17. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  18. Automatic detection of aircraft emergency landing sites

    Science.gov (United States)

    Shen, Yu-Fei; Rahman, Zia-ur; Krusienski, Dean; Li, Jiang

    2011-06-01

    An automatic landing site detection algorithm is proposed for aircraft emergency landing. Emergency landing is an unplanned event in response to emergency situations. If, as is unfortunately usually the case, there is no airstrip or airfield that can be reached by the un-powered aircraft, a crash landing or ditching has to be carried out. Identifying a safe landing site is critical to the survival of passengers and crew. Conventionally, the pilot chooses the landing site visually by looking at the terrain through the cockpit. The success of this vital decision greatly depends on the external environmental factors that can impair human vision, and on the pilot's flight experience that can vary significantly among pilots. Therefore, we propose a robust, reliable and efficient algorithm that is expected to alleviate the negative impact of these factors. We present only the detection mechanism of the proposed algorithm and assume that the image enhancement for increased visibility, and image stitching for a larger field-of-view have already been performed on the images acquired by aircraftmounted cameras. Specifically, we describe an elastic bound detection method which is designed to position the horizon. The terrain image is divided into non-overlapping blocks which are then clustered according to a "roughness" measure. Adjacent smooth blocks are merged to form potential landing sites whose dimensions are measured with principal component analysis and geometric transformations. If the dimensions of the candidate region exceed the minimum requirement for safe landing, the potential landing site is considered a safe candidate and highlighted on the human machine interface. At the end, the pilot makes the final decision by confirming one of the candidates, also considering other factors such as wind speed and wind direction, etc. Preliminary results show the feasibility of the proposed algorithm.

  19. Automatic summarization of audio-visual soccer feeds

    OpenAIRE

    Chen F; De Vleeschouwer C; Duxans Barrobes H.; Gregorio Escalada J.; Conejero D.

    2010-01-01

    This paper presents a fully automatic system for soccer game summarization. The system takes audio-visual content as an input, and builds on the integration of two independent but complementary contributions (i) to identify crucial periods of the soccer game in a fully automatic way, and (ii) to summarize the soccer game as a function of individual narrative preferences of the user. The process involves both audio and video analysis, and handles the personalized summarization challenge as a r...

  20. Automatic localization of the da Vinci surgical instrument tips in 3-D transrectal ultrasound.

    Science.gov (United States)

    Mohareri, Omid; Ramezani, Mahdi; Adebar, Troy K; Abolmaesumi, Purang; Salcudean, Septimiu E

    2013-09-01

    Robot-assisted laparoscopic radical prostatectomy (RALRP) using the da Vinci surgical system is the current state-of-the-art treatment option for clinically confined prostate cancer. Given the limited field of view of the surgical site in RALRP, several groups have proposed the integration of transrectal ultrasound (TRUS) imaging in the surgical workflow to assist with accurate resection of the prostate and the sparing of the neurovascular bundles (NVBs). We previously introduced a robotic TRUS manipulator and a method for automatically tracking da Vinci surgical instruments with the TRUS imaging plane, in order to facilitate the integration of intraoperative TRUS in RALRP. Rapid and automatic registration of the kinematic frames of the da Vinci surgical system and the robotic TRUS probe manipulator is a critical component of the instrument tracking system. In this paper, we propose a fully automatic registration technique based on automatic 3-D TRUS localization of robot instrument tips pressed against the air-tissue boundary anterior to the prostate. The detection approach uses a multiscale filtering technique to identify and localize surgical instrument tips in the TRUS volume, and could also be used to detect other surface fiducials in 3-D ultrasound. Experiments have been performed using a tissue phantom and two ex vivo tissue samples to show the feasibility of the proposed methods. Also, an initial in vivo evaluation of the system has been carried out on a live anaesthetized dog with a da Vinci Si surgical system and a target registration error (defined as the root mean square distance of corresponding points after registration) of 2.68 mm has been achieved. Results show this method's accuracy and consistency for automatic registration of TRUS images to the da Vinci surgical system.

  1. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  2. Intelligent Storage System Based on Automatic Identification

    Directory of Open Access Journals (Sweden)

    Kolarovszki Peter

    2014-09-01

    Full Text Available This article describes RFID technology in conjunction with warehouse management systems. Article also deals with automatic identification and data capture technologies and each processes, which are used in warehouse management system. It describes processes from entering goods into production to identification of goods and also palletizing, storing, bin transferring and removing goods from warehouse. Article focuses on utilizing AMP middleware in WMS processes in Nowadays, the identification of goods in most warehouses is carried through barcodes. In this article we want to specify, how can be processes described above identified through RFID technology. All results are verified by measurement in our AIDC laboratory, which is located at the University of Žilina, and also in Laboratory of Automatic Identification Goods and Services located in GS1 Slovakia. The results of our research bring the new point of view and indicate the ways using of RFID technology in warehouse management system.

  3. Towards automatic classification of all WISE sources

    CERN Document Server

    Kurcz, Agnieszka; Solarz, Aleksandra; Krupa, Magdalena; Pollo, Agnieszka; Małek, Katarzyna

    2016-01-01

    The WISE satellite has detected hundreds of millions sources over the entire sky. Classifying them reliably is however a challenging task due to degeneracies in WISE multicolour space and low levels of detection in its two longest-wavelength bandpasses. Here we aim at obtaining comprehensive and reliable star, galaxy and quasar catalogues based on automatic source classification in full-sky WISE data. This means that the final classification will employ only parameters available from WISE itself, in particular those reliably measured for a majority of sources. For the automatic classification we applied the support vector machines (SVM) algorithm, which requires a training sample with relevant classes already identified, and we chose to use the SDSS spectroscopic dataset for that purpose. By calibrating the classifier on the test data drawn from SDSS, we first established that a polynomial kernel is preferred over a radial one for this particular dataset. Next, using three classification parameters (W1 magnit...

  4. Automatic Palette Identification of Colored Graphics

    Science.gov (United States)

    Lacroix, Vinciane

    The median-shift, a new clustering algorithm, is proposed to automatically identify the palette of colored graphics, a pre-requisite for graphics vectorization. The median-shift is an iterative process which shifts each data point to the "median" point of its neighborhood defined thanks to a distance measure and a maximum radius, the only parameter of the method. The process is viewed as a graph transformation which converges to a set of clusters made of one or several connected vertices. As the palette identification depends on color perception, the clustering is performed in the L*a*b* feature space. As pixels located on edges are made of mixed colors not expected to be part of the palette, they are removed from the initial data set by an automatic pre-processing. Results are shown on scanned maps and on the Macbeth color chart and compared to well established methods.

  5. Automatic basal slice detection for cardiac analysis

    Science.gov (United States)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  6. Identifying Factors That Predict Promotion Time to E-4 and Re-Enlistment Eligibility for U.S. Marine Corps Field Radio Operators

    Science.gov (United States)

    2014-12-01

    FACTORS THAT PREDICT PROMOTION TIME TO E-4 AND RE-ENLISTMENT ELIGIBILITY FOR U.S. MARINE CORPS FIELD RADIO OPERATORS by William G. Wathen... PROMOTION TIME TO E-4 AND RE-ENLISTMENT ELIGIBILITY FOR U.S. MARINE CORPS FIELD RADIO OPERATORS 5. FUNDING NUMBERS 6. AUTHOR(S) William G. Wathen...testing for MOS suitability. 14. SUBJECT TERMS career assignment, mos assignment, linear regression 15. NUMBER OF PAGES 77 16. PRICE CODE 17

  7. High-throughput phenotyping (HTP) identifies seedling root traits linked to variation in seed yield and nutrient capture in field-grown oilseed rape (Brassica napus L.).

    Science.gov (United States)

    Thomas, C L; Graham, N S; Hayden, R; Meacham, M C; Neugebauer, K; Nightingale, M; Dupuy, L X; Hammond, J P; White, P J; Broadley, M R

    2016-04-06

    Root traits can be selected for crop improvement. Techniques such as soil excavations can be used to screen root traits in the field, but are limited to genotypes that are well-adapted to field conditions. The aim of this study was to compare a low-cost, high-throughput root phenotyping (HTP) technique in a controlled environment with field performance, using oilseed rape (OSR;Brassica napus) varieties. Primary root length (PRL), lateral root length and lateral root density (LRD) were measured on 14-d-old seedlings of elite OSR varieties (n = 32) using a 'pouch and wick' HTP system (∼40 replicates). Six field experiments were conducted using the same varieties at two UK sites each year for 3 years. Plants were excavated at the 6- to 8-leaf stage for general vigour assessments of roots and shoots in all six experiments, and final seed yield was determined. Leaves were sampled for mineral composition from one of the field experiments. Seedling PRL in the HTP system correlated with seed yield in four out of six (r = 0·50, 0·50, 0·33, 0·49;P HTP systems to screen this trait in both elite and more genetically diverse, non-field-adapted OSR. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company.

  8. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  9. Automatic aircraft recognition

    Science.gov (United States)

    Hmam, Hatem; Kim, Jijoong

    2002-08-01

    Automatic aircraft recognition is very complex because of clutter, shadows, clouds, self-occlusion and degraded imaging conditions. This paper presents an aircraft recognition system, which assumes from the start that the image is possibly degraded, and implements a number of strategies to overcome edge fragmentation and distortion. The current vision system employs a bottom up approach, where recognition begins by locating image primitives (e.g., lines and corners), which are then combined in an incremental fashion into larger sets of line groupings using knowledge about aircraft, as viewed from a generic viewpoint. Knowledge about aircraft is represented in the form of whole/part shape description and the connectedness property, and is embedded in production rules, which primarily aim at finding instances of the aircraft parts in the image and checking the connectedness property between the parts. Once a match is found, a confidence score is assigned and as evidence in support of an aircraft interpretation is accumulated, the score is increased proportionally. Finally a selection of the resulting image interpretations with the highest scores, is subjected to competition tests, and only non-ambiguous interpretations are allowed to survive. Experimental results demonstrating the effectiveness of the current recognition system are given.

  10. High-throughput phenotyping (HTP) identifies seedling root traits linked to variation in seed yield and nutrient capture in field-grown oilseed rape (Brassica napus L.)

    Science.gov (United States)

    Thomas, C. L.; Graham, N. S.; Hayden, R.; Meacham, M. C.; Neugebauer, K.; Nightingale, M.; Dupuy, L. X.; Hammond, J. P.; White, P. J.; Broadley, M. R.

    2016-01-01

    Background and Aims Root traits can be selected for crop improvement. Techniques such as soil excavations can be used to screen root traits in the field, but are limited to genotypes that are well-adapted to field conditions. The aim of this study was to compare a low-cost, high-throughput root phenotyping (HTP) technique in a controlled environment with field performance, using oilseed rape (OSR; Brassica napus) varieties. Methods Primary root length (PRL), lateral root length and lateral root density (LRD) were measured on 14-d-old seedlings of elite OSR varieties (n = 32) using a ‘pouch and wick’ HTP system (∼40 replicates). Six field experiments were conducted using the same varieties at two UK sites each year for 3 years. Plants were excavated at the 6- to 8-leaf stage for general vigour assessments of roots and shoots in all six experiments, and final seed yield was determined. Leaves were sampled for mineral composition from one of the field experiments. Key Results Seedling PRL in the HTP system correlated with seed yield in four out of six (r = 0·50, 0·50, 0·33, 0·49; P root traits might therefore be of limited additional selection value, given that vigour can be measured easily on shoots/canopies. In contrast, LRD cannot be assessed easily in the field and, if LRD can improve nutrient uptake, then it may be possible to use HTP systems to screen this trait in both elite and more genetically diverse, non-field-adapted OSR. PMID:27052342

  11. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  12. The Automatic Telescope Network (ATN)

    CERN Document Server

    Mattox, J R

    1999-01-01

    Because of the scheduled GLAST mission by NASA, there is strong scientific justification for preparation for very extensive blazar monitoring in the optical bands to exploit the opportunity to learn about blazars through the correlation of variability of the gamma-ray flux with flux at lower frequencies. Current optical facilities do not provide the required capability.Developments in technology have enabled astronomers to readily deploy automatic telescopes. The effort to create an Automatic Telescope Network (ATN) for blazar monitoring in the GLAST era is described. Other scientific applications of the networks of automatic telescopes are discussed. The potential of the ATN for science education is also discussed.

  13. Using geochemical tracing system to identify new types of gas sources in marine strata of the Hotan River Gas Field in the Tarim Basin

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    There are natural gas sources of various modes of occurrence in superimposed basins. Besides the conventional kerogen and ancient oil reservoir, dispersed soluble organic matter (DSOM) is an important direct gas source. Because of its wide distribution, great potential to generate gas and proneness to crack under catalysis, DSOM is an important type of gas source in the highly evolved zones in marine strata. Through the geological and geochemical analysis that reflects the long-period evolvement and multiple gas accumulation processes in marine strata, and using the ternary geochemical tracing system, here we study the origin and accumulation of the natural gas in the Hotan River Gas Field in the Tarim Basin. The natural gas is produced from highly evolved and cracked DSOM in the Hotan River Gas Field, and it is accumulated after migrating for a long distance along faults. This conclusion is of great significance to the further expansion of the oil and gas exploration fields in the Tarim Basin.

  14. Personality in speech assessment and automatic classification

    CERN Document Server

    Polzehl, Tim

    2015-01-01

    This work combines interdisciplinary knowledge and experience from research fields of psychology, linguistics, audio-processing, machine learning, and computer science. The work systematically explores a novel research topic devoted to automated modeling of personality expression from speech. For this aim, it introduces a novel personality assessment questionnaire and presents the results of extensive labeling sessions to annotate the speech data with personality assessments. It provides estimates of the Big 5 personality traits, i.e. openness, conscientiousness, extroversion, agreeableness, and neuroticism. Based on a database built on the questionnaire, the book presents models to tell apart different personality types or classes from speech automatically.

  15. Capacitance sensor for automatic soil retreat measurements

    Institute of Scientific and Technical Information of China (English)

    GU Jun; YANG Juan; YIN Wu-liang; WANG Chao; WANG Hua-xiang; LIU Ze; CHENG Su-sen

    2008-01-01

    To continuously monitor the soil retreat due to erosion in field, provide valuable information about the erosion processes and overcome the disadvantages of inefficiency, high time-consumption and labor-intensity of existing methods, this paper describes a novel capacitance sensor for measuring the soil retreat. A capaci-tance sensor based probe is proposed, which can measure the depth of the soil around it automatically and the data can be recorded by a data logger. Experimental results in the lab verify its usefulness.

  16. Automatic Sarcasm Detection in Twitter Messages

    OpenAIRE

    Ræder, Johan Georg Cyrus Mazaher

    2016-01-01

    In the past decade, social media like Twitter have become popular and a part of everyday life for many people. Opinion mining of the thoughts and opinions they share can be of interest to, e.g., companies and organizations. The sentiment of a text can be drastically altered when figurative language such as sarcasm is used. This thesis presents a system for automatic sarcasm detection in Twitter messages. To get a better understanding of the field, state-of-the-art systems fo...

  17. Automatic Mode Switch (AMS Causes Less Synchronization

    Directory of Open Access Journals (Sweden)

    Jorat

    2016-03-01

    Full Text Available Introduction Cardiac resynchronization devices are part of modern heart failure management. After implantation, we analyze and program devices in an attempt to ensure their success. Biventricular pacing should be 98% or more for the lowest mortality and best symptom improvement. Case Presentation In this case series, we present a combination of far field sensing and automatic mode switching (AMS in six patients. It is found that this combination causes ventricular sensing (VS episodes with wide QRS and no synchronization. We turn off the AMS and alleviate the problem. Conclusions Switching AMS off may increase biventricular pacing in some patients.

  18. Two-field photography can identify patients with vision-threatening diabetic retinopathy - A screening approach in the primary care setting

    NARCIS (Netherlands)

    Stellingwerf, C; Hardus, PLLJ; Hooymans, JMM

    2001-01-01

    OBJECTIVE - To compare the effectiveness of two 45 degrees photographic fields per eye in the screening for diabetic retinopathy with the routine ophthalmologist's examination and to study the effectiveness of visual acuity measurement in the detection of diabetic macular edema, RESEARCH DESIGN AND

  19. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  20. Automatic gray scale correction of video data

    Science.gov (United States)

    Chochia, Pavel A.

    1995-01-01

    Automatic gray scale correction of captured video data (both still and moving images) is one of the least researched questions in the image processing area, in spite of this the question is touched almost in every book concerned with image processing. Classically it is related to the image enhancement, and frequently is classified as histogram modification techniques. Traditionally used algorithms, based on analysis of the image histogram, are not able to decide the problem properly. The investigating difficulties are associated with the absence of a formal quantitative estimate of image quality -- till now the most often used criteria are human visual perception and experience. Hence, the problem of finding out some measurable properties of real images, which might be the basis for automatic building of gray scale correction function (sometimes identified also as gamma-correction function), is still unsolved. In the paper we try to discern some common properties of real images that could help us to evaluate the gray scale image distortion, and, finally, to construct the appropriate correction function to enhance an image. Such a method might be sufficiently used for automatic image processing procedures, like enhancing of medical images, reproducing of pictures in the publishing industry, correcting of remote sensing images, preprocessing of captured data in the computer vision area, and for many other applications. The question of complexity of analysis procedure becomes important when an algorithm is realized in real-time (for example in video input devices, like video cameras).

  1. Automatic morphometry of nerve histological sections.

    Science.gov (United States)

    Romero, E; Cuisenaire, O; Denef, J F; Delbeke, J; Macq, B; Veraart, C

    2000-04-15

    A method for the automatic segmentation, recognition and measurement of neuronal myelinated fibers in nerve histological sections is presented. In this method, the fiber parameters i.e. perimeter, area, position of the fiber and myelin sheath thickness are automatically computed. Obliquity of the sections may be taken into account. First, the image is thresholded to provide a coarse classification between myelin and non-myelin pixels. Next, the resulting binary image is further simplified using connected morphological operators. By applying semantic rules to the zonal graph axon candidates are identified. Those are either isolated or still connected. Then, separation of connected fibers is performed by evaluating myelin sheath thickness around each candidate area with an Euclidean distance transformation. Finally, properties of each detected fiber are computed and false positives are removed. The accuracy of the method is assessed by evaluating missed detection, false positive ratio and comparing the results to the manual procedure with sampling. In the evaluated nerve surface, a 0.9% of false positives was found, along with 6.36% of missed detections. The resulting histograms show strong correlation with those obtained by manual measure. The noise introduced by this method is significantly lower than the intrinsic sampling variability. This automatic method constitutes an original tool for morphometrical analysis.

  2. Prospects for de-automatization.

    Science.gov (United States)

    Kihlstrom, John F

    2011-06-01

    Research by Raz and his associates has repeatedly found that suggestions for hypnotic agnosia, administered to highly hypnotizable subjects, reduce or even eliminate Stroop interference. The present paper sought unsuccessfully to extend these findings to negative priming in the Stroop task. Nevertheless, the reduction of Stroop interference has broad theoretical implications, both for our understanding of automaticity and for the prospect of de-automatizing cognition in meditation and other altered states of consciousness.

  3. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  4. Automatic Sarcasm Detection: A Survey

    OpenAIRE

    Joshi, Aditya; Bhattacharyya, Pushpak; Carman, Mark James

    2016-01-01

    Automatic sarcasm detection is the task of predicting sarcasm in text. This is a crucial step to sentiment analysis, considering prevalence and challenges of sarcasm in sentiment-bearing text. Beginning with an approach that used speech-based features, sarcasm detection has witnessed great interest from the sentiment analysis community. This paper is the first known compilation of past work in automatic sarcasm detection. We observe three milestones in the research so far: semi-supervised pat...

  5. Automatic Coarse Graining of Polymers

    OpenAIRE

    Faller, Roland

    2003-01-01

    Several recently proposed semi--automatic and fully--automatic coarse--graining schemes for polymer simulations are discussed. All these techniques derive effective potentials for multi--atom units or super--atoms from atomistic simulations. These include techniques relying on single chain simulations in vacuum and self--consistent optimizations from the melt like the simplex method and the inverted Boltzmann method. The focus is on matching the polymer structure on different scales. Several ...

  6. The automatization of journalistic narrative

    Directory of Open Access Journals (Sweden)

    Naara Normande

    2013-06-01

    Full Text Available This paper proposes an initial discussion about the production of automatized journalistic narratives. Despite being a topic discussed in specialized sites and international conferences in communication area, the concepts are still deficient in academic research. For this article, we studied the concepts of narrative, databases and algorithms, indicating a theoretical trend that explains this automatized journalistic narratives. As characterization, we use the cases of Los Angeles Times, Narrative Science and Automated Insights.

  7. Identifiability in stochastic models

    CERN Document Server

    1992-01-01

    The problem of identifiability is basic to all statistical methods and data analysis, occurring in such diverse areas as Reliability Theory, Survival Analysis, and Econometrics, where stochastic modeling is widely used. Mathematics dealing with identifiability per se is closely related to the so-called branch of ""characterization problems"" in Probability Theory. This book brings together relevant material on identifiability as it occurs in these diverse fields.

  8. Needles in the haystack: Using open-text fields to identify persons with intellectual and developmental disabilities in administrative home care data.

    Science.gov (United States)

    McKenzie, Katherine; Martin, Lynn; Ouellette-Kuntz, Hélène

    2017-08-22

    Use of administrative health data to study populations of interest is becoming more common. Identifying individuals with intellectual and developmental disabilities (IDD) in existing databases can be challenging due to inconsistent definitions and terminologies of IDD over time and across sectors, and the inability to rely on etiologies of IDD as they are frequently unknown. To identify diagnoses related to IDD in an administrative database and create a cohort of persons with IDD. Open-text diagnostic entries related to IDD were identified in an Ontario home care database (2003-2015) and coded as being either acceptable (e.g. Down syndrome) or ambiguous (e.g. intellectually challenged). The cognitive and functional skills of the resulting groups were compared using logistic regressions and standardized differences, and their age distributions were compared to that of the general home care population. Just under 1% of the home care population had a diagnostic entry related to IDD. Ambiguous terms were most commonly used (61%), and this group tended to be older and less impaired than the group with more acceptable terms used to describe their IDD. Open-text diagnostic variables in administrative health records can be used to identify and study individuals with IDD. Future work is needed to educate assessors on the importance of using standard, accepted terminology when recording diagnoses related to IDD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Automatic Melody Segmentation

    NARCIS (Netherlands)

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation analy

  10. A Computer-controlled, Fully Automatic NMR/NQR Double Resonance Spectrometer

    Science.gov (United States)

    Zhenye, Feng; Lücken, Edwin A. C.; Diolot, Jacques

    1992-02-01

    A completely automatic computer-controlled NMR/NQR double resonance spectrometer is described. It features automatic tuning of the low, variable frequency power amplifier, thus permitting untended use over long periods, with high sensitivity and signal reproducibility. The sample is transferred between the low-frequency, zero-field region and the high-field region using compressed air and the possibility of switching on a field of several tens of gauss during the transfer of the sample is also included

  11. Automatic thoracic body region localization

    Science.gov (United States)

    Bai, PeiRui; Udupa, Jayaram K.; Tong, YuBing; Xie, ShiPeng; Torigian, Drew A.

    2017-03-01

    Radiological imaging and image interpretation for clinical decision making are mostly specific to each body region such as head & neck, thorax, abdomen, pelvis, and extremities. For automating image analysis and consistency of results, standardizing definitions of body regions and the various anatomic objects, tissue regions, and zones in them becomes essential. Assuming that a standardized definition of body regions is available, a fundamental early step needed in automated image and object analytics is to automatically trim the given image stack into image volumes exactly satisfying the body region definition. This paper presents a solution to this problem based on the concept of virtual landmarks and evaluates it on whole-body positron emission tomography/computed tomography (PET/CT) scans. The method first selects a (set of) reference object(s), segments it (them) roughly, and identifies virtual landmarks for the object(s). The geometric relationship between these landmarks and the boundary locations of body regions in the craniocaudal direction is then learned through a neural network regressor, and the locations are predicted. Based on low-dose unenhanced CT images of 180 near whole-body PET/CT scans (which includes 34 whole-body PET/CT scans), the mean localization error for the boundaries of superior of thorax (TS) and inferior of thorax (TI), expressed as number of slices (slice spacing ≍ 4mm)), and using either the skeleton or the pleural spaces as reference objects, is found to be 3,2 (using skeleton) and 3, 5 (using pleural spaces) respectively, or in mm 13, 10 mm (using skeleton) and 10.5, 20 mm (using pleural spaces), respectively. Improvements of this performance via optimal selection of objects and virtual landmarks and other object analytics applications are currently being pursued. and the skeleton and pleural spaces used as a reference objects

  12. Identifying native-like protein structures with scoring functions based on all-atom ECEPP force fields, implicit solvent models and structure relaxation.

    Science.gov (United States)

    Arnautova, Yelena A; Vorobjev, Yury N; Vila, Jorge A; Scheraga, Harold A

    2009-10-01

    Availability of energy functions which can discriminate native-like from non-native protein conformations is crucial for theoretical protein structure prediction and refinement of low-resolution protein models. This article reports the results of benchmark tests for scoring functions based on two all-atom ECEPP force fields, that is, ECEPP/3 and ECEPP05, and two implicit solvent models for a large set of protein decoys. The following three scoring functions are considered: (i) ECEPP05 plus a solvent-accessible surface area model with the parameters optimized with a set of protein decoys (ECEPP05/SA); (ii) ECEPP/3 plus the solvent-accessible surface area model of Ooi et al. (Proc Natl Acad Sci USA 1987;84:3086-3090) (ECEPP3/OONS); and (iii) ECEPP05 plus an implicit solvent model based on a solution of the Poisson equation with an optimized Fast Adaptive Multigrid Boundary Element (FAMBEpH) method (ECEPP05/FAMBEpH). Short Monte Carlo-with-Minimization (MCM) simulations, following local energy minimization, are used as a scoring method with ECEPP05/SA and ECEPP3/OONS potentials, whereas energy calculation is used with ECEPP05/FAMBEpH. The performance of each scoring function is evaluated by examining its ability to distinguish between native-like and non-native protein structures. The results of the tests show that the new ECEPP05/SA scoring function represents a significant improvement over the earlier ECEPP3/OONS version of the force field. Thus, it is able to rank native-like structures with C(alpha) root-mean-square-deviations below 3.5 A as lowest-energy conformations for 76% and within the top 10 for 87% of the proteins tested, compared with 69 and 80%, respectively, for ECEPP3/OONS. The use of the FAMBEpH solvation model, which provides a more accurate description of the protein-solvent interactions, improves the discriminative ability of the scoring function to 89%. All failed tests in which the native-like structures cannot be discriminated as those with low

  13. Automatic Facial Expression Analysis A Survey

    Directory of Open Access Journals (Sweden)

    C.P. Sumathi

    2013-01-01

    Full Text Available The Automatic Facial Expression Recognition has been one of the latest research topic since1990’s.There have been recent advances in detecting face, facial expression recognition andclassification. There are multiple methods devised for facial feature extraction which helps in identifyingface and facial expressions. This paper surveys some of the published work since 2003 till date. Variousmethods are analysed to identify the Facial expression. The Paper also discusses about the facialparameterization using Facial Action Coding System(FACS action units and the methods whichrecognizes the action units parameters using facial expression data that are extracted. Various kinds offacial expressions are present in human face which can be identified based on their geometric features,appearance features and hybrid features . The two basic concepts of extracting features are based onfacial deformation and facial motion. This article also identifies the techniques based on thecharacteristics of expressions and classifies the suitable methods that can be implemented.

  14. Microbial mineralization of cis-dichloroethene and vinyl chloride as a component of natural attenuation of chloroethene contaminants under conditions identified in the field as anoxic

    Science.gov (United States)

    Bradley, Paul M.

    2012-01-01

    Chlororespiration is a key component of remediation at many chloroethene-contaminated sites. In some instances, limited accumulation of reductive dechlorination daughter products may suggest that natural attenuation is not adequate for site remediation. This conclusion is justified when evidence for parent compound (tetrachloroethene, PCE, or trichloroethene, TCE) degradation is lacking. For many chloroethene-contaminated shallow aquifer systems, however, non-conservative losses of the parent compounds are clear but the mass balance between parent compound attenuation and accumulation of reductive dechlorination daughter products is incomplete. Incomplete mass balance indicates a failure to account for important contaminant attenuation mechanisms, and is consistent with contaminant degradation to non-diagnostic mineralization products. An ongoing technical debate over the potential for mineralization of dichloroethene (DCE) and vinyl chloride (VC) to CO2 in the complete absence of diatomic oxygen has largely obscured the importance of microbial DCE/VC mineralization at dissolved oxygen (DO) concentrations below the current field standard (DO < 0.1-0.5 milligrams per liter) for nominally anoxic conditions. This study demonstrates that oxygen-based microbial mineralization of DCE and VC can be substantial under field conditions that are frequently characterized as "anoxic." Because mischaracterization of operant contaminant biodegradation processes can lead to expensive and ineffective remedial actions, a modified framework for assessing the potential importance of oxygen during chloroethene biodegradation was developed.

  15. Mapping Planetary Volcanic Deposits: Identifying Vents and Distinguishing between Effects of Eruption Conditions and Local Storage and Release on Flow Field Morphology

    Science.gov (United States)

    Bleacher, J. E.; Eppler, D. B.; Skinner, J. A.; Evans, C. A.; Feng, W.; Gruener, J. E.; Hurwitz, D. M.; Whitson, P.; Janoiko, B.

    2014-01-01

    Terrestrial geologic mapping techniques are regularly used for "photogeologic" mapping of other planets, but these approaches are complicated by the diverse type, areal coverage, and spatial resolution of available data sets. When available, spatially-limited in-situ human and/or robotic surface observations can sometimes introduce a level of detail that is difficult to integrate with regional or global interpretations. To assess best practices for utilizing observations acquired from orbit and on the surface, our team conducted a comparative study of geologic mapping and interpretation techniques. We compared maps generated for the same area in the San Francisco Volcanic Field (SFVF) in northern Arizona using 1) data collected for reconnaissance before and during the 2010 Desert Research And Technology Studies campaign, and 2) during a traditional, terrestrial field geology study. The operations, related results, and direct mapping comparisons are discussed in companion LPSC abstracts. Here we present new geologic interpretations for a volcanic cone and related lava flows as derived from all approaches involved in this study. Mapping results indicate a need for caution when interpreting past eruption conditions on other planetary surfaces from orbital data alone.

  16. Mapping Planetary Volcanic Deposits: Identifying Vents and Distingushing between Effects of Eruption Conditions and Local Lava Storage and Release on Flow Field Morphology

    Science.gov (United States)

    Bleacher, J. E.; Eppler, D. B.; Skinner, J. A.; Evans, C. A.; Feng, W.; Gruener, J. E.; Hurwitz, D. M.; Whitson, P.; Janoiko, B.

    2014-01-01

    Terrestrial geologic mapping techniques are regularly used for "photogeologic" mapping of other planets, but these approaches are complicated by the diverse type, areal coverage, and spatial resolution of available data sets. When available, spatially-limited in-situ human and/or robotic surface observations can sometimes introduce a level of detail that is difficult to integrate with regional or global interpretations. To assess best practices for utilizing observations acquired from orbit and on the surface, our team conducted a comparative study of geologic mapping and interpretation techniques. We compared maps generated for the same area in the San Francisco Volcanic Field (SFVF) in northern Arizona using 1) data collected for reconnaissance before and during the 2010 Desert Research And Technology Studies campaign, and 2) during a traditional, terrestrial field geology study. The operations, related results, and direct mapping comparisons are discussed in companion LPSC abstracts [1-3]. Here we present new geologic interpretations for a volcanic cone and related lava flows as derived from all approaches involved in this study. Mapping results indicate a need for caution when interpreting past eruption conditions on other planetary surfaces from orbital data alone.

  17. AUTOMATIC URBAN ILLEGAL BUILDING DETECTION USING MULTI-TEMPORAL SATELLITE IMAGES AND GEOSPATIAL INFORMATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    N. Khalili Moghadam

    2015-12-01

    Full Text Available With the unprecedented growth of urban population and urban development, we are faced with the growing trend of illegal building (IB construction. Field visit, as the currently used method of IB detection, is time and man power consuming, in addition to its high cost. Therefore, an automatic IB detection is required. Acquiring multi-temporal satellite images and using image processing techniques for automatic change detection is one of the optimum methods which can be used in IB monitoring. In this research an automatic method of IB detection has been proposed. Two-temporal panchromatic satellite images of IRS-P5 of the study area in a part of Tehran, the city map and an updated spatial database of existing buildings were used to detect the suspected IBs. In the pre-processing step, the images were geometrically and radiometrically corrected. In the next step, the changed pixels were detected using K-means clustering technique because of its quickness and less user’s intervention required. Then, all the changed pixels of each building were identified and the change percentage of each building with the standard threshold of changes was compared to detect the buildings which are under construction. Finally, the IBs were detected by checking the municipality database. The unmatched constructed buildings with municipal database will be field checked to identify the IBs. The results show that out of 343 buildings appeared in the images; only 19 buildings were detected as under construction and three of them as unlicensed buildings. Furthermore, the overall accuracies of 83%, 79% and 75% were obtained for K-means change detection, detection of under construction buildings and IBs detection, respectively.

  18. The Role of Automatic Obesity Stereotypes in Real Hiring Discrimination

    Science.gov (United States)

    Agerstrom, Jens; Rooth, Dan-Olof

    2011-01-01

    This study examined whether automatic stereotypes captured by the implicit association test (IAT) can predict real hiring discrimination against the obese. In an unobtrusive field experiment, job applications were sent to a large number of real job vacancies. The applications were matched on credentials but differed with respect to the applicant's…

  19. Automatic Segmentation and Deep Learning of Bird Sounds

    NARCIS (Netherlands)

    Koops, Hendrik Vincent; Van Balen, J.M.H.; Wiering, F.

    2015-01-01

    We present a study on automatic birdsong recognition with deep neural networks using the BIRDCLEF2014 dataset. Through deep learning, feature hierarchies are learned that represent the data on several levels of abstraction. Deep learning has been applied with success to problems in fields such as mu

  20. The Eras and Trends of Automatic Short Answer Grading

    Science.gov (United States)

    Burrows, Steven; Gurevych, Iryna; Stein, Benno

    2015-01-01

    Automatic short answer grading (ASAG) is the task of assessing short natural language responses to objective questions using computational methods. The active research in this field has increased enormously of late with over 80 papers fitting a definition of ASAG. However, the past efforts have generally been ad-hoc and non-comparable until…

  1. Automatic Segmentation and Deep Learning of Bird Sounds

    NARCIS (Netherlands)

    Koops, Hendrik Vincent; Van Balen, J.M.H.; Wiering, F.

    2015-01-01

    We present a study on automatic birdsong recognition with deep neural networks using the BIRDCLEF2014 dataset. Through deep learning, feature hierarchies are learned that represent the data on several levels of abstraction. Deep learning has been applied with success to problems in fields such as mu

  2. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic, un...

  3. Automatic Segmentation and Deep Learning of Bird Sounds

    NARCIS (Netherlands)

    Koops, Hendrik Vincent; Van Balen, J.M.H.|info:eu-repo/dai/nl/352221860; Wiering, F.|info:eu-repo/dai/nl/141928034

    2015-01-01

    We present a study on automatic birdsong recognition with deep neural networks using the BIRDCLEF2014 dataset. Through deep learning, feature hierarchies are learned that represent the data on several levels of abstraction. Deep learning has been applied with success to problems in fields such as

  4. Automatic guiding of the primary image of solar Gregory telescopes

    NARCIS (Netherlands)

    Küveler, G.; Wiehr, E.; Thomas, D.; Harzer, M.; Bianda, M.; Epple, A.; Sütterlin, P.; Weisshaar, E.

    1998-01-01

    The primary image reflected from the field-stop of solar Gregory telescopes is used for automatic guiding. This new system avoids temporal varying influences from the bending of the telescope tube by the main mirror's gravity and from offsets between the telescope and a separate guiding refractor.

  5. New automatic minidisk infiltrometer: design and testing

    Directory of Open Access Journals (Sweden)

    Klípa Vladimír

    2015-06-01

    Full Text Available Soil hydraulic conductivity is a key parameter to predict water flow through the soil profile. We have developed an automatic minidisk infiltrometer (AMI to enable easy measurement of unsaturated hydraulic conductivity using the tension infiltrometer method in the field. AMI senses the cumulative infiltration by recording change in buoyancy force acting on a vertical solid bar fixed in the reservoir tube of the infiltrometer. Performance of the instrument was tested in the laboratory and in two contrasting catchments at three sites with different land use. Hydraulic conductivities determined using AMI were compared with earlier manually taken readings. The results of laboratory testing demonstrated high accuracy and robustness of the AMI measurement. Field testing of AMI proved the suitability of the instrument for use in the determination of sorptivity and near saturated hydraulic conductivity

  6. Automatic sign language identification

    OpenAIRE

    Gebre, B.G.; Wittenburg, P.; Heskes, T.

    2013-01-01

    We propose a Random-Forest based sign language identification system. The system uses low-level visual features and is based on the hypothesis that sign languages have varying distributions of phonemes (hand-shapes, locations and movements). We evaluated the system on two sign languages -- British SL and Greek SL, both taken from a publicly available corpus, called Dicta Sign Corpus. Achieved average F1 scores are about 95% - indicating that sign languages can be identified with high accuracy...

  7. Identifying Activity

    CERN Document Server

    Lewis, Adrian S

    2009-01-01

    Identification of active constraints in constrained optimization is of interest from both practical and theoretical viewpoints, as it holds the promise of reducing an inequality-constrained problem to an equality-constrained problem, in a neighborhood of a solution. We study this issue in the more general setting of composite nonsmooth minimization, in which the objective is a composition of a smooth vector function c with a lower semicontinuous function h, typically nonsmooth but structured. In this setting, the graph of the generalized gradient of h can often be decomposed into a union (nondisjoint) of simpler subsets. "Identification" amounts to deciding which subsets of the graph are "active" in the criticality conditions at a given solution. We give conditions under which any convergent sequence of approximate critical points finitely identifies the activity. Prominent among these properties is a condition akin to the Mangasarian-Fromovitz constraint qualification, which ensures boundedness of the set of...

  8. The Potential of Automatic Word Comparison for Historical Linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Greenhill, Simon J; Gray, Russell D

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection-although not perfect-could become an important component of future research in historical linguistics.

  9. Metabolic changes in occipital lobe epilepsy with automatisms

    Directory of Open Access Journals (Sweden)

    Chong H Wong

    2014-07-01

    Full Text Available Purpose: Some studies suggest that the pattern of glucose hypometabolism relates not only to the ictal-onset zone, but also reflects seizure propagation. We investigated metabolic changes in patients with occipital lobe epilepsy (OLE that may reflect propagation of ictal discharge during seizures with automatisms.Methods: Fifteen patients who had undergone epilepsy surgery for intractable OLE and had undergone interictal Fluorine-18-fluorodeoxyglucose positron emission tomography (18F-FDG-PET between 1994 and 2004 were divided into two groups (with and without automatisms during seizure. Significant regions of hypometabolism were identified by comparing 18F-FDG-PET results from each group with 16 healthy controls by using Statistical Parametric Mapping (SPM 2.Key Findings: Significant hypometabolism was confined largely to the epileptogenic occipital lobe in the patient group without automatisms. In patients with automatisms, glucose hypometabolism extended from the epileptogenic occipital lobe into the ipsilateral temporal lobe.Significance: We identified a distinctive hypometabolic pattern that was specific for OLE patients with automatisms during a seizure. This finding supports the postulate that seizure propagation is a cause of glucose hypometabolism beyond the region of seizure onset.

  10. Super pixel density based clustering automatic image classification method

    Science.gov (United States)

    Xu, Mingxing; Zhang, Chuan; Zhang, Tianxu

    2015-12-01

    The image classification is an important means of image segmentation and data mining, how to achieve rapid automated image classification has been the focus of research. In this paper, based on the super pixel density of cluster centers algorithm for automatic image classification and identify outlier. The use of the image pixel location coordinates and gray value computing density and distance, to achieve automatic image classification and outlier extraction. Due to the increased pixel dramatically increase the computational complexity, consider the method of ultra-pixel image preprocessing, divided into a small number of super-pixel sub-blocks after the density and distance calculations, while the design of a normalized density and distance discrimination law, to achieve automatic classification and clustering center selection, whereby the image automatically classify and identify outlier. After a lot of experiments, our method does not require human intervention, can automatically categorize images computing speed than the density clustering algorithm, the image can be effectively automated classification and outlier extraction.

  11. The Potential of Automatic Word Comparison for Historical Linguistics

    Science.gov (United States)

    Greenhill, Simon J.; Gray, Russell D.

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection—although not perfect—could become an important component of future research in historical linguistics. PMID:28129337

  12. Identifying dynamically young galaxy groups via wide-angle tail galaxies: A case study in the COSMOS field at z=0.53

    CERN Document Server

    Oklopcic, A; Giodini, S; Zamorani, G; Birzan, L; Schinnerer, E; Carilli, C L; Finoguenov, A; Lilly, S; Koekemoer, A; Scoville, N Z

    2010-01-01

    We present an analysis of a wide-angle tail (WAT) radio galaxy located in a galaxy group in the COSMOS field at a redshift of z=0.53 (hereafter CWAT-02). We find that the host galaxy of CWAT-02 is the brightest galaxy in the group, although it does not coincide with the center of mass of the system. Estimating a) the velocity of CWAT-02, relative to the intra-cluster medium (ICM), and b) the line-of-sight peculiar velocity of CWAT-02's host galaxy, relative to the average velocity of the group, we find that both values are higher than those expected for a dominant galaxy in a relaxed system. This suggests that CWAT-02's host group is dynamically young and likely in the process of an ongoing group merger. Our results are consistent with previous findings showing that the presence of a wide-angle tail galaxy in a galaxy group or cluster can be used as an indicator of dynamically young non-relaxed systems. Taking the unrelaxed state of CWAT-02's host group into account, we discuss the impact of radio-AGN heating...

  13. Exposure dating and glacial reconstruction at Mt. Field, Tasmania, Australia, identifies MIS 3 and MIS 2 glacial advances and climatic variability

    Science.gov (United States)

    Mackintosh, A. N.; Barrows, T. T.; Colhoun, E. A.; Fifield, L. K.

    2006-05-01

    Tasmania is important for understanding Quaternary climatic change because it is one of only three areas that experienced extensive mid-latitude Southern Hemisphere glaciation and it lies in a dominantly oceanic environment at a great distance from Northern Hemisphere ice sheet feedbacks. We applied exposure dating using 36Cl to an extensive sequence of moraines from the last glacial at Mt. Field, Tasmania. Glaciers advanced at 41-44 ka during Marine oxygen Isotope Stage (MIS) 3 and at 18 ka during MIS 2. Both advances occurred in response to an ELA lowering greater than 1100 m below the present-day mean summer freezing level, and a possible temperature reduction of 7-8°C. Deglaciation was rapid and complete by ca. 16 ka. The overall story emerging from studies of former Tasmanian glaciers is that the MIS 2 glaciation was of limited extent and that some glaciers were more extensive during earlier parts of the last glacial cycle. Copyright

  14. Intensive field phenotyping of maize (Zea mays L.) root crowns identifies phenes and phene integration associated with plant growth and nitrogen acquisition.

    Science.gov (United States)

    York, Larry M; Lynch, Jonathan P

    2015-09-01

    Root architecture is an important regulator of nitrogen (N) acquisition. Existing methods to phenotype the root architecture of cereal crops are generally limited to seedlings or to the outer roots of mature root crowns. The functional integration of root phenes is poorly understood. In this study, intensive phenotyping of mature root crowns of maize was conducted to discover phenes and phene modules related to N acquisition. Twelve maize genotypes were grown under replete and deficient N regimes in the field in South Africa and eight in the USA. An image was captured for every whorl of nodal roots in each crown. Custom software was used to measure root phenes including nodal occupancy, angle, diameter, distance to branching, lateral branching, and lateral length. Variation existed for all root phenes within maize root crowns. Size-related phenes such as diameter and number were substantially influenced by nodal position, while angle, lateral density, and distance to branching were not. Greater distance to branching, the length from the shoot to the emergence of laterals, is proposed to be a novel phene state that minimizes placing roots in already explored soil. Root phenes from both older and younger whorls of nodal roots contributed to variation in shoot mass and N uptake. The additive integration of root phenes accounted for 70% of the variation observed in shoot mass in low N soil. These results demonstrate the utility of intensive phenotyping of mature root systems, as well as the importance of phene integration in soil resource acquisition.

  15. Algorithms for skiascopy measurement automatization

    Science.gov (United States)

    Fomins, Sergejs; Trukša, Renārs; KrūmiĆa, Gunta

    2014-10-01

    Automatic dynamic infrared retinoscope was developed, which allows to run procedure at a much higher rate. Our system uses a USB image sensor with up to 180 Hz refresh rate equipped with a long focus objective and 850 nm infrared light emitting diode as light source. Two servo motors driven by microprocessor control the rotation of semitransparent mirror and motion of retinoscope chassis. Image of eye pupil reflex is captured via software and analyzed along the horizontal plane. Algorithm for automatic accommodative state analysis is developed based on the intensity changes of the fundus reflex.

  16. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  17. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  18. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  19. Automatic detection and classification of damage zone(s) for incorporating in digital image correlation technique

    Science.gov (United States)

    Bhattacharjee, Sudipta; Deb, Debasis

    2016-07-01

    Digital image correlation (DIC) is a technique developed for monitoring surface deformation/displacement of an object under loading conditions. This method is further refined to make it capable of handling discontinuities on the surface of the sample. A damage zone is referred to a surface area fractured and opened in due course of loading. In this study, an algorithm is presented to automatically detect multiple damage zones in deformed image. The algorithm identifies the pixels located inside these zones and eliminate them from FEM-DIC processes. The proposed algorithm is successfully implemented on several damaged samples to estimate displacement fields of an object under loading conditions. This study shows that displacement fields represent the damage conditions reasonably well as compared to regular FEM-DIC technique without considering the damage zones.

  20. Developing a Satellite Based Automatic System for Crop Monitoring: Kenya's Great Rift Valley, A Case Study

    Science.gov (United States)

    Lucciani, Roberto; Laneve, Giovanni; Jahjah, Munzer; Mito, Collins

    2016-08-01

    The crop growth stage represents essential information for agricultural areas management. In this study we investigate the feasibility of a tool based on remotely sensed satellite (Landsat 8) imagery, capable of automatically classify crop fields and how much resolution enhancement based on pan-sharpening techniques and phenological information extraction, useful to create decision rules that allow to identify semantic class to assign to an object, can effectively support the classification process. Moreover we investigate the opportunity to extract vegetation health status information from remotely sensed assessment of the equivalent water thickness (EWT). Our case study is the Kenya's Great Rift valley, in this area a ground truth campaign was conducted during August 2015 in order to collect crop fields GPS measurements, leaf area index (LAI) and chlorophyll samples.

  1. A Machine Vision System for Automatically Grading Hardwood Lumber - (Industrial Metrology)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas T. Drayer; Philip A. Araman; Robert L. Brisbon

    1992-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  2. Young L Dwarfs Identified in the Field: A Preliminary Low-Gravity, Optical Spectral Sequence from L0 to L5

    CERN Document Server

    Cruz, Kelle L; Burgasser, Adam J

    2008-01-01

    We present an analysis of 23 L dwarfs whose optical spectra display unusual features. Twenty-one were uncovered during our search for nearby, late-type objects using the Two Micron All-Sky Survey while two were identified in the literature. The unusual spectral features, notably weak FeH molecular absorption and weak Na I and K I doublets, are attributable to low-gravity and indicate that these L dwarfs are young, low-mass brown dwarfs. We use these data to expand the spectral classification scheme for L0 to L5-type dwarfs to include three gravity classes. Most of the low-gravity L dwarfs have southerly declinations and distance estimates within 60 pc. Their implied youth, on-sky distribution, and distances suggest that they are members of nearby, intermediate-age (~10-100 Myr), loose associations such as the Beta Pictoris moving group, the Tucana/Horologium association, and the AB Doradus moving group. At an age of 30 Myr and with effective temperatures from 1500 to 2400 K, evolutionary models predict masses...

  3. Automatic Recognition of Facial Actions in Spontaneous Expressions

    Directory of Open Access Journals (Sweden)

    Marian Stewart Bartlett

    2006-09-01

    Full Text Available Spontaneous facial expressions differ from posed expressions in both which muscles are moved, and in the dynamics of the movement. Advances in the field of automatic facial expression measurement will require development and assessment on spontaneous behavior. Here we present preliminary results on a task of facial action detection in spontaneous facial expressions. We employ a user independent fully automatic system for real time recognition of facial actions from the Facial Action Coding System (FACS. The system automatically detects frontal faces in the video stream and coded each frame with respect to 20 Action units. The approach applies machine learning methods such as support vector machines and AdaBoost, to texture-based image representations. The output margin for the learned classifiers predicts action unit intensity. Frame-by-frame intensity measurements will enable investigations into facial expression dynamics which were previously intractable by human coding.

  4. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo;

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...

  5. How CBO Estimates Automatic Stabilizers

    Science.gov (United States)

    2015-11-01

    Medicare Payroll Taxes ........................................................................................ 10 Taxes on Production and Imports ...the economy. Most types of revenues—mainly personal, corporate, and social insurance taxes—are sensitive to the business cycle and account for most of...States: Automatic Stabilizers, Discretionary Fiscal Policy Actions, and the Economy, Finance and Economic Discussion Series Paper 2010-43 (Board of

  6. Automatic Identification of Metaphoric Utterances

    Science.gov (United States)

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…

  7. Automatic Error Analysis Using Intervals

    Science.gov (United States)

    Rothwell, E. J.; Cloud, M. J.

    2012-01-01

    A technique for automatic error analysis using interval mathematics is introduced. A comparison to standard error propagation methods shows that in cases involving complicated formulas, the interval approach gives comparable error estimates with much less effort. Several examples are considered, and numerical errors are computed using the INTLAB…

  8. Trevi Park: Automatic Parking System

    OpenAIRE

    ECT Team, Purdue

    2007-01-01

    TreviPark is an underground, multi-story stacking system that holds cars efficiently, thus reducing the cost of each parking space, as a fully automatic parking system intended to maximize space utilization in parking structures. TreviPark costs less than the price of a conventional urban garage and takes up half the volume and 80% of the depth.

  9. Automatic agar tray inoculation device

    Science.gov (United States)

    Wilkins, J. R.; Mills, S. M.

    1972-01-01

    Automatic agar tray inoculation device is simple in design and foolproof in operation. It employs either conventional inoculating loop or cotton swab for uniform inoculation of agar media, and it allows technician to carry on with other activities while tray is being inoculated.

  10. An Automatic Program Synthesis Bibliography.

    Science.gov (United States)

    1988-01-19

    experimentation of the BIS system. Interna- tional Journal of Man-Machine Studies, 17:173-188, 1982. [67] A F Cardenas . Technology for automatic...Philadelphia, PA, May 1978. [110] B L Gates and J A van Hulzen. Proceedings of European Conference on Computer Algebra (EUROCAL 85), pages 583-584. Volume 203

  11. Automatic milking : a better understanding

    NARCIS (Netherlands)

    Meijering, A.; Hogeveen, H.; Koning, de C.J.A.M.

    2004-01-01

    In 2000 the book Robotic Milking, reflecting the proceedings of an International Symposium which was held in The Netherlands came out. At that time, commercial introduction of automatic milking systems was no longer obstructed by technological inadequacies. Particularly in a few west-European countr

  12. AUTOMATIC CLASSIFICATION OF STRUCTURAL MRI FOR DIAGNOSIS OF NEURODEGENERATIVE DISEASES

    Directory of Open Access Journals (Sweden)

    Hernández-Tamames Juan Antonio

    2010-12-01

    Full Text Available This paper presents an automatic approach which classifies structural Magnetic Resonance images into pathological or healthy controls. A classification model was trained to find the boundaries that allow to separate the study groups. The method uses the deformation values from a set of regions, automatically identified as relevant, in a process that selects the statistically significant regions of a t-test under the restriction that this significance must be spatially coherent within a neighborhood of 5 voxels. The proposed method was assessed to distinguish healthy controls from schizophrenia patients. Classification results showed accuracy between 74% and 89%, depending on the stage of the disease and number of training samples.

  13. Automatic Artist Recognition of Songs for Advanced Retrieval

    Institute of Scientific and Technical Information of China (English)

    ZHU Song-hao; LIU Yun-cai

    2008-01-01

    Automatic recognition of artists is very important in acoustic music indexing, browsing, and contentbased acoustic music retrieving, but synchronously it is still a challenging errand to extract the most representative and salient attributes to depict diversiform artists. In this paper, we developed a novel system to complete the reorganization of artist automatically. The proposed system can efficiently identify the artist's voice of a raw song by analyzing substantive features extracted from both pure music and singing song mixed with accompanying music. The experiments on different genres of songs illustrate that the proposed system is possible.

  14. MARZ: Manual and Automatic Redshifting Software

    CERN Document Server

    Hinton, Samuel R; Lidman, Chris; Glazebrook, Karl; Lewis, Geraint F

    2016-01-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application Marz with greater usability, flexibility, and the capacity to analyse a wider range of object types than the Runz software package previously used for redshifting spectra from 2dF. Marz is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the Autoz cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automat...

  15. Distance transform for automatic dermatologic images composition

    Science.gov (United States)

    Grana, C.; Pellacani, G.; Seidenari, S.; Cucchiara, R.

    2006-03-01

    In this paper we focus on the problem of automatically registering dermatological images, because even if different products are available, most of them share the problem of a limited field of view on the skin. A possible solution is then the composition of multiple takes of the same lesion with digital software, such as that for panorama images creation. In this work, to perform an automatic selection of matching points the Harris Corner Detector is used, and to cope with outlier couples we employed the RANSAC method. Projective mapping is then used to match the two images. Given a set of correspondence points, Singular Value Decomposition was used to compute the transform parameters. At this point the two images need to be blended together. One initial assumption is often implicitly made: the aim is to merge two rectangular images. But when merging occurs between more than two images iteratively, this assumption will fail. To cope with differently shaped images, we employed the Distance Transform and provided a weighted merging of images. Different tests were conducted with dermatological images, both with standard rectangular frame and with not typical shapes, as for example a ring due to the objective and lens selection. The successive composition of different circular images with other blending functions, such as the Hat function, doesn't correctly get rid of the border and residuals of the circular mask are still visible. By applying Distance Transform blending, the result produced is insensitive of the outer shape of the image.

  16. Automatic script identification from images using cluster-based templates

    Energy Technology Data Exchange (ETDEWEB)

    Hochberg, J.; Kerns, L.; Kelly, P.; Thomas, T.

    1995-02-01

    We have developed a technique for automatically identifying the script used to generate a document that is stored electronically in bit image form. Our approach differs from previous work in that the distinctions among scripts are discovered by an automatic learning procedure, without any handson analysis. We first develop a set of representative symbols (templates) for each script in our database (Cyrillic, Roman, etc.). We do this by identifying all textual symbols in a set of training documents, scaling each symbol to a fixed size, clustering similar symbols, pruning minor clusters, and finding each cluster`s centroid. To identify a new document`s script, we identify and scale a subset of symbols from the document and compare them to the templates for each script. We choose the script whose templates provide the best match. Our current system distinguishes among the Armenian, Burmese, Chinese, Cyrillic, Ethiopic, Greek, Hebrew, Japanese, Korean, Roman, and Thai scripts with over 90% accuracy.

  17. An Automatic Image Inpainting Algorithm Based on FCM

    Directory of Open Access Journals (Sweden)

    Jiansheng Liu

    2014-01-01

    Full Text Available There are many existing image inpainting algorithms in which the repaired area should be manually determined by users. Aiming at this drawback of the traditional image inpainting algorithms, this paper proposes an automatic image inpainting algorithm which automatically identifies the repaired area by fuzzy C-mean (FCM algorithm. FCM algorithm classifies the image pixels into a number of categories according to the similarity principle, making the similar pixels clustering into the same category as possible. According to the provided gray value of the pixels to be inpainted, we calculate the category whose distance is the nearest to the inpainting area and this category is to be inpainting area, and then the inpainting area is restored by the TV model to realize image automatic inpainting.

  18. Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.

    Science.gov (United States)

    Denecke, Kerstin

    2016-01-01

    Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved.

  19. Automatic detection of microcalcifications with multi-fractal spectrum.

    Science.gov (United States)

    Ding, Yong; Dai, Hang; Zhang, Hang

    2014-01-01

    For improving the detection of micro-calcifications (MCs), this paper proposes an automatic detection of MC system making use of multi-fractal spectrum in digitized mammograms. The approach of automatic detection system is based on the principle that normal tissues possess certain fractal properties which change along with the presence of MCs. In this system, multi-fractal spectrum is applied to reveal such fractal properties. By quantifying the deviations of multi-fractal spectrums between normal tissues and MCs, the system can identify MCs altering the fractal properties and finally locate the position of MCs. The performance of the proposed system is compared with the leading automatic detection systems in a mammographic image database. Experimental results demonstrate that the proposed system is statistically superior to most of the compared systems and delivers a superior performance.

  20. Template-based automatic extraction of the joint space of foot bones from CT scan

    Science.gov (United States)

    Park, Eunbi; Kim, Taeho; Park, Jinah

    2016-03-01

    Clean bone segmentation is critical in studying the joint anatomy for measuring the spacing between the bones. However, separation of the coupled bones in CT images is sometimes difficult due to ambiguous gray values coming from the noise and the heterogeneity of bone materials as well as narrowing of the joint space. For fine reconstruction of the individual local boundaries, manual operation is a common practice where the segmentation remains to be a bottleneck. In this paper, we present an automatic method for extracting the joint space by applying graph cut on Markov random field model to the region of interest (ROI) which is identified by a template of 3D bone structures. The template includes encoded articular surface which identifies the tight region of the high-intensity bone boundaries together with the fuzzy joint area of interest. The localized shape information from the template model within the ROI effectively separates the bones nearby. By narrowing the ROI down to the region including two types of tissue, the object extraction problem was reduced to binary segmentation and solved via graph cut. Based on the shape of a joint space marked by the template, the hard constraint was set by the initial seeds which were automatically generated from thresholding and morphological operations. The performance and the robustness of the proposed method are evaluated on 12 volumes of ankle CT data, where each volume includes a set of 4 tarsal bones (calcaneus, talus, navicular and cuboid).

  1. Automatic background knowledge selection for matching biomedical ontologies.

    Directory of Open Access Journals (Sweden)

    Daniel Faria

    Full Text Available Ontology matching is a growing field of research that is of critical importance for the semantic web initiative. The use of background knowledge for ontology matching is often a key factor for success, particularly in complex and lexically rich domains such as the life sciences. However, in most ontology matching systems, the background knowledge sources are either predefined by the system or have to be provided by the user. In this paper, we present a novel methodology for automatically selecting background knowledge sources for any given ontologies to match. This methodology measures the usefulness of each background knowledge source by assessing the fraction of classes mapped through it over those mapped directly, which we call the mapping gain. We implemented this methodology in the AgreementMakerLight ontology matching framework, and evaluate it using the benchmark biomedical ontology matching tasks from the Ontology Alignment Evaluation Initiative (OAEI 2013. In each matching problem, our methodology consistently identified the sources of background knowledge that led to the highest improvements over the baseline alignment (i.e., without background knowledge. Furthermore, our proposed mapping gain parameter is strongly correlated with the F-measure of the produced alignments, thus making it a good estimator for ontology matching techniques based on background knowledge.

  2. Alexithymia and automatic processing of emotional stimuli: a systematic review.

    Science.gov (United States)

    Donges, Uta-Susan; Suslow, Thomas

    2017-04-01

    Alexithymia is a personality trait characterized by difficulties in recognizing and verbalizing emotions and the utilization of a cognitive style that is oriented toward external events, rather than intrapsychic experiences. Alexithymia is considered a vulnerability factor influencing onset and course of many psychiatric disorders. Even though emotions are, in general, elicited involuntarily and emerge without conscious effort, it is surprising that little attention in etiological considerations concerning alexithymia has been given to deficits in automatic emotion processing and their neurobiological bases. In this article, results from studies using behavioral or neurobiological research methods were systematically reviewed in which automatic processing of external emotional information was investigated as a function of alexithymia in healthy individuals. Twenty-two studies were identified through a literature search of Psycinfo, PubMed, and Web of Science databases from 1990 to 2016. The review reveals deficits in the automatic processing of emotional stimuli in alexithymia at a behavioral and neurobiological level. The vast majority of the reviewed studies examined visual processing. The alexithymia facets externally oriented thinking and difficulties identifying feelings were found to be related to impairments in the automatic processing of threat-related facial expressions. Alexithymic individuals manifest low reactivity to barely visible negative emotional stimuli in brain regions responsible for appraisal, encoding, and affective response, e.g. amygdala, occipitotemporal areas, and insula. Against this background, it appears plausible to assume that deficits in automatic emotion processing could be factors contributing to alexithymic personality characteristics. Directions for future research on alexithymia and automatic emotion perception are suggested.

  3. Method: automatic segmentation of mitochondria utilizing patch classification, contour pair classification, and automatically seeded level sets.

    Science.gov (United States)

    Giuly, Richard J; Martone, Maryann E; Ellisman, Mark H

    2012-02-09

    While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with

  4. Method: automatic segmentation of mitochondria utilizing patch classification, contour pair classification, and automatically seeded level sets

    Directory of Open Access Journals (Sweden)

    Giuly Richard J

    2012-02-01

    Full Text Available Abstract Background While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. Results We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. Conclusions We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to

  5. Automatic Control System for Neutron Laboratory Safety

    Institute of Scientific and Technical Information of China (English)

    ZHAO; Xiao; ZHANG; Guo-guang; FENG; Shu-qiang; SU; Dan; YANG; Guo-zhao; ZHANG; Shuai

    2015-01-01

    In order to cooperate with the experiment of neutron generator,and realize the automatic control in the experiment,a set of automatic control system for the safety of the neutron laboratory is designed.The system block diagram is shown as Fig.1.Automatic control device is for processing switch signal,so PLC is selected as the core component

  6. An Automatic Proof of Euler's Formula

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2005-05-01

    Full Text Available In this information age, everything is digitalized. The encoding of functions and the automatic proof of functions are important. This paper will discuss the automatic calculation for Taylor expansion coefficients, as an example, it can be applied to prove Euler's formula automatically.

  7. Self-Compassion and Automatic Thoughts

    Science.gov (United States)

    Akin, Ahmet

    2012-01-01

    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  8. 8 CFR 1205.1 - Automatic revocation.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Automatic revocation. 1205.1 Section 1205.1... REGULATIONS REVOCATION OF APPROVAL OF PETITIONS § 1205.1 Automatic revocation. (a) Reasons for automatic revocation. The approval of a petition or self-petition made under section 204 of the Act and in...

  9. 8 CFR 205.1 - Automatic revocation.

    Science.gov (United States)

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Automatic revocation. 205.1 Section 205.1 Aliens and Nationality DEPARTMENT OF HOMELAND SECURITY IMMIGRATION REGULATIONS REVOCATION OF APPROVAL OF PETITIONS § 205.1 Automatic revocation. (a) Reasons for automatic revocation. The approval of a petition...

  10. Automatic spikes detection in seismogram

    Institute of Scientific and Technical Information of China (English)

    王海军; 靳平; 刘贵忠

    2003-01-01

    @@ Data processing for seismic network is very complex and fussy, because a lot of data is recorded in seismic network every day, which make it impossible to process these data all by manual work. Therefore, seismic data should be processed automatically to produce a initial results about events detection and location. Afterwards, these results are reviewed and modified by analyst. In automatic processing data quality checking is important. There are three main problem data thatexist in real seismic records, which include: spike, repeated data and dropouts. Spike is defined as isolated large amplitude point; the other two problem datahave the same features that amplitude of sample points are uniform in a interval. In data quality checking, the first step is to detect and statistic problem data in a data segment, if percent of problem data exceed a threshold, then the whole data segment is masked and not be processed in the later process.

  11. Automatic Schema Evolution in Root

    Institute of Scientific and Technical Information of China (English)

    ReneBrun; FonsRademakers

    2001-01-01

    ROOT version 3(spring 2001) supports automatic class schema evolution.In addition this version also produces files that are self-describing.This is achieved by storing in each file a record with the description of all the persistent classes in the file.Being self-describing guarantees that a file can always be read later,its structure browsed and objects inspected.also when the library with the compiled code of these classes is missing The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session.ROOT supports the automatic generation of C++ code describing the data objects in a file.

  12. Automatic design of magazine covers

    Science.gov (United States)

    Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.

    2012-03-01

    In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.

  13. Laser Scanner For Automatic Storage

    Science.gov (United States)

    Carvalho, Fernando D.; Correia, Bento A.; Rebordao, Jose M.; Rodrigues, F. Carvalho

    1989-01-01

    The automated magazines are beeing used at industry more and more. One of the problems related with the automation of a Store House is the identification of the products envolved. Already used for stock management, the Bar Codes allows an easy way to identify one product. Applied to automated magazines, the bar codes allows a great variety of items in a small code. In order to be used by the national producers of automated magazines, a devoted laser scanner has been develloped. The Prototype uses an He-Ne laser whose beam scans a field angle of 75 degrees at 16 Hz. The scene reflectivity is transduced by a photodiode into an electrical signal, which is then binarized. This digital signal is the input of the decodifying program. The machine is able to see barcodes and to decode the information. A parallel interface allows the comunication with the central unit, which is responsible for the management of automated magazine.

  14. The Automatic Galaxy Collision Software

    CERN Document Server

    Smith, Beverly J; Pfeiffer, Phillip; Perkins, Sam; Barkanic, Jason; Fritts, Steve; Southerland, Derek; Manchikalapudi, Dinikar; Baker, Matt; Luckey, John; Franklin, Coral; Moffett, Amanda; Struck, Curtis

    2009-01-01

    The key to understanding the physical processes that occur during galaxy interactions is dynamical modeling, and especially the detailed matching of numerical models to specific systems. To make modeling interacting galaxies more efficient, we have constructed the `Automatic Galaxy Collision' (AGC) code, which requires less human intervention in finding good matches to data. We present some preliminary results from this code for the well-studied system Arp 284 (NGC 7714/5), and address questions of uniqueness of solutions.

  15. Automatic computation of transfer functions

    Science.gov (United States)

    Atcitty, Stanley; Watson, Luke Dale

    2015-04-14

    Technologies pertaining to the automatic computation of transfer functions for a physical system are described herein. The physical system is one of an electrical system, a mechanical system, an electromechanical system, an electrochemical system, or an electromagnetic system. A netlist in the form of a matrix comprises data that is indicative of elements in the physical system, values for the elements in the physical system, and structure of the physical system. Transfer functions for the physical system are computed based upon the netlist.

  16. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  17. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range. F...... interactive software is also part of a computer-assisted learning program on digital photogrammetry....

  18. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  19. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  20. The Mark II Automatic Diflux

    Directory of Open Access Journals (Sweden)

    Jean L Rasson

    2011-07-01

    Full Text Available We report here on the new realization of an automatic fluxgate theodolite able to perform unattended absolute geomagnetic declination and inclination measurements: the AUTODIF MKII. The main changes of this version compared with the former one are presented as well as the better specifications we expect now. We also explain the absolute orientation procedure by means of a laser beam and a corner cube and the method for leveling the fluxgate sensor, which is different from a conventional DIflux theodolite.

  1. Automatic Network Fingerprinting through Single-Node Motifs

    CERN Document Server

    Echtermeyer, Christoph; Rodrigues, Francisco A; Kaiser, Marcus; 10.1371/journal.pone.0015765

    2011-01-01

    Complex networks have been characterised by their specific connectivity patterns (network motifs), but their building blocks can also be identified and described by node-motifs---a combination of local network features. One technique to identify single node-motifs has been presented by Costa et al. (L. D. F. Costa, F. A. Rodrigues, C. C. Hilgetag, and M. Kaiser, Europhys. Lett., 87, 1, 2009). Here, we first suggest improvements to the method including how its parameters can be determined automatically. Such automatic routines make high-throughput studies of many networks feasible. Second, the new routines are validated in different network-series. Third, we provide an example of how the method can be used to analyse network time-series. In conclusion, we provide a robust method for systematically discovering and classifying characteristic nodes of a network. In contrast to classical motif analysis, our approach can identify individual components (here: nodes) that are specific to a network. Such special nodes...

  2. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  3. Automatic system for corneal ulcer diagnostic: II

    Science.gov (United States)

    Ventura, Liliane; Chiaradia, Caio; Faria de Sousa, Sidney J.

    1998-06-01

    Corneal Ulcer is a deepithelization of the cornea and it is a very common disease in agricultural countries. The clinician most used parameter in order to identify a favorable ulcer evolution is the regress of the affected area. However, this kind of evaluation is subjective, once just the horizontal and vertical axes are measured based on a graduated scale and the affected area is estimated. Also, the registration of the disease is made by photographs. In order to overcome the subjectiveness and to register the images in a more accessible way (hard disks, floppy disks, etc.), we have developed an automatic system in order to evaluate the affected area (the ulcer). An optical system is implemented in a Slit Lamp (SL) and connected to a CCD detector. The image is displayed in PC monitor by a commercial frame grabber and a dedicated software for determining the area of the ulcer (precision of 20 mm) has been developed.

  4. Automatic hanging protocol for chest radiographs

    Science.gov (United States)

    Luo, Hui; Hao, Wei; Cornelius, Craig

    2005-04-01

    Chest radiography is one of the most widely used techniques in diagnostic imaging. It makes up at least one third of all conventional diagnostic radiographic procedures in hospitals. However, in both film-screen and computed radiography, images are often digitized with the view and orientation unknown or mislabeled, which causes inefficiency in displaying them in the picture archive and communication system (PACS). Hence, the goal of this work is to provide a robust, efficient, and automatic hanging protocol for chest radiographs. To achieve it, the method star ts with recognition by extracting a set of distinctive features from chest radiographs. Next, a well-defined probabilistic classifier is used to train and classify the radiographs. Identifying the orientation of the radiographs is performed by an efficient algorithm which locates the neck, heart, and abdomen positions in radiographs. The initial experiment was performed on radiographs collected from daily routine chest exams in hospitals, and it has shown promising results.

  5. A Review of Methods of Instance-based Automatic Image Annotation

    Directory of Open Access Journals (Sweden)

    Morad Derakhshan

    2016-12-01

    Full Text Available Today, to use automatic image annotation in order to fill the semantic gap between low level features of images and understanding their information in retrieving process has become popular. Since automatic image annotation is crucial in understanding digital images several methods have been proposed to automatically annotate an image. One of the most important of these methods is instance-based image annotation. As these methods are vastly used in this paper, the most important instance-based image annotation methods are analyzed. First of all the main parts of instance-based automatic image annotation are analyzed. Afterwards, the main methods of instance-based automatic image annotation are reviewed and compared based on various features. In the end the most important challenges and open-ended fields in instance-based image annotation are analyzed.

  6. ORCID Author Identifiers: A Primer for Librarians.

    Science.gov (United States)

    Akers, Katherine G; Sarkozy, Alexandra; Wu, Wendy; Slyman, Alison

    2016-01-01

    The ORCID (Open Researcher and Contributor ID) registry helps disambiguate authors and streamline research workflows by assigning unique 16-digit author identifiers that enable automatic linkages between researchers and their scholarly activities. This article describes how ORCID works, the benefits of using ORCID, and how librarians can promote ORCID at their institutions by raising awareness of ORCID, helping researchers create and populate ORCID profiles, and integrating ORCID identifiers into institutional repositories and other university research information systems.

  7. Video Analytics Algorithm for Automatic Vehicle Classification (Intelligent Transport System

    Directory of Open Access Journals (Sweden)

    ArtaIftikhar

    2013-04-01

    Full Text Available Automated Vehicle detection and classification is an important component of intelligent transport system. Due to significant importance in various fields such as traffic accidents avoidance, toll collection, congestion avoidance, terrorist activities monitoring, security and surveillance systems, intelligent transport system has become important field of study. Various technologies have been used for detecting and classifying vehicles automatically. Automated vehicle detection is broadly divided into two types- Hardware based and software based detection. Various algorithms have been implemented to classify different vehicles from videos. In this paper an efficient and economical solution for automatic vehicle detection and classification is proposed. The proposed system first isolates the object through background subtraction followed by vehicle detection using ontology. Vehicle detection is based on low level features such as shape, size, and spatial location. Finally system classifies vehicles into one of the known classes of vehicle based on size.

  8. ALOHA: Automatic libraries of helicity amplitudes for Feynman diagram computations

    Science.gov (United States)

    de Aquino, Priscila; Link, William; Maltoni, Fabio; Mattelaer, Olivier; Stelzer, Tim

    2012-10-01

    We present an application that automatically writes the HELAS (HELicity Amplitude Subroutines) library corresponding to the Feynman rules of any quantum field theory Lagrangian. The code is written in Python and takes the Universal FeynRules Output (UFO) as an input. From this input it produces the complete set of routines, wave-functions and amplitudes, that are needed for the computation of Feynman diagrams at leading as well as at higher orders. The representation is language independent and currently it can output routines in Fortran, C++, and Python. A few sample applications implemented in the MADGRAPH 5 framework are presented. Program summary Program title: ALOHA Catalogue identifier: AEMS_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: http://www.opensource.org/licenses/UoI-NCSA.php No. of lines in distributed program, including test data, etc.: 6094320 No. of bytes in distributed program, including test data, etc.: 7479819 Distribution format: tar.gz Programming language: Python2.6 Computer: 32/64 bit Operating system: Linux/Mac/Windows RAM: 512 Mbytes Classification: 4.4, 11.6 Nature of problem: An effcient numerical evaluation of a squared matrix element can be done with the help of the helicity routines implemented in the HELAS library [1]. This static library contains a limited number of helicity functions and is therefore not always able to provide the needed routine in the presence of an arbitrary interaction. This program provides a way to automatically create the corresponding routines for any given model. Solution method: ALOHA takes the Feynman rules associated to the vertex obtained from the model information (in the UFO format [2]), and multiplies it by the different wavefunctions or propagators. As a result the analytical expression of the helicity routines is obtained. Subsequently, this expression is

  9. Automatic noninvasive measurement of systolic blood pressure using photoplethysmography

    Directory of Open Access Journals (Sweden)

    Glik Zehava

    2009-10-01

    Full Text Available Abstract Background Automatic measurement of arterial blood pressure is important, but the available commercial automatic blood pressure meters, mostly based on oscillometry, are of low accuracy. Methods In this study, we present a cuff-based technique for automatic measurement of systolic blood pressure, based on photoplethysmographic signals measured simultaneously in fingers of both hands. After inflating the pressure cuff to a level above systolic blood pressure in a relatively slow rate, it is slowly deflated. The cuff pressure for which the photoplethysmographic signal reappeared during the deflation of the pressure-cuff was taken as the systolic blood pressure. The algorithm for the detection of the photoplethysmographic signal involves: (1 determination of the time-segments in which the photoplethysmographic signal distal to the cuff is expected to appear, utilizing the photoplethysmographic signal in the free hand, and (2 discrimination between random fluctuations and photoplethysmographic pattern. The detected pulses in the time-segments were identified as photoplethysmographic pulses if they met two criteria, based on the pulse waveform and on the correlation between the signal in each segment and the signal in the two neighboring segments. Results Comparison of the photoplethysmographic-based automatic technique to sphygmomanometry, the reference standard, shows that the standard deviation of their differences was 3.7 mmHg. For subjects with systolic blood pressure above 130 mmHg the standard deviation was even lower, 2.9 mmHg. These values are much lower than the 8 mmHg value imposed by AAMI standard for automatic blood pressure meters. Conclusion The photoplethysmographic-based technique for automatic measurement of systolic blood pressure, and the algorithm which was presented in this study, seems to be accurate.

  10. Automatic alignment of double optical paths in excimer laser amplifier

    Science.gov (United States)

    Wang, Dahui; Zhao, Xueqing; Hua, Hengqi; Zhang, Yongsheng; Hu, Yun; Yi, Aiping; Zhao, Jun

    2013-05-01

    A kind of beam automatic alignment method used for double paths amplification in the electron pumped excimer laser system is demonstrated. In this way, the beams from the amplifiers can be transferred along the designated direction and accordingly irradiate on the target with high stabilization and accuracy. However, owing to nonexistence of natural alignment references in excimer laser amplifiers, two cross-hairs structure is used to align the beams. Here, one crosshair put into the input beam is regarded as the near-field reference while the other put into output beam is regarded as the far-field reference. The two cross-hairs are transmitted onto Charge Coupled Devices (CCD) by image-relaying structures separately. The errors between intersection points of two cross-talk images and centroid coordinates of actual beam are recorded automatically and sent to closed loop feedback control mechanism. Negative feedback keeps running until preset accuracy is reached. On the basis of above-mentioned design, the alignment optical path is built and the software is compiled, whereafter the experiment of double paths automatic alignment in electron pumped excimer laser amplifier is carried through. Meanwhile, the related influencing factors and the alignment precision are analyzed. Experimental results indicate that the alignment system can achieve the aiming direction of automatic aligning beams in short time. The analysis shows that the accuracy of alignment system is 0.63μrad and the beam maximum restoration error is 13.75μm. Furthermore, the bigger distance between the two cross-hairs, the higher precision of the system is. Therefore, the automatic alignment system has been used in angular multiplexing excimer Main Oscillation Power Amplification (MOPA) system and can satisfy the requirement of beam alignment precision on the whole.

  11. Poster — Thur Eve — 70: Automatic lung bronchial and vessel bifurcations detection algorithm for deformable image registration assessment

    Energy Technology Data Exchange (ETDEWEB)

    Labine, Alexandre; Carrier, Jean-François; Bedwani, Stéphane [Centre hospitalier de l' Université de Montréal (Canada); Chav, Ramnada; De Guise, Jacques [Laboratoire de recherche en imagerie et d' orthopédie-CRCHUM, École de technologie supérieure (Canada)

    2014-08-15

    Purpose: To investigate an automatic bronchial and vessel bifurcations detection algorithm for deformable image registration (DIR) assessment to improve lung cancer radiation treatment. Methods: 4DCT datasets were acquired and exported to Varian treatment planning system (TPS) EclipseTM for contouring. The lungs TPS contour was used as the prior shape for a segmentation algorithm based on hierarchical surface deformation that identifies the deformed lungs volumes of the 10 breathing phases. Hounsfield unit (HU) threshold filter was applied within the segmented lung volumes to identify blood vessels and airways. Segmented blood vessels and airways were skeletonised using a hierarchical curve-skeleton algorithm based on a generalized potential field approach. A graph representation of the computed skeleton was generated to assign one of three labels to each node: the termination node, the continuation node or the branching node. Results: 320 ± 51 bifurcations were detected in the right lung of a patient for the 10 breathing phases. The bifurcations were visually analyzed. 92 ± 10 bifurcations were found in the upper half of the lung and 228 ± 45 bifurcations were found in the lower half of the lung. Discrepancies between ten vessel trees were mainly ascribed to large deformation and in regions where the HU varies. Conclusions: We established an automatic method for DIR assessment using the morphological information of the patient anatomy. This approach allows a description of the lung's internal structure movement, which is needed to validate the DIR deformation fields for accurate 4D cancer treatment planning.

  12. Unification of automatic target tracking and automatic target recognition

    Science.gov (United States)

    Schachter, Bruce J.

    2014-06-01

    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.

  13. Data-driven approach to identify field-scale biogeochemical transitions using geochemical and geophysical data and hidden Markov models: Development and application at a uranium-contaminated aquifer

    Science.gov (United States)

    Chen, Jinsong; Hubbard, Susan S.; Williams, Kenneth H.

    2013-10-01

    Although mechanistic reaction networks have been developed to quantify the biogeochemical evolution of subsurface systems associated with bioremediation, it is difficult in practice to quantify the onset and distribution of these transitions at the field scale using commonly collected wellbore datasets. As an alternative approach to the mechanistic methods, we develop a data-driven, statistical model to identify biogeochemical transitions using various time-lapse aqueous geochemical data (e.g., Fe(II), sulfate, sulfide, acetate, and uranium concentrations) and induced polarization (IP) data. We assume that the biogeochemical transitions can be classified as several dominant states that correspond to redox transitions and test the method at a uranium-contaminated site. The relationships between the geophysical observations and geochemical time series vary depending upon the unknown underlying redox status, which is modeled as a hidden Markov random field. We estimate unknown parameters by maximizing the joint likelihood function using the maximization-expectation algorithm. The case study results show that when considered together aqueous geochemical data and IP imaginary conductivity provide a key diagnostic signature of biogeochemical stages. The developed method provides useful information for evaluating the effectiveness of bioremediation, such as the probability of being in specific redox stages following biostimulation where desirable pathways (e.g., uranium removal) are more highly favored. The use of geophysical data in the approach advances the possibility of using noninvasive methods to monitor critical biogeochemical system stages and transitions remotely and over field relevant scales (e.g., from square meters to several hectares).

  14. Development of Automatic Remote Exposure Controller for Gamma Radiography

    Energy Technology Data Exchange (ETDEWEB)

    Joo, Gwang Tae; Shin, Jin Seong; Kim, Dong Eun; Song, Jung Ho; Choo, Seung Hwan; Chang, Hong Keun [Korea Industrial Testing Co., Ltd., Seoul (Korea, Republic of)

    2002-10-15

    Recently, gamma radiographic equipment have been used about 1,000 sets manually and operated by about 2,500 persons in Korea. In order for a radiography to work effectively with avoiding any hazard of the high level radiation from the source, many field workers have expected developing a wireless automatic remote exposure controller. The KlTCO research team has developed an automatic remote exposure controller that can regulate the speed of 0.4{approx}1.2m/s by BLDC motor of 24V 200W which has output of 54 kgf{center_dot}, suitable torque and safety factor for the work. And the developed automatic remote exposure controller can control rpm of motor, pigtail position by photo-sensor and exposure time by timer to RF sensor. Thus, the developed equipment is expected that the unit can be used in many practical applications with benefits in economical advantage to combine the use of both automatic and manual type because attachment is possible existent manual remote exposure controller, AC and DC combined use

  15. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 2 is a collection of papers that discusses the controversy about the suitability of COBOL as a common business oriented language, and the development of different common languages for scientific computation. A couple of papers describes the use of the Genie system in numerical calculation and analyzes Mercury autocode in terms of a phrase structure language, such as in the source language, target language, the order structure of ATLAS, and the meta-syntactical language of the assembly program. Other papers explain interference or an ""intermediate

  16. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    Steve Renals

    2011-10-01

    This paper is about the recognition and interpretation of multiparty meetings captured as audio, video and other signals. This is a challenging task since the meetings consist of spontaneous and conversational interactions between a number of participants: it is a multimodal, multiparty, multistream problem. We discuss the capture and annotation of the Augmented Multiparty Interaction (AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  17. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming, Volume 4 is a collection of papers that deals with the GIER ALGOL compiler, a parameterized compiler based on mechanical linguistics, and the JOVIAL language. A couple of papers describes a commercial use of stacks, an IBM system, and what an ideal computer program support system should be. One paper reviews the system of compilation, the development of a more advanced language, programming techniques, machine independence, and program transfer to other machines. Another paper describes the ALGOL 60 system for the GIER machine including running ALGOL pro

  18. Automatic Inference of DATR Theories

    CERN Document Server

    Barg, P

    1996-01-01

    This paper presents an approach for the automatic acquisition of linguistic knowledge from unstructured data. The acquired knowledge is represented in the lexical knowledge representation language DATR. A set of transformation rules that establish inheritance relationships and a default-inference algorithm make up the basis components of the system. Since the overall approach is not restricted to a special domain, the heuristic inference strategy uses criteria to evaluate the quality of a DATR theory, where different domains may require different criteria. The system is applied to the linguistic learning task of German noun inflection.

  19. Coordinated hybrid automatic repeat request

    KAUST Repository

    Makki, Behrooz

    2014-11-01

    We develop a coordinated hybrid automatic repeat request (HARQ) approach. With the proposed scheme, if a user message is correctly decoded in the first HARQ rounds, its spectrum is allocated to other users, to improve the network outage probability and the users\\' fairness. The results, which are obtained for single- and multiple-antenna setups, demonstrate the efficiency of the proposed approach in different conditions. For instance, with a maximum of M retransmissions and single transmit/receive antennas, the diversity gain of a user increases from M to (J+1)(M-1)+1 where J is the number of users helping that user.

  20. Commutated automatic gain control system

    Science.gov (United States)

    Yost, S. R.

    1982-01-01

    The commutated automatic gain control (AGC) system was designed and built for the prototype Loran-C receiver is discussed. The current version of the prototype receiver, the Mini L-80, was tested initially in 1980. The receiver uses a super jolt microcomputer to control a memory aided phase loop (MAPLL). The microcomputer also controls the input/output, latitude/longitude conversion, and the recently added AGC system. The AGC control adjusts the level of each station signal, such that the early portion of each envelope rise is about at the same amplitude in the receiver envelope detector.

  1. On automatic machine translation evaluation

    Directory of Open Access Journals (Sweden)

    Darinka Verdonik

    2013-05-01

    Full Text Available An important task of developing machine translation (MT is evaluating system performance. Automatic measures are most commonly used for this task, as manual evaluation is time-consuming and costly. However, to perform an objective evaluation is not a trivial task. Automatic measures, such as BLEU, TER, NIST, METEOR etc., have their own weaknesses, while manual evaluations are also problematic since they are always to some extent subjective. In this paper we test the influence of a test set on the results of automatic MT evaluation for the subtitling domain. Translating subtitles is a rather specific task for MT, since subtitles are a sort of summarization of spoken text rather than a direct translation of (written text. Additional problem when translating language pair that does not include English, in our example Slovene-Serbian, is that commonly the translations are done from English to Serbian and from English to Slovenian, and not directly, since most of the TV production is originally filmed in English. All this poses additional challenges to MT and consequently to MT evaluation. Automatic evaluation is based on a reference translation, which is usually taken from an existing parallel corpus and marked as a test set. In our experiments, we compare the evaluation results for the same MT system output using three types of test set. In the first round, the test set are 4000 subtitles from the parallel corpus of subtitles SUMAT. These subtitles are not direct translations from Serbian to Slovene or vice versa, but are based on an English original. In the second round, the test set are 1000 subtitles randomly extracted from the first test set and translated anew, from Serbian to Slovenian, based solely on the Serbian written subtitles. In the third round, the test set are the same 1000 subtitles, however this time the Slovene translations were obtained by manually correcting the Slovene MT outputs so that they are correct translations of the

  2. Automatic Identification and Data Extraction from 2-Dimensional Plots in Digital Documents

    CERN Document Server

    Brouwer, William; Das, Sujatha; Mitra, Prasenjit; Giles, C L

    2008-01-01

    Most search engines index the textual content of documents in digital libraries. However, scholarly articles frequently report important findings in figures for visual impact and the contents of these figures are not indexed. These contents are often invaluable to the researcher in various fields, for the purposes of direct comparison with their own work. Therefore, searching for figures and extracting figure data are important problems. To the best of our knowledge, there exists no tool to automatically extract data from figures in digital documents. If we can extract data from these images automatically and store them in a database, an end-user can query and combine data from multiple digital documents simultaneously and efficiently. We propose a framework based on image analysis and machine learning to extract information from 2-D plot images and store them in a database. The proposed algorithm identifies a 2-D plot and extracts the axis labels, legend and the data points from the 2-D plot. We also segrega...

  3. Digital Identifier Systems: Comparative Evaluation

    Directory of Open Access Journals (Sweden)

    Hamid Reza Khedmatgozar

    2015-02-01

    Full Text Available Identifier is one of the main elements in identifying an object in digital environment. Digital identifier systems were developed followed by a lot of problems such as violation of persistency and uniqueness of physical identifiers and URL in digital environment. These identifiers try to guarantee uniqueness and persistency of hostnames by using indirect names for Domain Name System (DNS. The main objective of this research is to identify qualified digital identifier system among other systems. To achieve the research objective, researchers have considered two major steps: first, identifying main criteria for distinguishing digital identifier based on literature review and focus group interview; and second, performing a comparative evaluation on common identifier systems in the world. Findings of first step demonstrated seven main criteria in three domains for distinguishing digital identifier systems: identifier uniqueness and persistency in the identifier features domain, digital identification, digital uniqueness, digital persistency and digital actionability in the digital coverage domain, and globality in the comprehensiveness of scope domain. In the second step, results of the comparative evaluation on common identifier systems indicated that six identifier systems, included, DOI, Handle, UCI, URN, ARK and PURL, are appropriate choices for using as a digital identifier system. Also, according to these results, three identification systems Including NBN, MARIAM and ISNI were identified as suitable choices for digital identification in certain specialized fields. According to many benefits of using these identifiers in important applied fields, such as, digital content chains and networks integration, digital right management, cross referencing, digital libraries and citation analysis, results of this study can help digital environment experts to diagnose digital identifier and their effective use in applied fields.

  4. Automatic classification of blank substrate defects

    Science.gov (United States)

    Boettiger, Tom; Buck, Peter; Paninjath, Sankaranarayanan; Pereira, Mark; Ronald, Rob; Rost, Dan; Samir, Bhamidipati

    2014-10-01

    Mask preparation stages are crucial in mask manufacturing, since this mask is to later act as a template for considerable number of dies on wafer. Defects on the initial blank substrate, and subsequent cleaned and coated substrates, can have a profound impact on the usability of the finished mask. This emphasizes the need for early and accurate identification of blank substrate defects and the risk they pose to the patterned reticle. While Automatic Defect Classification (ADC) is a well-developed technology for inspection and analysis of defects on patterned wafers and masks in the semiconductors industry, ADC for mask blanks is still in the early stages of adoption and development. Calibre ADC is a powerful analysis tool for fast, accurate, consistent and automatic classification of defects on mask blanks. Accurate, automated classification of mask blanks leads to better usability of blanks by enabling defect avoidance technologies during mask writing. Detailed information on blank defects can help to select appropriate job-decks to be written on the mask by defect avoidance tools [1][4][5]. Smart algorithms separate critical defects from the potentially large number of non-critical defects or false defects detected at various stages during mask blank preparation. Mechanisms used by Calibre ADC to identify and characterize defects include defect location and size, signal polarity (dark, bright) in both transmitted and reflected review images, distinguishing defect signals from background noise in defect images. The Calibre ADC engine then uses a decision tree to translate this information into a defect classification code. Using this automated process improves classification accuracy, repeatability and speed, while avoiding the subjectivity of human judgment compared to the alternative of manual defect classification by trained personnel [2]. This paper focuses on the results from the evaluation of Automatic Defect Classification (ADC) product at MP Mask

  5. From gaze cueing to perspective taking: Revisiting the claim that we automatically compute where or what other people are looking at.

    Science.gov (United States)

    Bukowski, Henryk; Hietanen, Jari K; Samson, Dana

    2015-09-14

    Two paradigms have shown that people automatically compute what or where another person is looking at. In the visual perspective-taking paradigm, participants judge how many objects they see; whereas, in the gaze cueing paradigm, participants identify a target. Unlike in the former task, in the latter task, the influence of what or where the other person is looking at is only observed when the other person is presented alone before the task-relevant objects. We show that this discrepancy across the two paradigms is not due to differences in visual settings (Experiment 1) or available time to extract the directional information (Experiment 2), but that it is caused by how attention is deployed in response to task instructions (Experiment 3). Thus, the mere presence of another person in the field of view is not sufficient to compute where/what that person is looking at, which qualifies the claimed automaticity of such computations.

  6. Automatic computational labeling of glomerular textural boundaries

    Science.gov (United States)

    Ginley, Brandon; Tomaszewski, John E.; Sarder, Pinaki

    2017-03-01

    The glomerulus, a specialized bundle of capillaries, is the blood filtering unit of the kidney. Each human kidney contains about 1 million glomeruli. Structural damages in the glomerular micro-compartments give rise to several renal conditions; most severe of which is proteinuria, where excessive blood proteins flow freely to the urine. The sole way to confirm glomerular structural damage in renal pathology is by examining histopathological or immunofluorescence stained needle biopsies under a light microscope. However, this method is extremely tedious and time consuming, and requires manual scoring on the number and volume of structures. Computational quantification of equivalent features promises to greatly ease this manual burden. The largest obstacle to computational quantification of renal tissue is the ability to recognize complex glomerular textural boundaries automatically. Here we present a computational pipeline to accurately identify glomerular boundaries with high precision and accuracy. The computational pipeline employs an integrated approach composed of Gabor filtering, Gaussian blurring, statistical F-testing, and distance transform, and performs significantly better than standard Gabor based textural segmentation method. Our integrated approach provides mean accuracy/precision of 0.89/0.97 on n = 200Hematoxylin and Eosin (HE) glomerulus images, and mean 0.88/0.94 accuracy/precision on n = 200 Periodic Acid Schiff (PAS) glomerulus images. Respective accuracy/precision of the Gabor filter bank based method is 0.83/0.84 for HE and 0.78/0.8 for PAS. Our method will simplify computational partitioning of glomerular micro-compartments hidden within dense textural boundaries. Automatic quantification of glomeruli will streamline structural analysis in clinic, and can help realize real time diagnoses and interventions.

  7. Automatic Recognition of Object Names in Literature

    Science.gov (United States)

    Bonnin, C.; Lesteven, S.; Derriere, S.; Oberto, A.

    2008-08-01

    SIMBAD is a database of astronomical objects that provides (among other things) their bibliographic references in a large number of journals. Currently, these references have to be entered manually by librarians who read each paper. To cope with the increasing number of papers, CDS develops a tool to assist the librarians in their work, taking advantage of the Dictionary of Nomenclature of Celestial Objects, which keeps track of object acronyms and of their origin. The program searches for object names directly in PDF documents by comparing the words with all the formats stored in the Dictionary of Nomenclature. It also searches for variable star names based on constellation names and for a large list of usual names such as Aldebaran or the Crab. Object names found in the documents often correspond to several astronomical objects. The system retrieves all possible matches, displays them with their object type given by SIMBAD, and lets the librarian make the final choice. The bibliographic reference can then be automatically added to the object identifiers in the database. Besides, the systematic usage of the Dictionary of Nomenclature, which is updated manually, permitted to automatically check it and to detect errors and inconsistencies. Last but not least, the program collects some additional information such as the position of the object names in the document (in the title, subtitle, abstract, table, figure caption...) and their number of occurrences. In the future, this will permit to calculate the 'weight' of an object in a reference and to provide SIMBAD users with an important new information, which will help them to find the most relevant papers in the object reference list.

  8. Automatic generation of tourist brochures

    KAUST Repository

    Birsak, Michael

    2014-05-01

    We present a novel framework for the automatic generation of tourist brochures that include routing instructions and additional information presented in the form of so-called detail lenses. The first contribution of this paper is the automatic creation of layouts for the brochures. Our approach is based on the minimization of an energy function that combines multiple goals: positioning of the lenses as close as possible to the corresponding region shown in an overview map, keeping the number of lenses low, and an efficient numbering of the lenses. The second contribution is a route-aware simplification of the graph of streets used for traveling between the points of interest (POIs). This is done by reducing the graph consisting of all shortest paths through the minimization of an energy function. The output is a subset of street segments that enable traveling between all the POIs without considerable detours, while at the same time guaranteeing a clutter-free visualization. © 2014 The Author(s) Computer Graphics Forum © 2014 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  9. ACIR: automatic cochlea image registration

    Science.gov (United States)

    Al-Dhamari, Ibraheem; Bauer, Sabine; Paulus, Dietrich; Lissek, Friedrich; Jacob, Roland

    2017-02-01

    Efficient Cochlear Implant (CI) surgery requires prior knowledge of the cochlea's size and its characteristics. This information helps to select suitable implants for different patients. To get these measurements, a segmentation method of cochlea medical images is needed. An important pre-processing step for good cochlea segmentation involves efficient image registration. The cochlea's small size and complex structure, in addition to the different resolutions and head positions during imaging, reveals a big challenge for the automated registration of the different image modalities. In this paper, an Automatic Cochlea Image Registration (ACIR) method for multi- modal human cochlea images is proposed. This method is based on using small areas that have clear structures from both input images instead of registering the complete image. It uses the Adaptive Stochastic Gradient Descent Optimizer (ASGD) and Mattes's Mutual Information metric (MMI) to estimate 3D rigid transform parameters. The use of state of the art medical image registration optimizers published over the last two years are studied and compared quantitatively using the standard Dice Similarity Coefficient (DSC). ACIR requires only 4.86 seconds on average to align cochlea images automatically and to put all the modalities in the same spatial locations without human interference. The source code is based on the tool elastix and is provided for free as a 3D Slicer plugin. Another contribution of this work is a proposed public cochlea standard dataset which can be downloaded for free from a public XNAT server.

  10. QXT-full Automatic Saccharify Instrument

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    QXT is a full automatic saccharify instrument of eight holes . The instrument use process control technology of micro-computer. It can realize automatic of saccharify full process correctly. Due to adapt control mode of high precision expert PID and digit automatic calibration technology of fill micro computer, not only ensure precision of linear raising temperature region (1 ℃ /min) and constant temperature region (temperature error ±0.2 ℃), but also overcome the disturbance

  11. Automatic Control of Water Pumping Stations

    Institute of Scientific and Technical Information of China (English)

    Muhannad Alrheeh; JIANG Zhengfeng

    2006-01-01

    Automatic Control of pumps is an interesting proposal to operate water pumping stations among many kinds of water pumping stations according to their functions.In this paper, our pumping station is being used for water supply system. This paper is to introduce the idea of pump controller and the important factors that must be considering when we want to design automatic control system of water pumping stations. Then the automatic control circuit with the function of all components will be introduced.

  12. An automatic coastline detector for use with SAR images

    Energy Technology Data Exchange (ETDEWEB)

    Erteza, Ireena A.

    1998-09-01

    SAR imagery for coastline detection has many potential advantages over conventional optical stereoscopic techniques. For example, SAR does not have restrictions on being collected during daylight or when there is no cloud cover. In addition, the techniques for coastline detection witth SAR images can be automated. In this paper, we present the algorithmic development of an automatic coastline detector for use with SAR imagery. Three main algorithms comprise the automatic coastline detection algorithm, The first algorithm considers the image pre-processing steps that must occur on the original image in order to accentuate the land/water boundary. The second algorithm automatically follows along the accentuated land/water boundary and produces a single-pixel-wide coastline. The third algorithm identifies islands and marks them. This report describes in detail the development of these three algorithms. Examples of imagery are used throughout the paper to illustrate the various steps in algorithms. Actual code is included in appendices. The algorithms presented are preliminary versions that can be applied to automatic coastline detection in SAR imagery. There are many variations and additions to the algorithms that can be made to improve robustness and automation, as required by a particular application.

  13. A Computational Approach to Automatic Prediction of Drunk Texting

    OpenAIRE

    Joshi, Aditya; Mishra, Abhijit; AR, Balamurali; Bhattacharyya, Pushpak; Carman, Mark

    2016-01-01

    Alcohol abuse may lead to unsociable behavior such as crime, drunk driving, or privacy leaks. We introduce automatic drunk-texting prediction as the task of identifying whether a text was written when under the influence of alcohol. We experiment with tweets labeled using hashtags as distant supervision. Our classifiers use a set of N-gram and stylistic features to detect drunk tweets. Our observations present the first quantitative evidence that text contains signals that can be exploited to...

  14. Management of natural resources through automatic cartographic inventory

    Science.gov (United States)

    Rey, P. A.; Gourinard, Y.; Cambou, F. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. Significant correspondence codes relating ERTS imagery to ground truth from vegetation and geology maps have been established. The use of color equidensity and color composite methods for selecting zones of equal densitometric value on ERTS imagery was perfected. Primary interest of temporal color composite is stressed. A chain of transfer operations from ERTS imagery to the automatic mapping of natural resources was developed.

  15. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink

    2017-01-01

    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  16. Automatic Inference of Cryptographic Key Length Based on Analysis of Proof Tightness

    Science.gov (United States)

    2016-06-01

    the general number field sieve (GNFS) algorithm [23]. 16 CHAPTER 4: Concept of Operations and Design In this chapter we discuss concept of operations...maintaining and automatically reasoning about these expanded attack trees. We provide a software tool that utilizes machine -readable proof and attack metadata...for maintaining and automatically reasoning about these expanded attack trees. We provide a software tool that utilizes machine -readable proof and

  17. Complete approach to automatic identification and subpixel center location for ellipse feature

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    To meet the need of automatic image features extraction with high precision in visual inspection, a complete approach to automatic identification and sub-pixel center location for similar-ellipse feature is proposed. In the method, the feature area is identified automatically based on the edge attribute, and the sub-pixel center location is accomplished with the leastsquare algorithm. It shows that the method is valid, practical, and has high precision by experiment. Meanwhile this method can meet the need of instrumentation of visual inspection because of easy realization and without man-machine interaction.

  18. Autoclass: An automatic classification system

    Science.gov (United States)

    Stutz, John; Cheeseman, Peter; Hanson, Robin

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework, and using various mathematical and algorithmic approximations, the AutoClass System searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit, or share, model parameters through a class hierarchy. The mathematical foundations of AutoClass are summarized.

  19. Automatic Image Interpolation Using Homography

    Directory of Open Access Journals (Sweden)

    Chi-Tsung Liu

    2010-01-01

    Full Text Available While taking photographs, we often face the problem that unwanted foreground objects (e.g., vehicles, signs, and pedestrians occlude the main subject(s. We propose to apply image interpolation (also known as inpainting techniques to remove unwanted objects in the photographs and to automatically patch the vacancy after the unwanted objects are removed. When given only a single image, if the information loss after the unwanted objects in images being removed is too great, the patching results are usually unsatisfactory. The proposed inpainting techniques employ the homographic constraints in geometry to incorporate multiple images taken from different viewpoints. Our experiment results showed that the proposed techniques could effectively reduce process in searching for potential patches from multiple input images and decide the best patches for the missing regions.

  20. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... is the possiblility to combine the three methods in an extremely flexible way. We examine some applications where this flexibility is very useful. A method for Taylor expanding solutions of ordinary differential equations is presented, and a method for obtaining interval enclosures of the truncation errors incurred...... with intervals as initial values. A modification of the mean value enclosure of discrete mappings is considered, namely the extended mean value enclosure which in most cases leads to even better enclosures. These methods have previously been described in connection with discretizing solutions of ordinary...

  1. Automatic Network Reconstruction using ASP

    CERN Document Server

    Ostrowski, Max; Durzinsky, Markus; Marwan, Wolfgang; Wagler, Annegret

    2011-01-01

    Building biological models by inferring functional dependencies from experimental data is an im- portant issue in Molecular Biology. To relieve the biologist from this traditionally manual process, various approaches have been proposed to increase the degree of automation. However, available ap- proaches often yield a single model only, rely on specific assumptions, and/or use dedicated, heuris- tic algorithms that are intolerant to changing circumstances or requirements in the view of the rapid progress made in Biotechnology. Our aim is to provide a declarative solution to the problem by ap- peal to Answer Set Programming (ASP) overcoming these difficulties. We build upon an existing approach to Automatic Network Reconstruction proposed by part of the authors. This approach has firm mathematical foundations and is well suited for ASP due to its combinatorial flavor providing a characterization of all models explaining a set of experiments. The usage of ASP has several ben- efits over the existing heuristic a...

  2. Automatic orientation correction for radiographs

    Science.gov (United States)

    Luo, Hui; Luo, Jiebo; Wang, Xiaohui

    2006-03-01

    In picture archiving and communications systems (PACS), images need to be displayed in standardized ways for radiologists' interpretations. However, for most radiographs acquired by computed radiography (CR), digital radiography (DR), or digitized films, the image orientation is undetermined because of the variation of examination conditions and patient situations. To address this problem, an automatic orientation correction method is presented. It first detects the most indicative region for orientation in a radiograph, and then extracts a set of low-level visual features sensitive to rotation from the region. Based on these features, a trained classifier based on a support vector machine is employed to recognize the correct orientation of the radiograph and reorient it to a desired position. A large-scale experiment has been conducted on more than 12,000 radiographs covering a large variety of body parts and projections to validate the method. The overall performance is quite promising, with the success rate of orientation correction reaching 95.2%.

  3. Towards Automatic Processing of Virtual City Models for Simulations

    Science.gov (United States)

    Piepereit, R.; Schilling, A.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2016-10-01

    Especially in the field of numerical simulations, such as flow and acoustic simulations, the interest in using virtual 3D models to optimize urban systems is increasing. The few instances in which simulations were already carried out in practice have been associated with an extremely high manual and therefore uneconomical effort for the processing of models. Using different ways of capturing models in Geographic Information System (GIS) and Computer Aided Engineering (CAE), increases the already very high complexity of the processing. To obtain virtual 3D models suitable for simulation, we developed a tool for automatic processing with the goal to establish ties between the world of GIS and CAE. In this paper we introduce a way to use Coons surfaces for the automatic processing of building models in LoD2, and investigate ways to simplify LoD3 models in order to reduce unnecessary information for a numerical simulation.

  4. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  5. 2011 International Conference in Electrics, Communication and Automatic Control Proceedings

    CERN Document Server

    2012-01-01

    This two-volume set contains the very latest, cutting-edge material in electrics, communication and automatic control. As a vital field of research that is highly relevant to current developments in a number of technological domains, the subjects it covers include micro-electronics and integrated circuit control, signal processing technology, next-generation network infrastructure, wireless communication and scientific instruments. The aim of the International Conference in Electrics, Communication and Automatic Control, held in Chongqing, China, in June 2011 was to provide a valuable inclusive platform for researchers, engineers, academicians and industrial professionals from all over the world to share their research results with fellow scientists in the sector. The call for papers netted well over 600 submissions, of which 224 were selected for presentation. This fully peer-reviewed collection of papers from the conference can be viewed as a single-source compendium of the latest trends and techniques in t...

  6. 5th International Conference on Electrical Engineering and Automatic Control

    CERN Document Server

    Yao, Yufeng

    2016-01-01

    On the basis of instrument electrical and automatic control system, the 5th International Conference on Electrical Engineering and Automatic Control (CEEAC) was established at the crossroads of information technology and control technology, and seeks to effectively apply information technology to a sweeping trend that views control as the core of intelligent manufacturing and life. This book takes a look forward into advanced manufacturing development, an area shaped by intelligent manufacturing. It highlights the application and promotion of process control represented by traditional industries, such as the steel industry and petrochemical industry; the technical equipment and system cooperative control represented by robot technology and multi-axis CNC; and the control and support of emerging process technologies represented by laser melting and stacking, as well as the emerging industry represented by sustainable and intelligent life. The book places particular emphasis on the micro-segments field, such as...

  7. Automatic Segmentation of Abdominal Adipose Tissue in MRI

    DEFF Research Database (Denmark)

    Mosbech, Thomas Hammershaimb; Pilgaard, Kasper; Vaag, Allan

    2011-01-01

    This paper presents a method for automatically segmenting abdominal adipose tissue from 3-dimensional magnetic resonance images. We distinguish between three types of adipose tissue; visceral, deep subcutaneous and superficial subcutaneous. Images are pre-processed to remove the bias field effect...... of intensity in-homogeneities. This effect is estimated by a thin plate spline extended to fit two classes of automatically sampled intensity points in 3D. Adipose tissue pixels are labelled with fuzzy c-means clustering and locally determined thresholds. The visceral and subcutaneous adipose tissue...... are separated using deformable models, incorporating information from the clustering. The subcutaneous adipose tissue is subdivided into a deep and superficial part by means of dynamic programming applied to a spatial transformation of the image data. Regression analysis shows good correspondences between our...

  8. Building an Automatic Thesaurus to Enhance Information Retrieval

    Directory of Open Access Journals (Sweden)

    Essam Said Hanandeh

    2013-01-01

    Full Text Available One of the major problems of modern Information Retrieval (IR systems is the vocabulary Problem that concerns with the discrepancies between terms used for describing documents and the terms used by the researcher to describe their information need. We have implemented an automatic thesurs, the system was built using Vector Space Model (VSM. In this model, we used Cosine measure similarity. In this paper we use selected 242 Arabic abstract documents. All these abstracts involve computer science and information system. The main goal of this paper is to design and build automatic Arabic thesauri using term-term similarity that can be used in any special field or domain to improve the expansion process and to get more relevance documents for the user's query. The study concluded that the similarl thesaurus improved the recall and precision more than traditional information retrieval system in terms of recall and precision level.

  9. Automatic Image Segmentation based on MRF-MAP

    CERN Document Server

    Qiyang, Zhao

    2012-01-01

    Solving the Maximum a Posteriori on Markov Random Field, MRF-MAP, is a prevailing method in recent interactive image segmentation tools. Although mathematically explicit in its computational targets, and impressive for the segmentation quality, MRF-MAP is hard to accomplish without the interactive information from users. So it is rarely adopted in the automatic style up to today. In this paper, we present an automatic image segmentation algorithm, NegCut, based on the approximation to MRF-MAP. First we prove MRF-MAP is NP-hard when the probabilistic models are unknown, and then present an approximation function in the form of minimum cuts on graphs with negative weights. Finally, the binary segmentation is taken from the largest eigenvector of the target matrix, with a tuned version of the Lanczos eigensolver. It is shown competitive at the segmentation quality in our experiments.

  10. Visuospatial stimulus-bound automatic writing behavior: a right hemispheric stroke syndrome.

    Science.gov (United States)

    Evyapan, D; Kumral, E

    2001-01-23

    Three cases of visuospatial stimulus-bound automatic writing behavior were identified among 80 patients (4%) with acute right cerebral hemispheric stroke. All cases had similar clinical characteristics and writing behavior, and visuospatial stimulus-bound automatic writing was related to visually perceived letters. This syndrome might be specific for right hemispheric stroke and might be included among other hypergraphic syndromes attributable to right hemispheric damage.

  11. Automatic cobb angle determination from radiographic images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H.F.; Veldhuizen, Albert G.; Ooijen, van Peter M.A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Met

  12. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  13. An Experiment in Automatic Hierarchical Document Classification.

    Science.gov (United States)

    Garland, Kathleen

    1983-01-01

    Describes method of automatic document classification in which documents classed as QA by Library of Congress classification system were clustered at six thresholds by keyword using single link technique. Automatically generated clusters were compared to Library of Congress subclasses, and partial classified hierarchy was formed. Twelve references…

  14. Automatic cobb angle determination from radiographic images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H.F.; Veldhuizen, Albert G.; van Ooijen, Peter M.A.; Purnama, Ketut E.; Verkerke, Gijsbertus Jacob

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  15. Generalization versus Contextualization in Automatic Evaluation

    Science.gov (United States)

    Gawronski, Bertram; Rydell, Robert J.; Vervliet, Bram; De Houwer, Jan

    2010-01-01

    Research has shown that automatic evaluations can be highly robust and difficult to change, highly malleable and easy to change, and highly context dependent. We tested a representational account of these disparate findings, which specifies the conditions under which automatic evaluations reflect (a) initially acquired information, (b)…

  16. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  17. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  18. Associating fuzzy logic, neural networks and multivariable statistic methodologies in the automatic identification of oil reservoir lithologies through well logs

    Energy Technology Data Exchange (ETDEWEB)

    Carrasquilla, Abel [Universidade Estadual do Norte Fluminense Darcy Ribeiro (UENF), Macae, RJ (Brazil). Lab. de Engenharia e Exploracao de Petroleo]. E-mail: abel@lenep.uenf.br; Silva, Jadir da [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Dept. de Geologia; Flexa, Roosevelt [Baker Hughes do Brasil Ltda, Macae, RJ (Brazil)

    2008-07-01

    In this article, we present a new approach to the automatic identification of lithologies using only well log data, which associates fuzzy logic, neural networks and multivariable statistic methods. Firstly, we chose well log data that represents lithological types, as gamma rays (GR) and density (RHOB), and, immediately, we applied a fuzzy logic algorithm to determine optimal number of clusters. In the following step, a competitive neural network is developed, based on Kohonen's learning rule, where the input layer is composed of two neurons, which represent the same number of used logs. On the other hand, the competitive layer is composed by several neurons, which have the same number of clusters as determined by the fuzzy logic algorithm. Finally, some data bank elements of the lithological types are selected at random to be the discriminate variables, which correspond to the input data of the multigroup discriminate analysis program. In this form, with the application of this methodology, the lithological types were automatically identified throughout the a well of the Namorado Oil Field, Campos Basin, which presented some difficulty in the results, mainly because of geological complexity of this field. (author)

  19. Automatic Detection of Omega Signals Captured by the Poynting Flux Analyzer (PFX) on Board the Akebono Satellite

    CERN Document Server

    Suarjaya, I Made Agus Dwi; Goto, Yoshitaka

    2016-01-01

    The Akebono satellite was launched in 1989 to observe the Earth's magnetosphere and plasmasphere. Omega was a navigation system with 8 ground stations transmitter and had transmission pattern that repeats every 10 s. From 1989 to 1997, the PFX on board the Akebono satellite received signals at 10.2 kHz from these stations. Huge amounts of PFX data became valuable for studying the propagation characteristics of VLF waves in the ionosphere and plasmasphere. In this study, we introduce a method for automatic detection of Omega signals from the PFX data in a systematic way, it involves identifying a transmission station, calculating the delay time, and estimating the signal intensity. We show the reliability of the automatic detection system where we able to detect the omega signal and confirmed its propagation to the opposite hemisphere along the Earth's magnetic field lines. For more than three years (39 months), we detected 43,734 and 111,049 signals in the magnetic and electric field, respectively, and demons...

  20. Automatic Detection of Omega Signals Captured by the Poynting Flux Analyzer (PFX on Board the Akebono Satellite

    Directory of Open Access Journals (Sweden)

    Made Agus Dwi Suarjaya

    2016-10-01

    Full Text Available The Akebono satellite was launched in 1989 to observe the Earth’s magnetosphere and plasmasphere. Omega was a navigation system with 8 ground stations transmitter and had transmission pattern that repeats every 10 s. From 1989 to 1997, the PFX on board the Akebono satellite received signals at 10.2 kHz from these stations. Huge amounts of PFX data became valuable for studying the propagation characteristics of VLF waves in the ionosphere and plasmasphere. In this study, we introduce a method for automatic detection of Omega signals from the PFX data in a systematic way, it involves identifying a transmission station, calculating the delay time, and estimating the signal intensity. We show the reliability of the automatic detection system where we able to detect the omega signal and confirmed its propagation to the opposite hemisphere along the Earth’s magnetic field lines. For more than three years (39 months, we detected 43,734 and 111,049 signals in the magnetic and electric field, respectively, and demonstrated that the proposed method is powerful enough for the statistical analyses.

  1. Automatic Synthesis of Anthropomorphic Pulmonary CT Phantoms.

    Directory of Open Access Journals (Sweden)

    Daniel Jimenez-Carretero

    Full Text Available The great density and structural complexity of pulmonary vessels and airways impose limitations on the generation of accurate reference standards, which are critical in training and in the validation of image processing methods for features such as pulmonary vessel segmentation or artery-vein (AV separations. The design of synthetic computed tomography (CT images of the lung could overcome these difficulties by providing a database of pseudorealistic cases in a constrained and controlled scenario where each part of the image is differentiated unequivocally. This work demonstrates a complete framework to generate computational anthropomorphic CT phantoms of the human lung automatically. Starting from biological and image-based knowledge about the topology and relationships between structures, the system is able to generate synthetic pulmonary arteries, veins, and airways using iterative growth methods that can be merged into a final simulated lung with realistic features. A dataset of 24 labeled anthropomorphic pulmonary CT phantoms were synthesized with the proposed system. Visual examination and quantitative measurements of intensity distributions, dispersion of structures and relationships between pulmonary air and blood flow systems show good correspondence between real and synthetic lungs (p > 0.05 with low Cohen's d effect size and AUC values, supporting the potentiality of the tool and the usefulness of the generated phantoms in the biomedical image processing field.

  2. Automatically ordering events and times in text

    CERN Document Server

    Derczynski, Leon R A

    2017-01-01

    The book offers a detailed guide to temporal ordering, exploring open problems in the field and providing solutions and extensive analysis. It addresses the challenge of automatically ordering events and times in text. Aided by TimeML, it also describes and presents concepts relating to time in easy-to-compute terms. Working out the order that events and times happen has proven difficult for computers, since the language used to discuss time can be vague and complex. Mapping out these concepts for a computational system, which does not have its own inherent idea of time, is, unsurprisingly, tough. Solving this problem enables powerful systems that can plan, reason about events, and construct stories of their own accord, as well as understand the complex narratives that humans express and comprehend so naturally. This book presents a theory and data-driven analysis of temporal ordering, leading to the identification of exactly what is difficult about the task. It then proposes and evaluates machine-learning so...

  3. Automatic keywording of High Energy Physics

    CERN Document Server

    Dallman, David Peter

    1999-01-01

    Bibliographic databases were developed from the traditional library card catalogue in order to enable users to access library documents via various types of bibliographic information, such as title, author, series or conference date. In addition these catalogues sometimes contained some form of indexation by subject, such as the Universal (or Dewey) Decimal Classification used for books. With the introduction of the eprint archives, set up by the High Energy Physics (HEP) Community in the early 90s, huge collections of documents in several fields have been made available on the World Wide Web. These developments however have not yet been followed up from a keywording point of view. We will see in this paper how important it is to attribute keywords to all documents in the area of HEP Grey Literature. As libraries are facing a future with less and less manpower available and more and more documents, we will explore the possibility of being helped by automatic classification software. We will specifically menti...

  4. Automatic Synthesis of Anthropomorphic Pulmonary CT Phantoms

    Science.gov (United States)

    Jimenez-Carretero, Daniel; San Jose Estepar, Raul; Diaz Cacio, Mario; Ledesma-Carbayo, Maria J.

    2016-01-01

    The great density and structural complexity of pulmonary vessels and airways impose limitations on the generation of accurate reference standards, which are critical in training and in the validation of image processing methods for features such as pulmonary vessel segmentation or artery–vein (AV) separations. The design of synthetic computed tomography (CT) images of the lung could overcome these difficulties by providing a database of pseudorealistic cases in a constrained and controlled scenario where each part of the image is differentiated unequivocally. This work demonstrates a complete framework to generate computational anthropomorphic CT phantoms of the human lung automatically. Starting from biological and image-based knowledge about the topology and relationships between structures, the system is able to generate synthetic pulmonary arteries, veins, and airways using iterative growth methods that can be merged into a final simulated lung with realistic features. A dataset of 24 labeled anthropomorphic pulmonary CT phantoms were synthesized with the proposed system. Visual examination and quantitative measurements of intensity distributions, dispersion of structures and relationships between pulmonary air and blood flow systems show good correspondence between real and synthetic lungs (p > 0.05 with low Cohen’s d effect size and AUC values), supporting the potentiality of the tool and the usefulness of the generated phantoms in the biomedical image processing field. PMID:26731653

  5. Surgery with computerized virtual reality for the automatic detection of tumors.

    Science.gov (United States)

    Fernández Fernández de Santo; Nieto Llanos, S; Ortiz Aguilar, M; Sánchez Colodrón, E; Tello López, J; Blasco Delgado, O; Galván Pérez, A; Maestu García, M; Guerra Paredes, E

    1999-07-01

    We present a novel and highly accurate system based on informatics engineering capable of automatic detection of tumors directly in the operating field. The system can identify the outlines of the tumor, determine whether it is malignant or not, detect lymphadenopathy and determine whether nodes are metastasized or not. The highly elaborate system, based on artificial vision, has been used in 30 gastric and 5 pancreatic neoplasms, among other tumor types. Images of the surgical field were recorded with a video camera connected to a computer, which was operated by the engineer. Questions asked by the surgeon during the procedure were processed immediately and sent to the virtual reality helmet worn by the surgeon, to the TV monitor in the operating room, or to both. The system is based on purely physical and mathematical processes that work reliably; in this sense it is free from errors and is self-consistent, operator errors or hardware failure excepted. In all cases tested here the system correctly identified the tumor as benign or malignant, revealed the extension of the tumor, and detected lymph node metastases. In every case these results were confirmed by histological examination.

  6. Spectral Curve Fitting for Automatic Hyperspectral Data Analysis

    CERN Document Server

    Brown, Adrian J

    2014-01-01

    Automatic discovery and curve fitting of absorption bands in hyperspectral data can enable the analyst to identify materials present in a scene by comparison with library spectra. This procedure is common in laboratory spectra, but is challenging for sparse hyperspectral data. A procedure for robust discovery of overlapping bands in hyperspectral data is described in this paper. The method is capable of automatically discovering and fitting symmetric absorption bands, can separate overlapping absorption bands in a stable manner, and has relatively low sensitivity to noise. A comparison with techniques already available in the literature is presented using simulated spectra. An application is demonstrated utilizing the shortwave infrared (2.0-2.5 micron or 5000-4000 cm-1) region. A small hyperspectral scene is processed to demonstrate the ability of the method to detect small shifts in absorption wavelength caused by varying white mica chemistry in a natural setting.

  7. A model for automatic identification of human pulse signals

    Institute of Scientific and Technical Information of China (English)

    Hui-yan WANG; Pei-yong ZHANG

    2008-01-01

    This paper presents a quantitative method for automatic identification of human pulse signals. The idea is to start with the extraction of characteristic parameters and then to construct the recognition model based on Bayesian networks. To identify depth, frequency and rhythm, several parameters are proposed. To distinguish the strength and shape, which cannot be represented by one or several parameters and are hard to recognize, the main time-domain feature parameters are computed based on the feature points of the pulse signal. Then the extracted parameters are taken as the input and five models for automatic pulse signal identification are constructed based on Bayesian networks. Experimental results demonstrate that the method is feasible and effective in recognizing depth, frequency, rhythm, strength and shape of pulse signals, which can be expected to facilitate the modernization of pulse diagnosis.

  8. Semi-automatic long-term acoustic surveying

    DEFF Research Database (Denmark)

    Andreassen, Tórur; Surlykke, Annemarie; Hallam, John

    2014-01-01

    data sampling rates (500kHz). Using a sound energy threshold criterion for triggering recording, we collected 236GB (Gi=10243) of data at full bandwidth. We implemented a simple automatic method using a Support Vector Machine (SVM) classifier based on a combination of temporal and spectral analyses...... to classify events into bat calls and non-bat events. After experimentation we selected duration, energy, bandwidth, and entropy as classification features to identify short high energy structured sounds in the right frequency range. The spectral entropy makes use of the orderly arrangement of frequencies...... in bat calls to reject short noise pulses, e.g. from rain. The SVM classifier reduced our dataset to 162MB of candidate bat calls with an estimated accuracy of 96% for dry nights and 70% when it was raining. The automatic survey revealed calls from two species of bat not previously recorded in the area...

  9. Image Processing Method for Automatic Discrimination of Hoverfly Species

    Directory of Open Access Journals (Sweden)

    Vladimir Crnojević

    2014-01-01

    Full Text Available An approach to automatic hoverfly species discrimination based on detection and extraction of vein junctions in wing venation patterns of insects is presented in the paper. The dataset used in our experiments consists of high resolution microscopic wing images of several hoverfly species collected over a relatively long period of time at different geographic locations. Junctions are detected using the combination of the well known HOG (histograms of oriented gradients and the robust version of recently proposed CLBP (complete local binary pattern. These features are used to train an SVM classifier to detect junctions in wing images. Once the junctions are identified they are used to extract statistics characterizing the constellations of these points. Such simple features can be used to automatically discriminate four selected hoverfly species with polynomial kernel SVM and achieve high classification accuracy.

  10. Refinements to the Boolean approach to automatic data editing

    Energy Technology Data Exchange (ETDEWEB)

    Liepins, G.E.

    1980-09-01

    Automatic data editing consists of three components: identification of erroneous records, identification of most likely erroneous fields within an erroneous record (fields to impute), and assignment of acceptable values to failing records. Moreover the types of data considered naturally fall into three categories: coded (categorical) data, continuous data, and mixed data (both coded and continuous). For the case of coded data, a natural way to approach automatic data is commonly referred to as the Boolean approach, first developed by Fellegi and Holt. For the fields to impute problem, central to the operation of the Fellegi-Holt approach is the explicit recognition of certain implied edits; Fellegi and Holt orginally required a complete set of edits, and their algorithm to generate this complete set has occasionally had the distinct disadvantage of failing to converge within reasonable time. The primary results of this paper is an algorithm that significantly prunes the Fellegi-Holt edit generation process, yet, nonetheless, generates a sufficient collection of implied edits adequate for the solution of the fields to impute problem. 3 figures.

  11. 7 CFR 58.418 - Automatic cheese making equipment.

    Science.gov (United States)

    2010-01-01

    ... processing or packaging areas. (c) Automatic salter. The automatic salter shall be constructed of stainless.... The automatic salter shall be constructed so that it can be satisfactorily cleaned. The salting...

  12. Automatic classification of seismo-volcanic signatures

    Science.gov (United States)

    Malfante, Marielle; Dalla Mura, Mauro; Mars, Jérôme; Macedo, Orlando; Inza, Adolfo; Métaxian, Jean-Philippe

    2017-04-01

    The prediction of volcanic eruptions and the evaluation of their associated risks is still a timely and open issue. For this purpose, several types of signals are recorded in the proximity of volcanoes and then analysed by experts. Typically, seismic signals that are considered as precursor or indicator of an active volcanic phase are detected and manually classified. In this work, we propose an architecture for automatic classification of seismo-volcanic waves. The system we propose is based on supervised machine learning. Specifically, a prediction model is built from a large dataset of labelled examples by the means of a learning algorithm (Support Vector Machine or Random Forest). Four main steps are involved: (i) preprocess the signals, (ii) from each signal, extract features that are useful for the classes discrimination, (iii) use an automatic learning algorithm to train a prediction model and (iv) classify (i.e., assign a semantic label) newly recorded and unlabelled examples. Our main contribution lies in the definition of the feature space used to represent the signals (i.e., in the choice of the features to extract from the data). Feature vectors describe the data in a space of lower dimension with respect to the original one. Ideally, signals are separable in the feature space depending on their classes. For this work, we consider a large set of features (79) gathered from an extensive state of the art in both acoustic and seismic fields. An analysis of this feature set shows that for the application of interest, 11 features are sufficient to discriminate the data. The architecture is tested on 4725 seismic events recorded between June 2006 and September 2011 at Ubinas, the most active volcano of Peru. Six main classes of signals are considered: volcanic tremors (TR), long period (LP), volcano-tectonic (VT), explosion (EXP), hybrids (HIB) and tornillo (TOR). Our model reaches above 90% of accuracy, thereby validating the proposed architecture and the

  13. Automatic seagrass pattern identification on sonar images

    Science.gov (United States)

    Rahnemoonfar, Maryam; Rahman, Abdullah

    2016-05-01

    Natural and human-induced disturbances are resulting in degradation and loss of seagrass. Freshwater flooding, severe meteorological events and invasive species are among the major natural disturbances. Human-induced disturbances are mainly due to boat propeller scars in the shallow seagrass meadows and anchor scars in the deeper areas. Therefore, there is a vital need to map seagrass ecosystems in order to determine worldwide abundance and distribution. Currently there is no established method for mapping the pothole or scars in seagrass. One of the most precise sensors to map the seagrass disturbance is side scan sonar. Here we propose an automatic method which detects seagrass potholes in sonar images. Side scan sonar images are notorious for having speckle noise and uneven illumination across the image. Moreover, disturbance presents complex patterns where most segmentation techniques will fail. In this paper, by applying mathematical morphology technique and calculating the local standard deviation of the image, the images were enhanced and the pothole patterns were identified. The proposed method was applied on sonar images taken from Laguna Madre in Texas. Experimental results show the effectiveness of the proposed method.

  14. Automatically Determining Scale Within Unstructured Point Clouds

    Science.gov (United States)

    Kadamen, Jayren; Sithole, George

    2016-06-01

    Three dimensional models obtained from imagery have an arbitrary scale and therefore have to be scaled. Automatically scaling these models requires the detection of objects in these models which can be computationally intensive. Real-time object detection may pose problems for applications such as indoor navigation. This investigation poses the idea that relational cues, specifically height ratios, within indoor environments may offer an easier means to obtain scales for models created using imagery. The investigation aimed to show two things, (a) that the size of objects, especially the height off ground is consistent within an environment, and (b) that based on this consistency, objects can be identified and their general size used to scale a model. To test the idea a hypothesis is first tested on a terrestrial lidar scan of an indoor environment. Later as a proof of concept the same test is applied to a model created using imagery. The most notable finding was that the detection of objects can be more readily done by studying the ratio between the dimensions of objects that have their dimensions defined by human physiology. For example the dimensions of desks and chairs are related to the height of an average person. In the test, the difference between generalised and actual dimensions of objects were assessed. A maximum difference of 3.96% (2.93cm) was observed from automated scaling. By analysing the ratio between the heights (distance from the floor) of the tops of objects in a room, identification was also achieved.

  15. Automatic Generation of Minimal Cut Sets

    Directory of Open Access Journals (Sweden)

    Sentot Kromodimoeljo

    2015-06-01

    Full Text Available A cut set is a collection of component failure modes that could lead to a system failure. Cut Set Analysis (CSA is applied to critical systems to identify and rank system vulnerabilities at design time. Model checking tools have been used to automate the generation of minimal cut sets but are generally based on checking reachability of system failure states. This paper describes a new approach to CSA using a Linear Temporal Logic (LTL model checker called BT Analyser that supports the generation of multiple counterexamples. The approach enables a broader class of system failures to be analysed, by generalising from failure state formulae to failure behaviours expressed in LTL. The traditional approach to CSA using model checking requires the model or system failure to be modified, usually by hand, to eliminate already-discovered cut sets, and the model checker to be rerun, at each step. By contrast, the new approach works incrementally and fully automatically, thereby removing the tedious and error-prone manual process and resulting in significantly reduced computation time. This in turn enables larger models to be checked. Two different strategies for using BT Analyser for CSA are presented. There is generally no single best strategy for model checking: their relative efficiency depends on the model and property being analysed. Comparative results are given for the A320 hydraulics case study in the Behavior Tree modelling language.

  16. A Hierarchy of Tree-Automatic Structures

    CERN Document Server

    Finkel, Olivier

    2011-01-01

    We consider $\\omega^n$-automatic structures which are relational structures whose domain and relations are accepted by automata reading ordinal words of length $\\omega^n$ for some integer $n\\geq 1$. We show that all these structures are $\\omega$-tree-automatic structures presentable by Muller or Rabin tree automata. We prove that the isomorphism relation for $\\omega^2$-automatic (resp. $\\omega^n$-automatic for $n>2$) boolean algebras (respectively, partial orders, rings, commutative rings, non commutative rings, non commutative groups) is not determined by the axiomatic system ZFC. We infer from the proof of the above result that the isomorphism problem for $\\omega^n$-automatic boolean algebras, $n > 1$, (respectively, rings, commutative rings, non commutative rings, non commutative groups) is neither a $\\Sigma_2^1$-set nor a $\\Pi_2^1$-set. We obtain that there exist infinitely many $\\omega^n$-automatic, hence also $\\omega$-tree-automatic, atomless boolean algebras $B_n$, $n\\geq 1$, which are pairwise isomorp...

  17. Automatic Weather Station (AWS) Lidar

    Science.gov (United States)

    Rall, Jonathan A.R.; Abshire, James B.; Spinhirne, James D.; Smith, David E. (Technical Monitor)

    2000-01-01

    An autonomous, low-power atmospheric lidar instrument is being developed at NASA Goddard Space Flight Center. This compact, portable lidar will operate continuously in a temperature controlled enclosure, charge its own batteries through a combination of a small rugged wind generator and solar panels, and transmit its data from remote locations to ground stations via satellite. A network of these instruments will be established by co-locating them at remote Automatic Weather Station (AWS) sites in Antarctica under the auspices of the National Science Foundation (NSF). The NSF Office of Polar Programs provides support to place the weather stations in remote areas of Antarctica in support of meteorological research and operations. The AWS meteorological data will directly benefit the analysis of the lidar data while a network of ground based atmospheric lidar will provide knowledge regarding the temporal evolution and spatial extent of Type la polar stratospheric clouds (PSC). These clouds play a crucial role in the annual austral springtime destruction of stratospheric ozone over Antarctica, i.e. the ozone hole. In addition, the lidar will monitor and record the general atmospheric conditions (transmission and backscatter) of the overlying atmosphere which will benefit the Geoscience Laser Altimeter System (GLAS). Prototype lidar instruments have been deployed to the Amundsen-Scott South Pole Station (1995-96, 2000) and to an Automated Geophysical Observatory site (AGO 1) in January 1999. We report on data acquired with these instruments, instrument performance, and anticipated performance of the AWS Lidar.

  18. Automatic segmentation of psoriasis lesions

    Science.gov (United States)

    Ning, Yang; Shi, Chenbo; Wang, Li; Shu, Chang

    2014-10-01

    The automatic segmentation of psoriatic lesions is widely researched these years. It is an important step in Computer-aid methods of calculating PASI for estimation of lesions. Currently those algorithms can only handle single erythema or only deal with scaling segmentation. In practice, scaling and erythema are often mixed together. In order to get the segmentation of lesions area - this paper proposes an algorithm based on Random forests with color and texture features. The algorithm has three steps. The first step, the polarized light is applied based on the skin's Tyndall-effect in the imaging to eliminate the reflection and Lab color space are used for fitting the human perception. The second step, sliding window and its sub windows are used to get textural feature and color feature. In this step, a feature of image roughness has been defined, so that scaling can be easily separated from normal skin. In the end, Random forests will be used to ensure the generalization ability of the algorithm. This algorithm can give reliable segmentation results even the image has different lighting conditions, skin types. In the data set offered by Union Hospital, more than 90% images can be segmented accurately.

  19. Automatic Assessment of Programming assignment

    Directory of Open Access Journals (Sweden)

    Surendra Gupta

    2012-01-01

    Full Text Available In today’s world study of computer’s language is more important. Effective and good programming skills are need full all computer science students. They can be master in programming, only through intensive exercise practices. Due to day by day increasing number of students in the class, the assessment of programming exercises leads to extensive workload for teacher/instructor, particularly if it has to be carried out manually. In this paper, we propose an automatic assessment system for programming assignments, using verification program with random inputs. One of the most important properties of a program is that, it carries out its intended function. The intended function of a program or part of a program can be verified by using inverse function’s verification program. For checking intended functionality and evaluation of a program, we have used verification program. This assessment system has been tested on basic C programming courses, and results shows that it can work well in basic programming exercises, with some initial promising results

  20. Automatic Transmission Of Liquid Nitrogen

    Directory of Open Access Journals (Sweden)

    Sumedh Mhatre

    2015-08-01

    Full Text Available Liquid Nitrogen is one of the major substance used as a chiller in industry such as Ice cream factory Milk Diary Storage of blood sample Blood Bank etc. It helps to maintain the required product at a lower temperature for preservation purpose. We cannot fully utilise the LN2 so practically if we are using 3.75 litre LN2 for a single day then around 12 of LN2 450 ml is wasted due to vaporisation. A pressure relief valve is provided to create a pressure difference. If there is no pressure difference between the cylinder carrying LN2 and its surrounding it will results in damage of container as well as wastage of LN2.Transmission of LN2 from TA55 to BA3 is carried manually .So care must be taken for the transmission of LN2 in order to avoid its wastage. With the help of this project concept the transmission of LN2 will be carried automatically so as to reduce the wastage of LN2 in case of manual operation.

  1. Automatic Testing of a CANopen Node

    OpenAIRE

    Liang, Hui

    2013-01-01

    This Bachelor’s thesis was commissioned by TK Engineering Oy in Vaasa. The goals of the thesis were to test a prototype CANopen node, called UWASA Node for conformance to the CiA 301 standard, and to develop the automatic performance test software and the automatic CiA 401 test software. A test report that describes to the designer what needs to be corrected and improved is made in this thesis. For the CiA 301 test there is a CANopen conformance test tool that can be used. The automatic perfo...

  2. An Automatic Clustering Technique for Optimal Clusters

    CERN Document Server

    Pavan, K Karteeka; Rao, A V Dattatreya; 10.5121/ijcsea.2011.1412

    2011-01-01

    This paper proposes a simple, automatic and efficient clustering algorithm, namely, Automatic Merging for Optimal Clusters (AMOC) which aims to generate nearly optimal clusters for the given datasets automatically. The AMOC is an extension to standard k-means with a two phase iterative procedure combining certain validation techniques in order to find optimal clusters with automation of merging of clusters. Experiments on both synthetic and real data have proved that the proposed algorithm finds nearly optimal clustering structures in terms of number of clusters, compactness and separation.

  3. Automatic tuning of flexible interventional RF receiver coils.

    Science.gov (United States)

    Venook, Ross D; Hargreaves, Brian A; Gold, Garry E; Conolly, Steven M; Scott, Greig C

    2005-10-01

    Microcontroller-based circuitry was built and tested for automatically tuning flexible RF receiver coils at the touch of a button. This circuitry is robust to 10% changes in probe center frequency, is in line with the scanner, and requires less than 1 s to tune a simple probe. Images were acquired using this circuitry with a varactor-tunable 1-inch flexible probe in a phantom and in an in vitro porcine knee model. The phantom experiments support the use of automatic tuning by demonstrating 30% signal-to-noise ratio (SNR) losses for 5% changes in coil center frequency, in agreement with theoretical calculations. Comparisons between patellofemoral cartilage images obtained using a 3-inch surface coil and the surgically-implanted 1-inch flexible coil reveal a worst-case local SNR advantage of a factor of 4 for the smaller coil. This work confirms that surgically implanted coils can greatly improve resolution in small-field-of-view (FOV) applications, and demonstrates the importance and feasibility of automatically tuning such probes.

  4. Automatic graphene transfer system for improved material quality and efficiency

    Science.gov (United States)

    Boscá, Alberto; Pedrós, Jorge; Martínez, Javier; Palacios, Tomás; Calle, Fernando

    2016-02-01

    In most applications based on chemical vapor deposition (CVD) graphene, the transfer from the growth to the target substrate is a critical step for the final device performance. Manual procedures are time consuming and depend on handling skills, whereas existing automatic roll-to-roll methods work well for flexible substrates but tend to induce mechanical damage in rigid ones. A new system that automatically transfers CVD graphene to an arbitrary target substrate has been developed. The process is based on the all-fluidic manipulation of the graphene to avoid mechanical damage, strain and contamination, and on the combination of capillary action and electrostatic repulsion between the graphene and its container to ensure a centered sample on top of the target substrate. The improved carrier mobility and yield of the automatically transferred graphene, as compared to that manually transferred, is demonstrated by the optical and electrical characterization of field-effect transistors fabricated on both materials. In particular, 70% higher mobility values, with a 30% decrease in the unintentional doping and a 10% strain reduction are achieved. The system has been developed for lab-scale transfer and proved to be scalable for industrial applications.

  5. Automatic computational models of acoustical category features: Talking versus singing

    Science.gov (United States)

    Gerhard, David

    2003-10-01

    The automatic discrimination between acoustical categories has been an increasingly interesting problem in the fields of computer listening, multimedia databases, and music information retrieval. A system is presented which automatically generates classification models, given a set of destination classes and a set of a priori labeled acoustic events. Computational models are created using comparative probability density estimations. For the specific example presented, the destination classes are talking and singing. Individual feature models are evaluated using two measures: The Kologorov-Smirnov distance measures feature separation, and accuracy is measured using absolute and relative metrics. The system automatically segments the event set into a user-defined number (n) of development subsets, and runs a development cycle for each set, generating n separate systems, each of which is evaluated using the above metrics to improve overall system accuracy and to reduce inherent data skew from any one development subset. Multiple features for the same acoustical categories are then compared for underlying feature overlap using cross-correlation. Advantages of automated computational models include improved system development and testing, shortened development cycle, and automation of common system evaluation tasks. Numerical results are presented relating to the talking/singing classification problem.

  6. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  7. Automatic health record review to help prioritize gravely ill Social Security disability applicants.

    Science.gov (United States)

    Abbott, Kenneth; Ho, Yen-Yi; Erickson, Jennifer

    2017-07-01

    Every year, thousands of patients die waiting for disability benefits from the Social Security Administration. Some qualify for expedited service under the Compassionate Allowance (CAL) initiative, but CAL software focuses exclusively on information from a single form field. This paper describes the development of a supplemental process for identifying some overlooked but gravely ill applicants, through automatic annotation of health records accompanying new claims. We explore improved prioritization instead of fully autonomous claims approval. We developed a sample of claims containing medical records at the moment of arrival in a single office. A series of tools annotated both patient records and public Web page descriptions of CAL medical conditions. We trained random forests to identify CAL patients and validated each model with 10-fold cross validation. Our main model, a general CAL classifier, had an area under the receiver operating characteristic curve of 0.915. Combining this classifier with existing software improved sensitivity from 0.960 to 0.994, detecting every deceased patient, but reducing positive predictive value to 0.216. True positive CAL identification is a priority, given CAL patient mortality. Mere prioritization of the false positives would not create a meaningful burden in terms of manual review. Death certificate data suggest the presence of truly ill patients among putative false positives. To a limited extent, it is possible to identify gravely ill Social Security disability applicants by analyzing annotations of unstructured electronic health records, and the level of identification is sufficient to be useful in prioritizing case reviews.

  8. The concept of automatic reinforcement: implications for behavioral research in developmental disabilities.

    Science.gov (United States)

    Vollmer, T R

    1994-01-01

    Automatic reinforcement refers to situations in which behavior is maintained by operant mechanisms independent of the social environment. A number of difficulties exist in conducting an adequate functional analysis of automatically reinforced aberrant behavior. For example, sources of reinforcement are often difficult or impossible to identify, manipulate, or control. Further, the development of treatments is often difficult because many behavioral interventions, such as timeout, involve manipulation of the social environment--an approach that may be functionally irrelevant in the case of automatic reinforcement. This article discusses the problems inherent in the analysis of automatically reinforced behavior and reviews four classes of treatment that are compatible with that behavioral function. The four types of intervention reviewed include manipulations of establishing operations, sensory extinction, differential reinforcement, and punishment. Suggestions for future research are discussed.

  9. Automatic Artifact Removal from Electroencephalogram Data Based on A Priori Artifact Information

    Directory of Open Access Journals (Sweden)

    Chi Zhang

    2015-01-01

    Full Text Available Electroencephalogram (EEG is susceptible to various nonneural physiological artifacts. Automatic artifact removal from EEG data remains a key challenge for extracting relevant information from brain activities. To adapt to variable subjects and EEG acquisition environments, this paper presents an automatic online artifact removal method based on a priori artifact information. The combination of discrete wavelet transform and independent component analysis (ICA, wavelet-ICA, was utilized to separate artifact components. The artifact components were then automatically identified using a priori artifact information, which was acquired in advance. Subsequently, signal reconstruction without artifact components was performed to obtain artifact-free signals. The results showed that, using this automatic online artifact removal method, there were statistical significant improvements of the classification accuracies in both two experiments, namely, motor imagery and emotion recognition.

  10. Automatic single questionnaire intensity (SQI, EMS98 scale) estimation using ranking models built on the existing BCSF database

    Science.gov (United States)

    Schlupp, A.; Sira, C.; Schmitt, K.; Schaming, M.

    2013-12-01

    the fact that each definitive BCSF SQIs is determined by an expert analysis. We compare the SQIs obtained by these methods from our database and discuss the coherency and variations between automatic and manual processes. These methods lead to high scores with up to 85% of the forms well classified and most of the remaining forms classified with only a shift of one intensity degree. This allows us to use the ranking methods as the best automatic methods to fast SQIs estimation and to produce fast shakemaps. The next step, to improve the use of these methods, will be to identify explanations for the forms not classified at the correct value and a way to select the few remaining forms that should be analyzed by the expert. Note that beyond intensity VI, on-line questionnaires are insufficient and a field survey is indispensable to estimate intensity. For such survey, in France, BCSF leads a macroseismic intervention group (GIM).

  11. MFM Automatic Control System Development for CYCIAE-100

    Institute of Scientific and Technical Information of China (English)

    CAO; Lei; YIN; Zhi-guo; LV; Yin-long; ZHONG; Jun-qing

    2012-01-01

    <正>In order to do the magnetic field measurement (MFM) work for CYCIAE-100, a set of MFM automatic facility has been developed by the cyclotron team at CIAE. 1 Design of project The MFM facility for CYCIAE-100 adopts the method of circular and radial motion to complete the measurement. In circular direction, an open loop control is adopted at hardware level. A kind of arithmetic is compensated to form a virtual closed loop control based on the position signal by angle encoder

  12. Automatically Discovering Relaxed Lyapunov Functions for Polynomial Dynamical Systems

    CERN Document Server

    Liu, Jiang; Zhao, Hengjun

    2011-01-01

    The notion of Lyapunov function plays a key role in design and verification of dynamical systems, as well as hybrid and cyber-physical systems. In this paper, to analyze the asymptotic stability of a dynamical system, we generalize standard Lyapunov functions to relaxed Lyapunov functions (RLFs), by considering higher order Lie derivatives of certain functions along the system's vector field. Furthermore, we present a complete method to automatically discovering polynomial RLFs for polynomial dynamical systems (PDSs). Our method is complete in the sense that it is able to discover all polynomial RLFs by enumerating all polynomial templates for any PDS.

  13. Automatic tuning of myoelectric prostheses.

    Science.gov (United States)

    Bonivento, C; Davalli, A; Fantuzzi, C; Sacchetti, R; Terenzi, S

    1998-07-01

    This paper is concerned with the development of a software package for the automatic tuning of myoelectric prostheses. The package core consists of Fuzzy Logic Expert Systems (FLES) that embody skilled operator heuristics in the tuning of prosthesis control parameters. The prosthesis system is an artificial arm-hand system developed at the National Institute of Accidents at Work (INAIL) laboratories. The prosthesis is powered by an electric motor that is controlled by a microprocessor using myoelectric signals acquired from skin-surface electrodes placed on a muscle in the residual limb of the subject. The software package, Microprocessor Controlled Arm (MCA) Auto Tuning, is a tool for aiding both INAIL expert operators and unskilled persons in the controller parameter tuning procedure. Prosthesis control parameter setup and subsequent recurrent adjustments are fundamental for the correct working of the prosthesis, especially when we consider that myoelectric parameters may vary greatly with environmental modifications. The parameter adjustment requires the end-user to go to the manufacturer's laboratory for the control parameters setup because, generally, he/she does not have the necessary knowledge and instruments to do this at home. However, this procedure is not very practical and involves a waste of time for the technicians and uneasiness for the clients. The idea behind the MCA Auto Tuning package consists in translating technician expertise into an FLES knowledge database. The software interacts through a user-friendly graphic interface with an unskilled user, who is guided through a step-by-step procedure in the prosthesis parameter tuning that emulates the traditional expert-aided procedure. The adoption of this program on a large scale may yield considerable economic benefits and improve the service quality supplied to the users of prostheses. In fact, the time required to set the prosthesis parameters are remarkably reduced, as is the technician

  14. Automatization and familiarity in repeated checking

    NARCIS (Netherlands)

    Dek, Eliane C P; van den Hout, Marcel A.; Giele, Catharina L.; Engelhard, Iris M.

    2014-01-01

    Repeated checking paradoxically increases memory uncertainty. This study investigated the underlying mechanism of this effect. We hypothesized that as a result of repeated checking, familiarity with stimuli increases, and automatization of the checking procedure occurs, which should result in decrea

  15. Automatic Speech Segmentation Based on HMM

    Directory of Open Access Journals (Sweden)

    M. Kroul

    2007-06-01

    Full Text Available This contribution deals with the problem of automatic phoneme segmentation using HMMs. Automatization of speech segmentation task is important for applications, where large amount of data is needed to process, so manual segmentation is out of the question. In this paper we focus on automatic segmentation of recordings, which will be used for triphone synthesis unit database creation. For speech synthesis, the speech unit quality is a crucial aspect, so the maximal accuracy in segmentation is needed here. In this work, different kinds of HMMs with various parameters have been trained and their usefulness for automatic segmentation is discussed. At the end of this work, some segmentation accuracy tests of all models are presented.

  16. Collapsible truss structure is automatically expandable

    Science.gov (United States)

    1965-01-01

    Coil springs wound with maximum initial tension in a three-truss, closed loop structure form a collapsible truss structure. The truss automatically expands and provides excellent rigidity and close dimensional tolerance when expanded.

  17. Automatic coding of online collaboration protocols

    NARCIS (Netherlands)

    Erkens, Gijsbert; Janssen, J.J.H.M.

    2006-01-01

    An automatic coding procedure is described to determine the communicative functions of messages in chat discussions. Five main communicative functions are distinguished: argumentative (indicating a line of argumentation or reasoning), responsive (e.g., confirmations, denials, and answers), informati

  18. 2010 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2010 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  19. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  20. Automatic Evolution of Molecular Nanotechnology Designs

    Science.gov (United States)

    Globus, Al; Lawton, John; Wipke, Todd; Saini, Subhash (Technical Monitor)

    1998-01-01

    This paper describes strategies for automatically generating designs for analog circuits at the molecular level. Software maps out the edges and vertices of potential nanotechnology systems on graphs, then selects appropriate ones through evolutionary or genetic paradigms.

  1. Automatic lexical classification: bridging research and practice.

    Science.gov (United States)

    Korhonen, Anna

    2010-08-13

    Natural language processing (NLP)--the automatic analysis, understanding and generation of human language by computers--is vitally dependent on accurate knowledge about words. Because words change their behaviour between text types, domains and sub-languages, a fully accurate static lexical resource (e.g. a dictionary, word classification) is unattainable. Researchers are now developing techniques that could be used to automatically acquire or update lexical resources from textual data. If successful, the automatic approach could considerably enhance the accuracy and portability of language technologies, such as machine translation, text mining and summarization. This paper reviews the recent and on-going research in automatic lexical acquisition. Focusing on lexical classification, it discusses the many challenges that still need to be met before the approach can benefit NLP on a large scale.

  2. Automaticity in social-cognitive processes.

    Science.gov (United States)

    Bargh, John A; Schwader, Kay L; Hailey, Sarah E; Dyer, Rebecca L; Boothby, Erica J

    2012-12-01

    Over the past several years, the concept of automaticity of higher cognitive processes has permeated nearly all domains of psychological research. In this review, we highlight insights arising from studies in decision-making, moral judgments, close relationships, emotional processes, face perception and social judgment, motivation and goal pursuit, conformity and behavioral contagion, embodied cognition, and the emergence of higher-level automatic processes in early childhood. Taken together, recent work in these domains demonstrates that automaticity does not result exclusively from a process of skill acquisition (in which a process always begins as a conscious and deliberate one, becoming capable of automatic operation only with frequent use) - there are evolved substrates and early childhood learning mechanisms involved as well.

  3. Automatic acquisition of pattern collocations in GO

    Institute of Scientific and Technical Information of China (English)

    LIU Zhi-qing; DOU Qing; LI Wen-hong; LU Ben-jie

    2008-01-01

    The quality, quantity, and consistency of the knowledgeused in GO-playing programs often determine their strengths,and automatic acquisition of large amounts of high-quality andconsistent GO knowledge is crucial for successful GO playing.In a previous article of this subject, we have presented analgorithm for efficient and automatic acquisition of spatialpatterns of GO as well as their frequency of occurrence fromgame records. In this article, we present two algorithms, one forefficient and automatic acquisition of pairs of spatial patternsthat appear jointly in a local context, and the other for deter-mining whether the joint pattern appearances are of certainsignificance statistically and not just a coincidence. Results ofthe two algorithms include 1 779 966 pairs of spatial patternsacquired automatically from 16 067 game records of profess-sional GO players, of which about 99.8% are qualified as patterncollocations with a statistical confidence of 99.5% or higher.

  4. 2014 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2014 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  5. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application ... The questions were generated by first extracting the text from the materials supplied by the ... Keywords: Discourse Connectives, Machine Learning, Automatic Test Generation E-Learning.

  6. Automatic identification for standing tree limb pruning

    Institute of Scientific and Technical Information of China (English)

    Sun Renshan; Li Wenbin; Tian Yongchen; Hua Li

    2006-01-01

    To meet the demand of automatic pruning machines,this paper presents a new method for dynamic automatic identification of standing tree limbs and capture of the digital images of Platycladus orientalis.Methods of computer vision,image processing and wavelet analysis technology were used to compress,filter,segment,abate noise and capture the outline of the picture.We then present the arithmetic for dynamic automatic identification of standing tree limbs,extracting basic growth characteristics of the standing trees such as the form,size,degree of bending and their relative spatial position.We use pattern recognition technology to confirm the proportionate relationship matching the database and thus achieve the goal of dynamic automatic identification of standing tree limbs.

  7. 2009 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2009 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  8. A Demonstration of Automatically Switched Optical Network

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    We build an automatically switched optical network (ASON) testbed with four optical cross-connect nodes. Many fundamental ASON features are demonstrated, which is implemented by control protocols based on generalized multi-protocol label switching (GMPLS) framework.

  9. 2011 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2011 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  10. 2012 United States Automatic Identification System Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 2012 United States Automatic Identification System Database contains vessel traffic data for planning purposes within the U.S. coastal waters. The database is...

  11. Automatic Classification of Interplanetary Dust Particles

    Science.gov (United States)

    Lasue, J.; Stepinski, T. F.; Bell, S. W.

    2010-03-01

    We present an automatic classification of the IDPs collected by NASA-JSC based on their EDS spectra. Agglomerative clustering and the Sammon's map algorithms are used to visualize relationships between the clusters.

  12. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  13. Automatic and Direct Identification of Blink Components from Scalp EEG

    Directory of Open Access Journals (Sweden)

    Guojun Dai

    2013-08-01

    Full Text Available Eye blink is an important and inevitable artifact during scalp electroencephalogram (EEG recording. The main problem in EEG signal processing is how to identify eye blink components automatically with independent component analysis (ICA. Taking into account the fact that the eye blink as an external source has a higher sum of correlation with frontal EEG channels than all other sources due to both its location and significant amplitude, in this paper, we proposed a method based on correlation index and the feature of power distribution to automatically detect eye blink components. Furthermore, we prove mathematically that the correlation between independent components and scalp EEG channels can be translating directly from the mixing matrix of ICA. This helps to simplify calculations and understand the implications of the correlation. The proposed method doesn’t need to select a template or thresholds in advance, and it works without simultaneously recording an electrooculography (EOG reference. The experimental results demonstrate that the proposed method can automatically recognize eye blink components with a high accuracy on entire datasets from 15 subjects.

  14. Automatic and direct identification of blink components from scalp EEG.

    Science.gov (United States)

    Kong, Wanzeng; Zhou, Zhanpeng; Hu, Sanqing; Zhang, Jianhai; Babiloni, Fabio; Dai, Guojun

    2013-08-16

    Eye blink is an important and inevitable artifact during scalp electroencephalogram (EEG) recording. The main problem in EEG signal processing is how to identify eye blink components automatically with independent component analysis (ICA). Taking into account the fact that the eye blink as an external source has a higher sum of correlation with frontal EEG channels than all other sources due to both its location and significant amplitude, in this paper, we proposed a method based on correlation index and the feature of power distribution to automatically detect eye blink components. Furthermore, we prove mathematically that the correlation between independent components and scalp EEG channels can be translating directly from the mixing matrix of ICA. This helps to simplify calculations and understand the implications of the correlation. The proposed method doesn't need to select a template or thresholds in advance, and it works without simultaneously recording an electrooculography (EOG) reference. The experimental results demonstrate that the proposed method can automatically recognize eye blink components with a high accuracy on entire datasets from 15 subjects.

  15. Automatic Detection of Vehicles Using Intensity Laser and Anaglyph Image

    Directory of Open Access Journals (Sweden)

    Hideo Araki

    2006-12-01

    Full Text Available In this work is presented a methodology to automatic car detection motion presents in digital aerial image on urban area using intensity, anaglyph and subtracting images. The anaglyph image is used to identify the motion cars on the expose take, because the cars provide red color due the not homology between objects. An implicit model was developed to provide a digital pixel value that has the specific propriety presented early, using the ratio between the RGB color of car object in the anaglyph image. The intensity image is used to decrease the false positive and to do the processing to work into roads and streets. The subtracting image is applied to decrease the false positives obtained due the markings road. The goal of this paper is automatically detect motion cars presents in digital aerial image in urban areas. The algorithm implemented applies normalization on the left and right images and later form the anaglyph with using the translation. The results show the applicability of proposed method and it potentiality on the automatic car detection and presented the performance of proposed methodology.

  16. Automatic Spot Identification for High Throughput Microarray Analysis

    Science.gov (United States)

    Wu, Eunice; Su, Yan A.; Billings, Eric; Brooks, Bernard R.; Wu, Xiongwu

    2013-01-01

    High throughput microarray analysis has great potential in scientific research, disease diagnosis, and drug discovery. A major hurdle toward high throughput microarray analysis is the time and effort needed to accurately locate gene spots in microarray images. An automatic microarray image processor will allow accurate and efficient determination of spot locations and sizes so that gene expression information can be reliably extracted in a high throughput manner. Current microarray image processing tools require intensive manual operations in addition to the input of grid parameters to correctly and accurately identify gene spots. This work developed a method, herein called auto-spot, to automate the spot identification process. Through a series of correlation and convolution operations, as well as pixel manipulations, this method makes spot identification an automatic and accurate process. Testing with real microarray images has demonstrated that this method is capable of automatically extracting subgrids from microarray images and determining spot locations and sizes within each subgrid, regardless of variations in array patterns and background noises. With this method, we are one step closer to the goal of high throughput microarray analysis. PMID:24298393

  17. Automatic Control of Freeboard and Turbine Operation

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Frigaard, Peter Bak; Friis-Madsen, Erik;

    The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003.......The report deals with the modules for automatic control of freeboard and turbine operation on board the Wave dragon, Nissum Bredning (WD-NB) prototype, and covers what has been going on up to ultimo 2003....

  18. Automatic Age Estimation System for Face Images

    OpenAIRE

    Chin-Teng Lin; Dong-Lin Li; Jian-Hao Lai; Ming-Feng Han; Jyh-Yeong Chang

    2012-01-01

    Humans are the most important tracking objects in surveillance systems. However, human tracking is not enough to provide the required information for personalized recognition. In this paper, we present a novel and reliable framework for automatic age estimation based on computer vision. It exploits global face features based on the combination of Gabor wavelets and orthogonal locality preserving projections. In addition, the proposed system can extract face aging features automatically in rea...

  19. Automatic safety rod for reactors. [LMFBR

    Science.gov (United States)

    Germer, J.H.

    1982-03-23

    An automatic safety rod for a nuclear reactor containing neutron absorbing material and designed to be inserted into a reactor core after a loss-of-flow. Actuation is based upon either a sudden decrease in core pressure drop or the pressure drop decreases below a predetermined minimum value. The automatic control rod includes a pressure regulating device whereby a controlled decrease in operating pressure due to reduced coolant flow does not cause the rod to drop into the core.

  20. Automatic Fringe Detection Of Dynamic Moire Patterns

    Science.gov (United States)

    Fang, Jing; Su, Xian-ji; Shi, Hong-ming

    1989-10-01

    Fringe-carrier method is used in automatic fringe-order numbering of dynamic in-plane moire patterns. In experiment both static carrier and dynamic moire patterns are recorded. The image files corresponding to instants are set up to assign fringe orders automatically. Subtracting the carrier image from the modulated ones, the moire patterns due to the dynamic deformations are restored with fringe-order variation displayed by different grey levels.

  1. Automatic Text Summarization: Past, Present and Future

    OpenAIRE

    Saggion, Horacio; Poibeau, Thierry

    2012-01-01

    International audience; Automatic text summarization, the computer-based production of condensed versions of documents, is an important technology for the information society. Without summaries it would be practically impossible for human beings to get access to the ever growing mass of information available online. Although research in text summarization is over fifty years old, some efforts are still needed given the insufficient quality of automatic summaries and the number of interesting ...

  2. Phoneme vs Grapheme Based Automatic Speech Recognition

    OpenAIRE

    Magimai.-Doss, Mathew; Dines, John; Bourlard, Hervé; Hermansky, Hynek

    2004-01-01

    In recent literature, different approaches have been proposed to use graphemes as subword units with implicit source of phoneme information for automatic speech recognition. The major advantage of using graphemes as subword units is that the definition of lexicon is easy. In previous studies, results comparable to phoneme-based automatic speech recognition systems have been reported using context-independent graphemes or context-dependent graphemes with decision trees. In this paper, we study...

  3. Automatic quiz generation for elderly people

    OpenAIRE

    Samuelsen, Jeanette

    2016-01-01

    Studies have indicated that games can be beneficial for the elderly, in areas such as cognitive functioning and well-being. Taking part in social activities, such as playing a game with others, could also be beneficial. One type of game is a computer-based quiz. One can create quiz questions manually; however, this can be time-consuming. Another approach is to generate quiz questions automatically. This project has examined how quizzes for Norwegian elderly can be automatically generated usin...

  4. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  5. Automatic Configuration of Programmable Logic Controller Emulators

    Science.gov (United States)

    2015-03-01

    PLCs ), which are used to control much of the world’s critical infrastructures, are highly vulnerable and exposed to the Internet. Many efforts have...scalable solution is needed in order to automatically configure PLC emula- tors. The ScriptGenE Framework presented in this thesis leverages several...techniques used in reverse engineering protocols in order to automatically configure PLC emula- tors using network traces. The accuracy, flexibility, and

  6. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  7. Development of an automatic measuring device for total sugar content in chlortetracycline fermenter based on STM32

    Science.gov (United States)

    Liu, Ruochen; Chen, Xiangguang; Yao, Minpu; Huang, Suyi; Ma, Deshou; Zhou, Biao

    2017-01-01

    Because fermented liquid in chlortetracycline fermenter has high viscosity and complex composition, conventional instruments can't directly measure its total sugar content of fermented liquid. At present, offline artificial sampling measurement is usually the way to measuring total sugar content in chlortetracycline Fermenter. it will take too much time and manpower to finish the measurement., and the results will bring the lag of control process. To realize automatic measurement of total sugar content in chlortetracycline fermenter, we developed an automatic measuring device for total sugar content based on STM32 microcomputer. It can not only realize the function of automatic sampling, filtering, measuring of fermented liquid and automatic washing of the device, but also can make the measuring results display in the field and finish data communication. The experiment results show that the automatic measuring device of total sugar content in chlortetracycline fermenter can meet the demand of practical application.

  8. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/.

  9. Towards automatic musical instrument timbre recognition

    Science.gov (United States)

    Park, Tae Hong

    This dissertation is comprised of two parts---focus on issues concerning research and development of an artificial system for automatic musical instrument timbre recognition and musical compositions. The technical part of the essay includes a detailed record of developed and implemented algorithms for feature extraction and pattern recognition. A review of existing literature introducing historical aspects surrounding timbre research, problems associated with a number of timbre definitions, and highlights of selected research activities that have had significant impact in this field are also included. The developed timbre recognition system follows a bottom-up, data-driven model that includes a pre-processing module, feature extraction module, and a RBF/EBF (Radial/Elliptical Basis Function) neural network-based pattern recognition module. 829 monophonic samples from 12 instruments have been chosen from the Peter Siedlaczek library (Best Service) and other samples from the Internet and personal collections. Significant emphasis has been put on feature extraction development and testing to achieve robust and consistent feature vectors that are eventually passed to the neural network module. In order to avoid a garbage-in-garbage-out (GIGO) trap and improve generality, extra care was taken in designing and testing the developed algorithms using various dynamics, different playing techniques, and a variety of pitches for each instrument with inclusion of attack and steady-state portions of a signal. Most of the research and development was conducted in Matlab. The compositional part of the essay includes brief introductions to "A d'Ess Are ," "Aboji," "48 13 N, 16 20 O," and "pH-SQ." A general outline pertaining to the ideas and concepts behind the architectural designs of the pieces including formal structures, time structures, orchestration methods, and pitch structures are also presented.

  10. The estimation of tax-benefit automatic stabilizers in Serbia: A combined micro-macro approach

    Directory of Open Access Journals (Sweden)

    Ranđelović Saša

    2013-01-01

    Full Text Available The large volatility of GDP due to the economic crisis, particularly in transition economies, has brought the issue of automatic stabilizers back into the focus of economic policy. The vast majority of empirical literature in this field relates to the estimation of the size of automatic stabilizers in developed countries, usually based on macroeconomic data. On the other hand empirical literature on this topic based on micro data, particularly for transition economies, is limited. This paper provides an evaluation of the size of automatic stabilizers in one transition economy (Serbia, by combining tax-benefit simulation modelling based on micro data and econometric methods based on macroeconomic data. The results show that, in the case of shock, around 17% of fall in market income would be absorbed by automatic stabilizers. Although the stabilizing effects of the tax-benefit system in Serbia are lower than in other European countries, the total size of automatic stabilizers is close to the average value in these countries, due to the higher elasticity of demand to income. The results also show that progressivity-enhancing income tax reform would only slightly increase automatic stabilizers, due to the large informal economy and the large share of agriculture in total households’ income.

  11. Featured Image: Identifying Weird Galaxies

    Science.gov (United States)

    Kohler, Susanna

    2017-08-01

    Hoags Object, an example of a ring galaxy. [NASA/Hubble Heritage Team/Ray A. Lucas (STScI/AURA)]The above image (click for the full view) shows PanSTARRSobservationsof some of the 185 galaxies identified in a recent study as ring galaxies bizarre and rare irregular galaxies that exhibit stars and gas in a ring around a central nucleus. Ring galaxies could be formed in a number of ways; one theory is that some might form in a galaxy collision when a smaller galaxy punches through the center of a larger one, triggering star formation around the center. In a recent study, Ian Timmis and Lior Shamir of Lawrence Technological University in Michigan explore ways that we may be able to identify ring galaxies in the overwhelming number of images expected from large upcoming surveys. They develop a computer analysis method that automatically finds ring galaxy candidates based on their visual appearance, and they test their approach on the 3 million galaxy images from the first PanSTARRS data release. To see more of the remarkable galaxies the authors found and to learn more about their identification method, check out the paper below.CitationIan Timmis and Lior Shamir 2017 ApJS 231 2. doi:10.3847/1538-4365/aa78a3

  12. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.

    2012-01-01

    , the iris is resampled into a standardized quadratic coordinate system, where occluded and invalid regions are masked out. Secondly, a pixel classification approach has been evaluated with good results. It is based on a so-called Markov Random Field spatial classification into dominantly brown and blue...

  13. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and...

  14. Rapid automatic keyword extraction for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J [Richland, WA; Cowley,; E, Wendy [Richland, WA; Crow, Vernon L [Richland, WA; Cramer, Nicholas O [Richland, WA

    2012-03-06

    Methods and systems for rapid automatic keyword extraction for information retrieval and analysis. Embodiments can include parsing words in an individual document by delimiters, stop words, or both in order to identify candidate keywords. Word scores for each word within the candidate keywords are then calculated based on a function of co-occurrence degree, co-occurrence frequency, or both. Based on a function of the word scores for words within the candidate keyword, a keyword score is calculated for each of the candidate keywords. A portion of the candidate keywords are then extracted as keywords based, at least in part, on the candidate keywords having the highest keyword scores.

  15. Automatic cone photoreceptor segmentation using graph theory and dynamic programming.

    Science.gov (United States)

    Chiu, Stephanie J; Lokhnygina, Yuliya; Dubis, Adam M; Dubra, Alfredo; Carroll, Joseph; Izatt, Joseph A; Farsiu, Sina

    2013-06-01

    Geometrical analysis of the photoreceptor mosaic can reveal subclinical ocular pathologies. In this paper, we describe a fully automatic algorithm to identify and segment photoreceptors in adaptive optics ophthalmoscope images of the photoreceptor mosaic. This method is an extension of our previously described closed contour segmentation framework based on graph theory and dynamic programming (GTDP). We validated the performance of the proposed algorithm by comparing it to the state-of-the-art technique on a large data set consisting of over 200,000 cones and posted the results online. We found that the GTDP method achieved a higher detection rate, decreasing the cone miss rate by over a factor of five.

  16. Management of natural resources through automatic cartographic inventory. [France

    Science.gov (United States)

    Rey, P.; Gourinard, Y.; Cambou, F. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. (1) Accurate recognition of previously known ground features from ERTS-1 imagery has been confirmed and a probable detection range for the major signatures can be given. (2) Unidentified elements, however, must be decoded by means of the equal densitometric value zone method. (3) Determination of these zonings involves an analogical treatment of images using the color equidensity methods (pseudo-color), color composites and especially temporal color composite (repetitive superposition). (4) After this analogical preparation, the digital equidensities can be processed by computer in the four MSS bands, according to a series of transfer operations from imagery and automatic cartography.

  17. Automatic Mosaicking of Satellite Imagery Considering the Clouds

    Science.gov (United States)

    Kang, Yifei; Pan, Li; Chen, Qi; Zhang, Tong; Zhang, Shasha; Liu, Zhang

    2016-06-01

    With the rapid development of high resolution remote sensing for earth observation technology, satellite imagery is widely used in the fields of resource investigation, environment protection, and agricultural research. Image mosaicking is an important part of satellite imagery production. However, the existence of clouds leads to lots of disadvantages for automatic image mosaicking, mainly in two aspects: 1) Image blurring may be caused during the process of image dodging, 2) Cloudy areas may be passed through by automatically generated seamlines. To address these problems, an automatic mosaicking method is proposed for cloudy satellite imagery in this paper. Firstly, modified Otsu thresholding and morphological processing are employed to extract cloudy areas and obtain the percentage of cloud cover. Then, cloud detection results are used to optimize the process of dodging and mosaicking. Thus, the mosaic image can be combined with more clear-sky areas instead of cloudy areas. Besides, clear-sky areas will be clear and distortionless. The Chinese GF-1 wide-field-of-view orthoimages are employed as experimental data. The performance of the proposed approach is evaluated in four aspects: the effect of cloud detection, the sharpness of clear-sky areas, the rationality of seamlines and efficiency. The evaluation results demonstrated that the mosaic image obtained by our method has fewer clouds, better internal color consistency and better visual clarity compared with that obtained by traditional method. The time consumed by the proposed method for 17 scenes of GF-1 orthoimages is within 4 hours on a desktop computer. The efficiency can meet the general production requirements for massive satellite imagery.

  18. Field Deployable DNA analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, E; Christian, A; Marion, J; Sorensen, K; Arroyo, E; Vrankovich, G; Hara, C; Nguyen, C

    2005-02-09

    This report details the feasibility of a field deployable DNA analyzer. Steps for swabbing cells from surfaces and extracting DNA in an automatable way are presented. Since enzymatic amplification reactions are highly sensitive to environmental contamination, sample preparation is a crucial step to make an autonomous deployable instrument. We perform sample clean up and concentration in a flow through packed bed. For small initial samples, whole genome amplification is performed in the packed bed resulting in enough product for subsequent PCR amplification. In addition to DNA, which can be used to identify a subject, protein is also left behind, the analysis of which can be used to determine exposure to certain substances, such as radionuclides. Our preparative step for DNA analysis left behind the protein complement as a waste stream; we determined to learn if the proteins themselves could be analyzed in a fieldable device. We successfully developed a two-step lateral flow assay for protein analysis and demonstrate a proof of principle assay.

  19. Practical automatic Arabic license plate recognition system

    Science.gov (United States)

    Mohammad, Khader; Agaian, Sos; Saleh, Hani

    2011-02-01

    Since 1970's, the need of an automatic license plate recognition system, sometimes referred as Automatic License Plate Recognition system, has been increasing. A license plate recognition system is an automatic system that is able to recognize a license plate number, extracted from image sensors. In specific, Automatic License Plate Recognition systems are being used in conjunction with various transportation systems in application areas such as law enforcement (e.g. speed limit enforcement) and commercial usages such as parking enforcement and automatic toll payment private and public entrances, border control, theft and vandalism control. Vehicle license plate recognition has been intensively studied in many countries. Due to the different types of license plates being used, the requirement of an automatic license plate recognition system is different for each country. [License plate detection using cluster run length smoothing algorithm ].Generally, an automatic license plate localization and recognition system is made up of three modules; license plate localization, character segmentation and optical character recognition modules. This paper presents an Arabic license plate recognition system that is insensitive to character size, font, shape and orientation with extremely high accuracy rate. The proposed system is based on a combination of enhancement, license plate localization, morphological processing, and feature vector extraction using the Haar transform. The performance of the system is fast due to classification of alphabet and numerals based on the license plate organization. Experimental results for license plates of two different Arab countries show an average of 99 % successful license plate localization and recognition in a total of more than 20 different images captured from a complex outdoor environment. The results run times takes less time compared to conventional and many states of art methods.

  20. Automatic Identification of Antibodies in the Protein Data Bank

    Institute of Scientific and Technical Information of China (English)

    LI Xun; WANG Renxiao

    2009-01-01

    An automatic method has been developed for identifying antibody entries in the protein data bank (PDB). Our method, called KIAb (Keyword-based Identification of Antibodies), parses PDB-format files to search for particular keywords relevant to antibodies, and makes judgment accordingly. Our method identified 780 entries as antibodies on the entire PDB. Among them, 767 entries were confirmed by manual inspection, indicating a high success rate of 98.3%. Our method recovered basically all of the entries compiled in the Summary of Antibody Crystal Structures (SACS) database. It also identified a number of entries missed by SACS. Our method thus provides a more com-plete mining of antibody entries in PDB with a very low false positive rate.