WorldWideScience

Sample records for fields automatically identified

  1. Automatically identifying gene/protein terms in MEDLINE abstracts.

    Science.gov (United States)

    Yu, Hong; Hatzivassiloglou, Vasileios; Rzhetsky, Andrey; Wilbur, W John

    2002-01-01

    Natural language processing (NLP) techniques are used to extract information automatically from computer-readable literature. In biology, the identification of terms corresponding to biological substances (e.g., genes and proteins) is a necessary step that precedes the application of other NLP systems that extract biological information (e.g., protein-protein interactions, gene regulation events, and biochemical pathways). We have developed GPmarkup (for "gene/protein-full name mark up"), a software system that automatically identifies gene/protein terms (i.e., symbols or full names) in MEDLINE abstracts. As a part of marking up process, we also generated automatically a knowledge source of paired gene/protein symbols and full names (e.g., LARD for lymphocyte associated receptor of death) from MEDLINE. We found that many of the pairs in our knowledge source do not appear in the current GenBank database. Therefore our methods may also be used for automatic lexicon generation. GPmarkup has 73% recall and 93% precision in identifying and marking up gene/protein terms in MEDLINE abstracts. A random sample of gene/protein symbols and full names and a sample set of marked up abstracts can be viewed at http://www.cpmc.columbia.edu/homepages/yuh9001/GPmarkup/. Contact. hy52@columbia.edu. Voice: 212-939-7028; fax: 212-666-0140.

  2. Field Robotics in Sports: Automatic Generation of guidance Lines for Automatic Grass Cutting, Striping and Pitch Marking of Football Playing Fields

    Directory of Open Access Journals (Sweden)

    Ole Green

    2011-03-01

    Full Text Available Progress is constantly being made and new applications are constantly coming out in the area of field robotics. In this paper, a promising application of field robotics in football playing fields is introduced. An algorithmic approach for generating the way points required for the guidance of a GPS-based field robotic through a football playing field to automatically carry out periodical tasks such as cutting the grass field, pitch and line marking illustrations and lawn striping is represented. The manual operation of these tasks requires very skilful personnel able to work for long hours with very high concentration for the football yard to be compatible with standards of Federation Internationale de Football Association (FIFA. In the other side, a GPS-based guided vehicle or robot with three implements; grass mower, lawn stripping roller and track marking illustrator is capable of working 24 h a day, in most weather and in harsh soil conditions without loss of quality. The proposed approach for the automatic operation of football playing fields requires no or very limited human intervention and therefore it saves numerous working hours and free a worker to focus on other tasks. An economic feasibility study showed that the proposed method is economically superimposing the current manual practices.

  3. Wide-Field Imaging Telescope-0 (WIT0) with automatic observing system

    Science.gov (United States)

    Ji, Tae-Geun; Byeon, Seoyeon; Lee, Hye-In; Park, Woojin; Lee, Sang-Yun; Hwang, Sungyong; Choi, Changsu; Gibson, Coyne Andrew; Kuehne, John W.; Prochaska, Travis; Marshall, Jennifer L.; Im, Myungshin; Pak, Soojong

    2018-01-01

    We introduce Wide-Field Imaging Telescope-0 (WIT0), with an automatic observing system. It is developed for monitoring the variabilities of many sources at a time, e.g. young stellar objects and active galactic nuclei. It can also find the locations of transient sources such as a supernova or gamma-ray bursts. In 2017 February, we installed the wide-field 10-inch telescope (Takahashi CCA-250) as a piggyback system on the 30-inch telescope at the McDonald Observatory in Texas, US. The 10-inch telescope has a 2.35 × 2.35 deg field-of-view with a 4k × 4k CCD Camera (FLI ML16803). To improve the observational efficiency of the system, we developed a new automatic observing software, KAOS30 (KHU Automatic Observing Software for McDonald 30-inch telescope), which was developed by Visual C++ on the basis of a windows operating system. The software consists of four control packages: the Telescope Control Package (TCP), the Data Acquisition Package (DAP), the Auto Focus Package (AFP), and the Script Mode Package (SMP). Since it also supports the instruments that are using the ASCOM driver, the additional hardware installations become quite simplified. We commissioned KAOS30 in 2017 August and are in the process of testing. Based on the WIT0 experiences, we will extend KAOS30 to control multiple telescopes in future projects.

  4. Markov random field based automatic image alignment for electron tomography.

    Science.gov (United States)

    Amat, Fernando; Moussavi, Farshid; Comolli, Luis R; Elidan, Gal; Downing, Kenneth H; Horowitz, Mark

    2008-03-01

    We present a method for automatic full-precision alignment of the images in a tomographic tilt series. Full-precision automatic alignment of cryo electron microscopy images has remained a difficult challenge to date, due to the limited electron dose and low image contrast. These facts lead to poor signal to noise ratio (SNR) in the images, which causes automatic feature trackers to generate errors, even with high contrast gold particles as fiducial features. To enable fully automatic alignment for full-precision reconstructions, we frame the problem probabilistically as finding the most likely particle tracks given a set of noisy images, using contextual information to make the solution more robust to the noise in each image. To solve this maximum likelihood problem, we use Markov Random Fields (MRF) to establish the correspondence of features in alignment and robust optimization for projection model estimation. The resulting algorithm, called Robust Alignment and Projection Estimation for Tomographic Reconstruction, or RAPTOR, has not needed any manual intervention for the difficult datasets we have tried, and has provided sub-pixel alignment that is as good as the manual approach by an expert user. We are able to automatically map complete and partial marker trajectories and thus obtain highly accurate image alignment. Our method has been applied to challenging cryo electron tomographic datasets with low SNR from intact bacterial cells, as well as several plastic section and X-ray datasets.

  5. Analogue particle identifier and test unit for automatic measuring of errors

    International Nuclear Information System (INIS)

    Boden, A.; Lauch, J.

    1979-04-01

    A high accuracy analogue particle identifier is described. The unit is used for particle identification or data correction of experimental based errors in magnetic spectrometers. Signals which are proportional to the energy, the time-of-flight or the position of absorption of the particles are supplied to an analogue computation circuit (multifunction converter). Three computation functions are available for different applications. The output of the identifier produces correction signals or pulses whose amplitudes are proportional to the mass of the particles. Particle identification and data correction can be optimized by the adjustment of variable parameters. An automatic test unit has been developed for adjustment and routine checking of particle identifiers. The computation functions can be tested by this unit with an accuracy of 1%. (orig.) [de

  6. Performance Modelling of Automatic Identification System with Extended Field of View

    DEFF Research Database (Denmark)

    Lauersen, Troels; Mortensen, Hans Peter; Pedersen, Nikolaj Bisgaard

    2010-01-01

    This paper deals with AIS (Automatic Identification System) behavior, to investigate the severity of packet collisions in an extended field of view (FOV). This is an important issue for satellite-based AIS, and the main goal is a feasibility study to find out to what extent an increased FOV...

  7. Automatic NMR field-frequency lock-pulsed phase locked loop approach.

    Science.gov (United States)

    Kan, S; Gonord, P; Fan, M; Sauzade, M; Courtieu, J

    1978-06-01

    A self-contained deuterium frequency-field lock scheme for a high-resolution NMR spectrometer is described. It is based on phase locked loop techniques in which the free induction decay signal behaves as a voltage-controlled oscillator. By pulsing the spins at an offset frequency of a few hundred hertz and using a digital phase-frequency discriminator this method not only eliminates the usual phase, rf power, offset adjustments needed in conventional lock systems but also possesses the automatic pull-in characteristics that dispense with the use of field sweeps to locate the NMR line prior to closure of the lock loop.

  8. A comparison of coronal mass ejections identified by manual and automatic methods

    Directory of Open Access Journals (Sweden)

    S. Yashiro

    2008-10-01

    Full Text Available Coronal mass ejections (CMEs are related to many phenomena (e.g. flares, solar energetic particles, geomagnetic storms, thus compiling of event catalogs is important for a global understanding these phenomena. CMEs have been identified manually for a long time, but in the SOHO era, automatic identification methods are being developed. In order to clarify the advantage and disadvantage of the manual and automatic CME catalogs, we examined the distributions of CME properties listed in the CDAW (manual and CACTus (automatic catalogs. Both catalogs have a good agreement on the wide CMEs (width>120° in their properties, while there is a significant discrepancy on the narrow CMEs (width≤30°: CACTus has a larger number of narrow CMEs than CDAW. We carried out an event-by-event examination of a sample of events and found that the CDAW catalog have missed many narrow CMEs during the solar maximum. Another significant discrepancy was found on the fast CMEs (speed>1000 km/s: the majority of the fast CDAW CMEs are wide and originate from low latitudes, while the fast CACTus CMEs are narrow and originate from all latitudes. Event-by-event examination of a sample of events suggests that CACTus has a problem on the detection of the fast CMEs.

  9. An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.

    Science.gov (United States)

    Saccomani, Maria Pia

    2011-08-01

    Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.

  10. Automatic alignment device for focal spot measurements in the center of the field for mammography

    International Nuclear Information System (INIS)

    Vieira, Marcelo A.C.; Watanabe, Alex O.; Oliveira Junior, Paulo D.; Schiabel, Homero

    2010-01-01

    Some quality control procedures used for mammography, such as focal spot evaluation, requires previous alignment of the measurement equipment with the X-ray central beam. However, alignment procedures are, in general, the most difficult task and the one that needs more time to be performed. Moreover, the operator sometimes is exposed to radiation during this procedure. This work presents an automatic alignment system for mammographic equipment that allows locating the central ray of the radiation beam and, immediately, aligns with it by dislocating itself automatically along the field. The system consists on a bidirectional moving device, connected to a CCD sensor for digital radiographic image acquisition. A computational analysis of a radiographic image, acquired at any position on the field, is performed in order to determine its positioning under the X-ray beam. Finally, a mechanical system for two moving directions, electronically controlled by a microcontroller under USB communication, makes the system to align automatically with the radiation beam central ray. The alignment process is fully automatic, fast and accurate, with no operator exposure to radiation, which allows a considerable time saving for quality control procedures achievement for mammography. (author)

  11. Systems and methods for automatically identifying and linking names in digital resources

    Science.gov (United States)

    Parker, Charles T.; Lyons, Catherine M.; Roston, Gerald P.; Garrity, George M.

    2017-06-06

    The present invention provides systems and methods for automatically identifying name-like-strings in digital resources, matching these name-like-string against a set of names held in an expertly curated database, and for those name-like-strings found in said database, enhancing the content by associating additional matter with the name, wherein said matter includes information about the names that is held within said database and pointers to other digital resources which include the same name and it synonyms.

  12. Automatic fault extraction using a modified ant-colony algorithm

    International Nuclear Information System (INIS)

    Zhao, Junsheng; Sun, Sam Zandong

    2013-01-01

    The basis of automatic fault extraction is seismic attributes, such as the coherence cube which is always used to identify a fault by the minimum value. The biggest challenge in automatic fault extraction is noise, including that of seismic data. However, a fault has a better spatial continuity in certain direction, which makes it quite different from noise. Considering this characteristic, a modified ant-colony algorithm is introduced into automatic fault identification and tracking, where the gradient direction and direction consistency are used as constraints. Numerical model test results show that this method is feasible and effective in automatic fault extraction and noise suppression. The application of field data further illustrates its validity and superiority. (paper)

  13. Exploratory field trial of motorcycle autonomous emergency braking (MAEB): Considerations on the acceptability of unexpected automatic decelerations.

    Science.gov (United States)

    Savino, Giovanni; Pierini, Marco; Thompson, Jason; Fitzharris, Michael; Lenné, Michael G

    2016-11-16

    Autonomous emergency braking (AEB) acts to slow down a vehicle when an unavoidable impending collision is detected. In addition to documented benefits when applied to passenger cars, AEB has also shown potential when applied to motorcycles (MAEB). However, the feasibility of MAEB as practically applied to motorcycles in the real world is not well understood. In this study we performed a field trial involving 16 riders on a test motorcycle subjected to automatic decelerations, thus simulating MAEB activation. The tests were conducted along a rectilinear path at nominal speed of 40 km/h and with mean deceleration of 0.15 g (15% of full braking) deployed at random times. Riders were also exposed to one final undeclared brake activation with the aim of providing genuinely unexpected automatic braking events. Participants were consistently able to manage automatic decelerations of the vehicle with minor to moderate effort. Results of undeclared activations were consistent with those of standard runs. This study demonstrated the feasibility of a moderate automatic deceleration in a scenario of motorcycle travelling in a straight path, supporting the notion that the application of AEB on motorcycles is practicable. Furthermore, the proposed field trial can be used as a reference for future regulation or consumer tests in order to address safety and acceptability of unexpected automatic decelerations on a motorcycle.

  14. Difficulty identifying feelings and automatic activation in the fusiform gyrus in response to facial emotion.

    Science.gov (United States)

    Eichmann, Mischa; Kugel, Harald; Suslow, Thomas

    2008-12-01

    Difficulties in identifying and differentiating one's emotions are a central characteristic of alexithymia. In the present study, automatic activation of the fusiform gyrus to facial emotion was investigated as a function of alexithymia as assessed by the 20-item Toronto Alexithymia Scale. During 3 Tesla fMRI scanning, pictures of faces bearing sad, happy, and neutral expressions masked by neutral faces were presented to 22 healthy adults who also responded to the Toronto Alexithymia Scale. The fusiform gyrus was selected as the region of interest, and voxel values of this region were extracted, summarized as means, and tested among the different conditions (sad, happy, and neutral faces). Masked sad facial emotions were associated with greater bilateral activation of the fusiform gyrus than masked neutral faces. The subscale, Difficulty Identifying Feelings, was negatively correlated with the neural response of the fusiform gyrus to masked sad faces. The correlation results suggest that automatic hyporesponsiveness of the fusiform gyrus to negative emotion stimuli may reflect problems in recognizing one's emotions in everyday life.

  15. Automatic plasma control in magnetic traps

    International Nuclear Information System (INIS)

    Samojlenko, Y.; Chuyanov, V.

    1984-01-01

    Hot plasma is essentially in thermodynamic non-steady state. Automatic plasma control basically means monitoring deviations from steady state and producing a suitable magnetic or electric field which brings the plasma back to its original state. Briefly described are two systems of automatic plasma control: control with a magnetic field using a negative impedance circuit, and control using an electric field. It appears that systems of automatic plasma stabilization will be an indispensable component of the fusion reactor and its possibilities will in many ways determine the reactor economy. (Ha)

  16. Automatic detection of diabetic retinopathy features in ultra-wide field retinal images

    Science.gov (United States)

    Levenkova, Anastasia; Sowmya, Arcot; Kalloniatis, Michael; Ly, Angelica; Ho, Arthur

    2017-03-01

    Diabetic retinopathy (DR) is a major cause of irreversible vision loss. DR screening relies on retinal clinical signs (features). Opportunities for computer-aided DR feature detection have emerged with the development of Ultra-WideField (UWF) digital scanning laser technology. UWF imaging covers 82% greater retinal area (200°), against 45° in conventional cameras3 , allowing more clinically relevant retinopathy to be detected4 . UWF images also provide a high resolution of 3078 x 2702 pixels. Currently DR screening uses 7 overlapping conventional fundus images, and the UWF images provide similar results1,4. However, in 40% of cases, more retinopathy was found outside the 7-field ETDRS) fields by UWF and in 10% of cases, retinopathy was reclassified as more severe4 . This is because UWF imaging allows examination of both the central retina and more peripheral regions, with the latter implicated in DR6 . We have developed an algorithm for automatic recognition of DR features, including bright (cotton wool spots and exudates) and dark lesions (microaneurysms and blot, dot and flame haemorrhages) in UWF images. The algorithm extracts features from grayscale (green "red-free" laser light) and colour-composite UWF images, including intensity, Histogram-of-Gradient and Local binary patterns. Pixel-based classification is performed with three different classifiers. The main contribution is the automatic detection of DR features in the peripheral retina. The method is evaluated by leave-one-out cross-validation on 25 UWF retinal images with 167 bright lesions, and 61 other images with 1089 dark lesions. The SVM classifier performs best with AUC of 94.4% / 95.31% for bright / dark lesions.

  17. Bianchi identities and the automatic conservation of energy-momentum and angular momentum in general-relativistic field theories

    International Nuclear Information System (INIS)

    Hehl, F.W.; McCrea, J.D.

    1986-01-01

    Automatic conservation of energy-momentum and angular momentum is guaranteed in a gravitational theory if, via the field equations, the conservation laws for the material currents are reduced to the contracted Bianchi identities. We first execute an irreducible decomposition of the Bianchi identities in a Riemann-Cartan space-time. Then, starting from a Riemannian space-time with or without torsion, we determine those gravitational theories which have automatic conservation: general relativity and the Einstein-Cartan-Sciama-Kibble theory, both with cosmological constant, and the nonviable pseudoscalar model. The Poincare gauge theory of gravity, like gauge theories of internal groups, has no automatic conservation in the sense defined above. This does not lead to any difficulties in principle. Analogies to 3-dimensional continuum mechanics are stressed throughout the article

  18. Bianchi identities and the automatic conservation of energy-momentum and angular momentum in general-relativistic field theories

    Science.gov (United States)

    Hehl, Friedrich W.; McCrea, J. Dermott

    1986-03-01

    Automatic conservation of energy-momentum and angular momentum is guaranteed in a gravitational theory if, via the field equations, the conservation laws for the material currents are reduced to the contracted Bianchi identities. We first execute an irreducible decomposition of the Bianchi identities in a Riemann-Cartan space-time. Then, starting from a Riemannian space-time with or without torsion, we determine those gravitational theories which have automatic conservation: general relativity and the Einstein-Cartan-Sciama-Kibble theory, both with cosmological constant, and the nonviable pseudoscalar model. The Poincaré gauge theory of gravity, like gauge theories of internal groups, has no automatic conservation in the sense defined above. This does not lead to any difficulties in principle. Analogies to 3-dimensional continuum mechanics are stressed throughout the article.

  19. Automatic address validation and health record review to identify homeless Social Security disability applicants.

    Science.gov (United States)

    Erickson, Jennifer; Abbott, Kenneth; Susienka, Lucinda

    2018-06-01

    Homeless patients face a variety of obstacles in pursuit of basic social services. Acknowledging this, the Social Security Administration directs employees to prioritize homeless patients and handle their disability claims with special care. However, under existing manual processes for identification of homelessness, many homeless patients never receive the special service to which they are entitled. In this paper, we explore address validation and automatic annotation of electronic health records to improve identification of homeless patients. We developed a sample of claims containing medical records at the moment of arrival in a single office. Using address validation software, we reconciled patient addresses with public directories of homeless shelters, veterans' hospitals and clinics, and correctional facilities. Other tools annotated electronic health records. We trained random forests to identify homeless patients and validated each model with 10-fold cross validation. For our finished model, the area under the receiver operating characteristic curve was 0.942. The random forest improved sensitivity from 0.067 to 0.879 but decreased positive predictive value to 0.382. Presumed false positive classifications bore many characteristics of homelessness. Organizations could use these methods to prompt early collection of information necessary to avoid labor-intensive attempts to reestablish contact with homeless individuals. Annually, such methods could benefit tens of thousands of patients who are homeless, destitute, and in urgent need of assistance. We were able to identify many more homeless patients through a combination of automatic address validation and natural language processing of unstructured electronic health records. Copyright © 2018. Published by Elsevier Inc.

  20. Sensitivity-based virtual fields for the non-linear virtual fields method

    Science.gov (United States)

    Marek, Aleksander; Davis, Frances M.; Pierron, Fabrice

    2017-09-01

    The virtual fields method is an approach to inversely identify material parameters using full-field deformation data. In this manuscript, a new set of automatically-defined virtual fields for non-linear constitutive models has been proposed. These new sensitivity-based virtual fields reduce the influence of noise on the parameter identification. The sensitivity-based virtual fields were applied to a numerical example involving small strain plasticity; however, the general formulation derived for these virtual fields is applicable to any non-linear constitutive model. To quantify the improvement offered by these new virtual fields, they were compared with stiffness-based and manually defined virtual fields. The proposed sensitivity-based virtual fields were consistently able to identify plastic model parameters and outperform the stiffness-based and manually defined virtual fields when the data was corrupted by noise.

  1. Developing Automatic Water Table Control System for Reducing Greenhouse Gas Emissions from Paddy Fields

    Science.gov (United States)

    Arif, C.; Fauzan, M. I.; Satyanto, K. S.; Budi, I. S.; Masaru, M.

    2018-05-01

    Water table in rice fields play important role to mitigate greenhouse gas (GHG) emissions from paddy fields. Continuous flooding by maintenance water table 2-5 cm above soil surface is not effective and release more GHG emissions. System of Rice Intensification (SRI) as alternative rice farming apply intermittent irrigation by maintaining lower water table is proven can reduce GHG emissions reducing productivity significantly. The objectives of this study were to develop automatic water table control system for SRI application and then evaluate the performances. The control system was developed based on fuzzy logic algorithms using the mini PC of Raspberry Pi. Based on laboratory and field tests, the developed system was working well as indicated by lower MAPE (mean absolute percentage error) values. MAPE values for simulation and field tests were 16.88% and 15.80%, respectively. This system can save irrigation water up to 42.54% without reducing productivity significantly when compared to manual irrigation systems.

  2. A new type industrial total station based on target automatic collimation

    Science.gov (United States)

    Lao, Dabao; Zhou, Weihu; Ji, Rongyi; Dong, Dengfeng; Xiong, Zhi; Wei, Jiang

    2018-01-01

    In the case of industrial field measurement, the present measuring instruments work with manual operation and collimation, which give rise to low efficiency for field measurement. In order to solve the problem, a new type industrial total station is presented in this paper. The new instrument can identify and trace cooperative target automatically, in the mean time, coordinate of the target is measured in real time. For realizing the system, key technology including high precision absolutely distance measurement, small high accuracy angle measurement, target automatic collimation with vision, and quick precise controlling should be worked out. After customized system assemblage and adjustment, the new type industrial total station will be established. As the experiments demonstrated, the coordinate accuracy of the instrument is under 15ppm in the distance of 60m, which proved that the measuring system is feasible. The result showed that the total station can satisfy most industrial field measurement requirements.

  3. Parallel computation of automatic differentiation applied to magnetic field calculations

    International Nuclear Information System (INIS)

    Hinkins, R.L.; Lawrence Berkeley Lab., CA

    1994-09-01

    The author presents a parallelization of an accelerator physics application to simulate magnetic field in three dimensions. The problem involves the evaluation of high order derivatives with respect to two variables of a multivariate function. Automatic differentiation software had been used with some success, but the computation time was prohibitive. The implementation runs on several platforms, including a network of workstations using PVM, a MasPar using MPFortran, and a CM-5 using CMFortran. A careful examination of the code led to several optimizations that improved its serial performance by a factor of 8.7. The parallelization produced further improvements, especially on the MasPar with a speedup factor of 620. As a result a problem that took six days on a SPARC 10/41 now runs in minutes on the MasPar, making it feasible for physicists at Lawrence Berkeley Laboratory to simulate larger magnets

  4. Automatic Recognition of Chinese Personal Name Using Conditional Random Fields and Knowledge Base

    Directory of Open Access Journals (Sweden)

    Chuan Gu

    2015-01-01

    Full Text Available According to the features of Chinese personal name, we present an approach for Chinese personal name recognition based on conditional random fields (CRF and knowledge base in this paper. The method builds multiple features of CRF model by adopting Chinese character as processing unit, selects useful features based on selection algorithm of knowledge base and incremental feature template, and finally implements the automatic recognition of Chinese personal name from Chinese document. The experimental results on open real corpus demonstrated the effectiveness of our method and obtained high accuracy rate and high recall rate of recognition.

  5. The automatic component of habit in health behavior: habit as cue-contingent automaticity.

    Science.gov (United States)

    Orbell, Sheina; Verplanken, Bas

    2010-07-01

    Habit might be usefully characterized as a form of automaticity that involves the association of a cue and a response. Three studies examined habitual automaticity in regard to different aspects of the cue-response relationship characteristic of unhealthy and healthy habits. In each study, habitual automaticity was assessed by the Self-Report Habit Index (SRHI). In Study 1 SRHI scores correlated with attentional bias to smoking cues in a Stroop task. Study 2 examined the ability of a habit cue to elicit an unwanted habit response. In a prospective field study, habitual automaticity in relation to smoking when drinking alcohol in a licensed public house (pub) predicted the likelihood of cigarette-related action slips 2 months later after smoking in pubs had become illegal. In Study 3 experimental group participants formed an implementation intention to floss in response to a specified situational cue. Habitual automaticity of dental flossing was rapidly enhanced compared to controls. The studies provided three different demonstrations of the importance of cues in the automatic operation of habits. Habitual automaticity assessed by the SRHI captured aspects of a habit that go beyond mere frequency or consistency of the behavior. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  6. An automatic taxonomy of galaxy morphology using unsupervised machine learning

    Science.gov (United States)

    Hocking, Alex; Geach, James E.; Sun, Yi; Davey, Neil

    2018-01-01

    We present an unsupervised machine learning technique that automatically segments and labels galaxies in astronomical imaging surveys using only pixel data. Distinct from previous unsupervised machine learning approaches used in astronomy we use no pre-selection or pre-filtering of target galaxy type to identify galaxies that are similar. We demonstrate the technique on the Hubble Space Telescope (HST) Frontier Fields. By training the algorithm using galaxies from one field (Abell 2744) and applying the result to another (MACS 0416.1-2403), we show how the algorithm can cleanly separate early and late type galaxies without any form of pre-directed training for what an 'early' or 'late' type galaxy is. We then apply the technique to the HST Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS) fields, creating a catalogue of approximately 60 000 classifications. We show how the automatic classification groups galaxies of similar morphological (and photometric) type and make the classifications public via a catalogue, a visual catalogue and galaxy similarity search. We compare the CANDELS machine-based classifications to human-classifications from the Galaxy Zoo: CANDELS project. Although there is not a direct mapping between Galaxy Zoo and our hierarchical labelling, we demonstrate a good level of concordance between human and machine classifications. Finally, we show how the technique can be used to identify rarer objects and present lensed galaxy candidates from the CANDELS imaging.

  7. Optical Automatic Car Identification (OACI) Field Test Program

    Science.gov (United States)

    1976-05-01

    The results of the Optical Automatic Car Identification (OACI) tests at Chicago conducted from August 16 to September 4, 1975 are presented. The main purpose of this test was to determine the suitability of optics as a principle of operation for an a...

  8. Application of an automatic cloud tracking technique to Meteosat water vapor and infrared observations

    Science.gov (United States)

    Endlich, R. M.; Wolf, D. E.

    1980-01-01

    The automatic cloud tracking system was applied to METEOSAT 6.7 micrometers water vapor measurements to learn whether the system can track the motions of water vapor patterns. Data for the midlatitudes, subtropics, and tropics were selected from a sequence of METEOSAT pictures for 25 April 1978. Trackable features in the water vapor patterns were identified using a clustering technique and the features were tracked by two different methods. In flat (low contrast) water vapor fields, the automatic motion computations were not reliable, but in areas where the water vapor fields contained small scale structure (such as in the vicinity of active weather phenomena) the computations were successful. Cloud motions were computed using METEOSAT infrared observations (including tropical convective systems and midlatitude jet stream cirrus).

  9. Development of Software for Automatic Analysis of Intervention in the Field of Homeopathy.

    Science.gov (United States)

    Jain, Rajesh Kumar; Goyal, Shagun; Bhat, Sushma N; Rao, Srinath; Sakthidharan, Vivek; Kumar, Prasanna; Sajan, Kannanaikal Rappayi; Jindal, Sameer Kumar; Jindal, Ghanshyam D

    2018-05-01

    To study the effect of homeopathic medicines (in higher potencies) in normal subjects, Peripheral Pulse Analyzer (PPA) has been used to record physiologic variability parameters before and after administration of the medicine/placebo in 210 normal subjects. Data have been acquired in seven rounds; placebo was administered in rounds 1 and 2 and medicine in potencies 6, 30, 200, 1 M, and 10 M was administered in rounds 3 to 7, respectively. Five different medicines in the said potencies were given to a group of around 40 subjects each. Although processing of data required human intervention, a software application has been developed to analyze the processed data and detect the response to eliminate the undue delay as well as human bias in subjective analysis. This utility named Automatic Analysis of Intervention in the Field of Homeopathy is run on the processed PPA data and the outcome has been compared with the manual analysis. The application software uses adaptive threshold based on statistics for detecting responses in contrast to fixed threshold used in manual analysis. The automatic analysis has detected 12.96% higher responses than subjective analysis. Higher response rates have been manually verified to be true positive. This indicates robustness of the application software. The automatic analysis software was run on another set of pulse harmonic parameters derived from the same data set to study cardiovascular susceptibility and 385 responses were detected in contrast to 272 of variability parameters. It was observed that 65% of the subjects, eliciting response, were common. This not only validates the software utility for giving consistent yield but also reveals the certainty of the response. This development may lead to electronic proving of homeopathic medicines (e-proving).

  10. A technique for automatically extracting useful field of view and central field of view images.

    Science.gov (United States)

    Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar

    2016-01-01

    It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.

  11. A technique for automatically extracting useful field of view and central field of view images

    International Nuclear Information System (INIS)

    Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar

    2016-01-01

    It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints

  12. Automatable algorithms to identify nonmedical opioid use using electronic data: a systematic review.

    Science.gov (United States)

    Canan, Chelsea; Polinski, Jennifer M; Alexander, G Caleb; Kowal, Mary K; Brennan, Troyen A; Shrank, William H

    2017-11-01

    Improved methods to identify nonmedical opioid use can help direct health care resources to individuals who need them. Automated algorithms that use large databases of electronic health care claims or records for surveillance are a potential means to achieve this goal. In this systematic review, we reviewed the utility, attempts at validation, and application of such algorithms to detect nonmedical opioid use. We searched PubMed and Embase for articles describing automatable algorithms that used electronic health care claims or records to identify patients or prescribers with likely nonmedical opioid use. We assessed algorithm development, validation, and performance characteristics and the settings where they were applied. Study variability precluded a meta-analysis. Of 15 included algorithms, 10 targeted patients, 2 targeted providers, 2 targeted both, and 1 identified medications with high abuse potential. Most patient-focused algorithms (67%) used prescription drug claims and/or medical claims, with diagnosis codes of substance abuse and/or dependence as the reference standard. Eleven algorithms were developed via regression modeling. Four used natural language processing, data mining, audit analysis, or factor analysis. Automated algorithms can facilitate population-level surveillance. However, there is no true gold standard for determining nonmedical opioid use. Users must recognize the implications of identifying false positives and, conversely, false negatives. Few algorithms have been applied in real-world settings. Automated algorithms may facilitate identification of patients and/or providers most likely to need more intensive screening and/or intervention for nonmedical opioid use. Additional implementation research in real-world settings would clarify their utility. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  13. Automatic lung tumor segmentation on PET/CT images using fuzzy Markov random field model.

    Science.gov (United States)

    Guo, Yu; Feng, Yuanming; Sun, Jian; Zhang, Ning; Lin, Wang; Sa, Yu; Wang, Ping

    2014-01-01

    The combination of positron emission tomography (PET) and CT images provides complementary functional and anatomical information of human tissues and it has been used for better tumor volume definition of lung cancer. This paper proposed a robust method for automatic lung tumor segmentation on PET/CT images. The new method is based on fuzzy Markov random field (MRF) model. The combination of PET and CT image information is achieved by using a proper joint posterior probability distribution of observed features in the fuzzy MRF model which performs better than the commonly used Gaussian joint distribution. In this study, the PET and CT simulation images of 7 non-small cell lung cancer (NSCLC) patients were used to evaluate the proposed method. Tumor segmentations with the proposed method and manual method by an experienced radiation oncologist on the fused images were performed, respectively. Segmentation results obtained with the two methods were similar and Dice's similarity coefficient (DSC) was 0.85 ± 0.013. It has been shown that effective and automatic segmentations can be achieved with this method for lung tumors which locate near other organs with similar intensities in PET and CT images, such as when the tumors extend into chest wall or mediastinum.

  14. Automatic Lung Tumor Segmentation on PET/CT Images Using Fuzzy Markov Random Field Model

    Directory of Open Access Journals (Sweden)

    Yu Guo

    2014-01-01

    Full Text Available The combination of positron emission tomography (PET and CT images provides complementary functional and anatomical information of human tissues and it has been used for better tumor volume definition of lung cancer. This paper proposed a robust method for automatic lung tumor segmentation on PET/CT images. The new method is based on fuzzy Markov random field (MRF model. The combination of PET and CT image information is achieved by using a proper joint posterior probability distribution of observed features in the fuzzy MRF model which performs better than the commonly used Gaussian joint distribution. In this study, the PET and CT simulation images of 7 non-small cell lung cancer (NSCLC patients were used to evaluate the proposed method. Tumor segmentations with the proposed method and manual method by an experienced radiation oncologist on the fused images were performed, respectively. Segmentation results obtained with the two methods were similar and Dice’s similarity coefficient (DSC was 0.85 ± 0.013. It has been shown that effective and automatic segmentations can be achieved with this method for lung tumors which locate near other organs with similar intensities in PET and CT images, such as when the tumors extend into chest wall or mediastinum.

  15. De-identifying Swedish clinical text - refinement of a gold standard and experiments with Conditional random fields

    Directory of Open Access Journals (Sweden)

    Dalianis Hercules

    2010-04-01

    Full Text Available Abstract Background In order to perform research on the information contained in Electronic Patient Records (EPRs, access to the data itself is needed. This is often very difficult due to confidentiality regulations. The data sets need to be fully de-identified before they can be distributed to researchers. De-identification is a difficult task where the definitions of annotation classes are not self-evident. Results We present work on the creation of two refined variants of a manually annotated Gold standard for de-identification, one created automatically, and one created through discussions among the annotators. The data is a subset from the Stockholm EPR Corpus, a data set available within our research group. These are used for the training and evaluation of an automatic system based on the Conditional Random Fields algorithm. Evaluating with four-fold cross-validation on sets of around 4-6 000 annotation instances, we obtained very promising results for both Gold Standards: F-score around 0.80 for a number of experiments, with higher results for certain annotation classes. Moreover, 49 false positives that were verified true positives were found by the system but missed by the annotators. Conclusions Our intention is to make this Gold standard, The Stockholm EPR PHI Corpus, available to other research groups in the future. Despite being slightly more time-consuming we believe the manual consensus gold standard is the most valuable for further research. We also propose a set of annotation classes to be used for similar de-identification tasks.

  16. Automatic sentence extraction for the detection of scientific paper relations

    Science.gov (United States)

    Sibaroni, Y.; Prasetiyowati, S. S.; Miftachudin, M.

    2018-03-01

    The relations between scientific papers are very useful for researchers to see the interconnection between scientific papers quickly. By observing the inter-article relationships, researchers can identify, among others, the weaknesses of existing research, performance improvements achieved to date, and tools or data typically used in research in specific fields. So far, methods that have been developed to detect paper relations include machine learning and rule-based methods. However, a problem still arises in the process of sentence extraction from scientific paper documents, which is still done manually. This manual process causes the detection of scientific paper relations longer and inefficient. To overcome this problem, this study performs an automatic sentences extraction while the paper relations are identified based on the citation sentence. The performance of the built system is then compared with that of the manual extraction system. The analysis results suggested that the automatic sentence extraction indicates a very high level of performance in the detection of paper relations, which is close to that of manual sentence extraction.

  17. Automatic analysis of altered gait in arylsulphatase A-deficient mice in the open field.

    Science.gov (United States)

    Leroy, Toon; Stroobants, Stijn; Aerts, Jean-Marie; D'Hooge, Rudi; Berckmans, Daniel

    2009-08-01

    In current research with laboratory animals, observing their dynamic behavior or locomotion is a labor-intensive task. Automatic continuous monitoring can provide quantitative data on each animal's condition and coordination ability. The objective of the present work is to develop an automated mouse observation system integrated with a conventional open-field test for motor function evaluation. Data were acquired from 86 mice having a targeted disruption of the arylsulphatase A (ASA) gene and having lowered coordinated locomotion abilities as a symptom. The mice used were 36 heterozygotes (12 females) and 50 knockout mice (30 females) at the age of 6 months. The mice were placed one at a time into the test setup, which consisted of a Plexiglas cage (53x34.5x26 cm) and two fluorescent bulbs for proper illumination. The transparent cage allowed images to be captured from underneath the cage, so image information could be obtained about the dynamic variation of the positions of the limbs of the mice for gait reconstruction. Every mouse was recorded for 10 min. Background subtraction and color filtering were used to measure and calculate image features, which are variables that contain crucial information, such as the mouse's position, orientation, body outline, and possible locations for the mouse's paws. A set of heuristic rules was used to prune implausible paw features and label the remaining ones as front/hind and left/right. After we had pruned the implausible paw features, the paw features that were consistent over subsequent images were matched to footprints. Finally, from the measured footprint sequence, eight parameters were calculated in order to quantify the gait of the mouse. This automatic observation technique can be integrated with a regular open-field test, where the trajectory and motor function of a free-moving mouse are measured simultaneously.

  18. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  19. Automatic sign language recognition inspired by human sign perception

    NARCIS (Netherlands)

    Ten Holt, G.A.

    2010-01-01

    Automatic sign language recognition is a relatively new field of research (since ca. 1990). Its objectives are to automatically analyze sign language utterances. There are several issues within the research area that merit investigation: how to capture the utterances (cameras, magnetic sensors,

  20. A model based method for automatic facial expression recognition

    NARCIS (Netherlands)

    Kuilenburg, H. van; Wiering, M.A.; Uyl, M. den

    2006-01-01

    Automatic facial expression recognition is a research topic with interesting applications in the field of human-computer interaction, psychology and product marketing. The classification accuracy for an automatic system which uses static images as input is however largely limited by the image

  1. THEORETICAL CONSIDERATIONS REGARDING THE AUTOMATIC FISCAL STABILIZERS OPERATING MECHANISM

    Directory of Open Access Journals (Sweden)

    Gondor Mihaela

    2012-07-01

    Full Text Available This paper examines the role of Automatic Fiscal Stabilizers (AFS for stabilizing the cyclical fluctuations of macroeconomic output as an alternative to discretionary fiscal policy, admitting its huge potential of being an anti crisis solution. The objectives of the study are the identification of the general features of the concept of automatic fiscal stabilizers and the logical assessment of them from economic perspectives. Based on the literature in the field, this paper points out the disadvantages of fiscal discretionary policy and argue the need of using Automatic Fiscal Stabilizers in order to provide a faster decision making process, shielded from political interference, and reduced uncertainty for households and business environment. The paper conclude about the need of using fiscal policy for smoothing the economic cycle, but in a way which includes among its features transparency, responsibility and clear operating mechanisms. Based on the research results the present paper assumes that pro-cyclicality reduces de effectiveness of the Automatic Fiscal Stabilizer and as a result concludes that it is very important to avoid the pro-cyclicality in fiscal rule design. Moreover, by committing in advance to specific fiscal policy action contingent on economic developments, uncertainty about the fiscal policy framework during a recession should be reduced. Being based on logical analysis and not focused on empirical, contextualized one, the paper presents some features of AFS operating mechanism and also identifies and systematizes the factors which provide its importance and national individuality. Reaching common understanding on the Automatic Fiscal Stabilizer concept as a institutional device for smoothing the gap of the economic cycles across different countries, particularly for the European Union Member States, will facilitate efforts to coordinate fiscal policy responses during a crisis, especially in the context of the fiscal

  2. Automatic imitation: A meta-analysis.

    Science.gov (United States)

    Cracco, Emiel; Bardi, Lara; Desmet, Charlotte; Genschow, Oliver; Rigoni, Davide; De Coster, Lize; Radkova, Ina; Deschrijver, Eliane; Brass, Marcel

    2018-05-01

    Automatic imitation is the finding that movement execution is facilitated by compatible and impeded by incompatible observed movements. In the past 15 years, automatic imitation has been studied to understand the relation between perception and action in social interaction. Although research on this topic started in cognitive science, interest quickly spread to related disciplines such as social psychology, clinical psychology, and neuroscience. However, important theoretical questions have remained unanswered. Therefore, in the present meta-analysis, we evaluated seven key questions on automatic imitation. The results, based on 161 studies containing 226 experiments, revealed an overall effect size of g z = 0.95, 95% CI [0.88, 1.02]. Moderator analyses identified automatic imitation as a flexible, largely automatic process that is driven by movement and effector compatibility, but is also influenced by spatial compatibility. Automatic imitation was found to be stronger for forced choice tasks than for simple response tasks, for human agents than for nonhuman agents, and for goalless actions than for goal-directed actions. However, it was not modulated by more subtle factors such as animacy beliefs, motion profiles, or visual perspective. Finally, there was no evidence for a relation between automatic imitation and either empathy or autism. Among other things, these findings point toward actor-imitator similarity as a crucial modulator of automatic imitation and challenge the view that imitative tendencies are an indicator of social functioning. The current meta-analysis has important theoretical implications and sheds light on longstanding controversies in the literature on automatic imitation and related domains. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. Persistent Identifiers for Field Deployments: A Missing Link in the Provenance Chain

    Science.gov (United States)

    Arko, R. A.; Ji, P.; Fils, D.; Shepherd, A.; Chandler, C. L.; Lehnert, K.

    2016-12-01

    Research in the geosciences is characterized by a wide range of complex and costly field deployments including oceanographic cruises, submersible dives, drilling expeditions, seismic networks, geodetic campaigns, moored arrays, aircraft flights, and satellite missions. Each deployment typically produces a mix of sensor and sample data, spanning a period from hours to decades, that ultimately yields a long tail of post-field products and publications. Publishing persistent, citable identifiers for field deployments will facilitate 1) preservation and reuse of the original field data, 2) reproducibility of the resulting publications, and 3) recognition for both the facilities that operate the platforms and the investigators who secure funding for the experiments. In the ocean domain, sharing unique identifiers for field deployments is a familiar practice. For example, the Biological and Chemical Oceanography Data Management Office (BCO-DMO) routinely links datasets to cruise identifiers published by the Rolling Deck to Repository (R2R) program. In recent years, facilities have started to publish formal/persistent identifiers, typically Digital Object Identifiers (DOIs), for field deployments including seismic networks, oceanographic cruises, and moored arrays. For example, the EarthChem Library (ECL) publishes a DOI for each dataset which, if it derived from an oceanographic research cruise on a US vessel, is linked to a DOI for the cruise published by R2R. Work is underway to create similar links for the IODP JOIDES Resolution Science Operator (JRSO) and the Continental Scientific Drilling Coordination Office (CSDCO). We present results and lessons learned including a draft schema for publishing field deployments as DataCite DOI records; current practice for linking these DOIs with related identifiers such as Open Researcher and Contributor IDs (ORCIDs), Open Funder Registry (OFR) codes, and International Geo Sample Numbers (IGSNs); and consideration of other

  4. 2nd International Conference on Mechatronics and Automatic Control

    CERN Document Server

    2015-01-01

    This book examines mechatronics and automatic control systems. The book covers important emerging topics in signal processing, control theory, sensors, mechanic manufacturing systems and automation. The book presents papers from the second International Conference on Mechatronics and Automatic Control Systems held in Beijing, China on September 20-21, 2014. Examines how to improve productivity through the latest advanced technologies Covering new systems and techniques in the broad field of mechatronics and automatic control systems.

  5. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  6. Metabolic changes in occipital lobe epilepsy with automatisms.

    Science.gov (United States)

    Wong, Chong H; Mohamed, Armin; Wen, Lingfeng; Eberl, Stefan; Somerville, Ernest; Fulham, Michael; Bleasel, Andrew F

    2014-01-01

    Some studies suggest that the pattern of glucose hypometabolism relates not only to the ictal-onset zone but also reflects seizure propagation. We investigated metabolic changes in patients with occipital lobe epilepsy (OLE) that may reflect propagation of ictal discharge during seizures with automatisms. Fifteen patients who had undergone epilepsy surgery for intractable OLE and had undergone interictal Fluorine-18-fluorodeoxyglucose positron-emission tomography ((18)F-FDG-PET) between 1994 and 2004 were divided into two groups (with and without automatisms during seizure). Significant regions of hypometabolism were identified by comparing (18)F-FDG-PET results from each group with 16 healthy controls by using statistical parametric mapping. Significant hypometabolism was confined largely to the epileptogenic occipital lobe in the patient group without automatisms. In patients with automatisms, glucose hypometabolism extended from the epileptogenic occipital lobe into the ipsilateral temporal lobe. We identified a distinctive hypometabolic pattern that was specific for OLE patients with automatisms during a seizure. This finding supports the postulate that seizure propagation is a cause of glucose hypometabolism beyond the region of seizure onset.

  7. Metabolic changes in occipital lobe epilepsy with automatisms

    Directory of Open Access Journals (Sweden)

    Chong H Wong

    2014-07-01

    Full Text Available Purpose: Some studies suggest that the pattern of glucose hypometabolism relates not only to the ictal-onset zone, but also reflects seizure propagation. We investigated metabolic changes in patients with occipital lobe epilepsy (OLE that may reflect propagation of ictal discharge during seizures with automatisms.Methods: Fifteen patients who had undergone epilepsy surgery for intractable OLE and had undergone interictal Fluorine-18-fluorodeoxyglucose positron emission tomography (18F-FDG-PET between 1994 and 2004 were divided into two groups (with and without automatisms during seizure. Significant regions of hypometabolism were identified by comparing 18F-FDG-PET results from each group with 16 healthy controls by using Statistical Parametric Mapping (SPM 2.Key Findings: Significant hypometabolism was confined largely to the epileptogenic occipital lobe in the patient group without automatisms. In patients with automatisms, glucose hypometabolism extended from the epileptogenic occipital lobe into the ipsilateral temporal lobe.Significance: We identified a distinctive hypometabolic pattern that was specific for OLE patients with automatisms during a seizure. This finding supports the postulate that seizure propagation is a cause of glucose hypometabolism beyond the region of seizure onset.

  8. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  9. Automatic classification of time-variable X-ray sources

    International Nuclear Information System (INIS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  10. Statistical pattern recognition for automatic writer identification and verification

    NARCIS (Netherlands)

    Bulacu, Marius Lucian

    2007-01-01

    The thesis addresses the problem of automatic person identification using scanned images of handwriting.Identifying the author of a handwritten sample using automatic image-based methods is an interesting pattern recognition problem with direct applicability in the forensic and historic document

  11. The Potential of Automatic Word Comparison for Historical Linguistics.

    Science.gov (United States)

    List, Johann-Mattis; Greenhill, Simon J; Gray, Russell D

    2017-01-01

    The amount of data from languages spoken all over the world is rapidly increasing. Traditional manual methods in historical linguistics need to face the challenges brought by this influx of data. Automatic approaches to word comparison could provide invaluable help to pre-analyze data which can be later enhanced by experts. In this way, computational approaches can take care of the repetitive and schematic tasks leaving experts to concentrate on answering interesting questions. Here we test the potential of automatic methods to detect etymologically related words (cognates) in cross-linguistic data. Using a newly compiled database of expert cognate judgments across five different language families, we compare how well different automatic approaches distinguish related from unrelated words. Our results show that automatic methods can identify cognates with a very high degree of accuracy, reaching 89% for the best-performing method Infomap. We identify the specific strengths and weaknesses of these different methods and point to major challenges for future approaches. Current automatic approaches for cognate detection-although not perfect-could become an important component of future research in historical linguistics.

  12. An Autonomous Robotic System for Mapping Weeds in Fields

    DEFF Research Database (Denmark)

    Hansen, Karl Damkjær; Garcia Ruiz, Francisco Jose; Kazmi, Wajahat

    2013-01-01

    The ASETA project develops theory and methods for robotic agricultural systems. In ASETA, unmanned aircraft and unmanned ground vehicles are used to automate the task of identifying and removing weeds in sugar beet fields. The framework for a working automatic robotic weeding system is presented...

  13. Automatic feathering of split fields for step-and-shoot intensity modulated radiation therapy

    International Nuclear Information System (INIS)

    Dogan, Nesrin; Leybovich, Leonid B; Sethi, Anil; Emami, Bahman

    2003-01-01

    Due to leaf travel range limitations of the Varian Dynamic Multileaf Collimator (DMLC) system, an IMRT field width exceeding 14.5 cm is split into two or more adjacent abutting sub-fields. The abutting sub-fields are then delivered as separate treatment fields. The accuracy of the delivery is very sensitive to multileaf positioning accuracy. The uncertainties in leaf and carriage positions cause errors in the delivered dose (e.g., hot or cold spots) along the match line of abutting sub-fields. The dose errors are proportional to the penumbra slope at the edge of each sub-field. To alleviate this problem, we developed techniques that feather the split line of IMRT fields. Feathering of the split line was achieved by dividing IMRT fields into several sub-groups with different split line positions. A Varian 21EX accelerator with an 80-leaf DLMC was used for IMRT delivery. Cylindrical targets with varying widths (>14.5 cm) were created to study the split line positions. Seven coplanar 6 MV fields were selected for planning using the NOMOS-CORVUS TM system. The isocentre of the fields was positioned at the centre of the target volume. Verification was done in a 30 x 30 x 30 cm 3 polystyrene phantom using film dosimetry. We investigated two techniques to move the split line from its original position or cause feathering of them: (1) varying the isocentre position along the target width and (2) introduction of a 'pseudo target' outside of the patient (phantom). The position of the 'pseudo target' was determined by analysing the divergence of IMRT fields. For target widths of 14-28 cm, IMRT fields were automatically split into two sub-fields, and the split line was positioned along the centre of the target by CORVUS. Measured dose distributions demonstrated that the dose to the critical structure was 10% higher than planned when the split line crossed through the centre of the target. Both methods of modifying the split line positions resulted in maximum shifts of ∼1 cm

  14. A web based semi automatic frame work for astrobiological researches

    Directory of Open Access Journals (Sweden)

    P.V. Arun

    2013-12-01

    Full Text Available Astrobiology addresses the possibility of extraterrestrial life and explores measures towards its recognition. Researches in this context are founded upon the premise that indicators of life encountered in space will be recognizable. However, effective recognition can be accomplished through a universal adaptation of life signatures without restricting solely to those attributes that represent local solutions to the challenges of survival. The life indicators should be modelled with reference to temporal and environmental variations specific to each planet and time. In this paper, we investigate a semi-automatic open source frame work for the accurate detection and interpretation of life signatures by facilitating public participation, in a similar way as adopted by SETI@home project. The involvement of public in identifying patterns can bring a thrust to the mission and is implemented using semi-automatic framework. Different advanced intelligent methodologies may augment the integration of this human machine analysis. Automatic and manual evaluations along with dynamic learning strategy have been adopted to provide accurate results. The system also helps to provide a deep public understanding about space agency’s works and facilitate a mass involvement in the astrobiological studies. It will surely help to motivate young eager minds to pursue a career in this field.

  15. The Associate Principal Astronomer for AI Management of Automatic Telescopes

    Science.gov (United States)

    Henry, Gregory W.

    1998-01-01

    This research program in scheduling and management of automatic telescopes had the following objectives: 1. To field test the 1993 Automatic Telescope Instruction Set (ATIS93) programming language, which was specifically developed to allow real-time control of an automatic telescope via an artificial intelligence scheduler running on a remote computer. 2. To develop and test the procedures for two-way communication between a telescope controller and remote scheduler via the Internet. 3. To test various concepts in Al scheduling being developed at NASA Ames Research Center on an automatic telescope operated by Tennessee State University at the Fairborn Observatory site in southern Arizona. and 4. To develop a prototype software package, dubbed the Associate Principal Astronomer, for the efficient scheduling and management of automatic telescopes.

  16. Towards an automatic wind speed and direction profiler for Wide Field adaptive optics systems

    Science.gov (United States)

    Sivo, G.; Turchi, A.; Masciadri, E.; Guesalaga, A.; Neichel, B.

    2018-05-01

    Wide Field Adaptive Optics (WFAO) systems are among the most sophisticated adaptive optics (AO) systems available today on large telescopes. Knowledge of the vertical spatio-temporal distribution of wind speed (WS) and direction (WD) is fundamental to optimize the performance of such systems. Previous studies already proved that the Gemini Multi-Conjugated AO system (GeMS) is able to retrieve measurements of the WS and WD stratification using the SLOpe Detection And Ranging (SLODAR) technique and to store measurements in the telemetry data. In order to assess the reliability of these estimates and of the SLODAR technique applied to such complex AO systems, in this study we compared WS and WD values retrieved from GeMS with those obtained with the atmospheric model Meso-NH on a rich statistical sample of nights. It has previously been proved that the latter technique provided excellent agreement with a large sample of radiosoundings, both in statistical terms and on individual flights. It can be considered, therefore, as an independent reference. The excellent agreement between GeMS measurements and the model that we find in this study proves the robustness of the SLODAR approach. To bypass the complex procedures necessary to achieve automatic measurements of the wind with GeMS, we propose a simple automatic method to monitor nightly WS and WD using Meso-NH model estimates. Such a method can be applied to whatever present or new-generation facilities are supported by WFAO systems. The interest of this study is, therefore, well beyond the optimization of GeMS performance.

  17. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions.

    Science.gov (United States)

    Zdravevski, Eftim; Risteska Stojkoska, Biljana; Standl, Marie; Schulz, Holger

    2017-01-01

    Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position. The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers. The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be achieved from

  18. Automatic machine-learning based identification of jogging periods from accelerometer measurements of adolescents under field conditions.

    Directory of Open Access Journals (Sweden)

    Eftim Zdravevski

    Full Text Available Assessment of health benefits associated with physical activity depend on the activity duration, intensity and frequency, therefore their correct identification is very valuable and important in epidemiological and clinical studies. The aims of this study are: to develop an algorithm for automatic identification of intended jogging periods; and to assess whether the identification performance is improved when using two accelerometers at the hip and ankle, compared to when using only one at either position.The study used diarized jogging periods and the corresponding accelerometer data from thirty-nine, 15-year-old adolescents, collected under field conditions, as part of the GINIplus study. The data was obtained from two accelerometers placed at the hip and ankle. Automated feature engineering technique was performed to extract features from the raw accelerometer readings and to select a subset of the most significant features. Four machine learning algorithms were used for classification: Logistic regression, Support Vector Machines, Random Forest and Extremely Randomized Trees. Classification was performed using only data from the hip accelerometer, using only data from ankle accelerometer and using data from both accelerometers.The reported jogging periods were verified by visual inspection and used as golden standard. After the feature selection and tuning of the classification algorithms, all options provided a classification accuracy of at least 0.99, independent of the applied segmentation strategy with sliding windows of either 60s or 180s. The best matching ratio, i.e. the length of correctly identified jogging periods related to the total time including the missed ones, was up to 0.875. It could be additionally improved up to 0.967 by application of post-classification rules, which considered the duration of breaks and jogging periods. There was no obvious benefit of using two accelerometers, rather almost the same performance could be

  19. Automatic evidence retrieval for systematic reviews.

    Science.gov (United States)

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy

    2014-10-01

    Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.

  20. Field manual for identifying and preserving high-water mark data

    Science.gov (United States)

    Feaster, Toby D.; Koenig, Todd A.

    2017-09-26

    This field manual provides general guidance for identifying and collecting high-water marks and is meant to be used by field personnel as a quick reference. The field manual describes purposes for collecting and documenting high-water marks along with the most common types of high-water marks. The manual provides a list of suggested field equipment, describes rules of thumb and best practices for finding high-water marks, and describes the importance of evaluating each high-water mark and assigning a numeric uncertainty value as part of the flagging process. The manual also includes an appendix of photographs of a variety of high-water marks obtained from various U.S. Geological Survey field investigations along with general comments about the logic for the assigned uncertainty values.

  1. [An automatic system controlled by microcontroller for carotid sinus perfusion].

    Science.gov (United States)

    Yi, X L; Wang, M Y; Fan, Z Z; He, R R

    2001-08-01

    To establish a new method for controlling automatically the carotid perfusion pressure. A cheap practical automatic perfusion unit based on AT89C2051 micro controller was designed. The unit, LDB-M perfusion pump and the carotid sinus of an animal constituted an automatic perfusion system. This system was able to provide ramp and stepwise updown perfusion pattern and has been used in the research of baroreflex. It can insure the precision and reproducibility of perfusion pressure curve, and improve the technical level in corresponding medical field.

  2. Glaucomatous patterns in Frequency Doubling Technology (FDT) perimetry data identified by unsupervised machine learning classifiers.

    Science.gov (United States)

    Bowd, Christopher; Weinreb, Robert N; Balasubramanian, Madhusudhanan; Lee, Intae; Jang, Giljin; Yousefi, Siamak; Zangwill, Linda M; Medeiros, Felipe A; Girkin, Christopher A; Liebmann, Jeffrey M; Goldbaum, Michael H

    2014-01-01

    The variational Bayesian independent component analysis-mixture model (VIM), an unsupervised machine-learning classifier, was used to automatically separate Matrix Frequency Doubling Technology (FDT) perimetry data into clusters of healthy and glaucomatous eyes, and to identify axes representing statistically independent patterns of defect in the glaucoma clusters. FDT measurements were obtained from 1,190 eyes with normal FDT results and 786 eyes with abnormal FDT results from the UCSD-based Diagnostic Innovations in Glaucoma Study (DIGS) and African Descent and Glaucoma Evaluation Study (ADAGES). For all eyes, VIM input was 52 threshold test points from the 24-2 test pattern, plus age. FDT mean deviation was -1.00 dB (S.D. = 2.80 dB) and -5.57 dB (S.D. = 5.09 dB) in FDT-normal eyes and FDT-abnormal eyes, respectively (p<0.001). VIM identified meaningful clusters of FDT data and positioned a set of statistically independent axes through the mean of each cluster. The optimal VIM model separated the FDT fields into 3 clusters. Cluster N contained primarily normal fields (1109/1190, specificity 93.1%) and clusters G1 and G2 combined, contained primarily abnormal fields (651/786, sensitivity 82.8%). For clusters G1 and G2 the optimal number of axes were 2 and 5, respectively. Patterns automatically generated along axes within the glaucoma clusters were similar to those known to be indicative of glaucoma. Fields located farther from the normal mean on each glaucoma axis showed increasing field defect severity. VIM successfully separated FDT fields from healthy and glaucoma eyes without a priori information about class membership, and identified familiar glaucomatous patterns of loss.

  3. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    Science.gov (United States)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  4. ANA, automatic natural learning of a semantic network

    International Nuclear Information System (INIS)

    Enguehard, Chantal

    1992-01-01

    The objective of this research thesis is the automatic extraction of terminology and the study of its automatic structuring in order to produce a semantic network. Such an operation is applied to text corpus representing knowledge on a specific field in order to select the relevant technical vocabulary regarding this field. Thus, the author developed a method and a software for the automatic acquisition of terminology items. The author first gives an overview of systems and methods of document indexing and of thesaurus elaboration, and a brief presentation of the state-of-the-art of learning. Then, he discusses some drawbacks of computer systems of natural language processing which are using large knowledge sources such as grammars and dictionaries. After a presentation of the adopted approach and of some hypotheses, the author defines objects and operators which are necessary for an easier data handling, presents the knowledge acquisition process, and finally precisely describes the system computerization. Some results are assessed and discussed, and limitations and perspectives are commented [fr

  5. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  6. Automatic measurement for solid state track detectors

    International Nuclear Information System (INIS)

    Ogura, Koichi

    1982-01-01

    Since in solid state track detectors, their tracks are measured with a microscope, observers are forced to do hard works that consume time and labour. This causes to obtain poor statistic accuracy or to produce personal error. Therefore, many researches have been done to aim at simplifying and automating track measurement. There are two categories in automating the measurement: simple counting of the number of tracks and the requirements to know geometrical elements such as the size of tracks or their coordinates as well as the number of tracks. The former is called automatic counting and the latter automatic analysis. The method to generally evaluate the number of tracks in automatic counting is the estimation of the total number of tracks in the total detector area or in a field of view of a microscope. It is suitable for counting when the track density is higher. The method to count tracks one by one includes the spark counting and the scanning microdensitometer. Automatic analysis includes video image analysis in which the high quality images obtained with a high resolution video camera are processed with a micro-computer, and the tracks are automatically recognized and measured by feature extraction. This method is described in detail. In many kinds of automatic measurements reported so far, frequently used ones are ''spark counting'' and ''video image analysis''. (Wakatsuki, Y.)

  7. Automatic alignment device for focal spot measurements in the center of the field for mammography; Sistema automatico de alinhamento para avaliacao do ponto focal no centro do campo de equipamentos mamograficos

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Marcelo A.C.; Watanabe, Alex O.; Oliveira Junior, Paulo D.; Schiabel, Homero [Universidade de Sao Paulo (USP), Sao Carlos, SP (Brazil). Escola de Engenharia. Dept. de Engenharia Eletrica], e-mail: mvieira@sc.usp.br

    2010-03-15

    Some quality control procedures used for mammography, such as focal spot evaluation, requires previous alignment of the measurement equipment with the X-ray central beam. However, alignment procedures are, in general, the most difficult task and the one that needs more time to be performed. Moreover, the operator sometimes is exposed to radiation during this procedure. This work presents an automatic alignment system for mammographic equipment that allows locating the central ray of the radiation beam and, immediately, aligns with it by dislocating itself automatically along the field. The system consists on a bidirectional moving device, connected to a CCD sensor for digital radiographic image acquisition. A computational analysis of a radiographic image, acquired at any position on the field, is performed in order to determine its positioning under the X-ray beam. Finally, a mechanical system for two moving directions, electronically controlled by a microcontroller under USB communication, makes the system to align automatically with the radiation beam central ray. The alignment process is fully automatic, fast and accurate, with no operator exposure to radiation, which allows a considerable time saving for quality control procedures achievement for mammography. (author)

  8. Automatic control of positioning along the joint during EBW in conditions of action of magnetic fields

    Science.gov (United States)

    Druzhinina, A. A.; Laptenok, V. D.; Murygin, A. V.; Laptenok, P. V.

    2016-11-01

    Positioning along the joint during the electron beam welding is a difficult scientific and technical problem to achieve the high quality of welds. The final solution of this problem is not found. This is caused by weak interference protection of sensors of the joint position directly in the welding process. Frequently during the electron beam welding magnetic fields deflect the electron beam from the optical axis of the electron beam gun. The collimated X-ray sensor is used to monitor the beam deflection caused by the action of magnetic fields. Signal of X-ray sensor is processed by the method of synchronous detection. Analysis of spectral characteristics of the X-ray sensor showed that the displacement of the joint from the optical axis of the gun affects on the output signal of sensor. The authors propose dual-circuit system for automatic positioning of the electron beam on the joint during the electron beam welding in conditions of action of magnetic interference. This system includes a contour of joint tracking and contour of compensation of magnetic fields. The proposed system is stable. Calculation of dynamic error of system showed that error of positioning does not exceed permissible deviation of the electron beam from the joint plane.

  9. Distributed automatic control of technological processes in conditions of weightlessness

    Science.gov (United States)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  10. Automatic jargon identifier for scientists engaging with the public and science communication educators

    Science.gov (United States)

    Chapnik, Noam; Yosef, Roy; Baram-Tsabari, Ayelet

    2017-01-01

    Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012–2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1) comparison and correlation with existing frequency word lists in the literature; (2) a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3) a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and be implemented

  11. Automatic jargon identifier for scientists engaging with the public and science communication educators.

    Directory of Open Access Journals (Sweden)

    Tzipora Rakedzon

    Full Text Available Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012-2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1 comparison and correlation with existing frequency word lists in the literature; (2 a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3 a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and

  12. Automatic jargon identifier for scientists engaging with the public and science communication educators.

    Science.gov (United States)

    Rakedzon, Tzipora; Segev, Elad; Chapnik, Noam; Yosef, Roy; Baram-Tsabari, Ayelet

    2017-01-01

    Scientists are required to communicate science and research not only to other experts in the field, but also to scientists and experts from other fields, as well as to the public and policymakers. One fundamental suggestion when communicating with non-experts is to avoid professional jargon. However, because they are trained to speak with highly specialized language, avoiding jargon is difficult for scientists, and there is no standard to guide scientists in adjusting their messages. In this research project, we present the development and validation of the data produced by an up-to-date, scientist-friendly program for identifying jargon in popular written texts, based on a corpus of over 90 million words published in the BBC site during the years 2012-2015. The validation of results by the jargon identifier, the De-jargonizer, involved three mini studies: (1) comparison and correlation with existing frequency word lists in the literature; (2) a comparison with previous research on spoken language jargon use in TED transcripts of non-science lectures, TED transcripts of science lectures and transcripts of academic science lectures; and (3) a test of 5,000 pairs of published research abstracts and lay reader summaries describing the same article from the journals PLOS Computational Biology and PLOS Genetics. Validation procedures showed that the data classification of the De-jargonizer significantly correlates with existing frequency word lists, replicates similar jargon differences in previous studies on scientific versus general lectures, and identifies significant differences in jargon use between abstracts and lay summaries. As expected, more jargon was found in the academic abstracts than lay summaries; however, the percentage of jargon in the lay summaries exceeded the amount recommended for the public to understand the text. Thus, the De-jargonizer can help scientists identify problematic jargon when communicating science to non-experts, and be implemented by

  13. Comparative Study between Sequential Automatic and Manual Home Respiratory Polygraphy Scoring Using a Three-Channel Device: Impact of the Manual Editing of Events to Identify Severe Obstructive Sleep Apnea

    Directory of Open Access Journals (Sweden)

    Glenda Ernst

    2015-01-01

    Full Text Available Objective. According to current guidelines, autoscoring of respiratory events in respiratory polygraphy requires manual scoring. The aim of this study was to evaluate the agreement between automatic analysis and manual scoring to identify patients with suspected OSA. Methods. This retrospective study analyzed 791 records from respiratory polygraphy (RP performed at home. The association grade between automatic scoring and manual scoring was evaluated using Kappa coefficient and the agreement using Bland and Altman test and intraclass correlation coefficient (CCI. To determine the accuracy in the identification of AHI≥30 eV/h, the ROC curve analysis was used. Results. The population analyzed consisted of 493 male (62.3% and 298 female patients, with an average age of 54.7±14.20 years and BMI of 32.7±8.21 kg/m2. There was no significant difference between automatic and manual apnea/hypopnea indexes (aAHI, mAHI: aAHI 17.25 (SD: 17.42 versus mAHI 21.20±7.96 (p; NS. The agreement between mAHI and aAHI to AHI≥30 was 94%, with a Kappa coefficient of 0.83 (p<0.001 and a CCI of 0.83. The AUC-ROC, sensitivity, and specificity were 0.99 (CI 95%: 0.98-0.99, p<0.001, 86% (CI 95%: 78.7–91.4, and 97% (CI 95%: 96–98.3, respectively. Conclusions. We observed good agreement between automatic scoring and sequential manual scoring to identify subjects with AHI≥30 eV/h.

  14. Automatization of laboratory extraction installation intended for investigations in the field of reprocessing of spenf fuels

    International Nuclear Information System (INIS)

    Vznuzdaev, E.A.; Galkin, B.Ya.; Gofman, F.Eh.

    1981-01-01

    Automatized stand for solving the problem of optimum control on technological extraction process in the spent fuel reprocessing by means of an automatized control system which is based on the means of computation technick is described in the paper. Preliminary experiments which had been conducted on the stand with spent fuel from WWER-440 reactor have shown high efficiency of automatization and possibility to conduct technological investigations in a short period of time and to have much of information which can not be obtained by ordinary organisation of work [ru

  15. Automatic Imitation

    Science.gov (United States)

    Heyes, Cecilia

    2011-01-01

    "Automatic imitation" is a type of stimulus-response compatibility effect in which the topographical features of task-irrelevant action stimuli facilitate similar, and interfere with dissimilar, responses. This article reviews behavioral, neurophysiological, and neuroimaging research on automatic imitation, asking in what sense it is "automatic"…

  16. Accuracy of Automatic Cephalometric Software on Landmark Identification

    Science.gov (United States)

    Anuwongnukroh, N.; Dechkunakorn, S.; Damrongsri, S.; Nilwarat, C.; Pudpong, N.; Radomsutthisarn, W.; Kangern, S.

    2017-11-01

    This study was to assess the accuracy of an automatic cephalometric analysis software in the identification of cephalometric landmarks. Thirty randomly selected digital lateral cephalograms of patients undergoing orthodontic treatment were used in this study. Thirteen landmarks (S, N, Or, A-point, U1T, U1A, B-point, Gn, Pog, Me, Go, L1T, and L1A) were identified on the digital image by an automatic cephalometric software and on cephalometric tracing by manual method. Superimposition of printed image and manual tracing was done by registration at the soft tissue profiles. The accuracy of landmarks located by the automatic method was compared with that of the manually identified landmarks by measuring the mean differences of distances of each landmark on the Cartesian plane where X and Y coordination axes passed through the center of ear rod. One-Sample T test was used to evaluate the mean differences. Statistically significant mean differences (pmean differences in both horizontal and vertical directions. Small mean differences (mean differences were found for A-point (3.0 4mm) in vertical direction. Only 5 of 13 landmarks (38.46%; S, N, Gn, Pog, and Go) showed no significant mean difference between the automatic and manual landmarking methods. It is concluded that if this automatic cephalometric analysis software is used for orthodontic diagnosis, the orthodontist must correct or modify the position of landmarks in order to increase the accuracy of cephalometric analysis.

  17. Automaticity of walking: functional significance, mechanisms, measurement and rehabilitation strategies

    Directory of Open Access Journals (Sweden)

    David J Clark

    2015-05-01

    Full Text Available Automaticity is a hallmark feature of walking in adults who are healthy and well-functioning. In the context of walking, ‘automaticity’ refers to the ability of the nervous system to successfully control typical steady state walking with minimal use of attention-demanding executive control resources. Converging lines of evidence indicate that walking deficits and disorders are characterized in part by a shift in the locomotor control strategy from healthy automaticity to compensatory executive control. This is potentially detrimental to walking performance, as an executive control strategy is not optimized for locomotor control. Furthermore, it places excessive demands on a limited pool of executive reserves. The result is compromised ability to perform basic and complex walking tasks and heightened risk for adverse mobility outcomes including falls. Strategies for rehabilitation of automaticity are not well defined, which is due to both a lack of systematic research into the causes of impaired automaticity and to a lack of robust neurophysiological assessments by which to gauge automaticity. These gaps in knowledge are concerning given the serious functional implications of compromised automaticity. Therefore, the objective of this article is to advance the science of automaticity of walking by consolidating evidence and identifying gaps in knowledge regarding: a functional significance of automaticity; b neurophysiology of automaticity; c measurement of automaticity; d mechanistic factors that compromise automaticity; and e strategies for rehabilitation of automaticity.

  18. Automatic traveltime picking using instantaneous traveltime

    KAUST Repository

    Saragiotis, Christos; Alkhalifah, Tariq Ali; Fomel, Sergey

    2013-01-01

    Event picking is used in many steps of seismic processing. We present an automatic event picking method that is based on a new attribute of seismic signals, instantaneous traveltime. The calculation of the instantaneous traveltime consists of two separate but interrelated stages. First, a trace is mapped onto the time-frequency domain. Then the time-frequency representation is mapped back onto the time domain by an appropriate operation. The computed instantaneous traveltime equals the recording time at those instances at which there is a seismic event, a feature that is used to pick the events. We analyzed the concept of the instantaneous traveltime and demonstrated the application of our automatic picking method on dynamite and Vibroseis field data.

  19. Automatic traveltime picking using instantaneous traveltime

    KAUST Repository

    Saragiotis, Christos

    2013-02-08

    Event picking is used in many steps of seismic processing. We present an automatic event picking method that is based on a new attribute of seismic signals, instantaneous traveltime. The calculation of the instantaneous traveltime consists of two separate but interrelated stages. First, a trace is mapped onto the time-frequency domain. Then the time-frequency representation is mapped back onto the time domain by an appropriate operation. The computed instantaneous traveltime equals the recording time at those instances at which there is a seismic event, a feature that is used to pick the events. We analyzed the concept of the instantaneous traveltime and demonstrated the application of our automatic picking method on dynamite and Vibroseis field data.

  20. Electrical design of a 110-ft long muon pipe with automatic degaussing

    International Nuclear Information System (INIS)

    Visser, A.T.

    1985-11-01

    This memo describes a magnetized cylindrical pipe made from tape wound grain oriented low carbon steel rolls. Grain oriented steel yields much higher magnetic fields at low ampereturns than cast iron or other steel pipes. This is especially important when only a few windings are allowed in the inner bore. The power supply and operating cost are also much lower. The pipe has a high (approx.9 kG) remnant field, but is automatically degaussed upon shutdown of the DC excitation power supply. A remnant field detector senses whether degaussing was successful. The pipe is used in the muon beam line. Its magnetic field deflects unwanted halo muons. Tests need to be conducted with and without pipe field. It is therefore desirable that the pipe field automatically returns to zero when the DC excitation is shut off. This can be rather easily accomplished

  1. Very Portable Remote Automatic Weather Stations

    Science.gov (United States)

    John R. Warren

    1987-01-01

    Remote Automatic Weather Stations (RAWS) were introduced to Forest Service and Bureau of Land Management field units in 1978 following development, test, and evaluation activities conducted jointly by the two agencies. The original configuration was designed for semi-permanent installation. Subsequently, a need for a more portable RAWS was expressed, and one was...

  2. Culture, attribution and automaticity: a social cognitive neuroscience view.

    Science.gov (United States)

    Mason, Malia F; Morris, Michael W

    2010-06-01

    A fundamental challenge facing social perceivers is identifying the cause underlying other people's behavior. Evidence indicates that East Asian perceivers are more likely than Western perceivers to reference the social context when attributing a cause to a target person's actions. One outstanding question is whether this reflects a culture's influence on automatic or on controlled components of causal attribution. After reviewing behavioral evidence that culture can shape automatic mental processes as well as controlled reasoning, we discuss the evidence in favor of cultural differences in automatic and controlled components of causal attribution more specifically. We contend that insights emerging from social cognitive neuroscience research can inform this debate. After introducing an attribution framework popular among social neuroscientists, we consider findings relevant to the automaticity of attribution, before speculating how one could use a social neuroscience approach to clarify whether culture affects automatic, controlled or both types of attribution processes.

  3. The development of automatic neutron diffractometry at Harwell

    International Nuclear Information System (INIS)

    Hall, J.W.

    1978-08-01

    Neutron diffractometry contributes substantially to studies of the structure of materials. Scientists at Harwell were among the first to make the collection of diffractometer data automatic and have continued to contribute to this field. This paper outlines the development of automatic neutron diffractometers at Harwell from 1960, and considers the various ANDROMACHE systems up to a hierarchical computer system that is anticipated for 1979. Appendices provide examples of the documentation provided for users of the ANDROMACHE Mark 6 neutron diffractometer system and give brief descriptions of the elements of the programs. (author)

  4. Identifying Future Training Technology Opportunities Using Career Field Models and Simulations

    National Research Council Canada - National Science Library

    Bennett, Jr., Winston; Stone, Brice; Turner, Kathryn; Ruck, Hendrick W

    2002-01-01

    ... itself. This report presents results from a recent application of a career field education and training planning simulation capability to identify cost-effective opportunities for the introduction...

  5. Trends of progress in medical technics as far as automatization is concerned

    Energy Technology Data Exchange (ETDEWEB)

    Agoston, M [Medicor Muevek, Budapest (Hungary)

    1978-09-01

    Modernization of medical treatment is developing to the direction of establishing big hospitals and policlinics. Highly productive automatic equipments give possibilities for performing the mass examinations with high efficiency. Still the X-ray instruments form the most valuable and indispensable device group. One direction to develop the automatization of these machines is to achieve the best X-ray exposure. The relatively slow but continuous spreading of isotope diagnostic instruments has been connected with a number of results in automatization, too. In the field of sterilization bactericid materials, gas- and ray sterilizing methods, as well as combined systems become used. Automatization has a strong influence on the domain of epidemiology as well.

  6. Scan-Less Line Field Optical Coherence Tomography, with Automatic Image Segmentation, as a Measurement Tool for Automotive Coatings

    Directory of Open Access Journals (Sweden)

    Samuel Lawman

    2017-04-01

    Full Text Available The measurement of the thicknesses of layers is important for the quality assurance of industrial coating systems. Current measurement techniques only provide a limited amount of information. Here, we show that spectral domain Line Field (LF Optical Coherence Tomography (OCT is able to return to the user a cross sectional B-Scan image in a single shot with no mechanical moving parts. To reliably extract layer thicknesses from such images of automotive paint systems, we present an automatic graph search image segmentation algorithm. To show that the algorithm works independently of the OCT device, the measurements are repeated with a separate time domain Full Field (FF OCT system. This gives matching mean thickness values within the standard deviations of the measured thicknesses across each B-Scan image. The combination of an LF-OCT with graph search segmentation is potentially a powerful technique for the quality assurance of non-opaque industrial coating layers.

  7. Applying deep learning technology to automatically identify metaphase chromosomes using scanning microscopic images: an initial investigation

    Science.gov (United States)

    Qiu, Yuchen; Lu, Xianglan; Yan, Shiju; Tan, Maxine; Cheng, Samuel; Li, Shibo; Liu, Hong; Zheng, Bin

    2016-03-01

    Automated high throughput scanning microscopy is a fast developing screening technology used in cytogenetic laboratories for the diagnosis of leukemia or other genetic diseases. However, one of the major challenges of using this new technology is how to efficiently detect the analyzable metaphase chromosomes during the scanning process. The purpose of this investigation is to develop a computer aided detection (CAD) scheme based on deep learning technology, which can identify the metaphase chromosomes with high accuracy. The CAD scheme includes an eight layer neural network. The first six layers compose of an automatic feature extraction module, which has an architecture of three convolution-max-pooling layer pairs. The 1st, 2nd and 3rd pair contains 30, 20, 20 feature maps, respectively. The seventh and eighth layers compose of a multiple layer perception (MLP) based classifier, which is used to identify the analyzable metaphase chromosomes. The performance of new CAD scheme was assessed by receiver operation characteristic (ROC) method. A number of 150 regions of interest (ROIs) were selected to test the performance of our new CAD scheme. Each ROI contains either interphase cell or metaphase chromosomes. The results indicate that new scheme is able to achieve an area under the ROC curve (AUC) of 0.886+/-0.043. This investigation demonstrates that applying a deep learning technique may enable to significantly improve the accuracy of the metaphase chromosome detection using a scanning microscopic imaging technology in the future.

  8. Automatic topics segmentation for TV news video

    Science.gov (United States)

    Hmayda, Mounira; Ejbali, Ridha; Zaied, Mourad

    2017-03-01

    Automatic identification of television programs in the TV stream is an important task for operating archives. This article proposes a new spatio-temporal approach to identify the programs in TV stream into two main steps: First, a reference catalogue for video features visual jingles built. We operate the features that characterize the instances of the same program type to identify the different types of programs in the flow of television. The role of video features is to represent the visual invariants for each visual jingle using appropriate automatic descriptors for each television program. On the other hand, programs in television streams are identified by examining the similarity of the video signal for visual grammars in the catalogue. The main idea of the identification process is to compare the visual similarity of the video signal features in the flow of television to the catalogue. After presenting the proposed approach, the paper overviews encouraging experimental results on several streams extracted from different channels and compounds of several programs.

  9. The system for automatic dose rate measurements by mobile groups in field

    International Nuclear Information System (INIS)

    Drabova, D.; Filgas, R.; Cespirova, I.; Ejemova, M.

    1998-01-01

    The comparison of characteristics between a pressurized ionization chamber, plastic scintillator and proportional counter is given. Based on requirements and comparison of properties of various probes, the system for automatic dose rate measurement and integration of geographic co-ordinates in field was designed and tested.The system consists of proportional counter. This is so-called intelligent probe can be easily connected to a personal computer. The probe measures in the energy range 30 keV - 1.3 MeV with reasonable energy and angular response, it can measure the dose rate in the range 50 nSv/h - 1 Sv/h with the typical efficiency 9.5 imp/s/μSv/h. The probe is fixed in the holder placed on the front mask of a car. For the simultaneous determination of geographical co-ordinates the personal GPS navigator Garmin 95 is used. Both devices are controlled by a notebook via two serial ports. The second serial port that is not quite common in notebook can be easily realised by a PCMCIA card. The notebook is used in the field by a mobile group can be transmitted to the assessment centre by the cellular GSM phone. The system Nokia 2110 connected to notebook by PCMCIA card is used. The whole system is powered up from the car battery. The system is controlled by specially developed software. The software was developed in the FoxPro 2.5 environment and works under MS-DOS 6.22. It has no problems to work in Windows 95 DOS window. The results of dose rate measurements obtained during route monitoring are stored in files. They can be displayed on a graphic screen, presenting the geographical distribution of the dose rate values colour coded on a map and the time sequence of the measured data. (authors)

  10. Identifying research fields within business and management: a journal cross-citation analysis

    NARCIS (Netherlands)

    Mingers, J.; Leydesdorff, L.

    2015-01-01

    A discipline such as business and management (B&M) is very broad and has many fields within it, ranging from fairly scientific ones such as management science or economics to softer ones such as information systems. There are at least three reasons why it is important to identify these sub-fields

  11. Automatic, anatomically selective, artifact-free enhancement of digital chest radiographs

    International Nuclear Information System (INIS)

    Sezan, M.I.; Tekalp, A.M.; Schaetzing, R.

    1988-01-01

    The authors propose a technique for automatic, anatomically selective, artifact-free enhancement of digital chest radiographs. Anatomically selective enhancement is motivated by the different enhancement requirements of the lung field and the mediastinum. A recent peak detection algorithm is applied to the image histogram to automatically determine a gray-level threshold between the lung and mediastinum fields. The gray-level threshold facilitates anatomically selective gray-scale modification and unsharp masking. Further, in an attempt to suppress possible white-band artifacts due to unsharp masking at sharp edges, local-contrast adaptivity is incorporated into anatomically selective unsharp masking by designing an anatomy-sensitive emphasis parameter that varied asymmetrically with positive and negative values of the local image contrast

  12. A semi-automatic annotation tool for cooking video

    Science.gov (United States)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  13. Automatic Weather Station (AWS Program operated by the University of Wisconsin-Madison during the 2012-2013 field season: Challenges and Successes

    Directory of Open Access Journals (Sweden)

    Matthew A. Lazzara

    2015-03-01

    Full Text Available This report reviews 2012-2013 field season activities of the University of Wisconsin-Madison's Antarctic Automatic Weather Station (AWS program, summarizes the science that these sites are supporting, and outlines the factors that impact the number of AWS sites serviced in any given field season. The 2012-2013 austral summer season was unusual in the AWS network history. Challenges encountered include, but are not limited to, warmer than normal conditions in the Ross Island area impacting airfield operations, changes to logistical procedures, and competition for shared resources. A flexible work plan provides the best means for taking on these challenges while maximizing AWS servicing efforts under restricted conditions and meeting the need for routine servicing that maintaining an autonomous observing network demands.

  14. Automatic anatomically selective image enhancement in digital chest radiography

    International Nuclear Information System (INIS)

    Sezan, M.I.; Minerbo, G.N.; Schaetzing, R.

    1989-01-01

    The authors develop a technique for automatic anatomically selective enhancement of digital chest radiographs. Anatomically selective enhancement is motivated by the desire to simultaneously meet the different enhancement requirements of the lung field and the mediastinum. A recent peak detection algorithm and a set of rules are applied to the image histogram to determine automatically a gray-level threshold between the lung field and mediastinum. The gray-level threshold facilitates anatomically selective gray-scale modification and/or unsharp masking. Further, in an attempt to suppress possible white-band or black-band artifacts due to unsharp masking at sharp edges, local-contrast adaptivity is incorporated into anatomically selective unsharp masking by designing an anatomy-sensitive emphasis parameter which varies asymmetrically with positive and negative values of the local image contrast

  15. A Machine Vision System for Automatically Grading Hardwood Lumber - (Proceedings)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas H. Drayer; Joe G. Tront; Philip A. Araman; Robert L. Brisbon

    1990-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  16. System for automatic detection of lung nodules exhibiting growth

    Science.gov (United States)

    Novak, Carol L.; Shen, Hong; Odry, Benjamin L.; Ko, Jane P.; Naidich, David P.

    2004-05-01

    Lung nodules that exhibit growth over time are considered highly suspicious for malignancy. We present a completely automated system for detection of growing lung nodules, using initial and follow-up multi-slice CT studies. The system begins with automatic detection of lung nodules in the later CT study, generating a preliminary list of candidate nodules. Next an automatic system for registering locations in two studies matches each candidate in the later study to its corresponding position in the earlier study. Then a method for automatic segmentation of lung nodules is applied to each candidate and its matching location, and the computed volumes are compared. The output of the system is a list of nodule candidates that are new or have exhibited volumetric growth since the previous scan. In a preliminary test of 10 patients examined by two radiologists, the automatic system identified 18 candidates as growing nodules. 7 (39%) of these corresponded to validated nodules or other focal abnormalities that exhibited growth. 4 of the 7 true detections had not been identified by either of the radiologists during their initial examinations of the studies. This technique represents a powerful method of surveillance that may reduce the probability of missing subtle or early malignant disease.

  17. Automatic characterization of dynamics in Absence Epilepsy

    DEFF Research Database (Denmark)

    Petersen, Katrine N. H.; Nielsen, Trine N.; Kjær, Troels W.

    2013-01-01

    Dynamics of the spike-wave paroxysms in Childhood Absence Epilepsy (CAE) are automatically characterized using novel approaches. Features are extracted from scalograms formed by Continuous Wavelet Transform (CWT). Detection algorithms are designed to identify an estimate of the temporal development...

  18. Automatic Segmentation and Deep Learning of Bird Sounds

    NARCIS (Netherlands)

    Koops, Hendrik Vincent; Van Balen, J.M.H.; Wiering, F.

    2015-01-01

    We present a study on automatic birdsong recognition with deep neural networks using the BIRDCLEF2014 dataset. Through deep learning, feature hierarchies are learned that represent the data on several levels of abstraction. Deep learning has been applied with success to problems in fields such as

  19. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. [comp.

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  20. Nonrelativistic effective field theories of QED and QCD. Applications and automatic calculations

    Energy Technology Data Exchange (ETDEWEB)

    Shtabovenko, Vladyslav

    2017-05-22

    This thesis deals with the applications of nonrelativistic Effective Field Theories to electromagnetic and strong interactions. The main results of this work are divided into three parts. In the first part, we use potential Nonrelativistic Quantum Electrodynamics (pNRQED), an EFT of QED at energies much below m{sub e}α (with m{sub e} being the electron mass and α the fine-structure constant), to develop a consistent description of electromagnetic van der Waals forces between two hydrogen atoms at a separation R much larger than the Bohr radius. We consider the interactions at short (R<<1/m{sub e}α{sup 2}), long (R>>1/m{sub e}α{sup 2}) and intermediate (R∝1/m{sub e}α{sup 2}) distances and identify the relevant dynamical scales that characterize each of the three regimes. For each regime we construct a suitable van der Waals EFT, that provides the simplest description of the low-energy dynamics. In this framework, van der Waals potentials naturally arise from the matching coefficients of the corresponding EFTs. They can be computed in a systematic way, order by order in the relevant expansion parameters, as is done in this work. Furthermore, the potentials receive contributions from radiative corrections and have to be renormalized. The development of a consistent EFT framework to treat electromagnetic van der Waals interactions between hydrogen atoms and the renormalization of the corresponding van der Waals potentials are the novel features of this study. In the second part, we study relativistic O(α{sup 0}{sub s}υ{sup 2}) (with α{sub s} being the strong coupling constant) corrections to the exclusive electromagnetic production of the heavy quarkonium χ {sub cJ} and a hard photon in the framework of nonrelativistic Quantum Chromodynamics (NRQCD), an EFT of QCD that takes full advantage of the nonrelativistic nature of charmonia and bottomonia and exploits wide separation of the relevant dynamical scales. These scales are m{sub Q} >> m{sub Q}υ >> m{sub Q

  1. Nonrelativistic effective field theories of QED and QCD. Applications and automatic calculations

    International Nuclear Information System (INIS)

    Shtabovenko, Vladyslav

    2017-01-01

    This thesis deals with the applications of nonrelativistic Effective Field Theories to electromagnetic and strong interactions. The main results of this work are divided into three parts. In the first part, we use potential Nonrelativistic Quantum Electrodynamics (pNRQED), an EFT of QED at energies much below m e α (with m e being the electron mass and α the fine-structure constant), to develop a consistent description of electromagnetic van der Waals forces between two hydrogen atoms at a separation R much larger than the Bohr radius. We consider the interactions at short (R<<1/m e α 2 ), long (R>>1/m e α 2 ) and intermediate (R∝1/m e α 2 ) distances and identify the relevant dynamical scales that characterize each of the three regimes. For each regime we construct a suitable van der Waals EFT, that provides the simplest description of the low-energy dynamics. In this framework, van der Waals potentials naturally arise from the matching coefficients of the corresponding EFTs. They can be computed in a systematic way, order by order in the relevant expansion parameters, as is done in this work. Furthermore, the potentials receive contributions from radiative corrections and have to be renormalized. The development of a consistent EFT framework to treat electromagnetic van der Waals interactions between hydrogen atoms and the renormalization of the corresponding van der Waals potentials are the novel features of this study. In the second part, we study relativistic O(α 0 s υ 2 ) (with α s being the strong coupling constant) corrections to the exclusive electromagnetic production of the heavy quarkonium χ cJ and a hard photon in the framework of nonrelativistic Quantum Chromodynamics (NRQCD), an EFT of QCD that takes full advantage of the nonrelativistic nature of charmonia and bottomonia and exploits wide separation of the relevant dynamical scales. These scales are m Q >> m Q υ >> m Q υ 2 , where m Q is the heavy quark mass and υ is the relative

  2. Is automatic CPAP titration as effective as manual CPAP titration in OSAHS patients? A meta-analysis.

    Science.gov (United States)

    Gao, Weijie; Jin, Yinghui; Wang, Yan; Sun, Mei; Chen, Baoyuan; Zhou, Ning; Deng, Yuan

    2012-06-01

    It is costly and time-consuming to conduct the standard manual titration to identify an effective pressure before continuous positive airway pressure (CPAP) treatment for obstructive sleep apnea (OSA) patients. Automatic titration is cheaper and more easily available than manual titration. The purpose of this systematic review was to evaluate the effect of automatic titration in identifying a pressure and on the improvement of apnea/hyponea index (AHI) and somnolence, the change of sleep quality, and the acceptance and compliance of CPAP treatment, compared with the manual titration. A systematic search was made of the PubMed, EMBASE, Cochrane Library, SCI, China Academic Journals Full-text Databases, Chinese Biomedical Literature Database, Chinese Scientific Journals Databases and Chinese Medical Association Journals. Randomized controlled trials comparing automatic titration and manual titration were reviewed. Studies were pooled to yield odds ratios (OR) or mean differences (MD) with 95% confidence intervals (CI). Ten trials involving 849 patients met the inclusion criteria. It is hard to identify a trend in the pressures determined by either automatic or manual titration. Automatic titration can improve the AHI (MD = 0.03/h, 95% CI = -4.48 to 4.53) and Epworth sleepiness scale (SMD = -0.02, 95% CI = -0.34 to 0.31,) as effectively as the manual titration. There is no difference between sleep architecture under automatic titration or manual titration. The acceptance of CPAP treatment (OR = 0.96, 95% CI = 0.60 to 1.55) and the compliance with treatment (MD = -0.04, 95% CI = -0.17 to 0.10) after automatic titration is not different from manual titration. Automatic titration is as effective as standard manual titration in improving AHI, somnolence while maintaining sleep quality similar to the standard method. In addition, automatic titration has the same effect on the acceptance and compliance of CPAP treatment as manual titration. With the potential advantage

  3. Development of Automatic Remote Exposure Controller for Gamma Radiography

    International Nuclear Information System (INIS)

    Joo, Gwang Tae; Shin, Jin Seong; Kim, Dong Eun; Song, Jung Ho; Choo, Seung Hwan; Chang, Hong Keun

    2002-01-01

    Recently, gamma radiographic equipment have been used about 1,000 sets manually and operated by about 2,500 persons in Korea. In order for a radiography to work effectively with avoiding any hazard of the high level radiation from the source, many field workers have expected developing a wireless automatic remote exposure controller. The KlTCO research team has developed an automatic remote exposure controller that can regulate the speed of 0.4∼1.2m/s by BLDC motor of 24V 200W which has output of 54 kgf·, suitable torque and safety factor for the work. And the developed automatic remote exposure controller can control rpm of motor, pigtail position by photo-sensor and exposure time by timer to RF sensor. Thus, the developed equipment is expected that the unit can be used in many practical applications with benefits in economical advantage to combine the use of both automatic and manual type because attachment is possible existent manual remote exposure controller, AC and DC combined use

  4. An Efficient Metric of Automatic Weight Generation for Properties in Instance Matching Technique

    OpenAIRE

    Seddiqui, Md. Hanif; Nath, Rudra Pratap Deb; Aono, Masaki

    2015-01-01

    The proliferation of heterogeneous data sources of semantic knowledge base intensifies the need of an automatic instance matching technique. However, the efficiency of instance matching is often influenced by the weight of a property associated to instances. Automatic weight generation is a non-trivial, however an important task in instance matching technique. Therefore, identifying an appropriate metric for generating weight for a property automatically is nevertheless a formidab...

  5. Automatic detection and visualisation of MEG ripple oscillations in epilepsy

    Directory of Open Access Journals (Sweden)

    Nicole van Klink

    2017-01-01

    Full Text Available High frequency oscillations (HFOs, 80–500 Hz in invasive EEG are a biomarker for the epileptic focus. Ripples (80–250 Hz have also been identified in non-invasive MEG, yet detection is impeded by noise, their low occurrence rates, and the workload of visual analysis. We propose a method that identifies ripples in MEG through noise reduction, beamforming and automatic detection with minimal user effort. We analysed 15 min of presurgical resting-state interictal MEG data of 25 patients with epilepsy. The MEG signal-to-noise was improved by using a cross-validation signal space separation method, and by calculating ~2400 beamformer-based virtual sensors in the grey matter. Ripples in these sensors were automatically detected by an algorithm optimized for MEG. A small subset of the identified ripples was visually checked. Ripple locations were compared with MEG spike dipole locations and the resection area if available. Running the automatic detection algorithm resulted in on average 905 ripples per patient, of which on average 148 ripples were visually reviewed. Reviewing took approximately 5 min per patient, and identified ripples in 16 out of 25 patients. In 14 patients the ripple locations showed good or moderate concordance with the MEG spikes. For six out of eight patients who had surgery, the ripple locations showed concordance with the resection area: 4/5 with good outcome and 2/3 with poor outcome. Automatic ripple detection in beamformer-based virtual sensors is a feasible non-invasive tool for the identification of ripples in MEG. Our method requires minimal user effort and is easily applicable in a clinical setting.

  6. Automatic generation of anatomic characteristics from cerebral aneurysm surface models.

    Science.gov (United States)

    Neugebauer, M; Lawonn, K; Beuing, O; Preim, B

    2013-03-01

    Computer-aided research on cerebral aneurysms often depends on a polygonal mesh representation of the vessel lumen. To support a differentiated, anatomy-aware analysis, it is necessary to derive anatomic descriptors from the surface model. We present an approach on automatic decomposition of the adjacent vessels into near- and far-vessel regions and computation of the axial plane. We also exemplarily present two applications of the geometric descriptors: automatic computation of a unique vessel order and automatic viewpoint selection. Approximation methods are employed to analyze vessel cross-sections and the vessel area profile along the centerline. The resulting transition zones between near- and far- vessel regions are used as input for an optimization process to compute the axial plane. The unique vessel order is defined via projection into the plane space of the axial plane. The viewing direction for the automatic viewpoint selection is derived from the normal vector of the axial plane. The approach was successfully applied to representative data sets exhibiting a broad variability with respect to the configuration of their adjacent vessels. A robustness analysis showed that the automatic decomposition is stable against noise. A survey with 4 medical experts showed a broad agreement with the automatically defined transition zones. Due to the general nature of the underlying algorithms, this approach is applicable to most of the likely aneurysm configurations in the cerebral vasculature. Additional geometric information obtained during automatic decomposition can support correction in case the automatic approach fails. The resulting descriptors can be used for various applications in the field of visualization, exploration and analysis of cerebral aneurysms.

  7. Automatic measurement of images on astrometric plates

    Science.gov (United States)

    Ortiz Gil, A.; Lopez Garcia, A.; Martinez Gonzalez, J. M.; Yershov, V.

    1994-04-01

    We present some results on the process of automatic detection and measurement of objects in overlapped fields of astrometric plates. The main steps of our algorithm are the following: determination of the Scale and Tilt between charge coupled devices (CCD) and microscope coordinate systems and estimation of signal-to-noise ratio in each field;--image identification and improvement of its position and size;--image final centering;--image selection and storage. Several parameters allow the use of variable criteria for image identification, characterization and selection. Problems related with faint images and crowded fields will be approached by special techniques (morphological filters, histogram properties and fitting models).

  8. Special Issue on Automatic Application Tuning for HPC Architectures

    Directory of Open Access Journals (Sweden)

    Siegfried Benkner

    2014-01-01

    Full Text Available High Performance Computing architectures have become incredibly complex and exploiting their full potential is becoming more and more challenging. As a consequence, automatic performance tuning (autotuning of HPC applications is of growing interest and many research groups around the world are currently involved. Autotuning is still a rapidly evolving research field with many different approaches being taken. This special issue features selected papers presented at the Dagstuhl seminar on “Automatic Application Tuning for HPC Architectures” in October 2013, which brought together researchers from the areas of autotuning and performance analysis in order to exchange ideas and steer future collaborations.

  9. Research progress of on-line automatic monitoring of chemical oxygen demand (COD) of water

    Science.gov (United States)

    Cai, Youfa; Fu, Xing; Gao, Xiaolu; Li, Lianyin

    2018-02-01

    With the increasingly stricter control of pollutant emission in China, the on-line automatic monitoring of water quality is particularly urgent. The chemical oxygen demand (COD) is a comprehensive index to measure the contamination caused by organic matters, and thus it is taken as one important index of energy-saving and emission reduction in China’s “Twelve-Five” program. So far, the COD on-line automatic monitoring instrument has played an important role in the field of sewage monitoring. This paper reviews the existing methods to achieve on-line automatic monitoring of COD, and on the basis, points out the future trend of the COD on-line automatic monitoring instruments.

  10. A Machine Vision System for Automatically Grading Hardwood Lumber - (Industrial Metrology)

    Science.gov (United States)

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas T. Drayer; Philip A. Araman; Robert L. Brisbon

    1992-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  11. A Review of Automatic Methods Based on Image Processing Techniques for Tuberculosis Detection from Microscopic Sputum Smear Images.

    Science.gov (United States)

    Panicker, Rani Oomman; Soman, Biju; Saini, Gagan; Rajan, Jeny

    2016-01-01

    Tuberculosis (TB) is an infectious disease caused by the bacteria Mycobacterium tuberculosis. It primarily affects the lungs, but it can also affect other parts of the body. TB remains one of the leading causes of death in developing countries, and its recent resurgences in both developed and developing countries warrant global attention. The number of deaths due to TB is very high (as per the WHO report, 1.5 million died in 2013), although most are preventable if diagnosed early and treated. There are many tools for TB detection, but the most widely used one is sputum smear microscopy. It is done manually and is often time consuming; a laboratory technician is expected to spend at least 15 min per slide, limiting the number of slides that can be screened. Many countries, including India, have a dearth of properly trained technicians, and they often fail to detect TB cases due to the stress of a heavy workload. Automatic methods are generally considered as a solution to this problem. Attempts have been made to develop automatic approaches to identify TB bacteria from microscopic sputum smear images. In this paper, we provide a review of automatic methods based on image processing techniques published between 1998 and 2014. The review shows that the accuracy of algorithms for the automatic detection of TB increased significantly over the years and gladly acknowledges that commercial products based on published works also started appearing in the market. This review could be useful to researchers and practitioners working in the field of TB automation, providing a comprehensive and accessible overview of methods of this field of research.

  12. Methodological Comparison between a Novel Automatic Sampling System for Gas Chromatography versus Photoacoustic Spectroscopy for Measuring Greenhouse Gas Emissions under Field Conditions

    Directory of Open Access Journals (Sweden)

    Alexander J. Schmithausen

    2016-10-01

    Full Text Available Trace gases such as nitrous oxide (N2O, methane (CH4, and carbon dioxide (CO2 are climate-related gases, and their emissions from agricultural livestock barns are not negligible. Conventional measurement systems in the field (Fourier transform infrared spectroscopy (FTIR; photoacoustic system (PAS are not sufficiently sensitive to N2O. Laser-based measurement systems are highly accurate, but they are very expensive to purchase and maintain. One cost-effective alternative is gas chromatography (GC with electron capture detection (ECD, but this is not suitable for field applications due to radiation. Measuring samples collected automatically under field conditions in the laboratory at a subsequent time presents many challenges. This study presents a sampling designed to promote laboratory analysis of N2O concentrations sampled under field conditions. Analyses were carried out using PAS in the field (online system and GC in the laboratory (offline system. Both measurement systems showed a good correlation for CH4 and CO2 concentrations. Measured N2O concentrations were near the detection limit for PAS. GC achieved more reliable results for N2O in very low concentration ranges.

  13. Automatic localization of the da Vinci surgical instrument tips in 3-D transrectal ultrasound.

    Science.gov (United States)

    Mohareri, Omid; Ramezani, Mahdi; Adebar, Troy K; Abolmaesumi, Purang; Salcudean, Septimiu E

    2013-09-01

    Robot-assisted laparoscopic radical prostatectomy (RALRP) using the da Vinci surgical system is the current state-of-the-art treatment option for clinically confined prostate cancer. Given the limited field of view of the surgical site in RALRP, several groups have proposed the integration of transrectal ultrasound (TRUS) imaging in the surgical workflow to assist with accurate resection of the prostate and the sparing of the neurovascular bundles (NVBs). We previously introduced a robotic TRUS manipulator and a method for automatically tracking da Vinci surgical instruments with the TRUS imaging plane, in order to facilitate the integration of intraoperative TRUS in RALRP. Rapid and automatic registration of the kinematic frames of the da Vinci surgical system and the robotic TRUS probe manipulator is a critical component of the instrument tracking system. In this paper, we propose a fully automatic registration technique based on automatic 3-D TRUS localization of robot instrument tips pressed against the air-tissue boundary anterior to the prostate. The detection approach uses a multiscale filtering technique to identify and localize surgical instrument tips in the TRUS volume, and could also be used to detect other surface fiducials in 3-D ultrasound. Experiments have been performed using a tissue phantom and two ex vivo tissue samples to show the feasibility of the proposed methods. Also, an initial in vivo evaluation of the system has been carried out on a live anaesthetized dog with a da Vinci Si surgical system and a target registration error (defined as the root mean square distance of corresponding points after registration) of 2.68 mm has been achieved. Results show this method's accuracy and consistency for automatic registration of TRUS images to the da Vinci surgical system.

  14. Automatic Evaluations and Exercising: Systematic Review and Implications for Future Research.

    Science.gov (United States)

    Schinkoeth, Michaela; Antoniewicz, Franziska

    2017-01-01

    The general purpose of this systematic review was to summarize, structure and evaluate the findings on automatic evaluations of exercising. Studies were eligible for inclusion if they reported measuring automatic evaluations of exercising with an implicit measure and assessed some kind of exercise variable. Fourteen nonexperimental and six experimental studies (out of a total N = 1,928) were identified and rated by two independent reviewers. The main study characteristics were extracted and the grade of evidence for each study evaluated. First, results revealed a large heterogeneity in the applied measures to assess automatic evaluations of exercising and the exercise variables. Generally, small to large-sized significant relations between automatic evaluations of exercising and exercise variables were identified in the vast majority of studies. The review offers a systematization of the various examined exercise variables and prompts to differentiate more carefully between actually observed exercise behavior (proximal exercise indicator) and associated physiological or psychological variables (distal exercise indicator). Second, a lack of transparent reported reflections on the differing theoretical basis leading to the use of specific implicit measures was observed. Implicit measures should be applied purposefully, taking into consideration the individual advantages or disadvantages of the measures. Third, 12 studies were rated as providing first-grade evidence (lowest grade of evidence), five represent second-grade and three were rated as third-grade evidence. There is a dramatic lack of experimental studies, which are essential for illustrating the cause-effect relation between automatic evaluations of exercising and exercise and investigating under which conditions automatic evaluations of exercising influence behavior. Conclusions about the necessity of exercise interventions targeted at the alteration of automatic evaluations of exercising should therefore

  15. An enhanced model for automatically extracting topic phrase from ...

    African Journals Online (AJOL)

    The key benefit foreseen from this automatic document classification is not only related to search engines, but also to many other fields like, document organization, text filtering and semantic index managing. Key words: Keyphrase extraction, machine learning, search engine snippet, document classification, topic tracking ...

  16. Automatic alignment of double optical paths in excimer laser amplifier

    Science.gov (United States)

    Wang, Dahui; Zhao, Xueqing; Hua, Hengqi; Zhang, Yongsheng; Hu, Yun; Yi, Aiping; Zhao, Jun

    2013-05-01

    A kind of beam automatic alignment method used for double paths amplification in the electron pumped excimer laser system is demonstrated. In this way, the beams from the amplifiers can be transferred along the designated direction and accordingly irradiate on the target with high stabilization and accuracy. However, owing to nonexistence of natural alignment references in excimer laser amplifiers, two cross-hairs structure is used to align the beams. Here, one crosshair put into the input beam is regarded as the near-field reference while the other put into output beam is regarded as the far-field reference. The two cross-hairs are transmitted onto Charge Coupled Devices (CCD) by image-relaying structures separately. The errors between intersection points of two cross-talk images and centroid coordinates of actual beam are recorded automatically and sent to closed loop feedback control mechanism. Negative feedback keeps running until preset accuracy is reached. On the basis of above-mentioned design, the alignment optical path is built and the software is compiled, whereafter the experiment of double paths automatic alignment in electron pumped excimer laser amplifier is carried through. Meanwhile, the related influencing factors and the alignment precision are analyzed. Experimental results indicate that the alignment system can achieve the aiming direction of automatic aligning beams in short time. The analysis shows that the accuracy of alignment system is 0.63μrad and the beam maximum restoration error is 13.75μm. Furthermore, the bigger distance between the two cross-hairs, the higher precision of the system is. Therefore, the automatic alignment system has been used in angular multiplexing excimer Main Oscillation Power Amplification (MOPA) system and can satisfy the requirement of beam alignment precision on the whole.

  17. Semi-automatic mapping for identifying complex geobodies in seismic images

    Science.gov (United States)

    Domínguez-C, Raymundo; Romero-Salcedo, Manuel; Velasquillo-Martínez, Luis G.; Shemeretov, Leonid

    2017-03-01

    Seismic images are composed of positive and negative seismic wave traces with different amplitudes (Robein 2010 Seismic Imaging: A Review of the Techniques, their Principles, Merits and Limitations (Houten: EAGE)). The association of these amplitudes together with a color palette forms complex visual patterns. The color intensity of such patterns is directly related to impedance contrasts: the higher the contrast, the higher the color intensity. Generally speaking, low impedance contrasts are depicted with low tone colors, creating zones with different patterns whose features are not evident for a 3D automated mapping option available on commercial software. In this work, a workflow for a semi-automatic mapping of seismic images focused on those areas with low-intensity colored zones that may be associated with geobodies of petroleum interest is proposed. The CIE L*A*B* color space was used to perform the seismic image processing, which helped find small but significant differences between pixel tones. This process generated binary masks that bound color regions to low-intensity colors. The three-dimensional-mask projection allowed the construction of 3D structures for such zones (geobodies). The proposed method was applied to a set of digital images from a seismic cube and tested on four representative study cases. The obtained results are encouraging because interesting geobodies are obtained with a minimum of information.

  18. Technology-Enhanced Assessment of Math Fact Automaticity: Patterns of Performance for Low- and Typically Achieving Students

    Science.gov (United States)

    Stickney, Eric M.; Sharp, Lindsay B.; Kenyon, Amanda S.

    2012-01-01

    Because math fact automaticity has been identified as a key barrier for students struggling with mathematics, we examined how initial math achievement levels influenced the path to automaticity (e.g., variation in number of attempts, speed of retrieval, and skill maintenance over time) and the relation between attainment of automaticity and gains…

  19. Learning algorithms and automatic processing of languages

    International Nuclear Information System (INIS)

    Fluhr, Christian Yves Andre

    1977-01-01

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts

  20. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    Andriamampianina, Lala

    1983-01-01

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author) [fr

  1. Design of cylindrical pipe automatic welding control system based on STM32

    Science.gov (United States)

    Chen, Shuaishuai; Shen, Weicong

    2018-04-01

    The development of modern economy makes the demand for pipeline construction and construction rapidly increasing, and the pipeline welding has become an important link in pipeline construction. At present, there are still a large number of using of manual welding methods at home and abroad, and field pipe welding especially lacks miniature and portable automatic welding equipment. An automated welding system consists of a control system, which consisting of a lower computer control panel and a host computer operating interface, as well as automatic welding machine mechanisms and welding power systems in coordination with the control system. In this paper, a new control system of automatic pipe welding based on the control panel of the lower computer and the interface of the host computer is proposed, which has many advantages over the traditional automatic welding machine.

  2. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  3. Automatic termination of a protective action

    International Nuclear Information System (INIS)

    Heil, P.H.

    1986-01-01

    Subcommittee 6 of NPEC is responsible for the development of IEEE Standard 603. The adequacy of requirements concerning control and termination of protective actions was raised during the balloting of IEEE Standard 603-1980. In essence, the concern dealt with the requirement for deliberate operator action to return the system to normal. It was questioned if control actions such as automatic termination of system operation were allowed. Changes in the standard were made to clarify that there was a distinction between control (including termination) and return to normal and also state that automatic control may be required. Additionally, an action item was identified in the forward of IEEE Standard 603-1980 to determine if any additional changes were needed. The purpose of the paper is to present the results of this additional work

  4. Automatic detection and classification of damage zone(s) for incorporating in digital image correlation technique

    Science.gov (United States)

    Bhattacharjee, Sudipta; Deb, Debasis

    2016-07-01

    Digital image correlation (DIC) is a technique developed for monitoring surface deformation/displacement of an object under loading conditions. This method is further refined to make it capable of handling discontinuities on the surface of the sample. A damage zone is referred to a surface area fractured and opened in due course of loading. In this study, an algorithm is presented to automatically detect multiple damage zones in deformed image. The algorithm identifies the pixels located inside these zones and eliminate them from FEM-DIC processes. The proposed algorithm is successfully implemented on several damaged samples to estimate displacement fields of an object under loading conditions. This study shows that displacement fields represent the damage conditions reasonably well as compared to regular FEM-DIC technique without considering the damage zones.

  5. Software design of automatic counting system for nuclear track based on mathematical morphology algorithm

    International Nuclear Information System (INIS)

    Pan Yi; Mao Wanchong

    2010-01-01

    The parameter measurement of nuclear track occupies an important position in the field of nuclear technology. However, traditional artificial counting method has many limitations. In recent years, DSP and digital image processing technology have been applied in nuclear field more and more. For the sake of reducing errors of visual measurement in artificial counting method, an automatic counting system for nuclear track based on DM642 real-time image processing platform is introduced in this article, which is able to effectively remove interferences from the background and noise points, as well as automatically extract nuclear track-points by using mathematical morphology algorithm. (authors)

  6. Automatic coding method of the ACR Code

    International Nuclear Information System (INIS)

    Park, Kwi Ae; Ihm, Jong Sool; Ahn, Woo Hyun; Baik, Seung Kook; Choi, Han Yong; Kim, Bong Gi

    1993-01-01

    The authors developed a computer program for automatic coding of ACR(American College of Radiology) code. The automatic coding of the ACR code is essential for computerization of the data in the department of radiology. This program was written in foxbase language and has been used for automatic coding of diagnosis in the Department of Radiology, Wallace Memorial Baptist since May 1992. The ACR dictionary files consisted of 11 files, one for the organ code and the others for the pathology code. The organ code was obtained by typing organ name or code number itself among the upper and lower level codes of the selected one that were simultaneous displayed on the screen. According to the first number of the selected organ code, the corresponding pathology code file was chosen automatically. By the similar fashion of organ code selection, the proper pathologic dode was obtained. An example of obtained ACR code is '131.3661'. This procedure was reproducible regardless of the number of fields of data. Because this program was written in 'User's Defined Function' from, decoding of the stored ACR code was achieved by this same program and incorporation of this program into program in to another data processing was possible. This program had merits of simple operation, accurate and detail coding, and easy adjustment for another program. Therefore, this program can be used for automation of routine work in the department of radiology

  7. Evolving a rule system controller for automatic driving in a car racing competition

    OpenAIRE

    Pérez, Diego; Sáez Achaerandio, Yago; Recio Isasi, Gustavo; Isasi Viñuela, Pedro

    2008-01-01

    IEEE Symposium on Computational Intelligence and Games. Perth, Australia, 15-18 December 2008. The techniques and the technologies supporting Automatic Vehicle Guidance are important issues. Automobile manufacturers view automatic driving as a very interesting product with motivating key features which allow improvement of the car safety, reduction in emission or fuel consumption or optimization of driver comfort during long journeys. Car racing is an active research field where new ...

  8. Realization of the ergonomics design and automatic control of the fundus cameras

    Science.gov (United States)

    Zeng, Chi-liang; Xiao, Ze-xin; Deng, Shi-chao; Yu, Xin-ye

    2012-12-01

    The principles of ergonomics design in fundus cameras should be extending the agreeableness by automatic control. Firstly, a 3D positional numerical control system is designed for positioning the eye pupils of the patients who are doing fundus examinations. This system consists of a electronically controlled chin bracket for moving up and down, a lateral movement of binocular with the detector and the automatic refocusing of the edges of the eye pupils. Secondly, an auto-focusing device for the object plane of patient's fundus is designed, which collects the patient's fundus images automatically whether their eyes is ametropic or not. Finally, a moving visual target is developed for expanding the fields of the fundus images.

  9. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  10. Assessment of automatic segmentation of teeth using a watershed-based method.

    Science.gov (United States)

    Galibourg, Antoine; Dumoncel, Jean; Telmon, Norbert; Calvet, Adèle; Michetti, Jérôme; Maret, Delphine

    2018-01-01

    Tooth 3D automatic segmentation (AS) is being actively developed in research and clinical fields. Here, we assess the effect of automatic segmentation using a watershed-based method on the accuracy and reproducibility of 3D reconstructions in volumetric measurements by comparing it with a semi-automatic segmentation(SAS) method that has already been validated. The study sample comprised 52 teeth, scanned with micro-CT (41 µm voxel size) and CBCT (76; 200 and 300 µm voxel size). Each tooth was segmented by AS based on a watershed method and by SAS. For all surface reconstructions, volumetric measurements were obtained and analysed statistically. Surfaces were then aligned using the SAS surfaces as the reference. The topography of the geometric discrepancies was displayed by using a colour map allowing the maximum differences to be located. AS reconstructions showed similar tooth volumes when compared with SAS for the 41 µm voxel size. A difference in volumes was observed, and increased with the voxel size for CBCT data. The maximum differences were mainly found at the cervical margins and incisal edges but the general form was preserved. Micro-CT, a modality used in dental research, provides data that can be segmented automatically, which is timesaving. AS with CBCT data enables the general form of the region of interest to be displayed. However, our AS method can still be used for metrically reliable measurements in the field of clinical dentistry if some manual refinements are applied.

  11. Automatic segmentation for brain MR images via a convex optimized segmentation and bias field correction coupled model.

    Science.gov (United States)

    Chen, Yunjie; Zhao, Bo; Zhang, Jianwei; Zheng, Yuhui

    2014-09-01

    Accurate segmentation of magnetic resonance (MR) images remains challenging mainly due to the intensity inhomogeneity, which is also commonly known as bias field. Recently active contour models with geometric information constraint have been applied, however, most of them deal with the bias field by using a necessary pre-processing step before segmentation of MR data. This paper presents a novel automatic variational method, which can segment brain MR images meanwhile correcting the bias field when segmenting images with high intensity inhomogeneities. We first define a function for clustering the image pixels in a smaller neighborhood. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. In order to reduce the effect of the noise, the local intensity variations are described by the Gaussian distributions with different means and variances. Then, the objective functions are integrated over the entire domain. In order to obtain the global optimal and make the results independent of the initialization of the algorithm, we reconstructed the energy function to be convex and calculated it by using the Split Bregman theory. A salient advantage of our method is that its result is independent of initialization, which allows robust and fully automated application. Our method is able to estimate the bias of quite general profiles, even in 7T MR images. Moreover, our model can also distinguish regions with similar intensity distribution with different variances. The proposed method has been rigorously validated with images acquired on variety of imaging modalities with promising results. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Individual differences in automatic emotion regulation affect the asymmetry of the LPP component.

    Directory of Open Access Journals (Sweden)

    Jing Zhang

    Full Text Available The main goal of this study was to investigate how automatic emotion regulation altered the hemispheric asymmetry of ERPs elicited by emotion processing. We examined the effect of individual differences in automatic emotion regulation on the late positive potential (LPP when participants were viewing blocks of positive high arousal, positive low arousal, negative high arousal and negative low arousal pictures from International affect picture system (IAPS. Two participant groups were categorized by the Emotion Regulation-Implicit Association Test which has been used in previous research to identify two groups of participants with automatic emotion control and with automatic emotion express. The main finding was that automatic emotion express group showed a right dominance of the LPP component at posterior electrodes, especially in high arousal conditions. But no right dominance of the LPP component was observed for automatic emotion control group. We also found the group with automatic emotion control showed no differences in the right posterior LPP amplitude between high- and low-arousal emotion conditions, while the participants with automatic emotion express showed larger LPP amplitude in the right posterior in high-arousal conditions compared to low-arousal conditions. This result suggested that AER (Automatic emotion regulation modulated the hemispheric asymmetry of LPP on posterior electrodes and supported the right hemisphere hypothesis.

  13. Individual differences in automatic emotion regulation affect the asymmetry of the LPP component.

    Science.gov (United States)

    Zhang, Jing; Zhou, Renlai

    2014-01-01

    The main goal of this study was to investigate how automatic emotion regulation altered the hemispheric asymmetry of ERPs elicited by emotion processing. We examined the effect of individual differences in automatic emotion regulation on the late positive potential (LPP) when participants were viewing blocks of positive high arousal, positive low arousal, negative high arousal and negative low arousal pictures from International affect picture system (IAPS). Two participant groups were categorized by the Emotion Regulation-Implicit Association Test which has been used in previous research to identify two groups of participants with automatic emotion control and with automatic emotion express. The main finding was that automatic emotion express group showed a right dominance of the LPP component at posterior electrodes, especially in high arousal conditions. But no right dominance of the LPP component was observed for automatic emotion control group. We also found the group with automatic emotion control showed no differences in the right posterior LPP amplitude between high- and low-arousal emotion conditions, while the participants with automatic emotion express showed larger LPP amplitude in the right posterior in high-arousal conditions compared to low-arousal conditions. This result suggested that AER (Automatic emotion regulation) modulated the hemispheric asymmetry of LPP on posterior electrodes and supported the right hemisphere hypothesis.

  14. Development of an automatic human duress detection system

    International Nuclear Information System (INIS)

    Greene, E.R.; Davis, J.G.; Tuttle, W.C.

    1979-01-01

    A method for automatically detecting duress in security personnel utilizes real-time assessment of physiological data (heart rate) to evaluate psychological stress. Using body-worn tape recorders, field data have been collected on 22 Albuquerque police officers (20 male, 2 female) to determine actual heart rate responses in both routine and life-threatening situations. Off-line computer analysis has been applied to the data to determine the speed and reliability with which an alarm could be triggered. Alarm algorithms relating field responses to laboratory collected baseline responses have been developed

  15. Identifying Discrimination at Work: The Use of Field Experiments.

    Science.gov (United States)

    Pager, Devah; Western, Bruce

    2012-06-01

    Antidiscrimination law offers protection to workers who have been treated unfairly on the basis of their race, gender, religion, or national origin. In order for these protections to be invoked, however, potential plaintiffs must be aware of and able to document discriminatory treatment. Given the subtlety of contemporary forms of discrimination, it is often difficult to identify discrimination when it has taken place. The methodology of field experiments offers one approach to measuring and detecting hiring discrimination, providing direct observation of discrimination in real-world settings. In this article, we discuss the findings of two recent field experiments measuring racial discrimination in low wage labor markets. This research provides several relevant findings for researchers and those interested in civil rights enforcement: (1) it produces estimates of the rate of discrimination at the point of hire; (2) it yields evidence about the interactions associated with discrimination (many of which reveal the subtlety with which contemporary discrimination is practiced); and (3) it provides a vehicle for both research on and enforcement of antidiscrimination law.

  16. Automatic Detection of Fake News

    OpenAIRE

    Pérez-Rosas, Verónica; Kleinberg, Bennett; Lefevre, Alexandra; Mihalcea, Rada

    2017-01-01

    The proliferation of misleading information in everyday access media outlets such as social media feeds, news blogs, and online newspapers have made it challenging to identify trustworthy news sources, thus increasing the need for computational tools able to provide insights into the reliability of online content. In this paper, we focus on the automatic identification of fake content in online news. Our contribution is twofold. First, we introduce two novel datasets for the task of fake news...

  17. TU-H-CAMPUS-JeP1-02: Fully Automatic Verification of Automatically Contoured Normal Tissues in the Head and Neck

    Energy Technology Data Exchange (ETDEWEB)

    McCarroll, R [UT MD Anderson Cancer Center, Houston, TX (United States); UT Health Science Center, Graduate School of Biomedical Sciences, Houston, TX (United States); Beadle, B; Yang, J; Zhang, L; Kisling, K; Balter, P; Stingo, F; Nelson, C; Followill, D; Court, L [UT MD Anderson Cancer Center, Houston, TX (United States); Mejia, M [University of Santo Tomas Hospital, Manila, Metro Manila (Philippines)

    2016-06-15

    Purpose: To investigate and validate the use of an independent deformable-based contouring algorithm for automatic verification of auto-contoured structures in the head and neck towards fully automated treatment planning. Methods: Two independent automatic contouring algorithms [(1) Eclipse’s Smart Segmentation followed by pixel-wise majority voting, (2) an in-house multi-atlas based method] were used to create contours of 6 normal structures of 10 head-and-neck patients. After rating by a radiation oncologist, the higher performing algorithm was selected as the primary contouring method, the other used for automatic verification of the primary. To determine the ability of the verification algorithm to detect incorrect contours, contours from the primary method were shifted from 0.5 to 2cm. Using a logit model the structure-specific minimum detectable shift was identified. The models were then applied to a set of twenty different patients and the sensitivity and specificity of the models verified. Results: Per physician rating, the multi-atlas method (4.8/5 point scale, with 3 rated as generally acceptable for planning purposes) was selected as primary and the Eclipse-based method (3.5/5) for verification. Mean distance to agreement and true positive rate were selected as covariates in an optimized logit model. These models, when applied to a group of twenty different patients, indicated that shifts could be detected at 0.5cm (brain), 0.75cm (mandible, cord), 1cm (brainstem, cochlea), or 1.25cm (parotid), with sensitivity and specificity greater than 0.95. If sensitivity and specificity constraints are reduced to 0.9, detectable shifts of mandible and brainstem were reduced by 0.25cm. These shifts represent additional safety margins which might be considered if auto-contours are used for automatic treatment planning without physician review. Conclusion: Automatically contoured structures can be automatically verified. This fully automated process could be used to

  18. Automatic Mosaicking of Satellite Imagery Considering the Clouds

    Science.gov (United States)

    Kang, Yifei; Pan, Li; Chen, Qi; Zhang, Tong; Zhang, Shasha; Liu, Zhang

    2016-06-01

    With the rapid development of high resolution remote sensing for earth observation technology, satellite imagery is widely used in the fields of resource investigation, environment protection, and agricultural research. Image mosaicking is an important part of satellite imagery production. However, the existence of clouds leads to lots of disadvantages for automatic image mosaicking, mainly in two aspects: 1) Image blurring may be caused during the process of image dodging, 2) Cloudy areas may be passed through by automatically generated seamlines. To address these problems, an automatic mosaicking method is proposed for cloudy satellite imagery in this paper. Firstly, modified Otsu thresholding and morphological processing are employed to extract cloudy areas and obtain the percentage of cloud cover. Then, cloud detection results are used to optimize the process of dodging and mosaicking. Thus, the mosaic image can be combined with more clear-sky areas instead of cloudy areas. Besides, clear-sky areas will be clear and distortionless. The Chinese GF-1 wide-field-of-view orthoimages are employed as experimental data. The performance of the proposed approach is evaluated in four aspects: the effect of cloud detection, the sharpness of clear-sky areas, the rationality of seamlines and efficiency. The evaluation results demonstrated that the mosaic image obtained by our method has fewer clouds, better internal color consistency and better visual clarity compared with that obtained by traditional method. The time consumed by the proposed method for 17 scenes of GF-1 orthoimages is within 4 hours on a desktop computer. The efficiency can meet the general production requirements for massive satellite imagery.

  19. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  20. Automatic acquisition and shape analysis of metastable peaks

    International Nuclear Information System (INIS)

    Maendli, H.; Robbiani, R.; Kuster, Th.; Seibl, J.

    1979-01-01

    A method for automatic acquisition and evaluation of metastable peaks due to transitions in the first field-free region of a double focussing mass spectrometer is presented. The data are acquired by computer-controlled repetitive scanning of the accelerating voltage and concomitant accumulation, the evaluation made by a mathematical derivatization of the resulting curve. Examples for application of the method are given. (Auth.)

  1. Finding weak points automatically

    International Nuclear Information System (INIS)

    Archinger, P.; Wassenberg, M.

    1999-01-01

    Operators of nuclear power stations have to carry out material tests at selected components by regular intervalls. Therefore a full automaticated test, which achieves a clearly higher reproducibility, compared to part automaticated variations, would provide a solution. In addition the full automaticated test reduces the dose of radiation for the test person. (orig.) [de

  2. Automatic Texture and Orthophoto Generation from Registered Panoramic Views

    DEFF Research Database (Denmark)

    Krispel, Ulrich; Evers, Henrik Leander; Tamke, Martin

    2015-01-01

    are automatically identified from the geometry and an image per view is created via projection. We combine methods of computer vision to train a classifier to detect the objects of interest from these orthographic views. Furthermore, these views can be used for automatic texturing of the proxy geometry....... from range data only. In order to detect these elements, we developed a method that utilizes range data and color information from high-resolution panoramic images of indoor scenes, taken at the scanners position. A proxy geometry is derived from the point clouds; orthographic views of the scene...

  3. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    Lattin, Kenneth R.

    1978-01-01

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  4. Automatic Tamil lyric generation based on ontological interpretation ...

    Indian Academy of Sciences (India)

    This system proposes an -gram based approach to automatic Tamil lyric generation, by the ontological semantic interpretation of the input scene. The approach is based on identifying the semantics conveyed in the scenario, thereby making the system understand the situation and generate lyrics accordingly. The heart of ...

  5. Automatically Identifying Fusion Events between GLUT4 Storage Vesicles and the Plasma Membrane in TIRF Microscopy Image Sequences

    Directory of Open Access Journals (Sweden)

    Jian Wu

    2015-01-01

    Full Text Available Quantitative analysis of the dynamic behavior about membrane-bound secretory vesicles has proven to be important in biological research. This paper proposes a novel approach to automatically identify the elusive fusion events between VAMP2-pHluorin labeled GLUT4 storage vesicles (GSVs and the plasma membrane. The differentiation is implemented to detect the initiation of fusion events by modified forward subtraction of consecutive frames in the TIRFM image sequence. Spatially connected pixels in difference images brighter than a specified adaptive threshold are grouped into a distinct fusion spot. The vesicles are located at the intensity-weighted centroid of their fusion spots. To reveal the true in vivo nature of a fusion event, 2D Gaussian fitting for the fusion spot is used to derive the intensity-weighted centroid and the spot size during the fusion process. The fusion event and its termination can be determined according to the change of spot size. The method is evaluated on real experiment data with ground truth annotated by expert cell biologists. The evaluation results show that it can achieve relatively high accuracy comparing favorably to the manual analysis, yet at a small fraction of time.

  6. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Routine use of quantitative three dimensional analysis of material microstructure by in particular, focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice or the quality of manual and automatic...... segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...

  7. The development of an automatic scanning method for CR-39 neutron dosimeter

    International Nuclear Information System (INIS)

    Tawara, Hiroko; Miyajima, Mitsuhiro; Sasaki, Shin-ichi; Hozumi, Ken-ichi

    1989-01-01

    A method of measuring low level neutron dose has been developed with CR-39 track detectors using an automatic scanning system. It is composed of the optical microscope with a video camera, an image processor and a personal computer. The focus point of the microscope and the X-Y stage are controlled from the computer. The minimum detectable neutron dose is estimated at 4.6 mrem in the uniform field of neutron with equivalent energy spectrum to Am-Be source from the results of automatic measurements. (author)

  8. Automatic Modulation Recognition by Support Vector Machines Using Wavelet Kernel

    Energy Technology Data Exchange (ETDEWEB)

    Feng, X Z; Yang, J; Luo, F L; Chen, J Y; Zhong, X P [College of Mechatronic Engineering and Automation, National University of Defense Technology, Changsha (China)

    2006-10-15

    Automatic modulation identification plays a significant role in electronic warfare, electronic surveillance systems and electronic counter measure. The task of modulation recognition of communication signals is to determine the modulation type and signal parameters. In fact, automatic modulation identification can be range to an application of pattern recognition in communication field. The support vector machines (SVM) is a new universal learning machine which is widely used in the fields of pattern recognition, regression estimation and probability density. In this paper, a new method using wavelet kernel function was proposed, which maps the input vector xi into a high dimensional feature space F. In this feature space F, we can construct the optimal hyperplane that realizes the maximal margin in this space. That is to say, we can use SVM to classify the communication signals into two groups, namely analogue modulated signals and digitally modulated signals. In addition, computer simulation results are given at last, which show good performance of the method.

  9. Automatic Modulation Recognition by Support Vector Machines Using Wavelet Kernel

    International Nuclear Information System (INIS)

    Feng, X Z; Yang, J; Luo, F L; Chen, J Y; Zhong, X P

    2006-01-01

    Automatic modulation identification plays a significant role in electronic warfare, electronic surveillance systems and electronic counter measure. The task of modulation recognition of communication signals is to determine the modulation type and signal parameters. In fact, automatic modulation identification can be range to an application of pattern recognition in communication field. The support vector machines (SVM) is a new universal learning machine which is widely used in the fields of pattern recognition, regression estimation and probability density. In this paper, a new method using wavelet kernel function was proposed, which maps the input vector xi into a high dimensional feature space F. In this feature space F, we can construct the optimal hyperplane that realizes the maximal margin in this space. That is to say, we can use SVM to classify the communication signals into two groups, namely analogue modulated signals and digitally modulated signals. In addition, computer simulation results are given at last, which show good performance of the method

  10. Automatic Reverse Engineering of Private Flight Control Protocols of UAVs

    Directory of Open Access Journals (Sweden)

    Ran Ji

    2017-01-01

    Full Text Available The increasing use of civil unmanned aerial vehicles (UAVs has the potential to threaten public safety and privacy. Therefore, airspace administrators urgently need an effective method to regulate UAVs. Understanding the meaning and format of UAV flight control commands by automatic protocol reverse-engineering techniques is highly beneficial to UAV regulation. To improve our understanding of the meaning and format of UAV flight control commands, this paper proposes a method to automatically analyze the private flight control protocols of UAVs. First, we classify flight control commands collected from a binary network trace into clusters; then, we analyze the meaning of flight control commands by the accumulated error of each cluster; next, we extract the binary format of commands and infer field semantics in these commands; and finally, we infer the location of the check field in command and the generator polynomial matrix. The proposed approach is validated via experiments on a widely used consumer UAV.

  11. Microprocessor controlled system for automatic and semi-automatic syntheses of radiopharmaceuticals

    International Nuclear Information System (INIS)

    Ruth, T.J.; Adam, M.J.; Morris, D.; Jivan, S.

    1986-01-01

    A computer based system has been constructed to control the automatic synthesis of 2-deoxy-2-( 18 F)fluoro-D-glucose and is also being used in the development of an automatic synthesis of L-6-( 18 F)fluorodopa. (author)

  12. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...

  13. Automatic Loop Parallelization via Compiler Guided Refactoring

    DEFF Research Database (Denmark)

    Larsen, Per; Ladelsky, Razya; Lidman, Jacob

    For many parallel applications, performance relies not on instruction-level parallelism, but on loop-level parallelism. Unfortunately, many modern applications are written in ways that obstruct automatic loop parallelization. Since we cannot identify sufficient parallelization opportunities...... for these codes in a static, off-line compiler, we developed an interactive compilation feedback system that guides the programmer in iteratively modifying application source, thereby improving the compiler’s ability to generate loop-parallel code. We use this compilation system to modify two sequential...... benchmarks, finding that the code parallelized in this way runs up to 8.3 times faster on an octo-core Intel Xeon 5570 system and up to 12.5 times faster on a quad-core IBM POWER6 system. Benchmark performance varies significantly between the systems. This suggests that semi-automatic parallelization should...

  14. Development of a doorframe-typed swinging seedling pick-up device for automatic field transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Han, H.; Mao, H.; Hu, J.; Tian, K.

    2015-07-01

    A doorframe-typed swing seedling pick-up device for automatic field transplanters was developed and evaluated in a laboratory. The device, consisting of a path manipulator and two grippers, can move the pins slowly to extract seedlings from the tray cells and return quickly to the pick-up point for the next extraction. The path manipulator was constructed with the creative design of type-Ⅱ mechanism combination in series. It consists of an oscillating guide linkage mechanism and a grooved globoidal cam mechanism. The gripper is a pincette-type mechanism using the pick-up pins to penetrate into the root mass for seedling extraction. The dynamic analysis of the designed seedling pick-up device was simulated with ADAMS software. Being the first prototype, various performance tests under local production conditions were conducted to find out the optimal machine operation parameters and transplant production conditions. As the gripper with multiple fine pins was moved by the swing pick-up device, it can effectively complete the transplanting work cycle of extracting, transferring, and discharging a seedling. The laboratory evaluation showed that the pick-up device equipped with two grippers can extract 80 seedlings/min with a 90% success and a 3% failure in discharging seedlings, using 42-day-old tomato plantlets. The quality of extracting seedlings was satisfactory. (Author)

  15. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  16. Motor automaticity in Parkinson’s disease

    Science.gov (United States)

    Wu, Tao; Hallett, Mark; Chan, Piu

    2017-01-01

    Bradykinesia is the most important feature contributing to motor difficulties in Parkinson’s disease (PD). However, the pathophysiology underlying bradykinesia is not fully understood. One important aspect is that PD patients have difficulty in performing learned motor skills automatically, but this problem has been generally overlooked. Here we review motor automaticity associated motor deficits in PD, such as reduced arm swing, decreased stride length, freezing of gait, micrographia and reduced facial expression. Recent neuroimaging studies have revealed some neural mechanisms underlying impaired motor automaticity in PD, including less efficient neural coding of movement, failure to shift automated motor skills to the sensorimotor striatum, instability of the automatic mode within the striatum, and use of attentional control and/or compensatory efforts to execute movements usually performed automatically in healthy people. PD patients lose previously acquired automatic skills due to their impaired sensorimotor striatum, and have difficulty in acquiring new automatic skills or restoring lost motor skills. More investigations on the pathophysiology of motor automaticity, the effect of L-dopa or surgical treatments on automaticity, and the potential role of using measures of automaticity in early diagnosis of PD would be valuable. PMID:26102020

  17. Robust automatic high resolution segmentation of SOFC anode porosity in 3D

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Bowen, Jacob R.

    2008-01-01

    Routine use of 3D characterization of SOFCs by focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice. We apply advanced image analysis algorithms to automatically segment the porosity phase of an SOFC...... anode in 3D. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to the desired phase boundary. Vector fields derived from the experimentally acquired data are used as the driving force. The automatic segmentation compared to manual delineation...... reveals and good correspondence and the two approaches are quantitatively compared. It is concluded that the. automatic approach is more robust, more reproduceable and orders of magnitude quicker than manual segmentation of SOFC anode porosity for subsequent quantitative 3D analysis. Lastly...

  18. Automatic Assessment of Craniofacial Growth in a Mouse Model of Crouzon Syndrome

    DEFF Research Database (Denmark)

    Thorup, Signe Strann; Larsen, Rasmus; Darvann, Tron Andre

    2009-01-01

    for each mouse-type; growth models were created using linear interpolation and visualized as 3D animations. Spatial regions of significantly different growth were identified using the local False Discovery Rate method, estimating the expected percentage of false predictions in a set of predictions. For all......-rigid volumetric image registration was applied to micro-CT scans of ten 4-week and twenty 6-week euthanized mice for growth modeling. Each age group consisted of 50% normal and 50% Crouzon mice. Four 3D mean shapes, one for each mouse-type and age group were created. Extracting a dense field of growth vectors...... a tool for spatially detailed automatic phenotyping. MAIN OBJECTIVES OF PRESENTATION: We will present a 3D growth model of normal and Crouzon mice, and differences will be statistically and visually compared....

  19. Automatic noninvasive measurement of systolic blood pressure using photoplethysmography

    Directory of Open Access Journals (Sweden)

    Glik Zehava

    2009-10-01

    Full Text Available Abstract Background Automatic measurement of arterial blood pressure is important, but the available commercial automatic blood pressure meters, mostly based on oscillometry, are of low accuracy. Methods In this study, we present a cuff-based technique for automatic measurement of systolic blood pressure, based on photoplethysmographic signals measured simultaneously in fingers of both hands. After inflating the pressure cuff to a level above systolic blood pressure in a relatively slow rate, it is slowly deflated. The cuff pressure for which the photoplethysmographic signal reappeared during the deflation of the pressure-cuff was taken as the systolic blood pressure. The algorithm for the detection of the photoplethysmographic signal involves: (1 determination of the time-segments in which the photoplethysmographic signal distal to the cuff is expected to appear, utilizing the photoplethysmographic signal in the free hand, and (2 discrimination between random fluctuations and photoplethysmographic pattern. The detected pulses in the time-segments were identified as photoplethysmographic pulses if they met two criteria, based on the pulse waveform and on the correlation between the signal in each segment and the signal in the two neighboring segments. Results Comparison of the photoplethysmographic-based automatic technique to sphygmomanometry, the reference standard, shows that the standard deviation of their differences was 3.7 mmHg. For subjects with systolic blood pressure above 130 mmHg the standard deviation was even lower, 2.9 mmHg. These values are much lower than the 8 mmHg value imposed by AAMI standard for automatic blood pressure meters. Conclusion The photoplethysmographic-based technique for automatic measurement of systolic blood pressure, and the algorithm which was presented in this study, seems to be accurate.

  20. Template-based automatic extraction of the joint space of foot bones from CT scan

    Science.gov (United States)

    Park, Eunbi; Kim, Taeho; Park, Jinah

    2016-03-01

    Clean bone segmentation is critical in studying the joint anatomy for measuring the spacing between the bones. However, separation of the coupled bones in CT images is sometimes difficult due to ambiguous gray values coming from the noise and the heterogeneity of bone materials as well as narrowing of the joint space. For fine reconstruction of the individual local boundaries, manual operation is a common practice where the segmentation remains to be a bottleneck. In this paper, we present an automatic method for extracting the joint space by applying graph cut on Markov random field model to the region of interest (ROI) which is identified by a template of 3D bone structures. The template includes encoded articular surface which identifies the tight region of the high-intensity bone boundaries together with the fuzzy joint area of interest. The localized shape information from the template model within the ROI effectively separates the bones nearby. By narrowing the ROI down to the region including two types of tissue, the object extraction problem was reduced to binary segmentation and solved via graph cut. Based on the shape of a joint space marked by the template, the hard constraint was set by the initial seeds which were automatically generated from thresholding and morphological operations. The performance and the robustness of the proposed method are evaluated on 12 volumes of ankle CT data, where each volume includes a set of 4 tarsal bones (calcaneus, talus, navicular and cuboid).

  1. Automatic Photoelectric Telescope Service

    International Nuclear Information System (INIS)

    Genet, R.M.; Boyd, L.J.; Kissell, K.E.; Crawford, D.L.; Hall, D.S.; BDM Corp., McLean, VA; Kitt Peak National Observatory, Tucson, AZ; Dyer Observatory, Nashville, TN)

    1987-01-01

    Automatic observatories have the potential of gathering sizable amounts of high-quality astronomical data at low cost. The Automatic Photoelectric Telescope Service (APT Service) has realized this potential and is routinely making photometric observations of a large number of variable stars. However, without observers to provide on-site monitoring, it was necessary to incorporate special quality checks into the operation of the APT Service at its multiple automatic telescope installation on Mount Hopkins. 18 references

  2. Automatic detection of osteoporotic vertebral fractures in routine thoracic and abdominal MDCT

    Energy Technology Data Exchange (ETDEWEB)

    Baum, Thomas; Dobritz, Martin; Rummeny, Ernst J.; Noel, Peter B. [Technische Universitaet Muenchen, Institut fuer Radiologie, Klinikum rechts der Isar, Muenchen (Germany); Bauer, Jan S. [Technische Universitaet Muenchen, Abteilung fuer Neuroradiologie, Klinikum rechts der Isar, Muenchen (Germany); Klinder, Tobias; Lorenz, Cristian [Philips Research Laboratories, Hamburg (Germany)

    2014-04-15

    To develop a prototype algorithm for automatic spine segmentation in MDCT images and use it to automatically detect osteoporotic vertebral fractures. Cross-sectional routine thoracic and abdominal MDCT images of 71 patients including 8 males and 9 females with 25 osteoporotic vertebral fractures and longitudinal MDCT images of 9 patients with 18 incidental fractures in the follow-up MDCT were retrospectively selected. The spine segmentation algorithm localised and identified the vertebrae T5-L5. Each vertebra was automatically segmented by using corresponding vertebra surface shape models that were adapted to the original images. Anterior, middle, and posterior height of each vertebra was automatically determined; the anterior-posterior ratio (APR) and middle-posterior ratio (MPR) were computed. As the gold standard, radiologists graded vertebral fractures from T5 to L5 according to the Genant classification in consensus. Using ROC analysis to differentiate vertebrae without versus with prevalent fracture, AUC values of 0.84 and 0.83 were obtained for APR and MPR, respectively (p < 0.001). Longitudinal changes in APR and MPR were significantly different between vertebrae without versus with incidental fracture (ΔAPR: -8.5 % ± 8.6 % versus -1.6 % ± 4.2 %, p = 0.002; ΔMPR: -11.4 % ± 7.7 % versus -1.2 % ± 1.6 %, p < 0.001). This prototype algorithm may support radiologists in reporting currently underdiagnosed osteoporotic vertebral fractures so that appropriate therapy can be initiated. circle This spine segmentation algorithm automatically localised, identified, and segmented the vertebrae in MDCT images. (orig.)

  3. The Role of Automatic Obesity Stereotypes in Real Hiring Discrimination

    Science.gov (United States)

    Agerstrom, Jens; Rooth, Dan-Olof

    2011-01-01

    This study examined whether automatic stereotypes captured by the implicit association test (IAT) can predict real hiring discrimination against the obese. In an unobtrusive field experiment, job applications were sent to a large number of real job vacancies. The applications were matched on credentials but differed with respect to the applicant's…

  4. Genital automatisms: Reappraisal of a remarkable but ignored symptom of focal seizures.

    Science.gov (United States)

    Dede, Hava Özlem; Bebek, Nerses; Gürses, Candan; Baysal-Kıraç, Leyla; Baykan, Betül; Gökyiğit, Ayşen

    2018-03-01

    Genital automatisms (GAs) are uncommon clinical phenomena of focal seizures. They are defined as repeated fondling, grabbing, or scratching of the genitals. The aim of this study was to determine the lateralizing and localizing value and associated clinical characteristics of GAs. Three hundred thirteen consecutive patients with drug-resistant seizures who were referred to our tertiary center for presurgical evaluation between 2009 and 2016 were investigated. The incidence of specific kinds of behavior, clinical semiology, associated symptoms/signs with corresponding ictal electroencephalography (EEG) findings, and their potential role in seizure localization and lateralization were evaluated. Fifteen (4.8%) of 313 patients had GAs. Genital automatisms were identified in 19 (16.4%) of a total 116 seizures. Genital automatisms were observed to occur more often in men than in women (M/F: 10/5). Nine of fifteen patients (60%) had temporal lobe epilepsy (right/left: 4/5) and three (20%) had frontal lobe epilepsy (right/left: 1/2), whereas the remaining two patients could not be classified. One patient was diagnosed as having Rasmussen encephalitis. Genital automatisms were ipsilateral to epileptic focus in 12 patients and contralateral in only one patient according to ictal-interictal EEG and neuroimaging findings. Epileptic focus could not be lateralized in the last 2 patients. Genital automatisms were associated with unilateral hand automatisms such as postictal nose wiping or manual automatisms in 13 (86.7%) of 15 and contralateral dystonia was seen in 6 patients. All patients had amnesia of the performance of GAs. Genital automatisms are more frequent in seizures originating from the temporal lobe, and they can also be seen in frontal lobe seizures. Genital automatisms seem to have a high lateralizing value to the ipsilateral hemisphere and are mostly concordant with other unilateral hand automatisms. Men exhibit GAs more often than women. Copyright © 2017

  5. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    Regional hydrological models are important tools in water resources management. Model prediction uncertainty is primarily due to structural (geological) non-uniqueness which makes sampling of the structural model space necessary to estimate prediction uncertainties. Geological structures and hete...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  6. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    Andreewsky, Alexandre; Fluhr, Christian.

    1975-06-01

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined [fr

  7. Automatic alignment of radionuclide images

    International Nuclear Information System (INIS)

    Barber, D.C.

    1982-01-01

    The variability of the position, dimensions and orientation of a radionuclide image within the field of view of a gamma camera hampers attempts to analyse the image numerically. This paper describes a method of using a set of training images of a particular type, in this case right lateral brain images, to define the likely variations in the position, dimensions and orientation for that type of image and to provide alignment data for a program that automatically aligns new images of the specified type to a standard position, size and orientation. Examples are given of the use of this method on three types of radionuclide image. (author)

  8. Magnetic field systems employing a superconducting D.C. field coil

    International Nuclear Information System (INIS)

    Bartram, T.C.; Hazell, P.A.

    1977-01-01

    Method and equipment for transferring energy to or from a direct-current superconducting field coil to change the magnetic field generated by the coil in which a second direct-current superconducting coil is used as a storage coil, and energy transfer between the field coil and the storage coil is effected automatically in dependence upon a control program. Preferably, the control program acts upon a variable transformer which is coupled by respective rectifier/inverters to the field and storage coils and also serves for intital supply of energy to the coils

  9. Automatic yield-line analysis of slabs using discontinuity layout optimization.

    Science.gov (United States)

    Gilbert, Matthew; He, Linwei; Smith, Colin C; Le, Canh V

    2014-08-08

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented.

  10. Segmenting articular cartilage automatically using a voxel classification approach

    DEFF Research Database (Denmark)

    Folkesson, Jenny; Dam, Erik B; Olsen, Ole F

    2007-01-01

    We present a fully automatic method for articular cartilage segmentation from magnetic resonance imaging (MRI) which we use as the foundation of a quantitative cartilage assessment. We evaluate our method by comparisons to manual segmentations by a radiologist and by examining the interscan...... reproducibility of the volume and area estimates. Training and evaluation of the method is performed on a data set consisting of 139 scans of knees with a status ranging from healthy to severely osteoarthritic. This is, to our knowledge, the only fully automatic cartilage segmentation method that has good...... agreement with manual segmentations, an interscan reproducibility as good as that of a human expert, and enables the separation between healthy and osteoarthritic populations. While high-field scanners offer high-quality imaging from which the articular cartilage have been evaluated extensively using manual...

  11. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  12. AISLE: an automatic volumetric segmentation method for the study of lung allometry.

    Science.gov (United States)

    Ren, Hongliang; Kazanzides, Peter

    2011-01-01

    We developed a fully automatic segmentation method for volumetric CT (computer tomography) datasets to support construction of a statistical atlas for the study of allometric laws of the lung. The proposed segmentation method, AISLE (Automated ITK-Snap based on Level-set), is based on the level-set implementation from an existing semi-automatic segmentation program, ITK-Snap. AISLE can segment the lung field without human interaction and provide intermediate graphical results as desired. The preliminary experimental results show that the proposed method can achieve accurate segmentation, in terms of volumetric overlap metric, by comparing with the ground-truth segmentation performed by a radiologist.

  13. Connection of automatic integral multichannel monitor of aerosol concentration

    International Nuclear Information System (INIS)

    Krejci, M.; Stulik, P.

    1985-01-01

    The instrument consists of the actual aerosol concentration monitor with two equivalent inputs, of an electropneumatic sampling selector, an aerosol pump, an electropneumatic valve, and of an exhaust device. For integral operating mode the instrument allows rapid checking and indication of exceedance of the permissible aerosol concentration limit at any sampling point. Upon exceedance of the permissible concentration limit, the device automatically switches into the multichannel cyclic measurement mode while the sampling point is identified where the aerosol concentration was increased. An emergency is displayed if the permissible limit has been exceeded. Following removal of the source of dangerous aerosol concentration, the control unit automatically switches the device into the integral measurement mode. (J.B.)

  14. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    Science.gov (United States)

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  15. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  16. Semi Automatic Ontology Instantiation in the domain of Risk Management

    Science.gov (United States)

    Makki, Jawad; Alquier, Anne-Marie; Prince, Violaine

    One of the challenging tasks in the context of Ontological Engineering is to automatically or semi-automatically support the process of Ontology Learning and Ontology Population from semi-structured documents (texts). In this paper we describe a Semi-Automatic Ontology Instantiation method from natural language text, in the domain of Risk Management. This method is composed from three steps 1 ) Annotation with part-of-speech tags, 2) Semantic Relation Instances Extraction, 3) Ontology instantiation process. It's based on combined NLP techniques using human intervention between steps 2 and 3 for control and validation. Since it heavily relies on linguistic knowledge it is not domain dependent which is a good feature for portability between the different fields of risk management application. The proposed methodology uses the ontology of the PRIMA1 project (supported by the European community) as a Generic Domain Ontology and populates it via an available corpus. A first validation of the approach is done through an experiment with Chemical Fact Sheets from Environmental Protection Agency2.

  17. Automatic welding machine for piping

    International Nuclear Information System (INIS)

    Yoshida, Kazuhiro; Koyama, Takaichi; Iizuka, Tomio; Ito, Yoshitoshi; Takami, Katsumi.

    1978-01-01

    A remotely controlled automatic special welding machine for piping was developed. This machine is utilized for long distance pipe lines, chemical plants, thermal power generating plants and nuclear power plants effectively from the viewpoint of good quality control, reduction of labor and good controllability. The function of this welding machine is to inspect the shape and dimensions of edge preparation before welding work by the sense of touch, to detect the temperature of melt pool, inspect the bead form by the sense of touch, and check the welding state by ITV during welding work, and to grind the bead surface and inspect the weld metal by ultrasonic test automatically after welding work. The construction of this welding system, the main specification of the apparatus, the welding procedure in detail, the electrical source of this welding machine, the cooling system, the structure and handling of guide ring, the central control system and the operating characteristics are explained. The working procedure and the effect by using this welding machine, and the application to nuclear power plants and the other industrial field are outlined. The HIDIC 08 is used as the controlling computer. This welding machine is useful for welding SUS piping as well as carbon steel piping. (Nakai, Y.)

  18. The estimation of tax-benefit automatic stabilizers in Serbia: A combined micro-macro approach

    Directory of Open Access Journals (Sweden)

    Ranđelović Saša

    2013-01-01

    Full Text Available The large volatility of GDP due to the economic crisis, particularly in transition economies, has brought the issue of automatic stabilizers back into the focus of economic policy. The vast majority of empirical literature in this field relates to the estimation of the size of automatic stabilizers in developed countries, usually based on macroeconomic data. On the other hand empirical literature on this topic based on micro data, particularly for transition economies, is limited. This paper provides an evaluation of the size of automatic stabilizers in one transition economy (Serbia, by combining tax-benefit simulation modelling based on micro data and econometric methods based on macroeconomic data. The results show that, in the case of shock, around 17% of fall in market income would be absorbed by automatic stabilizers. Although the stabilizing effects of the tax-benefit system in Serbia are lower than in other European countries, the total size of automatic stabilizers is close to the average value in these countries, due to the higher elasticity of demand to income. The results also show that progressivity-enhancing income tax reform would only slightly increase automatic stabilizers, due to the large informal economy and the large share of agriculture in total households’ income.

  19. Automatic sets and Delone sets

    International Nuclear Information System (INIS)

    Barbe, A; Haeseler, F von

    2004-01-01

    Automatic sets D part of Z m are characterized by having a finite number of decimations. They are equivalently generated by fixed points of certain substitution systems, or by certain finite automata. As examples, two-dimensional versions of the Thue-Morse, Baum-Sweet, Rudin-Shapiro and paperfolding sequences are presented. We give a necessary and sufficient condition for an automatic set D part of Z m to be a Delone set in R m . The result is then extended to automatic sets that are defined as fixed points of certain substitutions. The morphology of automatic sets is discussed by means of examples

  20. Automatic block-matching registration to improve lung tumor localization during image-guided radiotherapy

    Science.gov (United States)

    Robertson, Scott Patrick

    To improve relatively poor outcomes for locally-advanced lung cancer patients, many current efforts are dedicated to minimizing uncertainties in radiotherapy. This enables the isotoxic delivery of escalated tumor doses, leading to better local tumor control. The current dissertation specifically addresses inter-fractional uncertainties resulting from patient setup variability. An automatic block-matching registration (BMR) algorithm is implemented and evaluated for the purpose of directly localizing advanced-stage lung tumors during image-guided radiation therapy. In this algorithm, small image sub-volumes, termed "blocks", are automatically identified on the tumor surface in an initial planning computed tomography (CT) image. Each block is independently and automatically registered to daily images acquired immediately prior to each treatment fraction. To improve the accuracy and robustness of BMR, this algorithm incorporates multi-resolution pyramid registration, regularization with a median filter, and a new multiple-candidate-registrations technique. The result of block-matching is a sparse displacement vector field that models local tissue deformations near the tumor surface. The distribution of displacement vectors is aggregated to obtain the final tumor registration, corresponding to the treatment couch shift for patient setup correction. Compared to existing rigid and deformable registration algorithms, the final BMR algorithm significantly improves the overlap between target volumes from the planning CT and registered daily images. Furthermore, BMR results in the smallest treatment margins for the given study population. However, despite these improvements, large residual target localization errors were noted, indicating that purely rigid couch shifts cannot correct for all sources of inter-fractional variability. Further reductions in treatment uncertainties may require the combination of high-quality target localization and adaptive radiotherapy.

  1. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  2. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  3. Automatic categorization of diverse experimental information in the bioscience literature.

    Science.gov (United States)

    Fang, Ruihua; Schindelman, Gary; Van Auken, Kimberly; Fernandes, Jolene; Chen, Wen; Wang, Xiaodong; Davis, Paul; Tuli, Mary Ann; Marygold, Steven J; Millburn, Gillian; Matthews, Beverley; Zhang, Haiyan; Brown, Nick; Gelbart, William M; Sternberg, Paul W

    2012-01-26

    Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at WormBase for

  4. Development and evaluation of new semi-automatic TLD reader software

    International Nuclear Information System (INIS)

    Pathan, M.S.; Pradhan, S.M.; Palani Selvam, T.; Datta, D.

    2018-01-01

    Nowadays, all technology advancement is primarily focused on creating the user-friendly environment while operating any machine, also minimizing the human errors by automation of procedures. In the present study development and evaluation of new software for semi-automatic TLD badge reader (TLDBR-7B) is presented. The software provides an interactive interface and is compatible with latest windows OS as well as USB mode of data communication. Important new features of the software are automatic glow curve analysis for identifying any abnormality, event log register, user defined limits on TL count and time of temperature stabilization for readout interruption and auto reading resumption options

  5. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  6. The ''controbloc'', a programmable automatic device for the 1,300 MW generation of power stations

    International Nuclear Information System (INIS)

    Pralus, B.; Winzelle, J.C.

    1983-01-01

    Technological progress in the field of microelectronics has led to the development of an automatic control device, the ''controbloc'', for operating and controlling nuclear power plants. The ''controbloc'' will be used in automatic systems with a high degree of safety and versatility and is now being installed in the first of the new generation 1,300 MW power stations. The main characteristics of the device and the evaluation tests which have been carried out are described [fr

  7. IADE: a system for intelligent automatic design of bioisosteric analogs

    Science.gov (United States)

    Ertl, Peter; Lewis, Richard

    2012-11-01

    IADE, a software system supporting molecular modellers through the automatic design of non-classical bioisosteric analogs, scaffold hopping and fragment growing, is presented. The program combines sophisticated cheminformatics functionalities for constructing novel analogs and filtering them based on their drug-likeness and synthetic accessibility using automatic structure-based design capabilities: the best candidates are selected according to their similarity to the template ligand and to their interactions with the protein binding site. IADE works in an iterative manner, improving the fitness of designed molecules in every generation until structures with optimal properties are identified. The program frees molecular modellers from routine, repetitive tasks, allowing them to focus on analysis and evaluation of the automatically designed analogs, considerably enhancing their work efficiency as well as the area of chemical space that can be covered. The performance of IADE is illustrated through a case study of the design of a nonclassical bioisosteric analog of a farnesyltransferase inhibitor—an analog that has won a recent "Design a Molecule" competition.

  8. Automatic system for evaluation of ionizing field

    International Nuclear Information System (INIS)

    Pimenta, N.L.; Calil, S.J.

    1992-01-01

    A three-dimensional cartesian manipulator for evaluating the ionizing field and able to position a ionization chamber in any point of the space is developed. The control system is made using a IBM microcomputer. The system aimed the study of isodose curves from ionizing sources, verifying the performance of radiotherapeutic equipment. (C.G.C.)

  9. TMB: Automatic Differentiation and Laplace Approximation

    Directory of Open Access Journals (Sweden)

    Kasper Kristensen

    2016-04-01

    Full Text Available TMB is an open source R package that enables quick implementation of complex nonlinear random effects (latent variable models in a manner similar to the established AD Model Builder package (ADMB, http://admb-project.org/; Fournier et al. 2011. In addition, it offers easy access to parallel computations. The user defines the joint likelihood for the data and the random effects as a C++ template function, while all the other operations are done in R; e.g., reading in the data. The package evaluates and maximizes the Laplace approximation of the marginal likelihood where the random effects are automatically integrated out. This approximation, and its derivatives, are obtained using automatic differentiation (up to order three of the joint likelihood. The computations are designed to be fast for problems with many random effects (≈ 106 and parameters (≈ 103 . Computation times using ADMB and TMB are compared on a suite of examples ranging from simple models to large spatial models where the random effects are a Gaussian random field. Speedups ranging from 1.5 to about 100 are obtained with increasing gains for large problems. The package and examples are available at http://tmb-project.org/.

  10. Automatisms: bridging clinical neurology with criminal law.

    Science.gov (United States)

    Rolnick, Joshua; Parvizi, Josef

    2011-03-01

    The law, like neurology, grapples with the relationship between disease states and behavior. Sometimes, the two disciplines share the same terminology, such as automatism. In law, the "automatism defense" is a claim that action was involuntary or performed while unconscious. Someone charged with a serious crime can acknowledge committing the act and yet may go free if, relying on the expert testimony of clinicians, the court determines that the act of crime was committed in a state of automatism. In this review, we explore the relationship between the use of automatism in the legal and clinical literature. We close by addressing several issues raised by the automatism defense: semantic ambiguity surrounding the term automatism, the presence or absence of consciousness during automatisms, and the methodological obstacles that have hindered the study of cognition during automatisms. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. Aircraft noise effects on sleep: a systematic comparison of EEG awakenings and automatically detected cardiac activations

    International Nuclear Information System (INIS)

    Basner, Mathias; Müller, Uwe; Elmenhorst, Eva-Maria; Kluge, Götz; Griefahn, Barbara

    2008-01-01

    Polysomnography is the gold standard for investigating noise effects on sleep, but data collection and analysis are sumptuous and expensive. We recently developed an algorithm for the automatic identification of cardiac activations associated with cortical arousals, which uses heart rate information derived from a single electrocardiogram (ECG) channel. We hypothesized that cardiac activations can be used as estimates for EEG awakenings. Polysomnographic EEG awakenings and automatically detected cardiac activations were systematically compared using laboratory data of 112 subjects (47 male, mean ± SD age 37.9 ± 13 years), 985 nights and 23 855 aircraft noise events (ANEs). The probability of automatically detected cardiac activations increased monotonically with increasing maximum sound pressure levels of ANEs, exceeding the probability of EEG awakenings by up to 18.1%. If spontaneous reactions were taken into account, exposure–response curves were practically identical for EEG awakenings and cardiac activations. Automatically detected cardiac activations may be used as estimates for EEG awakenings. More investigations are needed to further validate the ECG algorithm in the field and to investigate inter-individual differences in its ability to predict EEG awakenings. This inexpensive, objective and non-invasive method facilitates large-scale field studies on the effects of traffic noise on sleep

  12. The identifiable victim effect in charitable giving: evidence from a natural field experiment

    DEFF Research Database (Denmark)

    Lesner, Tine; Rasmussen, O. D.

    2014-01-01

    or a statistical victim. Unlike much previous research, which has used only laboratory experiments, we find that the campaign letter focusing on one identifiable victim did not result in significantly larger donations than the campaign letter focusing on the statistical victim. In addition to the role......We design a natural field experiment to enhance our understanding of the role of the identifiable victim effect in charitable giving. Using direct mail solicitations to 25797 prior donors of a nonprofit charity, we tested the responsiveness of donors to make a contribution to either an identifiable...... campaigns. We find some evidence of crowding out, indicating that charitable giving could be a zero-sum game; however, the treatment letters did not have different effects on other payments....

  13. A simple field method to identify foot strike pattern during running.

    Science.gov (United States)

    Giandolini, Marlène; Poupard, Thibaut; Gimenez, Philippe; Horvais, Nicolas; Millet, Guillaume Y; Morin, Jean-Benoît; Samozino, Pierre

    2014-05-07

    Identifying foot strike patterns in running is an important issue for sport clinicians, coaches and footwear industrials. Current methods allow the monitoring of either many steps in laboratory conditions or only a few steps in the field. Because measuring running biomechanics during actual practice is critical, our purpose is to validate a method aiming at identifying foot strike patterns during continuous field measurements. Based on heel and metatarsal accelerations, this method requires two uniaxial accelerometers. The time between heel and metatarsal acceleration peaks (THM) was compared to the foot strike angle in the sagittal plane (αfoot) obtained by 2D video analysis for various conditions of speed, slope, footwear, foot strike and state of fatigue. Acceleration and kinematic measurements were performed at 1000Hz and 120Hz, respectively, during 2-min treadmill running bouts. Significant correlations were observed between THM and αfoot for 14 out of 15 conditions. The overall correlation coefficient was r=0.916 (Pstrike except for extreme forefoot strike during which the heel rarely or never strikes the ground, and for different footwears and states of fatigue. We proposed a classification based on THM: FFS<-5.49ms

  14. Automatic discovery of the communication network topology for building a supercomputer model

    Science.gov (United States)

    Sobolev, Sergey; Stefanov, Konstantin; Voevodin, Vadim

    2016-10-01

    The Research Computing Center of Lomonosov Moscow State University is developing the Octotron software suite for automatic monitoring and mitigation of emergency situations in supercomputers so as to maximize hardware reliability. The suite is based on a software model of the supercomputer. The model uses a graph to describe the computing system components and their interconnections. One of the most complex components of a supercomputer that needs to be included in the model is its communication network. This work describes the proposed approach for automatically discovering the Ethernet communication network topology in a supercomputer and its description in terms of the Octotron model. This suite automatically detects computing nodes and switches, collects information about them and identifies their interconnections. The application of this approach is demonstrated on the "Lomonosov" and "Lomonosov-2" supercomputers.

  15. Features and perspectives of automatized construction crane-manipulators

    Science.gov (United States)

    Stepanov, Mikhail A.; Ilukhin, Peter A.

    2018-03-01

    Modern construction industry still has a high percentage of manual labor, and the greatest prospects of improving the construction process are lying in the field of automatization. In this article automatized construction manipulator-cranes are being studied in order to achieve the most rational design scheme. This is done through formulating a list of general conditions necessary for such cranes and a set of specialized kinematical conditions. A variety of kinematical schemes is evaluated via these conditions, and some are taken for further dynamical analisys. The comparative dynamical analisys of taken schemes was made and the most rational scheme was defined. Therefore a basis for a more complex and practical research of manipulator-cranes design is given and ways to implement them on practical level can now be calculated properly. Also, the perspectives of implementation of automated control systems and informational networks on construction sites in order to boost the quality of construction works, safety of labour and ecological safety are shown.

  16. Automatic Hidden-Web Table Interpretation by Sibling Page Comparison

    Science.gov (United States)

    Tao, Cui; Embley, David W.

    The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%.

  17. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  18. Effectiveness of an automatic manual wheelchair braking system in the prevention of falls.

    Science.gov (United States)

    Martorello, Laura; Swanson, Edward

    2006-01-01

    The purpose of this study was to evaluate the effectiveness of an automatic manual wheelchair braking system in the reduction of falls for patients at high risk of falls while transferring to and from a manual wheelchair. The study design was a normative survey carried out through the use of a written questionnaire sent to 60 skilled nursing facilities to collect data from the medical charts, which identified patients at high risk for falls who used an automatic wheelchair braking system. The facilities participating in the study identified a frequency of falls of high-risk patients while transferring to and from the wheelchair ranging from 2 to 10 per year, with a median fall rate per facility of 4 falls. One year after the installation of the automatic wheelchair braking system, participating facilities demonstrated a reduction of zero to three falls during transfers by high-risk patients, with a median fall rate of zero falls. This represents a statistically significant reduction of 78% in the fall rate of high-risk patients while transferring to and from the wheelchair, t (18) = 6.39, p braking system for manual wheelchairs was installed. The application of the automatic braking system allows clients, families/caregivers, and facility personnel an increased safety factor for the reduction of falls from the wheelchair.

  19. MRI-alone radiation therapy planning for prostate cancer: Automatic fiducial marker detection

    International Nuclear Information System (INIS)

    Ghose, Soumya; Mitra, Jhimli; Rivest-Hénault, David; Fazlollahi, Amir; Fripp, Jurgen; Dowling, Jason A.; Stanwell, Peter; Pichler, Peter; Sun, Jidi; Greer, Peter B.

    2016-01-01

    Purpose: The feasibility of radiation therapy treatment planning using substitute computed tomography (sCT) generated from magnetic resonance images (MRIs) has been demonstrated by a number of research groups. One challenge with an MRI-alone workflow is the accurate identification of intraprostatic gold fiducial markers, which are frequently used for prostate localization prior to each dose delivery fraction. This paper investigates a template-matching approach for the detection of these seeds in MRI. Methods: Two different gradient echo T1 and T2* weighted MRI sequences were acquired from fifteen prostate cancer patients and evaluated for seed detection. For training, seed templates from manual contours were selected in a spectral clustering manifold learning framework. This aids in clustering “similar” gold fiducial markers together. The marker with the minimum distance to a cluster centroid was selected as the representative template of that cluster during training. During testing, Gaussian mixture modeling followed by a Markovian model was used in automatic detection of the probable candidates. The probable candidates were rigidly registered to the templates identified from spectral clustering, and a similarity metric is computed for ranking and detection. Results: A fiducial detection accuracy of 95% was obtained compared to manual observations. Expert radiation therapist observers were able to correctly identify all three implanted seeds on 11 of the 15 scans (the proposed method correctly identified all seeds on 10 of the 15). Conclusions: An novel automatic framework for gold fiducial marker detection in MRI is proposed and evaluated with detection accuracies comparable to manual detection. When radiation therapists are unable to determine the seed location in MRI, they refer back to the planning CT (only available in the existing clinical framework); similarly, an automatic quality control is built into the automatic software to ensure that all gold

  20. MRI-alone radiation therapy planning for prostate cancer: Automatic fiducial marker detection

    Energy Technology Data Exchange (ETDEWEB)

    Ghose, Soumya, E-mail: soumya.ghose@case.edu; Mitra, Jhimli [Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio 44106 and CSIRO Health and Biosecurity, The Australian e-Health & Research Centre, Herston, QLD 4029 (Australia); Rivest-Hénault, David; Fazlollahi, Amir; Fripp, Jurgen; Dowling, Jason A. [CSIRO Health and Biosecurity, The Australian e-Health & Research Centre, Herston, QLD 4029 (Australia); Stanwell, Peter [School of health sciences, The University of Newcastle, Newcastle, NSW 2308 (Australia); Pichler, Peter [Department of Radiation Oncology, Cavalry Mater Newcastle Hospital, Newcastle, NSW 2298 (Australia); Sun, Jidi; Greer, Peter B. [School of Mathematical and Physical Sciences, The University of Newcastle, Newcastle, NSW 2308, Australia and Department of Radiation Oncology, Cavalry Mater Newcastle Hospital, Newcastle, NSW 2298 (Australia)

    2016-05-15

    Purpose: The feasibility of radiation therapy treatment planning using substitute computed tomography (sCT) generated from magnetic resonance images (MRIs) has been demonstrated by a number of research groups. One challenge with an MRI-alone workflow is the accurate identification of intraprostatic gold fiducial markers, which are frequently used for prostate localization prior to each dose delivery fraction. This paper investigates a template-matching approach for the detection of these seeds in MRI. Methods: Two different gradient echo T1 and T2* weighted MRI sequences were acquired from fifteen prostate cancer patients and evaluated for seed detection. For training, seed templates from manual contours were selected in a spectral clustering manifold learning framework. This aids in clustering “similar” gold fiducial markers together. The marker with the minimum distance to a cluster centroid was selected as the representative template of that cluster during training. During testing, Gaussian mixture modeling followed by a Markovian model was used in automatic detection of the probable candidates. The probable candidates were rigidly registered to the templates identified from spectral clustering, and a similarity metric is computed for ranking and detection. Results: A fiducial detection accuracy of 95% was obtained compared to manual observations. Expert radiation therapist observers were able to correctly identify all three implanted seeds on 11 of the 15 scans (the proposed method correctly identified all seeds on 10 of the 15). Conclusions: An novel automatic framework for gold fiducial marker detection in MRI is proposed and evaluated with detection accuracies comparable to manual detection. When radiation therapists are unable to determine the seed location in MRI, they refer back to the planning CT (only available in the existing clinical framework); similarly, an automatic quality control is built into the automatic software to ensure that all gold

  1. Identifying open magnetic field regions of the Sun and their heliospheric counterparts

    Science.gov (United States)

    Krista, L. D.; Reinard, A.

    2017-12-01

    Open magnetic regions on the Sun are either long-lived (coronal holes) or transient (dimmings) in nature. Both phenomena are fundamental to our understanding of the solar behavior as a whole. Coronal holes are the sources of high-speed solar wind streams that cause recurrent geomagnetic storms. Furthermore, the variation of coronal hole properties (area, location, magnetic field strength) over the solar activity cycle is an important marker of the global evolution of the solar magnetic field. Dimming regions, on the other hand, are short-lived coronal holes that often emerge in the wake of solar eruptions. By analyzing their physical properties and their temporal evolution, we aim to understand their connection with their eruptive counterparts (flares and coronal mass ejections) and predict the possibility of a geomagnetic storm. The author developed the Coronal Hole Automated Recognition and Monitoring (CHARM) and the Coronal Dimming Tracker (CoDiT) algorithms. These tools not only identify but track the evolution of open magnetic field regions. CHARM also provides daily coronal hole maps, that are used for forecasts at the NOAA Space Weather Prediction Center. Our goal is to better understand the processes that give rise to eruptive and non-eruptive open field regions and investigate how these regions evolve over time and influence space weather.

  2. Persistent Identifiers for Field Expeditions: A Next Step for the US Oceanographic Research Fleet

    Science.gov (United States)

    Arko, Robert; Carbotte, Suzanne; Chandler, Cynthia; Smith, Shawn; Stocks, Karen

    2016-04-01

    Oceanographic research cruises are complex affairs, typically requiring an extensive effort to secure the funding, plan the experiment, and mobilize the field party. Yet cruises are not typically published online as first-class digital objects with persistent, citable identifiers linked to the scientific literature. The Rolling Deck to Repository (R2R; info@rvdata.us) program maintains a master catalog of oceanographic cruises for the United States research fleet, currently documenting over 6,000 expeditions on 37 active and retired vessels. In 2015, R2R started routinely publishing a Digital Object Identifier (DOI) for each completed cruise. Cruise DOIs, in turn, are linked to related persistent identifiers where available including the Open Researcher and Contributor ID (ORCID) for members of the science party, the International Geo Sample Number (IGSN) for physical specimens collected during the cruise, the Open Funder Registry (FundRef) codes that supported the experiment, and additional DOIs for datasets, journal articles, and other products resulting from the cruise. Publishing a persistent identifier for each field expedition will facilitate interoperability between the many different repositories that hold research products from cruises; will provide credit to the investigators who secured the funding and carried out the experiment; and will facilitate the gathering of fleet-wide altmetrics that demonstrate the broad impact of oceanographic research.

  3. Anatomy-based automatic detection and segmentation of major vessels in thoracic CTA images

    International Nuclear Information System (INIS)

    Zou Xiaotao; Liang Jianming; Wolf, M.; Salganicoff, M.; Krishnan, A.; Nadich, D.P.

    2007-01-01

    Existing approaches for automated computerized detection of pulmonary embolism (PE) using computed tomography angiography (CTA) usually focus on segmental and sub-segmental emboli. The goal of our current research is to extend our existing approach to automated detection of central PE. In order to detect central emboli, the major vessels must be first identified and segmented automatically. This submission presents an anatomy-based method for automatic computerized detection and segmentation of aortas and main pulmonary arteries in CTA images. (orig.)

  4. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    Science.gov (United States)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  5. Neural Bases of Automaticity

    Science.gov (United States)

    Servant, Mathieu; Cassey, Peter; Woodman, Geoffrey F.; Logan, Gordon D.

    2018-01-01

    Automaticity allows us to perform tasks in a fast, efficient, and effortless manner after sufficient practice. Theories of automaticity propose that across practice processing transitions from being controlled by working memory to being controlled by long-term memory retrieval. Recent event-related potential (ERP) studies have sought to test this…

  6. First results from the field test of households with dynamic tarif and automatic control in the regenerative model region Harz; Erste Ergebnisse des Haushaltsfeldtests mit dynamischen Tarif und automatischer Steuerung in der Regenerativen Modellregion Harz

    Energy Technology Data Exchange (ETDEWEB)

    Funke, Stephan; Landau, Markus [Fraunhofer Institut fuer Windenergie und Energiesystemtechnik (IWES), Kassel (Germany); Filzek, Dirk; Volkert, Christina [CUBE Engineering GmbH, Kassel (Germany); Fechner, Amelie [Saarland Univ., Saarbruecken (Germany). Forschungsgruppe Umweltpsychologie

    2012-07-01

    As part of the E-Energy research project RegModHarz (Regenerative Model Region Harz) a field test with test households is carried out. A system developed in the project and consisting of a dynamic tariff, appliance control and monitoring system is tested. Concomitant to this, the acceptance of this system by the participants of the field test is evaluated. First results from the commissioning of the system are already available. Currently, the second phase of the field test is performed. At this, the participants of the field test can adjust their power consumption actively and automatically to the availability of electricity from renewable energy sources in the model region.

  7. Brand and automaticity

    OpenAIRE

    Liu, J.

    2008-01-01

    A presumption of most consumer research is that consumers endeavor to maximize the utility of their choices and are in complete control of their purchasing and consumption behavior. However, everyday life experience suggests that many of our choices are not all that reasoned or conscious. Indeed, automaticity, one facet of behavior, is indispensable to complete the portrait of consumers. Despite its importance, little attention is paid to how the automatic side of behavior can be captured and...

  8. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...... by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers...

  9. A semi-automatic method for peak and valley detection in free-breathing respiratory waveforms

    International Nuclear Information System (INIS)

    Lu Wei; Nystrom, Michelle M.; Parikh, Parag J.; Fooshee, David R.; Hubenschmidt, James P.; Bradley, Jeffrey D.; Low, Daniel A.

    2006-01-01

    The existing commercial software often inadequately determines respiratory peaks for patients in respiration correlated computed tomography. A semi-automatic method was developed for peak and valley detection in free-breathing respiratory waveforms. First the waveform is separated into breath cycles by identifying intercepts of a moving average curve with the inspiration and expiration branches of the waveform. Peaks and valleys were then defined, respectively, as the maximum and minimum between pairs of alternating inspiration and expiration intercepts. Finally, automatic corrections and manual user interventions were employed. On average for each of the 20 patients, 99% of 307 peaks and valleys were automatically detected in 2.8 s. This method was robust for bellows waveforms with large variations

  10. Development of advanced automatic operation system for nuclear ship. 1. Perfect automatic normal operation

    International Nuclear Information System (INIS)

    Nakazawa, Toshio; Yabuuti, Noriaki; Takahashi, Hiroki; Shimazaki, Junya

    1999-02-01

    Development of operation support system such as automatic operating system and anomaly diagnosis systems of nuclear reactor is very important in practical nuclear ship because of a limited number of operators and severe conditions in which receiving support from others in a case of accident is very difficult. The goal of development of the operation support systems is to realize the perfect automatic control system in a series of normal operation from the reactor start-up to the shutdown. The automatic control system for the normal operation has been developed based on operating experiences of the first Japanese nuclear ship 'Mutsu'. Automation technique was verified by 'Mutsu' plant data at manual operation. Fully automatic control of start-up and shutdown operations was achieved by setting the desired value of operation and the limiting value of parameter fluctuation, and by making the operation program of the principal equipment such as the main coolant pump and the heaters. This report presents the automatic operation system developed for the start-up and the shutdown of reactor and the verification of the system using the Nuclear Ship Engineering Simulator System. (author)

  11. 14 CFR 23.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 23.1329 Section 23...: Installation § 23.1329 Automatic pilot system. If an automatic pilot system is installed, it must meet the following: (a) Each system must be designed so that the automatic pilot can— (1) Be quickly and positively...

  12. 46 CFR 52.01-10 - Automatic controls.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Automatic controls. 52.01-10 Section 52.01-10 Shipping... Requirements § 52.01-10 Automatic controls. (a) Each main boiler must meet the special requirements for automatic safety controls in § 62.35-20(a)(1) of this chapter. (b) Each automatically controlled auxiliary...

  13. A general graphical user interface for automatic reliability modeling

    Science.gov (United States)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  14. ALOHA: Automatic libraries of helicity amplitudes for Feynman diagram computations

    Science.gov (United States)

    de Aquino, Priscila; Link, William; Maltoni, Fabio; Mattelaer, Olivier; Stelzer, Tim

    2012-10-01

    We present an application that automatically writes the HELAS (HELicity Amplitude Subroutines) library corresponding to the Feynman rules of any quantum field theory Lagrangian. The code is written in Python and takes the Universal FeynRules Output (UFO) as an input. From this input it produces the complete set of routines, wave-functions and amplitudes, that are needed for the computation of Feynman diagrams at leading as well as at higher orders. The representation is language independent and currently it can output routines in Fortran, C++, and Python. A few sample applications implemented in the MADGRAPH 5 framework are presented. Program summary Program title: ALOHA Catalogue identifier: AEMS_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: http://www.opensource.org/licenses/UoI-NCSA.php No. of lines in distributed program, including test data, etc.: 6094320 No. of bytes in distributed program, including test data, etc.: 7479819 Distribution format: tar.gz Programming language: Python2.6 Computer: 32/64 bit Operating system: Linux/Mac/Windows RAM: 512 Mbytes Classification: 4.4, 11.6 Nature of problem: An effcient numerical evaluation of a squared matrix element can be done with the help of the helicity routines implemented in the HELAS library [1]. This static library contains a limited number of helicity functions and is therefore not always able to provide the needed routine in the presence of an arbitrary interaction. This program provides a way to automatically create the corresponding routines for any given model. Solution method: ALOHA takes the Feynman rules associated to the vertex obtained from the model information (in the UFO format [2]), and multiplies it by the different wavefunctions or propagators. As a result the analytical expression of the helicity routines is obtained. Subsequently, this expression is

  15. Automatic component calibration and error diagnostics for model-based accelerator control. Phase I final report

    International Nuclear Information System (INIS)

    Carl Stern; Martin Lee

    1999-01-01

    Phase I work studied the feasibility of developing software for automatic component calibration and error correction in beamline optics models. A prototype application was developed that corrects quadrupole field strength errors in beamline models

  16. Automatic component calibration and error diagnostics for model-based accelerator control. Phase I final report

    CERN Document Server

    Carl-Stern

    1999-01-01

    Phase I work studied the feasibility of developing software for automatic component calibration and error correction in beamline optics models. A prototype application was developed that corrects quadrupole field strength errors in beamline models.

  17. Simple Methods for Scanner Drift Normalization Validated for Automatic Segmentation of Knee Magnetic Resonance Imaging

    DEFF Research Database (Denmark)

    Dam, Erik Bjørnager

    2018-01-01

    Scanner drift is a well-known magnetic resonance imaging (MRI) artifact characterized by gradual signal degradation and scan intensity changes over time. In addition, hardware and software updates may imply abrupt changes in signal. The combined effects are particularly challenging for automatic...... image analysis methods used in longitudinal studies. The implication is increased measurement variation and a risk of bias in the estimations (e.g. in the volume change for a structure). We proposed two quite different approaches for scanner drift normalization and demonstrated the performance...... for segmentation of knee MRI using the fully automatic KneeIQ framework. The validation included a total of 1975 scans from both high-field and low-field MRI. The results demonstrated that the pre-processing method denoted Atlas Affine Normalization significantly removed scanner drift effects and ensured...

  18. Automatic welding and cladding in heavy fabrication

    International Nuclear Information System (INIS)

    Altamer, A. de

    1980-01-01

    A description is given of the automatic welding processes used by an Italian fabricator of pressure vessels for petrochemical and nuclear plant. The automatic submerged arc welding, submerged arc strip cladding, pulsed TIG, hot wire TIG and MIG welding processes have proved satisfactory in terms of process reliability, metal deposition rate, and cost effectiveness for low alloy and carbon steels. An example shows sequences required during automatic butt welding, including heat treatments. Factors which govern satisfactory automatic welding include automatic anti-drift rotator device, electrode guidance and bead programming system, the capability of single and dual head operation, flux recovery and slag removal systems, operator environment and controls, maintaining continuity of welding and automatic reverse side grinding. Automatic welding is used for: joining vessel sections; joining tubes to tubeplate; cladding of vessel rings and tubes, dished ends and extruded nozzles; nozzle to shell and butt welds, including narrow gap welding. (author)

  19. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    Full Text Available Abstract Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM. This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD. We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI. It is being used in

  20. Automatic categorization of diverse experimental information in the bioscience literature

    Science.gov (United States)

    2012-01-01

    Background Curation of information from bioscience literature into biological knowledge databases is a crucial way of capturing experimental information in a computable form. During the biocuration process, a critical first step is to identify from all published literature the papers that contain results for a specific data type the curator is interested in annotating. This step normally requires curators to manually examine many papers to ascertain which few contain information of interest and thus, is usually time consuming. We developed an automatic method for identifying papers containing these curation data types among a large pool of published scientific papers based on the machine learning method Support Vector Machine (SVM). This classification system is completely automatic and can be readily applied to diverse experimental data types. It has been in use in production for automatic categorization of 10 different experimental datatypes in the biocuration process at WormBase for the past two years and it is in the process of being adopted in the biocuration process at FlyBase and the Saccharomyces Genome Database (SGD). We anticipate that this method can be readily adopted by various databases in the biocuration community and thereby greatly reducing time spent on an otherwise laborious and demanding task. We also developed a simple, readily automated procedure to utilize training papers of similar data types from different bodies of literature such as C. elegans and D. melanogaster to identify papers with any of these data types for a single database. This approach has great significance because for some data types, especially those of low occurrence, a single corpus often does not have enough training papers to achieve satisfactory performance. Results We successfully tested the method on ten data types from WormBase, fifteen data types from FlyBase and three data types from Mouse Genomics Informatics (MGI). It is being used in the curation work flow at

  1. Implicit proactive interference, age, and automatic versus controlled retrieval strategies.

    Science.gov (United States)

    Ikier, Simay; Yang, Lixia; Hasher, Lynn

    2008-05-01

    We assessed the extent to which implicit proactive interference results from automatic versus controlled retrieval among younger and older adults. During a study phase, targets (e.g., "ALLERGY") either were or were not preceded by nontarget competitors (e.g., "ANALOGY"). After a filled interval, the participants were asked to complete word fragments, some of which cued studied words (e.g., "A_L_ _GY"). Retrieval strategies were identified by the difference in response speed between a phase containing fragments that cued only new words and a phase that included a mix of fragments cuing old and new words. Previous results were replicated: Proactive interference was found in implicit memory, and the negative effects were greater for older than for younger adults. Novel findings demonstrate two retrieval processes that contribute to interference: an automatic one that is age invariant and a controlled process that can reduce the magnitude of the automatic interference effects. The controlled process, however, is used effectively only by younger adults. This pattern of findings potentially explains age differences in susceptibility to proactive interference.

  2. Automatic spinal cord localization, robust to MRI contrasts using global curve optimization.

    Science.gov (United States)

    Gros, Charley; De Leener, Benjamin; Dupont, Sara M; Martin, Allan R; Fehlings, Michael G; Bakshi, Rohit; Tummala, Subhash; Auclair, Vincent; McLaren, Donald G; Callot, Virginie; Cohen-Adad, Julien; Sdika, Michaël

    2018-02-01

    During the last two decades, MRI has been increasingly used for providing valuable quantitative information about spinal cord morphometry, such as quantification of the spinal cord atrophy in various diseases. However, despite the significant improvement of MR sequences adapted to the spinal cord, automatic image processing tools for spinal cord MRI data are not yet as developed as for the brain. There is nonetheless great interest in fully automatic and fast processing methods to be able to propose quantitative analysis pipelines on large datasets without user bias. The first step of most of these analysis pipelines is to detect the spinal cord, which is challenging to achieve automatically across the broad range of MRI contrasts, field of view, resolutions and pathologies. In this paper, a fully automated, robust and fast method for detecting the spinal cord centerline on MRI volumes is introduced. The algorithm uses a global optimization scheme that attempts to strike a balance between a probabilistic localization map of the spinal cord center point and the overall spatial consistency of the spinal cord centerline (i.e. the rostro-caudal continuity of the spinal cord). Additionally, a new post-processing feature, which aims to automatically split brain and spine regions is introduced, to be able to detect a consistent spinal cord centerline, independently from the field of view. We present data on the validation of the proposed algorithm, known as "OptiC", from a large dataset involving 20 centers, 4 contrasts (T 2 -weighted n = 287, T 1 -weighted n = 120, T 2 ∗ -weighted n = 307, diffusion-weighted n = 90), 501 subjects including 173 patients with a variety of neurologic diseases. Validation involved the gold-standard centerline coverage, the mean square error between the true and predicted centerlines and the ability to accurately separate brain and spine regions. Overall, OptiC was able to cover 98.77% of the gold-standard centerline, with a

  3. Automatic differentiation of functions

    International Nuclear Information System (INIS)

    Douglas, S.R.

    1990-06-01

    Automatic differentiation is a method of computing derivatives of functions to any order in any number of variables. The functions must be expressible as combinations of elementary functions. When evaluated at specific numerical points, the derivatives have no truncation error and are automatically found. The method is illustrated by simple examples. Source code in FORTRAN is provided

  4. Examination of the semi-automatic calculation technique of vegetation cover rate by digital camera images.

    Science.gov (United States)

    Takemine, S.; Rikimaru, A.; Takahashi, K.

    The rice is one of the staple foods in the world High quality rice production requires periodically collecting rice growth data to control the growth of rice The height of plant the number of stem the color of leaf is well known parameters to indicate rice growth Rice growth diagnosis method based on these parameters is used operationally in Japan although collecting these parameters by field survey needs a lot of labor and time Recently a laborsaving method for rice growth diagnosis is proposed which is based on vegetation cover rate of rice Vegetation cover rate of rice is calculated based on discriminating rice plant areas in a digital camera image which is photographed in nadir direction Discrimination of rice plant areas in the image was done by the automatic binarization processing However in the case of vegetation cover rate calculation method depending on the automatic binarization process there is a possibility to decrease vegetation cover rate against growth of rice In this paper a calculation method of vegetation cover rate was proposed which based on the automatic binarization process and referred to the growth hysteresis information For several images obtained by field survey during rice growing season vegetation cover rate was calculated by the conventional automatic binarization processing and the proposed method respectively And vegetation cover rate of both methods was compared with reference value obtained by visual interpretation As a result of comparison the accuracy of discriminating rice plant areas was increased by the proposed

  5. Algorithms for the automatic identification of MARFEs and UFOs in JET database of visible camera videos

    International Nuclear Information System (INIS)

    Murari, A.; Camplani, M.; Cannas, B.; Usai, P.; Mazon, D.; Delaunay, F.

    2010-01-01

    MARFE instabilities and UFOs leave clear signatures in JET fast visible camera videos. Given the potential harmful consequences of these events, particularly as triggers of disruptions, it would be important to have the means of detecting them automatically. In this paper, the results of various algorithms to identify automatically the MARFEs and UFOs in JET visible videos are reported. The objective is to retrieve the videos, which have captured these events, exploring the whole JET database of images, as a preliminary step to the development of real-time identifiers in the future. For the detection of MARFEs, a complete identifier has been finalized, using morphological operators and Hu moments. The final algorithm manages to identify the videos with MARFEs with a success rate exceeding 80%. Due to the lack of a complete statistics of examples, the UFO identifier is less developed, but a preliminary code can detect UFOs quite reliably. (authors)

  6. Identification with video game characters as automatic shift of self-perceptions

    NARCIS (Netherlands)

    Klimmt, C.; Hefner, D.; Vorderer, P.A.; Roth, C.; Blake, C.

    2010-01-01

    Two experiments tested the prediction that video game players identify with the character or role they are assigned, which leads to automatic shifts in implicit self-perceptions. Video game identification, thus, is considered as a kind of altered self-experience. In Study 1 (N = 61), participants

  7. Solar Powered Automatic Shrimp Feeding System

    Directory of Open Access Journals (Sweden)

    Dindo T. Ani

    2015-12-01

    Full Text Available - Automatic system has brought many revolutions in the existing technologies. One among the technologies, which has greater developments, is the solar powered automatic shrimp feeding system. For instance, the solar power which is a renewable energy can be an alternative solution to energy crisis and basically reducing man power by using it in an automatic manner. The researchers believe an automatic shrimp feeding system may help solve problems on manual feeding operations. The project study aimed to design and develop a solar powered automatic shrimp feeding system. It specifically sought to prepare the design specifications of the project, to determine the methods of fabrication and assembly, and to test the response time of the automatic shrimp feeding system. The researchers designed and developed an automatic system which utilizes a 10 hour timer to be set in intervals preferred by the user and will undergo a continuous process. The magnetic contactor acts as a switch connected to the 10 hour timer which controls the activation or termination of electrical loads and powered by means of a solar panel outputting electrical power, and a rechargeable battery in electrical communication with the solar panel for storing the power. By undergoing through series of testing, the components of the modified system were proven functional and were operating within the desired output. It was recommended that the timer to be used should be tested to avoid malfunction and achieve the fully automatic system and that the system may be improved to handle changes in scope of the project.

  8. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  9. Identifying cognitive predictors of reactive and proactive aggression

    NARCIS (Netherlands)

    Brugman, S.; Lobbestael, J.; Arntz, A.R.; Cima, M.; Schumann, T.; Dambacher, F.

    2015-01-01

    The aim of this study was to identify implicit cognitive predictors of aggressive behavior. Specifically, the predictive value of an attentional bias for aggressive stimuli and automatic association of the self and aggression was examined for reactive and proactive aggressive behavior in a

  10. Personality in speech assessment and automatic classification

    CERN Document Server

    Polzehl, Tim

    2015-01-01

    This work combines interdisciplinary knowledge and experience from research fields of psychology, linguistics, audio-processing, machine learning, and computer science. The work systematically explores a novel research topic devoted to automated modeling of personality expression from speech. For this aim, it introduces a novel personality assessment questionnaire and presents the results of extensive labeling sessions to annotate the speech data with personality assessments. It provides estimates of the Big 5 personality traits, i.e. openness, conscientiousness, extroversion, agreeableness, and neuroticism. Based on a database built on the questionnaire, the book presents models to tell apart different personality types or classes from speech automatically.

  11. 77 FR 3404 - Energy Conservation Standards for Automatic Commercial Ice Makers: Public Meeting and...

    Science.gov (United States)

    2012-01-24

    .... Email: [email protected] . SUPPLEMENTARY INFORMATION: I. Statutory Authority II. History of... feedback from interested parties on its analytical framework, models, and preliminary results. II. History... automatic commercial ice makers installed in the field, such as in hospitals and restaurants. Details of the...

  12. Effects of voluntary and automatic control of center of pressure sway during quiet standing.

    Science.gov (United States)

    Ueta, Kozo; Okada, Yohei; Nakano, Hideki; Osumi, Michihiro; Morioka, Shu

    2015-01-01

    The authors investigated the effects of voluntary and automatic control on the spatial variables (envelope area, maximal amplitude, and root mean square [RMS]) of center of pressure (COP) displacement during quiet standing and identified differences in their postural control strategies (mean velocity [MV], mean power frequency [MPF], and power density). COP data were recorded under relaxed (experimental control), still (voluntary control), and dual (automatic control) conditions. RMS was significantly lower in the still and dual conditions than in the relaxed condition. MV, MPF, and power density were significantly higher in the still condition than in the dual condition. These results indicate that both voluntary and automatic control decrease the spatial variables of COP displacement; however, their postural control strategies are different.

  13. Fast Appearance Modeling for Automatic Primary Video Object Segmentation.

    Science.gov (United States)

    Yang, Jiong; Price, Brian; Shen, Xiaohui; Lin, Zhe; Yuan, Junsong

    2016-02-01

    Automatic segmentation of the primary object in a video clip is a challenging problem as there is no prior knowledge of the primary object. Most existing techniques thus adapt an iterative approach for foreground and background appearance modeling, i.e., fix the appearance model while optimizing the segmentation and fix the segmentation while optimizing the appearance model. However, these approaches may rely on good initialization and can be easily trapped in local optimal. In addition, they are usually time consuming for analyzing videos. To address these limitations, we propose a novel and efficient appearance modeling technique for automatic primary video object segmentation in the Markov random field (MRF) framework. It embeds the appearance constraint as auxiliary nodes and edges in the MRF structure, and can optimize both the segmentation and appearance model parameters simultaneously in one graph cut. The extensive experimental evaluations validate the superiority of the proposed approach over the state-of-the-art methods, in both efficiency and effectiveness.

  14. ORCID Author Identifiers: A Primer for Librarians.

    Science.gov (United States)

    Akers, Katherine G; Sarkozy, Alexandra; Wu, Wendy; Slyman, Alison

    2016-01-01

    The ORCID (Open Researcher and Contributor ID) registry helps disambiguate authors and streamline research workflows by assigning unique 16-digit author identifiers that enable automatic linkages between researchers and their scholarly activities. This article describes how ORCID works, the benefits of using ORCID, and how librarians can promote ORCID at their institutions by raising awareness of ORCID, helping researchers create and populate ORCID profiles, and integrating ORCID identifiers into institutional repositories and other university research information systems.

  15. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  16. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  17. Automatically generating Feynman rules for improved lattice field theories

    International Nuclear Information System (INIS)

    Hart, A.; Hippel, G.M. von; Horgan, R.R.; Storoni, L.C.

    2005-01-01

    Deriving the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially when improvement terms are present. This physically important task is, however, suitable for automation. We describe a flexible algorithm for generating Feynman rules for a wide range of lattice field theories including gluons, relativistic fermions and heavy quarks. We also present an efficient implementation of this in a freely available, multi-platform programming language (PYTHON), optimised to deal with a wide class of lattice field theories

  18. An evaluation of automatic coronary artery calcium scoring methods with cardiac CT using the orCaScore framework.

    Science.gov (United States)

    Wolterink, Jelmer M; Leiner, Tim; de Vos, Bob D; Coatrieux, Jean-Louis; Kelm, B Michael; Kondo, Satoshi; Salgado, Rodrigo A; Shahzad, Rahil; Shu, Huazhong; Snoeren, Miranda; Takx, Richard A P; van Vliet, Lucas J; van Walsum, Theo; Willems, Tineke P; Yang, Guanyu; Zheng, Yefeng; Viergever, Max A; Išgum, Ivana

    2016-05-01

    The amount of coronary artery calcification (CAC) is a strong and independent predictor of cardiovascular disease (CVD) events. In clinical practice, CAC is manually identified and automatically quantified in cardiac CT using commercially available software. This is a tedious and time-consuming process in large-scale studies. Therefore, a number of automatic methods that require no interaction and semiautomatic methods that require very limited interaction for the identification of CAC in cardiac CT have been proposed. Thus far, a comparison of their performance has been lacking. The objective of this study was to perform an independent evaluation of (semi)automatic methods for CAC scoring in cardiac CT using a publicly available standardized framework. Cardiac CT exams of 72 patients distributed over four CVD risk categories were provided for (semi)automatic CAC scoring. Each exam consisted of a noncontrast-enhanced calcium scoring CT (CSCT) and a corresponding coronary CT angiography (CCTA) scan. The exams were acquired in four different hospitals using state-of-the-art equipment from four major CT scanner vendors. The data were divided into 32 training exams and 40 test exams. A reference standard for CAC in CSCT was defined by consensus of two experts following a clinical protocol. The framework organizers evaluated the performance of (semi)automatic methods on test CSCT scans, per lesion, artery, and patient. Five (semi)automatic methods were evaluated. Four methods used both CSCT and CCTA to identify CAC, and one method used only CSCT. The evaluated methods correctly detected between 52% and 94% of CAC lesions with positive predictive values between 65% and 96%. Lesions in distal coronary arteries were most commonly missed and aortic calcifications close to the coronary ostia were the most common false positive errors. The majority (between 88% and 98%) of correctly identified CAC lesions were assigned to the correct artery. Linearly weighted Cohen's kappa

  19. Landsat 5 TM images and DEM in lithologic mapping of Payen Volcanic Field (Mendoza Province, Argentina)

    International Nuclear Information System (INIS)

    Fornaciai, A.; Bisson, M.; Mazzarini, F.; Del Carlo, P.; Pasquare, G.

    2009-01-01

    Satellite image such as Landsat 5 TM scene provides excellent representation of Earth and synoptic view of large geographic areas in different band combination. Landsat TM images allow automatic and semi-automatic classification of land cover, nevertheless the software frequently may some difficulties in distinguishing between similar radiometric surfaces. In this case, the use of Digital Elevation Model (DEM) can be an important tool to identify different surface covers. In this study, several False Color Composite (FCC) of Landsat 5 TM Image, DEM and the respective draped image of them, were used to delineate lithological boundaries and tectonic features of regional significance of the Paven Volcanic Field (PVF). PFV is a Quaternary fissural structure belonging to the black-arc extensional areas of the Andes in the Mendoza Province (Argentina) characterized by many composite basaltic lava flow fields. The necessity to identify different lava flows with the same composition, and then with same spectral features, allows to highlight the improvement of synergic use of TM images and shaded DEM in the visual interpretation. Information obtained from Satellite data and DEM have been compared with previous geological maps and transferred into a topographical base map. Based on these data a new lithological map at 1:100.000 scale has been presented [it

  20. 30 CFR 77.314 - Automatic temperature control instruments.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic temperature control instruments. 77... UNDERGROUND COAL MINES Thermal Dryers § 77.314 Automatic temperature control instruments. (a) Automatic temperature control instruments for thermal dryer system shall be of the recording type. (b) Automatic...

  1. Automatic control systems engineering

    International Nuclear Information System (INIS)

    Shin, Yun Gi

    2004-01-01

    This book gives descriptions of automatic control for electrical electronics, which indicates history of automatic control, Laplace transform, block diagram and signal flow diagram, electrometer, linearization of system, space of situation, state space analysis of electric system, sensor, hydro controlling system, stability, time response of linear dynamic system, conception of root locus, procedure to draw root locus, frequency response, and design of control system.

  2. Image processing applied to automatic detection of defects during ultrasonic examination

    International Nuclear Information System (INIS)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces

  3. The Rationalization of Automatic Units for HPDC Technology

    Directory of Open Access Journals (Sweden)

    A. Herman

    2012-04-01

    Full Text Available The paper deals with problem of optimal used automatic workplace for HPDC technology - mainly from aspects of operations sequence, efficiency of work cycle and planning of using and servicing of HPDC casting machine. Presented are possible ways to analyse automatic units for HPDC. The experimental part was focused on the rationalization of the current work cycle time for die casting of aluminium alloy. The working place was described in detail in the project. The measurements were carried out in detail with the help of charts and graphs mapped cycle of casting workplace. Other parameters and settings have been identified.The proposals for improvements were made after the first measurements and these improvements were subsequently verified. The main actions were mainly software modifications of casting center. It is for the reason that today's sophisticated workplaces have the option of a relatively wide range of modifications without any physical harm to machines themselves. It is possible to change settings or unlock some unsatisfactory parameters.

  4. FURTHER CONSIDERATIONS ON SPREADSHEET-BASED AUTOMATIC TREND LINES

    Directory of Open Access Journals (Sweden)

    DANIEL HOMOCIANU

    2015-12-01

    Full Text Available Most of the nowadays business applications working with data sets allow exports to the spreadsheet format. This fact is related to the experience of common business users with such products and to the possibility to couple what they have with something containing many models, functions and possibilities to process and represent data, by that getting something in dynamics and much more than a simple static less useful report. The purpose of Business Intelligence is to identify clusters, profiles, association rules, decision trees and many other patterns or even behaviours, but also to generate alerts for exceptions, determine trends and make predictions about the future based on historical data. In this context, the paper shows some practical results obtained after testing both the automatic creation of scatter charts and trend lines corresponding to the user’s preferences and the automatic suggesting of the most appropriate trend for the tested data mostly based on the statistical measure of how close they are to the regression function.

  5. Automatic crude oil handling through a pressurized system from the wellhead to the refinery

    Energy Technology Data Exchange (ETDEWEB)

    Davis, W.B.; Truman, P.W.; Groeneman, A.R.

    1967-01-01

    Production from 51 wells completed in the 3 unitized formations of the Lost Soldier Field, Sweetwater Co., Wyoming, is brought to a central point through individual flow lines. Here the fluids are directed through separate automatic well testing and oil treating facilities, one for each formation. After separation of oil, gas and water, the oil goes to pressurized surge tanks and then to lease automatic custody transfer units. There is one surge tank and one LACT unit for each formation. The oil is automatically transferred to the Sinclair Pipe Line Co. for delivery to Sinclair's refinery at Sinclair, Wyoming, through a closed pipe line system. A central console provides: (1) supervisory control from the wellheads through the LACT units, (2) well test and production data logging, and (3) monitoring by activating alarms for abnormal conditions of flow, liquid levels, temperatures and pressures.

  6. Integration of wireless sensor networks into automatic irrigation scheduling of a center pivot

    Science.gov (United States)

    A six-span center pivot system was used as a platform for testing two wireless sensor networks (WSN) of infrared thermometers. The cropped field was a semi-circle, divided into six pie shaped sections of which three were irrigated manually and three were irrigated automatically based on the time tem...

  7. Monitoring caustic injuries from emergency department databases using automatic keyword recognition software.

    Science.gov (United States)

    Vignally, P; Fondi, G; Taggi, F; Pitidis, A

    2011-03-31

    In Italy the European Union Injury Database reports the involvement of chemical products in 0.9% of home and leisure accidents. The Emergency Department registry on domestic accidents in Italy and the Poison Control Centres record that 90% of cases of exposure to toxic substances occur in the home. It is not rare for the effects of chemical agents to be observed in hospitals, with a high potential risk of damage - the rate of this cause of hospital admission is double the domestic injury average. The aim of this study was to monitor the effects of injuries caused by caustic agents in Italy using automatic free-text recognition in Emergency Department medical databases. We created a Stata software program to automatically identify caustic or corrosive injury cases using an agent-specific list of keywords. We focused attention on the procedure's sensitivity and specificity. Ten hospitals in six regions of Italy participated in the study. The program identified 112 cases of injury by caustic or corrosive agents. Checking the cases by quality controls (based on manual reading of ED reports), we assessed 99 cases as true positive, i.e. 88.4% of the patients were automatically recognized by the software as being affected by caustic substances (99% CI: 80.6%- 96.2%), that is to say 0.59% (99% CI: 0.45%-0.76%) of the whole sample of home injuries, a value almost three times as high as that expected (p < 0.0001) from European codified information. False positives were 11.6% of the recognized cases (99% CI: 5.1%- 21.5%). Our automatic procedure for caustic agent identification proved to have excellent product recognition capacity with an acceptable level of excess sensitivity. Contrary to our a priori hypothesis, the automatic recognition system provided a level of identification of agents possessing caustic effects that was significantly much greater than was predictable on the basis of the values from current codifications reported in the European Database.

  8. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    International Nuclear Information System (INIS)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship 'Mutsu'. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  9. Development of advanced automatic control system for nuclear ship. 2. Perfect automatic operation after reactor scram events

    Energy Technology Data Exchange (ETDEWEB)

    Yabuuchi, Noriaki; Nakazawa, Toshio; Takahashi, Hiroki; Shimazaki, Junya; Hoshi, Tsutao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-11-01

    An automatic operation system has been developed for the purpose of realizing a perfect automatic plant operation after reactor scram events. The goal of the automatic operation after a reactor scram event is to bring the reactor hot stand-by condition automatically. The basic functions of this system are as follows; to monitor actions of the equipments of safety actions after a reactor scram, to control necessary control equipments to bring a reactor to a hot stand-by condition automatically, and to energize a decay heat removal system. The performance evaluation on this system was carried out by comparing the results using to Nuclear Ship Engineering Simulation System (NESSY) and the those measured in the scram test of the nuclear ship `Mutsu`. As the result, it was showed that this system had the sufficient performance to bring a reactor to a hot syand-by condition quickly and safety. (author)

  10. Automatic exchange unit for control rod drive device

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To enable automatic reoperation and continuation without external power interruption remedy device at the time of recovering the interrupted power soruce during automatic positioning operation. Constitution: In case of an automatic exchange unit for a control rod drive device of the control type for setting the deviation between the positioning target position and the present position of the device to zero, the position data of the drive device of the positioning target value of the device is automatically read, and an interlock of operation inhibit is applied to a control system until the data reading is completed and automatic operation start or restart conditions are sequentially confirmed. After the confirmation, the interlock is released to start the automatic operation or reoperation. Accordingly, the automatic operation can be safely restarted and continued. (Yoshihara, H.)

  11. 2011 International Conference in Electrics, Communication and Automatic Control Proceedings

    CERN Document Server

    2012-01-01

    This two-volume set contains the very latest, cutting-edge material in electrics, communication and automatic control. As a vital field of research that is highly relevant to current developments in a number of technological domains, the subjects it covers include micro-electronics and integrated circuit control, signal processing technology, next-generation network infrastructure, wireless communication and scientific instruments. The aim of the International Conference in Electrics, Communication and Automatic Control, held in Chongqing, China, in June 2011 was to provide a valuable inclusive platform for researchers, engineers, academicians and industrial professionals from all over the world to share their research results with fellow scientists in the sector. The call for papers netted well over 600 submissions, of which 224 were selected for presentation. This fully peer-reviewed collection of papers from the conference can be viewed as a single-source compendium of the latest trends and techniques in t...

  12. Automatic process control in anaerobic digestion technology: A critical review.

    Science.gov (United States)

    Nguyen, Duc; Gadhamshetty, Venkataramana; Nitayavardhana, Saoharit; Khanal, Samir Kumar

    2015-10-01

    Anaerobic digestion (AD) is a mature technology that relies upon a synergistic effort of a diverse group of microbial communities for metabolizing diverse organic substrates. However, AD is highly sensitive to process disturbances, and thus it is advantageous to use online monitoring and process control techniques to efficiently operate AD process. A range of electrochemical, chromatographic and spectroscopic devices can be deployed for on-line monitoring and control of the AD process. While complexity of the control strategy ranges from a feedback control to advanced control systems, there are some debates on implementation of advanced instrumentations or advanced control strategies. Centralized AD plants could be the answer for the applications of progressive automatic control field. This article provides a critical overview of the available automatic control technologies that can be implemented in AD processes at different scales. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. 5th International Conference on Electrical Engineering and Automatic Control

    CERN Document Server

    Yao, Yufeng

    2016-01-01

    On the basis of instrument electrical and automatic control system, the 5th International Conference on Electrical Engineering and Automatic Control (CEEAC) was established at the crossroads of information technology and control technology, and seeks to effectively apply information technology to a sweeping trend that views control as the core of intelligent manufacturing and life. This book takes a look forward into advanced manufacturing development, an area shaped by intelligent manufacturing. It highlights the application and promotion of process control represented by traditional industries, such as the steel industry and petrochemical industry; the technical equipment and system cooperative control represented by robot technology and multi-axis CNC; and the control and support of emerging process technologies represented by laser melting and stacking, as well as the emerging industry represented by sustainable and intelligent life. The book places particular emphasis on the micro-segments field, such as...

  14. Automatic coronary calcium scoring using noncontrast and contrast CT images

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Guanyu, E-mail: yang.list@seu.edu.cn; Chen, Yang; Shu, Huazhong [Laboratory of Image Science and Technology, School of Computer Science and Engineering, Southeast University, No. 2, Si Pai Lou, Nanjing 210096 (China); Centre de Recherche en Information Biomédicale Sino-Français (LIA CRIBs), Nanjing 210096 (China); Key Laboratory of Computer Network and Information Integration, Southeast University, Ministry of Education, Nanjing 210096 (China); Ning, Xiufang; Sun, Qiaoyu [Laboratory of Image Science and Technology, School of Computer Science and Engineering, Southeast University, No. 2, Si Pai Lou, Nanjing 210096 (China); Key Laboratory of Computer Network and Information Integration, Southeast University, Ministry of Education, Nanjing 210096 (China); Coatrieux, Jean-Louis [INSERM-U1099, Rennes F-35000 (France); Labotatoire Traitement du Signal et de l’Image (LTSI), Université de Rennes 1, Campus de Beaulieu, Bat. 22, Rennes 35042 Cedex (France); Centre de Recherche en Information Biomédicale Sino-Français (LIA CRIBs), Nanjing 210096 (China)

    2016-05-15

    Purpose: Calcium scoring is widely used to assess the risk of coronary heart disease (CHD). Accurate coronary artery calcification detection in noncontrast CT image is a prerequisite step for coronary calcium scoring. Currently, calcified lesions in the coronary arteries are manually identified by radiologists in clinical practice. Thus, in this paper, a fully automatic calcium scoring method was developed to alleviate the work load of the radiologists or cardiologists. Methods: The challenge of automatic coronary calcification detection is to discriminate the calcification in the coronary arteries from the calcification in the other tissues. Since the anatomy of coronary arteries is difficult to be observed in the noncontrast CT images, the contrast CT image of the same patient is used to extract the regions of the aorta, heart, and coronary arteries. Then, a patient-specific region-of-interest (ROI) is generated in the noncontrast CT image according to the segmentation results in the contrast CT image. This patient-specific ROI focuses on the regions in the neighborhood of coronary arteries for calcification detection, which can eliminate the calcifications in the surrounding tissues. A support vector machine classifier is applied finally to refine the results by removing possible image noise. Furthermore, the calcified lesions in the noncontrast images belonging to the different main coronary arteries are identified automatically using the labeling results of the extracted coronary arteries. Results: Forty datasets from four different CT machine vendors were used to evaluate their algorithm, which were provided by the MICCAI 2014 Coronary Calcium Scoring (orCaScore) Challenge. The sensitivity and positive predictive value for the volume of detected calcifications are 0.989 and 0.948. Only one patient out of 40 patients had been assigned to the wrong risk category defined according to Agatston scores (0, 1–100, 101–300, >300) by comparing with the ground

  15. Position automatic determination technology

    International Nuclear Information System (INIS)

    1985-10-01

    This book tells of method of position determination and characteristic, control method of position determination and point of design, point of sensor choice for position detector, position determination of digital control system, application of clutch break in high frequency position determination, automation technique of position determination, position determination by electromagnetic clutch and break, air cylinder, cam and solenoid, stop position control of automatic guide vehicle, stacker crane and automatic transfer control.

  16. High-Speed Automatic Microscopy for Real Time Tracks Reconstruction in Nuclear Emulsion

    Science.gov (United States)

    D'Ambrosio, N.

    2006-06-01

    The Oscillation Project with Emulsion-tRacking Apparatus (OPERA) experiment will use a massive nuclear emulsion detector to search for /spl nu//sub /spl mu///spl rarr//spl nu//sub /spl tau// oscillation by identifying /spl tau/ leptons through the direct detection of their decay topology. The feasibility of experiments using a large mass emulsion detector is linked to the impressive progress under way in the development of automatic emulsion analysis. A new generation of scanning systems requires the development of fast automatic microscopes for emulsion scanning and image analysis to reconstruct tracks of elementary particles. The paper presents the European Scanning System (ESS) developed in the framework of OPERA collaboration.

  17. A neurocomputational model of automatic sequence production.

    Science.gov (United States)

    Helie, Sebastien; Roeder, Jessica L; Vucovich, Lauren; Rünger, Dennis; Ashby, F Gregory

    2015-07-01

    Most behaviors unfold in time and include a sequence of submovements or cognitive activities. In addition, most behaviors are automatic and repeated daily throughout life. Yet, relatively little is known about the neurobiology of automatic sequence production. Past research suggests a gradual transfer from the associative striatum to the sensorimotor striatum, but a number of more recent studies challenge this role of the BG in automatic sequence production. In this article, we propose a new neurocomputational model of automatic sequence production in which the main role of the BG is to train cortical-cortical connections within the premotor areas that are responsible for automatic sequence production. The new model is used to simulate four different data sets from human and nonhuman animals, including (1) behavioral data (e.g., RTs), (2) electrophysiology data (e.g., single-neuron recordings), (3) macrostructure data (e.g., TMS), and (4) neurological circuit data (e.g., inactivation studies). We conclude with a comparison of the new model with existing models of automatic sequence production and discuss a possible new role for the BG in automaticity and its implication for Parkinson's disease.

  18. Automatic ultrasound image enhancement for 2D semi-automatic breast-lesion segmentation

    Science.gov (United States)

    Lu, Kongkuo; Hall, Christopher S.

    2014-03-01

    Breast cancer is the fastest growing cancer, accounting for 29%, of new cases in 2012, and second leading cause of cancer death among women in the United States and worldwide. Ultrasound (US) has been used as an indispensable tool for breast cancer detection/diagnosis and treatment. In computer-aided assistance, lesion segmentation is a preliminary but vital step, but the task is quite challenging in US images, due to imaging artifacts that complicate detection and measurement of the suspect lesions. The lesions usually present with poor boundary features and vary significantly in size, shape, and intensity distribution between cases. Automatic methods are highly application dependent while manual tracing methods are extremely time consuming and have a great deal of intra- and inter- observer variability. Semi-automatic approaches are designed to counterbalance the advantage and drawbacks of the automatic and manual methods. However, considerable user interaction might be necessary to ensure reasonable segmentation for a wide range of lesions. This work proposes an automatic enhancement approach to improve the boundary searching ability of the live wire method to reduce necessary user interaction while keeping the segmentation performance. Based on the results of segmentation of 50 2D breast lesions in US images, less user interaction is required to achieve desired accuracy, i.e. < 80%, when auto-enhancement is applied for live-wire segmentation.

  19. The ‘Continuing Misfortune’ of Automatism in Early Surrealism

    Directory of Open Access Journals (Sweden)

    Tessel M. Bauduin

    2015-09-01

    Full Text Available In the 1924 Manifesto of Surrealism surrealist leader André Breton (1896-1966 defined Surrealism as ‘psychic automatism in its pure state,’ positioning ‘psychic automatism’ as both a concept and a technique. This definition followed upon an intense period of experimentation with various forms of automatism among the proto-surrealist group; predominantly automatic writing, but also induced dream states. This article explores how surrealist ‘psychic automatism’ functioned as a mechanism for communication, or the expression of thought as directly as possible through the unconscious, in the first two decades of Surrealism. It touches upon automatic writing, hysteria as an automatic bodily performance of the unconscious, dreaming and the experimentation with induced dream states, and automatic drawing and other visual arts-techniques that could be executed more or less automatically as well. For all that the surrealists reinvented automatism for their own poetic, artistic and revolutionary aims, the automatic techniques were primarily drawn from contemporary Spiritualism, psychical research and experimentation with mediums, and the article teases out the connections to mediumistic automatism. It is demonstrated how the surrealists effectively and successfully divested automatism of all things spiritual. It furthermore becomes clear that despite various mishaps, automatism in many forms was a very successful creative technique within Surrealism.

  20. Automatic Detection of Vehicles Using Intensity Laser and Anaglyph Image

    Directory of Open Access Journals (Sweden)

    Hideo Araki

    2006-12-01

    Full Text Available In this work is presented a methodology to automatic car detection motion presents in digital aerial image on urban area using intensity, anaglyph and subtracting images. The anaglyph image is used to identify the motion cars on the expose take, because the cars provide red color due the not homology between objects. An implicit model was developed to provide a digital pixel value that has the specific propriety presented early, using the ratio between the RGB color of car object in the anaglyph image. The intensity image is used to decrease the false positive and to do the processing to work into roads and streets. The subtracting image is applied to decrease the false positives obtained due the markings road. The goal of this paper is automatically detect motion cars presents in digital aerial image in urban areas. The algorithm implemented applies normalization on the left and right images and later form the anaglyph with using the translation. The results show the applicability of proposed method and it potentiality on the automatic car detection and presented the performance of proposed methodology.

  1. Improvement of remote control system of automatic ultrasonic equipment for inspection of reactor pressure vessel

    International Nuclear Information System (INIS)

    Cheong, Yong Moo; Jung, H. K.; Joo, Y. S.; Koo, K. M.; Hyung, H.; Sim, C. M.; Gong, U. S.; Kim, S. H.; Lee, J. P.; Rhoo, H. C.; Kim, M. S.; Ryoo, S. K.; Choi, C. H.; Oh, K. I.

    1999-12-01

    One of the important issues related to the nuclear safety is in-service inspection of reactor pressure vessel (RPV). A remote controlled automatic ultrasonic method is applied to the inspection. At present the automatic ultrasonic inspection system owned by KAERI is interrupted due to degradation of parts. In order to resume field inspection new remote control system for the equipment was designed and installed to the existing equipment. New ultrasonic sensors and their modules for RPV inspection were designed and fabricated in accordance with the new requirements of the inspection codes. Ultrasonic sensors were verified for the use in the RPV inspection. (author)

  2. An Automated Self-Learning Quantification System to Identify Visible Areas in Capsule Endoscopy Images.

    Science.gov (United States)

    Hashimoto, Shinichi; Ogihara, Hiroyuki; Suenaga, Masato; Fujita, Yusuke; Terai, Shuji; Hamamoto, Yoshihiko; Sakaida, Isao

    2017-08-01

    Visibility in capsule endoscopic images is presently evaluated through intermittent analysis of frames selected by a physician. It is thus subjective and not quantitative. A method to automatically quantify the visibility on capsule endoscopic images has not been reported. Generally, when designing automated image recognition programs, physicians must provide a training image; this process is called supervised learning. We aimed to develop a novel automated self-learning quantification system to identify visible areas on capsule endoscopic images. The technique was developed using 200 capsule endoscopic images retrospectively selected from each of three patients. The rate of detection of visible areas on capsule endoscopic images between a supervised learning program, using training images labeled by a physician, and our novel automated self-learning program, using unlabeled training images without intervention by a physician, was compared. The rate of detection of visible areas was equivalent for the supervised learning program and for our automatic self-learning program. The visible areas automatically identified by self-learning program correlated to the areas identified by an experienced physician. We developed a novel self-learning automated program to identify visible areas in capsule endoscopic images.

  3. Programmable automatic alpha--beta air sample counter

    International Nuclear Information System (INIS)

    Howell, W.P.

    1978-01-01

    A programmable automatic alpha-beta air sample counter was developed for routine sample counting by operational health physics personnel. The system is composed of an automatic sample changer utilizing a large silicon diode detector, an electronic counting system with energy analysis capability, an automatic data acquisition controller, an interface module, and a teletypewriter with paper tape punch and paper tape reader. The system is operated through the teletypewriter keyboard and the paper tape reader, which are used to instruct the automatic data acquisition controller. Paper tape programs are provided for background counting, Chi 2 test, and sample counting. Output data are printed by the teletypewriter on standard continuous roll or multifold paper. Data are automatically corrected for background and counter efficiency

  4. The epidural needle guidance with an intelligent and automatic identification system for epidural anesthesia

    Science.gov (United States)

    Kao, Meng-Chun; Ting, Chien-Kun; Kuo, Wen-Chuan

    2018-02-01

    Incorrect placement of the needle causes medical complications in the epidural block, such as dural puncture or spinal cord injury. This study proposes a system which combines an optical coherence tomography (OCT) imaging probe with an automatic identification (AI) system to objectively identify the position of the epidural needle tip. The automatic identification system uses three features as image parameters to distinguish the different tissue by three classifiers. Finally, we found that the support vector machine (SVM) classifier has highest accuracy, specificity, and sensitivity, which reached to 95%, 98%, and 92%, respectively.

  5. Robot-assisted automatic ultrasound calibration.

    Science.gov (United States)

    Aalamifar, Fereshteh; Cheng, Alexis; Kim, Younsu; Hu, Xiao; Zhang, Haichong K; Guo, Xiaoyu; Boctor, Emad M

    2016-10-01

    Ultrasound (US) calibration is the process of determining the unknown transformation from a coordinate frame such as the robot's tooltip to the US image frame and is a necessary task for any robotic or tracked US system. US calibration requires submillimeter-range accuracy for most applications, but it is a time-consuming and repetitive task. We provide a new framework for automatic US calibration with robot assistance and without the need for temporal calibration. US calibration based on active echo (AE) phantom was previously proposed, and its superiority over conventional cross-wire phantom-based calibration was shown. In this work, we use AE to guide the robotic arm motion through the process of data collection; we combine the capability of the AE point to localize itself in the frame of the US image with the automatic motion of the robotic arm to provide a framework for calibrating the arm to the US image automatically. We demonstrated the efficacy of the automated method compared to the manual method through experiments. To highlight the necessity of frequent ultrasound calibration, it is demonstrated that the calibration precision changed from 1.67 to 3.20 mm if the data collection is not repeated after a dismounting/mounting of the probe holder. In a large data set experiment, similar reconstruction precision of automatic and manual data collection was observed, while the time was reduced by 58 %. In addition, we compared ten automatic calibrations with ten manual ones, each performed in 15 min, and showed that all the automatic ones could converge in the case of setting the initial matrix as identity, while this was not achieved by manual data sets. Given the same initial matrix, the repeatability of the automatic was [0.46, 0.34, 0.80, 0.47] versus [0.42, 0.51, 0.98, 1.15] mm in the manual case for the US image four corners. The submillimeter accuracy requirement of US calibration makes frequent data collections unavoidable. We proposed an automated

  6. Review of Stat-Spotting: A Field Guide to Identifying Dubious Data by Joel Best

    Directory of Open Access Journals (Sweden)

    Joe Swingle

    2009-07-01

    Full Text Available Best, Joel. Stat-Spotting: A Field Guide to Identifying Dubious Data. (Berkeley: University of California Press, 2008 144 pp. $19.95. ISBN 1-978-0-520-25746-7.Stat-Spotting is a practical, do-it-yourself manual for detecting questionable claims reported in the media. Using examples drawn mostly from mass media sources, Stat-Spotting provides readers with a number of useful tips for identifying potentially problematic statistics. The author’s skillful analyses and explanations presented in clear and concise prose make Stat-Spotting an ideal guide for anyone who reads a newspaper, watches television, or surfs the Web. In short, everyone.

  7. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Hovland, Paul D; Norris, Boyana; Strout, Michelle Mills; Bhowmick, Sanjukta; Utke, Jean

    2005-01-01

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  8. Effect of automatic recirculation flow control on the transient response for Lungmen ABWR plant

    Energy Technology Data Exchange (ETDEWEB)

    Tzang, Y.-C., E-mail: yctzang@aec.gov.t [National Tsing Hua University, Department of Engineering and System Science, Hsinchu 30013, Taiwan (China); Chiang, R.-F.; Ferng, Y.-M.; Pei, B.-S. [National Tsing Hua University, Department of Engineering and System Science, Hsinchu 30013, Taiwan (China)

    2009-12-15

    In this study the automatic mode of the recirculation flow control system (RFCS) for the Lungmen ABWR plant has been modeled and incorporated into the basic RETRAN-02 system model. The integrated system model is then used to perform the analyses for the two transients in which the automatic RFCS is involved. The two transients selected are: (1) one reactor internal pump (RIP) trip, and (2) loss of feedwater heating. In general, the integrated system model can predict well the response of key system parameters, including neutron flux, steam dome pressure, heat flux, RIP flow, core inlet flow, feedwater flow, steam flow, and reactor water level. The transients are also analyzed for manual RFCS case, between the automatic RFCS and the manual RFCS cases, comparisons of the transient response for the key system parameter show that the difference of transient response can be clearly identified. Also, the results show that the DELTACPR (delta critical power ratio) for the transients analyzed may not be less limiting for the automatic RFCS case under certain combination of control system settings.

  9. Automatically sweeping dual-channel boxcar integrator

    International Nuclear Information System (INIS)

    Keefe, D.J.; Patterson, D.R.

    1978-01-01

    An automatically sweeping dual-channel boxcar integrator has been developed to automate the search for a signal that repeatedly follows a trigger pulse by a constant or slowly varying time delay when that signal is completely hidden in random electrical noise and dc-offset drifts. The automatically sweeping dual-channel boxcar integrator improves the signal-to-noise ratio and eliminates dc-drift errors in the same way that a conventional dual-channel boxcar integrator does, but, in addition, automatically locates the hidden signal. When the signal is found, its time delay is displayed with 100-ns resolution, and its peak value is automatically measured and displayed. This relieves the operator of the tedious, time-consuming, and error-prone search for the signal whenever the time delay changes. The automatically sweeping boxcar integrator can also be used as a conventional dual-channel boxcar integrator. In either mode, it can repeatedly integrate a signal up to 990 times and thus make accurate measurements of the signal pulse height in the presence of random noise, dc offsets, and unsynchronized interfering signals

  10. What Automaticity Deficit? Activation of Lexical Information by Readers with Dyslexia in a Rapid Automatized Naming Stroop-Switch Task

    Science.gov (United States)

    Jones, Manon W.; Snowling, Margaret J.; Moll, Kristina

    2016-01-01

    Reading fluency is often predicted by rapid automatized naming (RAN) speed, which as the name implies, measures the automaticity with which familiar stimuli (e.g., letters) can be retrieved and named. Readers with dyslexia are considered to have less "automatized" access to lexical information, reflected in longer RAN times compared with…

  11. Progressive sampling-based Bayesian optimization for efficient and automatic machine learning model selection.

    Science.gov (United States)

    Zeng, Xueqiang; Luo, Gang

    2017-12-01

    Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.

  12. Automatic supervision and fault detection of PV systems based on power losses analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chouder, A.; Silvestre, S. [Electronic Engineering Department, Universitat Politecnica de Catalunya, C/Jordi Girona 1-3, Campus Nord UPC, 08034 Barcelona (Spain)

    2010-10-15

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L{sub ct}) and miscellaneous capture losses (L{sub cm}). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R{sub C} and R{sub V}. Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally. (author)

  13. Automatic supervision and fault detection of PV systems based on power losses analysis

    International Nuclear Information System (INIS)

    Chouder, A.; Silvestre, S.

    2010-01-01

    In this work, we present a new automatic supervision and fault detection procedure for PV systems, based on the power losses analysis. This automatic supervision system has been developed in Matlab and Simulink environment. It includes parameter extraction techniques to calculate main PV system parameters from monitoring data in real conditions of work, taking into account the environmental irradiance and module temperature evolution, allowing simulation of the PV system behaviour in real time. The automatic supervision method analyses the output power losses, presents in the DC side of the PV generator, capture losses. Two new power losses indicators are defined: thermal capture losses (L ct ) and miscellaneous capture losses (L cm ). The processing of these indicators allows the supervision system to generate a faulty signal as indicator of fault detection in the PV system operation. Two new indicators of the deviation of the DC variables respect to the simulated ones have been also defined. These indicators are the current and voltage ratios: R C and R V . Analysing both, the faulty signal and the current/voltage ratios, the type of fault can be identified. The automatic supervision system has been successfully tested experimentally.

  14. 14 CFR 29.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 29.1329 Section 29... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  15. 14 CFR 27.1329 - Automatic pilot system.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic pilot system. 27.1329 Section 27... pilot system. (a) Each automatic pilot system must be designed so that the automatic pilot can— (1) Be sufficiently overpowered by one pilot to allow control of the rotorcraft; and (2) Be readily and positively...

  16. Automatic differentiation algorithms in model analysis

    NARCIS (Netherlands)

    Huiskes, M.J.

    2002-01-01

    Title: Automatic differentiation algorithms in model analysis
    Author: M.J. Huiskes
    Date: 19 March, 2002

    In this thesis automatic differentiation algorithms and derivative-based methods

  17. Automatic measurement of target crossing speed

    Science.gov (United States)

    Wardell, Mark; Lougheed, James H.

    1992-11-01

    The motion of ground vehicle targets after a ballistic round is launched can be a major source of inaccuracy for small (handheld) anti-armour weapon systems. A method of automatically measuring the crossing component to compensate the fire control solution has been devised and tested against various targets in a range of environments. A photodetector array aligned with the sight's horizontal reticle obtains scene features, which are digitized and processed to separate target from sight motion. Relative motion of the target against the background is briefly monitored to deduce angular crossing rate and a compensating lead angle is introduced into the aim point. Research to gather quantitative data and optimize algorithm performance is described, and some results from field testing are presented.

  18. Automatic Shadow Detection and Removal from a Single Image.

    Science.gov (United States)

    Khan, Salman H; Bennamoun, Mohammed; Sohel, Ferdous; Togneri, Roberto

    2016-03-01

    We present a framework to automatically detect and remove shadows in real world scenes from a single image. Previous works on shadow detection put a lot of effort in designing shadow variant and invariant hand-crafted features. In contrast, our framework automatically learns the most relevant features in a supervised manner using multiple convolutional deep neural networks (ConvNets). The features are learned at the super-pixel level and along the dominant boundaries in the image. The predicted posteriors based on the learned features are fed to a conditional random field model to generate smooth shadow masks. Using the detected shadow masks, we propose a Bayesian formulation to accurately extract shadow matte and subsequently remove shadows. The Bayesian formulation is based on a novel model which accurately models the shadow generation process in the umbra and penumbra regions. The model parameters are efficiently estimated using an iterative optimization procedure. Our proposed framework consistently performed better than the state-of-the-art on all major shadow databases collected under a variety of conditions.

  19. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR, type reactor

    International Nuclear Information System (INIS)

    Lefevre, F.; Baumaire, A.; Comby, R.; Benas, J.C.

    1990-01-01

    The automatization of the monitoring of the steam generator tubes required some developments in the field of data processing. The monitoring is performed by means of Eddy current tests. Improvements in signal processing and in pattern recognition associated to the artificial intelligence techniques induced EDF (French Electricity Company) to develop an automatic signal processing system. The system, named EXTRACSION (French acronym for Expert System for the Processing and classification of Signals of Nuclear Nature), insures the coherence between the different fields of knowledge (metallurgy, measurement, signals) during data processing by applying an object oriented representation [fr

  20. Automatic intelligent cruise control

    OpenAIRE

    Stanton, NA; Young, MS

    2006-01-01

    This paper reports a study on the evaluation of automatic intelligent cruise control (AICC) from a psychological perspective. It was anticipated that AICC would have an effect upon the psychology of driving—namely, make the driver feel like they have less control, reduce the level of trust in the vehicle, make drivers less situationally aware, but might reduce the workload and make driving might less stressful. Drivers were asked to drive in a driving simulator under manual and automatic inte...

  1. MATHEMATICAL AND COMPUTER MODELING OF AUTOMATIC CONTROL SYSTEM FOR HYDROSTATIC BEARING

    Directory of Open Access Journals (Sweden)

    N. A. Pelevin

    2016-09-01

    Full Text Available The paper presents simulation results of hydrostatic bearing dynamics in spindle assembly of standard flexible production module with throttled circuit. The necessity of dynamic quality increase for automatic control system of the hydrostatic bearing with the use of correcting means in the form of RC-chains is shown. The features of correction parameters choice coming from the existence of the crossing connections in automatic control system structure are noted. We propose the block diagram of automatic control system of the hydrostatic bearing in Simulink working field and cyclic algorithm for determination program of RC-chain parameters implemented in MATLAB taking into account typical thermal processes for the finishing treatment. Graphic-analytical method for the correction parameters choice is presented based on the stability stock phase gradient for dynamic quality determination of automatic control system. Researches of the method estimability in case of using the standard metal bellow valve as the hydrocapacity for RC-chain are also carried out. Recommendations for the bellow valve choice are formulated. The check of dynamic quality indicators concerning transition processes calculated by means of the appropriate programs developed for MATLAB is performed. Examples are given for phase stability factor gradient schedules with partition of various areas of hydrostatic bearing dynamic quality for different frequencies of spindle rotation and procedure description of data cursor function application on MATLAB toolbar. Improvement of hydrostatic bearing dynamics under typical low loadings for finishing treatment is noted. Also, decrease of dynamic indicators for high loadings treatment in case of roughing treatment is marked.

  2. Dynamic Artificial Potential Fields for Autonomous Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Jhala, Arnav Harish

    2009-01-01

    the implementation and evaluation of Artificial Potential Fields for automatic camera placement. We first describe the re- casting of the frame composition problem as a solution to a two particles suspended in an Artificial Potential Field. We demonstrate the application of this technique to control both camera...

  3. Application of image recognition-based automatic hyphae detection in fungal keratitis.

    Science.gov (United States)

    Wu, Xuelian; Tao, Yuan; Qiu, Qingchen; Wu, Xinyi

    2018-03-01

    The purpose of this study is to evaluate the accuracy of two methods in diagnosis of fungal keratitis, whereby one method is automatic hyphae detection based on images recognition and the other method is corneal smear. We evaluate the sensitivity and specificity of the method in diagnosis of fungal keratitis, which is automatic hyphae detection based on image recognition. We analyze the consistency of clinical symptoms and the density of hyphae, and perform quantification using the method of automatic hyphae detection based on image recognition. In our study, 56 cases with fungal keratitis (just single eye) and 23 cases with bacterial keratitis were included. All cases underwent the routine inspection of slit lamp biomicroscopy, corneal smear examination, microorganism culture and the assessment of in vivo confocal microscopy images before starting medical treatment. Then, we recognize the hyphae images of in vivo confocal microscopy by using automatic hyphae detection based on image recognition to evaluate its sensitivity and specificity and compare with the method of corneal smear. The next step is to use the index of density to assess the severity of infection, and then find the correlation with the patients' clinical symptoms and evaluate consistency between them. The accuracy of this technology was superior to corneal smear examination (p hyphae detection of image recognition was 89.29%, and the specificity was 95.65%. The area under the ROC curve was 0.946. The correlation coefficient between the grading of the severity in the fungal keratitis by the automatic hyphae detection based on image recognition and the clinical grading is 0.87. The technology of automatic hyphae detection based on image recognition was with high sensitivity and specificity, able to identify fungal keratitis, which is better than the method of corneal smear examination. This technology has the advantages when compared with the conventional artificial identification of confocal

  4. Review of Stat-Spotting: A Field Guide to Identifying Dubious Data by Joel Best

    OpenAIRE

    Joe Swingle

    2009-01-01

    Best, Joel. Stat-Spotting: A Field Guide to Identifying Dubious Data. (Berkeley: University of California Press, 2008) 144 pp. $19.95. ISBN 1-978-0-520-25746-7.Stat-Spotting is a practical, do-it-yourself manual for detecting questionable claims reported in the media. Using examples drawn mostly from mass media sources, Stat-Spotting provides readers with a number of useful tips for identifying potentially problematic statistics. The author’s skillful analyses and explanations presented in cl...

  5. New Method to Identify Field Joint Coating Failures Based on MFL In-Line Inspection Signals

    Directory of Open Access Journals (Sweden)

    Lianshuang Dai

    2018-02-01

    Full Text Available Above ground indirect detections and random excavations that have applied the past years for buried long distance oil and gas pipelines can only identify some damaged coating locations. Hence, large number of field joint coating (FJC failures happen unconsciously until they lead to failures of the pipelines. Based on the analysis of magnetic flux leakage (MFL in-line inspection (ILI signals, combined with the statistical results of 414 excavations from two different pipeline sections, a new method to identify the failed FJC is established. Though it can only identify FJC failures when there are signs of corrosion on pipe body, it is much more efficient and cost-saving. The concluded identification rule still needs more validations and improvements to be more applicable and accuracy.

  6. Automatic radioxenon analyzer for CTBT monitoring

    International Nuclear Information System (INIS)

    Bowyer, T.W.; Abel, K.H.; Hensley, W.K.

    1996-12-01

    Over the past 3 years, with support from US DOE's NN-20 Comprehensive Test Ban Treaty (CTBT) R ampersand D program, PNNL has developed and demonstrated a fully automatic analyzer for collecting and measuring the four Xe radionuclides, 131m Xe(11.9 d), 133m Xe(2.19 d), 133 Xe (5.24 d), and 135 Xe(9.10 h), in the atmosphere. These radionuclides are important signatures in monitoring for compliance to a CTBT. Activity ratios permit discriminating radioxenon from nuclear detonation and that from nuclear reactor operations, nuclear fuel reprocessing, or medical isotope production and usage. In the analyzer, Xe is continuously and automatically separated from the atmosphere at flow rates of about 7 m 3 /h on sorption bed. Aliquots collected for 6-12 h are automatically analyzed by electron-photon coincidence spectrometry to produce sensitivities in the range of 20-100 μBq/m 3 of air, about 100-fold better than with reported laboratory-based procedures for short time collection intervals. Spectral data are automatically analyzed and the calculated radioxenon concentrations and raw gamma- ray spectra automatically transmitted to data centers

  7. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic......, uncommitted machine-learning based framework. Six different classifiers were evaluated in cross-validation schemes and the results showed that the presence of OA can be quantified by a bone structure marker. The performance of the developed marker reached a generalization area-under-the-ROC (AUC) of 0...

  8. Automatic Detect and Trace of Solar Filaments

    Science.gov (United States)

    Fang, Cheng; Chen, P. F.; Tang, Yu-hua; Hao, Qi; Guo, Yang

    We developed a series of methods to automatically detect and trace solar filaments in solar Hα images. The programs are able to not only recognize filaments and determine their properties, such as the position, the area and other relevant parameters, but also to trace the daily evolution of the filaments. For solar full disk Hα images, the method consists of three parts: first, preprocessing is applied to correct the original images; second, the Canny edge-detection method is used to detect the filaments; third, filament properties are recognized through the morphological operators. For each Hα filament and its barb features, we introduced the unweighted undirected graph concept and adopted Dijkstra shortest-path algorithm to recognize the filament spine; then, using polarity inversion line shift method for measuring the polarities in both sides of the filament to determine the filament axis chirality; finally, employing connected components labeling method to identify the barbs and calculating the angle between each barb and spine to indicate the barb chirality. Our algorithms are applied to the observations from varied observatories, including the Optical & Near Infrared Solar Eruption Tracer (ONSET) in Nanjing University, Mauna Loa Solar Observatory (MLSO) and Big Bear Solar Observatory (BBSO). The programs are demonstrated to be effective and efficient. We used our method to automatically process and analyze 3470 images obtained by MLSO from January 1998 to December 2009, and a butterfly diagram of filaments is obtained. It shows that the latitudinal migration of solar filaments has three trends in the Solar Cycle 23: The drift velocity was fast from 1998 to the solar maximum; after the solar maximum, it became relatively slow and after 2006, the migration became divergent, signifying the solar minimum. About 60% filaments with the latitudes larger than 50 degree migrate towards the Polar Regions with relatively high velocities, and the latitudinal migrating

  9. Improvement of remote control system of automatic ultrasonic equipment for inspection of reactor pressure vessel

    Energy Technology Data Exchange (ETDEWEB)

    Cheong, Yong Moo; Jung, H. K.; Joo, Y. S.; Koo, K. M.; Hyung, H.; Sim, C. M.; Gong, U. S.; Kim, S. H.; Lee, J. P.; Rhoo, H. C.; Kim, M. S.; Ryoo, S. K.; Choi, C. H.; Oh, K. I

    1999-12-01

    One of the important issues related to the nuclear safety is in-service inspection of reactor pressure vessel (RPV). A remote controlled automatic ultrasonic method is applied to the inspection. At present the automatic ultrasonic inspection system owned by KAERI is interrupted due to degradation of parts. In order to resume field inspection new remote control system for the equipment was designed and installed to the existing equipment. New ultrasonic sensors and their modules for RPV inspection were designed and fabricated in accordance with the new requirements of the inspection codes. Ultrasonic sensors were verified for the use in the RPV inspection. (autho0008.

  10. Using Probe Vehicle Data for Automatic Extraction of Road Traffic Parameters

    Directory of Open Access Journals (Sweden)

    Roman Popescu Maria Alexandra

    2016-12-01

    Full Text Available Through this paper the author aims to study and find solutions for automatic detection of traffic light position and for automatic calculation of the waiting time at traffic light. The first objective serves mainly the road transportation field, mainly because it removes the need for collaboration with local authorities to establish a national network of traffic lights. The second objective is important not only for companies which are providing navigation solutions, but especially for authorities, institutions, companies operating in road traffic management systems. Real-time dynamic determination of traffic queue length and of waiting time at traffic lights allow the creation of dynamic systems, intelligent and flexible, adapted to actual traffic conditions, and not to generic, theoretical models. Thus, cities can approach the Smart City concept by boosting, efficienting and greening the road transport, promoted in Europe through the Horizon 2020, Smart Cities, Urban Mobility initiative.

  11. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  12. Unsupervised segmentation of lung fields in chest radiographs using multiresolution fractal feature vector and deformable models.

    Science.gov (United States)

    Lee, Wen-Li; Chang, Koyin; Hsieh, Kai-Sheng

    2016-09-01

    Segmenting lung fields in a chest radiograph is essential for automatically analyzing an image. We present an unsupervised method based on multiresolution fractal feature vector. The feature vector characterizes the lung field region effectively. A fuzzy c-means clustering algorithm is then applied to obtain a satisfactory initial contour. The final contour is obtained by deformable models. The results show the feasibility and high performance of the proposed method. Furthermore, based on the segmentation of lung fields, the cardiothoracic ratio (CTR) can be measured. The CTR is a simple index for evaluating cardiac hypertrophy. After identifying a suspicious symptom based on the estimated CTR, a physician can suggest that the patient undergoes additional extensive tests before a treatment plan is finalized.

  13. Automatic single questionnaire intensity (SQI, EMS98 scale) estimation using ranking models built on the existing BCSF database

    Science.gov (United States)

    Schlupp, A.; Sira, C.; Schmitt, K.; Schaming, M.

    2013-12-01

    the fact that each definitive BCSF SQIs is determined by an expert analysis. We compare the SQIs obtained by these methods from our database and discuss the coherency and variations between automatic and manual processes. These methods lead to high scores with up to 85% of the forms well classified and most of the remaining forms classified with only a shift of one intensity degree. This allows us to use the ranking methods as the best automatic methods to fast SQIs estimation and to produce fast shakemaps. The next step, to improve the use of these methods, will be to identify explanations for the forms not classified at the correct value and a way to select the few remaining forms that should be analyzed by the expert. Note that beyond intensity VI, on-line questionnaires are insufficient and a field survey is indispensable to estimate intensity. For such survey, in France, BCSF leads a macroseismic intervention group (GIM).

  14. SU-E-J-15: Automatically Detect Patient Treatment Position and Orientation in KV Portal Images

    International Nuclear Information System (INIS)

    Qiu, J; Yang, D

    2015-01-01

    Purpose: In the course of radiation therapy, the complex information processing workflow will Result in potential errors, such as incorrect or inaccurate patient setups. With automatic image check and patient identification, such errors could be effectively reduced. For this purpose, we developed a simple and rapid image processing method, to automatically detect the patient position and orientation in 2D portal images, so to allow automatic check of positions and orientations for patient daily RT treatments. Methods: Based on the principle of portal image formation, a set of whole body DRR images were reconstructed from multiple whole body CT volume datasets, and fused together to be used as the matching template. To identify the patient setup position and orientation shown in a 2D portal image, the 2D portal image was preprocessed (contrast enhancement, down-sampling and couch table detection), then matched to the template image so to identify the laterality (left or right), position, orientation and treatment site. Results: Five day’s clinical qualified portal images were gathered randomly, then were processed by the automatic detection and matching method without any additional information. The detection results were visually checked by physicists. 182 images were correct detection in a total of 200kV portal images. The correct rate was 91%. Conclusion: The proposed method can detect patient setup and orientation quickly and automatically. It only requires the image intensity information in KV portal images. This method can be useful in the framework of Electronic Chart Check (ECCK) to reduce the potential errors in workflow of radiation therapy and so to improve patient safety. In addition, the auto-detection results, as the patient treatment site position and patient orientation, could be useful to guide the sequential image processing procedures, e.g. verification of patient daily setup accuracy. This work was partially supported by research grant from

  15. SU-E-J-15: Automatically Detect Patient Treatment Position and Orientation in KV Portal Images

    Energy Technology Data Exchange (ETDEWEB)

    Qiu, J [Washington University in St Louis, Taian, Shandong (China); Yang, D [Washington University School of Medicine, St Louis, MO (United States)

    2015-06-15

    Purpose: In the course of radiation therapy, the complex information processing workflow will Result in potential errors, such as incorrect or inaccurate patient setups. With automatic image check and patient identification, such errors could be effectively reduced. For this purpose, we developed a simple and rapid image processing method, to automatically detect the patient position and orientation in 2D portal images, so to allow automatic check of positions and orientations for patient daily RT treatments. Methods: Based on the principle of portal image formation, a set of whole body DRR images were reconstructed from multiple whole body CT volume datasets, and fused together to be used as the matching template. To identify the patient setup position and orientation shown in a 2D portal image, the 2D portal image was preprocessed (contrast enhancement, down-sampling and couch table detection), then matched to the template image so to identify the laterality (left or right), position, orientation and treatment site. Results: Five day’s clinical qualified portal images were gathered randomly, then were processed by the automatic detection and matching method without any additional information. The detection results were visually checked by physicists. 182 images were correct detection in a total of 200kV portal images. The correct rate was 91%. Conclusion: The proposed method can detect patient setup and orientation quickly and automatically. It only requires the image intensity information in KV portal images. This method can be useful in the framework of Electronic Chart Check (ECCK) to reduce the potential errors in workflow of radiation therapy and so to improve patient safety. In addition, the auto-detection results, as the patient treatment site position and patient orientation, could be useful to guide the sequential image processing procedures, e.g. verification of patient daily setup accuracy. This work was partially supported by research grant from

  16. Associative priming in a masked perceptual identification task: evidence for automatic processes.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Raaijmakers, Jeroen G W

    2002-10-01

    Two experiments investigated the influence of automatic and strategic processes on associative priming effects in a perceptual identification task in which prime-target pairs are briefly presented and masked. In this paradigm, priming is defined as a higher percentage of correctly identified targets for related pairs than for unrelated pairs. In Experiment 1, priming was obtained for mediated word pairs. This mediated priming effect was affected neither by the presence of direct associations nor by the presentation time of the primes, indicating that automatic priming effects play a role in perceptual identification. Experiment 2 showed that the priming effect was not affected by the proportion (.90 vs. .10) of related pairs if primes were presented briefly to prevent their identification. However, a large proportion effect was found when primes were presented for 1000 ms so that they were clearly visible. These results indicate that priming in a masked perceptual identification task is the result of automatic processes and is not affected by strategies. The present paradigm provides a valuable alternative to more commonly used tasks such as lexical decision.

  17. Deliberation versus automaticity in decision making: Which presentation format features facilitate automatic decision making?

    Directory of Open Access Journals (Sweden)

    Anke Soellner

    2013-05-01

    Full Text Available The idea of automatic decision making approximating normatively optimal decisions without necessitating much cognitive effort is intriguing. Whereas recent findings support the notion that such fast, automatic processes explain empirical data well, little is known about the conditions under which such processes are selected rather than more deliberate stepwise strategies. We investigate the role of the format of information presentation, focusing explicitly on the ease of information acquisition and its influence on information integration processes. In a probabilistic inference task, the standard matrix employed in prior research was contrasted with a newly created map presentation format and additional variations of both presentation formats. Across three experiments, a robust presentation format effect emerged: Automatic decision making was more prevalent in the matrix (with high information accessibility, whereas sequential decision strategies prevailed when the presentation format demanded more information acquisition effort. Further scrutiny of the effect showed that it is not driven by the presentation format as such, but rather by the extent of information search induced by a format. Thus, if information is accessible with minimal need for information search, information integration is likely to proceed in a perception-like, holistic manner. In turn, a moderate demand for information search decreases the likelihood of behavior consistent with the assumptions of automatic decision making.

  18. Computer vision for automatic inspection of agricultural produce

    Science.gov (United States)

    Molto, Enrique; Blasco, Jose; Benlloch, Jose V.

    1999-01-01

    Fruit and vegetables suffer different manipulations from the field to the final consumer. These are basically oriented towards the cleaning and selection of the product in homogeneous categories. For this reason, several research projects, aimed at fast, adequate produce sorting and quality control are currently under development around the world. Moreover, it is possible to find manual and semi- automatic commercial system capable of reasonably performing these tasks.However, in many cases, their accuracy is incompatible with current European market demands, which are constantly increasing. IVIA, the Valencian Research Institute of Agriculture, located in Spain, has been involved in several European projects related with machine vision for real-time inspection of various agricultural produces. This paper will focus on the work related with two products that have different requirements: fruit and olives. In the case of fruit, the Institute has developed a vision system capable of providing assessment of the external quality of single fruit to a robot that also receives information from other senors. The system use four different views of each fruit and has been tested on peaches, apples and citrus. Processing time of each image is under 500 ms using a conventional PC. The system provides information about primary and secondary color, blemishes and their extension, and stem presence and position, which allows further automatic orientation of the fruit in the final box using a robotic manipulator. Work carried out in olives was devoted to fast sorting of olives for consumption at table. A prototype has been developed to demonstrate the feasibility of a machine vision system capable of automatically sorting 2500 kg/h olives using low-cost conventional hardware.

  19. Forensic Automatic Speaker Recognition Based on Likelihood Ratio Using Acoustic-phonetic Features Measured Automatically

    Directory of Open Access Journals (Sweden)

    Huapeng Wang

    2015-01-01

    Full Text Available Forensic speaker recognition is experiencing a remarkable paradigm shift in terms of the evaluation framework and presentation of voice evidence. This paper proposes a new method of forensic automatic speaker recognition using the likelihood ratio framework to quantify the strength of voice evidence. The proposed method uses a reference database to calculate the within- and between-speaker variability. Some acoustic-phonetic features are extracted automatically using the software VoiceSauce. The effectiveness of the approach was tested using two Mandarin databases: A mobile telephone database and a landline database. The experiment's results indicate that these acoustic-phonetic features do have some discriminating potential and are worth trying in discrimination. The automatic acoustic-phonetic features have acceptable discriminative performance and can provide more reliable results in evidence analysis when fused with other kind of voice features.

  20. 46 CFR 63.25-1 - Small automatic auxiliary boilers.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Small automatic auxiliary boilers. 63.25-1 Section 63.25... AUXILIARY BOILERS Requirements for Specific Types of Automatic Auxiliary Boilers § 63.25-1 Small automatic auxiliary boilers. Small automatic auxiliary boilers defined as having heat-input ratings of 400,000 Btu/hr...

  1. A new method for automatic discontinuity traces sampling on rock mass 3D model

    Science.gov (United States)

    Umili, G.; Ferrero, A.; Einstein, H. H.

    2013-02-01

    A new automatic method for discontinuity traces mapping and sampling on a rock mass digital model is described in this work. The implemented procedure allows one to automatically identify discontinuity traces on a Digital Surface Model: traces are detected directly as surface breaklines, by means of maximum and minimum principal curvature values of the vertices that constitute the model surface. Color influence and user errors, that usually characterize the trace mapping on images, are eliminated. Also trace sampling procedures based on circular windows and circular scanlines have been implemented: they are used to infer trace data and to calculate values of mean trace length, expected discontinuity diameter and intensity of rock discontinuities. The method is tested on a case study: results obtained applying the automatic procedure on the DSM of a rock face are compared to those obtained performing a manual sampling on the orthophotograph of the same rock face.

  2. Automatic identification of single- and/or few-layer thin-film material

    DEFF Research Database (Denmark)

    2014-01-01

    One or more digital representations of single- (101) and/or few-layer (102) thin- film material are automatically identified robustly and reliably in a digital image (100), the digital image (100) having a predetermined number of colour components, by - determining (304) a background colour...... component of the digital image (100) for each colour component, and - determining or estimating (306) a colour component of thin-film material to be identified in the digital image (100) for each colour component by obtaining a pre-determined contrast value (C R; C G; C B) for each colour component...

  3. Scanner OPC signatures: automatic vendor-to-vendor OPE matching

    Science.gov (United States)

    Renwick, Stephen P.

    2009-03-01

    As 193nm lithography continues to be stretched and the k1 factor decreases, optical proximity correction (OPC) has become a vital part of the lithographer's tool kit. Unfortunately, as is now well known, the design variations of lithographic scanners from different vendors cause them to have slightly different optical-proximity effect (OPE) behavior, meaning that they print features through pitch in distinct ways. This in turn means that their response to OPC is not the same, and that an OPC solution designed for a scanner from Company 1 may or may not work properly on a scanner from Company 2. Since OPC is not inexpensive, that causes trouble for chipmakers using more than one brand of scanner. Clearly a scanner-matching procedure is needed to meet this challenge. Previously, automatic matching has only been reported for scanners of different tool generations from the same manufacturer. In contrast, scanners from different companies have been matched using expert tuning and adjustment techniques, frequently requiring laborious test exposures. Automatic matching between scanners from Company 1 and Company 2 has remained an unsettled problem. We have recently solved this problem and introduce a novel method to perform the automatic matching. The success in meeting this challenge required three enabling factors. First, we recognized the strongest drivers of OPE mismatch and are thereby able to reduce the information needed about a tool from another supplier to that information readily available from all modern scanners. Second, we developed a means of reliably identifying the scanners' optical signatures, minimizing dependence on process parameters that can cloud the issue. Third, we carefully employed standard statistical techniques, checking for robustness of the algorithms used and maximizing efficiency. The result is an automatic software system that can predict an OPC matching solution for scanners from different suppliers without requiring expert intervention.

  4. Towards identifying the mechanisms underlying field-aligned edge-loss of HHFW power on NSTX

    International Nuclear Information System (INIS)

    Perkins, R. J.; Bell, R. E.; Bertelli, N.; Diallo, A.; Gerhardt, S.; Hosea, J. C.; Jaworski, M. A.; LeBlanc, B. P.; Kramer, G. J.; Maingi, R.; Phillips, C. K.; Podestà, M.; Roquemore, L.; Scotti, F.; Taylor, G.; Wilson, J. R.; Ahn, J-W.; Gray, T. K.; Green, D. L.; McLean, A.

    2014-01-01

    Fast-wave heating will be a major heating scheme on ITER, as it can heat ions directly and is relatively unaffected by the large machine size unlike neutral beams. However, fast-wave interactions with the plasma edge can lead to deleterious effects such as, in the case of the high-harmonic fast-wave (HHFW) system on NSTX, large losses of fast-wave power in the scrape off layer (SOL) under certain conditions. In such scenarios, a large fraction of the lost HHFW power is deposited on the upper and lower divertors in bright spiral shapes. The responsible mechanism(s) has not yet been identified but may include fast-wave propagation in the scrape off layer, parametric decay instability, and RF currents driven by the antenna reactive fields. Understanding and mitigating these losses is important not only for improving the heating and current-drive on NSTX-Upgrade but also for understanding fast-wave propagation across the SOL in any fast-wave system. This talk summarizes experimental results demonstrating that the flow of lost HHFW power to the divertor regions largely follows the open SOL magnetic field lines. This lost power flux is relatively large close to both the antenna and the last closed flux surface with a reduced level in between, so the loss mechanism cannot be localized to the antenna. At the same time, significant losses also occur along field lines connected to the inboard edge of the bottom antenna plate. The power lost within the spirals is roughly estimated, showing that these field-aligned losses to the divertor are significant but may not account for the total HHFW loss. To elucidate the role of the onset layer for perpendicular fast-wave propagation with regards to fast-wave propagation in the SOL, a cylindrical cold-plasma model is being developed. This model, in addition to advanced RF codes such as TORIC and AORSA, is aimed at identifying the underlying mechanism(s) behind these SOL losses, to minimize their effects in NSTX-U, and to predict

  5. Towards identifying the mechanisms underlying field-aligned edge-loss of HHFW power on NSTX

    Energy Technology Data Exchange (ETDEWEB)

    Perkins, R. J.; Bell, R. E.; Bertelli, N.; Diallo, A.; Gerhardt, S.; Hosea, J. C.; Jaworski, M. A.; LeBlanc, B. P.; Kramer, G. J.; Maingi, R.; Phillips, C. K.; Podestà, M.; Roquemore, L.; Scotti, F.; Taylor, G.; Wilson, J. R. [Princeton Plasma Physics Laboratory, Princeton, NJ (United States); Ahn, J-W.; Gray, T. K.; Green, D. L.; McLean, A. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2014-02-12

    Fast-wave heating will be a major heating scheme on ITER, as it can heat ions directly and is relatively unaffected by the large machine size unlike neutral beams. However, fast-wave interactions with the plasma edge can lead to deleterious effects such as, in the case of the high-harmonic fast-wave (HHFW) system on NSTX, large losses of fast-wave power in the scrape off layer (SOL) under certain conditions. In such scenarios, a large fraction of the lost HHFW power is deposited on the upper and lower divertors in bright spiral shapes. The responsible mechanism(s) has not yet been identified but may include fast-wave propagation in the scrape off layer, parametric decay instability, and RF currents driven by the antenna reactive fields. Understanding and mitigating these losses is important not only for improving the heating and current-drive on NSTX-Upgrade but also for understanding fast-wave propagation across the SOL in any fast-wave system. This talk summarizes experimental results demonstrating that the flow of lost HHFW power to the divertor regions largely follows the open SOL magnetic field lines. This lost power flux is relatively large close to both the antenna and the last closed flux surface with a reduced level in between, so the loss mechanism cannot be localized to the antenna. At the same time, significant losses also occur along field lines connected to the inboard edge of the bottom antenna plate. The power lost within the spirals is roughly estimated, showing that these field-aligned losses to the divertor are significant but may not account for the total HHFW loss. To elucidate the role of the onset layer for perpendicular fast-wave propagation with regards to fast-wave propagation in the SOL, a cylindrical cold-plasma model is being developed. This model, in addition to advanced RF codes such as TORIC and AORSA, is aimed at identifying the underlying mechanism(s) behind these SOL losses, to minimize their effects in NSTX-U, and to predict

  6. Automatic selection of resting-state networks with functional magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Silvia Francesca eStorti

    2013-05-01

    Full Text Available Functional magnetic resonance imaging (fMRI during a resting-state condition can reveal the co-activation of specific brain regions in distributed networks, called resting-state networks, which are selected by independent component analysis (ICA of the fMRI data. One of the major difficulties with component analysis is the automatic selection of the ICA features related to brain activity. In this study we describe a method designed to automatically select networks of potential functional relevance, specifically, those regions known to be involved in motor function, visual processing, executive functioning, auditory processing, memory, and the default-mode network. To do this, image analysis was based on probabilistic ICA as implemented in FSL software. After decomposition, the optimal number of components was selected by applying a novel algorithm which takes into account, for each component, Pearson's median coefficient of skewness of the spatial maps generated by FSL, followed by clustering, segmentation, and spectral analysis. To evaluate the performance of the approach, we investigated the resting-state networks in 25 subjects. For each subject, three resting-state scans were obtained with a Siemens Allegra 3 T scanner (NYU data set. Comparison of the visually and the automatically identified neuronal networks showed that the algorithm had high accuracy (first scan: 95%, second scan: 95%, third scan: 93% and precision (90%, 90%, 84%. The reproducibility of the networks for visual and automatic selection was very close: it was highly consistent in each subject for the default-mode network (≥ 92% and the occipital network, which includes the medial visual cortical areas (≥ 94%, and consistent for the attention network (≥ 80%, the right and/or left lateralized frontoparietal attention networks, and the temporal-motor network (≥ 80%. The automatic selection method may be used to detect neural networks and reduce subjectivity in ICA

  7. 49 CFR 236.825 - System, automatic train control.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false System, automatic train control. 236.825 Section..., INSPECTION, MAINTENANCE, AND REPAIR OF SIGNAL AND TRAIN CONTROL SYSTEMS, DEVICES, AND APPLIANCES Definitions § 236.825 System, automatic train control. A system so arranged that its operation will automatically...

  8. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... differential equations, but in this thesis, we describe how to use the methods for enclosing iterates of discrete mappings, and then later use them for discretizing solutions of ordinary differential equations. The theory of automatic differentiation is introduced, and three methods for obtaining derivatives...... are described: The forward, the backward, and the Taylor expansion methods. The three methods have been implemented in the C++ program packages FADBAD/TADIFF. Some examples showing how to use the three metho ds are presented. A feature of FADBAD/TADIFF not present in other automatic differentiation packages...

  9. Automatic slice identification in 3D medical images with a ConvNet regressor

    NARCIS (Netherlands)

    de Vos, Bob D.; Viergever, Max A.; de Jong, Pim A.; Išgum, Ivana

    2016-01-01

    Identification of anatomical regions of interest is a prerequisite in many medical image analysis tasks. We propose a method that automatically identifies a slice of interest (SOI) in 3D images with a convolutional neural network (ConvNet) regressor. In 150 chest CT scans two reference slices were

  10. LEARNING VECTOR QUANTIZATION FOR ADAPTED GAUSSIAN MIXTURE MODELS IN AUTOMATIC SPEAKER IDENTIFICATION

    Directory of Open Access Journals (Sweden)

    IMEN TRABELSI

    2017-05-01

    Full Text Available Speaker Identification (SI aims at automatically identifying an individual by extracting and processing information from his/her voice. Speaker voice is a robust a biometric modality that has a strong impact in several application areas. In this study, a new combination learning scheme has been proposed based on Gaussian mixture model-universal background model (GMM-UBM and Learning vector quantization (LVQ for automatic text-independent speaker identification. Features vectors, constituted by the Mel Frequency Cepstral Coefficients (MFCC extracted from the speech signal are used to train the New England subset of the TIMIT database. The best results obtained (90% for gender- independent speaker identification, 97 % for male speakers and 93% for female speakers for test data using 36 MFCC features.

  11. 2D automatic body-fitted structured mesh generation using advancing extraction method

    Science.gov (United States)

    Zhang, Yaoxin; Jia, Yafei

    2018-01-01

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like topography with extrusion-like structures (i.e., branches or tributaries) and intrusion-like structures (i.e., peninsula or dikes). With the AEM, the hierarchical levels of sub-domains can be identified, and the block boundary of each sub-domain in convex polygon shape in each level can be extracted in an advancing scheme. In this paper, several examples were used to illustrate the effectiveness and applicability of the proposed algorithm for automatic structured mesh generation, and the implementation of the method.

  12. Development of an Automatic Combination System of Clothing Parts for Blind People: MyEyes

    Directory of Open Access Journals (Sweden)

    Daniel Rocha

    2018-01-01

    Full Text Available Blind people have been, over time, a reason for motivation in the development of solutions to improve their quality of life. The aim of this work is to propose a solution for one of such problems, namely, the selection and combination of clothing for the blind. Thus, this paper describes the whole project developed, in agreement with the Portuguese Association of the Blind and Amblyopic of Portugal (ACAPO, for the creation of a Web platform to aid the blind in selecting combinations of clothing. Near Field Communication (NFC technology is the basis of this project in the identification of garments. The features of the garments are inserted manually, and a combination of features is possible. There is also the possibility to automatically identify the color of the garment. The system has been tested by the ACAPO organization and preliminary feedback is positive, which are a good starting point for the future. This solution helps promote an increased autonomy for blind people.

  13. NEUROIMAGING AND PATTERN RECOGNITION TECHNIQUES FOR AUTOMATIC DETECTION OF ALZHEIMER’S DISEASE: A REVIEW

    Directory of Open Access Journals (Sweden)

    Rupali Kamathe

    2017-08-01

    Full Text Available Alzheimer’s disease (AD is the most common form of dementia with currently unavailable firm treatments that can stop or reverse the disease progression. A combination of brain imaging and clinical tests for checking the signs of memory impairment is used to identify patients with AD. In recent years, Neuroimaging techniques combined with machine learning algorithms have received lot of attention in this field. There is a need for development of automated techniques to detect the disease well before patient suffers from irreversible loss. This paper is about the review of such semi or fully automatic techniques with detail comparison of methods implemented, class labels considered, data base used and the results obtained for related study. This review provides detailed comparison of different Neuroimaging techniques and reveals potential application of machine learning algorithms in medical image analysis; particularly in AD enabling even the early detection of the disease- the class labelled as Multiple Cognitive Impairment.

  14. The Use of Automatic Indexing for Authority Control.

    Science.gov (United States)

    Dillon, Martin; And Others

    1981-01-01

    Uses an experimental system for authority control on a collection of bibliographic records to demonstrate the resemblance between thesaurus-based automatic indexing and automatic authority control. Details of the automatic indexing system are given, results discussed, and the benefits of the resemblance examined. Included are a rules appendix and…

  15. 30 CFR 77.1401 - Automatic controls and brakes.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic controls and brakes. 77.1401 Section... MINES Personnel Hoisting § 77.1401 Automatic controls and brakes. Hoists and elevators shall be equipped with overspeed, overwind, and automatic stop controls and with brakes capable of stopping the elevator...

  16. 30 CFR 57.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 57.19006 Section 57.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 57.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  17. 30 CFR 56.19006 - Automatic hoist braking devices.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic hoist braking devices. 56.19006 Section 56.19006 Mineral Resources MINE SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR METAL AND... Hoisting Hoists § 56.19006 Automatic hoist braking devices. Automatic hoists shall be provided with devices...

  18. Process and equipment for automatic measurement of resonant frequencies in seismic detectors

    International Nuclear Information System (INIS)

    Fredriksson, O.A.; Thomas, E.L.

    1977-01-01

    This is a process for the automatic indication of the resonant frequency of one or more detector elements which have operated inside a geophysical data-gathering system. Geophones or hydrophones or groups of both instruments are to be understood as comprising the detector elements. The invention concerns the creation of a process and of equipment working with laboratory precision, although it can be used in the field. (orig./RW) [de

  19. Development of the automatic control rod operation system for JOYO. Verification of automatic control rod operation guide system

    International Nuclear Information System (INIS)

    Terakado, Tsuguo; Suzuki, Shinya; Kawai, Masashi; Aoki, Hiroshi; Ohkubo, Toshiyuki

    1999-10-01

    The automatic control rod operation system was developed to control the JOYO reactor power automatically in all operation modes(critical approach, cooling system heat up, power ascent, power descent), development began in 1989. Prior to applying the system, verification tests of the automatic control rod operation guide system was conducted during 32nd duty cycles of JOYO' from Dec. 1997 to Feb. 1998. The automatic control rod operation guide system consists of the control rod operation guide function and the plant operation guide function. The control rod operation guide function provides information on control rod movement and position, while the plant operation guide function provide guidance for plant operations corresponding to reactor power changes(power ascent or power descent). Control rod insertion or withdrawing are predicted by fuzzy algorithms. (J.P.N.)

  20. Low-cost automatic station for compost temperature monitoring

    Directory of Open Access Journals (Sweden)

    Marcelo D. L. Jordão

    Full Text Available ABSTRACT Temperature monitoring is an important procedure to control the composting process. Due to cost limitation, temperature monitoring is manual and with daily sampling resolution. The objective of this study was to develop an automatic station with US$ 150 dollars, able to monitor air temperature at two different points in a compost pile, with a 5-min time resolution. In the calibration test, the sensors showed an estimated uncertainty from ± 1 to ± 1.9 ºC. In the field validation test, the station guaranteed secure autonomy for seven days and endured high humidity and extreme temperature (> 70 °C.

  1. Discriminative Chemical Patterns: Automatic and Interactive Design.

    Science.gov (United States)

    Bietz, Stefan; Schomburg, Karen T; Hilbig, Matthias; Rarey, Matthias

    2015-08-24

    The classification of molecules with respect to their inhibiting, activating, or toxicological potential constitutes a central aspect in the field of cheminformatics. Often, a discriminative feature is needed to distinguish two different molecule sets. Besides physicochemical properties, substructures and chemical patterns belong to the descriptors most frequently applied for this purpose. As a commonly used example of this descriptor class, SMARTS strings represent a powerful concept for the representation and processing of abstract chemical patterns. While their usage facilitates a convenient way to apply previously derived classification rules on new molecule sets, the manual generation of useful SMARTS patterns remains a complex and time-consuming process. Here, we introduce SMARTSminer, a new algorithm for the automatic derivation of discriminative SMARTS patterns from preclassified molecule sets. Based on a specially adapted subgraph mining algorithm, SMARTSminer identifies structural features that are frequent in only one of the given molecule classes. In comparison to elemental substructures, it also supports the consideration of general and specific SMARTS features. Furthermore, SMARTSminer is integrated into an interactive pattern editor named SMARTSeditor. This allows for an intuitive visualization on the basis of the SMARTSviewer concept as well as interactive adaption and further improvement of the generated patterns. Additionally, a new molecular matching feature provides an immediate feedback on a pattern's matching behavior across the molecule sets. We demonstrate the utility of the SMARTSminer functionality and its integration into the SMARTSeditor software in several different classification scenarios.

  2. Automatic generation of groundwater model hydrostratigraphy from AEM resistivity and boreholes

    DEFF Research Database (Denmark)

    Marker, Pernille Aabye; Foged, N.; Christiansen, A. V.

    2014-01-01

    distribution govern groundwater flow. The coupling between hydrological and geophysical parameters is managed using a translator function with spatially variable parameters followed by a 3D zonation. The translator function translates geophysical resistivities into clay fractions and is calibrated...... with observed lithological data. Principal components are computed for the translated clay fractions and geophysical resistivities. Zonation is carried out by k-means clustering on the principal components. The hydraulic parameters of the zones are determined in a hydrological model calibration using head...... and discharge observations. The method was applied to field data collected at a Danish field site. Our results show that a competitive hydrological model can be constructed from the AEM dataset using the automatic procedure outlined above....

  3. Precision about the automatic emotional brain.

    Science.gov (United States)

    Vuilleumier, Patrik

    2015-01-01

    The question of automaticity in emotion processing has been debated under different perspectives in recent years. Satisfying answers to this issue will require a better definition of automaticity in terms of relevant behavioral phenomena, ecological conditions of occurrence, and a more precise mechanistic account of the underlying neural circuits.

  4. Automatic control of nuclear power plants

    International Nuclear Information System (INIS)

    Jover, P.

    1976-01-01

    The fundamental concepts in automatic control are surveyed, and the purpose of the automatic control of pressurized water reactors is given. The response characteristics for the main components are then studied and block diagrams are given for the main control loops (turbine, steam generator, and nuclear reactors) [fr

  5. Interactivity in automatic control: foundations and experiences

    OpenAIRE

    Dormido Bencomo, Sebastián; Guzmán Sánchez, José Luis; Costa Castelló, Ramon; Berenguel, M

    2012-01-01

    The first part of this paper presents the concepts of interactivity and visualization and its essential role in learning the fundamentals and techniques of automatic control. More than 10 years experience of the authors in the development and design of interactive tools dedicated to the study of automatic control concepts are also exposed. The second part of the paper summarizes the main features of the “Automatic Control with Interactive Tools” text that has been recently published by Pea...

  6. Towards unifying inheritance and automatic program specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2002-01-01

    with covariant specialization to control the automatic application of program specialization to class members. Lapis integrates object-oriented concepts, block structure, and techniques from automatic program specialization to provide both a language where object-oriented designs can be e#ciently implemented......Inheritance allows a class to be specialized and its attributes refined, but implementation specialization can only take place by overriding with manually implemented methods. Automatic program specialization can generate a specialized, effcient implementation. However, specialization of programs...

  7. Identifying Architectural Technical Debt in Android Applications through Compliance Checking

    NARCIS (Netherlands)

    Verdecchia, R.

    By considering the fast pace at which mobile applications need to evolve, Architectural Technical Debt results to be a crucial yet implicit factor of success. In this research we present an approach to automatically identify Architectural Technical Debt in Android applications. The approach takes

  8. Colour transformations and K-means segmentation for automatic cloud detection

    Directory of Open Access Journals (Sweden)

    Martin Blazek

    2015-08-01

    Full Text Available The main aim of this work is to find simple criteria for automatic recognition of several meteorological phenomena using optical digital sensors (e.g., Wide-Field cameras, automatic DSLR cameras or robotic telescopes. The output of those sensors is commonly represented in RGB channels containing information about both colour and luminosity even when normalised. Transformation into other colour spaces (e.g., CIE 1931 xyz, CIE L*a*b*, YCbCr can separate colour from luminosity, which is especially useful in the image processing of automatic cloud boundary recognition. Different colour transformations provide different sectorization of cloudy images. Hence, the analysed meteorological phenomena (cloud types, clear sky project differently into the colour diagrams of each international colour systems. In such diagrams, statistical tools can be applied in search of criteria which could determine clear sky from a covered one and possibly even perform a meteorological classification of cloud types. For the purpose of this work, a database of sky images (both clear and cloudy, with emphasis on a variety of different observation conditions (e.g., time, altitude, solar angle, etc. was acquired. The effectiveness of several colour transformations for meteorological application is discussed and the representation of different clouds (or clear sky in those colour systems is analysed. Utilisation of this algorithm would be useful in all-sky surveys, supplementary meteorological observations, solar cell effectiveness predictions or daytime astronomical solar observations.

  9. Automatic NAA. Saturation activities

    International Nuclear Information System (INIS)

    Westphal, G.P.; Grass, F.; Kuhnert, M.

    2008-01-01

    A system for Automatic NAA is based on a list of specific saturation activities determined for one irradiation position at a given neutron flux and a single detector geometry. Originally compiled from measurements of standard reference materials, the list may be extended also by the calculation of saturation activities from k 0 and Q 0 factors, and f and α values of the irradiation position. A systematic improvement of the SRM approach is currently being performed by pseudo-cyclic activation analysis, to reduce counting errors. From these measurements, the list of saturation activities is recalculated in an automatic procedure. (author)

  10. Automatic history matching of an offshore field in Brazil; Ajuste automatico de historico de um campo offshore no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Jose P.M. dos [PETROBRAS S.A., Macae, RJ (Brazil). Exploracao e Producao. Bacia de Campos]. E-mail: zepedro@ep-bc.petrobras.com.br; Schiozer, Denis J. [Universidade Estadual de Campinas, SP (Brazil). Dept. de Engenharia de Petroleo]. E-mail: denis@cepetro.unicamp.br

    2000-07-01

    Efficient reservoir management is strongly influenced by good production prediction which depends on a good reservoir characterization. The validation of this characterization, due to the complexity of the dynamics of multiphase flow in porous media and to several geological uncertainties involved in the process, it is obtained through an history matching associated to the study of the reservoir in subject. History matching is usually a very complex task and most of the time it can be a frustrating experience due to the high number of variables to be adjusted to reach a final objective which can be a combination of several matches. Automated history matching techniques were object of several studies but with a limited acceptance due to the large computational effort required. Nowadays, they are becoming more attractive motivated by recent hardware and software developments. This work shows an example of application of automatic history matching using an offshore field in Brazil, with emphasis in the benefits of the use of parallel computing and optimization techniques to reduce the total time of the process. It is shown that although the computational effort is higher, the total time of a reservoir study can be significantly reduced with a higher quality of the results. (author)

  11. Automatic REM Sleep Detection Associated with Idiopathic REM Sleep Behavior Disorder

    DEFF Research Database (Denmark)

    Kempfner, Jacob; Sørensen, Gertrud Laura; Sørensen, Helge Bjarup Dissing

    2011-01-01

    Rapid eye movement sleep Behavior Disorder (RBD) is a strong early marker of later development of Parkinsonism. Currently there are no objective methods to identify and discriminate abnormal from normal motor activity during REM sleep. Therefore, a REM sleep detection without the use of chin...... electromyography (EMG) is useful. This is addressed by analyzing the classification performance when implementing two automatic REM sleep detectors. The first detector uses the electroencephalography (EEG), electrooculography (EOG) and EMG to detect REM sleep, while the second detector only uses the EEG and EOG......, an automatic computerized REM detection algorithm has been implemented, using wavelet packet combined with artificial neural network. Results: When using the EEG, EOG and EMG modalities, it was possible to correctly classify REM sleep with an average Area Under Curve (AUC) equal to 0:900:03 for normal subjects...

  12. Feature extraction and classification in automatic weld seam radioscopy

    International Nuclear Information System (INIS)

    Heindoerfer, F.; Pohle, R.

    1994-01-01

    The investigations conducted have shown that automatic feature extraction and classification procedures permit the identification of weld seam flaws. Within this context the favored learning fuzzy classificator represents a very good alternative to conventional classificators. The results have also made clear that improvements mainly in the field of image registration are still possible by increasing the resolution of the radioscopy system. Since, only if the flaw is segmented correctly, i.e. in its full size, and due to improved detail recognizability and sufficient contrast difference will an almost error-free classification be conceivable. (orig./MM) [de

  13. [Automatic Sleep Stage Classification Based on an Improved K-means Clustering Algorithm].

    Science.gov (United States)

    Xiao, Shuyuan; Wang, Bei; Zhang, Jian; Zhang, Qunfeng; Zou, Junzhong

    2016-10-01

    Sleep stage scoring is a hotspot in the field of medicine and neuroscience.Visual inspection of sleep is laborious and the results may be subjective to different clinicians.Automatic sleep stage classification algorithm can be used to reduce the manual workload.However,there are still limitations when it encounters complicated and changeable clinical cases.The purpose of this paper is to develop an automatic sleep staging algorithm based on the characteristics of actual sleep data.In the proposed improved K-means clustering algorithm,points were selected as the initial centers by using a concept of density to avoid the randomness of the original K-means algorithm.Meanwhile,the cluster centers were updated according to the‘Three-Sigma Rule’during the iteration to abate the influence of the outliers.The proposed method was tested and analyzed on the overnight sleep data of the healthy persons and patients with sleep disorders after continuous positive airway pressure(CPAP)treatment.The automatic sleep stage classification results were compared with the visual inspection by qualified clinicians and the averaged accuracy reached 76%.With the analysis of morphological diversity of sleep data,it was proved that the proposed improved K-means algorithm was feasible and valid for clinical practice.

  14. Automatic Discovery and Geotagging of Objects from Street View Imagery

    Directory of Open Access Journals (Sweden)

    Vladimir A. Krylov

    2018-04-01

    Full Text Available Many applications, such as autonomous navigation, urban planning, and asset monitoring, rely on the availability of accurate information about objects and their geolocations. In this paper, we propose the automatic detection and computation of the coordinates of recurring stationary objects of interest using street view imagery. Our processing pipeline relies on two fully convolutional neural networks: the first segments objects in the images, while the second estimates their distance from the camera. To geolocate all the detected objects coherently we propose a novel custom Markov random field model to estimate the objects’ geolocation. The novelty of the resulting pipeline is the combined use of monocular depth estimation and triangulation to enable automatic mapping of complex scenes with the simultaneous presence of multiple, visually similar objects of interest. We validate experimentally the effectiveness of our approach on two object classes: traffic lights and telegraph poles. The experiments report high object recall rates and position precision of approximately 2 m, which is approaching the precision of single-frequency GPS receivers.

  15. Automatic identification of otologic drilling faults: a preliminary report.

    Science.gov (United States)

    Shen, Peng; Feng, Guodong; Cao, Tianyang; Gao, Zhiqiang; Li, Xisheng

    2009-09-01

    A preliminary study was carried out to identify parameters to characterize drilling faults when using an otologic drill under various operating conditions. An otologic drill was modified by the addition of four sensors. Under consistent conditions, the drill was used to simulate three important types of drilling faults and the captured data were analysed to extract characteristic signals. A multisensor information fusion system was designed to fuse the signals and automatically identify the faults. When identifying drilling faults, there was a high degree of repeatability and regularity, with an average recognition rate of >70%. This study shows that the variables measured change in a fashion that allows the identification of particular drilling faults, and that it is feasible to use these data to provide rapid feedback for a control system. Further experiments are being undertaken to implement such a system.

  16. On the question of the necessity of implementation of automatic control systems in timber industry

    Science.gov (United States)

    Khasanov, E. R.; Zelenkov, P. V.; Petrosyan, M. O.; Murygin, A. V.; Laptenor, V. D.

    2016-04-01

    The paper considers the necessity of implementation of automatic control systems on the level of forest farms management and timber industry. Main areas of activity, which currently subjected to automation, are revealed. Objectives, which solved by implementation of APCS, are identified.

  17. Automatization of welding

    International Nuclear Information System (INIS)

    Iwabuchi, Masashi; Tomita, Jinji; Nishihara, Katsunori.

    1978-01-01

    Automatization of welding is one of the effective measures for securing high degree of quality of nuclear power equipment, as well as for correspondence to the environment at the site of plant. As the latest ones of the automatic welders practically used for welding of nuclear power apparatuses in factories of Toshiba and IHI, those for pipes and lining tanks are described here. The pipe welder performs the battering welding on the inside of pipe end as the so-called IGSCC countermeasure and the succeeding butt welding through the same controller. The lining tank welder is able to perform simultaneous welding of two parallel weld lines on a large thin plate lining tank. Both types of the welders are demonstrating excellent performance at the shops as well as at the plant site. (author)

  18. Field-In-Field Technique With Intrafractionally Modulated Junction Shifts for Craniospinal Irradiation

    International Nuclear Information System (INIS)

    Yom, Sue S.; Frija, Erik K. C.; Mahajan, Anita; Chang, Eric; Klein, Kelli C.; Shiu, Almon; Ohrt, Jared; Woo, Shiao

    2007-01-01

    Purpose: To plan craniospinal irradiation with 'field-in-field' (FIF) homogenization in combination with daily, intrafractional modulation of the field junctions, to minimize the possibility of spinal cord overdose. Methods and Materials: Lateral cranial fields and posterior spinal fields were planned using a forward-planned, step-and-shoot FIF technique. Field junctions were automatically modulated and custom-weighted for maximal homogeneity within each treatment fraction. Dose-volume histogram analyses and film dosimetry were used to assess results. Results: Plan inhomogeneity improved with FIF. Planning with daily modulated junction shifts provided consistent dose delivery during each fraction of treatment across the junctions. Modulation minimized the impact of a 5-mm setup error at the junction. Film dosimetry confirmed that no point in the junction exceeded the anticipated dose. Conclusions: Field-in-field planning and modulated junction shifts improve the homogeneity and consistency of daily dose delivery, simplify treatment, and reduce the impact of setup errors

  19. Label-free sensor for automatic identification of erythrocytes using digital in-line holographic microscopy and machine learning.

    Science.gov (United States)

    Go, Taesik; Byeon, Hyeokjun; Lee, Sang Joon

    2018-04-30

    Cell types of erythrocytes should be identified because they are closely related to their functionality and viability. Conventional methods for classifying erythrocytes are time consuming and labor intensive. Therefore, an automatic and accurate erythrocyte classification system is indispensable in healthcare and biomedical fields. In this study, we proposed a new label-free sensor for automatic identification of erythrocyte cell types using a digital in-line holographic microscopy (DIHM) combined with machine learning algorithms. A total of 12 features, including information on intensity distributions, morphological descriptors, and optical focusing characteristics, is quantitatively obtained from numerically reconstructed holographic images. All individual features for discocytes, echinocytes, and spherocytes are statistically different. To improve the performance of cell type identification, we adopted several machine learning algorithms, such as decision tree model, support vector machine, linear discriminant classification, and k-nearest neighbor classification. With the aid of these machine learning algorithms, the extracted features are effectively utilized to distinguish erythrocytes. Among the four tested algorithms, the decision tree model exhibits the best identification performance for the training sets (n = 440, 98.18%) and test sets (n = 190, 97.37%). This proposed methodology, which smartly combined DIHM and machine learning, would be helpful for sensing abnormal erythrocytes and computer-aided diagnosis of hematological diseases in clinic. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Pattern-Driven Automatic Parallelization

    Directory of Open Access Journals (Sweden)

    Christoph W. Kessler

    1996-01-01

    Full Text Available This article describes a knowledge-based system for automatic parallelization of a wide class of sequential numerical codes operating on vectors and dense matrices, and for execution on distributed memory message-passing multiprocessors. Its main feature is a fast and powerful pattern recognition tool that locally identifies frequently occurring computations and programming concepts in the source code. This tool also works for dusty deck codes that have been "encrypted" by former machine-specific code transformations. Successful pattern recognition guides sophisticated code transformations including local algorithm replacement such that the parallelized code need not emerge from the sequential program structure by just parallelizing the loops. It allows access to an expert's knowledge on useful parallel algorithms, available machine-specific library routines, and powerful program transformations. The partially restored program semantics also supports local array alignment, distribution, and redistribution, and allows for faster and more exact prediction of the performance of the parallelized target code than is usually possible.

  1. Evaluation of Semi-Automatic Metadata Generation Tools: A Survey of the Current State of the Art

    Directory of Open Access Journals (Sweden)

    Jung-ran Park

    2015-09-01

    Full Text Available Assessment of the current landscape of semi-automatic metadata generation tools is particularly important considering the rapid development of digital repositories and the recent explosion of big data. Utilization of (semiautomatic metadata generation is critical in addressing these environmental changes and may be unavoidable in the future considering the costly and complex operation of manual metadata creation. To address such needs, this study examines the range of semi-automatic metadata generation tools (n=39 while providing an analysis of their techniques, features, and functions. The study focuses on open-source tools that can be readily utilized in libraries and other memory institutions.  The challenges and current barriers to implementation of these tools were identified. The greatest area of difficulty lies in the fact that  the piecemeal development of most semi-automatic generation tools only addresses part of the issue of semi-automatic metadata generation, providing solutions to one or a few metadata elements but not the full range elements.  This indicates that significant local efforts will be required to integrate the various tools into a coherent set of a working whole.  Suggestions toward such efforts are presented for future developments that may assist information professionals with incorporation of semi-automatic tools within their daily workflows.

  2. Speed and automaticity of word recognition - inseparable twins?

    DEFF Research Database (Denmark)

    Poulsen, Mads; Asmussen, Vibeke; Elbro, Carsten

    'Speed and automaticity' of word recognition is a standard collocation. However, it is not clear whether speed and automaticity (i.e., effortlessness) make independent contributions to reading comprehension. In theory, both speed and automaticity may save cognitive resources for comprehension...... processes. Hence, the aim of the present study was to assess the unique contributions of word recognition speed and automaticity to reading comprehension while controlling for decoding speed and accuracy. Method: 139 Grade 5 students completed tests of reading comprehension and computer-based tests of speed...... of decoding and word recognition together with a test of effortlessness (automaticity) of word recognition. Effortlessness was measured in a dual task in which participants were presented with a word enclosed in an unrelated figure. The task was to read the word and decide whether the figure was a triangle...

  3. An Objective Approach to Identify Spectral Distinctiveness for Hearing Impairment

    Directory of Open Access Journals (Sweden)

    Yeou-Jiunn Chen

    2013-01-01

    Full Text Available To facilitate the process of developing speech perception, speech-language pathologists have to teach a subject with hearing loss the differences between two syllables by manually enhancing acoustic cues of speech. However, this process is time consuming and difficult. Thus, this study proposes an objective approach to automatically identify the regions of spectral distinctiveness between two syllables, which is used for speech-perception training. To accurately represent the characteristics of speech, mel-frequency cepstrum coefficients are selected as analytical parameters. The mismatch between two syllables in time domain is handled by dynamic time warping. Further, a filter bank is adopted to estimate the components in different frequency bands, which are also represented as mel-frequency cepstrum coefficients. The spectral distinctiveness in different frequency bands is then easily estimated by using Euclidean metrics. Finally, a morphological gradient operator is applied to automatically identify the regions of spectral distinctiveness. To evaluate the proposed approach, the identified regions are manipulated and then the manipulated syllables are measured by a close-set based speech-perception test. The experimental results demonstrated that the identified regions of spectral distinctiveness are very useful in speech perception, which indeed can help speech-language pathologists in speech-perception training.

  4. The design philosophy for an automatic TLD system to meet current international specifications

    International Nuclear Information System (INIS)

    Haaslahti, J.

    1986-01-01

    The object of this paper is to describe the elements of a new automatic TLD system intended to meet draft IEC/ISO proposals and ANSI requirements in the USA. Dosemeter badge design is based on ICRU recommendations. The basic intent has been to produce a standard system that can measure and file raw data that can be adapted to specific user requirements with software. The system consists of a programmable automatic reader, an automatic irradiator, a computer, and dosemeters for environmental, whole body, extremity, and clinical applications. The reader uses hot nitrogen heating and photon counting, and measurement conditions may be chosen with complete freedom. The reader can produce a real-time glow curve to assist in checking performance. The irradiator has a 90 Sr- 90 Y source to permit programmed irradiation for calibration and material sensitivity checks. Cassettes are used to hold TLD cards during processing. Cassette coding both identifies samples and calls measurement parameters into use from memory. The system can be preprogrammed to measure all common materials and all common dosemeter elements (both square and round). (author)

  5. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  6. Towards automatic exchange of information

    OpenAIRE

    Oberson, Xavier

    2015-01-01

    This article describes the various steps that led towards automatic exchange of information as the global standard and the issues that remain to be solved. First, the various competing models of exchange information, such as Double Tax Treaty (DTT), TIEA's, FATCA or UE Directives are described with a view to show how they interact between themselves. Second, the so-called Rubik Strategy is summarized and compared with an automatic exchange of information (AEOI). The third part then describes ...

  7. FieldChopper, a new tool for automatic model generation and virtual screening based on molecular fields.

    Science.gov (United States)

    Kalliokoski, Tuomo; Ronkko, Toni; Poso, Antti

    2008-06-01

    Algorithms were developed for ligand-based virtual screening of molecular databases. FieldChopper (FC) is based on the discretization of the electrostatic and van der Waals field into three classes. A model is built from a set of superimposed active molecules. The similarity of the compounds in the database to the model is then calculated using matrices that define scores for comparing field values of different categories. The method was validated using 12 publicly available data sets by comparing the method to the electrostatic similarity comparison program EON. The results suggest that FC is competitive with more complex descriptors and could be used as a molecular sieve in virtual screening experiments when multiple active ligands are known.

  8. Automatic registration method for multisensor datasets adopted for dimensional measurements on cutting tools

    International Nuclear Information System (INIS)

    Shaw, L; Mehari, F; Weckenmann, A; Ettl, S; Häusler, G

    2013-01-01

    Multisensor systems with optical 3D sensors are frequently employed to capture complete surface information by measuring workpieces from different views. During coarse and fine registration the resulting datasets are afterward transformed into one common coordinate system. Automatic fine registration methods are well established in dimensional metrology, whereas there is a deficit in automatic coarse registration methods. The advantage of a fully automatic registration procedure is twofold: it enables a fast and contact-free alignment and further a flexible application to datasets of any kind of optical 3D sensor. In this paper, an algorithm adapted for a robust automatic coarse registration is presented. The method was originally developed for the field of object reconstruction or localization. It is based on a segmentation of planes in the datasets to calculate the transformation parameters. The rotation is defined by the normals of three corresponding segmented planes of two overlapping datasets, while the translation is calculated via the intersection point of the segmented planes. First results have shown that the translation is strongly shape dependent: 3D data of objects with non-orthogonal planar flanks cannot be registered with the current method. In the novel supplement for the algorithm, the translation is additionally calculated via the distance between centroids of corresponding segmented planes, which results in more than one option for the transformation. A newly introduced measure considering the distance between the datasets after coarse registration evaluates the best possible transformation. Results of the robust automatic registration method are presented on the example of datasets taken from a cutting tool with a fringe-projection system and a focus-variation system. The successful application in dimensional metrology is proven with evaluations of shape parameters based on the registered datasets of a calibrated workpiece. (paper)

  9. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  10. Automatic Detection of Acromegaly From Facial Photographs Using Machine Learning Methods.

    Science.gov (United States)

    Kong, Xiangyi; Gong, Shun; Su, Lijuan; Howard, Newton; Kong, Yanguo

    2018-01-01

    Automatic early detection of acromegaly is theoretically possible from facial photographs, which can lessen the prevalence and increase the cure probability. In this study, several popular machine learning algorithms were used to train a retrospective development dataset consisting of 527 acromegaly patients and 596 normal subjects. We firstly used OpenCV to detect the face bounding rectangle box, and then cropped and resized it to the same pixel dimensions. From the detected faces, locations of facial landmarks which were the potential clinical indicators were extracted. Frontalization was then adopted to synthesize frontal facing views to improve the performance. Several popular machine learning methods including LM, KNN, SVM, RT, CNN, and EM were used to automatically identify acromegaly from the detected facial photographs, extracted facial landmarks, and synthesized frontal faces. The trained models were evaluated using a separate dataset, of which half were diagnosed as acromegaly by growth hormone suppression test. The best result of our proposed methods showed a PPV of 96%, a NPV of 95%, a sensitivity of 96% and a specificity of 96%. Artificial intelligence can automatically early detect acromegaly with a high sensitivity and specificity. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  11. SELFADJUSTING AUTOMATIC CONTROL OF SOWING UNIT

    Directory of Open Access Journals (Sweden)

    A. Yu. Izmaylov

    2015-01-01

    Full Text Available The selfadjusting automatic control of sowing unit and differentiated introduction of mineral fertilizers doses according to agrochemical indicators of the soil (precision agriculture are used wider nowadays. It was defined that the main requirement to the differentiated seeding and fertilizing is an accuracy and duration of transition from one norm to another. Established that at a speed of unit of 10 km/h object moves for 0.5 s about on 1.5 m and more. Thus in this device the radio channel originated differentiated correction is updated in 10 s, and in the RTK mode - 0.5-2 s that breaks the accuracy of introduction of seeds and fertilizers. The block schematic diagram of system of automatic control of technological process of seeding and mineral fertilizing with use of navigation means of machine-tractor aggregates orientation in the field and technical means for realization of technology of precision agriculture at sowing and fertilizers application due to electronic maps of soil fertility and navigation satellite systems was worked out. It was noted that for regulation of a fertilizing dose it is necessary to complete the unit with the electric drive, and for error reduction use navigation GLONASS, GPS, Galileo receivers. To tracking of four leading navigation systems GPS/GLONASS/Galileo/Compass receiver with 32 canals developed by domestic-owned firm «KB NAVIS» was suggested. It was established that the automated device created by All-Russia Research Institute of Mechanization for Agriculture information based on NAVSTAR and GLONASS/GPS system successfully operates seeding and make possible the differentiate fertilizing.

  12. Development project of an automatic sampling system for part time unmanned pipeline terminals

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Gullherme O.; De Almelda, Marcio M. G.; Ramos, Ricardo R. [Petrobas, (Brazil); Potten, Gary [Cameron Measurement Systems, (United States)

    2010-07-01

    The Sao Paulo - Brasilia Pipeline (OSBRA) is a highly automated pipeline using a SCADA system which operates from a control room. A new quality management system standard was established for transportation and storage operations. The products had to be sampled on an automatic basis. This paper reports the development of an automatic sampling system (ASS) in accordance with the new quality control standard. The prototype was developed to be implemented through a human-machine interface (HMI) from the control room SCADA screens. A technical cooperation agreement(TCA) was drawn up for development of this new ASS product. The TCA was a joint cooperation between the Holding, the Operator and the cooperators. The prototype will be on-field tested at Senador Canedo tank farm to SPEC requirements. The current performance of the ASS establishes reasonable expectations for further successful development.

  13. Automatic and manual segmentation of healthy retinas using high-definition optical coherence tomography.

    Science.gov (United States)

    Golbaz, Isabelle; Ahlers, Christian; Goesseringer, Nina; Stock, Geraldine; Geitzenauer, Wolfgang; Prünte, Christian; Schmidt-Erfurth, Ursula Margarethe

    2011-03-01

    This study compared automatic- and manual segmentation modalities in the retina of healthy eyes using high-definition optical coherence tomography (HD-OCT). Twenty retinas in 20 healthy individuals were examined using an HD-OCT system (Carl Zeiss Meditec, Inc.). Three-dimensional imaging was performed with an axial resolution of 6 μm at a maximum scanning speed of 25,000 A-scans/second. Volumes of 6 × 6 × 2 mm were scanned. Scans were analysed using a matlab-based algorithm and a manual segmentation software system (3D-Doctor). The volume values calculated by the two methods were compared. Statistical analysis revealed a high correlation between automatic and manual modes of segmentation. The automatic mode of measuring retinal volume and the corresponding three-dimensional images provided similar results to the manual segmentation procedure. Both methods were able to visualize retinal and subretinal features accurately. This study compared two methods of assessing retinal volume using HD-OCT scans in healthy retinas. Both methods were able to provide realistic volumetric data when applied to raster scan sets. Manual segmentation methods represent an adequate tool with which to control automated processes and to identify clinically relevant structures, whereas automatic procedures will be needed to obtain data in larger patient populations. © 2009 The Authors. Journal compilation © 2009 Acta Ophthalmol.

  14. Automatic Lamp and Fan Control Based on Microcontroller

    Science.gov (United States)

    Widyaningrum, V. T.; Pramudita, Y. D.

    2018-01-01

    In general, automation can be described as a process following pre-determined sequential steps with a little or without any human exertion. Automation is provided with the use of various sensors suitable to observe the production processes, actuators and different techniques and devices. In this research, the automation system developed is an automatic lamp and an automatic fan on the smart home. Both of these systems will be processed using an Arduino Mega 2560 microcontroller. A microcontroller is used to obtain values of physical conditions through sensors connected to it. In the automatic lamp system required sensors to detect the light of the LDR (Light Dependent Resistor) sensor. While the automatic fan system required sensors to detect the temperature of the DHT11 sensor. In tests that have been done lamps and fans can work properly. The lamp can turn on automatically when the light begins to darken, and the lamp can also turn off automatically when the light begins to bright again. In addition, it can concluded also that the readings of LDR sensors are placed outside the room is different from the readings of LDR sensors placed in the room. This is because the light intensity received by the existing LDR sensor in the room is blocked by the wall of the house or by other objects. Then for the fan, it can also turn on automatically when the temperature is greater than 25°C, and the fan speed can also be adjusted. The fan may also turn off automatically when the temperature is less than equal to 25°C.

  15. Automatic control system at the ''Loviisa'' NPP

    International Nuclear Information System (INIS)

    Kukhtevich, I.V.; Mal'tsev, B.K.; Sergievskaya, E.N.

    1980-01-01

    Automatic control system of the Loviisa-1 NPP (Finland) is described. According to operation conditions of Finland power system the Loviisa-1 NPP must operate in the mode of week and day control of loading schedule and participate in current control of power system frequency and capacity. With provision for these requirements NPP is equipped with the all-regime system for automatic control functioning during reactor start-up, shut-down, in normal and transient regimes and in emergency situations. The automatic control system includes: a data subsystem, an automatic control subsystem, a discrete control subsystem including remote, a subsystem for reactor control and protection and overall station system of protections: control and dosimetry inside the reactor. Structures of a data-computer complex, discrete control subsystems, reactor control and protection systems, neutron flux control system, inside-reactor control system, station protection system and system for control of fuel element tightness are presented in short. Two-year experience of the NPP operation confirmed advisability of the chosen volume of automatization. The Loviisa-1 NPP operates successfully in the mode of the week and day control of supervisor schedule and current control of frequency (short-term control)

  16. 21 CFR 211.68 - Automatic, mechanical, and electronic equipment.

    Science.gov (United States)

    2010-04-01

    ... SERVICES (CONTINUED) DRUGS: GENERAL CURRENT GOOD MANUFACTURING PRACTICE FOR FINISHED PHARMACEUTICALS Equipment § 211.68 Automatic, mechanical, and electronic equipment. (a) Automatic, mechanical, or electronic... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Automatic, mechanical, and electronic equipment...

  17. Automatic Clustering Using FSDE-Forced Strategy Differential Evolution

    Science.gov (United States)

    Yasid, A.

    2018-01-01

    Clustering analysis is important in datamining for unsupervised data, cause no adequate prior knowledge. One of the important tasks is defining the number of clusters without user involvement that is known as automatic clustering. This study intends on acquiring cluster number automatically utilizing forced strategy differential evolution (AC-FSDE). Two mutation parameters, namely: constant parameter and variable parameter are employed to boost differential evolution performance. Four well-known benchmark datasets were used to evaluate the algorithm. Moreover, the result is compared with other state of the art automatic clustering methods. The experiment results evidence that AC-FSDE is better or competitive with other existing automatic clustering algorithm.

  18. #SupportTheCause: Identifying Motivations to Participate in Online Health Campaigns

    NARCIS (Netherlands)

    Nguyen, Dong-Phuong; van den Broek, Tijs Adriaan; Hauff, C.; Hiemstra, Djoerd; Ehrenhard, Michel Léon

    We consider the task of automatically identifying participants’ motivations in the public health campaign Movember and investigate the impact of the different motivations on the amount of campaign donations raised. Our classification scheme is based on the Social Identity Model of Collective Action

  19. Automatic defect detection in video archives: application to Montreux Jazz Festival digital archives

    Science.gov (United States)

    Hanhart, Philippe; Rerabek, Martin; Ivanov, Ivan; Dufaux, Alain; Jones, Caryl; Delidais, Alexandre; Ebrahimi, Touradj

    2013-09-01

    Archival of audio-visual databases has become an important discipline in multimedia. Various defects are typ- ically present in such archives. Among those, one can mention recording related defects such as interference between audio and video signals, optical related artifacts, recording and play out artifacts such as horizontal lines, and dropouts, as well as those due to digitization such as diagonal lines. An automatic or semi-automatic detection to identify such defects is useful, especially for large databases. In this paper, we propose two auto- matic algorithms for detection of horizontal and diagonal lines, as well as dropouts that are among the most typical artifacts encountered. We then evaluate the performance of these algorithms by making use of ground truth scores obtained by human subjects.

  20. Evaluation of automatic face recognition for automatic border control on actual data recorded of travellers at Schiphol Airport

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Hendrikse, A.J.; Gerritsen, K.J.; Brömme, A.; Busch, C.

    2012-01-01

    Automatic border control at airports using automated facial recognition for checking the passport is becoming more and more common. A problem is that it is not clear how reliable these automatic gates are. Very few independent studies exist that assess the reliability of automated facial recognition

  1. An introduction to automatic radioactive sample counters

    International Nuclear Information System (INIS)

    1980-01-01

    The subject is covered in chapters, entitled; the detection of radiation in sample counters; nucleonic equipment; liquid scintillation counting; basic features of automatic sample counters; statistics of counting; data analysis; purchase, installation, calibration and maintenance of automatic sample counters. (U.K.)

  2. Automatic Cobb Angle Determination From Radiographic Images

    NARCIS (Netherlands)

    Sardjono, Tri Arief; Wilkinson, Michael H. F.; Veldhuizen, Albert G.; van Ooijen, Peter M. A.; Purnama, Ketut E.; Verkerke, Gijsbertus J.

    2013-01-01

    Study Design. Automatic measurement of Cobb angle in patients with scoliosis. Objective. To test the accuracy of an automatic Cobb angle determination method from frontal radiographical images. Summary of Background Data. Thirty-six frontal radiographical images of patients with scoliosis. Methods.

  3. Observed use of automatic seat belts in 1987 cars.

    Science.gov (United States)

    Williams, A F; Wells, J K; Lund, A K; Teed, N

    1989-10-01

    Usage of the automatic belt systems supplied by six large-volume automobile manufacturers to meet the federal requirements for automatic restraints were observed in suburban Washington, D.C., Chicago, Los Angeles, and Philadelphia. The different belt systems studied were: Ford and Toyota (motorized, nondetachable automatic shoulder belt), Nissan (motorized, detachable shoulder belt), VW and Chrysler (nonmotorized, detachable shoulder belt), and GM (nonmotorized detachable lap and shoulder belt). Use of automatic belts was significantly greater than manual belt use in otherwise comparable late-model cars for all manufacturers except Chrysler; in Chrysler cars, automatic belt use was significantly lower than manual belt use. The automatic shoulder belts provided by Ford, Nissan, Toyota, and VW increased use rates to about 90%. Because use rates were lower in Ford cars with manual belts, their increase was greater. GM cars had the smallest increase in use rates; however, lap belt use was highest in GM cars. The other manufacturers supply knee bolsters to supplement shoulder belt protection; all--except VW--also provide manual lap belts, which were used by about half of those who used the automatic shoulder belt. The results indicate that some manufacturers have been more successful than others in providing automatic belt systems that result in high use that, in turn, will mean fewer deaths and injuries in those cars.

  4. Automatic teeth axes calculation for well-aligned teeth using cost profile analysis along teeth center arch.

    Science.gov (United States)

    Kim, Gyehyun; Lee, Jeongjin; Seo, Jinwook; Lee, Wooshik; Shin, Yeong-Gil; Kim, Bohyoung

    2012-04-01

    In dental implantology and virtual dental surgery planning using computed tomography (CT) images, the examination of the axes of neighboring and/or biting teeth is important to improve the performance of the masticatory system as well as the aesthetic beauty. However, due to its high connectivity to neighboring teeth and jawbones, a tooth and/or its axis is very elusive to automatically identify in dental CT images. This paper presents a novel method of automatically calculating individual teeth axes. The planes separating the individual teeth are automatically calculated using cost profile analysis along the teeth center arch. In this calculation, a novel plane cost function, which considers the intensity and the gradient, is proposed to favor the teeth separation planes crossing the teeth interstice and suppress the possible inappropriately detected separation planes crossing the soft pulp. The soft pulp and dentine of each individually separated tooth are then segmented by a fast marching method with two newly proposed speed functions considering their own specific anatomical characteristics. The axis of each tooth is finally calculated using principal component analysis on the segmented soft pulp and dentine. In experimental results using 20 clinical datasets, the average angle and minimum distance differences between the teeth axes manually specified by two dentists and automatically calculated by the proposed method were 1.94° ± 0.61° and 1.13 ± 0.56 mm, respectively. The proposed method identified the individual teeth axes accurately, demonstrating that it can give dentists substantial assistance during dental surgery such as dental implant placement and orthognathic surgery.

  5. Automatic sample changers maintenance manual

    International Nuclear Information System (INIS)

    Myers, T.A.

    1978-10-01

    This manual describes and provides trouble-shooting aids for the Automatic Sample Changer electronics on the automatic beta counting system, developed by the Los Alamos Scientific Laboratory Group CNC-11. The output of a gas detector is shaped by a preamplifier, then is coupled to an amplifier. Amplifier output is discriminated and is the input to a scaler. An identification number is associated with each sample. At a predetermined count length, the identification number, scaler data plus other information is punched out on a data card. The next sample to be counted is automatically selected. The beta counter uses the same electronics as the prior count did, the only difference being the sample identification number and sample itself. This manual is intended as a step-by-step aid in trouble-shooting the electronics associated with positioning the sample, counting the sample, and getting the needed data punched on an 80-column data card

  6. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  7. Automatic Encoding and Language Detection in the GSDL

    Directory of Open Access Journals (Sweden)

    Otakar Pinkas

    2014-10-01

    Full Text Available Automatic detection of encoding and language of the text is part of the Greenstone Digital Library Software (GSDL for building and distributing digital collections. It is developed by the University of Waikato (New Zealand in cooperation with UNESCO. The automatic encoding and language detection in Slavic languages is difficult and it sometimes fails. The aim is to detect cases of failure. The automatic detection in the GSDL is based on n-grams method. The most frequent n-grams for Czech are presented. The whole process of automatic detection in the GSDL is described. The input documents to test collections are plain texts encoded in ISO-8859-1, ISO-8859-2 and Windows-1250. We manually evaluated the quality of automatic detection. To the causes of errors belong the improper language model predominance and the incorrect switch to Windows-1250. We carried out further tests on documents that were more complex.

  8. 12th Portuguese Conference on Automatic Control

    CERN Document Server

    Soares, Filomena; Moreira, António

    2017-01-01

    The biennial CONTROLO conferences are the main events promoted by The CONTROLO 2016 – 12th Portuguese Conference on Automatic Control, Guimarães, Portugal, September 14th to 16th, was organized by Algoritmi, School of Engineering, University of Minho, in partnership with INESC TEC, and promoted by the Portuguese Association for Automatic Control – APCA, national member organization of the International Federation of Automatic Control – IFAC. The seventy-five papers published in this volume cover a wide range of topics. Thirty-one of them, of a more theoretical nature, are distributed among the first five parts: Control Theory; Optimal and Predictive Control; Fuzzy, Neural and Genetic Control; Modeling and Identification; Sensing and Estimation. The papers go from cutting-edge theoretical research to innovative control applications and show expressively how Automatic Control can be used to increase the well being of people. .

  9. Some experimental results for an automatic helium liquefier

    International Nuclear Information System (INIS)

    Watanabe, T.; Kudo, T.; Kuraoka, Y.; Sakura, K.; Tsuruga, H.; Watanabe, T.

    1984-01-01

    This chapter describes the testing of an automatic cooldown system. The liquefying machine examined is a CTi Model 1400. The automatic helium gas liquefying system is operated by using sequence control with a programmable controller. The automatic mode is carried out by operation of two compressors. The monitoring system consists of 41 remote sensors. Liquid level is measured by a superconducting level meter. The J-T valve and return valve, which require precise control, are operated by pulse motors. The advantages of the automatic cooldown system are reduced operator man power; temperatures and pressures are changed smoothly, so that the flow chart of automation is simple; and the system makes continuous liquefier operation possible

  10. RESEARCH ON THE CONSTRUCTION OF REMOTE SENSING AUTOMATIC INTERPRETATION SYMBOL BIG DATA

    Directory of Open Access Journals (Sweden)

    Y. Gao

    2018-04-01

    Full Text Available Remote sensing automatic interpretation symbol (RSAIS is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013–2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  11. Research on the Construction of Remote Sensing Automatic Interpretation Symbol Big Data

    Science.gov (United States)

    Gao, Y.; Liu, R.; Liu, J.; Cheng, T.

    2018-04-01

    Remote sensing automatic interpretation symbol (RSAIS) is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013-2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  12. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  13. Grinding Parts For Automatic Welding

    Science.gov (United States)

    Burley, Richard K.; Hoult, William S.

    1989-01-01

    Rollers guide grinding tool along prospective welding path. Skatelike fixture holds rotary grinder or file for machining large-diameter rings or ring segments in preparation for welding. Operator grasps handles to push rolling fixture along part. Rollers maintain precise dimensional relationship so grinding wheel cuts precise depth. Fixture-mounted grinder machines surface to quality sufficient for automatic welding; manual welding with attendant variations and distortion not necessary. Developed to enable automatic welding of parts, manual welding of which resulted in weld bead permeated with microscopic fissures.

  14. 46 CFR 171.118 - Automatic ventilators and side ports.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Automatic ventilators and side ports. 171.118 Section 171.118 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SUBDIVISION AND STABILITY... Bulkhead or Weather Deck § 171.118 Automatic ventilators and side ports. (a) An automatic ventilator must...

  15. 30 CFR 75.1404 - Automatic brakes; speed reduction gear.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Automatic brakes; speed reduction gear. 75.1404... Automatic brakes; speed reduction gear. [Statutory Provisions] Each locomotive and haulage car used in an... permit automatic brakes, locomotives and haulage cars shall be subject to speed reduction gear, or other...

  16. Automatic Error Recovery in Robot Assembly Operations Using Reverse Execution

    DEFF Research Database (Denmark)

    Laursen, Johan Sund; Schultz, Ulrik Pagh; Ellekilde, Lars-Peter

    2015-01-01

    , in particular for small-batch productions. As an alternative, we propose a system for automatically handling certain classes of errors instead of preventing them. Specifically, we show that many operations can be automatically reversed. Errors can be handled through automatic reverse execution of the control...... program to a safe point, from which forward execution can be resumed. This paper describes the principles behind automatic reversal of robotic assembly operations, and experimentally demonstrates the use of a domain-specific language that supports automatic error handling through reverse execution. Our...

  17. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin; Zhang, Fa; Gao, Xin

    2017-01-01

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  18. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin

    2017-10-20

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  19. Shaping electromagnetic waves using software-automatically-designed metasurfaces.

    Science.gov (United States)

    Zhang, Qian; Wan, Xiang; Liu, Shuo; Yuan Yin, Jia; Zhang, Lei; Jun Cui, Tie

    2017-06-15

    We present a fully digital procedure of designing reflective coding metasurfaces to shape reflected electromagnetic waves. The design procedure is completely automatic, controlled by a personal computer. In details, the macro coding units of metasurface are automatically divided into several types (e.g. two types for 1-bit coding, four types for 2-bit coding, etc.), and each type of the macro coding units is formed by discretely random arrangement of micro coding units. By combining an optimization algorithm and commercial electromagnetic software, the digital patterns of the macro coding units are optimized to possess constant phase difference for the reflected waves. The apertures of the designed reflective metasurfaces are formed by arranging the macro coding units with certain coding sequence. To experimentally verify the performance, a coding metasurface is fabricated by automatically designing two digital 1-bit unit cells, which are arranged in array to constitute a periodic coding metasurface to generate the required four-beam radiations with specific directions. Two complicated functional metasurfaces with circularly- and elliptically-shaped radiation beams are realized by automatically designing 4-bit macro coding units, showing excellent performance of the automatic designs by software. The proposed method provides a smart tool to realize various functional devices and systems automatically.

  20. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  1. Automatic tools for enhancing the collaborative experience in large projects

    International Nuclear Information System (INIS)

    Bourilkov, D; Rodriquez, J L

    2014-01-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  2. Automatic control of commercial computer programs

    International Nuclear Information System (INIS)

    Rezvov, B.A.; Artem'ev, A.N.; Maevskij, A.G.; Demkiv, A.A.; Kirillov, B.F.; Belyaev, A.D.; Artem'ev, N.A.

    2010-01-01

    The way of automatic control of commercial computer programs is presented. The developed connection of the EXAFS spectrometer automatic system (which is managed by PC for DOS) is taken with the commercial program for the CCD detector control (which is managed by PC for Windows). The described complex system is used for the automation of intermediate amplitude spectra processing in EXAFS spectrum measurements at Kurchatov SR source

  3. Automatic shadowing device for electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, F W; Bogitch, S

    1960-01-01

    For the past ten years in the laboratory of the Department of Nuclear Medicine and Radiation Biology at the University of California, and before that at Rochester, New York, every evaporation was done with the aid of an automatic shadowing device. For several months the automatic shadowing device has been available at the Atomic Bomb Casualty Commission (ABCC) Hiroshima, Japan with the modifications described. 1 reference.

  4. Approaches to identifying reservoir heterogeneity and reserve growth opportunities from subsurface data: The Oficina Formation, Budare field, Venezuela

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, D.S.; Raeuchle, S.K.; Holtz, M.H. [Bureau of Economic Geology, Austin, TX (United States)] [and others

    1997-08-01

    We applied an integrated geologic, geophysical, and engineering approach devised to identify heterogeneities in the subsurface that might lead to reserve growth opportunities in our analysis of the Oficina Formation at Budare field, Venezuela. The approach involves 4 key steps: (1) Determine geologic reservoir architecture; (2) Investigate trends in reservoir fluid flow; (3) Integrate fluid flow trends with reservoir architecture; and (4) Estimate original oil-in-place, residual oil saturation, and remaining mobile oil, to identify opportunities for reserve growth. There are three main oil-producing reservoirs in the Oficina Formation that were deposited in a bed-load fluvial system, an incised valley-fill, and a barrier-strandplain system. Reservoir continuity is complex because, in addition to lateral facies variability, the major Oficina depositional systems were internally subdivided by high-frequency stratigraphic surfaces. These surfaces define times of intermittent lacustrine and marine flooding events that punctuated the fluvial and marginal marine sedimentation, respectively. Syn and post depositional faulting further disrupted reservoir continuity. Trends in fluid flow established from initial fluid levels, response to recompletion workovers, and pressure depletion data demonstrated barriers to lateral and vertical fluid flow caused by a combination of reservoir facies pinchout, flooding shale markers, and the faults. Considerable reserve growth potential exists at Budare field because the reservoir units are highly compartment by the depositional heterogeneity and structural complexity. Numerous reserve growth opportunities were identified in attics updip of existing production, in untapped or incompletely drained compartments, and in field extensions.

  5. An Automatic Identification Procedure to Promote the use of FES-Cycling Training for Hemiparetic Patients

    Directory of Open Access Journals (Sweden)

    Emilia Ambrosini

    2014-01-01

    Full Text Available Cycling induced by Functional Electrical Stimulation (FES training currently requires a manual setting of different parameters, which is a time-consuming and scarcely repeatable procedure. We proposed an automatic procedure for setting session-specific parameters optimized for hemiparetic patients. This procedure consisted of the identification of the stimulation strategy as the angular ranges during which FES drove the motion, the comparison between the identified strategy and the physiological muscular activation strategy, and the setting of the pulse amplitude and duration of each stimulated muscle. Preliminary trials on 10 healthy volunteers helped define the procedure. Feasibility tests on 8 hemiparetic patients (5 stroke, 3 traumatic brain injury were performed. The procedure maximized the motor output within the tolerance constraint, identified a biomimetic strategy in 6 patients, and always lasted less than 5 minutes. Its reasonable duration and automatic nature make the procedure usable at the beginning of every training session, potentially enhancing the performance of FES-cycling training.

  6. Automatic extraction of discontinuity orientation from rock mass surface 3D point cloud

    Science.gov (United States)

    Chen, Jianqin; Zhu, Hehua; Li, Xiaojun

    2016-10-01

    This paper presents a new method for extracting discontinuity orientation automatically from rock mass surface 3D point cloud. The proposed method consists of four steps: (1) automatic grouping of discontinuity sets using an improved K-means clustering method, (2) discontinuity segmentation and optimization, (3) discontinuity plane fitting using Random Sample Consensus (RANSAC) method, and (4) coordinate transformation of discontinuity plane. The method is first validated by the point cloud of a small piece of a rock slope acquired by photogrammetry. The extracted discontinuity orientations are compared with measured ones in the field. Then it is applied to a publicly available LiDAR data of a road cut rock slope at Rockbench repository. The extracted discontinuity orientations are compared with the method proposed by Riquelme et al. (2014). The results show that the presented method is reliable and of high accuracy, and can meet the engineering needs.

  7. Development of an automatic scaler

    International Nuclear Information System (INIS)

    He Yuehong

    2009-04-01

    A self-designed automatic scaler is introduced. A microcontroller LPC936 is used as the master chip in the scaler. A counter integrated with the micro-controller is configured to operate as external pulse counter. Software employed in the scaler is based on a embedded real-time operating system kernel named Small RTOS. Data storage, calculation and some other functions are also provided. The scaler is designed for applications with low cost, low power consumption solutions. By now, the automatic scaler has been applied in a surface contamination instrument. (authors)

  8. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  9. A Unification of Inheritance and Automatic Program Specialization

    DEFF Research Database (Denmark)

    Schultz, Ulrik Pagh

    2004-01-01

    , inheritance is used to control the automatic application of program specialization to class members during compilation to obtain an efficient implementation. This paper presents the language JUST, which integrates object-oriented concepts, block structure, and techniques from automatic program specialization......The object-oriented style of programming facilitates program adaptation and enhances program genericness, but at the expense of efficiency. Automatic program specialization can be used to generate specialized, efficient implementations for specific scenarios, but requires the program...... to be structured appropriately for specialization and is yet another new concept for the programmer to understand and apply. We have unified automatic program specialization and inheritance into a single concept, and implemented this approach in a modified version of Java named JUST. When programming in JUST...

  10. Using automatic item generation to create multiple-choice test items.

    Science.gov (United States)

    Gierl, Mark J; Lai, Hollis; Turner, Simon R

    2012-08-01

    Many tests of medical knowledge, from the undergraduate level to the level of certification and licensure, contain multiple-choice items. Although these are efficient in measuring examinees' knowledge and skills across diverse content areas, multiple-choice items are time-consuming and expensive to create. Changes in student assessment brought about by new forms of computer-based testing have created the demand for large numbers of multiple-choice items. Our current approaches to item development cannot meet this demand. We present a methodology for developing multiple-choice items based on automatic item generation (AIG) concepts and procedures. We describe a three-stage approach to AIG and we illustrate this approach by generating multiple-choice items for a medical licensure test in the content area of surgery. To generate multiple-choice items, our method requires a three-stage process. Firstly, a cognitive model is created by content specialists. Secondly, item models are developed using the content from the cognitive model. Thirdly, items are generated from the item models using computer software. Using this methodology, we generated 1248 multiple-choice items from one item model. Automatic item generation is a process that involves using models to generate items using computer technology. With our method, content specialists identify and structure the content for the test items, and computer technology systematically combines the content to generate new test items. By combining these outcomes, items can be generated automatically. © Blackwell Publishing Ltd 2012.

  11. Fusion of dynamic contrast-enhanced magnetic resonance mammography at 3.0 T with X-ray mammograms: Pilot study evaluation using dedicated semi-automatic registration software

    Energy Technology Data Exchange (ETDEWEB)

    Dietzel, Matthias, E-mail: dietzelmatthias2@hotmail.com [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Hopp, Torsten; Ruiter, Nicole [Karlsruhe Institute of Technology (KIT), Institute for Data Processing and Electronics, Postfach 3640, D-76021 Karlsruhe (Germany); Zoubi, Ramy [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Runnebaum, Ingo B. [Clinic of Gynecology and Obstetrics, Friedrich-Schiller-University Jena, Bachstrasse 18, D-07743 Jena (Germany); Kaiser, Werner A. [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany); Medical School, University of Harvard, 25 Shattuck Street, Boston, MA 02115 (United States); Baltzer, Pascal A.T. [Institute of Diagnostic and Interventional Radiology, Friedrich-Schiller-University Jena, Erlanger Allee 101, D-07740 Jena (Germany)

    2011-08-15

    Rationale and objectives: To evaluate the semi-automatic image registration accuracy of X-ray-mammography (XR-M) with high-resolution high-field (3.0 T) MR-mammography (MR-M) in an initial pilot study. Material and methods: MR-M was acquired on a high-field clinical scanner at 3.0 T (T1-weighted 3D VIBE {+-} Gd). XR-M was obtained with state-of-the-art full-field digital systems. Seven patients with clearly delineable mass lesions >10 mm both in XR-M and MR-M were enrolled (exclusion criteria: previous breast surgery; surgical intervention between XR-M and MR-M). XR-M and MR-M were matched using a dedicated image-registration algorithm allowing semi-automatic non-linear deformation of MR-M based on finite-element modeling. To identify registration errors (RE) a virtual craniocaudal 2D mammogram was calculated by the software from MR-M (with and w/o Gadodiamide/Gd) and matched with corresponding XR-M. To quantify REs the geometric center of the lesions in the virtual vs. conventional mammogram were subtracted. The robustness of registration was quantified by registration of X-MRs to both MR-Ms with and w/o Gadodiamide. Results: Image registration was performed successfully for all patients. Overall RE was 8.2 mm (1 min after Gd; confidence interval/CI: 2.0-14.4 mm, standard deviation/SD: 6.7 mm) vs. 8.9 mm (no Gd; CI: 4.0-13.9 mm, SD: 5.4 mm). The mean difference between pre- vs. post-contrast was 0.7 mm (SD: 1.9 mm). Conclusion: Image registration of high-field 3.0 T MR-mammography with X-ray-mammography is feasible. For this study applying a high-resolution protocol at 3.0 T, the registration was robust and the overall registration error was sufficient for clinical application.

  12. Fusion of dynamic contrast-enhanced magnetic resonance mammography at 3.0 T with X-ray mammograms: Pilot study evaluation using dedicated semi-automatic registration software

    International Nuclear Information System (INIS)

    Dietzel, Matthias; Hopp, Torsten; Ruiter, Nicole; Zoubi, Ramy; Runnebaum, Ingo B.; Kaiser, Werner A.; Baltzer, Pascal A.T.

    2011-01-01

    Rationale and objectives: To evaluate the semi-automatic image registration accuracy of X-ray-mammography (XR-M) with high-resolution high-field (3.0 T) MR-mammography (MR-M) in an initial pilot study. Material and methods: MR-M was acquired on a high-field clinical scanner at 3.0 T (T1-weighted 3D VIBE ± Gd). XR-M was obtained with state-of-the-art full-field digital systems. Seven patients with clearly delineable mass lesions >10 mm both in XR-M and MR-M were enrolled (exclusion criteria: previous breast surgery; surgical intervention between XR-M and MR-M). XR-M and MR-M were matched using a dedicated image-registration algorithm allowing semi-automatic non-linear deformation of MR-M based on finite-element modeling. To identify registration errors (RE) a virtual craniocaudal 2D mammogram was calculated by the software from MR-M (with and w/o Gadodiamide/Gd) and matched with corresponding XR-M. To quantify REs the geometric center of the lesions in the virtual vs. conventional mammogram were subtracted. The robustness of registration was quantified by registration of X-MRs to both MR-Ms with and w/o Gadodiamide. Results: Image registration was performed successfully for all patients. Overall RE was 8.2 mm (1 min after Gd; confidence interval/CI: 2.0-14.4 mm, standard deviation/SD: 6.7 mm) vs. 8.9 mm (no Gd; CI: 4.0-13.9 mm, SD: 5.4 mm). The mean difference between pre- vs. post-contrast was 0.7 mm (SD: 1.9 mm). Conclusion: Image registration of high-field 3.0 T MR-mammography with X-ray-mammography is feasible. For this study applying a high-resolution protocol at 3.0 T, the registration was robust and the overall registration error was sufficient for clinical application.

  13. Evaluation of automatic time gain compensated in-vivo ultrasound sequences

    DEFF Research Database (Denmark)

    Axelsen, Martin Christian; Røeboe, Kristian Frostholm; Hemmsen, Martin Christian

    2010-01-01

    algorithm for automatic time gain compensation (TGC) on in-vivo ultrasound sequences. Forty ultrasound sequences were recorded from the abdomen of two healthy volunteers. Each sequence of 5 sec was recorded with 40 frames/sec. Post processing each frame, a mask is created wherein anechoic and hyper echoic...... regions are mapped. Near field hyper intensity and deep areas with low signal strength are also included in the mask. The algorithm uses this mask to create a parallel image where anechoic and hyper echoic regions are eliminated. From this, the mean power is calculated as a function of depth. The power...

  14. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  15. Equipment for fully automatic radiographic pipe inspection

    International Nuclear Information System (INIS)

    Basler, G.; Sperl, H.; Weinschenk, K.

    1977-01-01

    The patent describes a device for fully automatic radiographic testing of large pipes with longitudinal welds. Furthermore the invention enables automatic marking of films in radiographic inspection with regard to a ticketing of the test piece and of that part of it where testing took place. (RW) [de

  16. Automatic recognition of conceptualization zones in scientific articles and two life science applications.

    Science.gov (United States)

    Liakata, Maria; Saha, Shyamasree; Dobnik, Simon; Batchelor, Colin; Rebholz-Schuhmann, Dietrich

    2012-04-01

    Scholarly biomedical publications report on the findings of a research investigation. Scientists use a well-established discourse structure to relate their work to the state of the art, express their own motivation and hypotheses and report on their methods, results and conclusions. In previous work, we have proposed ways to explicitly annotate the structure of scientific investigations in scholarly publications. Here we present the means to facilitate automatic access to the scientific discourse of articles by automating the recognition of 11 categories at the sentence level, which we call Core Scientific Concepts (CoreSCs). These include: Hypothesis, Motivation, Goal, Object, Background, Method, Experiment, Model, Observation, Result and Conclusion. CoreSCs provide the structure and context to all statements and relations within an article and their automatic recognition can greatly facilitate biomedical information extraction by characterizing the different types of facts, hypotheses and evidence available in a scientific publication. We have trained and compared machine learning classifiers (support vector machines and conditional random fields) on a corpus of 265 full articles in biochemistry and chemistry to automatically recognize CoreSCs. We have evaluated our automatic classifications against a manually annotated gold standard, and have achieved promising accuracies with 'Experiment', 'Background' and 'Model' being the categories with the highest F1-scores (76%, 62% and 53%, respectively). We have analysed the task of CoreSC annotation both from a sentence classification as well as sequence labelling perspective and we present a detailed feature evaluation. The most discriminative features are local sentence features such as unigrams, bigrams and grammatical dependencies while features encoding the document structure, such as section headings, also play an important role for some of the categories. We discuss the usefulness of automatically generated Core

  17. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT

    DEFF Research Database (Denmark)

    Murphy, Keelin; Pluim, Josien P. W.; Rikxoort, Eva M. van

    2012-01-01

    and its results; (b) verify that the quantitative, regional ventilation measurements acquired through CT are meaningful for pulmonary function analysis; (c) identify the most effective of the calculated measurements in predicting pulmonary function; and (d) demonstrate the potential of the system...... disorder). Lungs, fissures, airways, lobes, and vessels are automatically segmented in both scans and the expiration scan is registered with the inspiration scan using a fully automatic nonrigid registration algorithm. Segmentations and registrations are examined and scored by expert observers to analyze...... to have good correlation with spirometry results, with several having correlation coefficients, r, in the range of 0.85–0.90. The best performing kNN classifier succeeded in classifying 67% of subjects into the correct COPD GOLD stage, with a further 29% assigned to a class neighboring the correct one...

  18. Fast automatic analysis of antenatal dexamethasone on micro-seizure activity in the EEG

    International Nuclear Information System (INIS)

    Rastin, S.J.; Unsworth, C.P.; Bennet, L.

    2010-01-01

    Full text: In this work wc develop an automatic scheme for studying the effect of the antenatal Dexamethasone on the EEG activity. To do so an FFT (Fast Fourier Transform) based detector was designed and applied to the EEG recordings obtained from two groups of fetal sheep. Both groups received two injections with a time delay of 24 h between them. However the applied medicine was different for each group (Dex and saline). The detector developed was used to automatically identify and classify micro-seizures that occurred in the frequency bands corresponding to the EEG transients known as slow waves (2.5 14 Hz). For each second of the data recordings the spectrum was computed and the rise of the energy in each predefined frequency band then counted when the energy level exceeded a predefined corresponding threshold level (Where the threshold level was obtained from the long term average of the spectral points at each band). Our results demonstrate that it was possible to automatically count the micro-seizures for the three different bands in a time effective manner. It was found that the number of transients did not strongly depend on the nature of the injected medicine which was consistent with the results manually obtained by an EEG expert. Tn conclusion, the automatic detection scheme presented here would allow for rapid micro-seizure event identification of hours of highly sampled EEG data thus providing a valuable time-saving device.

  19. Development of portable and automatic smear sampler for measuring surface contamination

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Bong Jae; Chang, S. Y.; Kim, B. H.; Kim, J. S.; Lee, H. S.; Kim, C. K.; Swol, C. W

    2000-12-01

    In measuring radioactive contamination in radiation controlled area of nuclear facility, the technical criteria to evaluate surface contamination level and to sample by indirect method was established in this paper. Radioactive materials always present on surface within operating nuclear facilities. Because of the presence of interfering radiation field, health physicists take smear samples to monitoring surface contamination according to routine monitoring program. But there are some problems like great errors and difference of personnel. Then, to solve them, The portable and automatic smear sampling apparatus was designed and fabricated.

  20. Development of portable and automatic smear sampler for measuring surface contamination

    International Nuclear Information System (INIS)

    Lee, Bong Jae; Chang, S. Y.; Kim, B. H.; Kim, J. S.; Lee, H. S.; Kim, C. K.; Swol, C. W.

    2000-12-01

    In measuring radioactive contamination in radiation controlled area of nuclear facility, the technical criteria to evaluate surface contamination level and to sample by indirect method was established in this paper. Radioactive materials always present on surface within operating nuclear facilities. Because of the presence of interfering radiation field, health physicists take smear samples to monitoring surface contamination according to routine monitoring program. But there are some problems like great errors and difference of personnel. Then, to solve them, The portable and automatic smear sampling apparatus was designed and fabricated

  1. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping.

    Science.gov (United States)

    Sadeghi-Tehran, Pouria; Virlet, Nicolas; Sabermanesh, Kasra; Hawkesford, Malcolm J

    2017-01-01

    Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1) comparison with ground-truth images, (2) variation along a day with changes in ambient illumination, (3) comparison with manual measurements and (4) an estimation of performance along the full life cycle of a wheat canopy. The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  2. 30 CFR 75.1403-4 - Criteria-Automatic elevators.

    Science.gov (United States)

    2010-07-01

    ... appropriate on automatic elevators which will automatically shut-off the power and apply the brakes in the... telephone or other effective communication system by which aid or assistance can be obtained promptly. ...

  3. Automatic positioning control device for automatic control rod exchanger

    International Nuclear Information System (INIS)

    Nasu, Seiji; Sasaki, Masayoshi.

    1982-01-01

    Purpose: To attain accurate positioning for a control rod exchanger. Constitution: The present position for an automatic control rod exchanger is detected by a synchro generator. An aimed stopping position for the exchanger, a stop instruction range depending on the distantial operation delay in the control system and the inertia-running distance of the mechanical system, and a coincidence confirmation range depending on the required positioning accuracy are previously set. If there is a difference between the present position and the aimed stopping position, the automatic exchanger is caused to run toward the aimed stopping position. A stop instruction is generated upon arrival at the position within said stop instruction range, and a coincidence confirmation signal is generated upon arrival at the position within the coincidence confirmation range. Since uncertain factors such as operation delay in the control system and the inertia-running distance of the mechanical system that influence the positioning accuracy are made definite by the method of actual measurement or the like and the stop instruction range and the coincidence confirmation range are set based on the measured data, the accuracy for the positioning can be improved. (Ikeda, J.)

  4. DAF: differential ACE filtering image quality assessment by automatic color equalization

    Science.gov (United States)

    Ouni, S.; Chambah, M.; Saint-Jean, C.; Rizzi, A.

    2008-01-01

    Ideally, a quality assessment system would perceive and measure image or video impairments just like a human being. But in reality, objective quality metrics do not necessarily correlate well with perceived quality [1]. Plus, some measures assume that there exists a reference in the form of an "original" to compare to, which prevents their usage in digital restoration field, where often there is no reference to compare to. That is why subjective evaluation is the most used and most efficient approach up to now. But subjective assessment is expensive, time consuming and does not respond, hence, to the economic requirements [2,3]. Thus, reliable automatic methods for visual quality assessment are needed in the field of digital film restoration. The ACE method, for Automatic Color Equalization [4,6], is an algorithm for digital images unsupervised enhancement. It is based on a new computational approach that tries to model the perceptual response of our vision system merging the Gray World and White Patch equalization mechanisms in a global and local way. Like our vision system ACE is able to adapt to widely varying lighting conditions, and to extract visual information from the environment efficaciously. Moreover ACE can be run in an unsupervised manner. Hence it is very useful as a digital film restoration tool since no a priori information is available. In this paper we deepen the investigation of using the ACE algorithm as a basis for a reference free image quality evaluation. This new metric called DAF for Differential ACE Filtering [7] is an objective quality measure that can be used in several image restoration and image quality assessment systems. In this paper, we compare on different image databases, the results obtained with DAF and with some subjective image quality assessments (Mean Opinion Score MOS as measure of perceived image quality). We study also the correlation between objective measure and MOS. In our experiments, we have used for the first image

  5. Development of an automatic sampling device for the continuous measurement of atmospheric carbonyls compounds

    International Nuclear Information System (INIS)

    Perraud, V.

    2007-12-01

    Two sampling strategies were studied to develop an automatic instrument for the continuous measurement of atmospheric carbonyl compounds. Because of its specificity towards carbonyls compounds, sampling by using a transfer of gaseous phase in a liquid phase associated with a simultaneous chemical derivatization of the trapped compounds was first studied. However, this method do not allow a quantitative sampling of all studied carbonyl compounds, nor a continuous measurement in the field. To overcome the difficulties, a second strategy was investigated: the cryogenic adsorption onto solid adsorbent followed by thermodesorption and a direct analysis by GC/MS. Collection efficiency using different solid adsorbents was found greater than 95% for carbonyl compounds consisting of 1 to 7 carbons. This work is a successful first step towards the realization of the automatic sampling device for a continuous measurement of atmospheric carbonyls compounds. (author)

  6. Automatic verification of step-and-shoot IMRT field segments using portal imaging

    International Nuclear Information System (INIS)

    Woo, M.K.; Lightstone, A.W.; Shan, G.; Kumaraswamy, L.; Li, Y.

    2003-01-01

    In step-and-shoot IMRT, many individual beam segments are delivered. These segments are generated by the IMRT treatment planning system and subsequently transmitted electronically through computer hardware and software modules before they are finally delivered. Hence, an independent system that monitors the actual field shape during treatment delivery is an added level of quality assurance in this complicated process. In this paper we describe the development and testing of such a system. The system verifies the field shape by comparing the radiation field detected by the built-in portal imaging system on the linac to the actual field shape planned on the treatment planning system. The comparison is based on a software algorithm that detects the leaf edge positions of the radiation field on the portal image and compares that to the calculated positions. The process is fully automated and requires minimal intervention of the radiation therapists. The system has been tested with actual clinical plan sequences and was able to alert the operator of incorrect settings in real time

  7. A simple semi-automatic approach for land cover classification from multispectral remote sensing imagery.

    Directory of Open Access Journals (Sweden)

    Dong Jiang

    Full Text Available Land cover data represent a fundamental data source for various types of scientific research. The classification of land cover based on satellite data is a challenging task, and an efficient classification method is needed. In this study, an automatic scheme is proposed for the classification of land use using multispectral remote sensing images based on change detection and a semi-supervised classifier. The satellite image can be automatically classified using only the prior land cover map and existing images; therefore human involvement is reduced to a minimum, ensuring the operability of the method. The method was tested in the Qingpu District of Shanghai, China. Using Environment Satellite 1(HJ-1 images of 2009 with 30 m spatial resolution, the areas were classified into five main types of land cover based on previous land cover data and spectral features. The results agreed on validation of land cover maps well with a Kappa value of 0.79 and statistical area biases in proportion less than 6%. This study proposed a simple semi-automatic approach for land cover classification by using prior maps with satisfied accuracy, which integrated the accuracy of visual interpretation and performance of automatic classification methods. The method can be used for land cover mapping in areas lacking ground reference information or identifying rapid variation of land cover regions (such as rapid urbanization with convenience.

  8. Automatic Operation For A Robot Lawn Mower

    Science.gov (United States)

    Huang, Y. Y.; Cao, Z. L.; Oh, S. J.; Kattan, E. U.; Hall, E. L.

    1987-02-01

    A domestic mobile robot, lawn mower, which performs the automatic operation mode, has been built up in the Center of Robotics Research, University of Cincinnati. The robot lawn mower automatically completes its work with the region filling operation, a new kind of path planning for mobile robots. Some strategies for region filling of path planning have been developed for a partly-known or a unknown environment. Also, an advanced omnidirectional navigation system and a multisensor-based control system are used in the automatic operation. Research on the robot lawn mower, especially on the region filling of path planning, is significant in industrial and agricultural applications.

  9. Automatic face morphing for transferring facial animation

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Bui, T.D.; Poel, Mannes; Heylen, Dirk K.J.; Nijholt, Antinus; Hamza, H.M.

    2003-01-01

    In this paper, we introduce a novel method of automatically finding the training set of RBF networks for morphing a prototype face to represent a new face. This is done by automatically specifying and adjusting corresponding feature points on a target face. The RBF networks are then used to transfer

  10. Automatic needle insertion diminishes pain during growth hormone injection

    DEFF Research Database (Denmark)

    Main, K M; Jørgensen, J T; Hertel, N T

    1995-01-01

    prototype pens for GH administration, providing either manual or automatic sc needle insertion, using a combined visual analogue/facial scale and a five-item scale in 18 children. With the automatic pen there was a significantly lower maximum pain score compared with the manual pen (median 28.5 versus 52.......0 mm) as well as a lower mean pain score (mean 13.7 versus 23.5 mm). The five-item scale revealed that automatic needle insertion was significantly less painful than manual insertion and 13 patients chose to continue treatment with the automatic pen. In conclusion, pain during GH injection can...

  11. An automatic method to generate domain-specific investigator networks using PubMed abstracts

    Directory of Open Access Journals (Sweden)

    Gwinn Marta

    2007-06-01

    Full Text Available Abstract Background Collaboration among investigators has become critical to scientific research. This includes ad hoc collaboration established through personal contacts as well as formal consortia established by funding agencies. Continued growth in online resources for scientific research and communication has promoted the development of highly networked research communities. Extending these networks globally requires identifying additional investigators in a given domain, profiling their research interests, and collecting current contact information. We present a novel strategy for building investigator networks dynamically and producing detailed investigator profiles using data available in PubMed abstracts. Results We developed a novel strategy to obtain detailed investigator information by automatically parsing the affiliation string in PubMed records. We illustrated the results by using a published literature database in human genome epidemiology (HuGE Pub Lit as a test case. Our parsing strategy extracted country information from 92.1% of the affiliation strings in a random sample of PubMed records and in 97.0% of HuGE records, with accuracies of 94.0% and 91.0%, respectively. Institution information was parsed from 91.3% of the general PubMed records (accuracy 86.8% and from 94.2% of HuGE PubMed records (accuracy 87.0. We demonstrated the application of our approach to dynamic creation of investigator networks by creating a prototype information system containing a large database of PubMed abstracts relevant to human genome epidemiology (HuGE Pub Lit, indexed using PubMed medical subject headings converted to Unified Medical Language System concepts. Our method was able to identify 70–90% of the investigators/collaborators in three different human genetics fields; it also successfully identified 9 of 10 genetics investigators within the PREBIC network, an existing preterm birth research network. Conclusion We successfully created a

  12. An automatic method to generate domain-specific investigator networks using PubMed abstracts

    Science.gov (United States)

    Yu, Wei; Yesupriya, Ajay; Wulf, Anja; Qu, Junfeng; Gwinn, Marta; Khoury, Muin J

    2007-01-01

    Background Collaboration among investigators has become critical to scientific research. This includes ad hoc collaboration established through personal contacts as well as formal consortia established by funding agencies. Continued growth in online resources for scientific research and communication has promoted the development of highly networked research communities. Extending these networks globally requires identifying additional investigators in a given domain, profiling their research interests, and collecting current contact information. We present a novel strategy for building investigator networks dynamically and producing detailed investigator profiles using data available in PubMed abstracts. Results We developed a novel strategy to obtain detailed investigator information by automatically parsing the affiliation string in PubMed records. We illustrated the results by using a published literature database in human genome epidemiology (HuGE Pub Lit) as a test case. Our parsing strategy extracted country information from 92.1% of the affiliation strings in a random sample of PubMed records and in 97.0% of HuGE records, with accuracies of 94.0% and 91.0%, respectively. Institution information was parsed from 91.3% of the general PubMed records (accuracy 86.8%) and from 94.2% of HuGE PubMed records (accuracy 87.0). We demonstrated the application of our approach to dynamic creation of investigator networks by creating a prototype information system containing a large database of PubMed abstracts relevant to human genome epidemiology (HuGE Pub Lit), indexed using PubMed medical subject headings converted to Unified Medical Language System concepts. Our method was able to identify 70–90% of the investigators/collaborators in three different human genetics fields; it also successfully identified 9 of 10 genetics investigators within the PREBIC network, an existing preterm birth research network. Conclusion We successfully created a web-based prototype

  13. Learning algorithms and automatic processing of languages; Algorithmes a apprentissage et traitement automatique des langues

    Energy Technology Data Exchange (ETDEWEB)

    Fluhr, Christian Yves Andre

    1977-06-15

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts.

  14. Exposure to violent video games increases automatic aggressiveness.

    Science.gov (United States)

    Uhlmann, Eric; Swanson, Jane

    2004-02-01

    The effects of exposure to violent video games on automatic associations with the self were investigated in a sample of 121 students. Playing the violent video game Doom led participants to associate themselves with aggressive traits and actions on the Implicit Association Test. In addition, self-reported prior exposure to violent video games predicted automatic aggressive self-concept, above and beyond self-reported aggression. Results suggest that playing violent video games can lead to the automatic learning of aggressive self-views.

  15. MadEvent: automatic event generation with MadGraph

    International Nuclear Information System (INIS)

    Maltoni, Fabio; Stelzer, Tim

    2003-01-01

    We present a new multi-channel integration method and its implementation in the multi-purpose event generator MadEvent, which is based on MadGraph. Given a process, MadGraph automatically identifies all the relevant subprocesses, generates both the amplitudes and the mappings needed for an efficient integration over the phase space, and passes them to MadEvent. As a result, a process-specific, stand-alone code is produced that allows the user to calculate cross sections and produce unweighted events in a standard output format. Several examples are given for processes that are relevant for physics studies at present and forthcoming colliders. (author)

  16. Automatic Knowledge Extraction and Knowledge Structuring for a National Term Bank

    DEFF Research Database (Denmark)

    Lassen, Tine; Madsen, Bodil Nistrup; Erdman Thomsen, Hanne

    2011-01-01

    This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data fr...... various existing sources, as well as methods for target group oriented knowledge dissemination. In this paper, we mainly focus on the plans for automatic knowledge extraction and knowledge structuring that will result in ontologies for a national term bank.......This paper gives an introduction to the plans and ongoing work in a project, the aim of which is to develop methods for automatic knowledge extraction and automatic construction and updating of ontologies. The project also aims at developing methods for automatic merging of terminological data from...

  17. Development of automatic ultrasonic testing system and its application

    International Nuclear Information System (INIS)

    Oh, Sang Hong; Matsuura, Toshihiko; Iwata, Ryusuke; Nakagawa, Michio; Horikawa, Kohsuke; Kim, You Chul

    1997-01-01

    The radiographic testing (RT) has been usually applied to a nondestructive testing, which is carried out on purpose to detect internal defects at welded joints of a penstock. In the case that RT could not be applied to, the ultrasonic testing (UT) was performed. UT was generally carried out by manual scanning and the inspections data were recorded by the inspector in a site. So, as a weak point, there was no objective inspection records correspond to films of RT. It was expected that the automatic ultrasonic testing system by which automatic scanning and automatic recording are possible was developed. In this respect, the automatic ultrasonic testing system was developed. Using newly developed the automatic ultrasonic testing system, test results to the circumferential welded joints of the penstock at a site were shown in this paper.

  18. Effective field theory for NN interactions

    International Nuclear Information System (INIS)

    Tran Duy Khuong; Vo Hanh Phuc

    2003-01-01

    The effective field theory of NN interactions is formulated and the power counting appropriate to this case is reviewed. It is more subtle than in most effective field theories since in the limit that the S-wave NN scattering lengths go to infinity. It is governed by nontrivial fixed point. The leading two body terms in the effective field theory for nucleon self interactions are scale invariant and invariant under Wigner SU(4) spin-isospin symmetry in this limit. Higher body terms with no derivatives (i.e. three and four body terms) are automatically invariant under Wigner symmetry. (author)

  19. The development of auto-sealing system for field joints of polyethylene coated pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okano, Yoshihiro [NKK Corp., Tsu, Mie (Japan); Shoji, Norio [NKK Corp., Yokohama (Japan); Namioka, Toshiyuki [Nippon Kokan Koji Corp., Osaka (Japan); Komura, Minoru [Nitto Denko Corp., Fukaya, Saitama (Japan)

    1997-08-01

    The paper describes the development of a system to create high quality, automatic sealing of field joints of polyethylene coated pipelines. The system uses a combination of electrically heated shrink sleeves and a low pressure chamber. The self-heating shrink sleeves include electric wires and heat themselves when connected to electricity. A method was developed to eliminate air trapped between the sleeve and steel pipe by shrinking the sleeves under low pressure. The low pressure condition was automatically and easily attained by using a vacuum chamber. The authors verified that the system produces high quality sealing of field joints.

  20. Automatic welding of stainless steel tubing

    Science.gov (United States)

    Clautice, W. E.

    1978-01-01

    The use of automatic welding for making girth welds in stainless steel tubing was investigated as well as the reduction in fabrication costs resulting from the elimination of radiographic inspection. Test methodology, materials, and techniques are discussed, and data sheets for individual tests are included. Process variables studied include welding amperes, revolutions per minute, and shielding gas flow. Strip chart recordings, as a definitive method of insuring weld quality, are studied. Test results, determined by both radiographic and visual inspection, are presented and indicate that once optimum welding procedures for specific sizes of tubing are established, and the welding machine operations are certified, then the automatic tube welding process produces good quality welds repeatedly, with a high degree of reliability. Revised specifications for welding tubing using the automatic process and weld visual inspection requirements at the Kennedy Space Center are enumerated.

  1. Automatic change detection to facial expressions in adolescents

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Jiannong, Shi

    2016-01-01

    Adolescence is a critical period for the neurodevelopment of social-emotional processing, wherein the automatic detection of changes in facial expressions is crucial for the development of interpersonal communication. Two groups of participants (an adolescent group and an adult group) were...... in facial expressions between the two age groups. The current findings demonstrated that the adolescent group featured more negative vMMN amplitudes than the adult group in the fronto-central region during the 120–200 ms interval. During the time window of 370–450 ms, only the adult group showed better...... automatic processing on fearful faces than happy faces. The present study indicated that adolescent’s posses stronger automatic detection of changes in emotional expression relative to adults, and sheds light on the neurodevelopment of automatic processes concerning social-emotional information....

  2. Automatic Adviser on stationary devices status identification and anticipated change

    Science.gov (United States)

    Shabelnikov, A. N.; Liabakh, N. N.; Gibner, Ya M.; Pushkarev, E. A.

    2018-05-01

    A task is defined to synthesize an Automatic Adviser to identify the automation systems stationary devices status using an autoregressive model of changing their key parameters. An applied model type was rationalized and the research objects monitoring process algorithm was developed. A complex of mobile objects status operation simulation and prediction results analysis was proposed. Research results are commented using a specific example of a hump yard compressor station. The work was supported by the Russian Fundamental Research Fund, project No. 17-20-01040.

  3. Automatic Control of Silicon Melt Level

    Science.gov (United States)

    Duncan, C. S.; Stickel, W. B.

    1982-01-01

    A new circuit, when combined with melt-replenishment system and melt level sensor, offers continuous closed-loop automatic control of melt-level during web growth. Installed on silicon-web furnace, circuit controls melt-level to within 0.1 mm for as long as 8 hours. Circuit affords greater area growth rate and higher web quality, automatic melt-level control also allows semiautomatic growth of web over long periods which can greatly reduce costs.

  4. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  5. Thai Automatic Speech Recognition

    National Research Council Canada - National Science Library

    Suebvisai, Sinaporn; Charoenpornsawat, Paisarn; Black, Alan; Woszczyna, Monika; Schultz, Tanja

    2005-01-01

    .... We focus on the discussion of the rapid deployment of ASR for Thai under limited time and data resources, including rapid data collection issues, acoustic model bootstrap, and automatic generation of pronunciations...

  6. Automatic detection and classification of artifacts in single-channel EEG

    DEFF Research Database (Denmark)

    Olund, Thomas; Duun-Henriksen, Jonas; Kjaer, Troels W.

    2014-01-01

    Ambulatory EEG monitoring can provide medical doctors important diagnostic information, without hospitalizing the patient. These recordings are however more exposed to noise and artifacts compared to clinically recorded EEG. An automatic artifact detection and classification algorithm for single......-channel EEG is proposed to help identifying these artifacts. Features are extracted from the EEG signal and wavelet subbands. Subsequently a selection algorithm is applied in order to identify the best discriminating features. A non-linear support vector machine is used to discriminate among different...... artifact classes using the selected features. Single-channel (Fp1-F7) EEG recordings are obtained from experiments with 12 healthy subjects performing artifact inducing movements. The dataset was used to construct and validate the model. Both subject-specific and generic implementation, are investigated...

  7. Use of Automatic Interaction Detector in Monitoring Faculty Salaries. AIR 1983 Annual Forum Paper.

    Science.gov (United States)

    Cohen, Margaret E.

    A university's use of the Automatic Interaction Detector (AID) to monitor faculty salary data is described. The first step consists of examining a tree diagram and summary table produced by AID. The tree is used to identify the characteristics of faculty at different salary levels. The table is used to determine the explanatory power of the…

  8. A Multi-threaded Version of Field II

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2014-01-01

    A multi-threaded version of Field II has been developed, which automatically can use the multi-core capabil- ities of modern CPUs. The memory allocation routines were rewritten to minimize the number of dynamic allocations and to make pre-allocations possible for each thread. This ensures...... that the simulation job can be automatically partitioned and the interdependence between threads minimized. The new code has been compared to Field II version 3.22, October 27, 2013 (latest free-ware version). A 64 element 5 MHz focused array transducer was simulated. One million point scatterers randomly distributed...... in a plane of 20 x 50 mm (width x depth) with random Gaussian amplitudes were simulated using the command calc scat . Dual Intel Xeon CPU E5-2630 2.60 GHz CPUs were used under Ubuntu Linux 10.02 and Matlab version 2013b. Each CPU holds 6 cores with hyper-threading, corresponding to a total of 24 hyper...

  9. Examples of testing global identifiability of biological and biomedical models with the DAISY software.

    Science.gov (United States)

    Saccomani, Maria Pia; Audoly, Stefania; Bellu, Giuseppina; D'Angiò, Leontina

    2010-04-01

    DAISY (Differential Algebra for Identifiability of SYstems) is a recently developed computer algebra software tool which can be used to automatically check global identifiability of (linear and) nonlinear dynamic models described by differential equations involving polynomial or rational functions. Global identifiability is a fundamental prerequisite for model identification which is important not only for biological or medical systems but also for many physical and engineering systems derived from first principles. Lack of identifiability implies that the parameter estimation techniques may not fail but any obtained numerical estimates will be meaningless. The software does not require understanding of the underlying mathematical principles and can be used by researchers in applied fields with a minimum of mathematical background. We illustrate the DAISY software by checking the a priori global identifiability of two benchmark nonlinear models taken from the literature. The analysis of these two examples includes comparison with other methods and demonstrates how identifiability analysis is simplified by this tool. Thus we illustrate the identifiability analysis of other two examples, by including discussion of some specific aspects related to the role of observability and knowledge of initial conditions in testing identifiability and to the computational complexity of the software. The main focus of this paper is not on the description of the mathematical background of the algorithm, which has been presented elsewhere, but on illustrating its use and on some of its more interesting features. DAISY is available on the web site http://www.dei.unipd.it/ approximately pia/. 2010 Elsevier Ltd. All rights reserved.

  10. Automatic Detection and Visualization of Qualitative Hemodynamic Characteristics in Cerebral Aneurysms.

    Science.gov (United States)

    Gasteiger, R; Lehmann, D J; van Pelt, R; Janiga, G; Beuing, O; Vilanova, A; Theisel, H; Preim, B

    2012-12-01

    Cerebral aneurysms are a pathological vessel dilatation that bear a high risk of rupture. For the understanding and evaluation of the risk of rupture, the analysis of hemodynamic information plays an important role. Besides quantitative hemodynamic information, also qualitative flow characteristics, e.g., the inflow jet and impingement zone are correlated with the risk of rupture. However, the assessment of these two characteristics is currently based on an interactive visual investigation of the flow field, obtained by computational fluid dynamics (CFD) or blood flow measurements. We present an automatic and robust detection as well as an expressive visualization of these characteristics. The detection can be used to support a comparison, e.g., of simulation results reflecting different treatment options. Our approach utilizes local streamline properties to formalize the inflow jet and impingement zone. We extract a characteristic seeding curve on the ostium, on which an inflow jet boundary contour is constructed. Based on this boundary contour we identify the impingement zone. Furthermore, we present several visualization techniques to depict both characteristics expressively. Thereby, we consider accuracy and robustness of the extracted characteristics, minimal visual clutter and occlusions. An evaluation with six domain experts confirms that our approach detects both hemodynamic characteristics reasonably.

  11. Adaptive pseudolinear compensators of dynamic characteristics of automatic control systems

    Science.gov (United States)

    Skorospeshkin, M. V.; Sukhodoev, M. S.; Timoshenko, E. A.; Lenskiy, F. V.

    2016-04-01

    Adaptive pseudolinear gain and phase compensators of dynamic characteristics of automatic control systems are suggested. The automatic control system performance with adaptive compensators has been explored. The efficiency of pseudolinear adaptive compensators in the automatic control systems with time-varying parameters has been demonstrated.

  12. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  13. Human-competitive automatic topic indexing

    CERN Document Server

    Medelyan, Olena

    2009-01-01

    Topic indexing is the task of identifying the main topics covered by a document. These are useful for many purposes: as subject headings in libraries, as keywords in academic publications and as tags on the web. Knowing a document’s topics helps people judge its relevance quickly. However, assigning topics manually is labor intensive. This thesis shows how to generate them automatically in a way that competes with human performance. Three kinds of indexing are investigated: term assignment, a task commonly performed by librarians, who select topics from a controlled vocabulary; tagging, a popular activity of web users, who choose topics freely; and a new method of keyphrase extraction, where topics are equated to Wikipedia article names. A general two-stage algorithm is introduced that first selects candidate topics and then ranks them by significance based on their properties. These properties draw on statistical, semantic, domain-specific and encyclopedic knowledge. They are combined using a machine learn...

  14. Process and device for automatically surveying complex installations

    International Nuclear Information System (INIS)

    Pekrul, P.J.; Thiele, A.W.

    1976-01-01

    A description is given of a process for automatically analysing separate signal processing channels in real time, one channel per signal, in a facility with significant background noise in signals varying in time and coming from transducers at selected points for the continuous monitoring of the operating conditions of the various components of the installation. The signals are intended to determine potential breakdowns, determine conclusions as to the severity of these potential breakdowns and indicate to an operator the measures to be taken in consequence. The feature of this process is that it comprises the automatic and successive selection of each channel for the purpose of spectral analysis, the automatic processing of the signal of each selected channel to show energy spectrum density data at pre-determined frequencies, the automatic comparison of the energy spectrum density data of each channel with pre-determined sets of limits varying with the frequency, and the automatic indication to the operator of the condition of the various components of the installation associated to each channel and the measures to be taken depending on the set of limits [fr

  15. Impact of automatic calibration techniques on HMD life cycle costs and sustainable performance

    Science.gov (United States)

    Speck, Richard P.; Herz, Norman E., Jr.

    2000-06-01

    Automatic test and calibration has become a valuable feature in many consumer products--ranging from antilock braking systems to auto-tune TVs. This paper discusses HMDs (Helmet Mounted Displays) and how similar techniques can reduce life cycle costs and increase sustainable performance if they are integrated into a program early enough. Optical ATE (Automatic Test Equipment) is already zeroing distortion in the HMDs and thereby making binocular displays a practical reality. A suitcase sized, field portable optical ATE unit could re-zero these errors in the Ready Room to cancel the effects of aging, minor damage and component replacement. Planning on this would yield large savings through relaxed component specifications and reduced logistic costs. Yet, the sustained performance would far exceed that attained with fixed calibration strategies. Major tactical benefits can come from reducing display errors, particularly in information fusion modules and virtual `beyond visual range' operations. Some versions of the ATE described are in production and examples of high resolution optical test data will be discussed.

  16. LAMOST OBSERVATIONS IN THE KEPLER FIELD: SPECTRAL CLASSIFICATION WITH THE MKCLASS CODE

    Energy Technology Data Exchange (ETDEWEB)

    Gray, R. O. [Department of Physics and Astronomy, Appalachian State University, Boone, NC 28608 (United States); Corbally, C. J. [Vatican Observatory Research Group, Steward Observatory, Tucson, AZ 85721-0065 (United States); Cat, P. De [Royal Observatory of Belgium, Ringlaan 3, B-1180 Brussel (Belgium); Fu, J. N.; Ren, A. B. [Department of Astronomy, Beijing Normal University, 19 Avenue Xinjiekouwai, Beijing 100875 (China); Shi, J. R.; Luo, A. L.; Zhang, H. T.; Wu, Y.; Cao, Z.; Li, G. [Key Laboratory for Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China); Zhang, Y.; Hou, Y.; Wang, Y. [Nanjing Institute of Astronomical Optics and Technology, National Astronomical Observatories, Chinese Academy of Sciences, Nanjing 210042 (China)

    2016-01-15

    The LAMOST-Kepler project was designed to obtain high-quality, low-resolution spectra of many of the stars in the Kepler field with the Large Sky Area Multi Object Fiber Spectroscopic Telescope (LAMOST) spectroscopic telescope. To date 101,086 spectra of 80,447 objects over the entire Kepler field have been acquired. Physical parameters, radial velocities, and rotational velocities of these stars will be reported in other papers. In this paper we present MK spectral classifications for these spectra determined with the automatic classification code MKCLASS. We discuss the quality and reliability of the spectral types and present histograms showing the frequency of the spectral types in the main table organized according to luminosity class. Finally, as examples of the use of this spectral database, we compute the proportion of A-type stars that are Am stars, and identify 32 new barium dwarf candidates.

  17. Automatic keywording of High Energy Physics

    CERN Document Server

    Dallman, David Peter

    1999-01-01

    Bibliographic databases were developed from the traditional library card catalogue in order to enable users to access library documents via various types of bibliographic information, such as title, author, series or conference date. In addition these catalogues sometimes contained some form of indexation by subject, such as the Universal (or Dewey) Decimal Classification used for books. With the introduction of the eprint archives, set up by the High Energy Physics (HEP) Community in the early 90s, huge collections of documents in several fields have been made available on the World Wide Web. These developments however have not yet been followed up from a keywording point of view. We will see in this paper how important it is to attribute keywords to all documents in the area of HEP Grey Literature. As libraries are facing a future with less and less manpower available and more and more documents, we will explore the possibility of being helped by automatic classification software. We will specifically menti...

  18. Knowledge-based full-automatic control system for a nuclear ship reactor

    International Nuclear Information System (INIS)

    Shimazaki, J.; Nakazawa, T.; Yabuuchi, N.

    2000-01-01

    Plant operations aboard nuclear ships require quick judgements and actions due to changing marine conditions such as wind, waves and currents. Furthermore, additional human support is not available for nuclear ship operation at sea, so advanced automatic operations are necessary to reduce the number of operators required finally. Therefore, an advanced automatic operating system has been developed based on operational knowledge of nuclear ship 'Mutsu' plant. The advanced automatic operating system includes both the automatic operation system and the operator-support system which assists operators in completing actions during plant accidents, anomaly diagnosis and plant supervision. These system are largely being developed using artificial intelligent techniques such as neural network, fuzzy logic and knowledge-based expert. The automatic operation system is fundamentally based upon application of an operator's knowledge of both normal (start-up to rated power level) and abnormal (after scram) operations. Comparing plant behaviors from start-up to power level by the automatic operation with by 'Mutsu' manual operation, stable automatic operation was obtained almost same as manual operation within all operating limits. The abnormal automatic system was for hard work of manual operations after scram or LOCA accidents. An integrating system with the normal and the abnormal automatic systems are being developed for interacting smoothly both systems. (author)

  19. Intracellular recording, sensory field mapping, and culturing identified neurons in the leech, Hirudo medicinalis.

    Science.gov (United States)

    Titlow, Josh; Majeed, Zana R; Nicholls, John G; Cooper, Robin L

    2013-11-04

    The freshwater leech, Hirudo medicinalis, is a versatile model organism that has been used to address scientific questions in the fields of neurophysiology, neuroethology, and developmental biology. The goal of this report is to consolidate experimental techniques from the leech system into a single article that will be of use to physiologists with expertise in other nervous system preparations, or to biology students with little or no electrophysiology experience. We demonstrate how to dissect the leech for recording intracellularly from identified neural circuits in the ganglion. Next we show how individual cells of known function can be removed from the ganglion to be cultured in a Petri dish, and how to record from those neurons in culture. Then we demonstrate how to prepare a patch of innervated skin to be used for mapping sensory or motor fields. These leech preparations are still widely used to address basic electrical properties of neural networks, behavior, synaptogenesis, and development. They are also an appropriate training module for neuroscience or physiology teaching laboratories.

  20. Detection of Degradation Effects in Field-Aged c-Si Solar Cells through IR Thermography and Digital Image Processing

    Directory of Open Access Journals (Sweden)

    E. Kaplani

    2012-01-01

    Full Text Available Due to the vast expansion of photovoltaic (PV module production nowadays, a great interest is shown in factors affecting PV performance and efficiency under real conditions. Particular attention is being given to degradation effects of PV cells and modules, which during the last decade are seen to be responsible for significant power losses observed in PV systems. This paper presents and analyses degradation effects observed in severely EVA discoloured PV cells from field-aged modules operating already for 18–22 years. Temperature degradation effects are identified through IR thermography in bus bars, contact solder bonds, blisters, hot spots, and hot areas. I-V curve analysis results showed an agreement between the source of electrical performance degradation and the degradation effects in the defected cell identified by the IR thermography. Finally, an algorithm was developed to automatically detect EVA discoloration in PV cells through processing of the digital image alone in a way closely imitating human perception of color. This nondestructive and noncostly solution could be applied in the detection of EVA discoloration in existing PV installations and the automatic monitoring and remote inspection of PV systems.

  1. Algorithm of automatic generation of technology process and process relations of automotive wiring harnesses

    Institute of Scientific and Technical Information of China (English)

    XU Benzhu; ZHU Jiman; LIU Xiaoping

    2012-01-01

    Identifying each process and their constraint relations from the complex wiring harness drawings quickly and accurately is the basis for formulating process routes. According to the knowledge of automotive wiring harness and the characteristics of wiring harness components, we established the model of wiring harness graph. Then we research the algorithm of identifying technology processes automatically, finally we describe the relationships between processes by introducing the constraint matrix, which is in or- der to lay a good foundation for harness process planning and production scheduling.

  2. Automatic structural scene digitalization.

    Science.gov (United States)

    Tang, Rui; Wang, Yuhan; Cosker, Darren; Li, Wenbin

    2017-01-01

    In this paper, we present an automatic system for the analysis and labeling of structural scenes, floor plan drawings in Computer-aided Design (CAD) format. The proposed system applies a fusion strategy to detect and recognize various components of CAD floor plans, such as walls, doors, windows and other ambiguous assets. Technically, a general rule-based filter parsing method is fist adopted to extract effective information from the original floor plan. Then, an image-processing based recovery method is employed to correct information extracted in the first step. Our proposed method is fully automatic and real-time. Such analysis system provides high accuracy and is also evaluated on a public website that, on average, archives more than ten thousands effective uses per day and reaches a relatively high satisfaction rate.

  3. Automatic progressive damage detection of rotor bar in induction motor using vibration analysis and multiple classifiers

    International Nuclear Information System (INIS)

    Cruz-Vega, Israel; Rangel-Magdaleno, Jose; Ramirez-Cortes, Juan; Peregrina-Barreto, Hayde

    2017-01-01

    There is an increased interest in developing reliable condition monitoring and fault diagnosis systems of machines like induction motors; such interest is not only in the final phase of the failure but also at early stages. In this paper, several levels of damage of rotor bars under different load conditions are identified by means of vibration signals. The importance of this work relies on a simple but effective automatic detection algorithm of the damage before a break occurs. The feature extraction is based on discrete wavelet analysis and auto- correlation process. Then, the automatic classification of the fault degree is carried out by a binary classification tree. In each node, com- paring the learned levels of the breaking off correctly identifies the fault degree. The best results of classification are obtained employing computational intelligence techniques like support vector machines, multilayer perceptron, and the k-NN algorithm, with a proper selection of their optimal parameters.

  4. Automatic progressive damage detection of rotor bar in induction motor using vibration analysis and multiple classifiers

    Energy Technology Data Exchange (ETDEWEB)

    Cruz-Vega, Israel; Rangel-Magdaleno, Jose; Ramirez-Cortes, Juan; Peregrina-Barreto, Hayde [Santa María Tonantzintla, Puebla (Mexico)

    2017-06-15

    There is an increased interest in developing reliable condition monitoring and fault diagnosis systems of machines like induction motors; such interest is not only in the final phase of the failure but also at early stages. In this paper, several levels of damage of rotor bars under different load conditions are identified by means of vibration signals. The importance of this work relies on a simple but effective automatic detection algorithm of the damage before a break occurs. The feature extraction is based on discrete wavelet analysis and auto- correlation process. Then, the automatic classification of the fault degree is carried out by a binary classification tree. In each node, com- paring the learned levels of the breaking off correctly identifies the fault degree. The best results of classification are obtained employing computational intelligence techniques like support vector machines, multilayer perceptron, and the k-NN algorithm, with a proper selection of their optimal parameters.

  5. Automatic neutron dosimetry system based on fluorescent nuclear track detector technology

    International Nuclear Information System (INIS)

    Akselrod, M.S.; Fomenko, V.V.; Bartz, J.A.; Haslett, T.L.

    2014-01-01

    For the first time, the authors are describing an automatic fluorescent nuclear track detector (FNTD) reader for neutron dosimetry. FNTD is a luminescent integrating type of detector made of aluminium oxide crystals that does not require electronics or batteries during irradiation. Non-destructive optical readout of the detector is performed using a confocal laser scanning fluorescence imaging with near-diffraction limited resolution. The fully automatic table-top reader allows one to load up to 216 detectors on a tray, read their engraved IDs using a CCD camera and optical character recognition, scan and process simultaneously two types of images in fluorescent and reflected laser light contrast to eliminate false-positive tracks related to surface and volume crystal imperfections. The FNTD dosimetry system allows one to measure neutron doses from 0.1 mSv to 20 Sv and covers neutron energies from thermal to 20 MeV. The reader is characterised by a robust, compact optical design, fast data processing electronics and user-friendly software. The first table-top automatic FNTD neutron dosimetry system was successfully tested for LLD, linearity and ability to measure neutrons in mixed neutron-photon fields satisfying US and ISO standards. This new neutron dosimetry system provides advantages over other technologies including environmental stability of the detector material, wide range of detectable neutron energies and doses, detector re-readability and re-usability and all-optical readout. A new adaptive image processing algorithm reliably removes false-positive tracks associated with surface and bulk crystal imperfections. (authors)

  6. CAnat: An algorithm for the automatic segmentation of anatomy of medical images

    International Nuclear Information System (INIS)

    Caon, M.; Gobert, L.; Mariusz, B.

    2011-01-01

    Full text: To develop a method to automatically categorise organs and tissues displayed in medical images. Dosimetry calculations using Monte Carlo methods require a mathematical representation of human anatomy e.g. a voxel phantom. For a whole body, their construction involves processing several hundred images to identify each organ and tissue-the process is very time-consuming. This project is developing a Computational Anatomy (CAnat) algorithm to automatically recognise and classify the different tissue in a tomographic image. Methods The algorithm utilizes the Statistical Region Merging technique (SRM). The SRM depends on one estimated parameter. The parameter is a measure of statistical complexity of the image and can be automatically adjusted to suit individual image features. This allows for automatic tuning of coarseness of the overall segmentation as well as object specific selection for further tasks. CAnat is tested on two CT images selected to represent different anatomical complexities. In the mid-thigh image, tissues/. regions of interest are air, fat, muscle, bone marrow and compact bone. In the pelvic image, fat, urinary bladder and anus/colon, muscle, cancellous bone, and compact bone. Segmentation results were evaluated using the Jaccard index which is a measure of set agreement. An index of one indicates perfect agreement between CAnat and manual segmentation. The Jaccard indices for the mid-thigh CT were 0.99, 0.89, 0.97, 0.63 and 0.88, respectively and for the pelvic CT were 0.99, 0.81, 0.77, 0.93, 0.53, 0.76, respectively. Conclusion The high accuracy preliminary segmentation results demonstrate the feasibility of the CAnat algorithm.

  7. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping

    Directory of Open Access Journals (Sweden)

    Pouria Sadeghi-Tehran

    2017-11-01

    Full Text Available Abstract Background Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. Results In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1 comparison with ground-truth images, (2 variation along a day with changes in ambient illumination, (3 comparison with manual measurements and (4 an estimation of performance along the full life cycle of a wheat canopy. Conclusion The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  8. 14 CFR 25.904 - Automatic takeoff thrust control system (ATTCS).

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Automatic takeoff thrust control system... Automatic takeoff thrust control system (ATTCS). Each applicant seeking approval for installation of an engine power control system that automatically resets the power or thrust on the operating engine(s) when...

  9. Development of an automatic prompt gamma-ray activation analysis system

    International Nuclear Information System (INIS)

    Osawa, Takahito

    2013-01-01

    An automatic prompt gamma-ray activation analysis system was developed and installed at the Japan Research Reactor No. 3 Modified (JRR-3M). The main control software, referred to as AutoPGA, was developed using LabVIEW 2011 and the hand-made program can control all functions of the analytical system. The core of the new system is an automatic sample exchanger and measurement system with several additional automatic control functions integrated into the system. Up to fourteen samples can be automatically measured by the system. (author)

  10. Automatic segmentation of the heart in radiotherapy for breast cancer

    DEFF Research Database (Denmark)

    Laugaard Lorenzen, Ebbe; Ewertz, Marianne; Brink, Carsten

    2014-01-01

    Background. The aim of this study was to evaluate two fully automatic segmentation methods in comparison with manual delineations for their use in delineating the heart on planning computed tomography (CT) used in radiotherapy for breast cancer. Material and methods. Automatic delineation of heart...... in 15 breast cancer patients was performed by two different automatic delineation systems. Analysis of accuracy and precision of the differences between manual and automatic delineations were evaluated on volume, mean dose, maximum dose and spatial distance differences. Two sets of manual delineations...

  11. Automatic identification of watercourses in flat and engineered landscapes by computing the skeleton of a LiDAR point cloud

    Science.gov (United States)

    Broersen, Tom; Peters, Ravi; Ledoux, Hugo

    2017-09-01

    Drainage networks play a crucial role in protecting land against floods. It is therefore important to have an accurate map of the watercourses that form the drainage network. Previous work on the automatic identification of watercourses was typically based on grids, focused on natural landscapes, and used mostly the slope and curvature of the terrain. We focus in this paper on areas that are characterised by low-lying, flat, and engineered landscapes; these are characteristic to the Netherlands for instance. We propose a new methodology to identify watercourses automatically from elevation data, it uses solely a raw classified LiDAR point cloud as input. We show that by computing twice a skeleton of the point cloud-once in 2D and once in 3D-and that by using the properties of the skeletons we can identify most of the watercourses. We have implemented our methodology and tested it for three different soil types around Utrecht, the Netherlands. We were able to detect 98% of the watercourses for one soil type, and around 75% for the worst case, when we compared to a reference dataset that was obtained semi-automatically.

  12. Group Dynamics in Automatic Imitation.

    Science.gov (United States)

    Gleibs, Ilka H; Wilson, Neil; Reddy, Geetha; Catmur, Caroline

    Imitation-matching the configural body movements of another individual-plays a crucial part in social interaction. We investigated whether automatic imitation is not only influenced by who we imitate (ingroup vs. outgroup member) but also by the nature of an expected interaction situation (competitive vs. cooperative). In line with assumptions from Social Identity Theory), we predicted that both social group membership and the expected situation impact on the level of automatic imitation. We adopted a 2 (group membership target: ingroup, outgroup) x 2 (situation: cooperative, competitive) design. The dependent variable was the degree to which participants imitated the target in a reaction time automatic imitation task. 99 female students from two British Universities participated. We found a significant two-way interaction on the imitation effect. When interacting in expectation of cooperation, imitation was stronger for an ingroup target compared to an outgroup target. However, this was not the case in the competitive condition where imitation did not differ between ingroup and outgroup target. This demonstrates that the goal structure of an expected interaction will determine the extent to which intergroup relations influence imitation, supporting a social identity approach.

  13. Automatic programming for critical applications

    Science.gov (United States)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  14. Identification of mycobacterium tuberculosis in sputum smear slide using automatic scanning microscope

    Science.gov (United States)

    Rulaningtyas, Riries; Suksmono, Andriyan B.; Mengko, Tati L. R.; Saptawati, Putri

    2015-04-01

    Sputum smear observation has an important role in tuberculosis (TB) disease diagnosis, because it needs accurate identification to avoid high errors diagnosis. In development countries, sputum smear slide observation is commonly done with conventional light microscope from Ziehl-Neelsen stained tissue and it doesn't need high cost to maintain the microscope. The clinicians do manual screening process for sputum smear slide which is time consuming and needs highly training to detect the presence of TB bacilli (mycobacterium tuberculosis) accurately, especially for negative slide and slide with less number of TB bacilli. For helping the clinicians, we propose automatic scanning microscope with automatic identification of TB bacilli. The designed system modified the field movement of light microscope with stepper motor which was controlled by microcontroller. Every sputum smear field was captured by camera. After that some image processing techniques were done for the sputum smear images. The color threshold was used for background subtraction with hue canal in HSV color space. Sobel edge detection algorithm was used for TB bacilli image segmentation. We used feature extraction based on shape for bacilli analyzing and then neural network classified TB bacilli or not. The results indicated identification of TB bacilli that we have done worked well and detected TB bacilli accurately in sputum smear slide with normal staining, but not worked well in over staining and less staining tissue slide. However, overall the designed system can help the clinicians in sputum smear observation becomes more easily.

  15. Automatically repairing invalid polygons with a constrained triangulation

    NARCIS (Netherlands)

    Ledoux, H.; Arroyo Ohori, K.; Meijers, M.

    2012-01-01

    Although the validation of single polygons has received considerable attention, the automatic repair of invalid polygons has not. Automated repair methods can be considered as interpreting ambiguous or ill-defined polygons and giving a coherent and clearly defined output. At this moment, automatic

  16. Automatic detection of P- and S-wave arrival times: new strategies based on the modified fractal method and basic matching pursuit.

    Science.gov (United States)

    Chi Durán, R. K.; Comte, D.; Diaz, M. A.; Silva, J. F.

    2017-12-01

    In this work, new strategies for automatic identification of P- and S-wave arrival times from digital recorded local seismograms are proposed and analyzed. The database of arrival times previously identified by a human reader was compared with automatic identification techniques based on the Fourier transformation in reduced time (spectrograms), fractal analysis, and the basic matching pursuit algorithm. The first two techniques were used to identify the P-wave arrival times, while the third was used for the identification of the S-wave. For validation, the results were compared with the short-time average over long-time average (STA/LTA) of Rietbrock et al., Geophys Res Lett 39(8), (2012) for the database of aftershocks of the 2010 Maule Mw = 8.8 earthquake. The identifiers proposed in this work exhibit good results that outperform the STA/LTA identifier in many scenarios. The average difference from the reference picks (times obtained by the human reader) in P- and S-wave arrival times is 1 s.

  17. The irace package: Iterated racing for automatic algorithm configuration

    Directory of Open Access Journals (Sweden)

    Manuel López-Ibáñez

    2016-01-01

    Full Text Available Modern optimization algorithms typically require the setting of a large number of parameters to optimize their performance. The immediate goal of automatic algorithm configuration is to find, automatically, the best parameter settings of an optimizer. Ultimately, automatic algorithm configuration has the potential to lead to new design paradigms for optimization software. The irace package is a software package that implements a number of automatic configuration procedures. In particular, it offers iterated racing procedures, which have been used successfully to automatically configure various state-of-the-art algorithms. The iterated racing procedures implemented in irace include the iterated F-race algorithm and several extensions and improvements over it. In this paper, we describe the rationale underlying the iterated racing procedures and introduce a number of recent extensions. Among these, we introduce a restart mechanism to avoid premature convergence, the use of truncated sampling distributions to handle correctly parameter bounds, and an elitist racing procedure for ensuring that the best configurations returned are also those evaluated in the highest number of training instances. We experimentally evaluate the most recent version of irace and demonstrate with a number of example applications the use and potential of irace, in particular, and automatic algorithm configuration, in general.

  18. A heads-up no-limit Texas Hold'em poker player: Discretized betting models and automatically generated equilibrium-finding programs

    DEFF Research Database (Denmark)

    Gilpin, Andrew G.; Sandholm, Tuomas; Sørensen, Troels Bjerre

    2008-01-01

    choices in the game. Second, we employ potential-aware automated abstraction algorithms for identifying strategically similar situations in order to decrease the size of the game tree. Third, we develop a new technique for automatically generating the source code of an equilibrium-finding algorithm from...... an XML-based description of a game. This automatically generated program is more efficient than what would be possible with a general-purpose equilibrium-finding program. Finally, we present results from the AAAI-07 Computer Poker Competition, in which Tartanian placed second out of ten entries....

  19. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    Science.gov (United States)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  20. Mimicry and automatic imitation are not correlated

    Science.gov (United States)

    van Den Bossche, Sofie; Cracco, Emiel; Bardi, Lara; Rigoni, Davide; Brass, Marcel

    2017-01-01

    It is widely known that individuals have a tendency to imitate each other. However, different psychological disciplines assess imitation in different manners. While social psychologists assess mimicry by means of action observation, cognitive psychologists assess automatic imitation with reaction time based measures on a trial-by-trial basis. Although these methods differ in crucial methodological aspects, both phenomena are assumed to rely on similar underlying mechanisms. This raises the fundamental question whether mimicry and automatic imitation are actually correlated. In the present research we assessed both phenomena and did not find a meaningful correlation. Moreover, personality traits such as empathy, autism traits, and traits related to self- versus other-focus did not correlate with mimicry or automatic imitation either. Theoretical implications are discussed. PMID:28877197

  1. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  2. Automatic color preference correction for color reproduction

    Science.gov (United States)

    Tsukada, Masato; Funayama, Chisato; Tajima, Johji

    2000-12-01

    The reproduction of natural objects in color images has attracted a great deal of attention. Reproduction more pleasing colors of natural objects is one of the methods available to improve image quality. We developed an automatic color correction method to maintain preferred color reproduction for three significant categories: facial skin color, green grass and blue sky. In this method, a representative color in an object area to be corrected is automatically extracted from an input image, and a set of color correction parameters is selected depending on the representative color. The improvement in image quality for reproductions of natural image was more than 93 percent in subjective experiments. These results show the usefulness of our automatic color correction method for the reproduction of preferred colors.

  3. Automatic generation of natural language nursing shift summaries in neonatal intensive care: BT-Nurse.

    Science.gov (United States)

    Hunter, James; Freer, Yvonne; Gatt, Albert; Reiter, Ehud; Sripada, Somayajulu; Sykes, Cindy

    2012-11-01

    Our objective was to determine whether and how a computer system could automatically generate helpful natural language nursing shift summaries solely from an electronic patient record system, in a neonatal intensive care unit (NICU). A system was developed which automatically generates partial NICU shift summaries (for the respiratory and cardiovascular systems), using data-to-text technology. It was evaluated for 2 months in the NICU at the Royal Infirmary of Edinburgh, under supervision. In an on-ward evaluation, a substantial majority of the summaries was found by outgoing and incoming nurses to be understandable (90%), and a majority was found to be accurate (70%), and helpful (59%). The evaluation also served to identify some outstanding issues, especially with regard to extra content the nurses wanted to see in the computer-generated summaries. It is technically possible automatically to generate limited natural language NICU shift summaries from an electronic patient record. However, it proved difficult to handle electronic data that was intended primarily for display to the medical staff, and considerable engineering effort would be required to create a deployable system from our proof-of-concept software. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Automatic weld torch guidance control system

    Science.gov (United States)

    Smaith, H. E.; Wall, W. A.; Burns, M. R., Jr.

    1982-01-01

    A highly reliable, fully digital, closed circuit television optical, type automatic weld seam tracking control system was developed. This automatic tracking equipment is used to reduce weld tooling costs and increase overall automatic welding reliability. The system utilizes a charge injection device digital camera which as 60,512 inidividual pixels as the light sensing elements. Through conventional scanning means, each pixel in the focal plane is sequentially scanned, the light level signal digitized, and an 8-bit word transmitted to scratch pad memory. From memory, the microprocessor performs an analysis of the digital signal and computes the tracking error. Lastly, the corrective signal is transmitted to a cross seam actuator digital drive motor controller to complete the closed loop, feedback, tracking system. This weld seam tracking control system is capable of a tracking accuracy of + or - 0.2 mm, or better. As configured, the system is applicable to square butt, V-groove, and lap joint weldments.

  5. FRICTION - WELDING MACHINE AUTOMATIC CONTROL CIRCUIT DESIGN AND APPLICATION

    OpenAIRE

    Hakan ATEŞ; Ramazan BAYINDIR

    2003-01-01

    In this work, automatic controllability of a laboratory-sized friction-welding machine has been investigated. The laboratory-sized friction-welding machine was composed of motor, brake, rotary and constant samples late pliers, and hydraulic unit. In automatic method, welding parameters such as friction time, friction pressure, forge time and forge pressure can be applied sensitively using time relays and contactors. At the end of the experimental study it's observed that automatic control sys...

  6. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Bourret, S.C.

    1974-01-01

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  7. AUTOMATIC DETECTION AND CLASSIFICATION OF RETINAL VASCULAR LANDMARKS

    Directory of Open Access Journals (Sweden)

    Hadi Hamad

    2014-06-01

    Full Text Available The main contribution of this paper is introducing a method to distinguish between different landmarks of the retina: bifurcations and crossings. The methodology may help in differentiating between arteries and veins and is useful in identifying diseases and other special pathologies, too. The method does not need any special skills, thus it can be assimilated to an automatic way for pinpointing landmarks; moreover it gives good responses for very small vessels. A skeletonized representation, taken out from the segmented binary image (obtained through a preprocessing step, is used to identify pixels with three or more neighbors. Then, the junction points are classified into bifurcations or crossovers depending on their geometrical and topological properties such as width, direction and connectivity of the surrounding segments. The proposed approach is applied to the public-domain DRIVE and STARE datasets and compared with the state-of-the-art methods using proper validation parameters. The method was successful in identifying the majority of the landmarks; the average correctly identified bifurcations in both DRIVE and STARE datasets for the recall and precision values are: 95.4% and 87.1% respectively; also for the crossovers, the recall and precision values are: 87.6% and 90.5% respectively; thus outperforming other studies.

  8. Dynamic Analysis of a Pendulum Dynamic Automatic Balancer

    Directory of Open Access Journals (Sweden)

    Jin-Seung Sohn

    2007-01-01

    Full Text Available The automatic dynamic balancer is a device to reduce the vibration from unbalanced mass of rotors. Instead of considering prevailing ball automatic dynamic balancer, pendulum automatic dynamic balancer is analyzed. For the analysis of dynamic stability and behavior, the nonlinear equations of motion for a system are derived with respect to polar coordinates by the Lagrange's equations. The perturbation method is applied to investigate the dynamic behavior of the system around the equilibrium position. Based on the linearized equations, the dynamic stability of the system around the equilibrium positions is investigated by the eigenvalue analysis.

  9. Automatic Detection of Wild-type Mouse Cranial Sutures

    DEFF Research Database (Denmark)

    Ólafsdóttir, Hildur; Darvann, Tron Andre; Hermann, Nuno V.

    , automatic detection of the cranial sutures becomes important. We have previously built a craniofacial, wild-type mouse atlas from a set of 10 Micro CT scans using a B-spline-based nonrigid registration method by Rueckert et al. Subsequently, all volumes were registered nonrigidly to the atlas. Using......, the observer traced the sutures on each of the mouse volumes as well. The observer outperforms the automatic approach by approximately 0.1 mm. All mice have similar errors while the suture error plots reveal that suture 1 and 2 are cumbersome, both for the observer and the automatic approach. These sutures can...

  10. Automatic Test Systems Aquisition

    National Research Council Canada - National Science Library

    1994-01-01

    We are providing this final memorandum report for your information and use. This report discusses the efforts to achieve commonality in standards among the Military Departments as part of the DoD policy for automatic test systems (ATS...

  11. Automatic Classification of the Sub-Techniques (Gears Used in Cross-Country Ski Skating Employing a Mobile Phone

    Directory of Open Access Journals (Sweden)

    Thomas Stöggl

    2014-10-01

    Full Text Available The purpose of the current study was to develop and validate an automatic algorithm for classification of cross-country (XC ski-skating gears (G using Smartphone accelerometer data. Eleven XC skiers (seven men, four women with regional-to-international levels of performance carried out roller skiing trials on a treadmill using fixed gears (G2left, G2right, G3, G4left, G4right and a 950-m trial using different speeds and inclines, applying gears and sides as they normally would. Gear classification by the Smartphone (on the chest and based on video recordings were compared. Formachine-learning, a collective database was compared to individual data. The Smartphone application identified the trials with fixed gears correctly in all cases. In the 950-m trial, participants executed 140 ± 22 cycles as assessed by video analysis, with the automatic Smartphone application giving a similar value. Based on collective data, gears were identified correctly 86.0% ± 8.9% of the time, a value that rose to 90.3% ± 4.1% (P < 0.01 with machine learning from individual data. Classification was most often incorrect during transition between gears, especially to or from G3. Identification was most often correct for skiers who made relatively few transitions between gears. The accuracy of the automatic procedure for identifying G2left, G2right, G3, G4left and G4right was 96%, 90%, 81%, 88% and 94%, respectively. The algorithm identified gears correctly 100% of the time when a single gear was used and 90% of the time when different gears were employed during a variable protocol. This algorithm could be improved with respect to identification of transitions between gears or the side employed within a given gear.

  12. Automatic Classification of the Sub-Techniques (Gears) Used in Cross-Country Ski Skating Employing a Mobile Phone

    Science.gov (United States)

    Stöggl, Thomas; Holst, Anders; Jonasson, Arndt; Andersson, Erik; Wunsch, Tobias; Norström, Christer; Holmberg, Hans-Christer

    2014-01-01

    The purpose of the current study was to develop and validate an automatic algorithm for classification of cross-country (XC) ski-skating gears (G) using Smartphone accelerometer data. Eleven XC skiers (seven men, four women) with regional-to-international levels of performance carried out roller skiing trials on a treadmill using fixed gears (G2left, G2right, G3, G4left, G4right) and a 950-m trial using different speeds and inclines, applying gears and sides as they normally would. Gear classification by the Smartphone (on the chest) and based on video recordings were compared. Formachine-learning, a collective database was compared to individual data. The Smartphone application identified the trials with fixed gears correctly in all cases. In the 950-m trial, participants executed 140 ± 22 cycles as assessed by video analysis, with the automatic Smartphone application giving a similar value. Based on collective data, gears were identified correctly 86.0% ± 8.9% of the time, a value that rose to 90.3% ± 4.1% (P < 0.01) with machine learning from individual data. Classification was most often incorrect during transition between gears, especially to or from G3. Identification was most often correct for skiers who made relatively few transitions between gears. The accuracy of the automatic procedure for identifying G2left, G2right, G3, G4left and G4right was 96%, 90%, 81%, 88% and 94%, respectively. The algorithm identified gears correctly 100% of the time when a single gear was used and 90% of the time when different gears were employed during a variable protocol. This algorithm could be improved with respect to identification of transitions between gears or the side employed within a given gear. PMID:25365459

  13. Patients with schizophrenia do not preserve automatic grouping when mentally re-grouping figures: shedding light on an ignored difficulty

    Directory of Open Access Journals (Sweden)

    Anne eGiersch

    2012-08-01

    Full Text Available Looking at a pair of objects is easy when automatic grouping mechanisms bind these objects together, but visual exploration can also be more flexible. It is possible to mentally ‘re-group’ two objects that are not only separate but belong to different pairs of objects. ‘Re-grouping’ is in conflict with automatic grouping, since it entails a separation of each item from the set it belongs to. This ability appears to be impaired in patients with schizophrenia. Here we check if this impairment is selective, which would suggest a dissociation between grouping and ‘re-grouping’, or if it impacts on usual, automatic grouping, which would call for a better understanding of the interactions between automatic grouping and ‘re-grouping’. Sixteen outpatients with schizophrenia and healthy controls had to identify two identical and contiguous target figures within a display of circles and squares alternating around a fixation point. Eye-tracking was used to check central fixation. The target pair could be located in the same or separate hemifields. Identical figures were grouped by a connector (grouped automatically or not (to be re-grouped. Attention modulation of automatic grouping was tested by manipulating the proportion of connected and unconnected targets, thus prompting subjects to focalize on either connected or unconnected pairs. Both groups were sensitive to automatic grouping in most conditions, but patients were unusually slowed down for connected targets while focalizing on unconnected pairs. In addition, this unusual effect occurred only when target were presented within the same hemifield. Patients and controls differed on this asymmetry between within- and across-hemifield presentation, suggesting that patients with schizophrenia do not re-group figures in the same way as controls do. We discuss possible implications on how ‘re-grouping’ ties in with ongoing, automatic perception in healthy volunteers.

  14. Children’s Behavioral Pain Cues: Implicit Automaticity and Control Dimensions in Observational Measures

    Directory of Open Access Journals (Sweden)

    Kamal Kaur Sekhon

    2017-01-01

    Full Text Available Some pain behaviors appear to be automatic, reflexive manifestations of pain, whereas others present as voluntarily controlled. This project examined whether this distinction would characterize pain cues used in observational pain measures for children aged 4–12. To develop a comprehensive list of cues, a systematic literature search of studies describing development of children’s observational pain assessment tools was conducted using MEDLINE, PsycINFO, and Web of Science. Twenty-one articles satisfied the criteria. A total of 66 nonredundant pain behavior items were identified. To determine whether items would be perceived as automatic or controlled, 277 research participants rated each on multiple scales associated with the distinction. Factor analyses yielded three major factors: the “Automatic” factor included items related to facial expression, paralinguistics, and consolability; the “Controlled” factor included items related to intentional movements, verbalizations, and social actions; and the “Ambiguous” factor included items related to voluntary facial expressions. Pain behaviors in observational pain scales for children can be characterized as automatic, controlled, and ambiguous, supporting a dual-processing, neuroregulatory model of pain expression. These dimensions would be expected to influence judgments of the nature and severity of pain being experienced and the extent to which the child is attempting to control the social environment.

  15. Automatic Generation System of Multiple-Choice Cloze Questions and its Evaluation

    Directory of Open Access Journals (Sweden)

    Takuya Goto

    2010-09-01

    Full Text Available Since English expressions vary according to the genres, it is important for students to study questions that are generated from sentences of the target genre. Although various questions are prepared, it is still not enough to satisfy various genres which students want to learn. On the other hand, when producing English questions, sufficient grammatical knowledge and vocabulary are needed, so it is difficult for non-expert to prepare English questions by themselves. In this paper, we propose an automatic generation system of multiple-choice cloze questions from English texts. Empirical knowledge is necessary to produce appropriate questions, so machine learning is introduced to acquire knowledge from existing questions. To generate the questions from texts automatically, the system (1 extracts appropriate sentences for questions from texts based on Preference Learning, (2 estimates a blank part based on Conditional Random Field, and (3 generates distracters based on statistical patterns of existing questions. Experimental results show our method is workable for selecting appropriate sentences and blank part. Moreover, our method is appropriate to generate the available distracters, especially for the sentence that does not contain the proper noun.

  16. Research on wireless communication technology based on automatic logistics system of welder

    Directory of Open Access Journals (Sweden)

    Sun Xuan

    2018-01-01

    Full Text Available In order to meet the requirements of high real-time and high stability of data transmission in automatic welding system, RTU data format and real-time communication mechanism are adopted in this system. In the automatic logistics system through the Ethernet and wireless WIFI technology will palletizer, stacker, AGV car organically together to complete the palletizer automatic crawling the goods, AGV car automatic delivery, stacking machine automatically out of the Dimensional warehouse. .

  17. Research on wireless communication technology based on automatic logistics system of welder

    OpenAIRE

    Sun Xuan; Wang Zhi-yong; Ma Zhe-dong

    2018-01-01

    In order to meet the requirements of high real-time and high stability of data transmission in automatic welding system, RTU data format and real-time communication mechanism are adopted in this system. In the automatic logistics system through the Ethernet and wireless WIFI technology will palletizer, stacker, AGV car organically together to complete the palletizer automatic crawling the goods, AGV car automatic delivery, stacking machine automatically out of the Dimensional warehouse. .

  18. Development of automatic pipe welder for nuclear power plant

    International Nuclear Information System (INIS)

    Iwamoto, Taro; Ando, Shimon; Omae, Tsutomu; Ito, Yoshitoshi; Araya, Takeshi.

    1978-01-01

    Numerous pipings are installed in nuclear power plants, and of course, the reliability of these pipings are very important to preserve the safety of the plants. These pipings undergo periodic inspection yearly, and when some defects are found or some reconstructions to superior systems are made, field welding in the plants is required. When the places to be welded are in containment vessels, the works must be carried out in radiation environment. In order to maintain the highest quality of welding and to reduce the radiation exposure of workers, many skilled workers are required. This automatic pipe welder was developed to solve these problems, aiming at carrying out welding works by remote control at the safe places outside containment vessels. Especially in order to obtain the highest quality of welding, it was not perfectly automated, but the man-machine system so as to enable to utilize the delicate sense of workers was adopted. The visual and contact detecting systems to monitor welding works, remote control system, computer control, light, small and easily installed welding head, grinding and supersonic flow detecting equipments, the power source of transistor switching type, air cooling equipment, and the function for setting welding conditions according to algorithm were added to the welding machine. The outline and main components of this automatic pipe welder are explained. (Kako, I.)

  19. An Automatic Framework Using Space-Time Processing and TR-MUSIC for Subsurface and Through-Wall Multitarget Imaging

    Directory of Open Access Journals (Sweden)

    Si-hao Tan

    2012-01-01

    Full Text Available We present an automatic framework combined space-time signal processing with Time Reversal electromagnetic (EM inversion for subsurface and through-wall multitarget imaging using electromagnetic waves. This framework is composed of a frequency-wavenumber (FK filter to suppress direct wave and medium bounce, a FK migration algorithm to automatically estimate the number of targets and identify target regions, which can be used to reduce the computational complexity of the following imaging algorithm, and a EM inversion algorithm using Time Reversal Multiple Signal Classification (TR-MUSIC to reconstruct hidden objects. The feasibility of the framework is demonstrated with simulated data generated by GPRMAX.

  20. Automatic sweep circuit

    Science.gov (United States)

    Keefe, Donald J.

    1980-01-01

    An automatically sweeping circuit for searching for an evoked response in an output signal in time with respect to a trigger input. Digital counters are used to activate a detector at precise intervals, and monitoring is repeated for statistical accuracy. If the response is not found then a different time window is examined until the signal is found.

  1. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  2. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  3. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  4. Automatic re-contouring in 4D radiotherapy

    International Nuclear Information System (INIS)

    Lu, Weiguo; Olivera, Gustavo H; Chen, Quan; Chen, Ming-Li; Ruchala, Kenneth J

    2006-01-01

    Delineating regions of interest (ROIs) on each phase of four-dimensional (4D) computed tomography (CT) images is an essential step for 4D radiotherapy. The requirement of manual phase-by-phase contouring prohibits the routine use of 4D radiotherapy. This paper develops an automatic re-contouring algorithm that combines techniques of deformable registration and surface construction. ROIs are manually contoured slice-by-slice in the reference phase image. A reference surface is constructed based on these reference contours using a triangulated surface construction technique. The deformable registration technique provides the voxel-to-voxel mapping between the reference phase and the test phase. The vertices of the reference surface are displaced in accordance with the deformation map, resulting in a deformed surface. The new contours are reconstructed by cutting the deformed surface slice-by-slice along the transversal, sagittal or coronal direction. Since both the inputs and outputs of our automatic re-contouring algorithm are contours, it is relatively easy to cope with any treatment planning system. We tested our automatic re-contouring algorithm using a deformable phantom and 4D CT images of six lung cancer patients. The proposed algorithm is validated by visual inspections and quantitative comparisons of the automatic re-contours with both the gold standard segmentations and the manual contours. Based on the automatic delineated ROIs, changes of tumour and sensitive structures during respiration are quantitatively analysed. This algorithm could also be used to re-contour daily images for treatment evaluation and adaptive radiotherapy

  5. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    Directory of Open Access Journals (Sweden)

    Raymond Mui

    2010-09-01

    Full Text Available Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  6. Automatic contact in DYNA3D for vehicle crashworthiness

    International Nuclear Information System (INIS)

    Whirley, R.G.; Engelmann, B.E.

    1994-01-01

    This paper presents a new formulation for the automatic definition and treatment of mechanical contact in explicit, nonlinear, finite element analysis. Automatic contact offers the benefits of significantly reduced model construction time and fewer opportunities for user error, but faces significant challenges in reliability and computational costs. The authors have used a new four-step automatic contact algorithm. Key aspects of the proposed method include (1) automatic identification of adjacent and opposite surfaces in the global search phase, and (2) the use of a smoothly varying surface normal that allows a consistent treatment of shell intersection and corner contact conditions without ad hoc rules. Three examples are given to illustrate the performance of the newly proposed algorithm in the public DYNA3D code

  7. Automatic reel controls filler wire in welding machines

    Science.gov (United States)

    Millett, A. V.

    1966-01-01

    Automatic reel on automatic welding equipment takes up slack in the reel-fed filler wire when welding operation is terminated. The reel maintains constant, adjustable tension on the wire during the welding operation and rewinds the wire from the wire feed unit when the welding is completed.

  8. Remote Sensing and GIS as Tools for Identifying Risk for Phreatomagmatic Eruptions in the Bishoftu Volcanic Field, Ethiopia

    Science.gov (United States)

    Pennington, H. G.; Graettinger, A.

    2017-12-01

    Bishoftu is a fast-growing town in the Oromia region of Ethiopia, located 47 km southeast of the nation's capital, Addis Ababa. It is situated atop a monogenetic basaltic volcanic field, called the Bishoftu Volcanic Field (BVF), which is composed of maar craters, scoria cones, lava flows, and rhyolite domes. Although not well dated, the morphology and archeological evidence have been used to infer a Holocene age, indicating that the community is exposed to continued volcanic risk. The presence of phreatomagmatic constructs in particular indicates that the hazards are not only vent-localized, but may have far reaching impacts. Hazard mapping is an essential tool for evaluating and communicating risks. This study presents the results of GIS analyses of proximal and distal syn-eruptive hazards associated with phreatomagmatic eruptions in the BVF. A digitized infrastructure map based on a SPOT 6 satellite image is used to identify the areas at risk from eruption scenarios. Parameters such as wind direction, vent location, and explosion energy are varied for hazard simulations to quantify the area impacted by different eruption scenarios. Proximal syn-eruptive hazards include tephra fall, base pyroclastic surges, and ballistic bombs. Distal hazards include predominantly ash fall. Eruption scenarios are simulated using Eject and Plumeria models as well as similar case studies from other urban volcanic fields. Within 5 km of the volcanic field center, more than 30 km2 of residential and commercial/industrial infrastructure will be damaged by proximal syn-eruptive hazards, in addition to 34 km2 of agricultural land, 291 km of roads, more than 10 km of railway, an airport, and two health centers. Within 100 km of the volcanic field center, ash fall will affect 3946 km2 of agricultural land, 179 km2 of residential land, and 28 km2 of commercial/industrial land. Approximately 2700 km of roads and railways, 553 km of waterways, an airport, and 14 health centers are located

  9. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  10. Automatic quantitative analysis of liver functions by a computer system

    International Nuclear Information System (INIS)

    Shinpo, Takako

    1984-01-01

    In the previous paper, we confirmed the clinical usefulness of hepatic clearance (hepatic blood flow), which is the hepatic uptake and blood disappearance rate coefficients. These were obtained by the initial slope index of each minute during a period of five frames of a hepatogram by injecting sup(99m)Tc-Sn-colloid 37 MBq. To analyze the information simply, rapidly and accurately, we developed a automatic quantitative analysis for liver functions. Information was obtained every quarter minute during a period of 60 frames of the sequential image. The sequential counts were measured for the heart, whole liver, both left lobe and right lobes using a computer connected to a scintillation camera. We measured the effective hepatic blood flow, from the disappearance rate multiplied by the percentage of hepatic uptake as follows, (liver counts)/(tatal counts of the field) Our method of analysis automatically recorded the reappearance graph of the disappearance curve and uptake curve on the basis of the heart and the whole liver, respectively; and computed using BASIC language. This method makes it possible to obtain the image of the initial uptake of sup(99m)Tc-Sn-colloid into the liver by a small dose of it. (author)

  11. Automatic Vetting for Malice in Android Platforms

    Science.gov (United States)

    2016-05-01

    Android Apps from Play Store Infected with Brain Test Malware. http://www.ibtimes.co.uk/google- removes -13- android -apps-play-store-infected- brain-test...AUTOMATIC VETTING FOR MALICE IN ANDROID PLATFORMS IOWA STATE UNIVERSITY MAY 2016 FINAL TECHNICAL REPORT APPROVED...COVERED (From - To) DEC 2013 - DEC 2015 4. TITLE AND SUBTITLE Automatic Vetting for Malice in Android Platforms 5a. CONTRACT NUMBER FA8750-14-2

  12. Automatic penalty continuation in structural topology optimization

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana; Stolpe, Mathias

    2015-01-01

    this issue is addressed. We propose an automatic continuation method, where the material penalization parameter is included as a new variable in the problem and a constraint guarantees that the requested penalty is eventually reached. The numerical results suggest that this approach is an appealing...... alternative to continuation methods. Automatic continuation also generally obtains better designs than the classical formulation using a reduced number of iterations....

  13. Automatic first-break picking using the instantaneous traveltime attribute

    KAUST Repository

    Saragiotis, Christos

    2012-01-01

    Picking the first breaks is an important step in seismic processing. The large volume of the seismic data calls for automatic and objective picking. We introduce a new automatic first-break picker, which uses specifically designed time windows and an iterative procedure based on the instantaneous traveltime attribute. The method is fast as it only uses a few FFT\\'s per trace. We demonstrate the effectiveness of this automatic method by applying it on real test data.

  14. Real-Time FPGA-Based Object Tracker with Automatic Pan-Tilt Features for Smart Video Surveillance Systems

    Directory of Open Access Journals (Sweden)

    Sanjay Singh

    2017-05-01

    Full Text Available The design of smart video surveillance systems is an active research field among the computer vision community because of their ability to perform automatic scene analysis by selecting and tracking the objects of interest. In this paper, we present the design and implementation of an FPGA-based standalone working prototype system for real-time tracking of an object of interest in live video streams for such systems. In addition to real-time tracking of the object of interest, the implemented system is also capable of providing purposive automatic camera movement (pan-tilt in the direction determined by movement of the tracked object. The complete system, including camera interface, DDR2 external memory interface controller, designed object tracking VLSI architecture, camera movement controller and display interface, has been implemented on the Xilinx ML510 (Virtex-5 FX130T FPGA Board. Our proposed, designed and implemented system robustly tracks the target object present in the scene in real time for standard PAL (720 × 576 resolution color video and automatically controls camera movement in the direction determined by the movement of the tracked object.

  15. Identifying the Tunneling Site in Strong-Field Ionization of H_{2}^{+}.

    Science.gov (United States)

    Liu, Kunlong; Barth, Ingo

    2017-12-15

    The tunneling site of the electron in a molecule exposed to a strong laser field determines the initial position of the ionizing electron and, as a result, has a large impact on the subsequent ultrafast electron dynamics on the polyatomic Coulomb potential. Here, the tunneling site of the electron of H_{2}^{+} ionized by a strong circularly polarized (CP) laser pulse is studied by numerically solving the time-dependent Schrödinger equation. We show that the electron removed from the down-field site is directly driven away by the CP field and the lateral photoelectron momentum distribution (LPMD) exhibits a Gaussian-like distribution, whereas the corresponding LPMD of the electron removed from the up-field site differs from the Gaussian shape due to the Coulomb focusing and scattering by the down-field core. Our current study presents the direct evidence clarifying a long-standing controversy over the tunneling site in H_{2}^{+} and raises the important role of the tunneling site in strong-field molecular ionization.

  16. A consideration of the operation of automatic production machines.

    Science.gov (United States)

    Hoshi, Toshiro; Sugimoto, Noboru

    2015-01-01

    At worksites, various automatic production machines are in use to release workers from muscular labor or labor in the detrimental environment. On the other hand, a large number of industrial accidents have been caused by automatic production machines. In view of this, this paper considers the operation of automatic production machines from the viewpoint of accident prevention, and points out two types of machine operation - operation for which quick performance is required (operation that is not permitted to be delayed) - and operation for which composed performance is required (operation that is not permitted to be performed in haste). These operations are distinguished by operation buttons of suitable colors and shapes. This paper shows that these characteristics are evaluated as "asymmetric on the time-axis". Here, in order for workers to accept the risk of automatic production machines, it is preconditioned in general that harm should be sufficiently small or avoidance of harm is easy. In this connection, this paper shows the possibility of facilitating the acceptance of the risk of automatic production machines by enhancing the asymmetric on the time-axis.

  17. Methodology for Automatic Ontology Generation Using Database Schema Information

    Directory of Open Access Journals (Sweden)

    JungHyen An

    2018-01-01

    Full Text Available An ontology is a model language that supports the functions to integrate conceptually distributed domain knowledge and infer relationships among the concepts. Ontologies are developed based on the target domain knowledge. As a result, methodologies to automatically generate an ontology from metadata that characterize the domain knowledge are becoming important. However, existing methodologies to automatically generate an ontology using metadata are required to generate the domain metadata in a predetermined template, and it is difficult to manage data that are increased on the ontology itself when the domain OWL (Ontology Web Language individuals are continuously increased. The database schema has a feature of domain knowledge and provides structural functions to efficiently process the knowledge-based data. In this paper, we propose a methodology to automatically generate ontologies and manage the OWL individual through an interaction of the database and the ontology. We describe the automatic ontology generation process with example schema and demonstrate the effectiveness of the automatically generated ontology by comparing it with existing ontologies using the ontology quality score.

  18. Automatic Deficits can lead to executive deficits in ADHD

    Directory of Open Access Journals (Sweden)

    Gabriella Martino

    2017-12-01

    Full Text Available It has been well documented an executive dysfunction in children with Attention Deficit Hyperactivity Disorder (ADHD and with Reading Disorder (RD. The purpose of the present study was to test an alternative hypothesis that deficits in executive functioning within ADHD may be partially due to an impairment of the automatic processing. In addition, since the co-occurrence between ADHD and RD, we tested the hypothesis that the automatic processing may be  a possible common cognitive factor between ADHD and RD. We investigated the automatic processing of selective visual attention through two experiments. 12 children with ADHD, 17 with ADHD+RD and 29 typically developing children, matched for age and gender, performed two tasks: Visual Information Processing Task and Clock Test. As expected, ADHD and ADHD+RD groups differed from the control group in controlled process task, suggesting a deficit in executive functioning. All clinical subjects also exhibited a lower performance in automatic processes, compared to control group. The results of this study suggest that executive deficits within ADHD can be partially due to an impairment of automatic processing.

  19. Higher-order automatic differentiation of mathematical functions

    Science.gov (United States)

    Charpentier, Isabelle; Dal Cappello, Claude

    2015-04-01

    Functions of mathematical physics such as the Bessel functions, the Chebyshev polynomials, the Gauss hypergeometric function and so forth, have practical applications in many scientific domains. On the one hand, differentiation formulas provided in reference books apply to real or complex variables. These do not account for the chain rule. On the other hand, based on the chain rule, the automatic differentiation has become a natural tool in numerical modeling. Nevertheless automatic differentiation tools do not deal with the numerous mathematical functions. This paper describes formulas and provides codes for the higher-order automatic differentiation of mathematical functions. The first method is based on Faà di Bruno's formula that generalizes the chain rule. The second one makes use of the second order differential equation they satisfy. Both methods are exemplified with the aforementioned functions.

  20. Oocytes Polar Body Detection for Automatic Enucleation

    Directory of Open Access Journals (Sweden)

    Di Chen

    2016-02-01

    Full Text Available Enucleation is a crucial step in cloning. In order to achieve automatic blind enucleation, we should detect the polar body of the oocyte automatically. The conventional polar body detection approaches have low success rate or low efficiency. We propose a polar body detection method based on machine learning in this paper. On one hand, the improved Histogram of Oriented Gradient (HOG algorithm is employed to extract features of polar body images, which will increase success rate. On the other hand, a position prediction method is put forward to narrow the search range of polar body, which will improve efficiency. Experiment results show that the success rate is 96% for various types of polar bodies. Furthermore, the method is applied to an enucleation experiment and improves the degree of automatic enucleation.