WorldWideScience

Sample records for automatic data processing

  1. Automatically processing physical data from LHD experiments

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M., E-mail: emoto.masahiko@nifs.ac.jp; Ida, K.; Suzuki, C.; Yoshida, M.; Akiyama, T.; Nakamura, Y.; Sakamoto, R.; Yokoyama, M.; Yoshinuma, M.

    2014-05-15

    Physical data produced by large helical device (LHD) experiments is supplied by the Kaiseki server, and registers more than 200 types of diagnostic data. Dependencies exist amongst the data; i.e., in many cases, the calculation of one data requires other data. Therefore, to obtain unregistered data, one needs to calculate not only the diagnostic data itself but also the dependent data; however, because the data is registered by different scientists, each scientist must separately calculate and register their respective data. To simplify this complicated procedure, we have developed an automatic calculation system called AutoAna. The calculation programs of AutoAna are distributed on a network, and the number of such programs can be easily increased dynamically. Our system is therefore scalable and ready for substantial increases in the size of the target data.

  2. A method of automatic data processing in radiometric control

    International Nuclear Information System (INIS)

    Adonin, V.M.; Gulyukina, N.A.; Nemirov, Yu.V.; Mogil'nitskij, M.I.

    1980-01-01

    Described is the algorithm for automatic data processing in gamma radiography of products. Rapidity due to application of recurrent evaluation is a specific feature of the processing. Experimental data of by-line control are presented. The results obtained have shown the applicability of automatic signal processing to the testing under industrial conditions, which would permit to increase the testing efficiency to eliminate the subjectivism in assessment of testing results and to improve working conditions

  3. Automatic processing of radioimmunological research data on a computer

    International Nuclear Information System (INIS)

    Korolyuk, I.P.; Gorodenko, A.N.; Gorodenko, S.I.

    1979-01-01

    A program ''CRITEST'' in the language PL/1 for the EC computer intended for automatic processing of the results of radioimmunological research has been elaborated. The program works in the operation system of the OC EC computer and is performed in the section OC 60 kb. When compiling the program Eitken's modified algorithm was used. The program was clinically approved when determining a number of hormones: CTH, T 4 , T 3 , TSH. The automatic processing of the radioimmunological research data on the computer makes it possible to simplify the labour-consuming analysis and to raise its accuracy

  4. Statistical data processing with automatic system for environmental radiation monitoring

    International Nuclear Information System (INIS)

    Zarkh, V.G.; Ostroglyadov, S.V.

    1986-01-01

    Practice of statistical data processing for radiation monitoring is exemplified, and some results obtained are presented. Experience in practical application of mathematical statistics methods for radiation monitoring data processing allowed to develop a concrete algorithm of statistical processing realized in M-6000 minicomputer. The suggested algorithm by its content is divided into 3 parts: parametrical data processing and hypotheses test, pair and multiple correlation analysis. Statistical processing programms are in a dialogue operation. The above algorithm was used to process observed data over radioactive waste disposal control region. Results of surface waters monitoring processing are presented

  5. A system for classifying wood-using industries and recording statistics for automatic data processing.

    Science.gov (United States)

    E.W. Fobes; R.W. Rowe

    1968-01-01

    A system for classifying wood-using industries and recording pertinent statistics for automatic data processing is described. Forms and coding instructions for recording data of primary processing plants are included.

  6. Towards Automatic Capturing of Manual Data Processing Provenance

    NARCIS (Netherlands)

    Wombacher, Andreas; Huq, M.R.

    2011-01-01

    Often data processing is not implemented by a work ow system or an integration application but is performed manually by humans along the lines of a more or less specified procedure. Collecting provenance information during manual data processing can not be automated. Further, manual collection of

  7. Indentification and structuring of data for automatic processing

    International Nuclear Information System (INIS)

    Wohland, H.; Rexer, G.; Ruehle, R.

    1976-01-01

    The data structure of a technical and scientific application system is described. The description of the structure is divided in different sections where the user can describe his own data. By fixing a section of this structure, a high degree of automation of the problem solving process can be achieved while preserving flexibility. (orig.) [de

  8. [Complex automatic data processing in multi-profile hospitals].

    Science.gov (United States)

    Dovzhenko, Iu M; Panov, G D

    1990-01-01

    The computerization of data processing in multi-disciplinary hospitals is the key factor in raising the quality of medical care provided to the population, intensifying the work of the personnel, improving the curative and diagnostic process and the use of resources. Even a small experience in complex computerization at the Botkin Hospital indicates that due to the use of the automated system the quality of data processing in being improved, a high level of patients' examination is being provided, a speedy training of young specialists is being achieved, conditions are being created for continuing education of physicians through the analysis of their own activity. At big hospitals a complex solution of administrative and curative diagnostic tasks on the basis of general hospital network of display connection and general hospital data bank is the most prospective form of computerization.

  9. Dialog system for automatic data input/output and processing with two BESM-6 computers

    International Nuclear Information System (INIS)

    Belyaev, Y.N.; Gorlov, Y.P.; Makarychev, S.V.; Monakov, A.A.; Shcherbakov, S.A.

    1985-01-01

    This paper presents a system for conducting experiments with fully automatic processing of data from multichannel recorders in the dialog mode. The system acquires data at a rate of 2.5 . 10 3 readings/sec, processes in real time, and outputs digital and graphical material in a multitasking environment

  10. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  11. Microsoft excel's automatic data processing and diagram drawing of RIA internal quality control parameters

    International Nuclear Information System (INIS)

    Zeng Pingfan; Liu Guoqiang

    2006-01-01

    We did automatic data processing and diagram drawing of various parameters of RIA' s internal quality control (IQC)by the use of Microsoft Excel (ME). By use of AVERAGE and STDEV of ME, we got x-bar, s and CV%. With pearson, we got the serum quality control coefficients (r). Inputing the original data to diagram's self-definition item, the diagram was drawn automatically. By the use of logic judging, we got the quality control judging results with the status, timing and data of various quality control parameters. For the past four years, the ME data processing and diagram drawing as well as quality control judging have been showed to be accurate, convenient and correct. It was quick and easy to manage and the automatic computer processing of RIA's IQC was realized. Conclusion: the method is applicable to all types of RIA' s IQC. (authors)

  12. 10 CFR 95.49 - Security of automatic data processing (ADP) systems.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Security of automatic data processing (ADP) systems. 95.49 Section 95.49 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) FACILITY SECURITY CLEARANCE AND SAFEGUARDING OF NATIONAL SECURITY INFORMATION AND RESTRICTED DATA Control of Information § 95.49 Security of...

  13. Automatic methods for processing track-detector data at the PAVICOM facility

    International Nuclear Information System (INIS)

    Aleksandrov, A.B.; Goncharova, L.A.; Polukhina, N.G.; Fejnberg, E.L.; Davydov, D.A.; Publichenko, P.A.; Roganova, T.M.

    2007-01-01

    New automatic methods essentially simplify and hasten the data treatment of tracking detectors. It allows handling big data files and appreciably improves their statistics; this fact predetermines an elaboration of new experiments, which suppose to use large volume targets, emulsive and solid-state large square tracking detectors. Thereupon the problem of training competent physicists able to work on modern automatic equipment is very relevant. About ten Moscow students working in LPI at PAVICOM facility master new methods every year. Most of the students working in high-energy physics take the print only about archaic hand methods of data handling from tracking detectors. In 2005 on the base of the PAVICOM facility and physics training of the MSU a new educational work for determination of the energy of neutrons passing through nuclear emulsion, which lets students acquire a base habit of data handling from tracking detectors using an automatic facility, was prepared; it can be included in the training process for students of any physical faculty. Specialists mastering methods of an automatic handling by the simple and obvious example of tracking detectors will be able to use their knowledge in various areas of science and techniques. The organization of upper division courses is a new additional aspect of using the PAVICOM facility described in an earlier paper [4

  14. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  15. Automatic discovery of data-centric and artifact-centric processes

    NARCIS (Netherlands)

    Nooijen, E.H.J.; Dongen, van B.F.; Fahland, D.; La Rosa, M.; Soffer, P.

    2013-01-01

    Process discovery is a technique that allows for automatically discovering a process model from recorded executions of a process as it happens in reality. This technique has successfully been applied for classical processes where one process execution is constituted by a single case with a unique

  16. Grid infrastructure for automatic processing of SAR data for flood applications

    Science.gov (United States)

    Kussul, Natalia; Skakun, Serhiy; Shelestov, Andrii

    2010-05-01

    More and more geosciences applications are being put on to the Grids. Due to the complexity of geosciences applications that is caused by complex workflow, the use of computationally intensive environmental models, the need of management and integration of heterogeneous data sets, Grid offers solutions to tackle these problems. Many geosciences applications, especially those related to the disaster management and mitigations require the geospatial services to be delivered in proper time. For example, information on flooded areas should be provided to corresponding organizations (local authorities, civil protection agencies, UN agencies etc.) no more than in 24 h to be able to effectively allocate resources required to mitigate the disaster. Therefore, providing infrastructure and services that will enable automatic generation of products based on the integration of heterogeneous data represents the tasks of great importance. In this paper we present Grid infrastructure for automatic processing of synthetic-aperture radar (SAR) satellite images to derive flood products. In particular, we use SAR data acquired by ESA's ENVSAT satellite, and neural networks to derive flood extent. The data are provided in operational mode from ESA rolling archive (within ESA Category-1 grant). We developed a portal that is based on OpenLayers frameworks and provides access point to the developed services. Through the portal the user can define geographical region and search for the required data. Upon selection of data sets a workflow is automatically generated and executed on the resources of Grid infrastructure. For workflow execution and management we use Karajan language. The workflow of SAR data processing consists of the following steps: image calibration, image orthorectification, image processing with neural networks, topographic effects removal, geocoding and transformation to lat/long projection, and visualisation. These steps are executed by different software, and can be

  17. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    Science.gov (United States)

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential

  18. Automatic methods of the processing of data from track detectors on the basis of the PAVICOM facility

    Science.gov (United States)

    Aleksandrov, A. B.; Goncharova, L. A.; Davydov, D. A.; Publichenko, P. A.; Roganova, T. M.; Polukhina, N. G.; Feinberg, E. L.

    2007-02-01

    New automatic methods essentially simplify and increase the rate of the processing of data from track detectors. This provides a possibility of processing large data arrays and considerably improves their statistical significance. This fact predetermines the development of new experiments which plan to use large-volume targets, large-area emulsion, and solid-state track detectors [1]. In this regard, the problem of training qualified physicists who are capable of operating modern automatic equipment is very important. Annually, about ten Moscow students master the new methods, working at the Lebedev Physical Institute at the PAVICOM facility [2 4]. Most students specializing in high-energy physics are only given an idea of archaic manual methods of the processing of data from track detectors. In 2005, on the basis of the PAVICOM facility and the physicstraining course of Moscow State University, a new training work was prepared. This work is devoted to the determination of the energy of neutrons passing through a nuclear emulsion. It provides the possibility of acquiring basic practical skills of the processing of data from track detectors using automatic equipment and can be included in the educational process of students of any physical faculty. Those who have mastered the methods of automatic data processing in a simple and pictorial example of track detectors will be able to apply their knowledge in various fields of science and technique. Formulation of training works for pregraduate and graduate students is a new additional aspect of application of the PAVICOM facility described earlier in [4].

  19. Automatic analysis of ultrasonic data

    International Nuclear Information System (INIS)

    Horteur, P.; Colin, J.; Benoist, P.; Bonis, M.; Paradis, L.

    1986-10-01

    This paper describes an automatic and self-contained data processing system, transportable on site, able to perform images such as ''A. Scan'', ''B. Scan'', ... to present very quickly the results of the control. It can be used in the case of pressure vessel inspection [fr

  20. Discrete data processing from a scintillation sensor for exposure automatic machine in medical radiography

    International Nuclear Information System (INIS)

    Karadimov, D.; Petukhov, N.; Dimitrova, S.; Vasilev, M.

    1983-01-01

    An exposure automatic machine with a thin plastic scintillator plate collimated by light guide with a photoconverter is described. Due to the physical processes nature in the scintillator accounted by the photomultiplier there is a possibility of a radiation analysis as well as of a digital processing of the results. Two alternative devices are shown. In the first of them the photomultiplier output signal is fed to a pulse-height selector at the output of which standard logical level pulses are obtained. These pulses are fed to a data register and then to a digital comparator where they are compared with a preset quantity selected depending on X-ray film sensitivity and foil combinations as well as on the desired film darkening. The ionizing radiation interruption is controlled by a switch unit. The detector spectral sensitivity correction is accomplished changing the photomultiplier supply voltage. In the second alternative a noise and ionizing radiation discriminator is used where a pulse-height selection according to the radiant energy is carried out. A digital comparator and a switching circuit control the ionizing radiation. By a second switching circuit the spectral distributed pulses from the discriminator are fed to a spectral analyser controlling dinamically the digital comparator compensating for the ionizing radiation spectral response influence. The second alternative advantage is that it allows for the radiation parameters control both in radiograph mode and X-ray examination mode. Due to the system fast-acting the device can be used to measure very short exposures as well as in serial examinations

  1. Automatic digitization of SMA data

    Science.gov (United States)

    Väänänen, Mika; Tanskanen, Eija

    2017-04-01

    In the 1970's and 1980's the Scandinavian Magnetometer Array produced large amounts of excellent data from over 30 stations In Norway, Sweden and Finland. 620 film reels and 20 kilometers of film have been preserved and the longest time series produced in the campaign span almost uninterrupted for five years, but the data has never seen widespread use due to the choice of medium. Film is a difficult medium to digitize efficiently. Previously events of interest were searched for by hand and digitization was done by projecting the film on paper and plotting it by hand. We propose a method of automatically digitizing geomagnetic data stored on film and extracting the numerical values from the digitized data. The automatic digitization process helps in preserving old, valuable data that might otherwise go unused.

  2. Well scintillation counter with automatic sample changing and data processing: an inexpensive instrument incorporating consumer products

    International Nuclear Information System (INIS)

    Dudley, R.A.; Figdor, H.C.; Keroe, E.A.; Morris, A.C. Jr.; Mutz, O.J.

    1977-01-01

    An automatic well scintillation-counting system suitable for in vitro assays with 125 I has been designed with the express purpose of allowing effective operation and maintenance in laboratories in developing countries. The system incorporates comparatively simple components, notably two consumer products: A Kodak Carousel slide projector as sample changer and a Hewlett-Packard HP-97 programmable printing calculator as system controller and data processor. The instrument can accomodate 80 counting vials of demensions 12 mm diameter x 75 mm, or 40 vials of 16 mm diameter x 100 mm. The calculator provides on-line control and data reduction with the mediation of an interface somewhat resembling that required between a scaler and a printer. Its program capacity is adequate for fairly complicated on-line operations, including, interpolation from a standard curve in logit-log space, calculation of error in hormone concentration, and termination of counting when the counting error is rediced to a prescribed fraction of the composite of other random assay errors (as stored in the calculator's memory). This system is inexpensive, robust, and capable of being operated manually if automatic accessories fail. It could be improved in several ways, particularly by providing for operation from batteries and, no doubt in the immediate future, substitution of the next generation of cheaper and more powerful calculators. The instrument may be cost-effective in any small to medium-sized laboratory. (orig.) [de

  3. Automatic computation of radioimmunoassay data

    International Nuclear Information System (INIS)

    Toyota, Takayoshi; Kudo, Mikihiko; Abe, Kanji; Kawamata, Fumiaki; Uehata, Shigeru.

    1975-01-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory. (auth.)

  4. Automatic data-processing equipment of moon mark of nail for verifying some experiential theory of Traditional Chinese Medicine.

    Science.gov (United States)

    Niu, Renjie; Fu, Chenyu; Xu, Zhiyong; Huang, Jianyuan

    2016-04-29

    Doctors who practice Traditional Chinese Medicine (TCM) diagnose using four methods - inspection, auscultation and olfaction, interrogation, and pulse feeling/palpation. The shape and shape changes of the moon marks on the nails are an important indication when judging the patient's health. There are a series of classical and experimental theories about moon marks in TCM, which does not have support from statistical data. To verify some experiential theories on moon mark in TCM by automatic data-processing equipment. This paper proposes the equipment that utilizes image processing technology to collect moon mark data of different target groups conveniently and quickly, building a database that combines this information with that gathered from the health and mental status questionnaire in each test. This equipment has a simple design, a low cost, and an optimized algorithm. The practice has been proven to quickly complete automatic acquisition and preservation of key data about moon marks. In the future, some conclusions will likely be obtained from these data; some changes of moon marks related to a special pathological change will be established with statistical methods.

  5. Automatic drawing and CAD actualization in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get expain the curve of radiant sampling data, and we can combine mineral masses and analyse and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  6. Automatic drawing and cad actualiztion in processing data of radiant sampling in physics prospect

    International Nuclear Information System (INIS)

    Liu Jinsheng

    2010-01-01

    In this paper discussed a method of processing radiant sampling data with computer. By this method can get explain the curve of radiant sampling data, and we can combine mineral masses and analyses and calculate them, then record the result on Notebook. There are many merites of this method: easy to learn, simple to use, high efficient. It adapts to all sorts of mines. (authors)

  7. Automatic processing, quality assurance and serving of real-time weather data

    Science.gov (United States)

    Williams, Matthew; Cornford, Dan; Bastin, Lucy; Jones, Richard; Parker, Stephen

    2011-03-01

    Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts, a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the world. Despite the abundance of available data, the production of usable information about the weather in individual local neighbourhoods requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this instance, this allows a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods provided by the INTAMAP project. A simplified example illustrates how the INTAMAP web processing service can be employed as part of a quality control procedure to estimate the bias and residual variance of user contributed temperature observations, using a reference standard based on temperature observations with carefully controlled quality. We also consider how the uncertainty introduced by the interpolation can be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.

  8. Big Data Analysis for Personalized Health Activities: Machine Learning Processing for Automatic Keyword Extraction Approach

    Directory of Open Access Journals (Sweden)

    Jun-Ho Huh

    2018-04-01

    Full Text Available The obese population is increasing rapidly due to the change of lifestyle and diet habits. Obesity can cause various complications and is becoming a social disease. Nonetheless, many obese patients are unaware of the medical treatments that are right for them. Although a variety of online and offline obesity management services have been introduced, they are still not enough to attract the attention of users and are not much of help to solve the problem. Obesity healthcare and personalized health activities are the important factors. Since obesity is related to lifestyle habits, eating habits, and interests, I concluded that the big data analysis of these factors could deduce the problem. Therefore, I collected big data by applying the machine learning and crawling method to the unstructured citizen health data in Korea and the search data of Naver, which is a Korean portal company, and Google for keyword analysis for personalized health activities. It visualized the big data using text mining and word cloud. This study collected and analyzed the data concerning the interests related to obesity, change of interest on obesity, and treatment articles. The analysis showed a wide range of seasonal factors according to spring, summer, fall, and winter. It also visualized and completed the process of extracting the keywords appropriate for treatment of abdominal obesity and lower body obesity. The keyword big data analysis technique for personalized health activities proposed in this paper is based on individual’s interests, level of interest, and body type. Also, the user interface (UI that visualizes the big data compatible with Android and Apple iOS. The users can see the data on the app screen. Many graphs and pictures can be seen via menu, and the significant data values are visualized through machine learning. Therefore, I expect that the big data analysis using various keywords specific to a person will result in measures for personalized

  9. Anomaly disentanglement on the basis of automatic processing of the aerial gamma-spectrometric survey data

    International Nuclear Information System (INIS)

    Perlovskij, V.A.; Zaika, S.D.; Lomtadze, V.V.

    1976-01-01

    Automated processing airborne gamma spectrometric data has been introduced with the use of the BESM-4 computer. Discrimination of anomalies is effected using a variable interval 200-300 m within which maximum values in the general counting channel and their coordinates are found; reference points are detected too against which one can determine the excess and amplitude of the anomalies. The discriminated anomalies are fixed on printer and charts of anomalies are constructed. The charts and tables characterizing the anomaly help in the identification of its potentialities and hence underlie a decision to start ground works

  10. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liu

    2011-09-01

    Full Text Available Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  11. Achieving Accurate Automatic Sleep Staging on Manually Pre-processed EEG Data Through Synchronization Feature Extraction and Graph Metrics.

    Science.gov (United States)

    Chriskos, Panteleimon; Frantzidis, Christos A; Gkivogkli, Polyxeni T; Bamidis, Panagiotis D; Kourtidou-Papadeli, Chrysoula

    2018-01-01

    Sleep staging, the process of assigning labels to epochs of sleep, depending on the stage of sleep they belong, is an arduous, time consuming and error prone process as the initial recordings are quite often polluted by noise from different sources. To properly analyze such data and extract clinical knowledge, noise components must be removed or alleviated. In this paper a pre-processing and subsequent sleep staging pipeline for the sleep analysis of electroencephalographic signals is described. Two novel methods of functional connectivity estimation (Synchronization Likelihood/SL and Relative Wavelet Entropy/RWE) are comparatively investigated for automatic sleep staging through manually pre-processed electroencephalographic recordings. A multi-step process that renders signals suitable for further analysis is initially described. Then, two methods that rely on extracting synchronization features from electroencephalographic recordings to achieve computerized sleep staging are proposed, based on bivariate features which provide a functional overview of the brain network, contrary to most proposed methods that rely on extracting univariate time and frequency features. Annotation of sleep epochs is achieved through the presented feature extraction methods by training classifiers, which are in turn able to accurately classify new epochs. Analysis of data from sleep experiments on a randomized, controlled bed-rest study, which was organized by the European Space Agency and was conducted in the "ENVIHAB" facility of the Institute of Aerospace Medicine at the German Aerospace Center (DLR) in Cologne, Germany attains high accuracy rates, over 90% based on ground truth that resulted from manual sleep staging by two experienced sleep experts. Therefore, it can be concluded that the above feature extraction methods are suitable for semi-automatic sleep staging.

  12. Selective and validated data processing techniques for performance improvement of automatic lines

    Directory of Open Access Journals (Sweden)

    D’Aponte Francesco

    2016-01-01

    Full Text Available Optimization of the data processing techniques of accelerometers and force transducers allowed to get information about actions in order to improve the behavior of a cutting stage of a converting machinery for diapers production. In particular, different mechanical configurations have been studied and compared in order to reduce the solicitations due to the impacts between knives and anvil, to get clean and accurate cuts and to reduce wear of knives themselves. Reducing the uncertainty of measurements allowed to correctly individuate the best configuration for the pneumatic system that realize the coupling between anvil and knife. The size of pipes, the working pressure and the type of the fluid used in the coupling system have been examined. Experimental results obtained by means of acceleration and force measurements allowed to identify in a reproducible and coherent way the geometry of the pushing device and the working pressure range of the hydraulic fluid. The remarkable reduction of knife and anvil vibrations is expected to strongly reduce the wear of the cutting stage components.

  13. Automatic support for product based workflow design : generation of process models from a product data model

    NARCIS (Netherlands)

    Vanderfeesten, I.T.P.; Reijers, H.A.; Aalst, van der W.M.P.; Vogelaar, J.J.C.L.; Meersman, R.; Dillon, T.; Herrero, P.

    2010-01-01

    Product Based Workflow Design (PBWD) is one of the few scientific methodologies for the (re)design of workflow processes. It is based on an analysis of the product that is produced in the workflow process and derives a process model from the product structure. Until now this derivation has been a

  14. Genetic analysis of seasonal runoff based on automatic techniques of hydrometeorological data processing

    Science.gov (United States)

    Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey

    2017-04-01

    Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods

  15. Localized Segment Based Processing for Automatic Building Extraction from LiDAR Data

    Science.gov (United States)

    Parida, G.; Rajan, K. S.

    2017-05-01

    The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.

  16. LOCALIZED SEGMENT BASED PROCESSING FOR AUTOMATIC BUILDING EXTRACTION FROM LiDAR DATA

    Directory of Open Access Journals (Sweden)

    G. Parida

    2017-05-01

    Full Text Available The current methods of object segmentation and extraction and classification of aerial LiDAR data is manual and tedious task. This work proposes a technique for object segmentation out of LiDAR data. A bottom-up geometric rule based approach was used initially to devise a way to segment buildings out of the LiDAR datasets. For curved wall surfaces, comparison of localized surface normals was done to segment buildings. The algorithm has been applied to both synthetic datasets as well as real world dataset of Vaihingen, Germany. Preliminary results show successful segmentation of the buildings objects from a given scene in case of synthetic datasets and promissory results in case of real world data. The advantages of the proposed work is non-dependence on any other form of data required except LiDAR. It is an unsupervised method of building segmentation, thus requires no model training as seen in supervised techniques. It focuses on extracting the walls of the buildings to construct the footprint, rather than focussing on roof. The focus on extracting the wall to reconstruct the buildings from a LiDAR scene is crux of the method proposed. The current segmentation approach can be used to get 2D footprints of the buildings, with further scope to generate 3D models. Thus, the proposed method can be used as a tool to get footprints of buildings in urban landscapes, helping in urban planning and the smart cities endeavour.

  17. On-line measurement with automatic emulsion analysis system and off-line data processing (E531 neutrino experiment)

    International Nuclear Information System (INIS)

    Miyanishi, Motoaki

    1984-01-01

    The automatic emulsion analysis system developed by Nagoya cosmic ray observation group was practically used for the experiment (FNAL-E531) on determining the lifetime of charm particles for the first time in the world, and achieved a great successful result. The system consists of four large precise coordinate-measuring stages capable of conducting simultaneous measurement and multiple (currently four) DOMS (digitized on-line microscope), supported with one mini-computer (ECLIPS S/130). The purpose of E531 experiment was the determination of charm particle lifetime. The experiment was carried out at FNAL, USA, and by the irradiation of wide band ν sub(μ) beam equivalent to 7 x 10 18 of 350 GeV/c protons. The detector was a hybrid system of emulsions and a counter spectrometer. The scan of neutrino reaction, the scan of charm particles, and charm event measurement were analyzed in emulsions, and the on-line programs for-respective analyses were created. Nagoya group has found 726 neutrino reactions in the first run, obtained 37 charm particle candidates, and found 1442 neutrino reactions in the second run, and obtained 56 charm particle candidates. The capability of the automatic emulsion analysis system in terms of the time equired for analysis is in total 3.5 hours per event; 15 minutes for C.S. scan, 15 minutes for coupling to module, 20 minutes for tracing to vertex, 1 hour for neutrino reaction measurement, 10 minutes for offline data processing and 1.5 hours for charm particle scanning. (Wakatsuki, Y.)

  18. Automatic calculations of electroweak processes

    International Nuclear Information System (INIS)

    Ishikawa, T.; Kawabata, S.; Kurihara, Y.; Shimizu, Y.; Kaneko, T.; Kato, K.; Tanaka, H.

    1996-01-01

    GRACE system is an excellent tool for calculating the cross section and for generating event of the elementary process automatically. However it is not always easy for beginners to use. An interactive version of GRACE is being developed so as to be a user friendly system. Since it works exactly in the same environment as PAW, all functions of PAW are available for handling any histogram information produced by GRACE. As its application the cross sections of all elementary processes with up to 5-body final states induced by e + e - interaction are going to be calculated and to be summarized as a catalogue. (author)

  19. Challenges and opportunities : One stop processing of automatic large-scale base map production using airborne lidar data within gis environment case study: Makassar City, Indonesia

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information

  20. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    Science.gov (United States)

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  1. Automatic generation of data merging program codes.

    OpenAIRE

    Hyensook, Kim; Oussena, Samia; Zhang, Ying; Clark, Tony

    2010-01-01

    Data merging is an essential part of ETL (Extract-Transform-Load) processes to build a data warehouse system. To avoid rewheeling merging techniques, we propose a Data Merging Meta-model (DMM) and its transformation into executable program codes in the manner of model driven engineering. DMM allows defining relationships of different model entities and their merging types in conceptual level. Our formalized transformation described using ATL (ATLAS Transformation Language) enables automatic g...

  2. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  3. Data-driven management using quantitative metric and automatic auditing program (QMAP) improves consistency of radiation oncology processes.

    Science.gov (United States)

    Yu, Naichang; Xia, Ping; Mastroianni, Anthony; Kolar, Matthew D; Chao, Samuel T; Greskovich, John F; Suh, John H

    Process consistency in planning and delivery of radiation therapy is essential to maintain patient safety and treatment quality and efficiency. Ensuring the timely completion of each critical clinical task is one aspect of process consistency. The purpose of this work is to report our experience in implementing a quantitative metric and automatic auditing program (QMAP) with a goal of improving the timely completion of critical clinical tasks. Based on our clinical electronic medical records system, we developed a software program to automatically capture the completion timestamp of each critical clinical task while providing frequent alerts of potential delinquency. These alerts were directed to designated triage teams within a time window that would offer an opportunity to mitigate the potential for late completion. Since July 2011, 18 metrics were introduced in our clinical workflow. We compared the delinquency rates for 4 selected metrics before the implementation of the metric with the delinquency rate of 2016. One-tailed Student t test was used for statistical analysis RESULTS: With an average of 150 daily patients on treatment at our main campus, the late treatment plan completion rate and late weekly physics check were reduced from 18.2% and 8.9% in 2011 to 4.2% and 0.1% in 2016, respectively (P < .01). The late weekly on-treatment physician visit rate was reduced from 7.2% in 2012 to <1.6% in 2016. The yearly late cone beam computed tomography review rate was reduced from 1.6% in 2011 to <0.1% in 2016. QMAP is effective in reducing late completions of critical tasks, which can positively impact treatment quality and patient safety by reducing the potential for errors resulting from distractions, interruptions, and rush in completion of critical tasks. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  4. Learning algorithms and automatic processing of languages

    International Nuclear Information System (INIS)

    Fluhr, Christian Yves Andre

    1977-01-01

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts

  5. Automatic River Network Extraction from LIDAR Data

    Science.gov (United States)

    Maderal, E. N.; Valcarcel, N.; Delgado, J.; Sevilla, C.; Ojeda, J. C.

    2016-06-01

    National Geographic Institute of Spain (IGN-ES) has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI) within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network) and hydrological criteria (flow accumulation river network), and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files), and process; using local virtualization and the Amazon Web Service (AWS), which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri) and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  6. AUTOMATIC RIVER NETWORK EXTRACTION FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    E. N. Maderal

    2016-06-01

    Full Text Available National Geographic Institute of Spain (IGN-ES has launched a new production system for automatic river network extraction for the Geospatial Reference Information (GRI within hydrography theme. The goal is to get an accurate and updated river network, automatically extracted as possible. For this, IGN-ES has full LiDAR coverage for the whole Spanish territory with a density of 0.5 points per square meter. To implement this work, it has been validated the technical feasibility, developed a methodology to automate each production phase: hydrological terrain models generation with 2 meter grid size and river network extraction combining hydrographic criteria (topographic network and hydrological criteria (flow accumulation river network, and finally the production was launched. The key points of this work has been managing a big data environment, more than 160,000 Lidar data files, the infrastructure to store (up to 40 Tb between results and intermediate files, and process; using local virtualization and the Amazon Web Service (AWS, which allowed to obtain this automatic production within 6 months, it also has been important the software stability (TerraScan-TerraSolid, GlobalMapper-Blue Marble , FME-Safe, ArcGIS-Esri and finally, the human resources managing. The results of this production has been an accurate automatic river network extraction for the whole country with a significant improvement for the altimetric component of the 3D linear vector. This article presents the technical feasibility, the production methodology, the automatic river network extraction production and its advantages over traditional vector extraction systems.

  7. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    of identifying a release of radioactivity provided that the number of sampling locations is sufficiently high. Consequently, we believe that a combination of applying the presented mapping approach and the physical knowledge of the transport processes of radioactivity should be used to predict the extreme values.......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  8. Automatic, semi-automatic and manual validation of urban drainage data.

    Science.gov (United States)

    Branisavljević, N; Prodanović, D; Pavlović, D

    2010-01-01

    Advances in sensor technology and the possibility of automated long distance data transmission have made continuous measurements the preferable way of monitoring urban drainage processes. Usually, the collected data have to be processed by an expert in order to detect and mark the wrong data, remove them and replace them with interpolated data. In general, the first step in detecting the wrong, anomaly data is called the data quality assessment or data validation. Data validation consists of three parts: data preparation, validation scores generation and scores interpretation. This paper will present the overall framework for the data quality improvement system, suitable for automatic, semi-automatic or manual operation. The first two steps of the validation process are explained in more detail, using several validation methods on the same set of real-case data from the Belgrade sewer system. The final part of the validation process, which is the scores interpretation, needs to be further investigated on the developed system.

  9. Automatically processed alpha-track radon monitor

    International Nuclear Information System (INIS)

    Langner, G.H. Jr.

    1993-01-01

    An automatically processed alpha-track radon monitor is provided which includes a housing having an aperture allowing radon entry, and a filter that excludes the entry of radon daughters into the housing. A flexible track registration material is located within the housing that records alpha-particle emissions from the decay of radon and radon daughters inside the housing. The flexible track registration material is capable of being spliced such that the registration material from a plurality of monitors can be spliced into a single strip to facilitate automatic processing of the registration material from the plurality of monitors. A process for the automatic counting of radon registered by a radon monitor is also provided

  10. Automatic and strategic processes in advertising effects

    DEFF Research Database (Denmark)

    Grunert, Klaus G.

    1996-01-01

    , the retrieval of information, and provide a heuristic for brand evaluation. Strategic processes govern learning and inference formation. T relative importance of both types of processes will depend on product involvement. The distinction of these two types of processes leads to some conclusions which...... are at variance with current notions about advertising effects. For example, the att span problem will be relevant only for strategic processes, not for automatic processes, a certain amount of learning can occur with very little conscious effort, and advertising's effect on brand evaluation may be more stable......Two kinds of cognitive processes can be distinguished: Automatic processes, which are mostly subconscious, are learned and changed very slowly, and are not subject to the capacity limitations of working memory, and strategic processes, which are conscious, are subject to capacity limitations...

  11. A CityGML extension for traffic-sign objects that guides the automatic processing of data collected using Mobile Mapping technology

    Science.gov (United States)

    Varela-González, M.; Riveiro, B.; Arias-Sánchez, P.; González-Jorge, H.; Martínez-Sánchez, J.

    2014-11-01

    The rapid evolution of integral schemes, accounting for geometric and semantic data, has been importantly motivated by the advances in the last decade in mobile laser scanning technology; automation in data processing has also recently influenced the expansion of the new model concepts. This paper reviews some important issues involved in the new paradigms of city 3D modelling: an interoperable schema for city 3D modelling (cityGML) and mobile mapping technology to provide the features that composing the city model. This paper focuses in traffic signs, discussing their characterization using cityGML in order to ease the implementation of LiDAR technology in road management software, as well as analysing some limitations of the current technology in the labour of automatic detection and classification.

  12. Data processing

    CERN Document Server

    Fry, T F

    2013-01-01

    Data Processing discusses the principles, practices, and associated tools in data processing. The book is comprised of 17 chapters that are organized into three parts. The first part covers the characteristics, systems, and methods of data processing. Part 2 deals with the data processing practice; this part discusses the data input, output, and storage. The last part discusses topics related to systems and software in data processing, which include checks and controls, computer language and programs, and program elements and structures. The text will be useful to practitioners of computer-rel

  13. Process and device for automatically surveying complex installations

    International Nuclear Information System (INIS)

    Pekrul, P.J.; Thiele, A.W.

    1976-01-01

    A description is given of a process for automatically analysing separate signal processing channels in real time, one channel per signal, in a facility with significant background noise in signals varying in time and coming from transducers at selected points for the continuous monitoring of the operating conditions of the various components of the installation. The signals are intended to determine potential breakdowns, determine conclusions as to the severity of these potential breakdowns and indicate to an operator the measures to be taken in consequence. The feature of this process is that it comprises the automatic and successive selection of each channel for the purpose of spectral analysis, the automatic processing of the signal of each selected channel to show energy spectrum density data at pre-determined frequencies, the automatic comparison of the energy spectrum density data of each channel with pre-determined sets of limits varying with the frequency, and the automatic indication to the operator of the condition of the various components of the installation associated to each channel and the measures to be taken depending on the set of limits [fr

  14. Development of automatic techniques for GPS data management

    International Nuclear Information System (INIS)

    Park, Pil Ho

    2001-06-01

    It is necessary for GPS center to establish automatization as effective management of GPS network including data gathering, data transformation, data backup, data sending to IGS (International GPS Service for geodynamics), and precise ephemerides gathering. The operating program of GPS center has been adopted at KCSC (Korea Cadastral Survey Corporation), NGI (National Geography Institute), MOMAF (Ministry of Maritime Affairs and Fisheries) without self-development of core technique. The automatic management of GPS network is consists of GPS data management and data processing. It is also fundamental technique, which should be accomplished by every GPS centers. Therefore, this study carried out analyzing of Japanese GPS center, which has accomplished automatization by module considering applicability for domestic GPS centers

  15. Semi-automatic Data Integration using Karma

    Science.gov (United States)

    Garijo, D.; Kejriwal, M.; Pierce, S. A.; Houser, P. I. Q.; Peckham, S. D.; Stanko, Z.; Hardesty Lewis, D.; Gil, Y.; Pennington, D. D.; Knoblock, C.

    2017-12-01

    Data integration applications are ubiquitous in scientific disciplines. A state-of-the-art data integration system accepts both a set of data sources and a target ontology as input, and semi-automatically maps the data sources in terms of concepts and relationships in the target ontology. Mappings can be both complex and highly domain-specific. Once such a semantic model, expressing the mapping using community-wide standard, is acquired, the source data can be stored in a single repository or database using the semantics of the target ontology. However, acquiring the mapping is a labor-prone process, and state-of-the-art artificial intelligence systems are unable to fully automate the process using heuristics and algorithms alone. Instead, a more realistic goal is to develop adaptive tools that minimize user feedback (e.g., by offering good mapping recommendations), while at the same time making it intuitive and easy for the user to both correct errors and to define complex mappings. We present Karma, a data integration system that has been developed over multiple years in the information integration group at the Information Sciences Institute, a research institute at the University of Southern California's Viterbi School of Engineering. Karma is a state-of-the-art data integration tool that supports an interactive graphical user interface, and has been featured in multiple domains over the last five years, including geospatial, biological, humanities and bibliographic applications. Karma allows a user to import their own ontology and datasets using widely used formats such as RDF, XML, CSV and JSON, can be set up either locally or on a server, supports a native backend database for prototyping queries, and can even be seamlessly integrated into external computational pipelines, including those ingesting data via streaming data sources, Web APIs and SQL databases. We illustrate a Karma workflow at a conceptual level, along with a live demo, and show use cases of

  16. DEVELOPMENT AND TESTING OF GEO-PROCESSING MODELS FOR THE AUTOMATIC GENERATION OF REMEDIATION PLAN AND NAVIGATION DATA TO USE IN INDUSTRIAL DISASTER REMEDIATION

    Directory of Open Access Journals (Sweden)

    G. Lucas

    2015-08-01

    Full Text Available This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines. Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long, 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect. The second model shows only 1% difference with the variation of feature number; so this last is less interesting for

  17. Draft Automatic Data Acquisition System Plan

    International Nuclear Information System (INIS)

    1987-04-01

    This Automatic Data Acquisition System (ADAS) Plan has been prepared in support of the requirement for detailed site characterization of the Deaf Smith County candidate repository site in salt, and describes the data acquisition system which will be used for unattended data collection from the geotechnical instrumentation installed at the site. Section 1.1 discusses the programmatic background to the plan, Section 1.2 presents the scope and purpose of the plan, and the organization of the document is given in Section 1.3. 31 refs., 34 figs., 8 tabs

  18. Data processing

    International Nuclear Information System (INIS)

    Cousot, P.

    1988-01-01

    The 1988 progress report of the Data Processing laboratory (Polytechnic School, France), is presented. The laboratory research fields are: the semantics, the tests and the semantic analysis of the codes, the formal calculus, the software applications, the algorithms, the neuron networks and VLSI (Very Large Scale Integration). The investigations concerning the polynomial rings are performed by means of the standard basis approach. Among the research topics, the Pascal codes, the parallel processing, the combinatorial, statistical and asymptotic properties of the fundamental data processing tools, the signal processing and the pattern recognition. The published papers, the congress communications and the thesis are also included [fr

  19. Accessories for Enhancement of the Semi-Automatic Welding Processes

    National Research Council Canada - National Science Library

    Wheeler, Douglas M; Sawhill, James M

    2000-01-01

    The project's objective is to identify specific areas of the semi-automatic welding operation that is performed with the major semi-automatic processes, which would be more productive if a suitable...

  20. Automation of chromosomes analysis. Automatic system for image processing

    International Nuclear Information System (INIS)

    Le Go, R.; Cosnac, B. de; Spiwack, A.

    1975-01-01

    The A.S.T.I. is an automatic system relating to the fast conversational processing of all kinds of images (cells, chromosomes) converted to a numerical data set (120000 points, 16 grey levels stored in a MOS memory) through a fast D.O. analyzer. The system performs automatically the isolation of any individual image, the area and weighted area of which are computed. These results are directly displayed on the command panel and can be transferred to a mini-computer for further computations. A bright spot allows parts of an image to be picked out and the results to be displayed. This study is particularly directed towards automatic karyo-typing [fr

  1. Automatic generation of combinatorial test data

    CERN Document Server

    Zhang, Jian; Ma, Feifei

    2014-01-01

    This book reviews the state-of-the-art in combinatorial testing, with particular emphasis on the automatic generation of test data. It describes the most commonly used approaches in this area - including algebraic construction, greedy methods, evolutionary computation, constraint solving and optimization - and explains major algorithms with examples. In addition, the book lists a number of test generation tools, as well as benchmarks and applications. Addressing a multidisciplinary topic, it will be of particular interest to researchers and professionals in the areas of software testing, combi

  2. Word Processing in Dyslexics: An Automatic Decoding Deficit?

    Science.gov (United States)

    Yap, Regina; Van Der Leu, Aryan

    1993-01-01

    Compares dyslexic children with normal readers on measures of phonological decoding and automatic word processing. Finds that dyslexics have a deficit in automatic phonological decoding skills. Discusses results within the framework of the phonological deficit and the automatization deficit hypotheses. (RS)

  3. Automatic protein structure solution from weak X-ray data

    Science.gov (United States)

    Skubák, Pavol; Pannu, Navraj S.

    2013-11-01

    Determining new protein structures from X-ray diffraction data at low resolution or with a weak anomalous signal is a difficult and often an impossible task. Here we propose a multivariate algorithm that simultaneously combines the structure determination steps. In tests on over 140 real data sets from the protein data bank, we show that this combined approach can automatically build models where current algorithms fail, including an anisotropically diffracting 3.88 Å RNA polymerase II data set. The method seamlessly automates the process, is ideal for non-specialists and provides a mathematical framework for successfully combining various sources of information in image processing.

  4. Semi-automatic film processing unit

    International Nuclear Information System (INIS)

    Mohamad Annuar Assadat Husain; Abdul Aziz Bin Ramli; Mohd Khalid Matori

    2005-01-01

    The design concept applied in the development of an semi-automatic film processing unit needs creativity and user support in channelling the required information to select materials and operation system that suit the design produced. Low cost and efficient operation are the challenges that need to be faced abreast with the fast technology advancement. In producing this processing unit, there are few elements which need to be considered in order to produce high quality image. Consistent movement and correct time coordination for developing and drying are a few elements which need to be controlled. Other elements which need serious attentions are temperature, liquid density and the amount of time for the chemical liquids to react. Subsequent chemical reaction that take place will cause the liquid chemical to age and this will adversely affect the quality of image produced. This unit is also equipped with liquid chemical drainage system and disposal chemical tank. This unit would be useful in GP clinics especially in rural area which practice manual system for developing and require low operational cost. (Author)

  5. Distributed automatic control of technological processes in conditions of weightlessness

    Science.gov (United States)

    Kukhtenko, A. I.; Merkulov, V. I.; Samoylenko, Y. I.; Ladikov-Royev, Y. P.

    1986-01-01

    Some problems associated with the automatic control of liquid metal and plasma systems under conditions of weightlessness are examined, with particular reference to the problem of stability of liquid equilibrium configurations. The theoretical fundamentals of automatic control of processes in electrically conducting continuous media are outlined, and means of using electromagnetic fields for simulating technological processes in a space environment are discussed.

  6. Automatic computation of radioimmunoassay data. Insulin and C-peptide

    Energy Technology Data Exchange (ETDEWEB)

    Toyota, T; Kudo, M; Abe, K [Hirosaki Univ., Aomori (Japan). School of Medicine; Kawamata, F; Uehata, S

    1975-09-01

    Radioimmunoassay provided dose response curves which showed linearity by the use of logistic transformation (Rodbard). This transformation which was applicable to radioimmunoassay should be useful for the computer processing of insulin and C-peptide assay. In the present studies, standard curves were analysed by testing the fit of analytic functions to radioimmunoassay of insulin and C-peptides. A program for use in combination with the double antibody technique was made by Dr. Kawamata. This approach was evidenced to be useful in order to allow automatic computation of data derived from the double antibody assays of insulin and C-peptides. Automatic corrected calculations of radioimmunoassay data of insulin was found to be satisfactory.

  7. Some results of automatic processing of images

    International Nuclear Information System (INIS)

    Golenishchev, I.A.; Gracheva, T.N.; Khardikov, S.V.

    1975-01-01

    The problems of automatic deciphering of the radiographic picture the purpose of which is making a conclusion concerning the quality of the inspected product on the basis of the product defect images in the picture are considered. The methods of defect image recognition are listed, and the algorithms and the class features of defects are described. The results of deciphering of a small radiographic picture by means of the ''Minsk-22'' computer are presented. It is established that the sensitivity of the method of the automatic deciphering is close to that obtained for visual deciphering

  8. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    Barisits, Martin-Stefan; The ATLAS collaboration; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas

    2016-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration has now been successfully operated for over a year. However, with the forthcoming start of run-2 and its expected workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing data imbalances as well as storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only ...

  9. Automatic rebalancing of data in ATLAS distributed data management

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00473067; The ATLAS collaboration; Serfon, Cedric; Garonne, Vincent; Lassnig, Mario; Beermann, Thomas; Javurek, Tomas

    2017-01-01

    The ATLAS Distributed Data Management system stores more than 220PB of physics data across more than 130 sites globally. Rucio, the next generation data management system of the ATLAS collaboration, has now been successfully operated for two years. However, with the increasing workload and utilization, more automated and advanced methods of managing the data are needed. In this article we present an extension to the data management system, which is in charge of detecting and foreseeing storage elements reaching and surpassing their capacity limit. The system automatically and dynamically rebalances the data to other storage elements, while respecting and guaranteeing data distribution policies and ensuring the availability of the data. This concept not only lowers the operational burden, as these cumbersome procedures had previously to be done manually, but it also enables the system to use its distributed resources more efficiently, which not only affects the data management system itself, but in consequence...

  10. Scheduling algorithms for automatic control systems for technological processes

    Science.gov (United States)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  11. Automatic image processing as a means of safeguarding nuclear material

    International Nuclear Information System (INIS)

    Kahnmeyer, W.; Willuhn, K.; Uebel, W.

    1985-01-01

    Problems involved in computerized analysis of pictures taken by automatic film or video cameras in the context of international safeguards implementation are described. They include technical ones as well as the need to establish objective criteria for assessing image information. In the near future automatic image processing systems will be useful in verifying the identity and integrity of IAEA seals. (author)

  12. Data Processing

    Science.gov (United States)

    Grangeat, P.

    A new area of biology has been opened up by nanoscale exploration of the living world. This has been made possible by technological progress, which has provided the tools needed to make devices that can measure things on such length and time scales. In a sense, this is a new window upon the living world, so rich and so diverse. Many of the investigative methods described in this book seek to obtain complementary physical, chemical, and biological data to understand the way it works and the way it is organised. At these length and time scales, only dedicated instrumentation could apprehend the relevant phenomena. There is no way for our senses to observe these things directly. One important field of application is molecular medicine, which aims to explain the mechanisms of life and disease by the presence and quantification of specific molecular entities. This involves combining information about genes, proteins, cells, and organs. This in turn requires the association of instruments for molecular diagnosis, either in vitro, e.g., the microarray or the lab-on-a-chip, or in vivo, e.g., probes for molecular biopsy, and tools for molecular imaging, used to localise molecular information in living organisms in a non-invasive way. These considerations concern both preclinical research for drug design and human medical applications. With the development of DNA and RNA chips [1], genomics has revolutionised investigative methods for cells and cell processes [2,3]. By sequencing the human genome, new ways have been found for understanding the fundamental mechanisms of life [4]. A revolution is currently under way with the analysis of the proteome [5-8], i.e., the complete set of proteins that can be found in some given biological medium, such as the blood plasma. The goal is to characterise certain diseases by recognisable signatures in the proteomic profile, as determined from a blood sample or a biopsy, for example [9-13]. What is at stake is the early detection of

  13. Automatic processing of multimodal tomography datasets.

    Science.gov (United States)

    Parsons, Aaron D; Price, Stephen W T; Wadeson, Nicola; Basham, Mark; Beale, Andrew M; Ashton, Alun W; Mosselmans, J Frederick W; Quinn, Paul D

    2017-01-01

    With the development of fourth-generation high-brightness synchrotrons on the horizon, the already large volume of data that will be collected on imaging and mapping beamlines is set to increase by orders of magnitude. As such, an easy and accessible way of dealing with such large datasets as quickly as possible is required in order to be able to address the core scientific problems during the experimental data collection. Savu is an accessible and flexible big data processing framework that is able to deal with both the variety and the volume of data of multimodal and multidimensional scientific datasets output such as those from chemical tomography experiments on the I18 microfocus scanning beamline at Diamond Light Source.

  14. Advancements in Big Data Processing

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration

    2012-01-01

    The ever-increasing volumes of scientific data present new challenges for Distributed Computing and Grid-technologies. The emerging Big Data revolution drives new discoveries in scientific fields including nanotechnology, astrophysics, high-energy physics, biology and medicine. New initiatives are transforming data-driven scientific fields by pushing Bid Data limits enabling massive data analysis in new ways. In petascale data processing scientists deal with datasets, not individual files. As a result, a task (comprised of many jobs) became a unit of petascale data processing on the Grid. Splitting of a large data processing task into jobs enabled fine-granularity checkpointing analogous to the splitting of a large file into smaller TCP/IP packets during data transfers. Transferring large data in small packets achieves reliability through automatic re-sending of the dropped TCP/IP packets. Similarly, transient job failures on the Grid can be recovered by automatic re-tries to achieve reliable Six Sigma produc...

  15. Multichannel display system with automatic sequential output of analog data

    International Nuclear Information System (INIS)

    Bykovskii, Yu.A.; Gruzinov, A.E.; Lagoda, V.B.

    1989-01-01

    The authors describe a device that, with maximum simplicity and autonomy, permits parallel data display from 16 measuring channels with automatic output to the screen of a storage oscilloscope in ∼ 50 μsec. The described device can be used to study the divergence characteristics of the ion component of plasma sources and in optical and x-ray spectroscopy of pulsed processes. Owing to its compactness and autonomy, the device can be located in the immediate vicinity of the detectors (for example, inside a vacuum chamber), which allows the number of vacuum electrical lead-ins and the induction level to be reduced

  16. Beyond behaviorism: on the automaticity of higher mental processes.

    Science.gov (United States)

    Bargh, J A; Ferguson, M J

    2000-11-01

    The first 100 years of experimental psychology were dominated by 2 major schools of thought: behaviorism and cognitive science. Here the authors consider the common philosophical commitment to determinism by both schools, and how the radical behaviorists' thesis of the determined nature of higher mental processes is being pursued today in social cognition research on automaticity. In harmony with "dual process" models in contemporary cognitive science, which equate determined processes with those that are automatic and which require no intervening conscious choice or guidance, as opposed to "controlled" processes which do, the social cognition research on the automaticity of higher mental processes provides compelling evidence for the determinism of those processes. This research has revealed that social interaction, evaluation and judgment, and the operation of internal goal structures can all proceed without the intervention of conscious acts of will and guidance of the process.

  17. MEMOPS: data modelling and automatic code generation.

    Science.gov (United States)

    Fogh, Rasmus H; Boucher, Wayne; Ionides, John M C; Vranken, Wim F; Stevens, Tim J; Laue, Ernest D

    2010-03-25

    In recent years the amount of biological data has exploded to the point where much useful information can only be extracted by complex computational analyses. Such analyses are greatly facilitated by metadata standards, both in terms of the ability to compare data originating from different sources, and in terms of exchanging data in standard forms, e.g. when running processes on a distributed computing infrastructure. However, standards thrive on stability whereas science tends to constantly move, with new methods being developed and old ones modified. Therefore maintaining both metadata standards, and all the code that is required to make them useful, is a non-trivial problem. Memops is a framework that uses an abstract definition of the metadata (described in UML) to generate internal data structures and subroutine libraries for data access (application programming interfaces--APIs--currently in Python, C and Java) and data storage (in XML files or databases). For the individual project these libraries obviate the need for writing code for input parsing, validity checking or output. Memops also ensures that the code is always internally consistent, massively reducing the need for code reorganisation. Across a scientific domain a Memops-supported data model makes it easier to support complex standards that can capture all the data produced in a scientific area, share them among all programs in a complex software pipeline, and carry them forward to deposition in an archive. The principles behind the Memops generation code will be presented, along with example applications in Nuclear Magnetic Resonance (NMR) spectroscopy and structural biology.

  18. Neural Correlates of Automatic and Controlled Auditory Processing in Schizophrenia

    Science.gov (United States)

    Morey, Rajendra A.; Mitchell, Teresa V.; Inan, Seniha; Lieberman, Jeffrey A.; Belger, Aysenil

    2009-01-01

    Individuals with schizophrenia demonstrate impairments in selective attention and sensory processing. The authors assessed differences in brain function between 26 participants with schizophrenia and 17 comparison subjects engaged in automatic (unattended) and controlled (attended) auditory information processing using event-related functional MRI. Lower regional neural activation during automatic auditory processing in the schizophrenia group was not confined to just the temporal lobe, but also extended to prefrontal regions. Controlled auditory processing was associated with a distributed frontotemporal and subcortical dysfunction. Differences in activation between these two modes of auditory information processing were more pronounced in the comparison group than in the patient group. PMID:19196926

  19. Estimating spatial travel times using automatic vehicle identification data

    Science.gov (United States)

    2001-01-01

    Prepared ca. 2001. The paper describes an algorithm that was developed for estimating reliable and accurate average roadway link travel times using Automatic Vehicle Identification (AVI) data. The algorithm presented is unique in two aspects. First, ...

  20. Automatic digital surface model (DSM) generation from aerial imagery data

    Science.gov (United States)

    Zhou, Nan; Cao, Shixiang; He, Hongyan; Xing, Kun; Yue, Chunyu

    2018-04-01

    Aerial sensors are widely used to acquire imagery for photogrammetric and remote sensing application. In general, the images have large overlapped region, which provide a lot of redundant geometry and radiation information for matching. This paper presents a POS supported dense matching procedure for automatic DSM generation from aerial imagery data. The method uses a coarse-to-fine hierarchical strategy with an effective combination of several image matching algorithms: image radiation pre-processing, image pyramid generation, feature point extraction and grid point generation, multi-image geometrically constraint cross-correlation (MIG3C), global relaxation optimization, multi-image geometrically constrained least squares matching (MIGCLSM), TIN generation and point cloud filtering. The image radiation pre-processing is used in order to reduce the effects of the inherent radiometric problems and optimize the images. The presented approach essentially consists of 3 components: feature point extraction and matching procedure, grid point matching procedure and relational matching procedure. The MIGCLSM method is used to achieve potentially sub-pixel accuracy matches and identify some inaccurate and possibly false matches. The feasibility of the method has been tested on different aerial scale images with different landcover types. The accuracy evaluation is based on the comparison between the automatic extracted DSMs derived from the precise exterior orientation parameters (EOPs) and the POS.

  1. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data.......Continuous recording of intraluminal pressures for extended periods of time is currently regarded as a valuable method for detection of esophageal motor abnormalities. A subsequent automatic analysis of the resulting motility data relies on strict mathematical criteria for recognition of pressure...

  2. Resource depletion promotes automatic processing: implications for distribution of practice.

    Science.gov (United States)

    Scheel, Matthew H

    2010-12-01

    Recent models of cognition include two processing systems: an automatic system that relies on associative learning, intuition, and heuristics, and a controlled system that relies on deliberate consideration. Automatic processing requires fewer resources and is more likely when resources are depleted. This study showed that prolonged practice on a resource-depleting mental arithmetic task promoted automatic processing on a subsequent problem-solving task, as evidenced by faster responding and more errors. Distribution of practice effects (0, 60, 120, or 180 sec. between problems) on rigidity also disappeared when groups had equal time on resource-depleting tasks. These results suggest that distribution of practice effects is reducible to resource availability. The discussion includes implications for interpreting discrepancies in the traditional distribution of practice effect.

  3. Automatic Data Traffic Control on DSM Architecture

    Science.gov (United States)

    Frumkin, Michael; Jin, Hao-Qiang; Yan, Jerry; Kwak, Dochan (Technical Monitor)

    2000-01-01

    We study data traffic on distributed shared memory machines and conclude that data placement and grouping improve performance of scientific codes. We present several methods which user can employ to improve data traffic in his code. We report on implementation of a tool which detects the code fragments causing data congestions and advises user on improvements of data routing in these fragments. The capabilities of the tool include deduction of data alignment and affinity from the source code; detection of the code constructs having abnormally high cache or TLB misses; generation of data placement constructs. We demonstrate the capabilities of the tool on experiments with NAS parallel benchmarks and with a simple computational fluid dynamics application ARC3D.

  4. Data mining to detect clinical mastitis with automatic milking

    NARCIS (Netherlands)

    Kamphuis, C.; Mollenhorst, H.; Heesterbeek, J.A.P.; Hogeveen, H.

    2010-01-01

    Our objective was to use data mining to develop and validate a detection model for clinical mastitis (CM) using sensor data collected at nine Dutch dairy herds milking automatically. Sensor data was available for almost 3.5 million quarter milkings (QM) from 1,109 cows; 348 QM with CM were observed

  5. Automatic data acquisition of anthropological measurements

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O

    1993-01-01

    A computer program in BASIC is presented which enables the input of measurement data from a caliper directly into specific records in a dBASE IV or PARADOX database. The program circumvents the tedious procedure of first recording measurement data manually and then entering the data into a computer....... Thus much time can be saved and the risk of wrong data entry is lowered. The program was easy to use, and no significant problems were encountered. Necessary hardware is a standard IBM compatible desktop computer, Mitotoyu Digimatic (TM) calipers and a Mitotoyu Digimatic MUX-10 Multiplexer (TM)....

  6. Image processing. A system for the automatic sorting of chromosomes

    International Nuclear Information System (INIS)

    Najai, Amor

    1977-01-01

    The present paper deals with two aspects of the system: - an automata (specialized hardware) dedicated to image processing. Images are digitized, divided into sub-units and computations are carried out on their main parameters. - A software for the automatic recognition and sorting of chromosomes is implemented on a Multi-20 minicomputer, connected to the automata. (author) [fr

  7. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  8. Automatic process control in anaerobic digestion technology: A critical review.

    Science.gov (United States)

    Nguyen, Duc; Gadhamshetty, Venkataramana; Nitayavardhana, Saoharit; Khanal, Samir Kumar

    2015-10-01

    Anaerobic digestion (AD) is a mature technology that relies upon a synergistic effort of a diverse group of microbial communities for metabolizing diverse organic substrates. However, AD is highly sensitive to process disturbances, and thus it is advantageous to use online monitoring and process control techniques to efficiently operate AD process. A range of electrochemical, chromatographic and spectroscopic devices can be deployed for on-line monitoring and control of the AD process. While complexity of the control strategy ranges from a feedback control to advanced control systems, there are some debates on implementation of advanced instrumentations or advanced control strategies. Centralized AD plants could be the answer for the applications of progressive automatic control field. This article provides a critical overview of the available automatic control technologies that can be implemented in AD processes at different scales. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    Science.gov (United States)

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  10. Childhood trauma exposure disrupts the automatic regulation of emotional processing.

    Science.gov (United States)

    Marusak, Hilary A; Martin, Kayla R; Etkin, Amit; Thomason, Moriah E

    2015-03-13

    Early-life trauma is one of the strongest risk factors for later emotional psychopathology. Although research in adults highlights that childhood trauma predicts deficits in emotion regulation that persist decades later, it is unknown whether neural and behavioral changes that may precipitate illness are evident during formative, developmental years. This study examined whether automatic regulation of emotional conflict is perturbed in a high-risk urban sample of trauma-exposed children and adolescents. A total of 14 trauma-exposed and 16 age-, sex-, and IQ-matched comparison youth underwent functional MRI while performing an emotional conflict task that involved categorizing facial affect while ignoring an overlying emotion word. Engagement of the conflict regulation system was evaluated at neural and behavioral levels. Results showed that trauma-exposed youth failed to dampen dorsolateral prefrontal cortex activity and engage amygdala-pregenual cingulate inhibitory circuitry during the regulation of emotional conflict, and were less able to regulate emotional conflict. In addition, trauma-exposed youth showed greater conflict-related amygdala reactivity that was associated with diminished levels of trait reward sensitivity. These data point to a trauma-related deficit in automatic regulation of emotional processing, and increase in sensitivity to emotional conflict in neural systems implicated in threat detection. Aberrant amygdala response to emotional conflict was related to diminished reward sensitivity that is emerging as a critical stress-susceptibility trait that may contribute to the emergence of mental illness during adolescence. These results suggest that deficits in conflict regulation for emotional material may underlie heightened risk for psychopathology in individuals that endure early-life trauma.

  11. From Automatic to Adaptive Data Acquisition

    DEFF Research Database (Denmark)

    Chang, Marcus

    2009-01-01

    the exibility of sensornets and reduce the complexity for the domain scientist, we developed an AI-based controller to act as a proxy between the scientist and sensornet. This controller is driven by the scientist's requirements to the collected data, and uses adaptive sampling in order to reach these goals....

  12. Automatic User Interface Generation for Visualizing Big Geoscience Data

    Science.gov (United States)

    Yu, H.; Wu, J.; Zhou, Y.; Tang, Z.; Kuo, K. S.

    2016-12-01

    Along with advanced computing and observation technologies, geoscience and its related fields have been generating a large amount of data at an unprecedented growth rate. Visualization becomes an increasingly attractive and feasible means for researchers to effectively and efficiently access and explore data to gain new understandings and discoveries. However, visualization has been challenging due to a lack of effective data models and visual representations to tackle the heterogeneity of geoscience data. We propose a new geoscience data visualization framework by leveraging the interface automata theory to automatically generate user interface (UI). Our study has the following three main contributions. First, geoscience data has its unique hierarchy data structure and complex formats, and therefore it is relatively easy for users to get lost or confused during their exploration of the data. By applying interface automata model to the UI design, users can be clearly guided to find the exact visualization and analysis that they want. In addition, from a development perspective, interface automaton is also easier to understand than conditional statements, which can simplify the development process. Second, it is common that geoscience data has discontinuity in its hierarchy structure. The application of interface automata can prevent users from suffering automation surprises, and enhance user experience. Third, for supporting a variety of different data visualization and analysis, our design with interface automata could also make applications become extendable in that a new visualization function or a new data group could be easily added to an existing application, which reduces the overhead of maintenance significantly. We demonstrate the effectiveness of our framework using real-world applications.

  13. Robust indexing for automatic data collection

    International Nuclear Information System (INIS)

    Sauter, Nicholas K.; Grosse-Kunstleve, Ralf W.; Adams, Paul D.

    2003-01-01

    We present improved methods for indexing diffraction patterns from macromolecular crystals. The novel procedures include a more robust way to verify the position of the incident X-ray beam on the detector, an algorithm to verify that the deduced lattice basis is consistent with the observations, and an alternative approach to identify the metric symmetry of the lattice. These methods help to correct failures commonly experienced during indexing, and increase the overall success rate of the process. Rapid indexing, without the need for visual inspection, will play an important role as beamlines at synchrotron sources prepare for high-throughput automation

  14. Automatized material and radioactivity flow control tool in decommissioning process

    International Nuclear Information System (INIS)

    Rehak, I.; Vasko, M.; Daniska, V.; Schultz, O.

    2009-01-01

    In this presentation the automatized material and radioactivity flow control tool in decommissioning process is discussed. It is concluded that: computer simulation of the decommissioning process is one of the important attributes of computer code Omega; one of the basic tools of computer optimisation of decommissioning waste processing are the tools of integral material and radioactivity flow; all the calculated parameters of materials are stored in each point of calculation process and they can be viewed; computer code Omega represents opened modular system, which can be improved; improvement of the module of optimisation of decommissioning waste processing will be performed in the frame of improvement of material procedures and scenarios.

  15. Automatic Analysis of Swift-XRT data

    Science.gov (United States)

    Evans, P. A.; Tyler, L. G.; Beardmore, A. P.; Osborne, J. P.

    2008-08-01

    The Swift spacecraft detects and autonomously observes ˜100 Gamma Ray Bursts (GRBs) per year, ˜96% of which are detected by the X-ray telescope (XRT). GRBs are accompanied by optical transients and the field of ground-based follow-up of GRBs has expanded significantly over the last few years, with rapid response instruments capable of responding to Swift triggers on timescales of minutes. To make the most efficient use of limited telescope time, follow-up astronomers need accurate positions of GRBs as soon as possible after the trigger. Additionally, information such as the X-ray light curve, is of interest when considering observing strategy. The Swift team at Leicester University have developed techniques to improve the accuracy of the GRB positions available from the XRT, and to produce science-grade X-ray light curves of GRBs. These techniques are fully automated, and are executed as soon as data are available.

  16. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  17. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  18. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    International Nuclear Information System (INIS)

    Benkirane, A.; Auger, G.; Chbihi, A.; Bloyet, D.; Plagnol, E.

    1994-01-01

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ''classical'' automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append

  19. A contextual image segmentation system using a priori information for automatic data classification in nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Benkirane, A; Auger, G; Chbihi, A [Grand Accelerateur National d` Ions Lourds (GANIL), 14 - Caen (France); Bloyet, D [Caen Univ., 14 (France); Plagnol, E [Paris-11 Univ., 91 - Orsay (France). Inst. de Physique Nucleaire

    1994-12-31

    This paper presents an original approach to solve an automatic data classification problem by means of image processing techniques. The classification is achieved using image segmentation techniques for extracting the meaningful classes. Two types of information are merged for this purpose: the information contained in experimental images and a priori information derived from underlying physics (and adapted to image segmentation problem). This data fusion is widely used at different stages of the segmentation process. This approach yields interesting results in terms of segmentation performances, even in very noisy cases. Satisfactory classification results are obtained in cases where more ``classical`` automatic data classification methods fail. (authors). 25 refs., 14 figs., 1 append.

  20. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  1. Automatic Compound Annotation from Mass Spectrometry Data Using MAGMa.

    NARCIS (Netherlands)

    Ridder, L.O.; Hooft, van der J.J.J.; Verhoeven, S.

    2014-01-01

    The MAGMa software for automatic annotation of mass spectrometry based fragmentation data was applied to 16 MS/MS datasets of the CASMI 2013 contest. Eight solutions were submitted in category 1 (molecular formula assignments) and twelve in category 2 (molecular structure assignment). The MS/MS

  2. Experimental Study for Automatic Colony Counting System Based Onimage Processing

    Science.gov (United States)

    Fang, Junlong; Li, Wenzhe; Wang, Guoxin

    Colony counting in many colony experiments is detected by manual method at present, therefore it is difficult for man to execute the method quickly and accurately .A new automatic colony counting system was developed. Making use of image-processing technology, a study was made on the feasibility of distinguishing objectively white bacterial colonies from clear plates according to the RGB color theory. An optimal chromatic value was obtained based upon a lot of experiments on the distribution of the chromatic value. It has been proved that the method greatly improves the accuracy and efficiency of the colony counting and the counting result is not affected by using inoculation, shape or size of the colony. It is revealed that automatic detection of colony quantity using image-processing technology could be an effective way.

  3. Fluid Intelligence and Automatic Neural Processes in Facial Expression Perception

    DEFF Research Database (Denmark)

    Liu, Tongran; Xiao, Tong; Li, Xiaoyan

    2015-01-01

    The relationship between human fluid intelligence and social-emotional abilities has been a topic of considerable interest. The current study investigated whether adolescents with different intellectual levels had different automatic neural processing of facial expressions. Two groups of adolescent...... males were enrolled: a high IQ group and an average IQ group. Age and parental socioeconomic status were matched between the two groups. Participants counted the numbers of the central cross changes while paired facial expressions were presented bilaterally in an oddball paradigm. There were two.......2). Participants were required to concentrate on the primary task of counting the central cross changes and to ignore the expressions to ensure that facial expression processing was automatic. Event-related potentials (ERPs) were obtained during the tasks. The visual mismatch negativity (vMMN) components were...

  4. Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data

    Science.gov (United States)

    Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan

    2016-09-01

    Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.

  5. Active Learning for Automatic Audio Processing of Unwritten Languages (ALAPUL)

    Science.gov (United States)

    2016-07-01

    AFRL-RH-WP-TR-2016-0074 ACTIVE LEARNING FOR AUTOMATIC AUDIO PROCESSING OF UNWRITTEN LANGUAGES (ALAPUL) Dimitra Vergyri Andreas Kathol Wen Wang...FA8650-15-C-9101 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) *Dimitra Vergyri; Andreas Kathol; Wen Wang; Chris Bartels; Julian VanHout...feature transform through deep auto-encoders for better phone recognition performance. We target iterative learning to improve the system through

  6. The protein crystallography beamline BW6 at DORIS - automatic operation and high-throughput data collection

    CERN Document Server

    Blume, H; Bourenkov, G P; Kosciesza, D; Bartunik, H D

    2001-01-01

    The wiggler beamline BW6 at DORIS has been optimized for de-novo solution of protein structures on the basis of MAD phasing. Facilities for automatic data collection, rapid data transfer and storage, and online processing have been developed which provide adequate conditions for high-throughput applications, e.g., in structural genomics.

  7. Randomized algorithms in automatic control and data mining

    CERN Document Server

    Granichin, Oleg; Toledano-Kitai, Dvora

    2015-01-01

    In the fields of data mining and control, the huge amount of unstructured data and the presence of uncertainty in system descriptions have always been critical issues. The book Randomized Algorithms in Automatic Control and Data Mining introduces the readers to the fundamentals of randomized algorithm applications in data mining (especially clustering) and in automatic control synthesis. The methods proposed in this book guarantee that the computational complexity of classical algorithms and the conservativeness of standard robust control techniques will be reduced. It is shown that when a problem requires "brute force" in selecting among options, algorithms based on random selection of alternatives offer good results with certain probability for a restricted time and significantly reduce the volume of operations.

  8. Experience in automatic processing of 340.000 images from ITEF 3-m magnetic spectrometer

    International Nuclear Information System (INIS)

    Dzhelyadin, R.I.; Dukhovskoj, I.A.; Ivanov, L.V.; Kishkurno, V.V.; Krutenkova, A.P.; Kulikov, V.V.; Lyulevich, V.I.; Polikarpov, V.M.; Radkevich, I.A.; Fedorets, V.S.; Fedotov, O.P.

    1974-01-01

    A number of conclusions were made regarding automatic processing of 340.000 pictures (1.020.000 frames) developed on a three-meter magnetic spectrometer with spark chambers. Possibilities for time optimization of automatic processing programs are discussed. The results of processing of a series of photographs were analysed to compare the paramters of automatic ans semi-automatic processing. Some problems relating to organization and technology of picture processing are also autlined [ru

  9. Software of the BESM-6 computer for automatic image processing from liquid-hydrogen bubble chambers

    International Nuclear Information System (INIS)

    Grebenikov, E.A.; Kiosa, M.N.; Kobzarev, K.K.; Kuznetsova, N.A.; Mironov, S.V.; Nasonova, L.P.

    1978-01-01

    A set of programs, which is used in ''road guidance'' mode on the BESM-6 computer to process picture information taken in liquid hydrogen bubble chambers is discussed. This mode allows the system to process data from an automatic scanner (AS) taking into account the results of manual scanning. The system hardware includes: an automatic scanner, an M-6000 mini-controller and a BESM-6 computer. Software is functionally divided into the following units: computation of event mask parameters and generation . of data files controlling the AS; front-end processing of data coming from the AS; filtering of track data; simulation of AS operation and gauging of the AS reference system. To speed up the overall performance, programs which receive and decode data, coming from the AS via the M-6000 controller and the data link to the BESM-6 computer, are written in machine language

  10. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  11. The FAST-DATA System: Fully Automatic Stochastic Technology for Data Acquisition, Transmission, and Analysis

    International Nuclear Information System (INIS)

    Albrecht, R.W.; Crowe, R.D.; McGuire, J.J.

    1978-01-01

    The potential to automatically collect, classify, and report on stochastic data (signals with random, time-varying components) from power plants has long been discussed by utilities, government, industries, national laboratories and universities. It has become clear to all concerned that such signals often contain information about plant conditions which may provide the basis for increased plant availability through early detection and warning of developing malfunctions. Maintenance can then be scheduled at opportune times. Inopportune failures of major and minor power plant components are a major cause of down-time and detracts significantly from availability of the plant. A complete system to realize automatic stochastic data processing has been conceptually designed. Development of the FAST-DATA system has been initiated through a program of periodic measurements performed on the vibration and loose parts monitoring system of the Trojan reactor (1130-MW(e)PWR) operated by Portland General Electric Company. The development plan for the system consists of a six-step procedure. The initial steps depend on a significant level of human involvement. In the course of development of the system, the routine duties of operators and analysts are gradually replaced by computerized automatic data handling procedures. In the final configuration, the operator and analysts are completely freed of routine chores by logical machinery. The results achieved to date from actual application of the proof-of-principle system are discussed. The early developmental phases have concentrated on system organization and examination of a representative data base. Preliminary results from the signature analysis program using Trojan data indicate that the performance specifications predicted for the FAST-DATA system are achievable in practice. (author)

  12. Can Automatic Classification Help to Increase Accuracy in Data Collection?

    Directory of Open Access Journals (Sweden)

    Frederique Lang

    2016-09-01

    Full Text Available Purpose: The authors aim at testing the performance of a set of machine learning algorithms that could improve the process of data cleaning when building datasets. Design/methodology/approach: The paper is centered on cleaning datasets gathered from publishers and online resources by the use of specific keywords. In this case, we analyzed data from the Web of Science. The accuracy of various forms of automatic classification was tested here in comparison with manual coding in order to determine their usefulness for data collection and cleaning. We assessed the performance of seven supervised classification algorithms (Support Vector Machine (SVM, Scaled Linear Discriminant Analysis, Lasso and elastic-net regularized generalized linear models, Maximum Entropy, Regression Tree, Boosting, and Random Forest and analyzed two properties: accuracy and recall. We assessed not only each algorithm individually, but also their combinations through a voting scheme. We also tested the performance of these algorithms with different sizes of training data. When assessing the performance of different combinations, we used an indicator of coverage to account for the agreement and disagreement on classification between algorithms. Findings: We found that the performance of the algorithms used vary with the size of the sample for training. However, for the classification exercise in this paper the best performing algorithms were SVM and Boosting. The combination of these two algorithms achieved a high agreement on coverage and was highly accurate. This combination performs well with a small training dataset (10%, which may reduce the manual work needed for classification tasks. Research limitations: The dataset gathered has significantly more records related to the topic of interest compared to unrelated topics. This may affect the performance of some algorithms, especially in their identification of unrelated papers. Practical implications: Although the

  13. Current position on software for the automatic data acquisition system

    International Nuclear Information System (INIS)

    1988-01-01

    This report describes the current concepts for software to control the operation of the Automatic Data Acquisition System (ADAS) proposed for the Deaf Smith County, Texas, Exploratory Shaft Facility (ESF). The purpose of this report is to provide conceptual details of how the ADAS software will execute the data acquisition function, and how the software will make collected information available to the test personnel, the Data Management Group (DMG), and other authorized users. It is not intended that this report describe all of the ADAS functions in exact detail, but the concepts included herein will form the basis for the formal ADAS functional requirements definition document. 5 refs., 14 figs

  14. Automatic rebuilding and optimization of crystallographic structures in the Protein Data Bank.

    NARCIS (Netherlands)

    Joosten, R.P.; Joosten, K.; Cohen, S.X.; Vriend, G.; Perrakis, A.

    2011-01-01

    MOTIVATION: Macromolecular crystal structures in the Protein Data Bank (PDB) are a key source of structural insight into biological processes. These structures, some >30 years old, were constructed with methods of their era. With PDB_REDO, we aim to automatically optimize these structures to better

  15. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    International Nuclear Information System (INIS)

    Pichara, Karim; Protopapas, Pavlos

    2013-01-01

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same

  16. AUTOMATIC CLASSIFICATION OF VARIABLE STARS IN CATALOGS WITH MISSING DATA

    Energy Technology Data Exchange (ETDEWEB)

    Pichara, Karim [Computer Science Department, Pontificia Universidad Católica de Chile, Santiago (Chile); Protopapas, Pavlos [Institute for Applied Computational Science, Harvard University, Cambridge, MA (United States)

    2013-11-10

    We present an automatic classification method for astronomical catalogs with missing data. We use Bayesian networks and a probabilistic graphical model that allows us to perform inference to predict missing values given observed data and dependency relationships between variables. To learn a Bayesian network from incomplete data, we use an iterative algorithm that utilizes sampling methods and expectation maximization to estimate the distributions and probabilistic dependencies of variables from data with missing values. To test our model, we use three catalogs with missing data (SAGE, Two Micron All Sky Survey, and UBVI) and one complete catalog (MACHO). We examine how classification accuracy changes when information from missing data catalogs is included, how our method compares to traditional missing data approaches, and at what computational cost. Integrating these catalogs with missing data, we find that classification of variable objects improves by a few percent and by 15% for quasar detection while keeping the computational cost the same.

  17. Rotor assembly and method for automatically processing liquids

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1992-12-22

    A rotor assembly is described for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water. It includes a rotor body for rotation about an axis and includes a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses. 34 figs.

  18. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  19. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  20. ExcelAutomat: a tool for systematic processing of files as applied to quantum chemical calculations

    Science.gov (United States)

    Laloo, Jalal Z. A.; Laloo, Nassirah; Rhyman, Lydia; Ramasami, Ponnadurai

    2017-07-01

    The processing of the input and output files of quantum chemical calculations often necessitates a spreadsheet as a key component of the workflow. Spreadsheet packages with a built-in programming language editor can automate the steps involved and thus provide a direct link between processing files and the spreadsheet. This helps to reduce user-interventions as well as the need to switch between different programs to carry out each step. The ExcelAutomat tool is the implementation of this method in Microsoft Excel (MS Excel) using the default Visual Basic for Application (VBA) programming language. The code in ExcelAutomat was adapted to work with the platform-independent open-source LibreOffice Calc, which also supports VBA. ExcelAutomat provides an interface through the spreadsheet to automate repetitive tasks such as merging input files, splitting, parsing and compiling data from output files, and generation of unique filenames. Selected extracted parameters can be retrieved as variables which can be included in custom codes for a tailored approach. ExcelAutomat works with Gaussian files and is adapted for use with other computational packages including the non-commercial GAMESS. ExcelAutomat is available as a downloadable MS Excel workbook or as a LibreOffice workbook.

  1. Automatic and controlled processing and the Broad Autism Phenotype.

    Science.gov (United States)

    Camodeca, Amy; Voelker, Sylvia

    2016-01-30

    Research related to verbal fluency in the Broad Autism Phenotype (BAP) is limited and dated, but generally suggests intact abilities in the context of weaknesses in other areas of executive function (Hughes et al., 1999; Wong et al., 2006; Delorme et al., 2007). Controlled processing, the generation of search strategies after initial, automated responses are exhausted (Spat, 2013), has yet to be investigated in the BAP, and may be evidenced in verbal fluency tasks. One hundred twenty-nine participants completed the Delis-Kaplan Executive Function System Verbal Fluency test (D-KEFS; Delis et al., 2001) and the Broad Autism Phenotype Questionnaire (BAPQ; Hurley et al., 2007). The BAP group (n=53) produced significantly fewer total words during the 2nd 15" interval compared to the Non-BAP (n=76) group. Partial correlations indicated similar relations between verbal fluency variables for each group. Regression analyses predicting 2nd 15" interval scores suggested differentiation between controlled and automatic processing skills in both groups. Results suggest adequate automatic processing, but slowed development of controlled processing strategies in the BAP, and provide evidence for similar underlying cognitive constructs for both groups. Controlled processing was predictive of Block Design score for Non-BAP participants, and was predictive of Pragmatic Language score on the BAPQ for BAP participants. These results are similar to past research related to strengths and weaknesses in the BAP, respectively, and suggest that controlled processing strategy use may be required in instances of weak lower-level skills. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  2. Automatic analysis of attack data from distributed honeypot network

    Science.gov (United States)

    Safarik, Jakub; Voznak, MIroslav; Rezac, Filip; Partila, Pavol; Tomala, Karel

    2013-05-01

    There are many ways of getting real data about malicious activity in a network. One of them relies on masquerading monitoring servers as a production one. These servers are called honeypots and data about attacks on them brings us valuable information about actual attacks and techniques used by hackers. The article describes distributed topology of honeypots, which was developed with a strong orientation on monitoring of IP telephony traffic. IP telephony servers can be easily exposed to various types of attacks, and without protection, this situation can lead to loss of money and other unpleasant consequences. Using a distributed topology with honeypots placed in different geological locations and networks provides more valuable and independent results. With automatic system of gathering information from all honeypots, it is possible to work with all information on one centralized point. Communication between honeypots and centralized data store use secure SSH tunnels and server communicates only with authorized honeypots. The centralized server also automatically analyses data from each honeypot. Results of this analysis and also other statistical data about malicious activity are simply accessible through a built-in web server. All statistical and analysis reports serve as information basis for an algorithm which classifies different types of used VoIP attacks. The web interface then brings a tool for quick comparison and evaluation of actual attacks in all monitored networks. The article describes both, the honeypots nodes in distributed architecture, which monitor suspicious activity, and also methods and algorithms used on the server side for analysis of gathered data.

  3. Process and equipment for automatic measurement of resonant frequencies in seismic detectors

    International Nuclear Information System (INIS)

    Fredriksson, O.A.; Thomas, E.L.

    1977-01-01

    This is a process for the automatic indication of the resonant frequency of one or more detector elements which have operated inside a geophysical data-gathering system. Geophones or hydrophones or groups of both instruments are to be understood as comprising the detector elements. The invention concerns the creation of a process and of equipment working with laboratory precision, although it can be used in the field. (orig./RW) [de

  4. A data management program for the Electra 800 automatic analyser.

    Science.gov (United States)

    Cambus, J P; Nguyen, F; de Graeve, J; Aragon, B; Valdiguie, P

    1994-10-01

    The Electra 800 automatic coagulation analyser rapidly performs most chronometric coagulation tests with high precision. To facilitate data handling, software, adaptable to any PC running under MS-DOS, was written to manage the analyser. Data are automatically collected via the RS232 interface or can be manually input. The software can handle 64 different analyses, all entirely 'user defined'. An 'electronic worksheet' presents the results in pages of ten patients. This enables the operator to assess the data and to perform verifications or complementary tests if necessary. All results outside a predetermined range can be flagged and results can be deleted, modified or added. A patient's previous files can be recalled as the data are archived at the end of the day. A 120 Mb disk can store approximately 130,000 patient files. A daily archive function can print the day's work in alphabetical order. A communication protocol allows connection to a mainframe computer. This program and the user's manual are available on request, free of charge, from the authors.

  5. Automatic humidification system to support the assessment of food drying processes

    Science.gov (United States)

    Ortiz Hernández, B. D.; Carreño Olejua, A. R.; Castellanos Olarte, J. M.

    2016-07-01

    This work shows the main features of an automatic humidification system to provide drying air that match environmental conditions of different climate zones. This conditioned air is then used to assess the drying process of different agro-industrial products at the Automation and Control for Agro-industrial Processes Laboratory of the Pontifical Bolivarian University of Bucaramanga, Colombia. The automatic system allows creating and improving control strategies to supply drying air under specified conditions of temperature and humidity. The development of automatic routines to control and acquire real time data was made possible by the use of robust control systems and suitable instrumentation. The signals are read and directed to a controller memory where they are scaled and transferred to a memory unit. Using the IP address is possible to access data to perform supervision tasks. One important characteristic of this automatic system is the Dynamic Data Exchange Server (DDE) to allow direct communication between the control unit and the computer used to build experimental curves.

  6. Algorithm for automatic analysis of electro-oculographic data.

    Science.gov (United States)

    Pettersson, Kati; Jagadeesan, Sharman; Lukander, Kristian; Henelius, Andreas; Haeggström, Edward; Müller, Kiti

    2013-10-25

    Large amounts of electro-oculographic (EOG) data, recorded during electroencephalographic (EEG) measurements, go underutilized. We present an automatic, auto-calibrating algorithm that allows efficient analysis of such data sets. The auto-calibration is based on automatic threshold value estimation. Amplitude threshold values for saccades and blinks are determined based on features in the recorded signal. The performance of the developed algorithm was tested by analyzing 4854 saccades and 213 blinks recorded in two different conditions: a task where the eye movements were controlled (saccade task) and a task with free viewing (multitask). The results were compared with results from a video-oculography (VOG) device and manually scored blinks. The algorithm achieved 93% detection sensitivity for blinks with 4% false positive rate. The detection sensitivity for horizontal saccades was between 98% and 100%, and for oblique saccades between 95% and 100%. The classification sensitivity for horizontal and large oblique saccades (10 deg) was larger than 89%, and for vertical saccades larger than 82%. The duration and peak velocities of the detected horizontal saccades were similar to those in the literature. In the multitask measurement the detection sensitivity for saccades was 97% with a 6% false positive rate. The developed algorithm enables reliable analysis of EOG data recorded both during EEG and as a separate metrics.

  7. Improving labeling efficiency in automatic quality control of MRSI data.

    Science.gov (United States)

    Pedrosa de Barros, Nuno; McKinley, Richard; Wiest, Roland; Slotboom, Johannes

    2017-12-01

    To improve the efficiency of the labeling task in automatic quality control of MR spectroscopy imaging data. 28'432 short and long echo time (TE) spectra (1.5 tesla; point resolved spectroscopy (PRESS); repetition time (TR)= 1,500 ms) from 18 different brain tumor patients were labeled by two experts as either accept or reject, depending on their quality. For each spectrum, 47 signal features were extracted. The data was then used to run several simulations and test an active learning approach using uncertainty sampling. The performance of the classifiers was evaluated as a function of the number of patients in the training set, number of spectra in the training set, and a parameter α used to control the level of classification uncertainty required for a new spectrum to be selected for labeling. The results showed that the proposed strategy allows reductions of up to 72.97% for short TE and 62.09% for long TE in the amount of data that needs to be labeled, without significant impact in classification accuracy. Further reductions are possible with significant but minimal impact in performance. Active learning using uncertainty sampling is an effective way to increase the labeling efficiency for training automatic quality control classifiers. Magn Reson Med 78:2399-2405, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  8. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    Science.gov (United States)

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Automatic Road Pavement Assessment with Image Processing: Review and Comparison

    Directory of Open Access Journals (Sweden)

    Sylvie Chambon

    2011-01-01

    Full Text Available In the field of noninvasive sensing techniques for civil infrastructures monitoring, this paper addresses the problem of crack detection, in the surface of the French national roads, by automatic analysis of optical images. The first contribution is a state of the art of the image-processing tools applied to civil engineering. The second contribution is about fine-defect detection in pavement surface. The approach is based on a multi-scale extraction and a Markovian segmentation. Third, an evaluation and comparison protocol which has been designed for evaluating this difficult task—the road pavement crack detection—is introduced. Finally, the proposed method is validated, analysed, and compared to a detection approach based on morphological tools.

  10. Enhancement of the automatic ultrasonic signal processing system using digital technology

    International Nuclear Information System (INIS)

    Koo, In Soo; Park, H. Y.; Suh, Y. S.; Kim, D. Hoon; Huh, S.; Sung, S. H.; Jang, G. S.; Ryoo, S. G.; Choi, J. H.; Kim, Y. H.; Lee, J. C.; Kim, D. Hyun; Park, H. J.; Kim, Y. C.; Lee, J. P.; Park, C. H.; Kim, M. S.

    1999-12-01

    The objective of this study is to develop the automatic ultrasonic signal processing system which can be used in the inspection equipment to assess the integrity of the reactor vessel by enhancing the performance of the ultrasonic signal processing system. Main activities of this study divided into three categories such as the development of the circuits for generating ultrasonic signal and receiving the signal from the inspection equipment, the development of signal processing algorithm and H/W of the data processing system, and the development of the specification for application programs and system S/W for the analysis and evaluation computer. The results of main activities are as follows 1) the design of the ultrasonic detector and the automatic ultrasonic signal processing system by using the investigation of the state-of-the-art technology in the inside and outside of the country. 2) the development of H/W and S/W of the data processing system based on the results. Especially, the H/W of the data processing system, which have both advantages of digital and analog controls through the real-time digital signal processing, was developed using the DSP which can process the digital signal in the real-time, and was developed not only firmware of the data processing system in order for the peripherals but also the test algorithm of specimen for the calibration. The application programs and the system S/W of the analysis/evaluation computer were developed. Developed equipment was verified by the performance test. Based on developed prototype for the automatic ultrasonic signal processing system, the localization of the overall ultrasonic inspection equipment for nuclear industries would be expected through the further studies of the H/W establishment of real applications, developing the S/W specification of the analysis computer. (author)

  11. Automatic Road Centerline Extraction from Imagery Using Road GPS Data

    Directory of Open Access Journals (Sweden)

    Chuqing Cao

    2014-09-01

    Full Text Available Road centerline extraction from imagery constitutes a key element in numerous geospatial applications, which has been addressed through a variety of approaches. However, most of the existing methods are not capable of dealing with challenges such as different road shapes, complex scenes, and variable resolutions. This paper presents a novel method for road centerline extraction from imagery in a fully automatic approach that addresses the aforementioned challenges by exploiting road GPS data. The proposed method combines road color feature with road GPS data to detect road centerline seed points. After global alignment of road GPS data, a novel road centerline extraction algorithm is developed to extract each individual road centerline in local regions. Through road connection, road centerline network is generated as the final output. Extensive experiments demonstrate that our proposed method can rapidly and accurately extract road centerline from remotely sensed imagery.

  12. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  13. Data processing made simple

    CERN Document Server

    Wooldridge, Susan

    2013-01-01

    Data Processing: Made Simple, Second Edition presents discussions of a number of trends and developments in the world of commercial data processing. The book covers the rapid growth of micro- and mini-computers for both home and office use; word processing and the 'automated office'; the advent of distributed data processing; and the continued growth of database-oriented systems. The text also discusses modern digital computers; fundamental computer concepts; information and data processing requirements of commercial organizations; and the historical perspective of the computer industry. The

  14. AUTOMATIC RAILWAY POWER LINE EXTRACTION USING MOBILE LASER SCANNING DATA

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2016-06-01

    Full Text Available Research on power line extraction technology using mobile laser point clouds has important practical significance on railway power lines patrol work. In this paper, we presents a new method for automatic extracting railway power line from MLS (Mobile Laser Scanning data. Firstly, according to the spatial structure characteristics of power-line and trajectory, the significant data is segmented piecewise. Then, use the self-adaptive space region growing method to extract power lines parallel with rails. Finally use PCA (Principal Components Analysis combine with information entropy theory method to judge a section of the power line whether is junction or not and which type of junction it belongs to. The least squares fitting algorithm is introduced to model the power line. An evaluation of the proposed method over a complicated railway point clouds acquired by a RIEGL VMX450 MLS system shows that the proposed method is promising.

  15. Sensitometric comparison of E and F dental radiographic films using manual and automatic processing systems

    Directory of Open Access Journals (Sweden)

    Dabaghi A.

    2008-04-01

    Full Text Available Background and Aim: Processing conditions affect sensitometric properties of X-ray films. In this study, we aimed to evaluate the sensitometric characteristics of InSight (IP, a new F-speed film, in fresh and used processing solutions in dental office condition and compare them with Ektaspeed Plus (EP.Materials and Methods: In this experimental in vitro study, an aluminium step wedge was used to construct characteristic curves for InSight and Ektaspeed Plus films (Kodak Eastman, Rochester, USA.All films were processed in Champion solution (X-ray Iran, Tehran, Iran both manually and automatically in a period of six days. Unexposed films of both types were processed manually and automatically to determine base plus fog density. Speed and film contrast were measured according to ISO definition. Data were analyzed using one-way ANOVA and T tests with P<0.05 as the level of significance.Results: IP was 20 to 22% faster than EP and showed to be an F-speed film when processed in automatic condition and E-F film when processed manually. Also it was F-speed in fresh solution and E-speed in old solution. IP and EP contrasts were similar in automatic processing but EP contrast was higher when processed manually. Both EP and IP films had standard values of base plus fog (<0.35 and B+F densities were decreased in old solution.Conclusion: Based on the results of this study, InSight is a F-speed film with a speed of at least 20% greater than Ektaspeed. In addition, it reduces patient exposure with no damage to image quality.

  16. [Automatic Extraction and Analysis of Dosimetry Data in Radiotherapy Plans].

    Science.gov (United States)

    Song, Wei; Zhao, Di; Lu, Hong; Zhang, Biyun; Ma, Jun; Yu, Dahai

    To improve the efficiency and accuracy of extraction and analysis of dosimetry data in radiotherapy plans for a batch of patients. With the interface function provided in Matlab platform, a program was written to extract the dosimetry data exported from treatment planning system in DICOM RT format and exported the dose-volume data to an Excel file with the SPSS compatible format. This method was compared with manual operation for 14 gastric carcinoma patients to validate the efficiency and accuracy. The output Excel data were compatible with SPSS in format, the dosimetry data error for PTV dose interval of 90%-98%, PTV dose interval of 99%-106% and all OARs were -3.48E-5 ± 3.01E-5, -1.11E-3 ± 7.68E-4, -7.85E-5 ± 9.91E-5 respectively. Compared with manual operation, the time required was reduced from 5.3 h to 0.19 h and input error was reduced from 0.002 to 0. The automatic extraction of dosimetry data in DICOM RT format for batch patients, the SPSS compatible data exportation, quick analysis were achieved in this paper. The efficiency of clinical researches based on dosimetry data analysis of large number of patients will be improved with this methods.

  17. AUTOMATIC 3D BUILDING MODEL GENERATIONS WITH AIRBORNE LiDAR DATA

    Directory of Open Access Journals (Sweden)

    N. Yastikli

    2017-11-01

    Full Text Available LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified

  18. Automatic 3d Building Model Generations with Airborne LiDAR Data

    Science.gov (United States)

    Yastikli, N.; Cetin, Z.

    2017-11-01

    LiDAR systems become more and more popular because of the potential use for obtaining the point clouds of vegetation and man-made objects on the earth surface in an accurate and quick way. Nowadays, these airborne systems have been frequently used in wide range of applications such as DEM/DSM generation, topographic mapping, object extraction, vegetation mapping, 3 dimensional (3D) modelling and simulation, change detection, engineering works, revision of maps, coastal management and bathymetry. The 3D building model generation is the one of the most prominent applications of LiDAR system, which has the major importance for urban planning, illegal construction monitoring, 3D city modelling, environmental simulation, tourism, security, telecommunication and mobile navigation etc. The manual or semi-automatic 3D building model generation is costly and very time-consuming process for these applications. Thus, an approach for automatic 3D building model generation is needed in a simple and quick way for many studies which includes building modelling. In this study, automatic 3D building models generation is aimed with airborne LiDAR data. An approach is proposed for automatic 3D building models generation including the automatic point based classification of raw LiDAR point cloud. The proposed point based classification includes the hierarchical rules, for the automatic production of 3D building models. The detailed analyses for the parameters which used in hierarchical rules have been performed to improve classification results using different test areas identified in the study area. The proposed approach have been tested in the study area which has partly open areas, forest areas and many types of the buildings, in Zekeriyakoy, Istanbul using the TerraScan module of TerraSolid. The 3D building model was generated automatically using the results of the automatic point based classification. The obtained results of this research on study area verified that automatic 3D

  19. Prosody's Contribution to Fluency: An Examination of the Theory of Automatic Information Processing

    Science.gov (United States)

    Schrauben, Julie E.

    2010-01-01

    LaBerge and Samuels' (1974) theory of automatic information processing in reading offers a model that explains how and where the processing of information occurs and the degree to which processing of information occurs. These processes are dependent upon two criteria: accurate word decoding and automatic word recognition. However, LaBerge and…

  20. Automatic processing of unattended object features by functional connectivity

    Directory of Open Access Journals (Sweden)

    Katja Martina Mayer

    2013-05-01

    Full Text Available Observers can selectively attend to object features that are relevant for a task. However, unattended task-irrelevant features may still be processed and possibly integrated with the attended features. This study investigated the neural mechanisms for processing both task-relevant (attended and task-irrelevant (unattended object features. The Garner paradigm was adapted for functional magnetic resonance imaging (fMRI to test whether specific brain areas process the conjunction of features or whether multiple interacting areas are involved in this form of feature integration. Observers attended to shape, colour, or non-rigid motion of novel objects while unattended features changed from trial to trial (change blocks or remained constant (no-change blocks during a given block. This block manipulation allowed us to measure the extent to which unattended features affected neural responses which would reflect the extent to which multiple object features are automatically processed. We did not find Garner interference at the behavioural level. However, we designed the experiment to equate performance across block types so that any fMRI results could not be due solely to differences in task difficulty between change and no-change blocks. Attention to specific features localised several areas known to be involved in object processing. No area showed larger responses on change blocks compared to no-change blocks. However, psychophysiological interaction analyses revealed that several functionally-localised areas showed significant positive interactions with areas in occipito-temporal and frontal areas that depended on block type. Overall, these findings suggest that both regional responses and functional connectivity are crucial for processing multi-featured objects.

  1. Grammar-based Automatic 3D Model Reconstruction from Terrestrial Laser Scanning Data

    Science.gov (United States)

    Yu, Q.; Helmholz, P.; Belton, D.; West, G.

    2014-04-01

    The automatic reconstruction of 3D buildings has been an important research topic during the last years. In this paper, a novel method is proposed to automatically reconstruct the 3D building models from segmented data based on pre-defined formal grammar and rules. Such segmented data can be extracted e.g. from terrestrial or mobile laser scanning devices. Two steps are considered in detail. The first step is to transform the segmented data into 3D shapes, for instance using the DXF (Drawing Exchange Format) format which is a CAD data file format used for data interchange between AutoCAD and other program. Second, we develop a formal grammar to describe the building model structure and integrate the pre-defined grammars into the reconstruction process. Depending on the different segmented data, the selected grammar and rules are applied to drive the reconstruction process in an automatic manner. Compared with other existing approaches, our proposed method allows the model reconstruction directly from 3D shapes and takes the whole building into account.

  2. Crowd-sourced data collection to support automatic classification of building footprint data

    Science.gov (United States)

    Hecht, Robert; Kalla, Matthias; Krüger, Tobias

    2018-05-01

    Human settlements are mainly formed by buildings with their different characteristics and usage. Despite the importance of buildings for the economy and society, complete regional or even national figures of the entire building stock and its spatial distribution are still hardly available. Available digital topographic data sets created by National Mapping Agencies or mapped voluntarily through a crowd via Volunteered Geographic Information (VGI) platforms (e.g. OpenStreetMap) contain building footprint information but often lack additional information on building type, usage, age or number of floors. For this reason, predictive modeling is becoming increasingly important in this context. The capabilities of machine learning allow for the prediction of building types and other building characteristics and thus, the efficient classification and description of the entire building stock of cities and regions. However, such data-driven approaches always require a sufficient amount of ground truth (reference) information for training and validation. The collection of reference data is usually cost-intensive and time-consuming. Experiences from other disciplines have shown that crowdsourcing offers the possibility to support the process of obtaining ground truth data. Therefore, this paper presents the results of an experimental study aiming at assessing the accuracy of non-expert annotations on street view images collected from an internet crowd. The findings provide the basis for a future integration of a crowdsourcing component into the process of land use mapping, particularly the automatic building classification.

  3. Automatic processing of unattended lexical information in visual oddball presentation: neurophysiological evidence

    Directory of Open Access Journals (Sweden)

    Yury eShtyrov

    2013-08-01

    Full Text Available Previous electrophysiological studies of automatic language processing revealed early (100-200 ms reflections of access to lexical characteristics of speech signal using the so-called mismatch negativity (MMN, a negative ERP deflection elicited by infrequent irregularities in unattended repetitive auditory stimulation. In those studies, lexical processing of spoken stimuli became manifest as an enhanced ERP in response to unattended real words as opposed to phonologically matched but meaningless pseudoword stimuli. This lexical ERP enhancement was explained by automatic activation of word memory traces realised as distributed strongly intra-connected neuronal circuits, whose robustness guarantees memory trace activation even in the absence of attention on spoken input. Such an account would predict the automatic activation of these memory traces upon any presentation of linguistic information, irrespective of the presentation modality. As previous lexical MMN studies exclusively used auditory stimulation, we here adapted the lexical MMN paradigm to investigate early automatic lexical effects in the visual modality. In a visual oddball sequence, matched short word and pseudoword stimuli were presented tachistoscopically in perifoveal area outside the visual focus of attention, as the subjects’ attention was concentrated on a concurrent non-linguistic visual dual task in the centre of the screen. Using EEG, we found a visual analogue of the lexical ERP enhancement effect, with unattended written words producing larger brain response amplitudes than matched pseudowords, starting at ~100 ms. Furthermore, we also found significant visual MMN, reported here for the first time for unattended lexical stimuli presented perifoveally. The data suggest early automatic lexical processing of visually presented language outside the focus of attention.

  4. Automatic delimitation of microwatershed using SRTM data of the NASA

    Directory of Open Access Journals (Sweden)

    Freddy Aníbal Jumbo Castillo

    2015-12-01

    Full Text Available The watershed as the basic territorial unit of planning and management of water resources, requires its proper delimitation of the catchment or drainage area, faced with this situation, the lack of geographic information of Casacay river micro watersheds, hydrographic unit should be resolved, for this purpose the research was aimed at automatic delimitation of micro watersheds using of Geographic Information Systems (GIS techniques and the project Shuttle Radar Topographic Mission (SRTM 30 meters spatial resolution data. The selected methodology was the Pfafstetter one, with which nine micro watersheds were obtained with their respective codification allowing to continue with watersheds standardization adopted by Ecuador Water's Secretariat. With the investigation results watersheds will be updated with more detail information, promoting the execution of tasks or activities related to the integrated management of the hydrographic unit studied

  5. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    OpenAIRE

    Fischer, Bernd; Knuth, Kevin; Hajian, Arsen; Schumann, Johann

    2004-01-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable buildin...

  6. Automatic segmentation of coronary arteries from computed tomography angiography data cloud using optimal thresholding

    Science.gov (United States)

    Ansari, Muhammad Ahsan; Zai, Sammer; Moon, Young Shik

    2017-01-01

    Manual analysis of the bulk data generated by computed tomography angiography (CTA) is time consuming, and interpretation of such data requires previous knowledge and expertise of the radiologist. Therefore, an automatic method that can isolate the coronary arteries from a given CTA dataset is required. We present an automatic yet effective segmentation method to delineate the coronary arteries from a three-dimensional CTA data cloud. Instead of a region growing process, which is usually time consuming and prone to leakages, the method is based on the optimal thresholding, which is applied globally on the Hessian-based vesselness measure in a localized way (slice by slice) to track the coronaries carefully to their distal ends. Moreover, to make the process automatic, we detect the aorta using the Hough transform technique. The proposed segmentation method is independent of the starting point to initiate its process and is fast in the sense that coronary arteries are obtained without any preprocessing or postprocessing steps. We used 12 real clinical datasets to show the efficiency and accuracy of the presented method. Experimental results reveal that the proposed method achieves 95% average accuracy.

  7. HYDICE postflight data processing

    Science.gov (United States)

    Aldrich, William S.; Kappus, Mary E.; Resmini, Ronald G.; Mitchell, Peter A.

    1996-06-01

    The hyperspectral digital imagery collection experiment (HYDICE) sensor records instrument counts for scene data, in-flight spectral and radiometric calibration sequences, and dark current levels onto an AMPEX DCRsi data tape. Following flight, the HYDICE ground data processing subsystem (GDPS) transforms selected scene data from digital numbers (DN) to calibrated radiance levels at the sensor aperture. This processing includes: dark current correction, spectral and radiometric calibration, conversion to radiance, and replacement of bad detector elements. A description of the algorithms for post-flight data processing is presented. A brief analysis of the original radiometric calibration procedure is given, along with a description of the development of the modified procedure currently used. Example data collected during the 1995 flight season, but uncorrected and processed, are shown to demonstrate the removal of apparent sensor artifacts (e.g., non-uniformities in detector response over the array) as a result of this transformation.

  8. Data process of liquid scintillation counting

    International Nuclear Information System (INIS)

    Ishikawa, Hiroaki; Kuwajima, Susumu.

    1975-01-01

    The use of liquid scintillation counting system has been significantly spread because automatic sample changers and printers have recently come to be incorporated. However, the system will be systematized completely if automatic data processing and the sample preparation of radioactive materials to be measured are realized. Dry or wet oxidation method is applied to the sample preparation when radioactive materials are hard to dissolve into scintillator solution. Since these several years, the automatic sample combustion system, in which the dry oxidation is automated, has been rapidly spread and serves greatly to labor saving. Since the printers generally indicate only counted number, data processing system has been developed, and speeded up calculating process, which automatically corrects quenching of samples for obtaining the final radioactivity required. The data processing system is roughly divided into on-line and off-line systems according to whether computers are connected directly or indirectly, while its hardware is classified to input, calculating and output devices. Also, the calculation to determine sample activity by external standard method is explained. (Wakatsuki, Y.)

  9. Automatic Control of Arc Process for Making Carbon Nanotubes

    Science.gov (United States)

    Scott, Carl D.; Pulumbarit, Robert B.; Victor, Joe

    2004-01-01

    An automatic-control system has been devised for a process in which carbon nanotubes are produced in an arc between a catalyst-filled carbon anode and a graphite cathode. The control system includes a motor-driven screw that adjusts the distance between the electrodes. The system also includes a bridge circuit that puts out a voltage proportional to the difference between (1) the actual value of potential drop across the arc and (2) a reference value between 38 and 40 V (corresponding to a current of about 100 A) at which the yield of carbon nanotubes is maximized. Utilizing the fact that the potential drop across the arc increases with the interelectrode gap, the output of the bridge circuit is fed to a motor-control circuit that causes the motor to move the anode toward or away from the cathode if the actual potential drop is more or less, respectively, than the reference potential. Thus, the system regulates the interelectrode gap to maintain the optimum potential drop. The system also includes circuitry that records the potential drop across the arc and the relative position of the anode holder as function of time.

  10. Automatic processing of CERN video, audio and photo archives

    Energy Technology Data Exchange (ETDEWEB)

    Kwiatek, M [CERN, Geneva (Switzerland)], E-mail: Michal.Kwiatek@cem.ch

    2008-07-15

    The digitalization of CERN audio-visual archives, a major task currently in progress, will generate over 40 TB of video, audio and photo files. Storing these files is one issue, but a far more important challenge is to provide long-time coherence of the archive and to make these files available on-line with minimum manpower investment. An infrastructure, based on standard CERN services, has been implemented, whereby master files, stored in the CERN Distributed File System (DFS), are discovered and scheduled for encoding into lightweight web formats based on predefined profiles. Changes in master files, conversion profiles or in the metadata database (read from CDS, the CERN Document Server) are automatically detected and the media re-encoded whenever necessary. The encoding processes are run on virtual servers provided on-demand by the CERN Server Self Service Centre, so that new servers can be easily configured to adapt to higher load. Finally, the generated files are made available from the CERN standard web servers with streaming implemented using Windows Media Services.

  11. Automatic processing of CERN video, audio and photo archives

    International Nuclear Information System (INIS)

    Kwiatek, M

    2008-01-01

    The digitalization of CERN audio-visual archives, a major task currently in progress, will generate over 40 TB of video, audio and photo files. Storing these files is one issue, but a far more important challenge is to provide long-time coherence of the archive and to make these files available on-line with minimum manpower investment. An infrastructure, based on standard CERN services, has been implemented, whereby master files, stored in the CERN Distributed File System (DFS), are discovered and scheduled for encoding into lightweight web formats based on predefined profiles. Changes in master files, conversion profiles or in the metadata database (read from CDS, the CERN Document Server) are automatically detected and the media re-encoded whenever necessary. The encoding processes are run on virtual servers provided on-demand by the CERN Server Self Service Centre, so that new servers can be easily configured to adapt to higher load. Finally, the generated files are made available from the CERN standard web servers with streaming implemented using Windows Media Services

  12. Algorithm of automatic generation of technology process and process relations of automotive wiring harnesses

    Institute of Scientific and Technical Information of China (English)

    XU Benzhu; ZHU Jiman; LIU Xiaoping

    2012-01-01

    Identifying each process and their constraint relations from the complex wiring harness drawings quickly and accurately is the basis for formulating process routes. According to the knowledge of automotive wiring harness and the characteristics of wiring harness components, we established the model of wiring harness graph. Then we research the algorithm of identifying technology processes automatically, finally we describe the relationships between processes by introducing the constraint matrix, which is in or- der to lay a good foundation for harness process planning and production scheduling.

  13. AUTOMATIC REGISTRATION OF MULTI-SOURCE DATA USING MUTUAL INFORMATION

    Directory of Open Access Journals (Sweden)

    E. G. Parmehr

    2012-07-01

    Full Text Available Automatic image registration is a basic step in multi-sensor data integration in remote sensing and photogrammetric applications such as data fusion. The effectiveness of Mutual Information (MI as a technique for automated multi-sensor image registration has previously been demonstrated for medical and remote sensing applications. In this paper, a new General Weighted MI (GWMI approach that improves the robustness of MI to local maxima, particularly in the case of registering optical imagery and 3D point clouds, is presented. Two different methods including a Gaussian Mixture Model (GMM and Kernel Density Estimation have been used to define the weight function of joint probability, regardless of the modality of the data being registered. The Expectation Maximizing method is then used to estimate parameters of GMM, and in order to reduce the cost of computation, a multi-resolution strategy has been used. The performance of the proposed GWMI method for the registration of aerial orthotoimagery and LiDAR range and intensity information has been experimentally evaluated and the results obtained are presented.

  14. Soft computing in big data processing

    CERN Document Server

    Park, Seung-Jong; Lee, Jee-Hyong

    2014-01-01

    Big data is an essential key to build a smart world as a meaning of the streaming, continuous integration of large volume and high velocity data covering from all sources to final destinations. The big data range from data mining, data analysis and decision making, by drawing statistical rules and mathematical patterns through systematical or automatically reasoning. The big data helps serve our life better, clarify our future and deliver greater value. We can discover how to capture and analyze data. Readers will be guided to processing system integrity and implementing intelligent systems. With intelligent systems, we deal with the fundamental data management and visualization challenges in effective management of dynamic and large-scale data, and efficient processing of real-time and spatio-temporal data. Advanced intelligent systems have led to managing the data monitoring, data processing and decision-making in realistic and effective way. Considering a big size of data, variety of data and frequent chan...

  15. AUTOMATIC EXTRACTION OF ROAD MARKINGS FROM MOBILE LASER SCANNING DATA

    Directory of Open Access Journals (Sweden)

    H. Ma

    2017-09-01

    Full Text Available Road markings as critical feature in high-defination maps, which are Advanced Driver Assistance System (ADAS and self-driving technology required, have important functions in providing guidance and information to moving cars. Mobile laser scanning (MLS system is an effective way to obtain the 3D information of the road surface, including road markings, at highway speeds and at less than traditional survey costs. This paper presents a novel method to automatically extract road markings from MLS point clouds. Ground points are first filtered from raw input point clouds using neighborhood elevation consistency method. The basic assumption of the method is that the road surface is smooth. Points with small elevation-difference between neighborhood are considered to be ground points. Then ground points are partitioned into a set of profiles according to trajectory data. The intensity histogram of points in each profile is generated to find intensity jumps in certain threshold which inversely to laser distance. The separated points are used as seed points to region grow based on intensity so as to obtain road mark of integrity. We use the point cloud template-matching method to refine the road marking candidates via removing the noise clusters with low correlation coefficient. During experiment with a MLS point set of about 2 kilometres in a city center, our method provides a promising solution to the road markings extraction from MLS data.

  16. Automatic Extraction of Road Markings from Mobile Laser Scanning Data

    Science.gov (United States)

    Ma, H.; Pei, Z.; Wei, Z.; Zhong, R.

    2017-09-01

    Road markings as critical feature in high-defination maps, which are Advanced Driver Assistance System (ADAS) and self-driving technology required, have important functions in providing guidance and information to moving cars. Mobile laser scanning (MLS) system is an effective way to obtain the 3D information of the road surface, including road markings, at highway speeds and at less than traditional survey costs. This paper presents a novel method to automatically extract road markings from MLS point clouds. Ground points are first filtered from raw input point clouds using neighborhood elevation consistency method. The basic assumption of the method is that the road surface is smooth. Points with small elevation-difference between neighborhood are considered to be ground points. Then ground points are partitioned into a set of profiles according to trajectory data. The intensity histogram of points in each profile is generated to find intensity jumps in certain threshold which inversely to laser distance. The separated points are used as seed points to region grow based on intensity so as to obtain road mark of integrity. We use the point cloud template-matching method to refine the road marking candidates via removing the noise clusters with low correlation coefficient. During experiment with a MLS point set of about 2 kilometres in a city center, our method provides a promising solution to the road markings extraction from MLS data.

  17. Learning algorithms and automatic processing of languages; Algorithmes a apprentissage et traitement automatique des langues

    Energy Technology Data Exchange (ETDEWEB)

    Fluhr, Christian Yves Andre

    1977-06-15

    This research thesis concerns the field of artificial intelligence. It addresses learning algorithms applied to automatic processing of languages. The author first briefly describes some mechanisms of human intelligence in order to describe how these mechanisms are simulated on a computer. He outlines the specific role of learning in various manifestations of intelligence. Then, based on the Markov's algorithm theory, the author discusses the notion of learning algorithm. Two main types of learning algorithms are then addressed: firstly, an 'algorithm-teacher dialogue' type sanction-based algorithm which aims at learning how to solve grammatical ambiguities in submitted texts; secondly, an algorithm related to a document system which structures semantic data automatically obtained from a set of texts in order to be able to understand by references to any question on the content of these texts.

  18. Graphics on demand: the automatic data visualization on the WEB

    Directory of Open Access Journals (Sweden)

    Ramzi Guetari

    2017-06-01

    Full Text Available Data visualization is an effective tool for communicating the results of opinion surveys, epidemiological studies, statistics on consumer habits, etc. The graphical representation of data usually assists human information processing by reducing demands on attention, working memory, and long-term memory. It allows, among other things, a faster reading of the information (by acting on the forms, directions, colors…, the independence of the language (or culture, a better capture the attention of the audience, etc. Data that could be graphically represented may be structured or unstructured. The unstructured data, whose volume grows exponentially, often hide important and even vital information for society and companies. It, therefore, takes a lot of work to extract valuable information from unstructured data. If it is easier to understand a message through structured data, such as a table, than through a long narrative text, it is even easier to convey a message through a graphic than a table. In our opinion, it is often very useful to synthesize the unstructured data in the form of graphical representations. In this paper, we present an approach for processing unstructured data containing statistics in order to represent them graphically. This approach allows transforming the unstructured data into structured one which globally conveys the same countable information. The graphical representation of such a structured data is then obvious. This approach deals with both quantitative and qualitative data. It is based on Natural Language Processing Techniques and Text Mining. An application that implements this process is also presented in this paper.

  19. Image processing techniques for remote sensing data

    Digital Repository Service at National Institute of Oceanography (India)

    RameshKumar, M.R.

    interpretation and for processing of scene data for autonomous machine perception. The technique of digital image processing are used for' automatic character/pattern recognition, industrial robots for product assembly and inspection, military recognizance... and spatial co-ordinates into discrete components. The mathematical concepts involved are the sampling and transform theory. Two dimensional transforms are used for image enhancement, restoration, encoding and description too. The main objective of the image...

  20. Data processing on FPGAs

    CERN Document Server

    Teubner, Jens

    2013-01-01

    Roughly a decade ago, power consumption and heat dissipation concerns forced the semiconductor industry to radically change its course, shifting from sequential to parallel computing. Unfortunately, improving performance of applications has now become much more difficult than in the good old days of frequency scaling. This is also affecting databases and data processing applications in general, and has led to the popularity of so-called data appliances-specialized data processing engines, where software and hardware are sold together in a closed box. Field-programmable gate arrays (FPGAs) incr

  1. Automatic Indoor Building Reconstruction from Mobile Laser Scanning Data

    Science.gov (United States)

    Xie, L.; Wang, R.

    2017-09-01

    Indoor reconstruction from point clouds is a hot topic in photogrammetry, computer vision and computer graphics. Reconstructing indoor scene from point clouds is challenging due to complex room floorplan and line-of-sight occlusions. Most of existing methods deal with stationary terrestrial laser scanning point clouds or RGB-D point clouds. In this paper, we propose an automatic method for reconstructing indoor 3D building models from mobile laser scanning point clouds. The method includes 2D floorplan generation, 3D building modeling, door detection and room segmentation. The main idea behind our approach is to separate wall structure into two different types as the inner wall and the outer wall based on the observation of point distribution. Then we utilize a graph cut based optimization method to solve the labeling problem and generate the 2D floorplan based on the optimization result. Subsequently, we leverage an ?-shape based method to detect the doors on the 2D projected point clouds and utilize the floorplan to segment the individual room. The experiments show that this door detection method can achieve a recognition rate at 97% and the room segmentation method can attain the correct segmentation results. We also evaluate the reconstruction accuracy on the synthetic data, which indicates the accuracy of our method is comparable to the state-of-the art.

  2. Automatic Compound Annotation from Mass Spectrometry Data Using MAGMa.

    Science.gov (United States)

    Ridder, Lars; van der Hooft, Justin J J; Verhoeven, Stefan

    2014-01-01

    The MAGMa software for automatic annotation of mass spectrometry based fragmentation data was applied to 16 MS/MS datasets of the CASMI 2013 contest. Eight solutions were submitted in category 1 (molecular formula assignments) and twelve in category 2 (molecular structure assignment). The MS/MS peaks of each challenge were matched with in silico generated substructures of candidate molecules from PubChem, resulting in penalty scores that were used for candidate ranking. In 6 of the 12 submitted solutions in category 2, the correct chemical structure obtained the best score, whereas 3 molecules were ranked outside the top 5. All top ranked molecular formulas submitted in category 1 were correct. In addition, we present MAGMa results generated retrospectively for the remaining challenges. Successful application of the MAGMa algorithm required inclusion of the relevant candidate molecules, application of the appropriate mass tolerance and a sufficient degree of in silico fragmentation of the candidate molecules. Furthermore, the effect of the exhaustiveness of the candidate lists and limitations of substructure based scoring are discussed.

  3. Controlled versus automatic processes: which is dominant to safety? The moderating effect of inhibitory control.

    Directory of Open Access Journals (Sweden)

    Yaoshan Xu

    Full Text Available This study explores the precursors of employees' safety behaviors based on a dual-process model, which suggests that human behaviors are determined by both controlled and automatic cognitive processes. Employees' responses to a self-reported survey on safety attitudes capture their controlled cognitive process, while the automatic association concerning safety measured by an Implicit Association Test (IAT reflects employees' automatic cognitive processes about safety. In addition, this study investigates the moderating effects of inhibition on the relationship between self-reported safety attitude and safety behavior, and that between automatic associations towards safety and safety behavior. The results suggest significant main effects of self-reported safety attitude and automatic association on safety behaviors. Further, the interaction between self-reported safety attitude and inhibition and that between automatic association and inhibition each predict unique variances in safety behavior. Specifically, the safety behaviors of employees with lower level of inhibitory control are influenced more by automatic association, whereas those of employees with higher level of inhibitory control are guided more by self-reported safety attitudes. These results suggest that safety behavior is the joint outcome of both controlled and automatic cognitive processes, and the relative importance of these cognitive processes depends on employees' individual differences in inhibitory control. The implications of these findings for theoretical and practical issues are discussed at the end.

  4. Automatic system for processing the plasma radiation spectra

    International Nuclear Information System (INIS)

    Isakaev, Eh.Kh.; Markin, A.V.; Khajmin, V.A.; Chinnov, V.F.

    2001-01-01

    One is tackling a problem to ensure computer for processing of experimental data when studying plasma obtained due to the present day systems to acquire information. One elaborated rather simple and reliable programs for processing. The system is used in case of plasma quantitative spectroscopy representing the classical and most widely used method to analyze the parameters and the properties of low-temperature and high-temperature plasma [ru

  5. Process air quality data

    Science.gov (United States)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  6. Image processing applied to automatic detection of defects during ultrasonic examination

    International Nuclear Information System (INIS)

    Moysan, J.

    1992-10-01

    This work is a study about image processing applied to ultrasonic BSCAN images which are obtained in the field of non destructive testing of weld. The goal is to define what image processing techniques can bring to ameliorate the exploitation of the data collected and, more precisely, what image processing can do to extract the meaningful echoes which enable to characterize and to size the defects. The report presents non destructive testing by ultrasounds in the nuclear field and it indicates specificities of the propagation of ultrasonic waves in austenitic weld. It gives a state of the art of the data processing applied to ultrasonic images in nondestructive evaluation. A new image analysis is then developed. It is based on a powerful tool, the co-occurrence matrix. This matrix enables to represent, in a whole representation, relations between amplitudes of couples of pixels. From the matrix analysis, a new complete and automatic method has been set down in order to define a threshold which separates echoes from noise. An automatic interpretation of the ultrasonic echoes is then possible. Complete validation has been done with standard pieces

  7. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  8. Automatic system of production, transfer and processing of coin targets for the production of metallic radioisotopes

    Science.gov (United States)

    Pellicioli, M.; Ouadi, A.; Marchand, P.; Foehrenbacher, T.; Schuler, J.; Dick-Schuler, N.; Brasse, D.

    2017-05-01

    The work presented in this paper gathers three main technical developments aiming at 1) optimizing nuclide production by the mean of solid targets 2) automatically transferring coin targets from vault to hotcell without human intervention 3) processing target dilution and purification in hotcell automatically. This system has been installed on a ACSI TR24 cyclotron in Strasbourg France.

  9. Online data processing system

    International Nuclear Information System (INIS)

    Nakahara, Yoshinori; Yagi, Hideyuki; Yamada, Takayuki

    1979-02-01

    A pulse height analyzer terminal system PHATS has been developed for online data processing via JAERI-TOKAI computer network. The system is controled by using a micro-computer MICRO-8 which was developed for the JAERI-TOKAI network. The system program consists of two subprograms, online control system ONLCS and pulse height analyzer control system PHACS. ONLCS links the terminal with the conversational programming system of FACOM 230/75 through the JAERI-TOKAI network and controls data processing in TSS and remote batch modes. PHACS is used to control INPUT/OUTPUT of data between pulse height analyzer and cassette-MT or typewriter. This report describes the hardware configuration and the system program in detail. In the appendix, explained are real time monitor, type of message, PEX to PEX protocol and Host to Host protocol, required for the system programming. (author)

  10. Data processing device

    International Nuclear Information System (INIS)

    Kita, Yoshio.

    1994-01-01

    A data processing device for use in a thermonuclear testing device comprises a frequency component judging section for analog signals, a sample time selection section based on the result of the judgement, a storing memory section for selecting digital data memorized in the sampling time. Namely, the frequency components of the analog signals are detected by the frequency component judging section, and one of a plurality of previously set sampling times is selected by the sampling time selection section based on the result of the judgement of the frequency component judging section. Then, digital data obtained by A/D conversion are read and preliminarily memorized in the storing memory section. Subsequently, the digital data memorized in the sampling time selected by the sampling time selection section are selected and transmitted to a superior computer. The amount of data to be memorized can greatly reduced, to reduce the cost. (N.H.)

  11. Experience with automatic orientation from different data sets

    DEFF Research Database (Denmark)

    Potucková, Marketa

    2003-01-01

    Automatic orientation of aerial images based on existing databases was a topic of the OEEPE research project running in 1998 and 1999. Different approaches for solving this task have been published until now. The method developed at Aalborg University uses the existing topographic database...

  12. Mirion--a software package for automatic processing of mass spectrometric images.

    Science.gov (United States)

    Paschke, C; Leisner, A; Hester, A; Maass, K; Guenther, S; Bouschen, W; Spengler, B

    2013-08-01

    Mass spectrometric imaging (MSI) techniques are of growing interest for the Life Sciences. In recent years, the development of new instruments employing ion sources that are tailored for spatial scanning allowed the acquisition of large data sets. A subsequent data processing, however, is still a bottleneck in the analytical process, as a manual data interpretation is impossible within a reasonable time frame. The transformation of mass spectrometric data into spatial distribution images of detected compounds turned out to be the most appropriate method to visualize the results of such scans, as humans are able to interpret images faster and easier than plain numbers. Image generation, thus, is a time-consuming and complex yet very efficient task. The free software package "Mirion," presented in this paper, allows the handling and analysis of data sets acquired by mass spectrometry imaging. Mirion can be used for image processing of MSI data obtained from many different sources, as it uses the HUPO-PSI-based standard data format imzML, which is implemented in the proprietary software of most of the mass spectrometer companies. Different graphical representations of the recorded data are available. Furthermore, automatic calculation and overlay of mass spectrometric images promotes direct comparison of different analytes for data evaluation. The program also includes tools for image processing and image analysis.

  13. Big Data technology in traffic: A case study of automatic counters

    Directory of Open Access Journals (Sweden)

    Janković Slađana R.

    2016-01-01

    Full Text Available Modern information and communication technologies together with intelligent devices provide a continuous inflow of large amounts of data that are used by traffic and transport systems. Collecting traffic data does not represent a challenge nowadays, but the issues remains in relation to storing and processing increasing amounts of data. In this paper we have investigated the possibilities of using Big Data technology to store and process data in the transport domain. The term Big Data refers to a large volume of information resource, its velocity and variety, far beyond the capabilities of commonly used software for storing, processing and data management. In our case study, Apache™ Hadoop® Big Data was used for processing data collected from 10 automatic traffic counters set up in Novi Sad and its surroundings. Indicators of traffic load which were calculated using the Big Data platforms were presented using tables and graphs in Microsoft Office Excel tool. The visualization and geolocation of the obtained indicators were performed using the Microsoft Business Intelligence (BI tools such as: Excel Power View and Excel Power Map. This case study showed that Big Data technologies combined with the BI tools can be used as a reliable support in monitoring of the traffic management systems.

  14. Automatic identification of corrosion damage using image processing techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bento, Mariana P.; Ramalho, Geraldo L.B.; Medeiros, Fatima N.S. de; Ribeiro, Elvis S. [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil); Medeiros, Luiz C.L. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper proposes a Nondestructive Evaluation (NDE) method for atmospheric corrosion detection on metallic surfaces using digital images. In this study, the uniform corrosion is characterized by texture attributes extracted from co-occurrence matrix and the Self Organizing Mapping (SOM) clustering algorithm. We present a technique for automatic inspection of oil and gas storage tanks and pipelines of petrochemical industries without disturbing their properties and performance. Experimental results are promising and encourage the possibility of using this methodology in designing trustful and robust early failure detection systems. (author)

  15. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Science.gov (United States)

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  16. Automatic data acquisition system for a photovoltaic solar plant

    Energy Technology Data Exchange (ETDEWEB)

    Santos, A.; Barrio, C.L.; Guerra, A.G.

    1986-01-01

    An autonomous monitoring system for photovoltaic solar plants is described. The system is able to collect data about the plant's physical and electrical characteristics and also about the environmental conditions. It may present the results on a display, if requested, but its main function is measuring periodically a set of parameters, including several points in the panel I-V characteristics, in an unattended mode. The data are stored on a magnetic tape for later processing on a computer. The system hardware and software are described, as well as their main functions.

  17. Automatic tracking of wake vortices using ground-wind sensor data

    Science.gov (United States)

    1977-01-03

    Algorithms for automatic tracking of wake vortices using ground-wind anemometer : data are developed. Methods of bad-data suppression, track initiation, and : track termination are included. An effective sensor-failure detection-and identification : ...

  18. The N400 and Late Positive Complex (LPC Effects Reflect Controlled Rather than Automatic Mechanisms of Sentence Processing

    Directory of Open Access Journals (Sweden)

    Boris Kotchoubey

    2012-08-01

    Full Text Available This study compared automatic and controlled cognitive processes that underlie event-related potentials (ERPs effects during speech perception. Sentences were presented to French native speakers, and the final word could be congruent or incongruent, and presented at one of four levels of degradation (using a modulation with pink noise: no degradation, mild degradation (2 levels, or strong degradation. We assumed that degradation impairs controlled more than automatic processes. The N400 and Late Positive Complex (LPC effects were defined as the differences between the corresponding wave amplitudes to incongruent words minus congruent words. Under mild degradation, where controlled sentence-level processing could still occur (as indicated by behavioral data, both N400 and LPC effects were delayed and the latter effect was reduced. Under strong degradation, where sentence processing was rather automatic (as indicated by behavioral data, no ERP effect remained. These results suggest that ERP effects elicited in complex contexts, such as sentences, reflect controlled rather than automatic mechanisms of speech processing. These results differ from the results of experiments that used word-pair or word-list paradigms.

  19. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  20. Study of automatic boat loading unit and horizontal sintering process of uranium dioxide pellet

    International Nuclear Information System (INIS)

    He Zhongjing; Chen Yu; Yao Dengfeng; Wang Youliang; Shu Binhua; Wu Genjiu

    2014-01-01

    Sintering process is a key process for the manufacture of nuclear fuel UO_2 pellet. In our factory, the continuous high temperature sintering furnace is used for sintering process. During the sintering of green pellets, the furnace, the boat and the accumulation way can influence the quality of the final product. In this text, on the basis of early process research, The automatic loading boat Unit and horizontal sintering process is studied successively. The results show that the physical and chemical properties of the products manufactured by automatic loading boat unit and horizontal sintering process can meet the technique requirements completely, and this system is reliable and continuous. (authors)

  1. Study on automatic ECT data evaluation by using neural network

    International Nuclear Information System (INIS)

    Komatsu, H.; Matsumoto, Y.; Badics, Z.; Aoki, K.; Nakayasu, F.; Hashimoto, M.; Miya, K.

    1994-01-01

    At the in--service inspection of the steam generator (SG) tubings in Pressurized Water Reactor (PWR) plant, eddy current testing (ECT) has been widely used at each outage. At present, ECT data evaluation is mainly performed by ECT data analyst, therefore it has the following problems. Only ECT signal configuration on the impedance trajectory is used in the evaluation. It is an enormous time consuming process. The evaluation result is influenced by the ability and experience of the analyst. Especially, it is difficult to identify the true defect signal hidden in background signals such as lift--off noise and deposit signals. In this work, the authors performed the study on the possibility of the application of neural network to ECT data evaluation. It was demonstrated that the neural network proved to be effective to identify the nature of defect, by selecting several optimum input parameters to categorize the raw ECT signals

  2. Automatic system for crystallographic data collection and analysis

    International Nuclear Information System (INIS)

    Minor, W.; Cymborowski, M.; Otwinowski, Z.

    2002-01-01

    During the last 10 years the rate of the new protein structures determined by X-ray crystallography has risen about tenfold. The use of high flux sources was instrumental in this growth. There are numerous advantages of using synchrotron radiation for protein crystallography: rapid data collection, use of microcrystals and the ability to conduct measurements at wide range of wavelengths. The rate-limiting step is often the ability to analyze and back up a fast stream of data produced by a multi-module CCD detector. A goal of the newly developed HKL-2000 package is to integrate all computational activities that have to be performed during the data collection experiment. The graphical Command Center of HKL-2000 organizes and forwards the data collection parameters to the display, indexing, strategy, simulation, refinement integration, scaling, and merging tasks. Data acquisition can become a part of data processing (or vice versa), which includes indexing, integration, scaling, and even phasing. The increase in inert band with will provide an opportunity to remotely interact with the experimental setup and perform the synchrotron experiment from the home laboratory. (author)

  3. Dissociation between controlled and automatic processes in the behavioral variant of fronto-temporal dementia.

    Science.gov (United States)

    Collette, Fabienne; Van der Linden, Martial; Salmon, Eric

    2010-01-01

    A decline of cognitive functioning affecting several cognitive domains was frequently reported in patients with frontotemporal dementia. We were interested in determining if these deficits can be interpreted as reflecting an impairment of controlled cognitive processes by using an assessment tool specifically developed to explore the distinction between automatic and controlled processes, namely the process dissociation procedure (PDP) developed by Jacoby. The PDP was applied to a word stem completion task to determine the contribution of automatic and controlled processes to episodic memory performance and was administered to a group of 12 patients with the behavioral variant of frontotemporal dementia (bv-FTD) and 20 control subjects (CS). Bv-FTD patients obtained a lower performance than CS for the estimates of controlled processes, but no group differences was observed for estimates of automatic processes. The between-groups comparison of the estimates of controlled and automatic processes showed a larger contribution of automatic processes to performance in bv-FTD, while a slightly more important contribution of controlled processes was observed in control subjects. These results are clearly indicative of an alteration of controlled memory processes in bv-FTD.

  4. Automatic Feature Detection, Description and Matching from Mobile Laser Scanning Data and Aerial Imagery

    Science.gov (United States)

    Hussnain, Zille; Oude Elberink, Sander; Vosselman, George

    2016-06-01

    In mobile laser scanning systems, the platform's position is measured by GNSS and IMU, which is often not reliable in urban areas. Consequently, derived Mobile Laser Scanning Point Cloud (MLSPC) lacks expected positioning reliability and accuracy. Many of the current solutions are either semi-automatic or unable to achieve pixel level accuracy. We propose an automatic feature extraction method which involves utilizing corresponding aerial images as a reference data set. The proposed method comprise three steps; image feature detection, description and matching between corresponding patches of nadir aerial and MLSPC ortho images. In the data pre-processing step the MLSPC is patch-wise cropped and converted to ortho images. Furthermore, each aerial image patch covering the area of the corresponding MLSPC patch is also cropped from the aerial image. For feature detection, we implemented an adaptive variant of Harris-operator to automatically detect corner feature points on the vertices of road markings. In feature description phase, we used the LATCH binary descriptor, which is robust to data from different sensors. For descriptor matching, we developed an outlier filtering technique, which exploits the arrangements of relative Euclidean-distances and angles between corresponding sets of feature points. We found that the positioning accuracy of the computed correspondence has achieved the pixel level accuracy, where the image resolution is 12cm. Furthermore, the developed approach is reliable when enough road markings are available in the data sets. We conclude that, in urban areas, the developed approach can reliably extract features necessary to improve the MLSPC accuracy to pixel level.

  5. A Cloud-Based System for Automatic Hazard Monitoring from Sentinel-1 SAR Data

    Science.gov (United States)

    Meyer, F. J.; Arko, S. A.; Hogenson, K.; McAlpin, D. B.; Whitley, M. A.

    2017-12-01

    Despite the all-weather capabilities of Synthetic Aperture Radar (SAR), and its high performance in change detection, the application of SAR for operational hazard monitoring was limited in the past. This has largely been due to high data costs, slow product delivery, and limited temporal sampling associated with legacy SAR systems. Only since the launch of ESA's Sentinel-1 sensors have routinely acquired and free-of-charge SAR data become available, allowing—for the first time—for a meaningful contribution of SAR to disaster monitoring. In this paper, we present recent technical advances of the Sentinel-1-based SAR processing system SARVIEWS, which was originally built to generate hazard products for volcano monitoring centers. We outline the main functionalities of SARVIEWS including its automatic database interface to Sentinel-1 holdings of the Alaska Satellite Facility (ASF), and its set of automatic processing techniques. Subsequently, we present recent system improvements that were added to SARVIEWS and allowed for a vast expansion of its hazard services; specifically: (1) In early 2017, the SARVIEWS system was migrated into the Amazon Cloud, providing access to cloud capabilities such as elastic scaling of compute resources and cloud-based storage; (2) we co-located SARVIEWS with ASF's cloud-based Sentinel-1 archive, enabling the efficient and cost effective processing of large data volumes; (3) we integrated SARVIEWS with ASF's HyP3 system (http://hyp3.asf.alaska.edu/), providing functionality such as subscription creation via API or map interface as well as automatic email notification; (4) we automated the production chains for seismic and volcanic hazards by integrating SARVIEWS with the USGS earthquake notification service (ENS) and the USGS eruption alert system. Email notifications from both services are parsed and subscriptions are automatically created when certain event criteria are met; (5) finally, SARVIEWS-generated hazard products are now

  6. An Automatic Framework Using Space-Time Processing and TR-MUSIC for Subsurface and Through-Wall Multitarget Imaging

    Directory of Open Access Journals (Sweden)

    Si-hao Tan

    2012-01-01

    Full Text Available We present an automatic framework combined space-time signal processing with Time Reversal electromagnetic (EM inversion for subsurface and through-wall multitarget imaging using electromagnetic waves. This framework is composed of a frequency-wavenumber (FK filter to suppress direct wave and medium bounce, a FK migration algorithm to automatically estimate the number of targets and identify target regions, which can be used to reduce the computational complexity of the following imaging algorithm, and a EM inversion algorithm using Time Reversal Multiple Signal Classification (TR-MUSIC to reconstruct hidden objects. The feasibility of the framework is demonstrated with simulated data generated by GPRMAX.

  7. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad

    2017-09-27

    Optimizing the performance of big-data streaming applications has become a daunting and time-consuming task: parameters may be tuned from a space of hundreds or even thousands of possible configurations. In this paper, we present a framework for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing three benchmark applications in Apache Storm. Our results show that a hill-climbing algorithm that uses a new heuristic sampling approach based on Latin Hypercube provides the best results. Our gray-box algorithm provides comparable results while being two to five times faster.

  8. Adaptive Automatic Gauge Control of a Cold Strip Rolling Process

    Directory of Open Access Journals (Sweden)

    ROMAN, N.

    2010-02-01

    Full Text Available The paper tackles with thickness control structure of the cold rolled strips. This structure is based on the rolls position control of a reversible quarto rolling mill. The main feature of the system proposed in the paper consists in the compensation of the errors introduced by the deficient dynamics of the hydraulic servo-system used for the rolls positioning, by means of a dynamic compensator that approximates the inverse system of the servo-system. Because the servo-system is considered variant over time, an on-line identification of the servo-system and parameter adapting of the compensator are achieved. The results obtained by numerical simulation are presented together with the data taken from real process. These results illustrate the efficiency of the proposed solutions.

  9. Automatic cortical surface reconstruction of high-resolution T1 echo planar imaging data.

    Science.gov (United States)

    Renvall, Ville; Witzel, Thomas; Wald, Lawrence L; Polimeni, Jonathan R

    2016-07-01

    Echo planar imaging (EPI) is the method of choice for the majority of functional magnetic resonance imaging (fMRI), yet EPI is prone to geometric distortions and thus misaligns with conventional anatomical reference data. The poor geometric correspondence between functional and anatomical data can lead to severe misplacements and corruption of detected activation patterns. However, recent advances in imaging technology have provided EPI data with increasing quality and resolution. Here we present a framework for deriving cortical surface reconstructions directly from high-resolution EPI-based reference images that provide anatomical models exactly geometric distortion-matched to the functional data. Anatomical EPI data with 1mm isotropic voxel size were acquired using a fast multiple inversion recovery time EPI sequence (MI-EPI) at 7T, from which quantitative T1 maps were calculated. Using these T1 maps, volumetric data mimicking the tissue contrast of standard anatomical data were synthesized using the Bloch equations, and these T1-weighted data were automatically processed using FreeSurfer. The spatial alignment between T2(⁎)-weighted EPI data and the synthetic T1-weighted anatomical MI-EPI-based images was improved compared to the conventional anatomical reference. In particular, the alignment near the regions vulnerable to distortion due to magnetic susceptibility differences was improved, and sampling of the adjacent tissue classes outside of the cortex was reduced when using cortical surface reconstructions derived directly from the MI-EPI reference. The MI-EPI method therefore produces high-quality anatomical data that can be automatically segmented with standard software, providing cortical surface reconstructions that are geometrically matched to the BOLD fMRI data. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Marcoule pilot work-room: process automatic operation

    International Nuclear Information System (INIS)

    Mus, G.; Linger, C.

    1987-01-01

    Commissioned in the early 1960s, the Marcoule Pilot Plant has undergone a series of sweeping transformations. The Research and Development resources concerning irradiated fuel processing have been expanded and modified. Its reprocessing capacity has also been raised from 2 to 5 t/year. Simultaneously, the installation control system was completely remodelled. The control consoles, which were previously positioned locally near the different units, have been grouped together in a centralized control room. To do this, the measurement and operating circuits were replaced by new data acquisition and processing systems requiring the use of numerical algorithms. The management and control of certain units, including mechanical fuel preparation, sampling, and sample transport to the laboratories, have been entrusted to programmable automata. Certain unit operations, such as concentration by evaporation, are set up with complete automation. These new arrangements will expand the resources for analysing the operation of the Pilot Plant, while offering a more overall view of the operations. They have been made possible by a major effort in the development of sensors, and represent the indispensable prerequisite for the installation of expert systems [fr

  11. Automatic Determination of Fiber-Length Distribution in Composite Material Using 3D CT Data

    Directory of Open Access Journals (Sweden)

    Günther Greiner

    2010-01-01

    Full Text Available Determining fiber length distribution in fiber reinforced polymer components is a crucial step in quality assurance, since fiber length has a strong influence on overall strength, stiffness, and stability of the material. The approximate fiber length distribution is usually determined early in the development process, as conventional methods require a destruction of the sample component. In this paper, a novel, automatic, and nondestructive approach for the determination of fiber length distribution in fiber reinforced polymers is presented. For this purpose, high-resolution computed tomography is used as imaging method together with subsequent image analysis for evaluation. The image analysis consists of an iterative process where single fibers are detected automatically in each iteration step after having applied image enhancement algorithms. Subsequently, a model-based approach is used together with a priori information in order to guide a fiber tracing and segmentation process. Thereby, the length of the segmented fibers can be calculated and a length distribution can be deduced. The performance and the robustness of the segmentation method is demonstrated by applying it to artificially generated test data and selected real components.

  12. Automatic Determination of Fiber-Length Distribution in Composite Material Using 3D CT Data

    Science.gov (United States)

    Teßmann, Matthias; Mohr, Stephan; Gayetskyy, Svitlana; Haßler, Ulf; Hanke, Randolf; Greiner, Günther

    2010-12-01

    Determining fiber length distribution in fiber reinforced polymer components is a crucial step in quality assurance, since fiber length has a strong influence on overall strength, stiffness, and stability of the material. The approximate fiber length distribution is usually determined early in the development process, as conventional methods require a destruction of the sample component. In this paper, a novel, automatic, and nondestructive approach for the determination of fiber length distribution in fiber reinforced polymers is presented. For this purpose, high-resolution computed tomography is used as imaging method together with subsequent image analysis for evaluation. The image analysis consists of an iterative process where single fibers are detected automatically in each iteration step after having applied image enhancement algorithms. Subsequently, a model-based approach is used together with a priori information in order to guide a fiber tracing and segmentation process. Thereby, the length of the segmented fibers can be calculated and a length distribution can be deduced. The performance and the robustness of the segmentation method is demonstrated by applying it to artificially generated test data and selected real components.

  13. Automatic Gap Detection in Friction Stir Welding Processes (Preprint)

    National Research Council Canada - National Science Library

    Yang, Yu; Kalya, Prabhanjana; Landers, Robert G; Krishnamurthy, K

    2006-01-01

    .... This paper develops a monitoring algorithm to detect gaps in Friction Stir Welding (FSW) processes. Experimental studies are conducted to determine how the process parameters and the gap width affect the welding process...

  14. Automatic Tracking Of Remote Sensing Precipitation Data Using Genetic Algorithm Image Registration Based Automatic Morphing: September 1999 Storm Floyd Case Study

    Science.gov (United States)

    Chiu, L.; Vongsaard, J.; El-Ghazawi, T.; Weinman, J.; Yang, R.; Kafatos, M.

    U Due to the poor temporal sampling by satellites, data gaps exist in satellite derived time series of precipitation. This poses a challenge for assimilating rain- fall data into forecast models. To yield a continuous time series, the classic image processing technique of digital image morphing has been used. However, the digital morphing technique was applied manually and that is time consuming. In order to avoid human intervention in the process, an automatic procedure for image morphing is needed for real-time operations. For this purpose, Genetic Algorithm Based Image Registration Automatic Morphing (GRAM) model was developed and tested in this paper. Specifically, automatic morphing technique was integrated with Genetic Algo- rithm and Feature Based Image Metamorphosis technique to fill in data gaps between satellite coverage. The technique was tested using NOWRAD data which are gener- ated from the network of NEXRAD radars. Time series of NOWRAD data from storm Floyd that occurred at the US eastern region on September 16, 1999 for 00:00, 01:00, 02:00,03:00, and 04:00am were used. The GRAM technique was applied to data col- lected at 00:00 and 04:00am. These images were also manually morphed. Images at 01:00, 02:00 and 03:00am were interpolated from the GRAM and manual morphing and compared with the original NOWRAD rainrates. The results show that the GRAM technique outperforms manual morphing. The correlation coefficients between the im- ages generated using manual morphing are 0.905, 0.900, and 0.905 for the images at 01:00, 02:00,and 03:00 am, while the corresponding correlation coefficients are 0.946, 0.911, and 0.913, respectively, based on the GRAM technique. Index terms ­ Remote Sensing, Image Registration, Hydrology, Genetic Algorithm, Morphing, NEXRAD

  15. Automatic Data Logging and Quality Analysis System for Mobile Devices

    Directory of Open Access Journals (Sweden)

    Yong-Yi Fanjiang

    2017-01-01

    Full Text Available The testing phase of mobile device products includes two important test projects that must be completed before shipment: the field trial and the beta user trial. During the field trial, the product is certified based on its integration and stability with the local operator’s system, and, during the beta user trial, the product is certified by multiple users regarding its daily use, where the goal is to detect and solve early problems. In the traditional approach used to issue returns, testers must log into a web site, fill out a problem form, and then go through a browser or FTP to upload logs; however, this is inconvenient, and problems are reported slowly. Therefore, we propose an “automatic logging analysis system” (ALAS to construct a convenient test environment and, using a record analysis (log parser program, automate the parsing of log files and have questions automatically sent to the database by the system. Finally, the mean time between failures (MTBF is used to establish measurement indicators for the beta user trial.

  16. Automatic welding processes for reactor coolant pipes used in PWR type nuclear power plant

    International Nuclear Information System (INIS)

    Hamada, T.; Nakamura, A.; Nagura, Y.; Sakamoto, N.

    1979-01-01

    The authors developed automatic welding processes (submerged arc welding process and TIG welding process) for application to the welding of reactor coolant pipes which constitute the most important part of the PWR type nuclear power plant. Submerged arc welding process is suitable for flat position welding in which pipes can be rotated, while TIG welding process is suitable for all position welding. This paper gives an outline of the two processes and the results of tests performed using these processes. (author)

  17. Automatic generation of optimal business processes from business rules

    NARCIS (Netherlands)

    Steen, B.; Ferreira Pires, Luis; Iacob, Maria Eugenia

    2010-01-01

    In recent years, business process models are increasingly being used as a means for business process improvement. Business rules can be seen as requirements for business processes, in that they describe the constraints that must hold for business processes that implement these business rules.

  18. Automatic Derivation of Statistical Data Analysis Algorithms: Planetary Nebulae and Beyond

    Science.gov (United States)

    Fischer, Bernd; Hajian, Arsen; Knuth, Kevin; Schumann, Johann

    2004-04-01

    AUTOBAYES is a fully automatic program synthesis system for the data analysis domain. Its input is a declarative problem description in form of a statistical model; its output is documented and optimized C/C++ code. The synthesis process relies on the combination of three key techniques. Bayesian networks are used as a compact internal representation mechanism which enables problem decompositions and guides the algorithm derivation. Program schemas are used as independently composable building blocks for the algorithm construction; they can encapsulate advanced algorithms and data structures. A symbolic-algebraic system is used to find closed-form solutions for problems and emerging subproblems. In this paper, we describe the application of AUTOBAYES to the analysis of planetary nebulae images taken by the Hubble Space Telescope. We explain the system architecture, and present in detail the automatic derivation of the scientists' original analysis as well as a refined analysis using clustering models. This study demonstrates that AUTOBAYES is now mature enough so that it can be applied to realistic scientific data analysis tasks.

  19. Automatic rebuilding and optimization of crystallographic structures in the Protein Data Bank.

    Science.gov (United States)

    Joosten, Robbie P; Joosten, Krista; Cohen, Serge X; Vriend, Gert; Perrakis, Anastassis

    2011-12-15

    Macromolecular crystal structures in the Protein Data Bank (PDB) are a key source of structural insight into biological processes. These structures, some >30 years old, were constructed with methods of their era. With PDB_REDO, we aim to automatically optimize these structures to better fit their corresponding experimental data, passing the benefits of new methods in crystallography on to a wide base of non-crystallographer structure users. We developed new algorithms to allow automatic rebuilding and remodeling of main chain peptide bonds and side chains in crystallographic electron density maps, and incorporated these and further enhancements in the PDB_REDO procedure. Applying the updated PDB_REDO to the oldest, but also to some of the newest models in the PDB, corrects existing modeling errors and brings these models to a higher quality, as judged by standard validation methods. The PDB_REDO database and links to all software are available at http://www.cmbi.ru.nl/pdb_redo. r.joosten@nki.nl; a.perrakis@nki.nl Supplementary data are available at Bioinformatics online.

  20. Covariance data processing code. ERRORJ

    International Nuclear Information System (INIS)

    Kosako, Kazuaki

    2001-01-01

    The covariance data processing code, ERRORJ, was developed to process the covariance data of JENDL-3.2. ERRORJ has the processing functions of covariance data for cross sections including resonance parameters, angular distribution and energy distribution. (author)

  1. Internal plasma diagnostic with a multichannel magnetic probe system using automatic data acquisition

    International Nuclear Information System (INIS)

    Korten, M.; Carolan, P.G.; Sand, F.; Waelbroeck, F.

    1975-04-01

    A 20-channel magnetic probe system inserted into the plasma is used to measure spatial distributions of poloidal and toroidal magnetic fields in the pulsed toroidal high β-experiment TEE. Plasma parameters, e.g. the β-value, toroidal current density and radial pressure distribution were derived applying static equilibrium theory and can be calculated from the measurements. A data acquisition system used in conjuction with a process computer was operated to obtain the experimental data automatically and to perform the multiple computational tasks. The program system described was built to serve as a first stage of a more common software system applicable for computational data handling for different diagnostics of a plasma physics confinement experiment. (orig.) [de

  2. [Use of nondeclarative and automatic memory processes in motor learning: how to mitigate the effects of aging].

    Science.gov (United States)

    Chauvel, Guillaume; Maquestiaux, François; Didierjean, André; Joubert, Sven; Dieudonné, Bénédicte; Verny, Marc

    2011-12-01

    Does normal aging inexorably lead to diminished motor learning abilities? This article provides an overview of the literature on the question, with particular emphasis on the functional dissociation between two sets of memory processes: declarative, effortful processes, and non-declarative, automatic processes. There is abundant evidence suggesting that aging does impair learning when past memories of former actions are required (episodic memory) and recollected through controlled processing (working memory). However, other studies have shown that aging does not impair learning when motor actions are performed non verbally and automatically (tapping procedural memory). These findings led us to hypothesize that one can minimize the impact of aging on the ability to learn new motor actions by favouring procedural learning. Recent data validating this hypothesis are presented. Our findings underline the importance of developing new motor learning strategies, which "bypass" declarative, effortful memory processes.

  3. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    Science.gov (United States)

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes.

  4. Automatic extraction of pavement markings on streets from point cloud data of mobile LiDAR

    International Nuclear Information System (INIS)

    Gao, Yang; Zhong, Ruofei; Liu, Xianlin; Tang, Tao; Wang, Liuzhao

    2017-01-01

    Pavement markings provide an important foundation as they help to keep roads users safe. Accurate and comprehensive information about pavement markings assists the road regulators and is useful in developing driverless technology. Mobile light detection and ranging (LiDAR) systems offer new opportunities to collect and process accurate pavement markings’ information. Mobile LiDAR systems can directly obtain the three-dimensional (3D) coordinates of an object, thus defining spatial data and the intensity of (3D) objects in a fast and efficient way. The RGB attribute information of data points can be obtained based on the panoramic camera in the system. In this paper, we present a novel method process to automatically extract pavement markings using multiple attribute information of the laser scanning point cloud from the mobile LiDAR data. This method process utilizes a differential grayscale of RGB color, laser pulse reflection intensity, and the differential intensity to identify and extract pavement markings. We utilized point cloud density to remove the noise and used morphological operations to eliminate the errors. In the application, we tested our method process on different sections of roads in Beijing, China, and Buffalo, NY, USA. The results indicated that both correctness ( p ) and completeness ( r ) were higher than 90%. The method process of this research can be applied to extract pavement markings from huge point cloud data produced by mobile LiDAR. (paper)

  5. Automatic extraction of pavement markings on streets from point cloud data of mobile LiDAR

    Science.gov (United States)

    Gao, Yang; Zhong, Ruofei; Tang, Tao; Wang, Liuzhao; Liu, Xianlin

    2017-08-01

    Pavement markings provide an important foundation as they help to keep roads users safe. Accurate and comprehensive information about pavement markings assists the road regulators and is useful in developing driverless technology. Mobile light detection and ranging (LiDAR) systems offer new opportunities to collect and process accurate pavement markings’ information. Mobile LiDAR systems can directly obtain the three-dimensional (3D) coordinates of an object, thus defining spatial data and the intensity of (3D) objects in a fast and efficient way. The RGB attribute information of data points can be obtained based on the panoramic camera in the system. In this paper, we present a novel method process to automatically extract pavement markings using multiple attribute information of the laser scanning point cloud from the mobile LiDAR data. This method process utilizes a differential grayscale of RGB color, laser pulse reflection intensity, and the differential intensity to identify and extract pavement markings. We utilized point cloud density to remove the noise and used morphological operations to eliminate the errors. In the application, we tested our method process on different sections of roads in Beijing, China, and Buffalo, NY, USA. The results indicated that both correctness (p) and completeness (r) were higher than 90%. The method process of this research can be applied to extract pavement markings from huge point cloud data produced by mobile LiDAR.

  6. Development of an automatic sample changer and a data acquisition system

    International Nuclear Information System (INIS)

    Bianchini, Ricardo M.; Estevez, Jorge; Vollmer, Alberto E.; Iglicki, Flora A.

    1999-01-01

    An automatic electro-pneumatic sample changer with a rotating sample holder is described. The changer is coupled through an electronic interface with the data acquisition station. The software to automate the system has been designed. (author)

  7. Roadway system assessment using bluetooth-based automatic vehicle identification travel time data.

    Science.gov (United States)

    2012-12-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection : systems. This includes considerations in the physical setup of the collection system as well as the interpretation of...

  8. Towards Automatic Classification of Exoplanet-Transit-Like Signals: A Case Study on Kepler Mission Data

    Science.gov (United States)

    Valizadegan, Hamed; Martin, Rodney; McCauliff, Sean D.; Jenkins, Jon Michael; Catanzarite, Joseph; Oza, Nikunj C.

    2015-08-01

    Building new catalogues of planetary candidates, astrophysical false alarms, and non-transiting phenomena is a challenging task that currently requires a reviewing team of astrophysicists and astronomers. These scientists need to examine more than 100 diagnostic metrics and associated graphics for each candidate exoplanet-transit-like signal to classify it into one of the three classes. Considering that the NASA Explorer Program's TESS mission and ESA's PLATO mission survey even a larger area of space, the classification of their transit-like signals is more time-consuming for human agents and a bottleneck to successfully construct the new catalogues in a timely manner. This encourages building automatic classification tools that can quickly and reliably classify the new signal data from these missions. The standard tool for building automatic classification systems is the supervised machine learning that requires a large set of highly accurate labeled examples in order to build an effective classifier. This requirement cannot be easily met for classifying transit-like signals because not only are existing labeled signals very limited, but also the current labels may not be reliable (because the labeling process is a subjective task). Our experiments with using different supervised classifiers to categorize transit-like signals verifies that the labeled signals are not rich enough to provide the classifier with enough power to generalize well beyond the observed cases (e.g. to unseen or test signals). That motivated us to utilize a new category of learning techniques, so-called semi-supervised learning, that combines the label information from the costly labeled signals, and distribution information from the cheaply available unlabeled signals in order to construct more effective classifiers. Our study on the Kepler Mission data shows that semi-supervised learning can significantly improve the result of multiple base classifiers (e.g. Support Vector Machines, Ada

  9. Automatic Data Filter Customization Using a Genetic Algorithm

    Science.gov (United States)

    Mandrake, Lukas

    2013-01-01

    This work predicts whether a retrieval algorithm will usefully determine CO2 concentration from an input spectrum of GOSAT (Greenhouse Gases Observing Satellite). This was done to eliminate needless runtime on atmospheric soundings that would never yield useful results. A space of 50 dimensions was examined for predictive power on the final CO2 results. Retrieval algorithms are frequently expensive to run, and wasted effort defeats requirements and expends needless resources. This algorithm could be used to help predict and filter unneeded runs in any computationally expensive regime. Traditional methods such as the Fischer discriminant analysis and decision trees can attempt to predict whether a sounding will be properly processed. However, this work sought to detect a subsection of the dimensional space that can be simply filtered out to eliminate unwanted runs. LDAs (linear discriminant analyses) and other systems examine the entire data and judge a "best fit," giving equal weight to complex and problematic regions as well as simple, clear-cut regions. In this implementation, a genetic space of "left" and "right" thresholds outside of which all data are rejected was defined. These left/right pairs are created for each of the 50 input dimensions. A genetic algorithm then runs through countless potential filter settings using a JPL computer cluster, optimizing the tossed-out data s yield (proper vs. improper run removal) and number of points tossed. This solution is robust to an arbitrary decision boundary within the data and avoids the global optimization problem of whole-dataset fitting using LDA or decision trees. It filters out runs that would not have produced useful CO2 values to save needless computation. This would be an algorithmic preprocessing improvement to any computationally expensive system.

  10. Automatic data-driven real-time segmentation and recognition of surgical workflow.

    Science.gov (United States)

    Dergachyova, Olga; Bouget, David; Huaulmé, Arnaud; Morandi, Xavier; Jannin, Pierre

    2016-06-01

    With the intention of extending the perception and action of surgical staff inside the operating room, the medical community has expressed a growing interest towards context-aware systems. Requiring an accurate identification of the surgical workflow, such systems make use of data from a diverse set of available sensors. In this paper, we propose a fully data-driven and real-time method for segmentation and recognition of surgical phases using a combination of video data and instrument usage signals, exploiting no prior knowledge. We also introduce new validation metrics for assessment of workflow detection. The segmentation and recognition are based on a four-stage process. Firstly, during the learning time, a Surgical Process Model is automatically constructed from data annotations to guide the following process. Secondly, data samples are described using a combination of low-level visual cues and instrument information. Then, in the third stage, these descriptions are employed to train a set of AdaBoost classifiers capable of distinguishing one surgical phase from others. Finally, AdaBoost responses are used as input to a Hidden semi-Markov Model in order to obtain a final decision. On the MICCAI EndoVis challenge laparoscopic dataset we achieved a precision and a recall of 91 % in classification of 7 phases. Compared to the analysis based on one data type only, a combination of visual features and instrument signals allows better segmentation, reduction of the detection delay and discovery of the correct phase order.

  11. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  12. Historical Patterns Based on Automatically Extracted Data: the Case of Classical Composers

    DEFF Research Database (Denmark)

    Borowiecki, Karol; O'Hagan, John

    2012-01-01

    application that automatically extracts and processes information was developed to generate data on the birth location, occupations and importance (using word count methods) of over 12,000 composers over six centuries. Quantitative measures of the relative importance of different types of music......The purpose of this paper is to demonstrate the potential for generating interesting aggregate data on certain aspect of the lives of thousands of composers, and indeed other creative groups, from large on-line dictionaries and to be able to do so relatively quickly. A purpose-built java...... and of the different music instruments over the centuries were also generated. Finally quantitative indicators of the importance of different cities over the different centuries in the lives of these composers are constructed. A range of interesting findings emerge in relation to all of these aspects of the lives...

  13. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers.

    Science.gov (United States)

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-04-15

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject "at rest"). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing "signal" (brain activity) can be distinguished form the "noise" components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX ("FMRIB's ICA-based X-noiseifier"), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original

  14. AUTOMATIC 3D BUILDING MODEL GENERATION FROM LIDAR AND IMAGE DATA USING SEQUENTIAL MINIMUM BOUNDING RECTANGLE

    Directory of Open Access Journals (Sweden)

    E. Kwak

    2012-07-01

    Full Text Available Digital Building Model is an important component in many applications such as city modelling, natural disaster planning, and aftermath evaluation. The importance of accurate and up-to-date building models has been discussed by many researchers, and many different approaches for efficient building model generation have been proposed. They can be categorised according to the data source used, the data processing strategy, and the amount of human interaction. In terms of data source, due to the limitations of using single source data, integration of multi-senor data is desired since it preserves the advantages of the involved datasets. Aerial imagery and LiDAR data are among the commonly combined sources to obtain 3D building models with good vertical accuracy from laser scanning and good planimetric accuracy from aerial images. The most used data processing strategies are data-driven and model-driven ones. Theoretically one can model any shape of buildings using data-driven approaches but practically it leaves the question of how to impose constraints and set the rules during the generation process. Due to the complexity of the implementation of the data-driven approaches, model-based approaches draw the attention of the researchers. However, the major drawback of model-based approaches is that the establishment of representative models involves a manual process that requires human intervention. Therefore, the objective of this research work is to automatically generate building models using the Minimum Bounding Rectangle algorithm and sequentially adjusting them to combine the advantages of image and LiDAR datasets.

  15. Automatic Modelling of Rubble Mound Breakwaters from LIDAR Data

    Science.gov (United States)

    Bueno, M.; Díaz-Vilariño, L.; González-Jorge, H.; Martínez-Sánchez, J.; Arias, P.

    2015-08-01

    Rubble mound breakwaters maintenance is critical to the protection of beaches and ports. LiDAR systems provide accurate point clouds from the emerged part of the structure that can be modelled to make it more useful and easy to handle. This work introduces a methodology for the automatic modelling of breakwaters with armour units of cube shape. The algorithm is divided in three main steps: normal vector computation, plane segmentation, and cube reconstruction. Plane segmentation uses the normal orientation of the points and the edge length of the cube. Cube reconstruction uses the intersection of three perpendicular planes and the edge length. Three point clouds cropped from the main point cloud of the structure are used for the tests. The number of cubes detected is around 56 % for two of the point clouds and 32 % for the third one over the total physical cubes. Accuracy assessment is done by comparison with manually drawn cubes calculating the differences between the vertexes. It ranges between 6.4 cm and 15 cm. Computing time ranges between 578.5 s and 8018.2 s. The computing time increases with the number of cubes and the requirements of collision detection.

  16. A Development Process for Enterprise Information Systems Based on Automatic Generation of the Components

    Directory of Open Access Journals (Sweden)

    Adrian ALEXANDRESCU

    2008-01-01

    Full Text Available This paper contains some ideas concerning the Enterprise Information Systems (EIS development. It combines known elements from the software engineering domain, with original elements, which the author has conceived and experimented. The author has followed two major objectives: to use a simple description for the concepts of an EIS, and to achieve a rapid and reliable EIS development process with minimal cost. The first goal was achieved defining some models, which describes the conceptual elements of the EIS domain: entities, events, actions, states and attribute-domain. The second goal is based on a predefined architectural model for the EIS, on predefined analyze and design models for the elements of the domain and finally on the automatic generation of the system components. The proposed methods do not depend on a special programming language or a data base management system. They are general and may be applied to any combination of such technologies.

  17. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David; Gereige, Issam; Gourgon, Cé cile

    2013-01-01

    patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications

  18. Towards automatic parameter tuning of stream processing systems

    KAUST Repository

    Bilal, Muhammad; Canini, Marco

    2017-01-01

    for automating parameter tuning for stream-processing systems. Our framework supports standard black-box optimization algorithms as well as a novel gray-box optimization algorithm. We demonstrate the multiple benefits of automated parameter tuning in optimizing

  19. Measuring automatic retrieval: a comparison of implicit memory, process dissociation, and speeded response procedures.

    Science.gov (United States)

    Horton, Keith D; Wilson, Daryl E; Vonk, Jennifer; Kirby, Sarah L; Nielsen, Tina

    2005-07-01

    Using the stem completion task, we compared estimates of automatic retrieval from an implicit memory task, the process dissociation procedure, and the speeded response procedure. Two standard manipulations were employed. In Experiment 1, a depth of processing effect was found on automatic retrieval using the speeded response procedure although this effect was substantially reduced in Experiment 2 when lexical processing was required of all words. In Experiment 3, the speeded response procedure showed an advantage of full versus divided attention at study on automatic retrieval. An implicit condition showed parallel effects in each study, suggesting that implicit stem completion may normally provide a good estimate of automatic retrieval. Also, we replicated earlier findings from the process dissociation procedure, but estimates of automatic retrieval from this procedure were consistently lower than those from the speeded response procedure, except when conscious retrieval was relatively low. We discuss several factors that may contribute to the conflicting outcomes, including the evidence for theoretical assumptions and criterial task differences between implicit and explicit tests.

  20. Process Concepts for Semi-automatic Dismantling of LCD Televisions

    OpenAIRE

    Elo, Kristofer; Sundin, Erik

    2014-01-01

    There is a large variety of electrical and electronic equipment products, for example liquid crystal display television sets (LCD TVs), in the waste stream today. Many LCD TVs contain mercury, which is a challenge to treat at the recycling plants. Two current used processes to recycle LCD TVs are automated shredding and manual disassembly. This paper aims to present concepts for semi-automated dismantling processes for LCD TVs in order to achieve higher productivity and flexibility, and in tu...

  1. Method for Processing Liver Spheroids Using an Automatic Tissue Processor

    Science.gov (United States)

    2016-05-01

    alcohol dehydration and hot liquid wax infiltration. After the water in the tissue is replaced with wax and cooled, it then becomes possible to cut...effective for processing and preparing microscopy slides of liver spheroids. The general process involved formalin fixation, dehydration in a...DPBS);  formalin (37% neutral buffer formaldehyde);  series of alcohol solutions: 70, 80, 95, and 100% ethanol in water; 2  xylene

  2. Using Automatic Code Generation in the Attitude Control Flight Software Engineering Process

    Science.gov (United States)

    McComas, David; O'Donnell, James R., Jr.; Andrews, Stephen F.

    1999-01-01

    This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.

  3. BES offline data processing

    International Nuclear Information System (INIS)

    Xu Rongsheng; Lang Pengfei; Chen Yaqing

    1991-01-01

    Data production of BES offline analysis is described, filter criteria of events are presented, and the division of the DST data, the reconstructed events, is introduced. Some physics signals first seen in BES from preliminary analyzing are also shown to prove the reliability of BES data

  4. SEMI-AUTOMATIC CO-REGISTRATION OF PHOTOGRAMMETRIC AND LIDAR DATA USING BUILDINGS

    Directory of Open Access Journals (Sweden)

    C. Armenakis

    2012-07-01

    Full Text Available In this work, the co-registration steps between LiDAR and photogrammetric DSM 3Ddata are analyzed and a solution based on automated plane matching is proposed and implemented. For a robust 3D geometric transformation both planes and points are used. Initially planes are chosen as the co-registration primitives. To confine the search space for the plane matching a sequential automatic building matching is performed first. For matching buildings from the LiDAR and the photogrammetric data, a similarity objective function is formed based on the roof height difference (RHD, the 3D histogram of the building attributes, and the building boundary area of a building. A region growing algorithm based on a Triangulated Irregular Network (TIN is implemented to extract planes from both datasets. Next, an automatic successive process for identifying and matching corresponding planes from the two datasets has been developed and implemented. It is based on the building boundary region and determines plane pairs through a robust matching process thus eliminating outlier pairs. The selected correct plane pairs are the input data for the geometric transformation process. The 3D conformal transformation method in conjunction with the attitude quaternion is applied to obtain the transformation parameters using the normal vectors of the corresponding plane pairs. Following the mapping of one dataset onto the coordinate system of the other, the Iterative Closest Point (ICP algorithm is then applied, using the corresponding building point clouds to further refine the transformation solution. The results indicate that the combination of planes and points improve the co-registration outcomes.

  5. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  6. Modular toolkit for Data Processing (MDP: a Python data processing framework

    Directory of Open Access Journals (Sweden)

    Tiziano Zito

    2009-01-01

    Full Text Available Modular toolkit for Data Processing (MDP is a data processing framework written in Python. From the user's perspective, MDP is a collection of supervised and unsupervised learning algorithms and other data processing units that can be combined into data processing sequences and more complex feed-forward network architectures. Computations are performed efficiently in terms of speed and memory requirements. From the scientific developer's perspective, MDP is a modular framework, which can easily be expanded. The implementation of new algorithms is easy and intuitive. The new implemented units are then automatically integrated with the rest of the library. MDP has been written in the context of theoretical research in neuroscience, but it has been designed to be helpful in any context where trainable data processing algorithms are used. Its simplicity on the user's side, the variety of readily available algorithms, and the reusability of the implemented units make it also a useful educational tool.

  7. Data-driven automatic parking constrained control for four-wheeled mobile vehicles

    OpenAIRE

    Wenxu Yan; Jing Deng; Dezhi Xu

    2016-01-01

    In this article, a novel data-driven constrained control scheme is proposed for automatic parking systems. The design of the proposed scheme only depends on the steering angle and the orientation angle of the car, and it does not involve any model information of the car. Therefore, the proposed scheme-based automatic parking system is applicable to different kinds of cars. In order to further reduce the desired trajectory coordinate tracking errors, a coordinates compensation algorithm is als...

  8. Automatic processing of isotopic dilution curves obtained by precordial detection

    International Nuclear Information System (INIS)

    Verite, J.C.

    1973-01-01

    Dilution curves pose two distinct problems: that of their acquisition and that of their processing. A study devoted to the latter aspect only was presented. It was necessary to satisfy two important conditions: the treatment procedure, although applied to a single category of curves (isotopic dilution curves obtained by precordial detection), had to be as general as possible; to allow dissemination of the method the equipment used had to be relatively modest and inexpensive. A simple method, considering the curve processing as a process identification, was developed and should enable the mean heart cavity volume and certain pulmonary circulation parameters to be determined. Considerable difficulties were encountered, limiting the value of the results obtained though not condemning the method itself. The curve processing question raised the problem of their acquisition, i.e. the number of these curves and their meaning. A list of the difficulties encountered is followed by a set of possible solutions, a solution being understood to mean a curve processing combination where the overlapping between the two aspects of the problem is accounted for [fr

  9. 3D automatic segmentation method for retinal optical coherence tomography volume data using boundary surface enhancement

    Directory of Open Access Journals (Sweden)

    Yankui Sun

    2016-03-01

    Full Text Available With the introduction of spectral-domain optical coherence tomography (SD-OCT, much larger image datasets are routinely acquired compared to what was possible using the previous generation of time-domain OCT. Thus, there is a critical need for the development of three-dimensional (3D segmentation methods for processing these data. We present here a novel 3D automatic segmentation method for retinal OCT volume data. Briefly, to segment a boundary surface, two OCT volume datasets are obtained by using a 3D smoothing filter and a 3D differential filter. Their linear combination is then calculated to generate new volume data with an enhanced boundary surface, where pixel intensity, boundary position information, and intensity changes on both sides of the boundary surface are used simultaneously. Next, preliminary discrete boundary points are detected from the A-Scans of the volume data. Finally, surface smoothness constraints and a dynamic threshold are applied to obtain a smoothed boundary surface by correcting a small number of error points. Our method can extract retinal layer boundary surfaces sequentially with a decreasing search region of volume data. We performed automatic segmentation on eight human OCT volume datasets acquired from a commercial Spectralis OCT system, where each volume of datasets contains 97 OCT B-Scan images with a resolution of 496×512 (each B-Scan comprising 512 A-Scans containing 496 pixels; experimental results show that this method can accurately segment seven layer boundary surfaces in normal as well as some abnormal eyes.

  10. Application of parallel processing for automatic inspection of printed circuits

    International Nuclear Information System (INIS)

    Lougheed, R.M.

    1986-01-01

    Automated visual inspection of printed electronic circuits is a challenging application for image processing systems. Detailed inspection requires high speed analysis of gray scale imagery along with high quality optics, lighting, and sensing equipment. A prototype system has been developed and demonstrated at the Environmental Research Institute of Michigan (ERIM) for inspection of multilayer thick-film circuits. The central problem of real-time image processing is solved by a special-purpose parallel processor which includes a new high-speed Cytocomputer. In this chapter the inspection process and the algorithms used are summarized, along with the functional requirements of the machine vision system. Next, the parallel processor is described in detail and then performance on this application is given

  11. DEVELOPING A SPATIAL PROCESSING SERVICE FOR AUTOMATIC CALCULATION OF STORM INUNDATION

    Directory of Open Access Journals (Sweden)

    H. Jafari

    2017-09-01

    Full Text Available With the increase in urbanization, the surface of earth and its climate are changing. These changes resulted in more frequent floodingand storm inundation in urban areas. The challenges of flooding can be addressed through several computational procedures. Due to its numerous advantages, accessible web services can be chosen as a proper format for determining the storm inundation. Web services have facilitated the integration and interactivity of the web applications. Such services made the interaction between machines more feasible. Web services enable the heterogeneous software systems to communicate with each other. A Web Processing Service (WPS makes it possible to process spatial data with different formats. In this study, we developed a WPS to automatically calculate the amount of storm inundation caused by rainfall in urban areas. The method we used for calculating the storm inundation is based on a simplified hydrologic model which estimates the final status of inundation. The simulation process and water transfer between subcatchments are carried out respectively, without user’s interference. The implementation of processing functions in a form of processing web services gives the capability to reuse the services and apply them in other services. As a result, it would avoid creating the duplicate resources.

  12. Automatic Data Collection Design for Neural Networks Detection of ...

    African Journals Online (AJOL)

    Automated data collection is necessary to alleviate problems inherent in data collection for investigation of management frauds. Once we have gathered a realistic data, several methods then exist for proper analysis and detection of anomalous transactions. However, in Nigeria, collecting fraudulent data is relatively difficult ...

  13. Cooperative processing data bases

    Science.gov (United States)

    Hasta, Juzar

    1991-01-01

    Cooperative processing for the 1990's using client-server technology is addressed. The main theme is concepts of downsizing from mainframes and minicomputers to workstations on a local area network (LAN). This document is presented in view graph form.

  14. The Development from Effortful to Automatic Processing in Mathematical Cognition.

    Science.gov (United States)

    Kaye, Daniel B.; And Others

    This investigation capitalizes upon the information processing models that depend upon measurement of latency of response to a mathematical problem and the decomposition of reaction time (RT). Simple two term addition problems were presented with possible solutions for true-false verification, and accuracy and RT to response were recorded. Total…

  15. Process for automatic filling of nuclear fuel rod cans

    International Nuclear Information System (INIS)

    Bezold, H.

    1977-01-01

    A drying section is inserted in the production line for the automation of the filling process for fuel rods with nuclear fuel pellets. The pellets are taken in a drum magazine to a drying furnace and then pushed out one after the other into the can to be filled. (TK) [de

  16. Automatic Processing of Changes in Facial Emotions in Dysphoria: A Magnetoencephalography Study.

    Science.gov (United States)

    Xu, Qianru; Ruohonen, Elisa M; Ye, Chaoxiong; Li, Xueqiao; Kreegipuu, Kairi; Stefanics, Gabor; Luo, Wenbo; Astikainen, Piia

    2018-01-01

    It is not known to what extent the automatic encoding and change detection of peripherally presented facial emotion is altered in dysphoria. The negative bias in automatic face processing in particular has rarely been studied. We used magnetoencephalography (MEG) to record automatic brain responses to happy and sad faces in dysphoric (Beck's Depression Inventory ≥ 13) and control participants. Stimuli were presented in a passive oddball condition, which allowed potential negative bias in dysphoria at different stages of face processing (M100, M170, and M300) and alterations of change detection (visual mismatch negativity, vMMN) to be investigated. The magnetic counterpart of the vMMN was elicited at all stages of face processing, indexing automatic deviance detection in facial emotions. The M170 amplitude was modulated by emotion, response amplitudes being larger for sad faces than happy faces. Group differences were found for the M300, and they were indexed by two different interaction effects. At the left occipital region of interest, the dysphoric group had larger amplitudes for sad than happy deviant faces, reflecting negative bias in deviance detection, which was not found in the control group. On the other hand, the dysphoric group showed no vMMN to changes in facial emotions, while the vMMN was observed in the control group at the right occipital region of interest. Our results indicate that there is a negative bias in automatic visual deviance detection, but also a general change detection deficit in dysphoria.

  17. automatic data collection design for neural networks detection

    African Journals Online (AJOL)

    Dr Obe

    Automated data collection is necessary to alleviate problems inherent in data collection for investigation ... (iv) Costs the employing organisation assets, ..... velocity (rate of cash flow over a period of time). ... Mining and Knowledge Discovery,.

  18. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  19. Automatic Optimization of Hardware Accelerators for Image Processing

    OpenAIRE

    Reiche, Oliver; Häublein, Konrad; Reichenbach, Marc; Hannig, Frank; Teich, Jürgen; Fey, Dietmar

    2015-01-01

    In the domain of image processing, often real-time constraints are required. In particular, in safety-critical applications, such as X-ray computed tomography in medical imaging or advanced driver assistance systems in the automotive domain, timing is of utmost importance. A common approach to maintain real-time capabilities of compute-intensive applications is to offload those computations to dedicated accelerator hardware, such as Field Programmable Gate Arrays (FPGAs). Programming such arc...

  20. The Structure of Processing Resource Demands in Monitoring Automatic Systems.

    Science.gov (United States)

    1981-01-01

    Attempts at modelling the human failure detection process have continually focused on normative predictions of optimal operator behavior ( Smallwood ...Broadbent’s filter model (Broadbent, 1957), to Treisman’s attenuation model (Treisman, 1964), to Norman’s late selection model ( Norman , 1968), tife concept...survey and a model. Acta Psychologica, 1967, 27, 84-92. Moray, N. Mental workload: Its theory and measurement. New York: Plenum Press, 1979. Li 42 Norman

  1. Automatic concrete cracks detection and mapping of terrestrial laser scan data

    Directory of Open Access Journals (Sweden)

    Mostafa Rabah

    2013-12-01

    The current paper submits a method for automatic concrete cracks detection and mapping from the data that was obtained during laser scanning survey. The method of cracks detection and mapping is achieved by three steps, namely the step of shading correction in the original image, step of crack detection and finally step of crack mapping and processing steps. The detected crack is defined in a pixel coordinate system. To remap the crack into the referred coordinate system, a reverse engineering is used. This is achieved by a hybrid concept of terrestrial laser-scanner point clouds and the corresponding camera image, i.e. a conversion from the pixel coordinate system to the terrestrial laser-scanner or global coordinate system. The results of the experiment show that the mean differences between terrestrial laser scan and the total station are about 30.5, 16.4 and 14.3 mms in x, y and z direction, respectively.

  2. Study on Classification Accuracy Inspection of Land Cover Data Aided by Automatic Image Change Detection Technology

    Science.gov (United States)

    Xie, W.-J.; Zhang, L.; Chen, H.-P.; Zhou, J.; Mao, W.-J.

    2018-04-01

    The purpose of carrying out national geographic conditions monitoring is to obtain information of surface changes caused by human social and economic activities, so that the geographic information can be used to offer better services for the government, enterprise and public. Land cover data contains detailed geographic conditions information, thus has been listed as one of the important achievements in the national geographic conditions monitoring project. At present, the main issue of the production of the land cover data is about how to improve the classification accuracy. For the land cover data quality inspection and acceptance, classification accuracy is also an important check point. So far, the classification accuracy inspection is mainly based on human-computer interaction or manual inspection in the project, which are time consuming and laborious. By harnessing the automatic high-resolution remote sensing image change detection technology based on the ERDAS IMAGINE platform, this paper carried out the classification accuracy inspection test of land cover data in the project, and presented a corresponding technical route, which includes data pre-processing, change detection, result output and information extraction. The result of the quality inspection test shows the effectiveness of the technical route, which can meet the inspection needs for the two typical errors, that is, missing and incorrect update error, and effectively reduces the work intensity of human-computer interaction inspection for quality inspectors, and also provides a technical reference for the data production and quality control of the land cover data.

  3. An Automatic and Real-time Restoration of Gamma Dose Data by Radio Telemetry

    International Nuclear Information System (INIS)

    Lee, Wan No; Kim, Hee Reyoung; Chung, Kun Ho; Cho, Young Hyun; Choi, Geun Sik; Lee, Chang Woo; Kim, Young Soo

    2006-01-01

    On-line gamma monitoring system based on a high pressurized ionization chamber has been used for determining airborne doses surrounding HANARO research reactor at KAERI (Korea Atomic Energy Research Institute). It is composed of a network of six monitoring stations and an on-line computer system. It has been operated by radio telemetry with a radio frequency of 468.8 MHz, which is able to transmit the real-time dose data measured from a remote ion chamber to the central computer for ten seconds-to seconds. Although radio telemetry has several advantages such as an effective and economical transmission, there is one main problem that data loss happen because each monitoring post only stores 300 radiation data points, which covers the previous sequential data of 50 minutes from the present in the case of a recording interval time of 10 seconds It is possible to restore the lost data by an off-line process such as a floppy disk or portable memory disk but it is ineffective method at the real-time monitoring system. Restoration, storage, and display of the current data as well as the lost data are also difficult in the present system. In this paper, an automatic and real-time restoration method by radio telemetry will be introduced

  4. Automatic Model-Based Generation of Parameterized Test Cases Using Data Abstraction

    NARCIS (Netherlands)

    Calamé, Jens R.; Ioustinova, Natalia; Romijn, J.M.T.; Smith, G.; van de Pol, Jan Cornelis

    2007-01-01

    Developing test suites is a costly and error-prone process. Model-based test generation tools facilitate this process by automatically generating test cases from system models. The applicability of these tools, however, depends on the size of the target systems. Here, we propose an approach to

  5. Automatic Control and Data Acquisition System for Combustion Laboratory Applications.

    Science.gov (United States)

    1982-10-01

    O VPI Access~.ion FCr- 1473 2 UNCLASSIFIED Approved for public release; distribution unlimited JAutomatic Control and Data Acquisition System for...unit. The CPU/ROK board includes a 16 bit microprocessor chip which decodes and executes all in- structions, and controls all data transfers. The 12K...in the limited memory space of 32K of the HP-85 33 ACQDTA’ 1) Controls DevicesCRAIN ,2) Acquires Photodiods Output$ 3) Stores Data o Disc 1

  6. Development and Utility of Automatic Language Processing Technologies. Volume 2

    Science.gov (United States)

    2014-04-01

    the course of this task. The earlier one – A3Threshold, focuses on filtering sets of alignments using the reported score. Experiments run with this...directories and parse out the scoring data. This information was then saved out to a MySQL database so that the scores could be individually displayed and...Machine Translation Evaluation (MT09) MT Eval Machine Translation Evaluation MultiBLEU An additional MT scoring metric based on BLEU MySQL MySQL is a

  7. MOPITT Near Real-Time Data for LANCE: Automatic Quality Assurance and Comparison to Operational Products

    Science.gov (United States)

    Martinez-Alonso, S.; Deeter, M. N.; Worden, H. M.; Ziskin, D.

    2017-12-01

    Terra-MOPITT (the Measurements of Pollution in the Troposphere instrument) near real-time (NRT) carbon monoxide (CO) products have been selected for distribution through NASA's LANCE (the Land, Atmosphere Near Real-Time Capability for EOS). MOPITT version 7 NRT data will be made publicly available within 3 hours from observation. The retrieval process is the same for both MOPITT NRT and operational products, albeit for the former it is constrained to use ancillary data available within the latency time. Among other requirements, LANCE NRT products must be examined for quality assurance (QA) purposes and relative errors between NRT and operational products must be quantified. Here we present an algorithm for automatic MOPITT NRT QA aimed to identify artifacts and separate those from anomalously high but real CO values. The algorithm is based on a comparison to the statistics of MOPITT operational products. We discuss the algorithm's performance when tested by applying it to three MOPITT datasets: a known (and corrected) artifact in version 4 operational data, anomalously high CO values in operational data during the 2015 Indonesia fires, and actual NRT data. Last, we describe results from a quantitative comparison between MOPITT NRT data and their operational counterparts.

  8. Development of Automatic Visceral Fat Volume Calculation Software for CT Volume Data

    Directory of Open Access Journals (Sweden)

    Mitsutaka Nemoto

    2014-01-01

    Full Text Available Objective. To develop automatic visceral fat volume calculation software for computed tomography (CT volume data and to evaluate its feasibility. Methods. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30 in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Results. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. Conclusions. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.

  9. Necessary Processing of Personal Data

    DEFF Research Database (Denmark)

    Tranberg, Charlotte Bagger

    2006-01-01

    The Data Protection Directive prohibits processing of sensitive data (racial or ethnic origin, political, religious or philosophical convictions, trade union membership and information on health and sex life). All other personal data may be processed, provided processing is deemed necessary in re...

  10. Automatic Methods in Image Processing and Their Relevance to Map-Making.

    Science.gov (United States)

    1981-02-11

    folding fre- quency = .5) and s is the "shaoing fac- tor" which controls the spatial frequency content of the signal; the signal band- width increases...ARIZONA UNIV TUCSON DIGITAL IAgE ANALYSIS LAB Iris 8/ 2AUTOMATIC METHOOS IN IMAGE PROCESSING AND THEIR RELEVANCE TO MA-.ETC~tl;FEB 1 S R HUNT DAA629

  11. Automatic/Control Processing Concepts and Their Implications for the Training of Skills.

    Science.gov (United States)

    1982-04-01

    driving a car are examples of automatic processes. Controll p s is comparatively slow, serial, limited by short-term memory, and requires subject effort...development has convinced us that moivation a oftn more Jmportn nti mAn =other iJli velLJoa jjthpgy gI. njj Lautomatic U_2,LLjjk. Motivation Is much more

  12. Automatic and Controlled Processing in Sentence Recall: The Role of Long-Term and Working Memory

    Science.gov (United States)

    Jefferies, E.; Lambon Ralph, M.A.; Baddeley, A.D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley,…

  13. The Development of Automatic and Controlled Inhibitory Retrieval Processes in True and False Recall

    Science.gov (United States)

    Knott, Lauren M.; Howe, Mark L.; Wimmer, Marina C.; Dewhurst, Stephen A.

    2011-01-01

    In three experiments, we investigated the role of automatic and controlled inhibitory retrieval processes in true and false memory development in children and adults. Experiment 1 incorporated a directed forgetting task to examine controlled retrieval inhibition. Experiments 2 and 3 used a part-set cue and retrieval practice task to examine…

  14. Automatic Processing of Metallurgical Abstracts for the Purpose of Information Retrieval. Final Report.

    Science.gov (United States)

    Melton, Jessica S.

    Objectives of this project were to develop and test a method for automatically processing the text of abstracts for a document retrieval system. The test corpus consisted of 768 abstracts from the metallurgical section of Chemical Abstracts (CA). The system, based on a subject indexing rational, had two components: (1) a stored dictionary of words…

  15. REALIZATION OF TRAINING PROGRAMME ON THE BASIS OF LINGUISTIC DATABASE FOR AUTOMATIC TEXTS PROCESSING SYSTEM

    Directory of Open Access Journals (Sweden)

    M. A. Makarych

    2016-01-01

    Full Text Available Due to the constant increasing of electronic textual information, modern society needs for the automatic processing of natural language (NL. The main purpose of NL automatic text processing systems is to analyze and create texts and represent their content. The purpose of the paper is the development of linguistic and software bases of an automatic system for processing English publicistic texts. This article discusses the examples of different approaches to the creation of linguistic databases for processing systems. The author gives a detailed description of basic building blocks for a new linguistic processor: lexical-semantic, syntactical and semantic-syntactical. The main advantage of the processor is using special semantic codes in the alphabetical dictionary. The semantic codes have been developed in accordance with a lexical-semantic classification. It helps to precisely define semantic functions of the keywords that are situated in parsing groups and allows the automatic system to avoid typical mistakes. The author also represents the realization of a developed linguistic database in the form of a training computer program.

  16. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind

    NARCIS (Netherlands)

    Nentjes, L.; Bernstein, D.; Arntz, A.; van Breukelen, G.; Slaats, M.

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in

  17. Automatic detection of adverse events to predict drug label changes using text and data mining techniques.

    Science.gov (United States)

    Gurulingappa, Harsha; Toldo, Luca; Rajput, Abdul Mateen; Kors, Jan A; Taweel, Adel; Tayrouz, Yorki

    2013-11-01

    The aim of this study was to assess the impact of automatically detected adverse event signals from text and open-source data on the prediction of drug label changes. Open-source adverse effect data were collected from FAERS, Yellow Cards and SIDER databases. A shallow linguistic relation extraction system (JSRE) was applied for extraction of adverse effects from MEDLINE case reports. Statistical approach was applied on the extracted datasets for signal detection and subsequent prediction of label changes issued for 29 drugs by the UK Regulatory Authority in 2009. 76% of drug label changes were automatically predicted. Out of these, 6% of drug label changes were detected only by text mining. JSRE enabled precise identification of four adverse drug events from MEDLINE that were undetectable otherwise. Changes in drug labels can be predicted automatically using data and text mining techniques. Text mining technology is mature and well-placed to support the pharmacovigilance tasks. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Automatically identifying scatter in fluorescence data using robust techniques

    DEFF Research Database (Denmark)

    Engelen, S.; Frosch, Stina; Hubert, M.

    2007-01-01

    as input data for three different PARAFAC methods. Firstly inserting missing values in the scatter regions are tested, secondly an interpolation of the scatter regions is performed and finally the scatter regions are down-weighted. These results show that the PARAFAC method to choose after scatter...

  19. An Automatic Building Extraction and Regularisation Technique Using LiDAR Point Cloud Data and Orthoimage

    Directory of Open Access Journals (Sweden)

    Syed Ali Naqi Gilani

    2016-03-01

    Full Text Available The development of robust and accurate methods for automatic building detection and regularisation using multisource data continues to be a challenge due to point cloud sparsity, high spectral variability, urban objects differences, surrounding complexity, and data misalignment. To address these challenges, constraints on object’s size, height, area, and orientation are generally benefited which adversely affect the detection performance. Often the buildings either small in size, under shadows or partly occluded are ousted during elimination of superfluous objects. To overcome the limitations, a methodology is developed to extract and regularise the buildings using features from point cloud and orthoimagery. The building delineation process is carried out by identifying the candidate building regions and segmenting them into grids. Vegetation elimination, building detection and extraction of their partially occluded parts are achieved by synthesising the point cloud and image data. Finally, the detected buildings are regularised by exploiting the image lines in the building regularisation process. Detection and regularisation processes have been evaluated using the ISPRS benchmark and four Australian data sets which differ in point density (1 to 29 points/m2, building sizes, shadows, terrain, and vegetation. Results indicate that there is 83% to 93% per-area completeness with the correctness of above 95%, demonstrating the robustness of the approach. The absence of over- and many-to-many segmentation errors in the ISPRS data set indicate that the technique has higher per-object accuracy. While compared with six existing similar methods, the proposed detection and regularisation approach performs significantly better on more complex data sets (Australian in contrast to the ISPRS benchmark, where it does better or equal to the counterparts.

  20. Evaluation of automatic face recognition for automatic border control on actual data recorded of travellers at Schiphol Airport

    NARCIS (Netherlands)

    Spreeuwers, Lieuwe Jan; Hendrikse, A.J.; Gerritsen, K.J.; Brömme, A.; Busch, C.

    2012-01-01

    Automatic border control at airports using automated facial recognition for checking the passport is becoming more and more common. A problem is that it is not clear how reliable these automatic gates are. Very few independent studies exist that assess the reliability of automated facial recognition

  1. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  2. Big Data Analysis of Manufacturing Processes

    International Nuclear Information System (INIS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-01-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results. (paper)

  3. Memory biases in remitted depression: the role of negative cognitions at explicit and automatic processing levels.

    Science.gov (United States)

    Romero, Nuria; Sanchez, Alvaro; Vazquez, Carmelo

    2014-03-01

    Cognitive models propose that depression is caused by dysfunctional schemas that endure beyond the depressive episode, representing vulnerability factors for recurrence. However, research testing negative cognitions linked to dysfunctional schemas in formerly depressed individuals is still scarce. Furthermore, negative cognitions are presumed to be linked to biases in recalling negative self-referent information in formerly depressed individuals, but no studies have directly tested this association. In the present study, we evaluated differences between formerly and never-depressed individuals in several experimental indices of negative cognitions and their associations with the recall of emotional self-referent material. Formerly (n = 30) and never depressed individuals (n = 40) completed measures of explicit (i.e., scrambled sentence test) and automatic (i.e., lexical decision task) processing to evaluate negative cognitions. Furthermore participants completed a self-referent incidental recall task to evaluate memory biases. Formerly compared to never depressed individuals showed greater negative cognitions at both explicit and automatic levels of processing. Results also showed greater recall of negative self-referent information in formerly compared to never-depressed individuals. Finally, individual differences in negative cognitions at both explicit and automatic levels of processing predicted greater recall of negative self-referent material in formerly depressed individuals. Analyses of the relationship between explicit and automatic processing indices and memory biases were correlational and the majority of participants in both groups were women. Our findings provide evidence of negative cognitions in formerly depressed individuals at both automatic and explicit levels of processing that may confer a cognitive vulnerability to depression. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Parallelization and automatic data distribution for nuclear reactor simulations

    Energy Technology Data Exchange (ETDEWEB)

    Liebrock, L.M. [Liebrock-Hicks Research, Calumet, MI (United States)

    1997-07-01

    Detailed attempts at realistic nuclear reactor simulations currently take many times real time to execute on high performance workstations. Even the fastest sequential machine can not run these simulations fast enough to ensure that the best corrective measure is used during a nuclear accident to prevent a minor malfunction from becoming a major catastrophe. Since sequential computers have nearly reached the speed of light barrier, these simulations will have to be run in parallel to make significant improvements in speed. In physical reactor plants, parallelism abounds. Fluids flow, controls change, and reactions occur in parallel with only adjacent components directly affecting each other. These do not occur in the sequentialized manner, with global instantaneous effects, that is often used in simulators. Development of parallel algorithms that more closely approximate the real-world operation of a reactor may, in addition to speeding up the simulations, actually improve the accuracy and reliability of the predictions generated. Three types of parallel architecture (shared memory machines, distributed memory multicomputers, and distributed networks) are briefly reviewed as targets for parallelization of nuclear reactor simulation. Various parallelization models (loop-based model, shared memory model, functional model, data parallel model, and a combined functional and data parallel model) are discussed along with their advantages and disadvantages for nuclear reactor simulation. A variety of tools are introduced for each of the models. Emphasis is placed on the data parallel model as the primary focus for two-phase flow simulation. Tools to support data parallel programming for multiple component applications and special parallelization considerations are also discussed.

  5. Parallelization and automatic data distribution for nuclear reactor simulations

    International Nuclear Information System (INIS)

    Liebrock, L.M.

    1997-01-01

    Detailed attempts at realistic nuclear reactor simulations currently take many times real time to execute on high performance workstations. Even the fastest sequential machine can not run these simulations fast enough to ensure that the best corrective measure is used during a nuclear accident to prevent a minor malfunction from becoming a major catastrophe. Since sequential computers have nearly reached the speed of light barrier, these simulations will have to be run in parallel to make significant improvements in speed. In physical reactor plants, parallelism abounds. Fluids flow, controls change, and reactions occur in parallel with only adjacent components directly affecting each other. These do not occur in the sequentialized manner, with global instantaneous effects, that is often used in simulators. Development of parallel algorithms that more closely approximate the real-world operation of a reactor may, in addition to speeding up the simulations, actually improve the accuracy and reliability of the predictions generated. Three types of parallel architecture (shared memory machines, distributed memory multicomputers, and distributed networks) are briefly reviewed as targets for parallelization of nuclear reactor simulation. Various parallelization models (loop-based model, shared memory model, functional model, data parallel model, and a combined functional and data parallel model) are discussed along with their advantages and disadvantages for nuclear reactor simulation. A variety of tools are introduced for each of the models. Emphasis is placed on the data parallel model as the primary focus for two-phase flow simulation. Tools to support data parallel programming for multiple component applications and special parallelization considerations are also discussed

  6. Automatic generation of water distribution systems based on GIS data.

    Science.gov (United States)

    Sitzenfrei, Robert; Möderl, Michael; Rauch, Wolfgang

    2013-09-01

    In the field of water distribution system (WDS) analysis, case study research is needed for testing or benchmarking optimisation strategies and newly developed software. However, data availability for the investigation of real cases is limited due to time and cost needed for data collection and model setup. We present a new algorithm that addresses this problem by generating WDSs from GIS using population density, housing density and elevation as input data. We show that the resulting WDSs are comparable to actual systems in terms of network properties and hydraulic performance. For example, comparing the pressure heads for an actual and a generated WDS results in pressure head differences of ±4 m or less for 75% of the supply area. Although elements like valves and pumps are not included, the new methodology can provide water distribution systems of varying levels of complexity (e.g., network layouts, connectivity, etc.) to allow testing design/optimisation algorithms on a large number of networks. The new approach can be used to estimate the construction costs of planned WDSs aimed at addressing population growth or at comparisons of different expansion strategies in growth corridors.

  7. Automatic car driving detection using raw accelerometry data.

    Science.gov (United States)

    Strączkiewicz, M; Urbanek, J K; Fadel, W F; Crainiceanu, C M; Harezlak, J

    2016-09-21

    Measuring physical activity using wearable devices has become increasingly popular. Raw data collected from such devices is usually summarized as 'activity counts', which combine information of human activity with environmental vibrations. Driving is a major sedentary activity that artificially increases the activity counts due to various car and body vibrations that are not connected to human movement. Thus, it has become increasingly important to identify periods of driving and quantify the bias induced by driving in activity counts. To address these problems, we propose a detection algorithm of driving via accelerometry (DADA), designed to detect time periods when an individual is driving a car. DADA is based on detection of vibrations generated by a moving vehicle and recorded by an accelerometer. The methodological approach is based on short-time Fourier transform (STFT) applied to the raw accelerometry data and identifies and focuses on frequency vibration ranges that are specific to car driving. We test the performance of DADA on data collected using wrist-worn ActiGraph devices in a controlled experiment conducted on 24 subjects. The median area under the receiver-operating characteristic curve (AUC) for predicting driving periods was 0.94, indicating an excellent performance of the algorithm. We also quantify the size of the bias induced by driving and obtain that per unit of time the activity counts generated by driving are, on average, 16% of the average activity counts generated during walking.

  8. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter

    Directory of Open Access Journals (Sweden)

    Hili Eidlin-Levy

    2017-12-01

    Full Text Available The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  9. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter.

    Science.gov (United States)

    Eidlin-Levy, Hili; Rubinsten, Orly

    2017-01-01

    The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD) are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development) performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter) while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  10. GPU applications for data processing

    Energy Technology Data Exchange (ETDEWEB)

    Vladymyrov, Mykhailo, E-mail: mykhailo.vladymyrov@cern.ch [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); Aleksandrov, Andrey [LPI - Lebedev Physical Institute of the Russian Academy of Sciences, RUS-119991 Moscow (Russian Federation); INFN sezione di Napoli, I-80125 Napoli (Italy); Tioukov, Valeri [INFN sezione di Napoli, I-80125 Napoli (Italy)

    2015-12-31

    Modern experiments that use nuclear photoemulsion imply fast and efficient data acquisition from the emulsion can be performed. The new approaches in developing scanning systems require real-time processing of large amount of data. Methods that use Graphical Processing Unit (GPU) computing power for emulsion data processing are presented here. It is shown how the GPU-accelerated emulsion processing helped us to rise the scanning speed by factor of nine.

  11. An algorithm for discovering Lagrangians automatically from data

    Directory of Open Access Journals (Sweden)

    Daniel J.A. Hills

    2015-11-01

    Full Text Available An activity fundamental to science is building mathematical models. These models are used to both predict the results of future experiments and gain insight into the structure of the system under study. We present an algorithm that automates the model building process in a scientifically principled way. The algorithm can take observed trajectories from a wide variety of mechanical systems and, without any other prior knowledge or tuning of parameters, predict the future evolution of the system. It does this by applying the principle of least action and searching for the simplest Lagrangian that describes the system’s behaviour. By generating this Lagrangian in a human interpretable form, it can also provide insight into the workings of the system.

  12. Maritime over the Horizon Sensor Integration: High Frequency Surface-Wave-Radar and Automatic Identification System Data Integration Algorithm.

    Science.gov (United States)

    Nikolic, Dejan; Stojkovic, Nikola; Lekic, Nikola

    2018-04-09

    To obtain the complete operational picture of the maritime situation in the Exclusive Economic Zone (EEZ) which lies over the horizon (OTH) requires the integration of data obtained from various sensors. These sensors include: high frequency surface-wave-radar (HFSWR), satellite automatic identification system (SAIS) and land automatic identification system (LAIS). The algorithm proposed in this paper utilizes radar tracks obtained from the network of HFSWRs, which are already processed by a multi-target tracking algorithm and associates SAIS and LAIS data to the corresponding radar tracks, thus forming an integrated data pair. During the integration process, all HFSWR targets in the vicinity of AIS data are evaluated and the one which has the highest matching factor is used for data association. On the other hand, if there is multiple AIS data in the vicinity of a single HFSWR track, the algorithm still makes only one data pair which consists of AIS and HFSWR data with the highest mutual matching factor. During the design and testing, special attention is given to the latency of AIS data, which could be very high in the EEZs of developing countries. The algorithm is designed, implemented and tested in a real working environment. The testing environment is located in the Gulf of Guinea and includes a network of HFSWRs consisting of two HFSWRs, several coastal sites with LAIS receivers and SAIS data provided by provider of SAIS data.

  13. Low-cost automatic activity data recording system

    Directory of Open Access Journals (Sweden)

    Moraes M.F.D.

    1997-01-01

    Full Text Available We describe a low-cost, high quality device capable of monitoring indirect activity by detecting touch-release events on a conducting surface, i.e., the animal's cage cover. In addition to the detecting sensor itself, the system includes an IBM PC interface for prompt data storage. The hardware/software design, while serving for other purposes, is used to record the circadian activity rhythm pattern of rats with time in an automated computerized fashion using minimal cost computer equipment (IBM PC XT. Once the sensor detects a touch-release action of the rat in the upper portion of the cage, the interface sends a command to the PC which records the time (hours-minutes-seconds when the activity occurred. As a result, the computer builds up several files (one per detector/sensor containing a time list of all recorded events. Data can be visualized in terms of actograms, indicating the number of detections per hour, and analyzed by mathematical tools such as Fast Fourier Transform (FFT or cosinor. In order to demonstrate method validation, an experiment was conducted on 8 Wistar rats under 12/12-h light/dark cycle conditions (lights on at 7:00 a.m.. Results show a biological validation of the method since it detected the presence of circadian activity rhythm patterns in the behavior of the rats

  14. Intentional and Automatic Numerical Processing as Predictors of Mathematical Abilities in Primary School Children

    Directory of Open Access Journals (Sweden)

    Violeta ePina

    2015-03-01

    Full Text Available Previous studies have suggested that numerical processing relates to mathematical performance, but it seems that such relationship is more evident for intentional than for automatic numerical processing. In the present study we assessed the relationship between the two types of numerical processing and specific mathematical abilities in a sample of 109 children in grades 1 to 6. Participants were tested in an ample range of mathematical tests and also performed both a numerical and a size comparison task. The results showed that numerical processing related to mathematical performance only when inhibitory control was involved in the comparison tasks. Concretely, we found that intentional numerical processing, as indexed by the numerical distance effect in the numerical comparison task, was related to mathematical reasoning skills only when the task-irrelevant dimension (the physical size was incongruent; whereas automatic numerical processing, indexed by the congruency effect in the size comparison task, was related to mathematical calculation skills only when digits were separated by small distance. The observed double dissociation highlights the relevance of both intentional and automatic numerical processing in mathematical skills, but when inhibitory control is also involved.

  15. Chemometric strategy for automatic chromatographic peak detection and background drift correction in chromatographic data.

    Science.gov (United States)

    Yu, Yong-Jie; Xia, Qiao-Ling; Wang, Sheng; Wang, Bing; Xie, Fu-Wei; Zhang, Xiao-Bing; Ma, Yun-Ming; Wu, Hai-Long

    2014-09-12

    Peak detection and background drift correction (BDC) are the key stages in using chemometric methods to analyze chromatographic fingerprints of complex samples. This study developed a novel chemometric strategy for simultaneous automatic chromatographic peak detection and BDC. A robust statistical method was used for intelligent estimation of instrumental noise level coupled with first-order derivative of chromatographic signal to automatically extract chromatographic peaks in the data. A local curve-fitting strategy was then employed for BDC. Simulated and real liquid chromatographic data were designed with various kinds of background drift and degree of overlapped chromatographic peaks to verify the performance of the proposed strategy. The underlying chromatographic peaks can be automatically detected and reasonably integrated by this strategy. Meanwhile, chromatograms with BDC can be precisely obtained. The proposed method was used to analyze a complex gas chromatography dataset that monitored quality changes in plant extracts during storage procedure. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Using dual-task methodology to dissociate automatic from nonautomatic processes involved in artificial grammar learning.

    Science.gov (United States)

    Hendricks, Michelle A; Conway, Christopher M; Kellogg, Ronald T

    2013-09-01

    Previous studies have suggested that both automatic and intentional processes contribute to the learning of grammar and fragment knowledge in artificial grammar learning (AGL) tasks. To explore the relative contribution of automatic and intentional processes to knowledge gained in AGL, we utilized dual-task methodology to dissociate automatic and intentional grammar- and fragment-based knowledge in AGL at both acquisition and at test. Both experiments used a balanced chunk strength grammar to assure an equal proportion of fragment cues (i.e., chunks) in grammatical and nongrammatical test items. In Experiment 1, participants engaged in a working memory dual-task either during acquisition, test, or both acquisition and test. The results showed that participants performing the dual-task during acquisition learned the artificial grammar as well as the single-task group, presumably by relying on automatic learning mechanisms. A working memory dual-task at test resulted in attenuated grammar performance, suggesting a role for intentional processes for the expression of grammatical learning at test. Experiment 2 explored the importance of perceptual cues by changing letters between the acquisition and test phase; unlike Experiment 1, there was no significant learning of grammatical information for participants under dual-task conditions in Experiment 2, suggesting that intentional processing is necessary for successful acquisition and expression of grammar-based knowledge under transfer conditions. In sum, it appears that some aspects of learning in AGL are indeed relatively automatic, although the expression of grammatical information and the learning of grammatical patterns when perceptual similarity is eliminated both appear to require explicit resources. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  17. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  18. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  19. AUTOMR: An automatic processing program system for the molecular replacement method

    International Nuclear Information System (INIS)

    Matsuura, Yoshiki

    1991-01-01

    An automatic processing program system of the molecular replacement method AUTMR is presented. The program solves the initial model of the target crystal structure using a homologous molecule as the search model. It processes the structure-factor calculation of the model molecule, the rotation function, the translation function and the rigid-group refinement successively in one computer job. Test calculations were performed for six protein crystals and the structures were solved in all of these cases. (orig.)

  20. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    Science.gov (United States)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  1. Roadway System Assessment Using Bluetooth-Based Automatic Vehicle Identification Travel Time Data

    OpenAIRE

    Day, Christopher M.; Brennan, Thomas M.; Hainen, Alexander M.; Remias, Stephen M.; Bullock, Darcy M.

    2012-01-01

    This monograph is an exposition of several practice-ready methodologies for automatic vehicle identification (AVI) data collection systems. This includes considerations in the physical setup of the collection system as well as the interpretation of the data. An extended discussion is provided, with examples, demonstrating data techniques for converting the raw data into more concise metrics and views. Examples of statistical before-after tests are also provided. A series of case studies were ...

  2. AUTOMATING THE DATA SECURITY PROCESS

    OpenAIRE

    Florin Ogigau-Neamtiu

    2017-01-01

    Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization) suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its...

  3. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathan; Calo, Victor M.

    2010-01-01

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques

  4. Research on the Construction of Remote Sensing Automatic Interpretation Symbol Big Data

    Science.gov (United States)

    Gao, Y.; Liu, R.; Liu, J.; Cheng, T.

    2018-04-01

    Remote sensing automatic interpretation symbol (RSAIS) is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013-2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  5. AUTOMATING THE DATA SECURITY PROCESS

    Directory of Open Access Journals (Sweden)

    Florin Ogigau-Neamtiu

    2017-11-01

    Full Text Available Contemporary organizations face big data security challenges in the cyber environment due to modern threats and actual business working model which relies heavily on collaboration, data sharing, tool integration, increased mobility, etc. The nowadays data classification and data obfuscation selection processes (encryption, masking or tokenization suffer because of the human implication in the process. Organizations need to shirk data security domain by classifying information based on its importance, conduct risk assessment plans and use the most cost effective data obfuscation technique. The paper proposes a new model for data protection by using automated machine decision making procedures to classify data and to select the appropriate data obfuscation technique. The proposed system uses natural language processing capabilities to analyze input data and to select the best course of action. The system has capabilities to learn from previous experiences thus improving itself and reducing the risk of wrong data classification.

  6. Semi-Automatic Registration of Airborne and Terrestrial Laser Scanning Data Using Building Corner Matching with Boundaries as Reliability Check

    Directory of Open Access Journals (Sweden)

    Liang Cheng

    2013-11-01

    Full Text Available Data registration is a prerequisite for the integration of multi-platform laser scanning in various applications. A new approach is proposed for the semi-automatic registration of airborne and terrestrial laser scanning data with buildings without eaves. Firstly, an automatic calculation procedure for thresholds in density of projected points (DoPP method is introduced to extract boundary segments from terrestrial laser scanning data. A new algorithm, using a self-extending procedure, is developed to recover the extracted boundary segments, which then intersect to form the corners of buildings. The building corners extracted from airborne and terrestrial laser scanning are reliably matched through an automatic iterative process in which boundaries from two datasets are compared for the reliability check. The experimental results illustrate that the proposed approach provides both high reliability and high geometric accuracy (average error of 0.44 m/0.15 m in horizontal/vertical direction for corresponding building corners for the final registration of airborne laser scanning (ALS and tripod mounted terrestrial laser scanning (TLS data.

  7. Towards Automatic Music Transcription: Extraction of MIDI-Data out of Polyphonic Piano Music

    Directory of Open Access Journals (Sweden)

    Jens Wellhausen

    2005-06-01

    Full Text Available Driven by the increasing amount of music available electronically the need of automatic search and retrieval systems for music becomes more and more important. In this paper an algorithm for automatic transcription of polyphonic piano music into MIDI data is presented, which is a very interesting basis for database applications and music analysis. The first part of the algorithm performs a note accurate temporal audio segmentation. The resulting segments are examined to extract the notes played in the second part. An algorithm for chord separation based on Independent Subspace Analysis is presented. Finally, the results are used to build a MIDI file.

  8. Automatic data acquisition system of environmental radiation monitor with a personal computer

    International Nuclear Information System (INIS)

    Ohkubo, Tohru; Nakamura, Takashi.

    1984-05-01

    The automatic data acquisition system of environmental radiation monitor was developed in a low price by using a PET personal computer. The count pulses from eight monitors settled at four site boundaries were transmitted to a radiation control room by a signal transmission device and analyzed by the computer via 12 channel scaler and PET-CAMAC Interface for graphic display and printing. (author)

  9. Evaluation of missing data techniques for in-car automatic speech recognition

    OpenAIRE

    Wang, Y.; Vuerinckx, R.; Gemmeke, J.F.; Cranen, B.; Hamme, H. Van

    2009-01-01

    Wang Y., Vuerinckx R., Gemmeke J., Cranen B., Van hamme H., ''Evaluation of missing data techniques for in-car automatic speech recognition'', Proceedings NAG/DAGA 2009 - international conference on acoustics, 4 pp., March 23-26, 2009, Rotterdam, The Netherlands.

  10. Manual editing of automatically recorded data in an anesthesia information management system.

    Science.gov (United States)

    Wax, David B; Beilin, Yaakov; Hossain, Sabera; Lin, Hung-Mo; Reich, David L

    2008-11-01

    Anesthesia information management systems allow automatic recording of physiologic and anesthetic data. The authors investigated the prevalence of such data modification in an academic medical center. The authors queried their anesthesia information management system database of anesthetics performed in 2006 and tabulated the counts of data points for automatically recorded physiologic and anesthetic parameters as well as the subset of those data that were manually invalidated by clinicians (both with and without alternate values manually appended). Patient, practitioner, data source, and timing characteristics of recorded values were also extracted to determine their associations with editing of various parameters in the anesthesia information management system record. A total of 29,491 cases were analyzed, 19% of which had one or more data points manually invalidated. Among 58 attending anesthesiologists, each invalidated data in a median of 7% of their cases when working as a sole practitioner. A minority of invalidated values were manually appended with alternate values. Pulse rate, blood pressure, and pulse oximetry were the most commonly invalidated parameters. Data invalidation usually resulted in a decrease in parameter variance. Factors independently associated with invalidation included extreme physiologic values, American Society of Anesthesiologists physical status classification, emergency status, timing (phase of the procedure/anesthetic), presence of an intraarterial catheter, resident or certified registered nurse anesthetist involvement, and procedure duration. Editing of physiologic data automatically recorded in an anesthesia information management system is a common practice and results in decreased variability of intraoperative data. Further investigation may clarify the reasons for and consequences of this behavior.

  11. Semi-Automatic Detection of Indigenous Settlement Features on Hispaniola through Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Till F. Sonnemann

    2017-12-01

    Full Text Available Satellite imagery has had limited application in the analysis of pre-colonial settlement archaeology in the Caribbean; visible evidence of wooden structures perishes quickly in tropical climates. Only slight topographic modifications remain, typically associated with middens. Nonetheless, surface scatters, as well as the soil characteristics they produce, can serve as quantifiable indicators of an archaeological site, detectable by analyzing remote sensing imagery. A variety of pre-processed, very diverse data sets went through a process of image registration, with the intention to combine multispectral bands to feed two different semi-automatic direct detection algorithms: a posterior probability, and a frequentist approach. Two 5 × 5 km2 areas in the northwestern Dominican Republic with diverse environments, having sufficient imagery coverage, and a representative number of known indigenous site locations, served each for one approach. Buffers around the locations of known sites, as well as areas with no likely archaeological evidence were used as samples. The resulting maps offer quantifiable statistical outcomes of locations with similar pixel value combinations as the identified sites, indicating higher probability of archaeological evidence. These still very experimental and rather unvalidated trials, as they have not been subsequently groundtruthed, show variable potential of this method in diverse environments.

  12. Cognitive regulation of smoking behavior within a cigarette: Automatic and nonautomatic processes.

    Science.gov (United States)

    Motschman, Courtney A; Tiffany, Stephen T

    2016-06-01

    There has been limited research on cognitive processes governing smoking behavior in individuals who are tobacco dependent. In a replication (Baxter & Hinson, 2001) and extension, this study examined the theory (Tiffany, 1990) that drug use may be controlled by automatic processes that develop over repeated use. Heavy and occasional cigarette smokers completed a button-press reaction time (RT) task while concurrently smoking a cigarette, pretending to smoke a lit cigarette, or not smoking. Slowed RT during the button-press task indexed the cognitive disruption associated with nonautomatic control of behavior. Occasional smokers' RTs were slowed when smoking or pretending to smoke compared with when not smoking. Heavy smokers' RTs were slowed when pretending to smoke versus not smoking; however, their RTs were similarly fast when smoking compared with not smoking. The results indicated that smoking behavior was more highly regulated by controlled, nonautomatic processes among occasional smokers and by automatic processes among heavy smokers. Patterns of RT across the interpuff interval indicated that occasional smokers were significantly slowed in anticipation of and immediately after puffing onset, whereas heavy smokers were only slowed significantly after puffing onset. These findings suggest that the entirety of the smoking sequence becomes automatized, with the behaviors leading up to puffing becoming more strongly regulated by automatic processes with experience. These results have relevance to theories on the cognitive regulation of cigarette smoking and support the importance of interventions that focus on routinized behaviors that individuals engage in during and leading up to drug use. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Processes meet big data : connecting data science with process science

    NARCIS (Netherlands)

    van der Aalst, W.; Damiani, E.

    2015-01-01

    As more and more companies are embracing Big data, it has become apparent that the ultimate challenge is to relate massive amounts of event data to processes that are highly dynamic. To unleash the value of event data, events need to be tightly connected to the control and management of operational

  14. Progress on statistical learning systems as data mining tools for the creation of automatic databases in Fusion environments

    International Nuclear Information System (INIS)

    Vega, J.; Murari, A.; Ratta, G.A.; Gonzalez, S.; Dormido-Canto, S.

    2010-01-01

    Nowadays, processing all information of a fusion database is a much more important issue than acquiring more data. Although typically fusion devices produce tens of thousands of discharges, specialized databases for physics studies are normally limited to a few tens of shots. This is due to the fact that these databases are almost always generated manually, which is a very time consuming and unreliable activity. The development of automatic methods to create specialized databases ensures first, the reduction of human efforts to identify and locate physical events, second, the standardization of criteria (reducing the vulnerability to human errors) and, third, the improvement of statistical relevance. Classification and regression techniques have been used for these purposes. The objective has been the automatic recognition of physical events (that can appear in a random and/or infrequent way) in waveforms and video-movies. Results are shown for the JET database.

  15. Digital Data Processing of Images

    African Journals Online (AJOL)

    Digital data processing was investigated to perform image processing. Image smoothing and restoration were explored and promising results obtained. The use of the computer, not only as a data management device, but as an important tool to render quantitative information, was illustrated by lung function determination.

  16. Processing of nuclear data - demonstrations

    International Nuclear Information System (INIS)

    Panini, G.C.

    1995-01-01

    Experimental data are not suitable for direct processing by computer codes. Data must be compiled in order to be compared, normalized, amended; gaps should be filled, differential data supplied. Evaluated data are given in a consistent computer readable format so as to facilitate the checking, the plotting, the comparison with the experimental source of the data. Processing codes have been developed for producing working libraries from the different sources. EXFOR is a complementary and essential tool for evaluators. It consists of a collection of experimental data in computer readable format. (R.P.)

  17. Multibeam sonar backscatter data processing

    Science.gov (United States)

    Schimel, Alexandre C. G.; Beaudoin, Jonathan; Parnum, Iain M.; Le Bas, Tim; Schmidt, Val; Keith, Gordon; Ierodiaconou, Daniel

    2018-06-01

    Multibeam sonar systems now routinely record seafloor backscatter data, which are processed into backscatter mosaics and angular responses, both of which can assist in identifying seafloor types and morphology. Those data products are obtained from the multibeam sonar raw data files through a sequence of data processing stages that follows a basic plan, but the implementation of which varies greatly between sonar systems and software. In this article, we provide a comprehensive review of this backscatter data processing chain, with a focus on the variability in the possible implementation of each processing stage. Our objective for undertaking this task is twofold: (1) to provide an overview of backscatter data processing for the consideration of the general user and (2) to provide suggestions to multibeam sonar manufacturers, software providers and the operators of these systems and software for eventually reducing the lack of control, uncertainty and variability associated with current data processing implementations and the resulting backscatter data products. One such suggestion is the adoption of a nomenclature for increasingly refined levels of processing, akin to the nomenclature adopted for satellite remote-sensing data deliverables.

  18. Automatic and controlled processing in sentence recall: The role of long-term and working memory

    OpenAIRE

    Jefferies, Elizabeth; Lambon Ralph, Matthew A.; Baddeley, Alan D.

    2004-01-01

    Immediate serial recall is better for sentences than word lists presumably because of the additional support that meaningful material receives from long-term memory. This may occur automatically, without the involvement of attention, or may require additional attentionally demanding processing. For example, the episodic buffer model (Baddeley, 2000) proposes that the executive component of working memory plays a crucial role in the formation of links between different representational formats...

  19. Automatic extraction of road features in urban environments using dense ALS data

    Science.gov (United States)

    Soilán, Mario; Truong-Hong, Linh; Riveiro, Belén; Laefer, Debra

    2018-02-01

    This paper describes a methodology that automatically extracts semantic information from urban ALS data for urban parameterization and road network definition. First, building façades are segmented from the ground surface by combining knowledge-based information with both voxel and raster data. Next, heuristic rules and unsupervised learning are applied to the ground surface data to distinguish sidewalk and pavement points as a means for curb detection. Then radiometric information was employed for road marking extraction. Using high-density ALS data from Dublin, Ireland, this fully automatic workflow was able to generate a F-score close to 95% for pavement and sidewalk identification with a resolution of 20 cm and better than 80% for road marking detection.

  20. NMRFx Processor: a cross-platform NMR data processing program

    International Nuclear Information System (INIS)

    Norris, Michael; Fetler, Bayard; Marchant, Jan; Johnson, Bruce A.

    2016-01-01

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  1. NMRFx Processor: a cross-platform NMR data processing program

    Energy Technology Data Exchange (ETDEWEB)

    Norris, Michael; Fetler, Bayard [One Moon Scientific, Inc. (United States); Marchant, Jan [University of Maryland Baltimore County, Howard Hughes Medical Institute (United States); Johnson, Bruce A., E-mail: bruce.johnson@asrc.cuny.edu [One Moon Scientific, Inc. (United States)

    2016-08-15

    NMRFx Processor is a new program for the processing of NMR data. Written in the Java programming language, NMRFx Processor is a cross-platform application and runs on Linux, Mac OS X and Windows operating systems. The application can be run in both a graphical user interface (GUI) mode and from the command line. Processing scripts are written in the Python programming language and executed so that the low-level Java commands are automatically run in parallel on computers with multiple cores or CPUs. Processing scripts can be generated automatically from the parameters of NMR experiments or interactively constructed in the GUI. A wide variety of processing operations are provided, including methods for processing of non-uniformly sampled datasets using iterative soft thresholding. The interactive GUI also enables the use of the program as an educational tool for teaching basic and advanced techniques in NMR data analysis.

  2. Northwest range-plant symbols adapted to automatic data processing.

    Science.gov (United States)

    George A. Garrison; Jon M. Skovlin

    1960-01-01

    Many range technicians, agronomists, foresters, biologists, and botanists of various educational institutions and government agencies in the Northwest have been using a four-letter symbol list or code compiled 12 years ago from records of plants collected by the U.S. Forest Service in Oregon and Washington, This code has served well as a means of entering plant names...

  3. Process acceptance and adjustment techniques for Swiss automatic screw machine parts. Final report

    International Nuclear Information System (INIS)

    Robb, J.M.

    1976-01-01

    Product tolerance requirements for small, cylindrical, piece parts produced on swiss automatic screw machines have progressed to the reliability limits of inspection equipment. The miniature size, configuration, and tolerance requirements (plus or minus 0.0001 in.) (0.00254 mm) of these parts preclude the use of screening techniques to accept product or adjust processes during setup and production runs; therefore, existing means of product acceptance and process adjustment must be refined or new techniques must be developed. The purpose of this endeavor has been to determine benefits gained through the implementation of a process acceptance technique (PAT) to swiss automatic screw machine processes. PAT is a statistical approach developed for the purpose of accepting product and centering processes for parts produced by selected, controlled processes. Through this endeavor a determination has been made of the conditions under which PAT can benefit a controlled process and some specific types of screw machine processes upon which PAT could be applied. However, it was also determined that PAT, if used indiscriminately, may become a record keeping burden when applied to more than one dimension at a given machining operation

  4. Implementasi Automatic Packet Reporting System (APRS) Untuk Paket Data Pemantauan dan PengukuranUntuk Paket Data Pemantauan dan Pengukuran

    OpenAIRE

    Arief Goeritno

    2016-01-01

    Telah dilakukan implementasi Automatic Packet Reporting System (APRS) untuk paket data pemantauan dan pengukuran melalui tujuan penelitian, berupa: a) penyetelan program aplikasi pada jaringan APRS dan b) pengukuran terhadap penerimaan data berdasarkan kinerja sensor-sensor. Penyetelan program aplikasi APRS merupakan konfigurasi perangkat lunak untuk APRS yang akan digunakan pada stasiun penerimaan data APRS dengan program aplikasi yang biasa digunakan, yaitu hyperterminal dan UI-View 32. Pen...

  5. Process Mining Online Assessment Data

    Science.gov (United States)

    Pechenizkiy, Mykola; Trcka, Nikola; Vasilyeva, Ekaterina; van der Aalst, Wil; De Bra, Paul

    2009-01-01

    Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of the underlying educational processes, for…

  6. Graphical Language for Data Processing

    Science.gov (United States)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  7. Wave data processing toolbox manual

    Science.gov (United States)

    Sullivan, Charlene M.; Warner, John C.; Martini, Marinna A.; Lightsom, Frances S.; Voulgaris, George; Work, Paul

    2006-01-01

    Researchers routinely deploy oceanographic equipment in estuaries, coastal nearshore environments, and shelf settings. These deployments usually include tripod-mounted instruments to measure a suite of physical parameters such as currents, waves, and pressure. Instruments such as the RD Instruments Acoustic Doppler Current Profiler (ADCP(tm)), the Sontek Argonaut, and the Nortek Aquadopp(tm) Profiler (AP) can measure these parameters. The data from these instruments must be processed using proprietary software unique to each instrument to convert measurements to real physical values. These processed files are then available for dissemination and scientific evaluation. For example, the proprietary processing program used to process data from the RD Instruments ADCP for wave information is called WavesMon. Depending on the length of the deployment, WavesMon will typically produce thousands of processed data files. These files are difficult to archive and further analysis of the data becomes cumbersome. More imperative is that these files alone do not include sufficient information pertinent to that deployment (metadata), which could hinder future scientific interpretation. This open-file report describes a toolbox developed to compile, archive, and disseminate the processed wave measurement data from an RD Instruments ADCP, a Sontek Argonaut, or a Nortek AP. This toolbox will be referred to as the Wave Data Processing Toolbox. The Wave Data Processing Toolbox congregates the processed files output from the proprietary software into two NetCDF files: one file contains the statistics of the burst data and the other file contains the raw burst data (additional details described below). One important advantage of this toolbox is that it converts the data into NetCDF format. Data in NetCDF format is easy to disseminate, is portable to any computer platform, and is viewable with public-domain freely-available software. Another important advantage is that a metadata

  8. Automatic-heuristic and executive-analytic processing during reasoning: Chronometric and dual-task considerations.

    Science.gov (United States)

    De Neys, Wim

    2006-06-01

    Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).

  9. A dual-task investigation of automaticity in visual word processing

    Science.gov (United States)

    McCann, R. S.; Remington, R. W.; Van Selst, M.

    2000-01-01

    An analysis of activation models of visual word processing suggests that frequency-sensitive forms of lexical processing should proceed normally while unattended. This hypothesis was tested by having participants perform a speeded pitch discrimination task followed by lexical decisions or word naming. As the stimulus onset asynchrony between the tasks was reduced, lexical-decision and naming latencies increased dramatically. Word-frequency effects were additive with the increase, indicating that frequency-sensitive processing was subject to postponement while attention was devoted to the other task. Either (a) the same neural hardware shares responsibility for lexical processing and central stages of choice reaction time task processing and cannot perform both computations simultaneously, or (b) lexical processing is blocked in order to optimize performance on the pitch discrimination task. Either way, word processing is not as automatic as activation models suggest.

  10. Automatic registration of imaging mass spectrometry data to the Allen Brain Atlas transcriptome

    Science.gov (United States)

    Abdelmoula, Walid M.; Carreira, Ricardo J.; Shyti, Reinald; Balluff, Benjamin; Tolner, Else; van den Maagdenberg, Arn M. J. M.; Lelieveldt, B. P. F.; McDonnell, Liam; Dijkstra, Jouke

    2014-03-01

    Imaging Mass Spectrometry (IMS) is an emerging molecular imaging technology that provides spatially resolved information on biomolecular structures; each image pixel effectively represents a molecular mass spectrum. By combining the histological images and IMS-images, neuroanatomical structures can be distinguished based on their biomolecular features as opposed to morphological features. The combination of IMS data with spatially resolved gene expression maps of the mouse brain, as provided by the Allen Mouse Brain atlas, would enable comparative studies of spatial metabolic and gene expression patterns in life-sciences research and biomarker discovery. As such, it would be highly desirable to spatially register IMS slices to the Allen Brain Atlas (ABA). In this paper, we propose a multi-step automatic registration pipeline to register ABA histology to IMS- images. Key novelty of the method is the selection of the best reference section from the ABA, based on pre-processed histology sections. First, we extracted a hippocampus-specific geometrical feature from the given experimental histological section to initially localize it among the ABA sections. Then, feature-based linear registration is applied to the initially localized section and its two neighbors in the ABA to select the most similar reference section. A non-rigid registration yields a one-to-one mapping of the experimental IMS slice to the ABA. The pipeline was applied on 6 coronal sections from two mouse brains, showing high anatomical correspondence, demonstrating the feasibility of complementing biomolecule distributions from individual mice with the genome-wide ABA transcriptome.

  11. Associative priming in a masked perceptual identification task: evidence for automatic processes.

    Science.gov (United States)

    Pecher, Diane; Zeelenberg, René; Raaijmakers, Jeroen G W

    2002-10-01

    Two experiments investigated the influence of automatic and strategic processes on associative priming effects in a perceptual identification task in which prime-target pairs are briefly presented and masked. In this paradigm, priming is defined as a higher percentage of correctly identified targets for related pairs than for unrelated pairs. In Experiment 1, priming was obtained for mediated word pairs. This mediated priming effect was affected neither by the presence of direct associations nor by the presentation time of the primes, indicating that automatic priming effects play a role in perceptual identification. Experiment 2 showed that the priming effect was not affected by the proportion (.90 vs. .10) of related pairs if primes were presented briefly to prevent their identification. However, a large proportion effect was found when primes were presented for 1000 ms so that they were clearly visible. These results indicate that priming in a masked perceptual identification task is the result of automatic processes and is not affected by strategies. The present paradigm provides a valuable alternative to more commonly used tasks such as lexical decision.

  12. Is place-value processing in four-digit numbers fully automatic? Yes, but not always.

    Science.gov (United States)

    García-Orza, Javier; Estudillo, Alejandro J; Calleja, Marina; Rodríguez, José Miguel

    2017-12-01

    Knowing the place-value of digits in multi-digit numbers allows us to identify, understand and distinguish between numbers with the same digits (e.g., 1492 vs. 1942). Research using the size congruency task has shown that the place-value in a string of three zeros and a non-zero digit (e.g., 0090) is processed automatically. In the present study, we explored whether place-value is also automatically activated when more complex numbers (e.g., 2795) are presented. Twenty-five participants were exposed to pairs of four-digit numbers that differed regarding the position of some digits and their physical size. Participants had to decide which of the two numbers was presented in a larger font size. In the congruent condition, the number shown in a bigger font size was numerically larger. In the incongruent condition, the number shown in a smaller font size was numerically larger. Two types of numbers were employed: numbers composed of three zeros and one non-zero digit (e.g., 0040-0400) and numbers composed of four non-zero digits (e.g., 2795-2759). Results showed larger congruency effects in more distant pairs in both type of numbers. Interestingly, this effect was considerably stronger in the strings composed of zeros. These results indicate that place-value coding is partially automatic, as it depends on the perceptual and numerical properties of the numbers to be processed.

  13. Automatic tissue image segmentation based on image processing and deep learning

    Science.gov (United States)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in multimodality imaging, especially in fusion structural images offered by CT, MRI with functional images collected by optical technologies or other novel imaging technologies. Plus, image segmentation also provides detailed structure description for quantitative visualization of treating light distribution in the human body when incorporated with 3D light transport simulation method. Here we used image enhancement, operators, and morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in a deep learning way. We also introduced parallel computing. Such approaches greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. Our results can be used as a criteria when diagnosing diseases such as cerebral atrophy, which is caused by pathological changes in gray matter or white matter. We demonstrated the great potential of such image processing and deep leaning combined automatic tissue image segmentation in personalized medicine, especially in monitoring, and treatments.

  14. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  15. ERPs reveal deficits in automatic cerebral stimulus processing in patients with NIDDM.

    Science.gov (United States)

    Vanhanen, M; Karhu, J; Koivisto, K; Pääkkönen, A; Partanen, J; Laakso, M; Riekkinen, P

    1996-11-04

    We compared auditory event-related potentials (ERPs) and neuropsychological test scores in nine patients with non-insulin-dependent diabetes mellitus (NIDDM) and in nine control subjects. The measures of automatic stimulus processing, habituation of auditory N100 and mismatch negativity (MMN) were impaired in patients. No differences were observed in the N2b and P3 components, which presumably reflect conscious cognitive analysis of the stimuli. A trend towards impaired performance in the Digit Span backward was found in diabetic subjects, but in the tests of secondary or long-term memory the groups were comparable. Patients with NIDDM may have defects in arousal and in the automatic ability to redirect attention, which can affect their cognitive performance.

  16. ACTIV - a program for automatic processing of gamma-ray spectra

    International Nuclear Information System (INIS)

    Zlokazov, V.B.

    1982-01-01

    Program ACTIV is intended for precise analysis of γ-rays and X-ray spectra and allows the user to carry out the full cycle of automatic processing of a series of spectra, i.e. calibration, automatic peak search, determination of peak positions and areas, identification of the radioisotopes and the transformation of the areas found into masses of isotopes in the irradiated sample. ACTIV uses a complex mathematical technique and is oriented mainly to large computers, but using overlaid loading, it can be run also on small computers like the PDP 11/70. Compared with other similar programs, ACTIV has some advantages in accuracy of peak shape description and in the reliability of the peak search and its least-square analysis. The program can be used for the purpose of activation analysis. The program can analyze spectra with poor statistics and with broad and narrow peaks. (orig.)

  17. The Automatic Conservative: Ideology-Based Attentional Asymmetries in the Processing of Valenced Information

    Science.gov (United States)

    Carraro, Luciana; Castelli, Luigi; Macchiella, Claudia

    2011-01-01

    Research has widely explored the differences between conservatives and liberals, and it has been also recently demonstrated that conservatives display different reactions toward valenced stimuli. However, previous studies have not yet fully illuminated the cognitive underpinnings of these differences. In the current work, we argued that political ideology is related to selective attention processes, so that negative stimuli are more likely to automatically grab the attention of conservatives as compared to liberals. In Experiment 1, we demonstrated that negative (vs. positive) information impaired the performance of conservatives, more than liberals, in an Emotional Stroop Task. This finding was confirmed in Experiment 2 and in Experiment 3 employing a Dot-Probe Task, demonstrating that threatening stimuli were more likely to attract the attention of conservatives. Overall, results support the conclusion that people embracing conservative views of the world display an automatic selective attention for negative stimuli. PMID:22096486

  18. A Method of Generating Indoor Map Spatial Data Automatically from Architectural Plans

    Directory of Open Access Journals (Sweden)

    SUN Weixin

    2016-06-01

    Full Text Available Taking architectural plans as data source, we proposed a method which can automatically generate indoor map spatial data. Firstly, referring to the spatial data demands of indoor map, we analyzed the basic characteristics of architectural plans, and introduced concepts of wall segment, adjoining node and adjoining wall segment, based on which basic flow of indoor map spatial data automatic generation was further established. Then, according to the adjoining relation between wall lines at the intersection with column, we constructed a repair method for wall connectivity in relation to the column. Utilizing the method of gradual expansibility and graphic reasoning to judge wall symbol local feature type at both sides of door or window, through update the enclosing rectangle of door or window, we developed a repair method for wall connectivity in relation to the door or window and a method for transform door or window into indoor map point feature. Finally, on the basis of geometric relation between adjoining wall segment median lines, a wall center-line extraction algorithm was presented. Taking one exhibition hall's architectural plan as example, we performed experiment and results show that the proposed methods have preferable applicability to deal with various complex situations, and realized indoor map spatial data automatic extraction effectively.

  19. VACTIV: A graphical dialog based program for an automatic processing of line and band spectra

    Science.gov (United States)

    Zlokazov, V. B.

    2013-05-01

    and estimation of parameters of interest. VACTIV can run on any standard modern laptop. Reasons for the new version: At the time of its creation (1999) VACTIV was seemingly the first attempt to apply the newest programming languages and styles to systems of spectrum analysis. Its goal was to both get a convenient and efficient technique for data processing, and to elaborate the formalism of spectrum analysis in terms of classes, their properties, their methods and events of an object-oriented programming language. Summary of revisions: Compared with ACTIV, VACTIV preserves all the mathematical algorithms, but provides the user with all the benefits of an interface, based on a graphical dialog. It allows him to make a quick intervention in the work of the program; in particular, to carry out the on-line control of the fitting process: depending on the intermediate results and using the visual form of data representation, to change the conditions for the fitting and so achieve the optimum performance, selecting the optimum strategy. To find the best conditions for the fitting one can compress the spectrum, delete the blunders from it, smooth it using a high-frequency spline filter and build the background using a low-frequency spline filter; use not only automatic methods for the blunder deletion, the peak search, the peak model forming and the calibration, but also use manual mouse clicking on the spectrum graph. Restrictions: To enhance the reliability and portability of the program the majority of the most important arrays have a static allocation; all the arrays are allocated with a surplus, and the total pool of the program is restricted only by the size of the computer virtual memory. A spectrum has the static size of 32 K real words. The maximum size of the least-square matrix is 314 (the maximum number of fitted parameters per one analyzed spectrum interval, not for the whole spectrum), from which it follows that the maximum number of peaks in one spectrum

  20. Standardization of GPS data processing

    International Nuclear Information System (INIS)

    Park, Pil Ho

    2001-06-01

    A nationwide GPS network has been constructed with about 60 permanent GPS stations after late 1990s in Korea. For using the GPS in variety of application area like crustal deformation, positioning, or monitoring upper atmosphere, it is necessary to have ability to process the data precisely. Now Korea Astronomy Observatory has the precise GPS data processing technique in Korea because it is difficult to understand characteristics of the parameters we want to estimate, resolve the integer ambiguity, and analyze many errors. There are three reliable GPS data processing software in the world ; Bernese(University of Berne), GIPSY-OASIS(JPL), GAMIT(MIT). These software allow us to achieve millimeter accuracy in the horizontal position and about 1 cm accuracy vertically even for regional networks with a diameter of several thousand kilometers. But we established the standard of GPS data processing using Bernese as main tool and GIPSY O ASIS as side

  1. Gigabit Per Second Data Processing

    Data.gov (United States)

    National Aeronautics and Space Administration — Solve the existing problem of handling the on-board, real time, memory intensive processing of the Gb/s data stream of the scientific instrument. Examine and define...

  2. Study of an automatic dosing of neptunium in the industrial process of separation neptunium 237-plutonium 238

    International Nuclear Information System (INIS)

    Ros, Pierre

    1973-01-01

    The objective is to study and to adapt a method of automatic dosing of neptunium to the industrial process of separation and purification of plutonium 238, while taking the information quality and economic aspects into account. After a recall of some generalities on the production of plutonium 238, and the process of separation plutonium-neptunium, the author addresses the dosing of neptunium. The adopted measurement technique is spectrophotometry (of neptunium, of neptunium peroxide) which is the most flexible and economic to adapt to automatic control. The author proposes a project of chemical automatic machine, and discusses the complex (stoichiometry, form) and some aspects of neptunium dosing (redox reactions, process control) [fr

  3. [Perspectives of an electronic data processing-controlled anesthesia protocol].

    Science.gov (United States)

    Martens, G; Naujoks, B

    1987-10-01

    There are two ways to introduce electronic data processing in anesthesia recording, which should be combined in the future: (1) computer-aided data collection (during anesthesia) and (2) data analysis. Both procedures have their own advantages and disadvantages. The first step in data collection is a system whereby the on-line registered data are automatically plotted and the discrete data are noted by hand (semi-automatic recording). The second step is to keep the minutes on a display screen instead of on paper, thus producing a protocol in digital form (automatic recording). We discuss the problems of these computer-aided recording systems and future trends, in particular the problems caused by the "human-computer interface" and by uncertainty with respect to the validity of the stored data. For computer-aided data analysis of anesthesia records, one has to select appropriate data in order to build up data bases. This selection is necessary whether the protocol is in analogical or in digital form, and we attempt to develop some general rules, the concrete selection depends, of course, on the aim of the evaluation. As an example we discuss evaluations for administrative purposes. Evaluations for scientific questions are even more affected by the quality of data definitions, and the efforts involved in data management are considerably higher. At the end of this paper we sketch a hybrid information system for computer-aided anesthesia recording that combines data collection and data analysis.

  4. Fully automatic characterization and data collection from crystals of biological macromolecules

    International Nuclear Information System (INIS)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W.

    2015-01-01

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention

  5. Fully automatic characterization and data collection from crystals of biological macromolecules

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier, E-mail: nurizzo@esrf.fr [European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France); Bowler, Matthew W., E-mail: nurizzo@esrf.fr [European Molecular Biology Laboratory, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); Université Grenoble Alpes–EMBL–CNRS, Grenoble Outstation, 71 Avenue des Martyrs, CS 90181, 38042 Grenoble (France); European Synchrotron Radiation Facility, 71 Avenue des Martyrs, CS 40220, 38043 Grenoble (France)

    2015-07-31

    A fully automatic system has been developed that performs X-ray centring and characterization of, and data collection from, large numbers of cryocooled crystals without human intervention. Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  6. A New Automatic Method of Urban Areas Mapping in East Asia from LANDSAT Data

    Science.gov (United States)

    XU, R.; Jia, G.

    2012-12-01

    Cities, as places where human activities are concentrated, account for a small percent of global land cover but are frequently cited as the chief causes of, and solutions to, climate, biogeochemistry, and hydrology processes at local, regional, and global scales. Accompanying with uncontrolled economic growth, urban sprawl has been attributed to the accelerating integration of East Asia into the world economy and involved dramatic changes in its urban form and land use. To understand the impact of urban extent on biogeophysical processes, reliable mapping of built-up areas is particularly essential in eastern cities as a result of their characteristics of smaller patches, more fragile, and a lower fraction of the urban landscape which does not have natural than in the West. Segmentation of urban land from other land-cover types using remote sensing imagery can be done by standard classification processes as well as a logic rule calculation based on spectral indices and their derivations. Efforts to establish such a logic rule with no threshold for automatically mapping are highly worthwhile. Existing automatic methods are reviewed, and then a proposed approach is introduced including the calculation of the new index and the improved logic rule. Following this, existing automatic methods as well as the proposed approach are compared in a common context. Afterwards, the proposed approach is tested separately in cities of large, medium, and small scale in East Asia selected from different LANDSAT images. The results are promising as the approach can efficiently segment urban areas, even in the presence of more complex eastern cities. Key words: Urban extraction; Automatic Method; Logic Rule; LANDSAT images; East AisaThe Proposed Approach of Extraction of Urban Built-up Areas in Guangzhou, China

  7. Big Data in Market Research: Why More Data Does Not Automatically Mean Better Information

    Directory of Open Access Journals (Sweden)

    Bosch Volker

    2016-11-01

    Full Text Available Big data will change market research at its core in the long term because consumption of products and media can be logged electronically more and more, making it measurable on a large scale. Unfortunately, big data datasets are rarely representative, even if they are huge. Smart algorithms are needed to achieve high precision and prediction quality for digital and non-representative approaches. Also, big data can only be processed with complex and therefore error-prone software, which leads to measurement errors that need to be corrected. Another challenge is posed by missing but critical variables. The amount of data can indeed be overwhelming, but it often lacks important information. The missing observations can only be filled in by using statistical data imputation. This requires an additional data source with the additional variables, for example a panel. Linear imputation is a statistical procedure that is anything but trivial. It is an instrument to “transport information,” and the higher the observed data correlates with the data to be imputed, the better it works. It makes structures visible even if the depth of the data is limited.

  8. Data acquisition and processing for flame spectrophotometry using a programmable desk calculator

    International Nuclear Information System (INIS)

    Hurteau, M.T.; Ashley, R.W.

    1976-02-01

    A programmable calculator has been used to provide automatic data acquisition and processing for flame spectrophotometric measurements. When coupled with an automatic wavelength selector, complete automation of sample analysis is provided for one or more elements in solution. The program takes into account deviation of analytical curves from linearity. Increased sensitivity and precision over manual calculations are obtained. (author)

  9. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    Energy Technology Data Exchange (ETDEWEB)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described.

  10. Automatic calibration and signal switching system for the particle beam fusion research data acquisition facility

    International Nuclear Information System (INIS)

    Boyer, W.B.

    1979-09-01

    This report describes both the hardware and software components of an automatic calibration and signal system (Autocal) for the data acquisition system for the Sandia particle beam fusion research accelerators Hydra, Proto I, and Proto II. The Autocal hardware consists of off-the-shelf commercial equipment. The various hardware components, special modifications and overall system configuration are described. Special software has been developed to support the Autocal hardware. Software operation and maintenance are described

  11. Reliability Engineering for ATLAS Petascale Data Processing on the Grid

    CERN Document Server

    Golubkov, D V; The ATLAS collaboration; Vaniachine, A V

    2012-01-01

    The ATLAS detector is in its third year of continuous LHC running taking data for physics analysis. A starting point for ATLAS physics analysis is reconstruction of the raw data. First-pass processing takes place shortly after data taking, followed later by reprocessing of the raw data with updated software and calibrations to improve the quality of the reconstructed data for physics analysis. Data reprocessing involves a significant commitment of computing resources and is conducted on the Grid. The reconstruction of one petabyte of ATLAS data with 1B collision events from the LHC takes about three million core-hours. Petascale data processing on the Grid involves millions of data processing jobs. At such scales, the reprocessing must handle a continuous stream of failures. Automatic job resubmission recovers transient failures at the cost of CPU time used by the failed jobs. Orchestrating ATLAS data processing applications to ensure efficient usage of tens of thousands of CPU-cores, reliability engineering ...

  12. Data processing in radiology: Resume and prospects

    Energy Technology Data Exchange (ETDEWEB)

    Heilmann, H.P.; Tiemann, J.

    1985-12-01

    The technical aspects of radiology are particularly suitable for electronic data processing. In addition to automation of radiological apparatus and tumour registration, there are three areas in radiology particularly suitable for electronic data processing: treatment planning, dose calculations and supervision of radiotherapy techniques in radio-oncology. It can be used for work processing in the office and for documentation, both in diagnostic and therapeutic radiology, and digital techniques can be employed for image transmission, storage and manipulation. Computers for treatment planning and dose calculation are standard techniques and suitable computers allow one to spot occasional and systematic errors during radiation treation treatment and to eliminate these. They also provide for the automatic generation of the required protocols. Word processors have proved particularly valuable in private practice. They are valuable for composing reports from their basic elements, but less valuable for texts that are stereotypes. The most important developments are in digital imaging, image storage and image transmission. The storage of images on video discs, transmission through fibre-optic cables and computer manipulation of images are described and the consequences and problems, which may affect the radiologist, are discussed.

  13. Data processing in radiology: Resume and prospects

    International Nuclear Information System (INIS)

    Heilmann, H.P.; Tiemann, J.

    1985-01-01

    The technical aspects of radiology are particularly suitable for electronic data processing. In addition to automation of radiological apparatus and tumour registration, there are three areas in radiology particularly suitable for electronic data processing: treatment planning, dose calculations and supervision of radiotherapy techniques in radio-oncology. It can be used for work processing in the office and for documentation, both in diagnostic and therapeutic radiology, and digital techniques can be employed for image transmission, storage and manipulation. Computers for treatment planning and dose calculation are standard techniques and suitable computers allow one to spot occasional and systematic errors during radiation treation treatment and to eliminate these. They also provide for the automatic generation of the required protocols. Word processors have proved particularly valuable in private practice. They are valuable for composing reports from their basic elements, but less valuable for texts that are stereotypes. The most important developments are in digital imaging, image storage and image transmission. The storage of images on video discs, transmission through fibre-optic cables and computer manipulation of images are described and the consequences and problems, which may affect the radiologist, are discussed. (orig.) [de

  14. Digital Data Processing of Stilbene

    International Nuclear Information System (INIS)

    AMIRI, Moslem; MATEJ, Zdenek; MRAVEC, Filip; PRENOSIL, Vaclav; CVACHOVEC, Frantisek

    2013-06-01

    Stilbene is a proven spectrometric detector for mixed fields of neutrons and gamma rays. By digital processing of shape output pulses from the detector it is possible to obtain information about the energy of the interacting neutron / photon and distinguish which of these two particles interacts in the detector. Another numerical processing of digital data can finalize the energy spectrum of both components of the mixed field. The quality of the digitized data is highly dependent on the parameters of the hardware used for digitization and on the quality of software processing. Our results also show how the quality of the particle type identification depends on the sampling rate and as well as the method of processing of the sampled data. (authors)

  15. Automatic processing of gamma ray spectra employing classical and modified Fourier transform approach

    International Nuclear Information System (INIS)

    Rattan, S.S.; Madan, V.K.

    1994-01-01

    This report describes methods for automatic processing of gamma ray spectra acquired with HPGe detectors. The processing incorporated both classical and signal processing approach. The classical method was used for smoothing, detecting significant peaks, finding peak envelope limits and a proposed method of finding peak limits, peak significance index, full width at half maximum, detecting doublets for further analysis. To facilitate application of signal processing to nuclear spectra, Madan et al. gave a new classification of signals and identified nuclear spectra as Type II signals, mathematically formalized modified Fourier transform and pioneered its application to process doublet envelopes acquired with modern spectrometers. It was extended to facilitate routine analysis of the spectra. A facility for energy and efficiency calibration was also included. The results obtained by analyzing observed gamma-ray spectra using the above approach compared favourably with those obtained with SAMPO and also those derived from table of radioisotopes. (author). 15 refs., 3 figs., 3 tabs

  16. Generating Impact Maps from Automatically Detected Bomb Craters in Aerial Wartime Images Using Marked Point Processes

    Science.gov (United States)

    Kruse, Christian; Rottensteiner, Franz; Hoberg, Thorsten; Ziems, Marcel; Rebke, Julia; Heipke, Christian

    2018-04-01

    The aftermath of wartime attacks is often felt long after the war ended, as numerous unexploded bombs may still exist in the ground. Typically, such areas are documented in so-called impact maps which are based on the detection of bomb craters. This paper proposes a method for the automatic detection of bomb craters in aerial wartime images that were taken during the Second World War. The object model for the bomb craters is represented by ellipses. A probabilistic approach based on marked point processes determines the most likely configuration of objects within the scene. Adding and removing new objects to and from the current configuration, respectively, changing their positions and modifying the ellipse parameters randomly creates new object configurations. Each configuration is evaluated using an energy function. High gradient magnitudes along the border of the ellipse are favored and overlapping ellipses are penalized. Reversible Jump Markov Chain Monte Carlo sampling in combination with simulated annealing provides the global energy optimum, which describes the conformance with a predefined model. For generating the impact map a probability map is defined which is created from the automatic detections via kernel density estimation. By setting a threshold, areas around the detections are classified as contaminated or uncontaminated sites, respectively. Our results show the general potential of the method for the automatic detection of bomb craters and its automated generation of an impact map in a heterogeneous image stock.

  17. AUTOMATIC GLOBAL REGISTRATION BETWEEN AIRBORNE LIDAR DATA AND REMOTE SENSING IMAGE BASED ON STRAIGHT LINE FEATURES

    Directory of Open Access Journals (Sweden)

    Z. Q. Liu

    2018-04-01

    Full Text Available An automatic global registration approach for point clouds and remote sensing image based on straight line features is proposed which is insensitive to rotational and scale transformation. First, the building ridge lines and contour lines in point clouds are automatically detected as registration primitives by integrating region growth and topology identification. Second, the collinear condition equation is selected as registration transformation function which is based on rotation matrix described by unit quaternion. The similarity measure is established according to the distance between the corresponding straight line features from point clouds and the image in the same reference coordinate system. Finally, an iterative Hough transform is adopted to simultaneously estimate the parameters and obtain correspondence between registration primitives. Experimental results prove the proposed method is valid and the spectral information is useful for the following classification processing.

  18. Information management data base for fusion target fabrication processes

    International Nuclear Information System (INIS)

    Reynolds, J.

    1983-01-01

    A computer-based data management system has been developed to handle data associated with target fabrication processes including glass microballoon characterization, gas filling, materials coating, and storage locations. The system provides automatic data storage and computation, flexible data entry procedures, fast access, automated report generation, and secure data transfer. It resides on a CDC CYBER 175 computer and is compatible with the CDC data base language Query Update, but is based on custom fortran software interacting directly with the CYBER's file management system. The described data base maintains detailed, accurate, and readily available records of fusion targets information

  19. STADUS - Ultrasonic data acquisition and processing system

    International Nuclear Information System (INIS)

    Saglio, Robert; Birac, A.M.; Frappier, J.C.

    1982-05-01

    The CEA (Commissariat a l'Energie Atomique) has developed a system for the acquisition and analysis of data recorded during ultrasonic testing. Initially this system was designed and built for the needs of in-service inspection of PWR type power reactors. It is in far wider use today for miscellaneous automatic ultrasonic inspection procedures. This system records, in digital form, the ultrasonic data supplied by the transducers (maximum 16 simultaneous channels), and the geometric coordinates defining the position of the inspection tool. Based on these data, which are recorded on floppy disk, this system helps to display data in the form of A SCAN, B SCAN and C SCAN images. In addition, processing programs of data transfer from the STADUS floppy disks have been developed and inserted on computers more powerful than the one used in the STADUS system. These programs serve to obtain different fault charts on an adjustable scale, as well as listings concerning the defect positions and dimensions [fr

  20. Adjustment of automatic control systems of production facilities at coal processing plants using multivariant physico- mathematical models

    Science.gov (United States)

    Evtushenko, V. F.; Myshlyaev, L. P.; Makarov, G. V.; Ivushkin, K. A.; Burkova, E. V.

    2016-10-01

    The structure of multi-variant physical and mathematical models of control system is offered as well as its application for adjustment of automatic control system (ACS) of production facilities on the example of coal processing plant.

  1. Data base structure and Management for Automatic Calculation of 210Pb Dating Methods Applying Different Models

    International Nuclear Information System (INIS)

    Gasco, C.; Anton, M. P.; Ampudia, J.

    2003-01-01

    The introduction of macros in try calculation sheets allows the automatic application of various dating models using unsupported ''210 Pb data from a data base. The calculation books the contain the models have been modified to permit the implementation of these macros. The Marine and Aquatic Radioecology group of CIEMAT (MARG) will be involved in new European Projects, thus new models have been developed. This report contains a detailed description of: a) the new implement macros b) the design of a dating Menu in the calculation sheet and c) organization and structure of the data base. (Author) 4 refs

  2. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  3. VISA: AN AUTOMATIC AWARE AND VISUAL AIDS MECHANISM FOR IMPROVING THE CORRECT USE OF GEOSPATIAL DATA

    Directory of Open Access Journals (Sweden)

    J. H. Hong

    2016-06-01

    Full Text Available With the fast growth of internet-based sharing mechanism and OpenGIS technology, users nowadays enjoy the luxury to quickly locate and access a variety of geospatial data for the tasks at hands. While this sharing innovation tremendously expand the possibility of application and reduce the development cost, users nevertheless have to deal with all kinds of “differences” implicitly hidden behind the acquired georesources. We argue the next generation of GIS-based environment, regardless internet-based or not, must have built-in knowledge to automatically and correctly assess the fitness of data use and present the analyzed results to users in an intuitive and meaningful way. The VISA approach proposed in this paper refer to four different types of visual aids that can be respectively used for addressing analyzed results, namely, virtual layer, informative window, symbol transformation and augmented TOC. The VISA-enabled interface works in an automatic-aware fashion, where the standardized metadata serve as the known facts about the selected geospatial resources, algorithms for analyzing the differences of temporality and quality of the geospatial resources were designed and the transformation of analyzed results into visual aids were automatically executed. It successfully presents a new way for bridging the communication gaps between systems and users. GIS has been long seen as a powerful integration tool, but its achievements would be highly restricted if it fails to provide a friendly and correct working platform.

  4. Automatic Mexico Gulf Oil Spill Detection from Radarsat-2 SAR Satellite Data Using Genetic Algorithm

    Science.gov (United States)

    Marghany, Maged

    2016-10-01

    In this work, a genetic algorithm is exploited for automatic detection of oil spills of small and large size. The route is achieved using arrays of RADARSAT-2 SAR ScanSAR Narrow single beam data obtained in the Gulf of Mexico. The study shows that genetic algorithm has automatically segmented the dark spot patches related to small and large oil spill pixels. This conclusion is confirmed by the receiveroperating characteristic (ROC) curve and ground data which have been documented. The ROC curve indicates that the existence of oil slick footprints can be identified with the area under the curve between the ROC curve and the no-discrimination line of 90%, which is greater than that of other surrounding environmental features. The small oil spill sizes represented 30% of the discriminated oil spill pixels in ROC curve. In conclusion, the genetic algorithm can be used as a tool for the automatic detection of oil spills of either small or large size and the ScanSAR Narrow single beam mode serves as an excellent sensor for oil spill patterns detection and surveying in the Gulf of Mexico.

  5. Intentional and automatic processing of numerical information in mathematical anxiety: testing the influence of emotional priming.

    Science.gov (United States)

    Ashkenazi, Sarit

    2018-02-05

    Current theoretical approaches suggest that mathematical anxiety (MA) manifests itself as a weakness in quantity manipulations. This study is the first to examine automatic versus intentional processing of numerical information using the numerical Stroop paradigm in participants with high MA. To manipulate anxiety levels, we combined the numerical Stroop task with an affective priming paradigm. We took a group of college students with high MA and compared their performance to a group of participants with low MA. Under low anxiety conditions (neutral priming), participants with high MA showed relatively intact number processing abilities. However, under high anxiety conditions (mathematical priming), participants with high MA showed (1) higher processing of the non-numerical irrelevant information, which aligns with the theoretical view regarding deficits in selective attention in anxiety and (2) an abnormal numerical distance effect. These results demonstrate that abnormal, basic numerical processing in MA is context related.

  6. Automatic sample changer control software for automation of neutron activation analysis process in Malaysian Nuclear Agency

    Science.gov (United States)

    Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.

    2018-01-01

    Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.

  7. Information processing requirements for on-board monitoring of automatic landing

    Science.gov (United States)

    Sorensen, J. A.; Karmarkar, J. S.

    1977-01-01

    A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.

  8. Age-related differences in the automatic processing of single letters: implications for selective attention.

    Science.gov (United States)

    Daffner, Kirk R; Alperin, Brittany R; Mott, Katherine K; Holcomb, Phillip J

    2014-01-22

    Older adults exhibit diminished ability to inhibit the processing of visual stimuli that are supposed to be ignored. The extent to which age-related changes in early visual processing contribute to impairments in selective attention remains to be determined. Here, 103 adults, 18-85 years of age, completed a color selective attention task in which they were asked to attend to a specified color and respond to designated target letters. An optimal approach would be to initially filter according to color and then process letter forms in the attend color to identify targets. An asymmetric N170 ERP component (larger amplitude over left posterior hemisphere sites) was used as a marker of the early automatic processing of letter forms. Young and middle-aged adults did not generate an asymmetric N170 component. In contrast, young-old and old-old adults produced a larger N170 over the left hemisphere. Furthermore, older adults generated a larger N170 to letter than nonletter stimuli over the left, but not right hemisphere. More asymmetric N170 responses predicted greater allocation of late selection resources to target letters in the ignore color, as indexed by P3b amplitude. These results suggest that unlike their younger counterparts, older adults automatically process stimuli as letters early in the selection process, when it would be more efficient to attend to color only. The inability to ignore letters early in the processing stream helps explain the age-related increase in subsequent processing of target letter forms presented in the ignore color.

  9. Algorithms and programs for processing of satellite data on ozone layer and UV radiation levels

    International Nuclear Information System (INIS)

    Borkovskij, N.B.; Ivanyukovich, V.A.

    2012-01-01

    Some algorithms and programs for automatic retrieving and processing ozone layer satellite data are discussed. These techniques are used for reliable short-term UV-radiation levels forecasting. (authors)

  10. Processing Terrain Point Cloud Data

    KAUST Repository

    DeVore, Ronald

    2013-01-10

    Terrain point cloud data are typically acquired through some form of Light Detection And Ranging sensing. They form a rich resource that is important in a variety of applications including navigation, line of sight, and terrain visualization. Processing terrain data has not received the attention of other forms of surface reconstruction or of image processing. The goal of terrain data processing is to convert the point cloud into a succinct representation system that is amenable to the various application demands. The present paper presents a platform for terrain processing built on the following principles: (i) measuring distortion in the Hausdorff metric, which we argue is a good match for the application demands, (ii) a multiscale representation based on tree approximation using local polynomial fitting. The basic elements held in the nodes of the tree can be efficiently encoded, transmitted, visualized, and utilized for the various target applications. Several challenges emerge because of the variable resolution of the data, missing data, occlusions, and noise. Techniques for identifying and handling these challenges are developed. © 2013 Society for Industrial and Applied Mathematics.

  11. Oscillatory brain dynamics associated with the automatic processing of emotion in words.

    Science.gov (United States)

    Wang, Lin; Bastiaansen, Marcel

    2014-10-01

    This study examines the automaticity of processing the emotional aspects of words, and characterizes the oscillatory brain dynamics that accompany this automatic processing. Participants read emotionally negative, neutral and positive nouns while performing a color detection task in which only perceptual-level analysis was required. Event-related potentials and time frequency representations were computed from the concurrently measured EEG. Negative words elicited a larger P2 and a larger late positivity than positive and neutral words, indicating deeper semantic/evaluative processing of negative words. In addition, sustained alpha power suppressions were found for the emotional compared to neutral words, in the time range from 500 to 1000ms post-stimulus. These results suggest that sustained attention was allocated to the emotional words, whereas the attention allocated to the neutral words was released after an initial analysis. This seems to hold even when the emotional content of the words is task-irrelevant. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Communicating Processes with Data for Supervisory Coordination

    Directory of Open Access Journals (Sweden)

    Jasen Markovski

    2012-08-01

    Full Text Available We employ supervisory controllers to safely coordinate high-level discrete(-event behavior of distributed components of complex systems. Supervisory controllers observe discrete-event system behavior, make a decision on allowed activities, and communicate the control signals to the involved parties. Models of the supervisory controllers can be automatically synthesized based on formal models of the system components and a formalization of the safe coordination (control requirements. Based on the obtained models, code generation can be used to implement the supervisory controllers in software, on a PLC, or an embedded (microprocessor. In this article, we develop a process theory with data that supports a model-based systems engineering framework for supervisory coordination. We employ communication to distinguish between the different flows of information, i.e., observation and supervision, whereas we employ data to specify the coordination requirements more compactly, and to increase the expressivity of the framework. To illustrate the framework, we remodel an industrial case study involving coordination of maintenance procedures of a printing process of a high-tech Oce printer.

  13. RESEARCH ON THE CONSTRUCTION OF REMOTE SENSING AUTOMATIC INTERPRETATION SYMBOL BIG DATA

    Directory of Open Access Journals (Sweden)

    Y. Gao

    2018-04-01

    Full Text Available Remote sensing automatic interpretation symbol (RSAIS is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013–2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  14. Type-assisted automatic garbage collection for lock-free data structures

    OpenAIRE

    Yang, Albert Mingkun; Wrigstad, Tobias

    2017-01-01

    We introduce Isolde, an automatic garbage collection scheme designed specifically for managing memory in lock-free data structures, such as stacks, lists, maps and queues. Isolde exists as a plug-in memory manager, designed to sit on-top of another memory manager, and use it's allocator and reclaimer (if exists). Isolde treats a lock-free data structure as a logical heap, isolated from the rest of the program. This allows garbage collection outside of Isolde to take place without affecting th...

  15. Data Quality Monitoring : Automatic MOnitoRing Environment (AMORE ) Web Administration Tool in ALICE Experiment

    CERN Document Server

    Nagi, Imre

    2013-01-01

    ALICE (A Large Ion Collider Experiment) is the heavy-ion detector designed to study the physics of strongly interacting matter and the quark-gluon plasma at the CERN Large Hadron Collider (LHC). The quality of the acquired data evolves over time depending on the status of the detectors, its components and the operating environment. To get an excellent performance of detector, all detector configurations have to be set perfectly so that the data-taking can be done in an optimal way. This report describes a new implementation of the administration tools of the ALICE’s DQM framework called AMORE (Automatic MonitoRing Environment) with web technologies.

  16. Sensitometric characteristics of D-, E- and F-speed dental radiographic films in manual and automatic processing

    Directory of Open Access Journals (Sweden)

    Jahangir Haghani

    2012-12-01

    Full Text Available BACKGROUND: The purpose of this study was to evaluate the sensitometric characteristics of Ultraspeed, Ektaspeed Plus and Insight dental radiographic films using manual and automatic processing systems. METHODS: In this experimental invitro study, an aluminum step-wedge was used to construct characteristic curves for D-, E- and F-speed radiographic films (Kodak Eastman, Rochester, USA. All films were processed in Iranian processing solution (chemical industries Co., Iran, Tehran both manually and automatically in a period of six days. Unexposed films of three types were processed manually and automatically to determine base plus fog density. Speed and film contrast were measured according to International Standard Organization definition. RESULTS: There was significant difference in density obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P < 0.001. There was significant difference in density obtained with the Ultraspeed and insight films. There was no significant difference in contrast obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P = 0.255 , P = 0.26. There was significant difference in speed obtained with the D-, E- and F-speed films in both manually and automatically processing systems (P = 0.034, P = 0.04. CONCLUSIONS: The choice of processing system can affect radiographic characteristics. The F-speed film processed in automatic system has greater speed in comparison with manual processing system, and it provides a further reduction in radiation exposure without detriment to image quality.

  17. Fast data processing with Spark

    CERN Document Server

    Sankar, Krishna

    2015-01-01

    Fast Data Processing with Spark - Second Edition is for software developers who want to learn how to write distributed programs with Spark. It will help developers who have had problems that were too big to be dealt with on a single computer. No previous experience with distributed programming is necessary. This book assumes knowledge of either Java, Scala, or Python.

  18. Automatable algorithms to identify nonmedical opioid use using electronic data: a systematic review.

    Science.gov (United States)

    Canan, Chelsea; Polinski, Jennifer M; Alexander, G Caleb; Kowal, Mary K; Brennan, Troyen A; Shrank, William H

    2017-11-01

    Improved methods to identify nonmedical opioid use can help direct health care resources to individuals who need them. Automated algorithms that use large databases of electronic health care claims or records for surveillance are a potential means to achieve this goal. In this systematic review, we reviewed the utility, attempts at validation, and application of such algorithms to detect nonmedical opioid use. We searched PubMed and Embase for articles describing automatable algorithms that used electronic health care claims or records to identify patients or prescribers with likely nonmedical opioid use. We assessed algorithm development, validation, and performance characteristics and the settings where they were applied. Study variability precluded a meta-analysis. Of 15 included algorithms, 10 targeted patients, 2 targeted providers, 2 targeted both, and 1 identified medications with high abuse potential. Most patient-focused algorithms (67%) used prescription drug claims and/or medical claims, with diagnosis codes of substance abuse and/or dependence as the reference standard. Eleven algorithms were developed via regression modeling. Four used natural language processing, data mining, audit analysis, or factor analysis. Automated algorithms can facilitate population-level surveillance. However, there is no true gold standard for determining nonmedical opioid use. Users must recognize the implications of identifying false positives and, conversely, false negatives. Few algorithms have been applied in real-world settings. Automated algorithms may facilitate identification of patients and/or providers most likely to need more intensive screening and/or intervention for nonmedical opioid use. Additional implementation research in real-world settings would clarify their utility. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. GIS Data Based Automatic High-Fidelity 3D Road Network Modeling

    Science.gov (United States)

    Wang, Jie; Shen, Yuzhong

    2011-01-01

    3D road models are widely used in many computer applications such as racing games and driving simulations_ However, almost all high-fidelity 3D road models were generated manually by professional artists at the expense of intensive labor. There are very few existing methods for automatically generating 3D high-fidelity road networks, especially those existing in the real world. This paper presents a novel approach thai can automatically produce 3D high-fidelity road network models from real 2D road GIS data that mainly contain road. centerline in formation. The proposed method first builds parametric representations of the road centerlines through segmentation and fitting . A basic set of civil engineering rules (e.g., cross slope, superelevation, grade) for road design are then selected in order to generate realistic road surfaces in compliance with these rules. While the proposed method applies to any types of roads, this paper mainly addresses automatic generation of complex traffic interchanges and intersections which are the most sophisticated elements in the road networks

  20. Data-driven automatic parking constrained control for four-wheeled mobile vehicles

    Directory of Open Access Journals (Sweden)

    Wenxu Yan

    2016-11-01

    Full Text Available In this article, a novel data-driven constrained control scheme is proposed for automatic parking systems. The design of the proposed scheme only depends on the steering angle and the orientation angle of the car, and it does not involve any model information of the car. Therefore, the proposed scheme-based automatic parking system is applicable to different kinds of cars. In order to further reduce the desired trajectory coordinate tracking errors, a coordinates compensation algorithm is also proposed. In the design procedure of the controller, a novel dynamic anti-windup compensator is used to deal with the change magnitude and rate saturations of automatic parking control input. It is theoretically proven that all the signals in the closed-loop system are uniformly ultimately bounded based on Lyapunov stability analysis method. Finally, a simulation comparison among the proposed scheme with coordinates compensation and Proportion Integration Differentiation (PID control algorithm is given. It is shown that the proposed scheme with coordinates compensation has smaller tracking errors and more rapid responses than PID scheme.

  1. Automatic assessment of functional health decline in older adults based on smart home data.

    Science.gov (United States)

    Alberdi Aramendi, Ane; Weakley, Alyssa; Aztiria Goenaga, Asier; Schmitter-Edgecombe, Maureen; Cook, Diane J

    2018-05-01

    In the context of an aging population, tools to help elderly to live independently must be developed. The goal of this paper is to evaluate the possibility of using unobtrusively collected activity-aware smart home behavioral data to automatically detect one of the most common consequences of aging: functional health decline. After gathering the longitudinal smart home data of 29 older adults for an average of >2 years, we automatically labeled the data with corresponding activity classes and extracted time-series statistics containing 10 behavioral features. Using this data, we created regression models to predict absolute and standardized functional health scores, as well as classification models to detect reliable absolute change and positive and negative fluctuations in everyday functioning. Functional health was assessed every six months by means of the Instrumental Activities of Daily Living-Compensation (IADL-C) scale. Results show that total IADL-C score and subscores can be predicted by means of activity-aware smart home data, as well as a reliable change in these scores. Positive and negative fluctuations in everyday functioning are harder to detect using in-home behavioral data, yet changes in social skills have shown to be predictable. Future work must focus on improving the sensitivity of the presented models and performing an in-depth feature selection to improve overall accuracy. Copyright © 2018 Elsevier Inc. All rights reserved.

  2. Feasibility of Automatic Extraction of Electronic Health Data to Evaluate a Status Epilepticus Clinical Protocol.

    Science.gov (United States)

    Hafeez, Baria; Paolicchi, Juliann; Pon, Steven; Howell, Joy D; Grinspan, Zachary M

    2016-05-01

    Status epilepticus is a common neurologic emergency in children. Pediatric medical centers often develop protocols to standardize care. Widespread adoption of electronic health records by hospitals affords the opportunity for clinicians to rapidly, and electronically evaluate protocol adherence. We reviewed the clinical data of a small sample of 7 children with status epilepticus, in order to (1) qualitatively determine the feasibility of automated data extraction and (2) demonstrate a timeline-style visualization of each patient's first 24 hours of care. Qualitatively, our observations indicate that most clinical data are well labeled in structured fields within the electronic health record, though some important information, particularly electroencephalography (EEG) data, may require manual abstraction. We conclude that a visualization that clarifies a patient's clinical course can be automatically created using the patient's electronic clinical data, supplemented with some manually abstracted data. Future work could use this timeline to evaluate adherence to status epilepticus clinical protocols. © The Author(s) 2015.

  3. A framework for automatic feature extraction from airborne light detection and ranging data

    Science.gov (United States)

    Yan, Jianhua

    Recent advances in airborne Light Detection and Ranging (LIDAR) technology allow rapid and inexpensive measurements of topography over large areas. Airborne LIDAR systems usually return a 3-dimensional cloud of point measurements from reflective objects scanned by the laser beneath the flight path. This technology is becoming a primary method for extracting information of different kinds of geometrical objects, such as high-resolution digital terrain models (DTMs), buildings and trees, etc. In the past decade, LIDAR gets more and more interest from researchers in the field of remote sensing and GIS. Compared to the traditional data sources, such as aerial photography and satellite images, LIDAR measurements are not influenced by sun shadow and relief displacement. However, voluminous data pose a new challenge for automated extraction the geometrical information from LIDAR measurements because many raster image processing techniques cannot be directly applied to irregularly spaced LIDAR points. In this dissertation, a framework is proposed to filter out information about different kinds of geometrical objects, such as terrain and buildings from LIDAR automatically. They are essential to numerous applications such as flood modeling, landslide prediction and hurricane animation. The framework consists of several intuitive algorithms. Firstly, a progressive morphological filter was developed to detect non-ground LIDAR measurements. By gradually increasing the window size and elevation difference threshold of the filter, the measurements of vehicles, vegetation, and buildings are removed, while ground data are preserved. Then, building measurements are identified from no-ground measurements using a region growing algorithm based on the plane-fitting technique. Raw footprints for segmented building measurements are derived by connecting boundary points and are further simplified and adjusted by several proposed operations to remove noise, which is caused by irregularly

  4. Automatic processes in at-risk adolescents: the role of alcohol-approach tendencies and response inhibition in drinking behavior.

    Science.gov (United States)

    Peeters, Margot; Wiers, Reinout W; Monshouwer, Karin; van de Schoot, Rens; Janssen, Tim; Vollebergh, Wilma A M

    2012-11-01

    This study examined the association between automatic processes and drinking behavior in relation to individual differences in response inhibition in young adolescents who had just started drinking. It was hypothesized that strong automatic behavioral tendencies toward alcohol-related stimuli (alcohol-approach bias) were associated with higher levels of alcohol use, especially amongst adolescents with relatively weak inhibition skills. To test this hypothesis structural equation analyses (standard error of mean) were performed using a zero inflated Poisson (ZIP) model. A well-known problem in studying risk behavior is the low incidence rate resulting in a zero dominated distribution. A ZIP-model accounts for non-normality of the data. Adolescents were selected from secondary Special Education schools (a risk group for the development of substance use problems). Participants were 374 adolescents (mean age of M = 13.6 years). Adolescents completed the alcohol approach avoidance task (a-AAT), the Stroop colour naming task (Stroop) and a questionnaire that assessed alcohol use. The ZIP-model established stronger alcohol-approach tendencies for adolescent drinkers (P processes are associated with the drinking behavior of young, at-risk adolescents. It appears that alcohol-approach tendencies are formed shortly after the initiation of drinking and particularly affect the drinking behavior of adolescents with relatively weak inhibition skills. Implications for the prevention of problem drinking in adolescents are discussed. © 2012 The Authors. Addiction © 2012 Society for the Study of Addiction.

  5. Automatical and accurate segmentation of cerebral tissues in fMRI dataset with combination of image processing and deep learning

    Science.gov (United States)

    Kong, Zhenglun; Luo, Junyi; Xu, Shengpu; Li, Ting

    2018-02-01

    Image segmentation plays an important role in medical science. One application is multimodality imaging, especially the fusion of structural imaging with functional imaging, which includes CT, MRI and new types of imaging technology such as optical imaging to obtain functional images. The fusion process require precisely extracted structural information, in order to register the image to it. Here we used image enhancement, morphometry methods to extract the accurate contours of different tissues such as skull, cerebrospinal fluid (CSF), grey matter (GM) and white matter (WM) on 5 fMRI head image datasets. Then we utilized convolutional neural network to realize automatic segmentation of images in deep learning way. Such approach greatly reduced the processing time compared to manual and semi-automatic segmentation and is of great importance in improving speed and accuracy as more and more samples being learned. The contours of the borders of different tissues on all images were accurately extracted and 3D visualized. This can be used in low-level light therapy and optical simulation software such as MCVM. We obtained a precise three-dimensional distribution of brain, which offered doctors and researchers quantitative volume data and detailed morphological characterization for personal precise medicine of Cerebral atrophy/expansion. We hope this technique can bring convenience to visualization medical and personalized medicine.

  6. Leveraging Parallel Data Processing Frameworks with Verified Lifting

    Directory of Open Access Journals (Sweden)

    Maaz Bin Safeer Ahmad

    2016-11-01

    Full Text Available Many parallel data frameworks have been proposed in recent years that let sequential programs access parallel processing. To capitalize on the benefits of such frameworks, existing code must often be rewritten to the domain-specific languages that each framework supports. This rewriting–tedious and error-prone–also requires developers to choose the framework that best optimizes performance given a specific workload. This paper describes Casper, a novel compiler that automatically retargets sequential Java code for execution on Hadoop, a parallel data processing framework that implements the MapReduce paradigm. Given a sequential code fragment, Casper uses verified lifting to infer a high-level summary expressed in our program specification language that is then compiled for execution on Hadoop. We demonstrate that Casper automatically translates Java benchmarks into Hadoop. The translated results execute on average 3.3x faster than the sequential implementations and scale better, as well, to larger datasets.

  7. Data from configuration management tools as sources for software process mining

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Samalik, J.; Weijters, A.J.M.M.; Lavazza, L; Oberhauser, R; Martin, A; Hassine, J; Gebhart, M; Jäntti, M

    2013-01-01

    Process mining has proven to be a valuable approach that provides new and objective insights into processes within organizations. Based on sets of well-structured data, the underlying ‘actual’ processes can be extracted and process models can be constructed automatically, i.e., the process model can

  8. Automatic two- and three-dimensional mesh generation based on fuzzy knowledge processing

    Science.gov (United States)

    Yagawa, G.; Yoshimura, S.; Soneda, N.; Nakao, K.

    1992-09-01

    This paper describes the development of a novel automatic FEM mesh generation algorithm based on the fuzzy knowledge processing technique. A number of local nodal patterns are stored in a nodal pattern database of the mesh generation system. These nodal patterns are determined a priori based on certain theories or past experience of experts of FEM analyses. For example, such human experts can determine certain nodal patterns suitable for stress concentration analyses of cracks, corners, holes and so on. Each nodal pattern possesses a membership function and a procedure of node placement according to this function. In the cases of the nodal patterns for stress concentration regions, the membership function which is utilized in the fuzzy knowledge processing has two meanings, i.e. the “closeness” of nodal location to each stress concentration field as well as “nodal density”. This is attributed to the fact that a denser nodal pattern is required near a stress concentration field. What a user has to do in a practical mesh generation process are to choose several local nodal patterns properly and to designate the maximum nodal density of each pattern. After those simple operations by the user, the system places the chosen nodal patterns automatically in an analysis domain and on its boundary, and connects them smoothly by the fuzzy knowledge processing technique. Then triangular or tetrahedral elements are generated by means of the advancing front method. The key issue of the present algorithm is an easy control of complex two- or three-dimensional nodal density distribution by means of the fuzzy knowledge processing technique. To demonstrate fundamental performances of the present algorithm, a prototype system was constructed with one of object-oriented languages, Smalltalk-80 on a 32-bit microcomputer, Macintosh II. The mesh generation of several two- and three-dimensional domains with cracks, holes and junctions was presented as examples.

  9. Automatic Prompt System in the Process of Mapping plWordNet on Princeton WordNet

    Directory of Open Access Journals (Sweden)

    Paweł Kędzia

    2015-06-01

    Full Text Available Automatic Prompt System in the Process of Mapping plWordNet on Princeton WordNet The paper offers a critical evaluation of the power and usefulness of an automatic prompt system based on the extended Relaxation Labelling algorithm in the process of (manual mapping plWordNet on Princeton WordNet. To this end the results of manual mapping – that is inter-lingual relations between plWN and PWN synsets – are juxtaposed with the automatic prompts that were generated for the source language synsets to be mapped. We check the number and type of inter-lingual relations introduced on the basis of automatic prompts and the distance of the respective prompt synsets from the actual target language synsets.

  10. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  11. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  12. Processing TES Level-2 Data

    Science.gov (United States)

    Poosti, Sassaneh; Akopyan, Sirvard; Sakurai, Regina; Yun, Hyejung; Saha, Pranjit; Strickland, Irina; Croft, Kevin; Smith, Weldon; Hoffman, Rodney; Koffend, John; hide

    2006-01-01

    TES Level 2 Subsystem is a set of computer programs that performs functions complementary to those of the program summarized in the immediately preceding article. TES Level-2 data pertain to retrieved species (or temperature) profiles, and errors thereof. Geolocation, quality, and other data (e.g., surface characteristics for nadir observations) are also included. The subsystem processes gridded meteorological information and extracts parameters that can be interpolated to the appropriate latitude, longitude, and pressure level based on the date and time. Radiances are simulated using the aforementioned meteorological information for initial guesses, and spectroscopic-parameter tables are generated. At each step of the retrieval, a nonlinear-least-squares- solving routine is run over multiple iterations, retrieving a subset of atmospheric constituents, and error analysis is performed. Scientific TES Level-2 data products are written in a format known as Hierarchical Data Format Earth Observing System 5 (HDF-EOS 5) for public distribution.

  13. Automatic design of decision-tree induction algorithms tailored to flexible-receptor docking data.

    Science.gov (United States)

    Barros, Rodrigo C; Winck, Ana T; Machado, Karina S; Basgalupp, Márcio P; de Carvalho, André C P L F; Ruiz, Duncan D; de Souza, Osmar Norberto

    2012-11-21

    This paper addresses the prediction of the free energy of binding of a drug candidate with enzyme InhA associated with Mycobacterium tuberculosis. This problem is found within rational drug design, where interactions between drug candidates and target proteins are verified through molecular docking simulations. In this application, it is important not only to correctly predict the free energy of binding, but also to provide a comprehensible model that could be validated by a domain specialist. Decision-tree induction algorithms have been successfully used in drug-design related applications, specially considering that decision trees are simple to understand, interpret, and validate. There are several decision-tree induction algorithms available for general-use, but each one has a bias that makes it more suitable for a particular data distribution. In this article, we propose and investigate the automatic design of decision-tree induction algorithms tailored to particular drug-enzyme binding data sets. We investigate the performance of our new method for evaluating binding conformations of different drug candidates to InhA, and we analyze our findings with respect to decision tree accuracy, comprehensibility, and biological relevance. The empirical analysis indicates that our method is capable of automatically generating decision-tree induction algorithms that significantly outperform the traditional C4.5 algorithm with respect to both accuracy and comprehensibility. In addition, we provide the biological interpretation of the rules generated by our approach, reinforcing the importance of comprehensible predictive models in this particular bioinformatics application. We conclude that automatically designing a decision-tree algorithm tailored to molecular docking data is a promising alternative for the prediction of the free energy from the binding of a drug candidate with a flexible-receptor.

  14. flowAI: automatic and interactive anomaly discerning tools for flow cytometry data.

    Science.gov (United States)

    Monaco, Gianni; Chen, Hao; Poidinger, Michael; Chen, Jinmiao; de Magalhães, João Pedro; Larbi, Anis

    2016-08-15

    Flow cytometry (FCM) is widely used in both clinical and basic research to characterize cell phenotypes and functions. The latest FCM instruments analyze up to 20 markers of individual cells, producing high-dimensional data. This requires the use of the latest clustering and dimensionality reduction techniques to automatically segregate cell sub-populations in an unbiased manner. However, automated analyses may lead to false discoveries due to inter-sample differences in quality and properties. We present an R package, flowAI, containing two methods to clean FCM files from unwanted events: (i) an automatic method that adopts algorithms for the detection of anomalies and (ii) an interactive method with a graphical user interface implemented into an R shiny application. The general approach behind the two methods consists of three key steps to check and remove suspected anomalies that derive from (i) abrupt changes in the flow rate, (ii) instability of signal acquisition and (iii) outliers in the lower limit and margin events in the upper limit of the dynamic range. For each file analyzed our software generates a summary of the quality assessment from the aforementioned steps. The software presented is an intuitive solution seeking to improve the results not only of manual but also and in particular of automatic analysis on FCM data. R source code available through Bioconductor: http://bioconductor.org/packages/flowAI/ CONTACTS: mongianni1@gmail.com or Anis_Larbi@immunol.a-star.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. P19-S Managing Proteomics Data from Data Generation and Data Warehousing to Central Data Repository and Journal Reviewing Processes

    Science.gov (United States)

    Thiele, H.; Glandorf, J.; Koerting, G.; Reidegeld, K.; Blüggel, M.; Meyer, H.; Stephan, C.

    2007-01-01

    In today’s proteomics research, various techniques and instrumentation bioinformatics tools are necessary to manage the large amount of heterogeneous data with an automatic quality control to produce reliable and comparable results. Therefore a data-processing pipeline is mandatory for data validation and comparison in a data-warehousing system. The proteome bioinformatics platform ProteinScape has been proven to cover these needs. The reprocessing of HUPO BPP participants’ MS data was done within ProteinScape. The reprocessed information was transferred into the global data repository PRIDE. ProteinScape as a data-warehousing system covers two main aspects: archiving relevant data of the proteomics workflow and information extraction functionality (protein identification, quantification and generation of biological knowledge). As a strategy for automatic data validation, different protein search engines are integrated. Result analysis is performed using a decoy database search strategy, which allows the measurement of the false-positive identification rate. Peptide identifications across different workflows, different MS techniques, and different search engines are merged to obtain a quality-controlled protein list. The proteomics identifications database (PRIDE), as a public data repository, is an archiving system where data are finally stored and no longer changed by further processing steps. Data submission to PRIDE is open to proteomics laboratories generating protein and peptide identifications. An export tool has been developed for transferring all relevant HUPO BPP data from ProteinScape into PRIDE using the PRIDE.xml format. The EU-funded ProDac project will coordinate the development of software tools covering international standards for the representation of proteomics data. The implementation of data submission pipelines and systematic data collection in public standards–compliant repositories will cover all aspects, from the generation of MS data

  16. Automatic analysis algorithm for radionuclide pulse-height data from beta-gamma coincidence systems

    International Nuclear Information System (INIS)

    Foltz Biegalski, K.M.

    2001-01-01

    There are two acceptable noble gas monitoring measurement modes for Comprehensive Nuclear-Test-Ban-Treaty (CTBT) verification purposes defined in CTBT/PC/II/WG.B/1. These include beta-gamma coincidence and high-resolution gamma-spectrometry. There are at present no commercial, off-the-shelf (COTS) applications for the analysis of β-γ coincidence data. Development of such software is in progress at the Prototype International Data Centre (PIDC) for eventual deployment at the International Data Centre (IDC). Flowcharts detailing the automatic analysis algorithm for β-γ coincidence data to be coded at the PIDC is included. The program is being written in C with Oracle databasing capabilities. (author)

  17. Fully automatic characterization and data collection from crystals of biological macromolecules.

    Science.gov (United States)

    Svensson, Olof; Malbet-Monaco, Stéphanie; Popov, Alexander; Nurizzo, Didier; Bowler, Matthew W

    2015-08-01

    Considerable effort is dedicated to evaluating macromolecular crystals at synchrotron sources, even for well established and robust systems. Much of this work is repetitive, and the time spent could be better invested in the interpretation of the results. In order to decrease the need for manual intervention in the most repetitive steps of structural biology projects, initial screening and data collection, a fully automatic system has been developed to mount, locate, centre to the optimal diffraction volume, characterize and, if possible, collect data from multiple cryocooled crystals. Using the capabilities of pixel-array detectors, the system is as fast as a human operator, taking an average of 6 min per sample depending on the sample size and the level of characterization required. Using a fast X-ray-based routine, samples are located and centred systematically at the position of highest diffraction signal and important parameters for sample characterization, such as flux, beam size and crystal volume, are automatically taken into account, ensuring the calculation of optimal data-collection strategies. The system is now in operation at the new ESRF beamline MASSIF-1 and has been used by both industrial and academic users for many different sample types, including crystals of less than 20 µm in the smallest dimension. To date, over 8000 samples have been evaluated on MASSIF-1 without any human intervention.

  18. Automatic Testing of the Trigger Data Serializer ASIC for the Upgrade of the ATLAS Muon Spectrometer

    CERN Document Server

    Pinkham, Reid; Schwarz, Thomas

    The Trigger Data Serializer (TDS) is a custom designed Application Specific Integrated Circuit (ASIC) designed at the University of Michigan to be used on the ATLAS New Small Wheel (NSW) detector. The TDS is a central hub of the NSW trigger system. It prepares the trigger data for both pad and strip detectors, performs pad-strip matching, and serializes the matched strip data to other circuits on the rim of the NSW. In total, 6000 TDS chips will be produced. As part of the TDS’ initial production run, a test platform was developed to verify the functionality of each chip before being sent to users. The test platform consisted of multiple FPGA evaluation boards with custom designed mezzanine boards to hold the TDS chip during testing and control software running on a local computer. Of the initial run of 200 chips, 161 chips were tested with the automatic setup of which 158 passed. Detailed description of the TDS and automatic test fixture can be found in this thesis.

  19. Automatic Measurement of Fetal Brain Development from Magnetic Resonance Imaging: New Reference Data.

    Science.gov (United States)

    Link, Daphna; Braginsky, Michael B; Joskowicz, Leo; Ben Sira, Liat; Harel, Shaul; Many, Ariel; Tarrasch, Ricardo; Malinger, Gustavo; Artzi, Moran; Kapoor, Cassandra; Miller, Elka; Ben Bashat, Dafna

    2018-01-01

    Accurate fetal brain volume estimation is of paramount importance in evaluating fetal development. The aim of this study was to develop an automatic method for fetal brain segmentation from magnetic resonance imaging (MRI) data, and to create for the first time a normal volumetric growth chart based on a large cohort. A semi-automatic segmentation method based on Seeded Region Growing algorithm was developed and applied to MRI data of 199 typically developed fetuses between 18 and 37 weeks' gestation. The accuracy of the algorithm was tested against a sub-cohort of ground truth manual segmentations. A quadratic regression analysis was used to create normal growth charts. The sensitivity of the method to identify developmental disorders was demonstrated on 9 fetuses with intrauterine growth restriction (IUGR). The developed method showed high correlation with manual segmentation (r2 = 0.9183, p user independent, applicable with retrospective data, and is suggested for use in routine clinical practice. © 2017 S. Karger AG, Basel.

  20. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    Science.gov (United States)

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    Science.gov (United States)

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  2. The origins of levels-of-processing effects in a conceptual test: evidence for automatic influences of memory from the process-dissociation procedure.

    Science.gov (United States)

    Bergerbest, Dafna; Goshen-Gottstein, Yonatan

    2002-12-01

    In three experiments, we explored automatic influences of memory in a conceptual memory task, as affected by a levels-of-processing (LoP) manipulation. We also explored the origins of the LoP effect by examining whether the effect emerged only when participants in the shallow condition truncated the perceptual processing (the lexical-processing hypothesis) or even when the entire word was encoded in this condition (the conceptual-processing hypothesis). Using the process-dissociation procedure and an implicit association-generation task, we found that the deep encoding condition yielded higher estimates of automatic influences than the shallow condition. In support of the conceptual processing hypothesis, the LoP effect was found even when the shallow task did not lead to truncated processing of the lexical units. We suggest that encoding for meaning is a prerequisite for automatic processing on conceptual tests of memory.

  3. Magnonic crystals for data processing

    International Nuclear Information System (INIS)

    Chumak, A V; Serga, A A; Hillebrands, B

    2017-01-01

    Magnons (the quanta of spin waves) propagating in magnetic materials with wavelengths at the nanometer-scale and carrying information in the form of an angular momentum can be used as data carriers in next-generation, nano-sized low-loss information processing systems. In this respect, artificial magnetic materials with properties periodically varied in space, known as magnonic crystals, are especially promising for controlling and manipulating magnon currents. In this article, different approaches for the realization of static, reconfigurable, and dynamic magnonic crystals are presented along with a variety of novel wave phenomena discovered in these crystals. Special attention is devoted to the utilization of magnonic crystals for processing of analog and digital information. (paper)

  4. Automatic data-acquisition and communications computer network for fusion experiments

    International Nuclear Information System (INIS)

    Kemper, C.O.

    1981-01-01

    A network of more than twenty computers serves the data acquisition, archiving, and analysis requirements of the ISX, EBT, and beam-line test facilities at the Fusion Division of Oak Ridge National Laboratory. The network includes PDP-8, PDP-12, PDP-11, PDP-10, and Interdata 8-32 processors, and is unified by a variety of high-speed serial and parallel communications channels. While some processors are dedicated to experimental data acquisition, and others are dedicated to later analysis and theoretical work, many processors perform a combination of acquisition, real-time analysis and display, and archiving and communications functions. A network software system has been developed which runs in each processor and automatically transports data files from point of acquisition to point or points of analysis, display, and storage, providing conversion and formatting functions are required

  5. Computerization of reporting and data storage using automatic coding method in the department of radiology

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Byung Hee; Lee, Kyung Sang; Kim, Woo Ho; Han, Joon Koo; Choi, Byung Ihn; Han, Man Chung [College of Medicine, Seoul National Univ., Seoul (Korea, Republic of)

    1990-10-15

    The authors developed a computer program for use in printing report as well as data storage and retrieval in the Radiology department. This program used IBM PC AT and was written in dBASE III plus language. The automatic coding method of the ACR code, developed by Kim et al was applied in this program, and the framework of this program is the same as that developed for the surgical pathology department. The working sheet, which contained the name card for X-ray film identification and the results of previous radiologic studies, were printed during registration. The word precessing function was applied for issuing the formal report of radiologic study, and the data storage was carried out during the typewriting of the report. Two kinds of data files were stored in the hard disk ; the temporary file contained full information and the permanent file contained patient's identification data, and ACR code. Searching of a specific case was performed by chart number, patients name, date of study, or ACR code within a second. All the cases were arranged by ACR codes of procedure code, anatomy code, and pathology code. Every new data was copied to the diskette after daily work automatically, with which data could be restored in case of hard diskette failure. The main advantage of this program with comparison to the larger computer system is its low price. Based on the experience in the Seoul District Armed Forces General Hospital, we assume that this program provides solution to various problems in the radiology department where a large computer system with well designed software is not available.

  6. AN AUTOMATIC PROCEDURE FOR COMBINING DIGITAL IMAGES AND LASER SCANNER DATA

    Directory of Open Access Journals (Sweden)

    W. Moussa

    2012-07-01

    Full Text Available Besides improving both the geometry and the visual quality of the model, the integration of close-range photogrammetry and terrestrial laser scanning techniques directs at filling gaps in laser scanner point clouds to avoid modeling errors, reconstructing more details in higher resolution and recovering simple structures with less geometric details. Thus, within this paper a flexible approach for the automatic combination of digital images and laser scanner data is presented. Our approach comprises two methods for data fusion. The first method starts by a marker-free registration of digital images based on a point-based environment model (PEM of a scene which stores the 3D laser scanner point clouds associated with intensity and RGB values. The PEM allows the extraction of accurate control information for the direct computation of absolute camera orientations with redundant information by means of accurate space resection methods. In order to use the computed relations between the digital images and the laser scanner data, an extended Helmert (seven-parameter transformation is introduced and its parameters are estimated. Precedent to that, in the second method, the local relative orientation parameters of the camera images are calculated by means of an optimized Structure and Motion (SaM reconstruction method. Then, using the determined transformation parameters results in having absolute oriented images in relation to the laser scanner data. With the resulting absolute orientations we have employed robust dense image reconstruction algorithms to create oriented dense image point clouds, which are automatically combined with the laser scanner data to form a complete detailed representation of a scene. Examples of different data sets are shown and experimental results demonstrate the effectiveness of the presented procedures.

  7. Smart-card-based automatic meal record system intervention tool for analysis using data mining approach.

    Science.gov (United States)

    Zenitani, Satoko; Nishiuchi, Hiromu; Kiuchi, Takahiro

    2010-04-01

    The Smart-card-based Automatic Meal Record system for company cafeterias (AutoMealRecord system) was recently developed and used to monitor employee eating habits. The system could be a unique nutrition assessment tool for automatically monitoring the meal purchases of all employees, although it only focuses on company cafeterias and has never been validated. Before starting an interventional study, we tested the reliability of the data collected by the system using the data mining approach. The AutoMealRecord data were examined to determine if it could predict current obesity. All data used in this study (n = 899) were collected by a major electric company based in Tokyo, which has been operating the AutoMealRecord system for several years. We analyzed dietary patterns by principal component analysis using data from the system and extracted 5 major dietary patterns: healthy, traditional Japanese, Chinese, Japanese noodles, and pasta. The ability to predict current body mass index (BMI) with dietary preference was assessed with multiple linear regression analyses, and in the current study, BMI was positively correlated with male gender, preference for "Japanese noodles," mean energy intake, protein content, and frequency of body measurement at a body measurement booth in the cafeteria. There was a negative correlation with age, dietary fiber, and lunchtime cafeteria use (R(2) = 0.22). This regression model predicted "would-be obese" participants (BMI >or= 23) with 68.8% accuracy by leave-one-out cross validation. This shows that there was sufficient predictability of BMI based on data from the AutoMealRecord System. We conclude that the AutoMealRecord system is valuable for further consideration as a health care intervention tool. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Application of digital process controller for automatic pulse operation in the NSRR

    International Nuclear Information System (INIS)

    Ishijima, K.; Ueda, T.; Saigo, M.

    1992-01-01

    The NSRR at JAERI is a modified TRIGA Reactor. It was built for investigating reactor fuel behavior under reactivity initiated accident (RIA) conditions. Recently, there has been a need to improve the flexibility of pulsing operations in the NSRR to cover a wide range of accidental situations, including RIA events at elevated power levels, and various abnormal power transients. To satisfy this need, we developed a new reactor control system which allows us to perform 'Shaped Pulse Operation: SP' and 'Combined Pulse Operation: CP'. Quick, accurate and complicated manipulation of control rods was required to realize these operations. Therefore we installed a new reactor control system, which we call an automatic pulse control system. This control system is composed of digital processing controllers and other digital equipments, and is fully automated and highly accurate. (author)

  9. Developing an Intelligent Automatic Appendix Extraction Method from Ultrasonography Based on Fuzzy ART and Image Processing

    Directory of Open Access Journals (Sweden)

    Kwang Baek Kim

    2015-01-01

    Full Text Available Ultrasound examination (US does a key role in the diagnosis and management of the patients with clinically suspected appendicitis which is the most common abdominal surgical emergency. Among the various sonographic findings of appendicitis, outer diameter of the appendix is most important. Therefore, clear delineation of the appendix on US images is essential. In this paper, we propose a new intelligent method to extract appendix automatically from abdominal sonographic images as a basic building block of developing such an intelligent tool for medical practitioners. Knowing that the appendix is located at the lower organ area below the bottom fascia line, we conduct a series of image processing techniques to find the fascia line correctly. And then we apply fuzzy ART learning algorithm to the organ area in order to extract appendix accurately. The experiment verifies that the proposed method is highly accurate (successful in 38 out of 40 cases in extracting appendix.

  10. Neural dynamics of morphological processing in spoken word comprehension: Laterality and automaticity

    Directory of Open Access Journals (Sweden)

    Caroline M. Whiting

    2013-11-01

    Full Text Available Rapid and automatic processing of grammatical complexity is argued to take place during speech comprehension, engaging a left-lateralised fronto-temporal language network. Here we address how neural activity in these regions is modulated by the grammatical properties of spoken words. We used combined magneto- and electroencephalography (MEG, EEG to delineate the spatiotemporal patterns of activity that support the recognition of morphologically complex words in English with inflectional (-s and derivational (-er affixes (e.g. bakes, baker. The mismatch negativity (MMN, an index of linguistic memory traces elicited in a passive listening paradigm, was used to examine the neural dynamics elicited by morphologically complex words. Results revealed an initial peak 130-180 ms after the deviation point with a major source in left superior temporal cortex. The localisation of this early activation showed a sensitivity to two grammatical properties of the stimuli: 1 the presence of morphological complexity, with affixed words showing increased left-laterality compared to non-affixed words; and 2 the grammatical category, with affixed verbs showing greater left-lateralisation in inferior frontal gyrus compared to affixed nouns (bakes vs. beaks. This automatic brain response was additionally sensitive to semantic coherence (the meaning of the stem vs. the meaning of the whole form in fronto-temporal regions. These results demonstrate that the spatiotemporal pattern of neural activity in spoken word processing is modulated by the presence of morphological structure, predominantly engaging the left-hemisphere’s fronto-temporal language network, and does not require focused attention on the linguistic input.

  11. Event-related potential evidence for separable automatic and controlled retrieval processes in proactive interference.

    Science.gov (United States)

    Bergström, Zara M; O'Connor, Richard J; Li, Martin K-H; Simons, Jon S

    2012-05-21

    Interference between competing memories is a major source of retrieval failure, yet, surprisingly little is known about how competitive memory activation arises in the brain. One possibility is that interference during episodic retrieval might be produced by relatively automatic conceptual priming mechanisms that are independent of strategic retrieval processes. Such priming-driven interference might occur when the competing memories have strong pre-existing associations to the retrieval cue. We used ERPs to measure the neural dynamics of retrieval competition, and investigated whether the ERP correlates of interference were affected by varying task demands for selective retrieval. Participants encoded cue words that were presented either two or four times, paired either with the same or different strongly associated words across repetitions. In a subsequent test, participants either selectively recalled each cue's most recent associate, or simply judged how many times a cue had been presented, without requiring selective recall. Interference effects on test performance were only seen in the recall task. In contrast, ERPs during test revealed an early posterior positivity for high interference items that was present in both retrieval tasks. This early ERP effect likely reflects a conceptual priming-related N400 reduction when many associations to a cue were pre-activated. A later parietal positivity resembling the ERP correlate of conscious recollection was found only in the recall task. The results suggest that early effects of proactive interference are relatively automatic and independent of intentional retrieval processes, consistent with suggestions that interference can arise through conceptual priming. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Automatic detection of referral patients due to retinal pathologies through data mining.

    Science.gov (United States)

    Quellec, Gwenolé; Lamard, Mathieu; Erginay, Ali; Chabouis, Agnès; Massin, Pascale; Cochener, Béatrice; Cazuguel, Guy

    2016-04-01

    With the increased prevalence of retinal pathologies, automating the detection of these pathologies is becoming more and more relevant. In the past few years, many algorithms have been developed for the automated detection of a specific pathology, typically diabetic retinopathy, using eye fundus photography. No matter how good these algorithms are, we believe many clinicians would not use automatic detection tools focusing on a single pathology and ignoring any other pathology present in the patient's retinas. To solve this issue, an algorithm for characterizing the appearance of abnormal retinas, as well as the appearance of the normal ones, is presented. This algorithm does not focus on individual images: it considers examination records consisting of multiple photographs of each retina, together with contextual information about the patient. Specifically, it relies on data mining in order to learn diagnosis rules from characterizations of fundus examination records. The main novelty is that the content of examination records (images and context) is characterized at multiple levels of spatial and lexical granularity: 1) spatial flexibility is ensured by an adaptive decomposition of composite retinal images into a cascade of regions, 2) lexical granularity is ensured by an adaptive decomposition of the feature space into a cascade of visual words. This multigranular representation allows for great flexibility in automatically characterizing normality and abnormality: it is possible to generate diagnosis rules whose precision and generalization ability can be traded off depending on data availability. A variation on usual data mining algorithms, originally designed to mine static data, is proposed so that contextual and visual data at adaptive granularity levels can be mined. This framework was evaluated in e-ophtha, a dataset of 25,702 examination records from the OPHDIAT screening network, as well as in the publicly-available Messidor dataset. It was successfully

  13. Automatic first-break picking using the instantaneous traveltime attribute

    KAUST Repository

    Saragiotis, Christos; Alkhalifah, Tariq Ali

    2012-01-01

    Picking the first breaks is an important step in seismic processing. The large volume of the seismic data calls for automatic and objective picking. We introduce a new automatic first-break picker, which uses specifically designed time windows

  14. The Effect of Orthographic Depth on Letter String Processing: The Case of Visual Attention Span and Rapid Automatized Naming

    Science.gov (United States)

    Antzaka, Alexia; Martin, Clara; Caffarra, Sendy; Schlöffel, Sophie; Carreiras, Manuel; Lallier, Marie

    2018-01-01

    The present study investigated whether orthographic depth can increase the bias towards multi-letter processing in two reading-related skills: visual attention span (VAS) and rapid automatized naming (RAN). VAS (i.e., the number of visual elements that can be processed at once in a multi-element array) was tested with a visual 1-back task and RAN…

  15. Sensitometric characteristics of D-, E- and F-speed dental radiographic films in manual and automatic processing

    Directory of Open Access Journals (Sweden)

    Jahangir Haghani DDS, MSc

    2012-09-01

    Full Text Available BACKGROUND AND AIM:The purpose of this study was to evaluatethe sensitometric characteristics of Ultraspeed,Ektaspeed Plus and Insight dental radiographic films using manual and automatic processing systems.METHODS:In this experimental invitro study, an aluminum step-wedge was used to construct characteristic curves forD-, E- and F-speed radiographic films (Kodak Eastman, Rochester, USA. All films were processed in Iranianprocessing solution (chemical industries Co., Iran, Tehran both manually and automatically in a period of six days.Unexposed films of three types were processed manually andautomatically to determine base plus fog density. Speedand film contrast were measured according to International Standard Organization definition.RESULTS:There was significant difference in density obtained with the D-, E- and F-speed films in both manually andautomatically processing systems (P < 0.001. There was significant difference in density obtained with the Ultraspeed andinsight films. There was no significant difference in contrast obtained with the D-, E- and F-speed films in both manuallyand automatically processing systems (P = 0.255 , P = 0.260. There was significant difference in speed obtained with theD-, E- and F-speed films in both manually and automatically processing systems (P = 0.034, P = 0.040.CONCLUSIONS:The choice of processing system canaffect radiographic characteristics. The F-speed film processed inautomatic system has greater speed in comparison with manualprocessing system, and it provides a further reduction inradiation exposure without detriment to image quality.

  16. Dynamics of processing invisible faces in the brain: automatic neural encoding of facial expression information.

    Science.gov (United States)

    Jiang, Yi; Shannon, Robert W; Vizueta, Nathalie; Bernat, Edward M; Patrick, Christopher J; He, Sheng

    2009-02-01

    The fusiform face area (FFA) and the superior temporal sulcus (STS) are suggested to process facial identity and facial expression information respectively. We recently demonstrated a functional dissociation between the FFA and the STS as well as correlated sensitivity of the STS and the amygdala to facial expressions using an interocular suppression paradigm [Jiang, Y., He, S., 2006. Cortical responses to invisible faces: dissociating subsystems for facial-information processing. Curr. Biol. 16, 2023-2029.]. In the current event-related brain potential (ERP) study, we investigated the temporal dynamics of facial information processing. Observers viewed neutral, fearful, and scrambled face stimuli, either visibly or rendered invisible through interocular suppression. Relative to scrambled face stimuli, intact visible faces elicited larger positive P1 (110-130 ms) and larger negative N1 or N170 (160-180 ms) potentials at posterior occipital and bilateral occipito-temporal regions respectively, with the N170 amplitude significantly greater for fearful than neutral faces. Invisible intact faces generated a stronger signal than scrambled faces at 140-200 ms over posterior occipital areas whereas invisible fearful faces (compared to neutral and scrambled faces) elicited a significantly larger negative deflection starting at 220 ms along the STS. These results provide further evidence for cortical processing of facial information without awareness and elucidate the temporal sequence of automatic facial expression information extraction.

  17. Summer Student Work Project Report: SCADA Bridge Tool Development Automatically Capturing Data from SCADA to the Maintenance System

    CERN Document Server

    Alhambra-Moron, Alfonso

    2015-01-01

    The main purpose of this report is to summarize the work project I have been doing at CERN during the last 3 months as a Summer Student. My name is Alfonso Alhambra Morón and the 8th of June 2015 I joined the EN-HE-LM team as a summer student supervised by Damien Lafarge in order to collaborate in the automation of the transfer of meter readings from SCADA1 to Infor EAM2, the computerized maintenance management system at CERN. The main objective of my project was to enable the automatic updates of meters in Infor EAM fetching data from SCADA so as to automatize a process which was done manually before and consumed resources in terms of having to consult the meter physically, import this information to Infor EAM by hand and detecting and correcting the errors that can occur when doing all of this manually. This problem is shared by several other teams at CERN apart from the Lift Maintenance team and for this reason the main target I had when developing my solution was flexibility and scalability so as to make...

  18. AUTOMATIC EXTRACTION AND TOPOLOGY RECONSTRUCTION OF URBAN VIADUCTS FROM LIDAR DATA

    Directory of Open Access Journals (Sweden)

    Y. Wang

    2015-08-01

    Full Text Available Urban viaducts are important infrastructures for the transportation system of a city. In this paper, an original method is proposed to automatically extract urban viaducts and reconstruct topology of the viaduct network just with airborne LiDAR point cloud data. It will greatly simplify the effort-taking procedure of viaducts extraction and reconstruction. In our method, the point cloud first is filtered to divide all the points into ground points and none-ground points. Region growth algorithm is adopted to find the viaduct points from the none-ground points by the features generated from its general prescriptive designation rules. Then, the viaduct points are projected into 2D images to extract the centerline of every viaduct and generate cubic functions to represent passages of viaducts by least square fitting, with which the topology of the viaduct network can be rebuilt by combining the height information. Finally, a topological graph of the viaducts network is produced. The full-automatic method can potentially benefit the application of urban navigation and city model reconstruction.

  19. Automatic Multi-sensor Data Quality Checking and Event Detection for Environmental Sensing

    Science.gov (United States)

    LIU, Q.; Zhang, Y.; Zhao, Y.; Gao, D.; Gallaher, D. W.; Lv, Q.; Shang, L.

    2017-12-01

    With the advances in sensing technologies, large-scale environmental sensing infrastructures are pervasively deployed to continuously collect data for various research and application fields, such as air quality study and weather condition monitoring. In such infrastructures, many sensor nodes are distributed in a specific area and each individual sensor node is capable of measuring several parameters (e.g., humidity, temperature, and pressure), providing massive data for natural event detection and analysis. However, due to the dynamics of the ambient environment, sensor data can be contaminated by errors or noise. Thus, data quality is still a primary concern for scientists before drawing any reliable scientific conclusions. To help researchers identify potential data quality issues and detect meaningful natural events, this work proposes a novel algorithm to automatically identify and rank anomalous time windows from multiple sensor data streams. More specifically, (1) the algorithm adaptively learns the characteristics of normal evolving time series and (2) models the spatial-temporal relationship among multiple sensor nodes to infer the anomaly likelihood of a time series window for a particular parameter in a sensor node. Case studies using different data sets are presented and the experimental results demonstrate that the proposed algorithm can effectively identify anomalous time windows, which may resulted from data quality issues and natural events.

  20. Data pre-processing for database marketing

    OpenAIRE

    Pinto, Filipe; Santos, Manuel Filipe; Cortez, Paulo; Quintela, Hélder

    2004-01-01

    To increase effectiveness in their marketing and Customer Relationship Manager activities, many organizations are adopting strategies of Database Marketing (DBM). Nowadays, DBM faces new challenges in business knowledge since current strategies are mainly approached by classical statistical inference, which may fail when complex, multi-dimensional and incomplete data is available. An alternative is to use Knowledge Discovery from Databases (KDD), which aims at automatic extraction of useful p...

  1. Automatically exposing OpenLifeData via SADI semantic Web Services.

    Science.gov (United States)

    González, Alejandro Rodríguez; Callahan, Alison; Cruz-Toledo, José; Garcia, Adrian; Egaña Aranguren, Mikel; Dumontier, Michel; Wilkinson, Mark D

    2014-01-01

    Two distinct trends are emerging with respect to how data is shared, collected, and analyzed within the bioinformatics community. First, Linked Data, exposed as SPARQL endpoints, promises to make data easier to collect and integrate by moving towards the harmonization of data syntax, descriptive vocabularies, and identifiers, as well as providing a standardized mechanism for data access. Second, Web Services, often linked together into workflows, normalize data access and create transparent, reproducible scientific methodologies that can, in principle, be re-used and customized to suit new scientific questions. Constructing queries that traverse semantically-rich Linked Data requires substantial expertise, yet traditional RESTful or SOAP Web Services cannot adequately describe the content of a SPARQL endpoint. We propose that content-driven Semantic Web Services can enable facile discovery of Linked Data, independent of their location. We use a well-curated Linked Dataset - OpenLifeData - and utilize its descriptive metadata to automatically configure a series of more than 22,000 Semantic Web Services that expose all of its content via the SADI set of design principles. The OpenLifeData SADI services are discoverable via queries to the SHARE registry and easy to integrate into new or existing bioinformatics workflows and analytical pipelines. We demonstrate the utility of this system through comparison of Web Service-mediated data access with traditional SPARQL, and note that this approach not only simplifies data retrieval, but simultaneously provides protection against resource-intensive queries. We show, through a variety of different clients and examples of varying complexity, that data from the myriad OpenLifeData can be recovered without any need for prior-knowledge of the content or structure of the SPARQL endpoints. We also demonstrate that, via clients such as SHARE, the complexity of federated SPARQL queries is dramatically reduced.

  2. Development of a multiplexer for an automatic data acquisition system for the control and monitoring of microbiological cultures

    Energy Technology Data Exchange (ETDEWEB)

    Morales Rondon, A.; Paredes Puente, J.; Arana Alonso, S.

    2016-07-01

    An automatic data acquisition system has been developed for the control and monitoring of microbiological cultures. Turning an otherwise time-consuming process into a smooth one, by allowing the researcher to set the parameters at the beginning of the experiment and move on into the next task. The development of the hardware and software are key to achieving this system. The mux is custom-made with 22 channels, light weight therefore easy to move around the lab. Furthermore, the software allows the researcher to check the measurements in real-time. It is based on virtual instrumentation software therefore new features can be added easily, thus, the mux is capable of adapting to the scientist necessities. (Author)

  3. Integrating automatic and controlled processes into neurocognitive models of social cognition.

    Science.gov (United States)

    Satpute, Ajay B; Lieberman, Matthew D

    2006-03-24

    Interest in the neural systems underlying social perception has expanded tremendously over the past few decades. However, gaps between behavioral literatures in social perception and neuroscience are still abundant. In this article, we apply the concept of dual-process models to neural systems in an effort to bridge the gap between many of these behavioral studies and neural systems underlying social perception. We describe and provide support for a neural division between reflexive and reflective systems. Reflexive systems correspond to automatic processes and include the amygdala, basal ganglia, ventromedial prefrontal cortex, dorsal anterior cingulate cortex, and lateral temporal cortex. Reflective systems correspond to controlled processes and include lateral prefrontal cortex, posterior parietal cortex, medial prefrontal cortex, rostral anterior cingulate cortex, and the hippocampus and surrounding medial temporal lobe region. This framework is considered to be a working model rather than a finished product. Finally, the utility of this model and its application to other social cognitive domains such as Theory of Mind are discussed.

  4. Automatic screening and classification of diabetic retinopathy and maculopathy using fuzzy image processing.

    Science.gov (United States)

    Rahim, Sarni Suhaila; Palade, Vasile; Shuttleworth, James; Jayne, Chrisina

    2016-12-01

    Digital retinal imaging is a challenging screening method for which effective, robust and cost-effective approaches are still to be developed. Regular screening for diabetic retinopathy and diabetic maculopathy diseases is necessary in order to identify the group at risk of visual impairment. This paper presents a novel automatic detection of diabetic retinopathy and maculopathy in eye fundus images by employing fuzzy image processing techniques. The paper first introduces the existing systems for diabetic retinopathy screening, with an emphasis on the maculopathy detection methods. The proposed medical decision support system consists of four parts, namely: image acquisition, image preprocessing including four retinal structures localisation, feature extraction and the classification of diabetic retinopathy and maculopathy. A combination of fuzzy image processing techniques, the Circular Hough Transform and several feature extraction methods are implemented in the proposed system. The paper also presents a novel technique for the macula region localisation in order to detect the maculopathy. In addition to the proposed detection system, the paper highlights a novel online dataset and it presents the dataset collection, the expert diagnosis process and the advantages of our online database compared to other public eye fundus image databases for diabetic retinopathy purposes.

  5. Examining the influence of psychopathy, hostility biases, and automatic processing on criminal offenders' Theory of Mind.

    Science.gov (United States)

    Nentjes, Lieke; Bernstein, David; Arntz, Arnoud; van Breukelen, Gerard; Slaats, Mariëtte

    2015-01-01

    Theory of Mind (ToM) is a social perceptual skill that refers to the ability to take someone else's perspective and infer what others think. The current study examined the effect of potential hostility biases, as well as controlled (slow) versus automatic (fast) processing on ToM performance in psychopathy. ToM abilities (as assessed with the Reading the Mind in the Eyes Test; RMET; Baron-Cohen, Wheelwright, Hill, Raste, & Plumb, 2001), was compared between 39 PCL-R diagnosed psychopathic offenders, 37 non-psychopathic offenders, and 26 nonoffender controls. Contrary to our hypothesis, psychopathic individuals presented with intact overall RMET performance when restrictions were imposed on how long task stimuli could be processed. In addition, psychopaths did not over-ascribe hostility to task stimuli (i.e., lack of hostility bias). However, there was a significant three-way interaction between hostility, processing speed, and psychopathy: when there was no time limit on stimulus presentation, psychopathic offenders made fewer errors in identifying more hostile eye stimuli compared to nonoffender controls, who seemed to be less accurate in detecting hostility. Psychopaths' more realistic appraisal of others' malevolent mental states is discussed in the light of theories that stress its potential adaptive function. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. The role of automaticity and attention in neural processes underlying empathy for happiness, sadness, and anxiety

    Directory of Open Access Journals (Sweden)

    Sylvia A. Morelli

    2013-05-01

    Full Text Available Although many studies have examined the neural basis of experiencing empathy, relatively little is known about how empathic processes are affected by different attentional conditions. Thus, we examined whether instructions to empathize might amplify responses in empathy-related regions and whether cognitive load would diminish the involvement of these regions. 32 participants completed a functional magnetic resonance imaging session assessing empathic responses to individuals experiencing happy, sad, and anxious events. Stimuli were presented under three conditions: watching naturally, while instructed to empathize, and under cognitive load. Across analyses, we found evidence for a core set of neural regions that support empathic processes (dorsomedial prefrontal cortex, DMPFC; medial prefrontal cortex, MPFC; temporoparietal junction, TPJ; amygdala; ventral anterior insula, AI; septal area, SA. Two key regions – the ventral AI and SA – were consistently active across all attentional conditions, suggesting that they are automatically engaged during empathy. In addition, watching versus empathizing with targets was not markedly different and instead led to similar subjective and neural responses to others’ emotional experiences. In contrast, cognitive load reduced the subjective experience of empathy and diminished neural responses in several regions related to empathy (DMPFC, MPFC, TPJ, amygdala and social cognition. The current results reveal how attention impacts empathic processes and provides insight into how empathy may unfold in everyday interactions.

  7. Distributed Processing of SETI Data

    Science.gov (United States)

    Korpela, Eric

    As you have read in prior chapters, researchers have been performing progressively more sensitive SETI searches since 1960. Each search has been limited by the technologies available at the time. As radio frequency technologies have become more efficient and computers have become faster, the searches have increased in capacity and become more sensitive. Often the limits of the hardware that performs the calculations required to process the telescope data in order to expose any embedded signals is what limits the sensitivity of the search. Shortly before the start of the 21st century, projects began to appear that exploited the processing capabilities of computers connected to the Internet in order to solve problems that required a large amount of computing power. The SETI@home project, managed by myself and a group of researchers at the Space Sciences Laboratory of the University of California, Berkeley, was the first attempt to use large-scale distributed computing to solve the problems of performing a sensitive search for narrow band radio signals from extraterrestrial civilizations. (Korpela et al., 2001) A follow-on project, Astropulse, searches for extraterrestrial signals with wider bandwidths and shorter time durations. Both projects are ongoing at the present time (mid-2010).

  8. System of automatic control over data Acquisition and Transmission to IGR NNC RK Data Center

    International Nuclear Information System (INIS)

    Komarov, I.I.; Gordienko, D.D.; Kunakov, A.V.

    2005-01-01

    Automated system for seismic and acoustic data acquisition and transmission in real time was established in Data Center IGR NNC RK, which functions very successively. The system monitors quality and volume of acquired information and also controls the status of the system and communication channels. Statistical data on system operation are accumulated in created database. Information on system status is reflected on the Center Web page. (author)

  9. Real-time environmental radiation monitoring system with automatic restoration of backup data in site detector via communication using radio frequency

    International Nuclear Information System (INIS)

    Lee, Wan No; Kim, Eun Han; Chung, Kun Ho; Cho, Young Hyun; Choi, Geun Sik; Lee, Chang Woo; Park, Ki Hyun; Kim, Yun Goo

    2003-01-01

    An environmental radiation monitoring system based on high pressurized ionization chamber has been used for on-line gamma monitoring surrounding the KAERI (Korea Atomic Energy Research Institute), which transmits the dose data measured from ion chamber on the site via radio frequency to a central processing computer and stores the transmitted real-time data. Although communication using ratio frequency has several advantages such as effective and economical transmission, storage, and data process, there is one main disadvantage that data loss during transmission often happens because of unexpected communication problems. It is possible to restore the loss data by off-line such as floppy disk but the simultaneous process and display of current data as well as the backup data are very difficult in the present on-line system. In this work, a new electronic circuit board and the operation software applicable to the conventional environmental radiation monitoring system are developed and the automatical synchronization of the ion chamber unit and the central processing computer is carried out every day. This system is automatically able to restore the backup data within 34 hours without additional equipment and also display together the current data as well as the transmitted backup data after checking time flag

  10. Automatic translation of MPI source into a latency-tolerant, data-driven form

    International Nuclear Information System (INIS)

    Nguyen, Tan; Cicotti, Pietro; Bylaska, Eric; Quinlan, Dan; Baden, Scott

    2017-01-01

    Hiding communication behind useful computation is an important performance programming technique but remains an inscrutable programming exercise even for the expert. We present Bamboo, a code transformation framework that can realize communication overlap in applications written in MPI without the need to intrusively modify the source code. We reformulate MPI source into a task dependency graph representation, which partially orders the tasks, enabling the program to execute in a data-driven fashion under the control of an external runtime system. Experimental results demonstrate that Bamboo significantly reduces communication delays while requiring only modest amounts of programmer annotation for a variety of applications and platforms, including those employing co-processors and accelerators. Moreover, Bamboo’s performance meets or exceeds that of labor-intensive hand coding. As a result, the translator is more than a means of hiding communication costs automatically; it demonstrates the utility of semantic level optimization against a well-known library.

  11. Automatic sleep classification using a data-driven topic model reveals latent sleep states

    DEFF Research Database (Denmark)

    Koch, Henriette; Christensen, Julie Anja Engelhard; Frandsen, Rune

    2014-01-01

    Latent Dirichlet Allocation. Model application was tested on control subjects and patients with periodic leg movements (PLM) representing a non-neurodegenerative group, and patients with idiopathic REM sleep behavior disorder (iRBD) and Parkinson's Disease (PD) representing a neurodegenerative group......Background: The golden standard for sleep classification uses manual scoring of polysomnography despite points of criticism such as oversimplification, low inter-rater reliability and the standard being designed on young and healthy subjects. New method: To meet the criticism and reveal the latent...... sleep states, this study developed a general and automatic sleep classifier using a data-driven approach. Spectral EEG and EOG measures and eye correlation in 1 s windows were calculated and each sleep epoch was expressed as a mixture of probabilities of latent sleep states by using the topic model...

  12. INfluence of vinasse on water movement in soil, using automatic acquisition and handling data system

    International Nuclear Information System (INIS)

    Nascimento Filho, V.F. do; Barros Ferraz, E.S. de

    1986-01-01

    The vinasse, by-product of ethylic alcohol industry from the sugar cane juice or molasses yeast fermentation, has been incorporated in the soil as fertilizer, due to the its hight organic matter (2-6%), potassium and sulphate (0,1-0,5%) and other nutrient contents. By employing monoenergetic gamma-ray beam attenuation technique (241Am; 59,5 keV; 100 mCi) the influence of vinasse on the water movement in the soil was studied. For this, an automatic acquisition and handling data system was used, based in multichannel analyser, multi-scaling mode operated, coupled to a personal microcomputer and plotter. Despite the small depth studied (6 cm), it was observed that vinasse decreases the water infiltration velocity in the soil. (Author) [pt

  13. Engineering studies related to Skylab program. [assessment of automatic gain control data

    Science.gov (United States)

    Hayne, G. S.

    1973-01-01

    The relationship between the S-193 Automatic Gain Control data and the magnitude of received signal power was studied in order to characterize performance parameters for Skylab equipment. The r-factor was used for the assessment and is defined to be less than unity, and a function of off-nadir angle, ocean surface roughness, and receiver signal to noise ratio. A digital computer simulation was also used to assess to additive receiver, or white noise. The system model for the digital simulation is described, along with intermediate frequency and video impulse response functions used, details of the input waveforms, and results to date. Specific discussion of the digital computer programs used is also provided.

  14. A comparison of density of Insight and Ektaspeed plus dental x-ray films using automatic and manual processing

    International Nuclear Information System (INIS)

    Yoon, Suk Ja

    2001-01-01

    To compare the film density of Insight dental X-ray film (Eastman Kodak Co., Rochester, NY, USA) with that of Ektaspeed Plus film (Eastman Kodak) under manual and automatic processing conditions. Insight and wedge on the film under the three different exposure times. The exposed films were processed by both manual and automatic ways. The Base plus fog density and the optical density and the optical density made by exposing step wedge were calculated using a digital densitometer (model 07-443, Victoreen Inc, Cleveland, Ohio, USA). The optical densities of the Insight and Ektaspeed film versus thickness of aluminum wedge at the same exposure time were plotted on the graphs. Statistical analyses were applied for comparing the optical densities of the two films. The film density of both Insight films and Ektaspeed Plus films under automatic processing condition was significantly higher over the manual processing. The film density of Insight over Ektaspeed Plus film. To take the full advantage of reducing exposure time, Insight film should be processed automatically

  15. In vitro motility evaluation of aggregated cancer cells by means of automatic image processing.

    Science.gov (United States)

    De Hauwer, C; Darro, F; Camby, I; Kiss, R; Van Ham, P; Decaesteker, C

    1999-05-01

    Set up of an automatic image processing based method that enables the motility of in vitro aggregated cells to be evaluated for a number of hours. Our biological model included the PC-3 human prostate cancer cell line growing as a monolayer on the bottom of Falcon plastic dishes containing conventional culture media. Our equipment consisted of an incubator, an inverted phase contrast microscope, a Charge Coupled Device (CCD) video camera, and a computer equipped with an image processing software developed in our laboratory. This computer-assisted microscope analysis of aggregated cells enables global cluster motility to be evaluated. This analysis also enables the trajectory of each cell to be isolated and parametrized within a given cluster or, indeed, the trajectories of individual cells outside a cluster. The results show that motility inside a PC-3 cluster is not restricted to slight motion due to cluster expansion, but rather consists of a marked cell movement within the cluster. The proposed equipment enables in vitro aggregated cell motility to be studied. This method can, therefore, be used in pharmacological studies in order to select anti-motility related compounds. The compounds selected by the equipment described could then be tested in vivo as potential anti-metastatic.

  16. Automatic crack detection method for loaded coal in vibration failure process.

    Directory of Open Access Journals (Sweden)

    Chengwu Li

    Full Text Available In the coal mining process, the destabilization of loaded coal mass is a prerequisite for coal and rock dynamic disaster, and surface cracks of the coal and rock mass are important indicators, reflecting the current state of the coal body. The detection of surface cracks in the coal body plays an important role in coal mine safety monitoring. In this paper, a method for detecting the surface cracks of loaded coal by a vibration failure process is proposed based on the characteristics of the surface cracks of coal and support vector machine (SVM. A large number of cracked images are obtained by establishing a vibration-induced failure test system and industrial camera. Histogram equalization and a hysteresis threshold algorithm were used to reduce the noise and emphasize the crack; then, 600 images and regions, including cracks and non-cracks, were manually labelled. In the crack feature extraction stage, eight features of the cracks are extracted to distinguish cracks from other objects. Finally, a crack identification model with an accuracy over 95% was trained by inputting the labelled sample images into the SVM classifier. The experimental results show that the proposed algorithm has a higher accuracy than the conventional algorithm and can effectively identify cracks on the surface of the coal and rock mass automatically.

  17. A method for automatic control of the process of producing electrode pitch

    Energy Technology Data Exchange (ETDEWEB)

    Rozenman, E.S.; Bugaysen, I.M.; Chernyshov, Yu.A.; Klyusa, M.D.; Krysin, V.P.; Livshits, B.Ya.; Martynenko, V.V.; Meniovich, B.I.; Sklyar, M.G.; Voytenko, B.I.

    1983-01-01

    A method is proposed for automatic control of the process for producing electride pitch through regulation of the feeding of the starting raw material with correction based on the pitch level in the last apparatus of the technological line and change in the feeding of air into the reactors based on the flow rates of the starting raw material and the temperature of the liquid phase in the reactors. In order to increase the stability of the quality of the electrode pitch with changes in the properties of the starting resin, the heating temperature of the dehydrated resin is regulated in the pipe furnace relative to the quality of the mean temperature pitch produced from it, while the level of the liquid phase in the reactor is regulated relative to the quality of the final product. The proposed method provides for an improvement in the quality of process regulation, which makes it possible to improve the properties of the anode mass and to reduce its expenditure for the production of Aluminum.

  18. Automatic and directed search processes in solving simple semantic-memory problems.

    Science.gov (United States)

    Ben-Zur, H

    1989-09-01

    The cognitive processes involved in simple semantic-memory problems were investigated in four experiments. On each trial of Experiments 1 and 2, two stimulus words were presented, with the instructions to find a third word (i.e., the solution) that, when coupled with each of the stimuli, would yield two word pairs used in everyday language (e.g., surprise and birthday, for which the solution is party). The results of the two experiments indicated that informing the subject whether the solution constituted the first or the second element in the word pairs facilitated both likelihood and speed of solution attainment. In addition, solution attainment was relatively high for items based on frequently used word pairs (Experiment 1) and for items in which the stimuli appear, in everyday language, in a small number of word pairs (Experiment 2). In Experiment 3, the subjects were required to produce word pairs containing one of the two stimulus words from the items used in Experiment 2. Solution production was facilitated by rehearsing the second stimulus word of the specific item. The conclusion, supported by a post hoc analysis of the results of Experiments 2 and 3 (Experiment 4), was that indirect priming from one stimulus word may facilitate solution production from a searched word. These results are interpreted in terms of automatic and controlled processes, and their relevance to two different models for retrieval from semantic memory is discussed.

  19. AN EFFICIENT METHOD FOR AUTOMATIC ROAD EXTRACTION BASED ON MULTIPLE FEATURES FROM LiDAR DATA

    Directory of Open Access Journals (Sweden)

    Y. Li

    2016-06-01

    Full Text Available The road extraction in urban areas is difficult task due to the complicated patterns and many contextual objects. LiDAR data directly provides three dimensional (3D points with less occlusions and smaller shadows. The elevation information and surface roughness are distinguishing features to separate roads. However, LiDAR data has some disadvantages are not beneficial to object extraction, such as the irregular distribution of point clouds and lack of clear edges of roads. For these problems, this paper proposes an automatic road centerlines extraction method which has three major steps: (1 road center point detection based on multiple feature spatial clustering for separating road points from ground points, (2 local principal component analysis with least squares fitting for extracting the primitives of road centerlines, and (3 hierarchical grouping for connecting primitives into complete roads network. Compared with MTH (consist of Mean shift algorithm, Tensor voting, and Hough transform proposed in our previous article, this method greatly reduced the computational cost. To evaluate the proposed method, the Vaihingen data set, a benchmark testing data provided by ISPRS for “Urban Classification and 3D Building Reconstruction” project, was selected. The experimental results show that our method achieve the same performance by less time in road extraction using LiDAR data.

  20. An Efficient Method for Automatic Road Extraction Based on Multiple Features from LiDAR Data

    Science.gov (United States)

    Li, Y.; Hu, X.; Guan, H.; Liu, P.

    2016-06-01

    The road extraction in urban areas is difficult task due to the complicated patterns and many contextual objects. LiDAR data directly provides three dimensional (3D) points with less occlusions and smaller shadows. The elevation information and surface roughness are distinguishing features to separate roads. However, LiDAR data has some disadvantages are not beneficial to object extraction, such as the irregular distribution of point clouds and lack of clear edges of roads. For these problems, this paper proposes an automatic road centerlines extraction method which has three major steps: (1) road center point detection based on multiple feature spatial clustering for separating road points from ground points, (2) local principal component analysis with least squares fitting for extracting the primitives of road centerlines, and (3) hierarchical grouping for connecting primitives into complete roads network. Compared with MTH (consist of Mean shift algorithm, Tensor voting, and Hough transform) proposed in our previous article, this method greatly reduced the computational cost. To evaluate the proposed method, the Vaihingen data set, a benchmark testing data provided by ISPRS for "Urban Classification and 3D Building Reconstruction" project, was selected. The experimental results show that our method achieve the same performance by less time in road extraction using LiDAR data.

  1. On the Automatic Generation of Plans for Life Cycle Assembly Processes

    Energy Technology Data Exchange (ETDEWEB)

    CALTON,TERRI L.

    2000-01-01

    Designing products for easy assembly and disassembly during their entire life cycles for purposes including product assembly, product upgrade, product servicing and repair, and product disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and manufacturing plan selection criteria, as compared to initial assembly, require re-visiting significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or to applied studies of life cycle assembly processes that give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for, optimize, and analyze the cycle assembly processes. The study of assembly planning is at the very heart of manufacturing research facilities and academic engineering institutions; and, in recent years a number of significant advances in the field of assembly planning have been made. These advances have ranged from the development of automated assembly planning systems, such as Sandia's Automated Assembly Analysis System Archimedes 3.0{copyright}, to the startling revolution in microprocessors and computer-controlled production tools such as computer-aided design (CAD), computer-aided manufacturing (CAM), flexible manufacturing systems (EMS), and computer-integrated manufacturing (CIM). These results have kindled considerable interest in the study of algorithms for life cycle related assembly processes and have blossomed into a field of intense interest. The intent of this manuscript is to bring together the fundamental results in this area, so that the unifying principles and underlying concepts of algorithm design may more easily be implemented in practice.

  2. Segmentation of Multi-Isotope Imaging Mass Spectrometry Data for Semi-Automatic Detection of Regions of Interest

    Science.gov (United States)

    Poczatek, J. Collin; Turck, Christoph W.; Lechene, Claude

    2012-01-01

    Multi-isotope imaging mass spectrometry (MIMS) associates secondary ion mass spectrometry (SIMS) with detection of several atomic masses, the use of stable isotopes as labels, and affiliated quantitative image-analysis software. By associating image and measure, MIMS allows one to obtain quantitative information about biological processes in sub-cellular domains. MIMS can be applied to a wide range of biomedical problems, in particular metabolism and cell fate [1], [2], [3]. In order to obtain morphologically pertinent data from MIMS images, we have to define regions of interest (ROIs). ROIs are drawn by hand, a tedious and time-consuming process. We have developed and successfully applied a support vector machine (SVM) for segmentation of MIMS images that allows fast, semi-automatic boundary detection of regions of interests. Using the SVM, high-quality ROIs (as compared to an expert's manual delineation) were obtained for 2 types of images derived from unrelated data sets. This automation simplifies, accelerates and improves the post-processing analysis of MIMS images. This approach has been integrated into “Open MIMS,” an ImageJ-plugin for comprehensive analysis of MIMS images that is available online at http://www.nrims.hms.harvard.edu/NRIMS_ImageJ.php. PMID:22347386

  3. Automatic registration of optical imagery with 3d lidar data using local combined mutual information

    Directory of Open Access Journals (Sweden)

    E. G. Parmehr

    2013-10-01

    Full Text Available Automatic registration of multi-sensor data is a basic step in data fusion for photogrammetric and remote sensing applications. The effectiveness of intensity-based methods such as Mutual Information (MI for automated registration of multi-sensor image has been previously reported for medical and remote sensing applications. In this paper, a new multivariable MI approach that exploits complementary information of inherently registered LiDAR DSM and intensity data to improve the robustness of registering optical imagery and LiDAR point cloud, is presented. LiDAR DSM and intensity information has been utilised in measuring the similarity of LiDAR and optical imagery via the Combined MI. An effective histogramming technique is adopted to facilitate estimation of a 3D probability density function (pdf. In addition, a local similarity measure is introduced to decrease the complexity of optimisation at higher dimensions and computation cost. Therefore, the reliability of registration is improved due to the use of redundant observations of similarity. The performance of the proposed method for registration of satellite and aerial images with LiDAR data in urban and rural areas is experimentally evaluated and the results obtained are discussed.

  4. Deep Learning Approach for Automatic Classification of Ocular and Cardiac Artifacts in MEG Data

    Directory of Open Access Journals (Sweden)

    Ahmad Hasasneh

    2018-01-01

    Full Text Available We propose an artifact classification scheme based on a combined deep and convolutional neural network (DCNN model, to automatically identify cardiac and ocular artifacts from neuromagnetic data, without the need for additional electrocardiogram (ECG and electrooculogram (EOG recordings. From independent components, the model uses both the spatial and temporal information of the decomposed magnetoencephalography (MEG data. In total, 7122 samples were used after data augmentation, in which task and nontask related MEG recordings from 48 subjects served as the database for this study. Artifact rejection was applied using the combined model, which achieved a sensitivity and specificity of 91.8% and 97.4%, respectively. The overall accuracy of the model was validated using a cross-validation test and revealed a median accuracy of 94.4%, indicating high reliability of the DCNN-based artifact removal in task and nontask related MEG experiments. The major advantages of the proposed method are as follows: (1 it is a fully automated and user independent workflow of artifact classification in MEG data; (2 once the model is trained there is no need for auxiliary signal recordings; (3 the flexibility in the model design and training allows for various modalities (MEG/EEG and various sensor types.

  5. Welding qualification procedure for fuel rods tubes of Zr-Sn alloys by the TIG automatic process

    International Nuclear Information System (INIS)

    1984-11-01

    It is presented the requirements to be used in the Welding qualification procedure for tubes of Zr-Sn alloys, specified in the ASTM B353 regulatory guide, used in the fabrication of fuel rods PWR reactors by the automatic TIG process. (E.G.) [pt

  6. Data taking and processing system for nuclear experimental physics study

    International Nuclear Information System (INIS)

    Nagashima, Y.; Kimura, H.; Katori, K.; Kuriyama, K.

    1979-01-01

    A multi input, multi mode, multi user data taking and processing system was developed. This system has following special features. 1) It is multi computer system which is constitute with two special processors and two mini computers. 2) The pseudo devices are introduced to make operating procedurs simply and easily. Especially, the selection or modification of 1 - 8 coincidence mode can be done very easily and quickly. 3) A 16 Kch spectrum storage has 8 partitions. Every partitions having floating size are handled automatically by the data taking software SHINE. 4) On line real time data processing can be done. Useing the FORTRAN language, user may prepare the processing software apart from the data taking software. Under the RSX-11D system software, this software runs concurrently with the data taking software by a multi programming mode. 5) The data communication between arbitraly external devices and this system can be done. With this communication procedures, not only the data transfer between computers, but also the control of the experimental devices are realized. Like the real time processing software, this software can be prepared by users and be ran concurrently with other softwares. 6) For data monitoring, two different graphic displays are used complementally. One is a refresh typed high speed display. The other is a storage typed large screen display. Raw datas are displayed on the former. Processed datas or multi parametric large volume datas are displayed on the later one. (author)

  7. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    Science.gov (United States)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  8. Data triggered data processing at MFTF-B

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1985-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory we schedule jobs to process experimental data to be collected during a five minute shot cycle. Our data-driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on our networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. We report here on details of diagnostic data processing and our experiences

  9. Data triggered data processing at the Mirror Fusion Test Facility

    International Nuclear Information System (INIS)

    Jackson, R.J.; Balch, T.R.; Preckshot, G.G.

    1986-01-01

    A primary characteristic of most batch systems is that the input data files must exist before jobs are scheduled. On the Mirror Fusion Test Facility (MFTF-B) at Lawrence Livermore National Laboratory the authors schedule jobs to process experimental data to be collected during a five minute shot cycle. The data driven processing system emulates a coarsely granular data flow architecture. Processing jobs are scheduled before the experimental data is collected. Processing jobs ''fire'', or execute, as input data becomes available. Similar to UNIX ''pipes'', data produced by upstream processing nodes may be used as inputs by following nodes. Users, working on the networked SUN workstations, specify data processing templates which define processes and their data dependencies. Data specifications indicate the source of data; actual associations with specific data instantiations are made when the jobs are scheduled. The authors report here on details of diagnostic data processing and their experiences

  10. Seeing race: N170 responses to race and their relation to automatic racial attitudes and controlled processing.

    Science.gov (United States)

    Ofan, Renana H; Rubin, Nava; Amodio, David M

    2011-10-01

    We examined the relation between neural activity reflecting early face perception processes and automatic and controlled responses to race. Participants completed a sequential evaluative priming task, in which two-tone images of Black faces, White faces, and cars appeared as primes, followed by target words categorized as pleasant or unpleasant, while encephalography was recorded. Half of these participants were alerted that the task assessed racial prejudice and could reveal their personal bias ("alerted" condition). To assess face perception processes, the N170 component of the ERP was examined. For all participants, stronger automatic pro-White bias was associated with larger N170 amplitudes to Black than White faces. For participants in the alerted condition only, larger N170 amplitudes to Black versus White faces were also associated with less controlled processing on the word categorization task. These findings suggest that preexisting racial attitudes affect early face processing and that situational factors moderate the link between early face processing and behavior.

  11. An algorithm for generating data accessibility recommendations for flight deck Automatic Dependent Surveillance-Broadcast (ADS-B) applications

    Science.gov (United States)

    2014-09-09

    Automatic Dependent Surveillance-Broadcast (ADS-B) In technology supports the display of traffic data on Cockpit Displays of Traffic Information (CDTIs). The data are used by flightcrews to perform defined self-separation procedures, such as the in-t...

  12. Runtime Modifications of Spark Data Processing Pipelines

    NARCIS (Netherlands)

    Lazovik, E.; Medema, M.; Albers, T.; Langius, E.A.F.; Lazovik, A.

    2017-01-01

    Distributed data processing systems are the standard means for large-scale data analysis in the Big Data field. These systems are based on processing pipelines where the processing is done via a composition of multiple elements or steps. In current distributed data processing systems, the code and

  13. Acute stress shifts the balance between controlled and automatic processes in prospective memory.

    Science.gov (United States)

    Möschl, Marcus; Walser, Moritz; Plessow, Franziska; Goschke, Thomas; Fischer, Rico

    2017-10-01

    In everyday life we frequently rely on our abilities to postpone intentions until later occasions (prospective memory; PM) and to deactivate completed intentions even in stressful situations. Yet, little is known about the effects of acute stress on these abilities. In the present work we investigated the impact of acute stress on PM functioning under high task demands. (1) Different from previous studies, in which intention deactivation required mostly low processing demands, we used salient focal PM cues to induce high processing demands during intention-deactivation phases. (2) We systematically manipulated PM-monitoring demands in a nonfocal PM task that required participants to monitor for either one or six specific syllables that could occur in ongoing-task words. Eighty participants underwent the Trier Social Stress Test, a standardized stress induction protocol, or a standardized control situation, before performing a computerized PM task. Our primary interests were whether PM performance, PM-monitoring costs, aftereffects of completed intentions and/or commission-error risk would differ between stressed and non-stressed individuals and whether these effects would differ under varying task demands. Results revealed that PM performance and aftereffects of completed intentions during subsequent performance were not affected by acute stress induction, replicating previous findings. Under high demands on intention deactivation (focal condition), however, acute stress produced a nominal increase in erroneous PM responses after intention completion (commission errors). Most importantly, under high demands on PM monitoring (nonfocal condition), acute stress led to a substantial reduction in PM-monitoring costs. These findings support ideas of selective and demand-dependent effects of acute stress on cognitive functioning. Under high task demands, acute stress might induce a shift in processing strategy towards resource-saving behavior, which seems to increase the

  14. Development of Web Tools for the automatic Upload of Calibration Data into the CMS Condition Data

    Science.gov (United States)

    di Guida, Salvatore; Innocente, Vincenzo; Pierro, Antonio

    2010-04-01

    This article explains the recent evolution of Condition Database Application Service. The Condition Database Application Service is part of the condition database system of the CMS experiment, and it is used for handling and monitoring the CMS detector condition data, and the corresponding computing resources like Oracle Databases, storage service and network devices. We deployed a service, the offline Dropbox service, that will be used by Alignment and Calibration Group in order to upload from the offline network (GPN) the calibration constants produced by running offline analysis.

  15. Fidelity of Automatic Speech Processing for Adult and Child Talker Classifications.

    Directory of Open Access Journals (Sweden)

    Mark VanDam

    Full Text Available Automatic speech processing (ASP has recently been applied to very large datasets of naturalistically collected, daylong recordings of child speech via an audio recorder worn by young children. The system developed by the LENA Research Foundation analyzes children's speech for research and clinical purposes, with special focus on of identifying and tagging family speech dynamics and the at-home acoustic environment from the auditory perspective of the child. A primary issue for researchers, clinicians, and families using the Language ENvironment Analysis (LENA system is to what degree the segment labels are valid. This classification study evaluates the performance of the computer ASP output against 23 trained human judges who made about 53,000 judgements of classification of segments tagged by the LENA ASP. Results indicate performance consistent with modern ASP such as those using HMM methods, with acoustic characteristics of fundamental frequency and segment duration most important for both human and machine classifications. Results are likely to be important for interpreting and improving ASP output.

  16. A method of automatic control of the process of compressing pyrogas in olefin production

    Energy Technology Data Exchange (ETDEWEB)

    Podval' niy, M.L.; Bobrovnikov, N.R.; Kotler, L.D.; Shib, L.M.; Tuchinskiy, M.R.

    1982-01-01

    In the known method of automatically controlling the process of compressing pyrogas in olefin production by regulating the supply of cooling agents to the interstage coolers of the compression unit depending on the flow of hydrocarbons to the compression unit, to raise performance by lowering deposition of polymers on the flow through surfaces of the equipment, the coolant supply is also regulated as a function of the flows of hydrocarbons from the upper and lower parts of the demethanizer and the bottoms of the stripping tower. The coolant supply is regulated proportional to the difference between the flow of stripping tower bottoms and the ratio of the hydrocarbon flow from the upper and lower parts of the demethanizer to the hydrocarbon flow in the compression unit. With an increase in the proportion of light hydrocarbons (sum of upper and lower demethanizer products) in the total flow of pyrogas going to compression, the flow of coolant to the compression unit is reduced. Condensation of the given fractions in the separators, their amount in condensate going through the piping to the stripping tower, is reduced. With the reduction in the proportion of light hydrocarbons in the pyrogas, the flow of coolant is increased, thus improving condensation of heavy hydrocarbons in the separators and removing them from the compression unit in the bottoms of the stripping tower.

  17. Extended morphological processing: a practical method for automatic spot detection of biological markers from microscopic images.

    Science.gov (United States)

    Kimori, Yoshitaka; Baba, Norio; Morone, Nobuhiro

    2010-07-08

    A reliable extraction technique for resolving multiple spots in light or electron microscopic images is essential in investigations of the spatial distribution and dynamics of specific proteins inside cells and tissues. Currently, automatic spot extraction and characterization in complex microscopic images poses many challenges to conventional image processing methods. A new method to extract closely located, small target spots from biological images is proposed. This method starts with a simple but practical operation based on the extended morphological top-hat transformation to subtract an uneven background. The core of our novel approach is the following: first, the original image is rotated in an arbitrary direction and each rotated image is opened with a single straight line-segment structuring element. Second, the opened images are unified and then subtracted from the original image. To evaluate these procedures, model images of simulated spots with closely located targets were created and the efficacy of our method was compared to that of conventional morphological filtering methods. The results showed the better performance of our method. The spots of real microscope images can be quantified to confirm that the method is applicable in a given practice. Our method achieved effective spot extraction under various image conditions, including aggregated target spots, poor signal-to-noise ratio, and large variations in the background intensity. Furthermore, it has no restrictions with respect to the shape of the extracted spots. The features of our method allow its broad application in biological and biomedical image information analysis.

  18. DDT: A Research Tool for Automatic Data Distribution in High Performance Fortran

    Directory of Open Access Journals (Sweden)

    Eduard AyguadÉ

    1997-01-01

    Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.

  19. Chemical name extraction based on automatic training data generation and rich feature set.

    Science.gov (United States)

    Yan, Su; Spangler, W Scott; Chen, Ying

    2013-01-01

    The automation of extracting chemical names from text has significant value to biomedical and life science research. A major barrier in this task is the difficulty of getting a sizable and good quality data to train a reliable entity extraction model. Another difficulty is the selection of informative features of chemical names, since comprehensive domain knowledge on chemistry nomenclature is required. Leveraging random text generation techniques, we explore the idea of automatically creating training sets for the task of chemical name extraction. Assuming the availability of an incomplete list of chemical names, called a dictionary, we are able to generate well-controlled, random, yet realistic chemical-like training documents. We statistically analyze the construction of chemical names based on the incomplete dictionary, and propose a series of new features, without relying on any domain knowledge. Compared to state-of-the-art models learned from manually labeled data and domain knowledge, our solution shows better or comparable results in annotating real-world data with less human effort. Moreover, we report an interesting observation about the language for chemical names. That is, both the structural and semantic components of chemical names follow a Zipfian distribution, which resembles many natural languages.

  20. Automatic cardiac gating of small-animal PET from list-mode data

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L.; Udias, J.M. [Universidad Complutense de Madrid Univ. (Spain). Grupo de Fisica Nuclear; Vaquero, J.J.; Desco, M. [Universidad Carlos III de Madrid (Spain). Dept. de Bioingenieria e Ingenieria Aeroespacial; Cusso, L. [Hospital General Universitario Gregorio Maranon, Madrid (Spain). Unidad de Medicina y Cirugia Experimental

    2011-07-01

    This work presents a method to obtain automatically the cardiac gating signal in a PET study of rats, by employing the variation with time of the counts in the cardiac region, that can be extracted from list-mode data. In an initial step, the cardiac region is identified in the image space by backward-projecting a small fraction of the acquired data and studying the variation with time of the counts in each voxel inside said region, with frequencies within 2 and 8 Hz. The region obtained corresponds accurately to the left-ventricle of the heart of the rat. In a second step, the lines-of-response (LORs) connected with this region are found by forward-projecting this region. The time variation of the number of counts in these LORs contains the cardiac motion information that we want to extract. This variation of counts with time is band-pass filtered to reduce noise, and the time signal so obtained is used to create the gating signal. The result was compared with a cardiac gating signal obtained from an ECG acquired simultaneously to the PET study. Reconstructed gated images obtained from both gating information are similar. The method proposed demonstrates that valid cardiac gating signals can be obtained for rats from PET list-mode data. (orig.)

  1. Deep Learning-Based Data Forgery Detection in Automatic Generation Control

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Fengli [Univ. of Arkansas, Fayetteville, AR (United States); Li, Qinghua [Univ. of Arkansas, Fayetteville, AR (United States)

    2017-10-09

    Automatic Generation Control (AGC) is a key control system in the power grid. It is used to calculate the Area Control Error (ACE) based on frequency and tie-line power flow between balancing areas, and then adjust power generation to maintain the power system frequency in an acceptable range. However, attackers might inject malicious frequency or tie-line power flow measurements to mislead AGC to do false generation correction which will harm the power grid operation. Such attacks are hard to be detected since they do not violate physical power system models. In this work, we propose algorithms based on Neural Network and Fourier Transform to detect data forgery attacks in AGC. Different from the few previous work that rely on accurate load prediction to detect data forgery, our solution only uses the ACE data already available in existing AGC systems. In particular, our solution learns the normal patterns of ACE time series and detects abnormal patterns caused by artificial attacks. Evaluations on the real ACE dataset show that our methods have high detection accuracy.

  2. An Overview of Automaticity and Implications For Training the Thinking Process

    National Research Council Canada - National Science Library

    Holt, Brian

    2002-01-01

    ...., visual search to battlefield thinking). The results of this examination suggest that automaticity can be developed using consistent rules and extensive practice that vary depending on the type of task...

  3. SU-D-BRD-07: Automatic Patient Data Audit and Plan Quality Check to Support ARIA and Eclipse

    Energy Technology Data Exchange (ETDEWEB)

    Li, X; Li, H; Wu, Y; Mutic, S; Yang, D [Washington University School of Medicine, St. Louis, MO (United States)

    2014-06-01

    Purpose: To ensure patient safety and treatment quality in RT departments that use Varian ARIA and Eclipse, we developed a computer software system and interface functions that allow previously developed electron chart checking (EcCk) methodologies to support these Varian systems. Methods: ARIA and Eclipse store most patient information in its MSSQL database. We studied the contents in the hundreds database tables and identified the data elements used for patient treatment management and treatment planning. Interface functions were developed in both c-sharp and MATLAB to support data access from ARIA and Eclipse servers using SQL queries. These functions and additional data processing functions allowed the existing rules and logics from EcCk to support ARIA and Eclipse. Dose and structure information are important for plan quality check, however they are not stored in the MSSQL database but as files in Varian private formats, and cannot be processed by external programs. We have therefore implemented a service program, which uses the DB Daemon and File Daemon services on ARIA server to automatically and seamlessly retrieve dose and structure data as DICOM files. This service was designed to 1) consistently monitor the data access requests from EcCk programs, 2) translate the requests for ARIA daemon services to obtain dose and structure DICOM files, and 3) monitor the process and return the obtained DICOM files back to EcCk programs for plan quality check purposes. Results: EcCk, which was previously designed to only support MOSAIQ TMS and Pinnacle TPS, can now support Varian ARIA and Eclipse. The new EcCk software has been tested and worked well in physics new start plan check, IMRT plan integrity and plan quality checks. Conclusion: Methods and computer programs have been implemented to allow EcCk to support Varian ARIA and Eclipse systems. This project was supported by a research grant from Varian Medical System.

  4. SU-D-BRD-07: Automatic Patient Data Audit and Plan Quality Check to Support ARIA and Eclipse

    International Nuclear Information System (INIS)

    Li, X; Li, H; Wu, Y; Mutic, S; Yang, D

    2014-01-01

    Purpose: To ensure patient safety and treatment quality in RT departments that use Varian ARIA and Eclipse, we developed a computer software system and interface functions that allow previously developed electron chart checking (EcCk) methodologies to support these Varian systems. Methods: ARIA and Eclipse store most patient information in its MSSQL database. We studied the contents in the hundreds database tables and identified the data elements used for patient treatment management and treatment planning. Interface functions were developed in both c-sharp and MATLAB to support data access from ARIA and Eclipse servers using SQL queries. These functions and additional data processing functions allowed the existing rules and logics from EcCk to support ARIA and Eclipse. Dose and structure information are important for plan quality check, however they are not stored in the MSSQL database but as files in Varian private formats, and cannot be processed by external programs. We have therefore implemented a service program, which uses the DB Daemon and File Daemon services on ARIA server to automatically and seamlessly retrieve dose and structure data as DICOM files. This service was designed to 1) consistently monitor the data access requests from EcCk programs, 2) translate the requests for ARIA daemon services to obtain dose and structure DICOM files, and 3) monitor the process and return the obtained DICOM files back to EcCk programs for plan quality check purposes. Results: EcCk, which was previously designed to only support MOSAIQ TMS and Pinnacle TPS, can now support Varian ARIA and Eclipse. The new EcCk software has been tested and worked well in physics new start plan check, IMRT plan integrity and plan quality checks. Conclusion: Methods and computer programs have been implemented to allow EcCk to support Varian ARIA and Eclipse systems. This project was supported by a research grant from Varian Medical System

  5. Process mining online assessment data

    NARCIS (Netherlands)

    Pechenizkiy, M.; Trcka, N.; Vasilyeva, E.; Aalst, van der W.M.P.; De Bra, P.M.E.; Barnes, T.; Desmarais, M.; Romero, C.; Ventura, S.

    2009-01-01

    Traditional data mining techniques have been extensively applied to find interesting patterns, build descriptive and predictive models from large volumes of data accumulated through the use of different information systems. The results of data mining can be used for getting a better understanding of

  6. Composable Data Processing in Environmental Science - A Process View

    NARCIS (Netherlands)

    Wombacher, Andreas

    Data processing in environmental science is essential for doing science. The heterogeneity of data sources, data processing operations and infrastructures results in a lot of manual data and process integration work done by each scientist individually. This is very inefficient and time consuming.

  7. Pemanfaatan Data ARR (Automatic Rainfall Recorder) untuk Peningkatan Efektifitas Model Hujan Satelit (Studi Kasus DAS Indragiri)

    OpenAIRE

    Hendra, Yuli; Fauzi, Manyuk; Sutikno, Sigit

    2015-01-01

    The availability of data on hydrological modeling always become a problem considering the incompleteness and the imprecision of data. As the development of technology, many models of hidrology using data acquired from the satellite have emerged. The accuracy and the model correlation was still unachieved from the previous research using satellite data. This problems was caused by the unstable weather conditions thus the process of recording and dowloading of the satellite data become less opt...

  8. Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data

    Science.gov (United States)

    Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.

    2016-06-01

    The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.

  9. Processing multidimensional nuclear physics data

    Energy Technology Data Exchange (ETDEWEB)

    Becker, J. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Modern Ge detector arrays for gamma-ray spectroscopy are producing data sets unprecedented in size and event multiplicity. Gammasphere, the DOE sponsored array, has the following characteristics: (1) High granularity (110 detectors); (2) High efficiency (10%); and (3) Precision energy measurements (Delta EE = 0.2%). Characteristics of detector line shape, the data set, and the standard practice in the nuclear physics community to the nuclear gamma-ray cascades from the 4096 times 4096 times 4096 data cube will be discussed.

  10. Simple Approaches to Improve the Automatic Inventory of ZEBRA Crossing from Mls Data

    Science.gov (United States)

    Arias, P.; Riveiro, B.; Soilán, M.; Díaz-Vilariño, L.; Martínez-Sánchez, J.

    2015-08-01

    The city management is increasingly supported by information technologies, leading to paradigms such as smart cities, where decision-makers, companies and citizens are continuously interconnected. 3D modelling turns of great relevance when the city has to be managed making use of geospatial databases or Geographic Information Systems. On the other hand, laser scanning technology has experienced a significant growth in the last years, and particularly, terrestrial mobile laser scanning platforms are being more and more used with inventory purposes in both cities and road environments. Consequently, large datasets are available to produce the geometric basis for the city model; however, this data is not directly exploitable by management systems constraining the implementation of the technology for such applications. This paper presents a new algorithm for the automatic detection of zebra crossing. The algorithm is divided in three main steps: road segmentation (based on a PCA analysis of the points contained in each cycle of collected by a mobile laser system), rasterization (conversion of the point cloud to a raster image coloured as a function of intensity data), and zebra crossing detection (using the Hough Transform and logical constrains for line classification). After evaluating different datasets collected in three cities located in Northwest Spain (comprising 25 strips with 30 visible zebra crossings) a completeness of 83% was achieved.

  11. Automatic extraction of property norm-like data from large text corpora.

    Science.gov (United States)

    Kelly, Colin; Devereux, Barry; Korhonen, Anna

    2014-01-01

    Traditional methods for deriving property-based representations of concepts from text have focused on either extracting only a subset of possible relation types, such as hyponymy/hypernymy (e.g., car is-a vehicle) or meronymy/metonymy (e.g., car has wheels), or unspecified relations (e.g., car--petrol). We propose a system for the challenging task of automatic, large-scale acquisition of unconstrained, human-like property norms from large text corpora, and discuss the theoretical implications of such a system. We employ syntactic, semantic, and encyclopedic information to guide our extraction, yielding concept-relation-feature triples (e.g., car be fast, car require petrol, car cause pollution), which approximate property-based conceptual representations. Our novel method extracts candidate triples from parsed corpora (Wikipedia and the British National Corpus) using syntactically and grammatically motivated rules, then reweights triples with a linear combination of their frequency and four statistical metrics. We assess our system output in three ways: lexical comparison with norms derived from human-generated property norm data, direct evaluation by four human judges, and a semantic distance comparison with both WordNet similarity data and human-judged concept similarity ratings. Our system offers a viable and performant method of plausible triple extraction: Our lexical comparison shows comparable performance to the current state-of-the-art, while subsequent evaluations exhibit the human-like character of our generated properties.

  12. 13C-detected NMR experiments for automatic resonance assignment of IDPs and multiple-fixing SMFT processing

    International Nuclear Information System (INIS)

    Dziekański, Paweł; Grudziąż, Katarzyna; Jarvoll, Patrik; Koźmiński, Wiktor; Zawadzka-Kazimierczuk, Anna

    2015-01-01

    Intrinsically disordered proteins (IDPs) have recently attracted much interest, due to their role in many biological processes, including signaling and regulation mechanisms. High-dimensional 13 C direct-detected NMR experiments have proven exceptionally useful in case of IDPs, providing spectra with superior peak dispersion. Here, two such novel experiments recorded with non-uniform sampling are introduced, these are 5D HabCabCO(CA)NCO and 5D HNCO(CA)NCO. Together with the 4D (HACA)CON(CA)NCO, an extension of the previously published 3D experiments (Pantoja-Uceda and Santoro in J Biomol NMR 59:43–50, 2014. doi: 10.1007/s10858-014-9827-1 10.1007/s10858-014-9827-1 ), they form a set allowing for complete and reliable resonance assignment of difficult IDPs. The processing is performed with sparse multidimensional Fourier transform based on the concept of restricting (fixing) some of spectral dimensions to a priori known resonance frequencies. In our study, a multiple-fixing method was developed, that allows easy access to spectral data. The experiments were tested on a resolution-demanding alpha-synuclein sample. Due to superior peak dispersion in high-dimensional spectrum and availability of the sequential connectivities between four consecutive residues, the overwhelming majority of resonances could be assigned automatically using the TSAR program

  13. Automatic processing of semantic relations in fMRI: neural activation during semantic priming of taxonomic and thematic categories.

    Science.gov (United States)

    Sachs, Olga; Weis, Susanne; Zellagui, Nadia; Huber, Walter; Zvyagintsev, Mikhail; Mathiak, Klaus; Kircher, Tilo

    2008-07-07

    Most current models of knowledge organization are based on hierarchical or taxonomic categories (animals, tools). Another important organizational pattern is thematic categorization, i.e. categories held together by external relations, a unifying scene or event (car and garage). The goal of this study was to compare the neural correlates of these categories under automatic processing conditions that minimize strategic influences. We used fMRI to examine neural correlates of semantic priming for category members with a short stimulus onset asynchrony (SOA) of 200 ms as subjects performed a lexical decision task. Four experimental conditions were compared: thematically related words (car-garage); taxonomically related (car-bus); unrelated (car-spoon); non-word trials (car-derf). We found faster reaction times for related than for unrelated prime-target pairs for both thematic and taxonomic categories. However, the size of the thematic priming effect was greater than that of the taxonomic. The imaging data showed signal changes for the taxonomic priming effects in the right precuneus, postcentral gyrus, middle frontal and superior frontal gyri and thematic priming effects in the right middle frontal gyrus and anterior cingulate. The contrast of neural priming effects showed larger signal changes in the right precuneus associated with the taxonomic but not with thematic priming response. We suggest that the greater involvement of precuneus in the processing of taxonomic relations indicates their reduced salience in the knowledge structure compared to more prominent thematic relations.

  14. Fast processing the film data file

    International Nuclear Information System (INIS)

    Abramov, B.M.; Avdeev, N.F.; Artemov, A.V.

    1978-01-01

    The problems of processing images obtained from three-meter magnetic spectrometer on a new PSP-2 automatic device are considered. A detailed description of the filtration program, which controls the correctness of operation connection line, as well as of scanning parameters and technical quality of information. The filtration process can be subdivided into the following main stages: search of fiducial marks binding of track to fiducial marks; plotting from sparks of track fragments in chambers. For filtration purposes the BESM-6 computer has been chosen. The complex of filtration programs is shaped as a RAM-file, the required version of the program is collected by the PATCHY program. The subprograms, performing the greater part of the calculations are written in the autocode MADLEN, the rest of the subprograms - in FORTRAN and ALGOL. The filtration time for one image makes 1,2-2 s of the calculation. The BESM-6 computer processes up to 12 thousand images a day

  15. Digital curation: a proposal of a semi-automatic digital object selection-based model for digital curation in Big Data environments

    Directory of Open Access Journals (Sweden)

    Moisés Lima Dutra

    2016-08-01

    Full Text Available Introduction: This work presents a new approach for Digital Curations from a Big Data perspective. Objective: The objective is to propose techniques to digital curations for selecting and evaluating digital objects that take into account volume, velocity, variety, reality, and the value of the data collected from multiple knowledge domains. Methodology: This is an exploratory research of applied nature, which addresses the research problem in a qualitative way. Heuristics allow this semi-automatic process to be done either by human curators or by software agents. Results: As a result, it was proposed a model for searching, processing, evaluating and selecting digital objects to be processed by digital curations. Conclusions: It is possible to use Big Data environments as a source of information resources for Digital Curation; besides, Big Data techniques and tools can support the search and selection process of information resources by Digital Curations.

  16. An Extensible Processing Framework for Eddy-covariance Data

    Science.gov (United States)

    Durden, D.; Fox, A. M.; Metzger, S.; Sturtevant, C.; Durden, N. P.; Luo, H.

    2016-12-01

    The evolution of large data collecting networks has not only led to an increase of available information, but also in the complexity of analyzing the observations. Timely dissemination of readily usable data products necessitates a streaming processing framework that is both automatable and flexible. Tower networks, such as ICOS, Ameriflux, and NEON, exemplify this issue by requiring large amounts of data to be processed from dispersed measurement sites. Eddy-covariance data from across the NEON network are expected to amount to 100 Gigabytes per day. The complexity of the algorithmic processing necessary to produce high-quality data products together with the continued development of new analysis techniques led to the development of a modular R-package, eddy4R. This allows algorithms provided by NEON and the larger community to be deployed in streaming processing, and to be used by community members alike. In order to control the processing environment, provide a proficient parallel processing structure, and certify dependencies are available during processing, we chose Docker as our "Development and Operations" (DevOps) platform. The Docker framework allows our processing algorithms to be developed, maintained and deployed at scale. Additionally, the eddy4R-Docker framework fosters community use and extensibility via pre-built Docker images and the Github distributed version control system. The capability to process large data sets is reliant upon efficient input and output of data, data compressibility to reduce compute resource loads, and the ability to easily package metadata. The Hierarchical Data Format (HDF5) is a file format that can meet these needs. A NEON standard HDF5 file structure and metadata attributes allow users to explore larger data sets in an intuitive "directory-like" structure adopting the NEON data product naming conventions.

  17. Bus Travel Time Deviation Analysis Using Automatic Vehicle Location Data and Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Xiaolin Gong

    2015-01-01

    Full Text Available To investigate the influences of causes of unreliability and bus schedule recovery phenomenon on microscopic segment-level travel time variance, this study adopts Structural Equation Modeling (SEM to specify, estimate, and measure the theoretical proposed models. The SEM model establishes and verifies hypotheses for interrelationships among travel time deviations, departure delays, segment lengths, dwell times, and number of traffic signals and access connections. The finally accepted model demonstrates excellent fitness. Most of the hypotheses are supported by the sample dataset from bus Automatic Vehicle Location system. The SEM model confirms the bus schedule recovery phenomenon. The departure delays at bus terminals and upstream travel time deviations indeed have negative impacts on travel time fluctuation of buses en route. Meanwhile, the segment length directly and negatively impacts travel time variability and inversely positively contributes to the schedule recovery process; this exogenous variable also indirectly and positively influences travel times through the existence of signalized intersections and access connections. This study offers a rational approach to analyzing travel time deviation feature. The SEM model structure and estimation results facilitate the understanding of bus service performance characteristics and provide several implications for bus service planning, management, and operation.

  18. Processing Terrain Point Cloud Data

    KAUST Repository

    DeVore, Ronald; Petrova, Guergana; Hielsberg, Matthew; Owens, Luke; Clack, Billy; Sood, Alok

    2013-01-01

    Terrain point cloud data are typically acquired through some form of Light Detection And Ranging sensing. They form a rich resource that is important in a variety of applications including navigation, line of sight, and terrain visualization

  19. Parallel processing of genomics data

    Science.gov (United States)

    Agapito, Giuseppe; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-10-01

    The availability of high-throughput experimental platforms for the analysis of biological samples, such as mass spectrometry, microarrays and Next Generation Sequencing, have made possible to analyze a whole genome in a single experiment. Such platforms produce an enormous volume of data per single experiment, thus the analysis of this enormous flow of data poses several challenges in term of data storage, preprocessing, and analysis. To face those issues, efficient, possibly parallel, bioinformatics software needs to be used to preprocess and analyze data, for instance to highlight genetic variation associated with complex diseases. In this paper we present a parallel algorithm for the parallel preprocessing and statistical analysis of genomics data, able to face high dimension of data and resulting in good response time. The proposed system is able to find statistically significant biological markers able to discriminate classes of patients that respond to drugs in different ways. Experiments performed on real and synthetic genomic datasets show good speed-up and scalability.

  20. Towards a Modernization Process for Secure Data Warehouses

    Science.gov (United States)

    Blanco, Carlos; Pérez-Castillo, Ricardo; Hernández, Arnulfo; Fernández-Medina, Eduardo; Trujillo, Juan

    Data Warehouses (DW) manage crucial enterprise information used for the decision making process which has to be protected from unauthorized accesses. However, security constraints are not properly integrated in the complete DWs’ development process, being traditionally considered in the last stages. Furthermore, legacy systems need a reverse engineering process in order to accomplish re-documentation for detecting new security requirements as well as system’s design recovery to enable migration and reuse. Thus, we have proposed a model driven architecture (MDA) for secure DWs which takes into account security issues from the early stages of development and provides automatic transformations between models. This paper fulfills this architecture providing an architecture-driven modernization (ADM) process focused on obtaining conceptual security models from legacy OLAP systems.

  1. Implementasi Automatic Packet Reporting System (APRS Untuk Paket Data Pemantauan dan PengukuranUntuk Paket Data Pemantauan dan Pengukuran

    Directory of Open Access Journals (Sweden)

    Arief Goeritno

    2016-03-01

    Full Text Available Telah dilakukan implementasi Automatic Packet Reporting System (APRS untuk paket data pemantauan dan pengukuran melalui tujuan penelitian, berupa: a penyetelan program aplikasi pada jaringan APRS dan b pengukuran terhadap penerimaan data berdasarkan kinerja sensor-sensor. Penyetelan program aplikasi APRS merupakan konfigurasi perangkat lunak untuk APRS yang akan digunakan pada stasiun penerimaan data APRS dengan program aplikasi yang biasa digunakan, yaitu hyperterminal dan UI-View 32. Pengukuran penerimaan data berdasarkan kinerja sensor yang dilakukan melalui proses perekaman pada stasiun penerimaan data APRS. Kinerja sensor-sensor akan diamati pada stasiun pengiriman dan data hasil pengamatan akan dapat diterima pada stasiun penerimaan secara real time. Program aplikasi berbasis hyperterminal dan UI View 32 telah berhasil melakukan proses handshaking antara Terminal Node Controller (TNC dan komputer, sehingga data telemetri dari stasiun pengiriman paket data dapat diterima di stasiun penerimaan. Data telemetri dapat diamati pada stasiun penerimaan dan dapat diperoleh secara real time dengan format: YB0LRB-11>BEACON,WIDE2-1 [05/18/2014 04:03:49]: : T#010,008,093,004,122,075. Notifikasi YB0LRB-11 merupakan stasiun pengiriman paket data telemetri, kemudian data tersebut akan diterima pada stasiun penerima YD1PRY dengan format: YD1PRY-2>APLPN,ARISS [05/18/2014 04:03:07]: : !06.30.37S/106.48.26E#. Notifikasi tersebut merupakan pengiriman informasi data posisi oleh stasiun YD1PRY untuk inisialisasi pada jaringan APRS. Stasiun YB0LRB-11 ketika mengirim paket data telemetri dengan format: YB0LRB-11>BEACON,WIDE2-1 [05/18/2014 04:00:49]: : T#001,004,035,005,122,075. Paket data dari stasiun YB0LRB yang dipancar ulang atau digipeater dengan format: YB0LRB-11>BEACON,YD1PRY-2,WIDE2* [05/18/2014 04:00:50]: : T#001,004,035,005,122,075. Sensor pengukuran berkinerja relatif stabil, walaupun terdapat nilai simpangan pengukuran sebesar 1 cm atau

  2. Using sensor data patterns from an automatic milking system to develop predictive variables for classifying clinical mastitis and abnormal milk

    NARCIS (Netherlands)

    Kamphuis, A.; Pietersma, D.; Tol, van der R.; Wiedermann, M.; Hogeveen, H.

    2008-01-01

    Dairy farmers using automatic milking are able to manage mastitis successfully with the help of mastitis attention lists. These attention lists are generated with mastitis detection models that make use of sensor data obtained throughout each quarter milking. The models tend to be limited to using

  3. Process and device for automatic control of air ratio in combustion

    Energy Technology Data Exchange (ETDEWEB)

    Rohr, F J; Holick, H

    1976-06-24

    The device concerns a process for the automatic control of the air ratio in combustion, by setting the fuel-air mixture for combustion depending on the air number lambda. The control of the air ratio of combustion engines is carried out using a zirconium dioxide measuring probe, which is situated in the exhaust gas. It is a disadvantage that this is only sensitive for an air number lambda of 1. In order to achieve control of the air ratio for air numbers greater or smaller than 1, according to the invention an auxiliary gas is mixed with the hot exhaust gas, or a component of the gas is withdrawn, so that a corrected exhaust gas flow is produced, whose air number is detected by the measuring sensor and controlled to a value of about 1. The auxiliary gas flow is chosen so that an air ratio differing from lambda equals 1 is formed when the air number of the corrected exhaust gas flow is regulated to a value of lambda equals 1 approximately. In order to keep the demand for auxiliary gas low, only part of the exhaust gas flow is used for the measurement. The exhaust gas part flow is kept constant while the auxiliary gas flow or the removed component of gas flow are altered. Hydrogen or oxygen are used as auxiliary gases, depending whether excess or reduced air is required. Instead of hydrogen, fuel or its combustion products can be used. According to the invention, the hydrogen or oxygen can be produced electrolytically. Dosing takes place by the current used for electrolysis.

  4. Automated data processing and radioassays.

    Science.gov (United States)

    Samols, E; Barrows, G H

    1978-04-01

    Radioassays include (1) radioimmunoassays, (2) competitive protein-binding assays based on competition for limited antibody or specific binding protein, (3) immunoradiometric assay, based on competition for excess labeled antibody, and (4) radioreceptor assays. Most mathematical models describing the relationship between labeled ligand binding and unlabeled ligand concentration have been based on the law of mass action or the isotope dilution principle. These models provide useful data reduction programs, but are theoretically unfactory because competitive radioassay usually is not based on classical dilution principles, labeled and unlabeled ligand do not have to be identical, antibodies (or receptors) are frequently heterogenous, equilibrium usually is not reached, and there is probably steric and cooperative influence on binding. An alternative, more flexible mathematical model based on the probability or binding collisions being restricted by the surface area of reactive divalent sites on antibody and on univalent antigen has been derived. Application of these models to automated data reduction allows standard curves to be fitted by a mathematical expression, and unknown values are calculated from binding data. The vitrues and pitfalls are presented of point-to-point data reduction, linear transformations, and curvilinear fitting approaches. A third-order polynomial using the square root of concentration closely approximates the mathematical model based on probability, and in our experience this method provides the most acceptable results with all varieties of radioassays. With this curvilinear system, linear point connection should be used between the zero standard and the beginning of significant dose response, and also towards saturation. The importance is stressed of limiting the range of reported automated assay results to that portion of the standard curve that delivers optimal sensitivity. Published methods for automated data reduction of Scatchard plots

  5. Automatic Censoring CFAR Detector Based on Ordered Data Difference for Low-Flying Helicopter Safety

    Directory of Open Access Journals (Sweden)

    Wen Jiang

    2016-07-01

    Full Text Available Being equipped with a millimeter-wave radar allows a low-flying helicopter to sense the surroundings in real time, which significantly increases its safety. However, nonhomogeneous clutter environments, such as a multiple target situation and a clutter edge environment, can dramatically affect the radar signal detection performance. In order to improve the radar signal detection performance in nonhomogeneous clutter environments, this paper proposes a new automatic censored cell averaging CFAR detector. The proposed CFAR detector does not require any prior information about the background environment and uses the hypothesis test of the first-order difference (FOD result of ordered data to reject the unwanted samples in the reference window. After censoring the unwanted ranked cells, the remaining samples are combined to form an estimate of the background power level, thus getting better radar signal detection performance. The simulation results show that the FOD-CFAR detector provides low loss CFAR performance in a homogeneous environment and also performs robustly in nonhomogeneous environments. Furthermore, the measured results of a low-flying helicopter validate the basic performance of the proposed method.

  6. Automatic Censoring CFAR Detector Based on Ordered Data Difference for Low-Flying Helicopter Safety

    Science.gov (United States)

    Jiang, Wen; Huang, Yulin; Yang, Jianyu

    2016-01-01

    Being equipped with a millimeter-wave radar allows a low-flying helicopter to sense the surroundings in real time, which significantly increases its safety. However, nonhomogeneous clutter environments, such as a multiple target situation and a clutter edge environment, can dramatically affect the radar signal detection performance. In order to improve the radar signal detection performance in nonhomogeneous clutter environments, this paper proposes a new automatic censored cell averaging CFAR detector. The proposed CFAR detector does not require any prior information about the background environment and uses the hypothesis test of the first-order difference (FOD) result of ordered data to reject the unwanted samples in the reference window. After censoring the unwanted ranked cells, the remaining samples are combined to form an estimate of the background power level, thus getting better radar signal detection performance. The simulation results show that the FOD-CFAR detector provides low loss CFAR performance in a homogeneous environment and also performs robustly in nonhomogeneous environments. Furthermore, the measured results of a low-flying helicopter validate the basic performance of the proposed method. PMID:27399714

  7. Automatic registration of panoramic image sequence and mobile laser scanning data using semantic features

    Science.gov (United States)

    Li, Jianping; Yang, Bisheng; Chen, Chi; Huang, Ronggang; Dong, Zhen; Xiao, Wen

    2018-02-01

    Inaccurate exterior orientation parameters (EoPs) between sensors obtained by pre-calibration leads to failure of registration between panoramic image sequence and mobile laser scanning data. To address this challenge, this paper proposes an automatic registration method based on semantic features extracted from panoramic images and point clouds. Firstly, accurate rotation parameters between the panoramic camera and the laser scanner are estimated using GPS and IMU aided structure from motion (SfM). The initial EoPs of panoramic images are obtained at the same time. Secondly, vehicles in panoramic images are extracted by the Faster-RCNN as candidate primitives to be matched with potential corresponding primitives in point clouds according to the initial EoPs. Finally, translation between the panoramic camera and the laser scanner is refined by maximizing the overlapping area of corresponding primitive pairs based on the Particle Swarm Optimization (PSO), resulting in a finer registration between panoramic image sequences and point clouds. Two challenging urban scenes were experimented to assess the proposed method, and the final registration errors of these two scenes were both less than three pixels, which demonstrates a high level of automation, robustness and accuracy.

  8. Automatic generation of co-embeddings from relational data with adaptive shaping.

    Science.gov (United States)

    Mu, Tingting; Goulermas, John Yannis

    2013-10-01

    In this paper, we study the co-embedding problem of how to map different types of patterns into one common low-dimensional space, given only the associations (relation values) between samples. We conduct a generic analysis to discover the commonalities between existing co-embedding algorithms and indirectly related approaches and investigate possible factors controlling the shapes and distributions of the co-embeddings. The primary contribution of this work is a novel method for computing co-embeddings, termed the automatic co-embedding with adaptive shaping (ACAS) algorithm, based on an efficient transformation of the co-embedding problem. Its advantages include flexible model adaptation to the given data, an economical set of model variables leading to a parametric co-embedding formulation, and a robust model fitting criterion for model optimization based on a quantization procedure. The secondary contribution of this work is the introduction of a set of generic schemes for the qualitative analysis and quantitative assessment of the output of co-embedding algorithms, using existing labeled benchmark datasets. Experiments with synthetic and real-world datasets show that the proposed algorithm is very competitive compared to existing ones.

  9. The effect of a low-speed automatic brake system estimated from real life data.

    Science.gov (United States)

    Isaksson-Hellman, Irene; Lindman, Magdalena

    2012-01-01

    A substantial part of all traffic accidents involving passenger cars are rear-end collisions and most of them occur at low speed. Auto Brake is a feature that has been launched in several passenger car models during the last few years. City Safety is a technology designed to help the driver mitigate, and in certain situations avoid, rear-end collisions at low speed by automatically braking the vehicle.Studies have been presented that predict promising benefits from these kinds of systems, but few attempts have been made to show the actual effect of Auto Brake. In this study, the effect of City Safety, a standard feature on the Volvo XC60 model, is calculated based on insurance claims data from cars in real traffic crashes in Sweden. The estimated claim frequency of rear-end frontal collisions measured in claims per 1,000 insured vehicle years was 23% lower for the City Safety equipped XC60 model than for other Volvo models without the system.

  10. AUTOMATIC THICKNESS AND VOLUME ESTIMATION OF SPRAYED CONCRETE ON ANCHORED RETAINING WALLS FROM TERRESTRIAL LIDAR DATA

    Directory of Open Access Journals (Sweden)

    J. Martínez-Sánchez

    2016-06-01

    Full Text Available When ground conditions are weak, particularly in free formed tunnel linings or retaining walls, sprayed concrete can be applied on the exposed surfaces immediately after excavation for shotcreting rock outcrops. In these situations, shotcrete is normally applied conjointly with rock bolts and mesh, thereby supporting the loose material that causes many of the small ground falls. On the other hand, contractors want to determine the thickness and volume of sprayed concrete for both technical and economic reasons: to guarantee their structural strength but also, to not deliver excess material that they will not be paid for. In this paper, we first introduce a terrestrial LiDAR-based method for the automatic detection of rock bolts, as typically used in anchored retaining walls. These ground support elements are segmented based on their geometry and they will serve as control points for the co-registration of two successive scans, before and after shotcreting. Then we compare both point clouds to estimate the sprayed concrete thickness and the expending volume on the wall. This novel methodology is demonstrated on repeated scan data from a retaining wall in the city of Vigo (Spain, resulting in a rock bolts detection rate of 91%, that permits to obtain a detailed information of the thickness and calculate a total volume of 3597 litres of concrete. These results have verified the effectiveness of the developed approach by increasing productivity and improving previous empirical proposals for real time thickness estimation.

  11. Automatic Thickness and Volume Estimation of Sprayed Concrete on Anchored Retaining Walls from Terrestrial LIDAR Data

    Science.gov (United States)

    Martínez-Sánchez, J.; Puente, I.; GonzálezJorge, H.; Riveiro, B.; Arias, P.

    2016-06-01

    When ground conditions are weak, particularly in free formed tunnel linings or retaining walls, sprayed concrete can be applied on the exposed surfaces immediately after excavation for shotcreting rock outcrops. In these situations, shotcrete is normally applied conjointly with rock bolts and mesh, thereby supporting the loose material that causes many of the small ground falls. On the other hand, contractors want to determine the thickness and volume of sprayed concrete for both technical and economic reasons: to guarantee their structural strength but also, to not deliver excess material that they will not be paid for. In this paper, we first introduce a terrestrial LiDAR-based method for the automatic detection of rock bolts, as typically used in anchored retaining walls. These ground support elements are segmented based on their geometry and they will serve as control points for the co-registration of two successive scans, before and after shotcreting. Then we compare both point clouds to estimate the sprayed concrete thickness and the expending volume on the wall. This novel methodology is demonstrated on repeated scan data from a retaining wall in the city of Vigo (Spain), resulting in a rock bolts detection rate of 91%, that permits to obtain a detailed information of the thickness and calculate a total volume of 3597 litres of concrete. These results have verified the effectiveness of the developed approach by increasing productivity and improving previous empirical proposals for real time thickness estimation.

  12. Automatic Tree Data Removal Method for Topography Measurement Result Using Terrestrial Laser Scanner

    Science.gov (United States)

    Yokoyama, H.; Chikatsu, H.

    2017-02-01

    Recently, laser scanning has been receiving greater attention as a useful tool for real-time 3D data acquisition, and various applications such as city modelling, DTM generation and 3D modelling of cultural heritage sites have been proposed. And, former digital data processing were demanded in the past digital archive techniques for cultural heritage sites. However, robust filtering method for distinguishing on- and off-terrain points by terrestrial laser scanner still have many issues. In the past investigation, former digital data processing using air-bone laser scanner were reported. Though, efficient tree removal methods from terrain points for the cultural heritage are not considered. In this paper, authors describe a new robust filtering method for cultural heritage using terrestrial laser scanner with "the echo digital processing technology" as latest data processing techniques of terrestrial laser scanner.

  13. X-ray data processing

    OpenAIRE

    Powell, Harold R.

    2017-01-01

    The method of molecular structure determination by X-ray crystallography is a little over a century old. The history is described briefly, along with developments in X-ray sources and detectors. The fundamental processes involved in measuring diffraction patterns on area detectors, i.e. autoindexing, refining crystal and detector parameters, integrating the reflections themselves and putting the resultant measurements on to a common scale are discussed, with particular reference to the most c...

  14. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  15. Processing data base information having nonwhite noise

    Science.gov (United States)

    Gross, Kenneth C.; Morreale, Patricia

    1995-01-01

    A method and system for processing a set of data from an industrial process and/or a sensor. The method and system can include processing data from either real or calculated data related to an industrial process variable. One of the data sets can be an artificial signal data set generated by an autoregressive moving average technique. After obtaining two data sets associated with one physical variable, a difference function data set is obtained by determining the arithmetic difference between the two pairs of data sets over time. A frequency domain transformation is made of the difference function data set to obtain Fourier modes describing a composite function data set. A residual function data set is obtained by subtracting the composite function data set from the difference function data set and the residual function data set (free of nonwhite noise) is analyzed by a statistical probability ratio test to provide a validated data base.

  16. Front-end data processing the SLD data acquisition system

    International Nuclear Information System (INIS)

    Nielsen, B.S.

    1986-07-01

    The data acquisition system for the SLD detector will make extensive use of parallel at the front-end level. Fastbus acquisition modules are being built with powerful processing capabilities for calibration, data reduction and further pre-processing of the large amount of analog data handled by each module. This paper describes the read-out electronics chain and data pre-processing system adapted for most of the detector channels, exemplified by the central drift chamber waveform digitization and processing system

  17. X-ray data processing.

    Science.gov (United States)

    Powell, Harold R

    2017-10-31

    The method of molecular structure determination by X-ray crystallography is a little over a century old. The history is described briefly, along with developments in X-ray sources and detectors. The fundamental processes involved in measuring diffraction patterns on area detectors, i.e. autoindexing, refining crystal and detector parameters, integrating the reflections themselves and putting the resultant measurements on to a common scale are discussed, with particular reference to the most commonly used software in the field. © 2017 The Author(s).

  18. SCICEX Topsounder Data Processing Transition

    Science.gov (United States)

    2018-05-29

    unlimited. UNCLASSIFIED 2 LONG-TERM GOALS Arctic sea ice thickness is critical to geophysical research into climate change, shipping, biological...declassifying Topeka 2012 and Hampton 2014 data. Part of the work included the training of a new analysis /programmer (Beth Kirby) to transfer and...thickness is critical to geophysical research into climate change, shipping, biological productivity and other things. The overall goal of this work

  19. Instrumental development and data processing

    International Nuclear Information System (INIS)

    Franzen, J.

    1978-01-01

    A review of recent developments in mass spectrometry instrumentation is presented under the following headings: introduction (scope of mass spectrometry compared with neighbouring fields); ion sources and ionization techniques; spectrometers (instrumental developments); measuring procedures; coupling techniques; data systems; conclusions (that mass spectrometry should have a broader basis and that there would be mutual profit from a better penetration of mass spectrometry into fields of routine application). (U.K.)

  20. An analysis of metropolitan land-use by machine processing of earth resources technology satellite data

    Science.gov (United States)

    Mausel, P. W.; Todd, W. J.; Baumgardner, M. F.

    1976-01-01

    A successful application of state-of-the-art remote sensing technology in classifying an urban area into its broad land use classes is reported. This research proves that numerous urban features are amenable to classification using ERTS multispectral data automatically processed by computer. Furthermore, such automatic data processing (ADP) techniques permit areal analysis on an unprecedented scale with a minimum expenditure of time. Also, classification results obtained using ADP procedures are consistent, comparable, and replicable. The results of classification are compared with the proposed U. S. G. S. land use classification system in order to determine the level of classification that is feasible to obtain through ERTS analysis of metropolitan areas.

  1. Sample and data management process description

    International Nuclear Information System (INIS)

    Kessner, J.H.

    2000-01-01

    The sample and data management process was initiated in 1994 as a result of a process improvement workshop. The purpose of the workshop was to develop a sample and data management process that would reduce cycle time and costs, simplify systems and procedures, and improve customer satisfaction for sampling, analytical services, and data management activities

  2. Digital processing data communication systems (bus systems)

    International Nuclear Information System (INIS)

    Fleck, K.

    1980-01-01

    After an introduction to the technology of digital processing data communication systems there are the following chapters: digital communication of processing data in automation technology, the technology of biserial communication, the implementaiton of a bus system, the data transmission of the TDC-2000 system of Honeywell's and the process bus CS 275 in the automation system TELEPERM M of Siemens AG. (WB) [de

  3. Automatic Sky View Factor Estimation from Street View Photographs—A Big Data Approach

    Directory of Open Access Journals (Sweden)

    Jianming Liang

    2017-04-01

    Full Text Available Hemispherical (fisheye photography is a well-established approach for estimating the sky view factor (SVF. High-resolution urban models from LiDAR and oblique airborne photogrammetry can provide continuous SVF estimates over a large urban area, but such data are not always available and are difficult to acquire. Street view panoramas have become widely available in urban areas worldwide: Google Street View (GSV maintains a global network of panoramas excluding China and several other countries; Baidu Street View (BSV and Tencent Street View (TSV focus their panorama acquisition efforts within China, and have covered hundreds of cities therein. In this paper, we approach this issue from a big data perspective by presenting and validating a method for automatic estimation of SVF from massive amounts of street view photographs. Comparisons were made with SVF estimates derived from two independent sources: a LiDAR-based Digital Surface Model (DSM and an oblique airborne photogrammetry-based 3D city model (OAP3D, resulting in a correlation coefficient of 0.863 and 0.987, respectively. The comparisons demonstrated the capacity of the proposed method to provide reliable SVF estimates. Additionally, we present an application of the proposed method with about 12,000 GSV panoramas to characterize the spatial distribution of SVF over Manhattan Island in New York City. Although this is a proof-of-concept study, it has shown the potential of the proposed approach to assist urban climate and urban planning research. However, further development is needed before this approach can be finally delivered to the urban climate and urban planning communities for practical applications.

  4. Methodology for automatic process of the fired ceramic tile's internal defect using IR images and artificial neural network

    OpenAIRE

    Andrade, Roberto Márcio de; Eduardo, Alexandre Carlos

    2011-01-01

    In the ceramic industry, rarely testing systems were employed to on-line detect the presence of defects in ceramic tiles. This paper is concerned with the problem of automatic inspection of ceramic tiles using Infrared Images and Artificial Neural Network (ANN). The performance of the technique has been evaluated theoretically and experimentally from laboratory and on line tile samples. It has been performed system for IR image processing and, utilizing an Artificial Neural Network (ANN), det...

  5. Automatic Service Derivation from Business Process Model Repositories via Semantic Technology

    NARCIS (Netherlands)

    Leopold, H.; Pittke, F.; Mendling, J.

    2015-01-01

    Although several approaches for service identification have been defined in research and practice, there is a notable lack of fully automated techniques. In this paper, we address the problem of manual work in the context of service derivation and present an approach for automatically deriving

  6. Experiences with automatic N and P measurements of an activated sludge process in a research environment

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Temmink, H.

    1996-01-01

    Some of the advantages of on-line automatic measurement of ammonia, nitrate and phosphate for studying activated sludge systems are pointed out with the help of examples of batch experiments. Sample taking is performed by cross-flow filtration and measurement of all three analytes is performed by...

  7. Cart'Eaux: an automatic mapping procedure for wastewater networks using machine learning and data mining

    Science.gov (United States)

    Bailly, J. S.; Delenne, C.; Chahinian, N.; Bringay, S.; Commandré, B.; Chaumont, M.; Derras, M.; Deruelle, L.; Roche, M.; Rodriguez, F.; Subsol, G.; Teisseire, M.

    2017-12-01

    In France, local government institutions must establish a detailed description of wastewater networks. The information should be available, but it remains fragmented (different formats held by different stakeholders) and incomplete. In the "Cart'Eaux" project, a multidisciplinary team, including an industrial partner, develops a global methodology using Machine Learning and Data Mining approaches applied to various types of large data to recover information in the aim of mapping urban sewage systems for hydraulic modelling. Deep-learning is first applied using a Convolution Neural Network to localize manhole covers on 5 cm resolution aerial RGB images. The detected manhole covers are then automatically connected using a tree-shaped graph constrained by industry rules. Based on a Delaunay triangulation, connections are chosen to minimize a cost function depending on pipe length, slope and possible intersection with roads or buildings. A stochastic version of this algorithm is currently being developed to account for positional uncertainty and detection errors, and generate sets of probable networks. As more information is required for hydraulic modeling (slopes, diameters, materials, etc.), text data mining is used to extract network characteristics from data posted on the Web or available through governmental or specific databases. Using an appropriate list of keywords, the web is scoured for documents which are saved in text format. The thematic entities are identified and linked to the surrounding spatial and temporal entities. The methodology is developed and tested on two towns in southern France. The primary results are encouraging: 54% of manhole covers are detected with few false detections, enabling the reconstruction of probable networks. The data mining results are still being investigated. It is clear at this stage that getting numerical values on specific pipes will be challenging. Thus, when no information is found, decision rules will be used to

  8. Automatic classification of fluorescence and optical diffusion spectroscopy data in neuro-oncology

    Science.gov (United States)

    Savelieva, T. A.; Loshchenov, V. B.; Goryajnov, S. A.; Potapov, A. A.

    2018-04-01

    The complexity of the biological tissue spectroscopic analysis due to the overlap of biological molecules' absorption spectra, multiple scattering effect, as well as measurement geometry in vivo has caused the relevance of this work. In the neurooncology the problem of tumor boundaries delineation is especially acute and requires the development of new methods of intraoperative diagnosis. Methods of optical spectroscopy allow detecting various diagnostically significant parameters non-invasively. 5-ALA induced protoporphyrin IX is frequently used as fluorescent tumor marker in neurooncology. At the same time analysis of the concentration and the oxygenation level of haemoglobin and significant changes of light scattering in tumor tissues have a high diagnostic value. This paper presents an original method for the simultaneous registration of backward diffuse reflectance and fluorescence spectra, which allows defining all the parameters listed above simultaneously. The clinical studies involving 47 patients with intracranial glial tumors of II-IV Grades were carried out in N.N. Burdenko National Medical Research Center of Neurosurgery. To register the spectral dependences the spectroscopic system LESA- 01-BIOSPEC was used with specially developed w-shaped diagnostic fiber optic probe. The original algorithm of combined spectroscopic signal processing was developed. We have created a software and hardware, which allowed (as compared with the methods currently used in neurosurgical practice) to increase the sensitivity of intraoperative demarcation of intracranial tumors from 78% to 96%, specificity of 60% to 82%. The result of analysis of different techniques of automatic classification shows that in our case the most appropriate is the k Nearest Neighbors algorithm with cubic metrics.

  9. Development of automatic radiographic inspection system using digital image processing and artificial intelligence

    International Nuclear Information System (INIS)

    Itoga, Kouyu; Sugimoto, Koji; Michiba, Koji; Kato, Yuhei; Sugita, Yuji; Onda, Katsuhiro.

    1991-01-01

    The application of computers to welding inspection is expanding rapidly. The classification of the application is the collection, analysis and processing of data, the graphic display of results, the distinction of the kinds of defects and the evaluation of the harmufulness of defects and the judgement of acceptance or rejection. The application of computer techniques to the automation of data collection was realized at the relatively early stage. Data processing and the graphic display of results are the techniques in progress now, and the application of artificial intelligence to the distinction of the kinds of defects and the evaluation of harmfulness is expected to expand rapidly. In order to computerize radiographic inspection, the abilities of image processing technology and knowledge engineering must be given to computers. The object of this system is the butt joints by arc welding of the steel materials of up to 30 mm thickness. The digitizing transformation of radiographs, the distinction and evaluation of transmissivity and gradation by image processing, and only as for those, of which the picture quality satisfies the standard, the extraction of defect images, their display, the distinction of the kinds and the final judgement are carried out. The techniques of image processing, the knowledge for distinguishing the kinds of defects and the concept of the practical system are reported. (K.I.)

  10. Evaluation and processing of covariance data

    International Nuclear Information System (INIS)

    Wagner, M.

    1993-01-01

    These proceedings of a specialists'meeting on evaluation and processing of covariance data is divided into 4 parts bearing on: part 1- Needs for evaluated covariance data (2 Papers), part 2- generation of covariance data (15 Papers), part 3- Processing of covariance files (2 Papers), part 4-Experience in the use of evaluated covariance data (2 Papers)

  11. Combined Acquisition/Processing For Data Reduction

    Science.gov (United States)

    Kruger, Robert A.

    1982-01-01

    Digital image processing systems necessarily consist of three components: acquisition, storage/retrieval and processing. The acquisition component requires the greatest data handling rates. By coupling together the acquisition witn some online hardwired processing, data rates and capacities for short term storage can be reduced. Furthermore, long term storage requirements can be reduced further by appropriate processing and editing of image data contained in short term memory. The net result could be reduced performance requirements for mass storage, processing and communication systems. Reduced amounts of data also snouid speed later data analysis and diagnostic decision making.

  12. It's a two-way street: Automatic and controlled processes in children's emotional responses to moral transgressions.

    Science.gov (United States)

    Dys, Sebastian P; Malti, Tina

    2016-12-01

    This study examined children's automatic, spontaneous emotional reactions to everyday moral transgressions and their relations with self-reported emotions, which are more complex and infused with controlled cognition. We presented children ​(N=242 4-, 8-, and 12-year-olds) with six everyday moral transgression scenarios in an experimental setting, and both their spontaneous facial emotional reactions and self-reported emotions in the role of the transgressor were recorded. We found that across age self-reported guilt was positively associated with spontaneous fear, and self-reported anger was positively related to spontaneous sadness. In addition, we found a developmental increase in spontaneous sadness and decrease in spontaneous happiness. These results support the importance of automatic and controlled processes in evoking children's emotional responses to everyday moral transgressions. We conclude by providing potential explanations for how automatic and controlled processes function in children's everyday moral experiences and how these processes may change with age. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Buffer mass test - data aquisition and data processing systems

    International Nuclear Information System (INIS)

    Hagvall, B.

    1982-08-01

    This report describes data aquisition and data processing systems used for the Buffer Mass Test at Stripa. A data aquisition system, designed mainly to provide high reliability, in Stripa produces raw-data log tapes. Copies of these tapes are mailed to the computer center at the University of Luleaa for processing of raw-data. The computer systems in Luleaa offer a wide range of processing facilities: large mass storage units, several plotting facilities, programs for processing and monitoring of vast amounts of data, etc.. (Author)

  14. Automatic Sample and Data Processing in Studies of Calcium Metabolism in Rats; Traitement Automatique des Echantillons et des Donnees dans l'Etude du Metabolisme Calcique chez le Rat; Avtomaticheskaya obrabotka obraztsov i dannykh pri izuchenii obmena kal'tsiya u krysy; Tratamiento Automatico de las Muestras y de los Datos en el Estudio del Metabousmo del Calcio en la Rata

    Energy Technology Data Exchange (ETDEWEB)

    Onkelinx, C.; Richelle, L. J.; Debras, J. [Universite de Liege (Belgium)

    1965-10-15

    The study of calcium metabolism in rats as a function of age or various forms of treatment entails experiments on large numbers of animals. These investigations involve: (i) studying the way in which the serum concentration of a tracer dose of {sup 45}Ca injected intravenously varies as a function of time, and (ii) carrying out measurements of chemical and radiochemical balance. By combining these two types of information and subjecting them to mathematical analysis it is possible to evolve a general model of calcium metabolism. This model can then be used to deduce the size of the exchangeable compartments and the relative importance of the different metabolic paths, such as intestinal absorption, renal and intestinal excretion, and deposition and elimination of bone calcium. The authors' work on these subjects was facilitated by the development of automatic methods for measuring the samples and processing the data, and these methods are the subject of their paper. Processing of samples: the radioactivity measurements are carried out on small samples (20-40 {lambda}) of plasma, removed at repeated intervals, and on total quantities of faeces and urine excreted in a given period. The measuring apparatus used comprises a feed, a low-background anti-coincidence counter and a digital computer; the measurements obtained from the computer are then recorded on a printer. The novel features of the sample preparation techniques used and the performance achieved by the measuring apparatus, are discussed, with special reference to the (statistical) counting conditions, which are checked by the computer each time new measurements have to be calculated. Processing of data: this is done by an IBM-7040 digital computer, into which are fed the programme for the calculation and all the un-corrected experimental data in the form of punched cards, separately for each animal. There are three stages to the data-processing operation, namely: (1) converting the raw data and calculating

  15. Reproducibility of wrist home blood pressure measurement with position sensor and automatic data storage

    Directory of Open Access Journals (Sweden)

    Nickenig Georg

    2009-05-01

    Full Text Available Abstract Background Wrist blood pressure (BP devices have physiological limits with regards to accuracy, therefore they were not preferred for home BP monitoring. However some wrist devices have been successfully validated using etablished validation protocols. Therefore this study assessed the reproducibility of wrist home BP measurement with position sensor and automatic data storage. Methods To compare the reproducibility of three different(BP measurement methods: 1 office BP, 2 home BP (Omron wrist device HEM- 637 IT with position sensor, 3 24-hour ambulatory BP(24-h ABPM (ABPM-04, Meditech, Hunconventional sphygmomanometric office BP was measured on study days 1 and 7, 24-h ABPM on study days 7 and 14 and home BP between study days 1 and 7 and between study days 8 and 14 in 69 hypertensive and 28 normotensive subjects. The correlation coeffcient of each BP measurement method with echocardiographic left ventricular mass index was analyzed. The schedule of home readings was performed according to recently published European Society of Hypertension (ESH- guidelines. Results The reproducibility of home BP measurement analyzed by the standard deviation as well as the squared differeces of mean individual differences between the respective BP measurements was significantly higher than the reproducibility of office BP (p Conclusion The short-term reproducibility of home BP measurement with the Omron HEM-637 IT wrist device was superior to the reproducibility of office BP and 24- h ABPM measurement. Furthermore, home BP with the wrist device showed similar correlations to targed organ damage as recently reported for upper arm devices. Although wrist devices have to be used cautious and with defined limitations, the use of validated devices with position sensor according to recently recommended measurement schedules might have the potential to be used for therapy monitoring.

  16. Automatic UAV Image Geo-Registration by Matching UAV Images to Georeferenced Image Data

    Directory of Open Access Journals (Sweden)

    Xiangyu Zhuo

    2017-04-01

    Full Text Available Recent years have witnessed the fast development of UAVs (unmanned aerial vehicles. As an alternative to traditional image acquisition methods, UAVs bridge the gap between terrestrial and airborne photogrammetry and enable flexible acquisition of high resolution images. However, the georeferencing accuracy of UAVs is still limited by the low-performance on-board GNSS and INS. This paper investigates automatic geo-registration of an individual UAV image or UAV image blocks by matching the UAV image(s with a previously taken georeferenced image, such as an individual aerial or satellite image with a height map attached or an aerial orthophoto with a DSM (digital surface model attached. As the biggest challenge for matching UAV and aerial images is in the large differences in scale and rotation, we propose a novel feature matching method for nadir or slightly tilted images. The method is comprised of a dense feature detection scheme, a one-to-many matching strategy and a global geometric verification scheme. The proposed method is able to find thousands of valid matches in cases where SIFT and ASIFT fail. Those matches can be used to geo-register the whole UAV image block towards the reference image data. When the reference images offer high georeferencing accuracy, the UAV images can also be geolocalized in a global coordinate system. A series of experiments involving different scenarios was conducted to validate the proposed method. The results demonstrate that our approach achieves not only decimeter-level registration accuracy, but also comparable global accuracy as the reference images.

  17. Editorial: "Business process intelligence : connecting data and processes"

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Zhao, J.L.; Wang, H.; Wang, Harry Jiannan

    2015-01-01

    This introduction to the special issue on Business Process Intelligence (BPI) discusses the relation between data and processes. The recent attention for Big Data illustrates that organizations are aware of the potential of the torrents of data generated by today's information systems. However, at

  18. Automatic processing of list of journals and publications in the Nuclear Research Institute

    International Nuclear Information System (INIS)

    Vymetal, L.

    Using an EC 1040 computer, the Institute of Nuclear Research processed the list of journals in the reference library of the Czechoslovak Atomic Energy Commission including journals acquired by all institutions subordinated to the Czechoslovak Atomic Energy Commission, ie., UJV Rez (Nuclear Research Institute), Nuclear Information Centre Prague, UVVVR Prague (Institute for Research, Production and Application of Radioisotopes) and Institute of Radioecology and Applied Nuclear Techniques Kosice. Computer processing allowed obtaining files arranged by libraries, subject matters of the journals, countries of publication, and journal titles. Automated processing is being prepared of publications by UJV staff. The preparation is described of data for computer processing of both files and specimens are shown of printouts. (Ha)

  19. A framework for automatic segmentation in three dimensions of microstructural tomography data

    DEFF Research Database (Denmark)

    Jørgensen, Peter Stanley; Hansen, Karin Vels; Larsen, Rasmus

    2010-01-01

    Routine use of quantitative three dimensional analysis of material microstructure by in particular, focused ion beam (FIB) serial sectioning is generally restricted by the time consuming task of manually delineating structures within each image slice or the quality of manual and automatic...... segmentation schemes. We present here a framework for performing automatic segmentation of complex microstructures using a level set method. The technique is based on numerical approximations to partial differential equations to evolve a 3D surface to capture the phase boundaries. Vector fields derived from...

  20. A prototype for JDEM science data processing

    International Nuclear Information System (INIS)

    Gottschalk, Erik E

    2011-01-01

    Fermilab is developing a prototype science data processing and data quality monitoring system for dark energy science. The purpose of the prototype is to demonstrate distributed data processing capabilities for astrophysics applications, and to evaluate candidate technologies for trade-off studies. We present the architecture and technical aspects of the prototype, including an open source scientific execution and application development framework, distributed data processing, and publish/subscribe message passing for quality control.